May 13 14:21:51.971217 kernel: Linux version 6.12.28-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 13 11:28:50 -00 2025 May 13 14:21:51.971261 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=7099d7ee582d4f3e6d25a3763207cfa25fb4eb117c83034e2c517b959b8370a1 May 13 14:21:51.971280 kernel: BIOS-provided physical RAM map: May 13 14:21:51.971300 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 13 14:21:51.971314 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 13 14:21:51.971328 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 13 14:21:51.971345 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable May 13 14:21:51.974584 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved May 13 14:21:51.974595 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 13 14:21:51.974603 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 13 14:21:51.974611 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable May 13 14:21:51.974619 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 13 14:21:51.974631 kernel: NX (Execute Disable) protection: active May 13 14:21:51.974639 kernel: APIC: Static calls initialized May 13 14:21:51.974648 kernel: SMBIOS 3.0.0 present. May 13 14:21:51.974657 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 May 13 14:21:51.974665 kernel: DMI: Memory slots populated: 1/1 May 13 14:21:51.974674 kernel: Hypervisor detected: KVM May 13 14:21:51.974683 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 13 14:21:51.974691 kernel: kvm-clock: using sched offset of 4817587620 cycles May 13 14:21:51.974700 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 13 14:21:51.974709 kernel: tsc: Detected 1996.249 MHz processor May 13 14:21:51.974717 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 13 14:21:51.974726 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 13 14:21:51.974735 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 May 13 14:21:51.974743 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 13 14:21:51.974754 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 13 14:21:51.974762 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 May 13 14:21:51.974771 kernel: ACPI: Early table checksum verification disabled May 13 14:21:51.974779 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) May 13 14:21:51.974788 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 14:21:51.974796 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 14:21:51.974805 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 14:21:51.974813 kernel: ACPI: FACS 0x00000000BFFE0000 000040 May 13 14:21:51.974822 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 13 14:21:51.974832 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 14:21:51.974841 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] May 13 14:21:51.974849 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] May 13 14:21:51.974858 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] May 13 14:21:51.974867 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] May 13 14:21:51.974878 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] May 13 14:21:51.974893 kernel: No NUMA configuration found May 13 14:21:51.974925 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] May 13 14:21:51.974955 kernel: NODE_DATA(0) allocated [mem 0x13fff5dc0-0x13fffcfff] May 13 14:21:51.974981 kernel: Zone ranges: May 13 14:21:51.975007 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 13 14:21:51.975033 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] May 13 14:21:51.975059 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] May 13 14:21:51.975088 kernel: Device empty May 13 14:21:51.975114 kernel: Movable zone start for each node May 13 14:21:51.975149 kernel: Early memory node ranges May 13 14:21:51.975176 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 13 14:21:51.975198 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] May 13 14:21:51.975227 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] May 13 14:21:51.975254 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] May 13 14:21:51.975280 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 13 14:21:51.975309 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 13 14:21:51.975336 kernel: On node 0, zone Normal: 35 pages in unavailable ranges May 13 14:21:51.975533 kernel: ACPI: PM-Timer IO Port: 0x608 May 13 14:21:51.975548 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 13 14:21:51.975557 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 13 14:21:51.975566 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 13 14:21:51.975575 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 13 14:21:51.975584 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 13 14:21:51.975593 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 13 14:21:51.975602 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 13 14:21:51.975610 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 13 14:21:51.975619 kernel: CPU topo: Max. logical packages: 2 May 13 14:21:51.975630 kernel: CPU topo: Max. logical dies: 2 May 13 14:21:51.975638 kernel: CPU topo: Max. dies per package: 1 May 13 14:21:51.975647 kernel: CPU topo: Max. threads per core: 1 May 13 14:21:51.975655 kernel: CPU topo: Num. cores per package: 1 May 13 14:21:51.975664 kernel: CPU topo: Num. threads per package: 1 May 13 14:21:51.975673 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs May 13 14:21:51.975681 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 13 14:21:51.975690 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices May 13 14:21:51.975698 kernel: Booting paravirtualized kernel on KVM May 13 14:21:51.975710 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 13 14:21:51.975719 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 13 14:21:51.975728 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 May 13 14:21:51.975736 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 May 13 14:21:51.975745 kernel: pcpu-alloc: [0] 0 1 May 13 14:21:51.975753 kernel: kvm-guest: PV spinlocks disabled, no host support May 13 14:21:51.975763 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=7099d7ee582d4f3e6d25a3763207cfa25fb4eb117c83034e2c517b959b8370a1 May 13 14:21:51.975773 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 13 14:21:51.975783 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 13 14:21:51.975792 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 13 14:21:51.975801 kernel: Fallback order for Node 0: 0 May 13 14:21:51.975810 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048443 May 13 14:21:51.975818 kernel: Policy zone: Normal May 13 14:21:51.975827 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 13 14:21:51.975836 kernel: software IO TLB: area num 2. May 13 14:21:51.975844 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 13 14:21:51.975853 kernel: ftrace: allocating 40071 entries in 157 pages May 13 14:21:51.975864 kernel: ftrace: allocated 157 pages with 5 groups May 13 14:21:51.975872 kernel: Dynamic Preempt: voluntary May 13 14:21:51.975881 kernel: rcu: Preemptible hierarchical RCU implementation. May 13 14:21:51.975891 kernel: rcu: RCU event tracing is enabled. May 13 14:21:51.975899 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 13 14:21:51.975908 kernel: Trampoline variant of Tasks RCU enabled. May 13 14:21:51.975917 kernel: Rude variant of Tasks RCU enabled. May 13 14:21:51.975926 kernel: Tracing variant of Tasks RCU enabled. May 13 14:21:51.975934 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 13 14:21:51.975943 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 13 14:21:51.975954 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 13 14:21:51.975963 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 13 14:21:51.975972 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 13 14:21:51.975981 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 May 13 14:21:51.975989 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 13 14:21:51.975998 kernel: Console: colour VGA+ 80x25 May 13 14:21:51.976007 kernel: printk: legacy console [tty0] enabled May 13 14:21:51.976015 kernel: printk: legacy console [ttyS0] enabled May 13 14:21:51.976024 kernel: ACPI: Core revision 20240827 May 13 14:21:51.976035 kernel: APIC: Switch to symmetric I/O mode setup May 13 14:21:51.976043 kernel: x2apic enabled May 13 14:21:51.976052 kernel: APIC: Switched APIC routing to: physical x2apic May 13 14:21:51.976061 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 13 14:21:51.976070 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized May 13 14:21:51.976085 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) May 13 14:21:51.976096 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 13 14:21:51.976106 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 13 14:21:51.976115 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 13 14:21:51.976124 kernel: Spectre V2 : Mitigation: Retpolines May 13 14:21:51.976134 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 13 14:21:51.976144 kernel: Speculative Store Bypass: Vulnerable May 13 14:21:51.976154 kernel: x86/fpu: x87 FPU will use FXSAVE May 13 14:21:51.976163 kernel: Freeing SMP alternatives memory: 32K May 13 14:21:51.976172 kernel: pid_max: default: 32768 minimum: 301 May 13 14:21:51.976181 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 13 14:21:51.976192 kernel: landlock: Up and running. May 13 14:21:51.976201 kernel: SELinux: Initializing. May 13 14:21:51.976210 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 14:21:51.976220 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 14:21:51.976229 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) May 13 14:21:51.976238 kernel: Performance Events: AMD PMU driver. May 13 14:21:51.976247 kernel: ... version: 0 May 13 14:21:51.976256 kernel: ... bit width: 48 May 13 14:21:51.976265 kernel: ... generic registers: 4 May 13 14:21:51.976276 kernel: ... value mask: 0000ffffffffffff May 13 14:21:51.976285 kernel: ... max period: 00007fffffffffff May 13 14:21:51.976294 kernel: ... fixed-purpose events: 0 May 13 14:21:51.976303 kernel: ... event mask: 000000000000000f May 13 14:21:51.976312 kernel: signal: max sigframe size: 1440 May 13 14:21:51.976321 kernel: rcu: Hierarchical SRCU implementation. May 13 14:21:51.976330 kernel: rcu: Max phase no-delay instances is 400. May 13 14:21:51.980078 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 13 14:21:51.980093 kernel: smp: Bringing up secondary CPUs ... May 13 14:21:51.980103 kernel: smpboot: x86: Booting SMP configuration: May 13 14:21:51.980116 kernel: .... node #0, CPUs: #1 May 13 14:21:51.980126 kernel: smp: Brought up 1 node, 2 CPUs May 13 14:21:51.980135 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) May 13 14:21:51.980145 kernel: Memory: 3961272K/4193772K available (14336K kernel code, 2430K rwdata, 9948K rodata, 54420K init, 2548K bss, 227296K reserved, 0K cma-reserved) May 13 14:21:51.980155 kernel: devtmpfs: initialized May 13 14:21:51.980164 kernel: x86/mm: Memory block size: 128MB May 13 14:21:51.980174 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 13 14:21:51.980183 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 13 14:21:51.980192 kernel: pinctrl core: initialized pinctrl subsystem May 13 14:21:51.980203 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 13 14:21:51.980212 kernel: audit: initializing netlink subsys (disabled) May 13 14:21:51.980222 kernel: audit: type=2000 audit(1747146107.992:1): state=initialized audit_enabled=0 res=1 May 13 14:21:51.980231 kernel: thermal_sys: Registered thermal governor 'step_wise' May 13 14:21:51.980240 kernel: thermal_sys: Registered thermal governor 'user_space' May 13 14:21:51.980249 kernel: cpuidle: using governor menu May 13 14:21:51.980259 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 13 14:21:51.980268 kernel: dca service started, version 1.12.1 May 13 14:21:51.980277 kernel: PCI: Using configuration type 1 for base access May 13 14:21:51.980288 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 13 14:21:51.980298 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 13 14:21:51.980307 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 13 14:21:51.980316 kernel: ACPI: Added _OSI(Module Device) May 13 14:21:51.980325 kernel: ACPI: Added _OSI(Processor Device) May 13 14:21:51.980334 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 13 14:21:51.980373 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 13 14:21:51.980384 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 13 14:21:51.980401 kernel: ACPI: Interpreter enabled May 13 14:21:51.980431 kernel: ACPI: PM: (supports S0 S3 S5) May 13 14:21:51.980458 kernel: ACPI: Using IOAPIC for interrupt routing May 13 14:21:51.980481 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 13 14:21:51.980508 kernel: PCI: Using E820 reservations for host bridge windows May 13 14:21:51.980538 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F May 13 14:21:51.980558 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 13 14:21:51.980700 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] May 13 14:21:51.980805 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] May 13 14:21:51.980896 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge May 13 14:21:51.980910 kernel: acpiphp: Slot [3] registered May 13 14:21:51.980919 kernel: acpiphp: Slot [4] registered May 13 14:21:51.980929 kernel: acpiphp: Slot [5] registered May 13 14:21:51.980938 kernel: acpiphp: Slot [6] registered May 13 14:21:51.980947 kernel: acpiphp: Slot [7] registered May 13 14:21:51.980956 kernel: acpiphp: Slot [8] registered May 13 14:21:51.980965 kernel: acpiphp: Slot [9] registered May 13 14:21:51.980977 kernel: acpiphp: Slot [10] registered May 13 14:21:51.980986 kernel: acpiphp: Slot [11] registered May 13 14:21:51.980995 kernel: acpiphp: Slot [12] registered May 13 14:21:51.981004 kernel: acpiphp: Slot [13] registered May 13 14:21:51.981013 kernel: acpiphp: Slot [14] registered May 13 14:21:51.981022 kernel: acpiphp: Slot [15] registered May 13 14:21:51.981031 kernel: acpiphp: Slot [16] registered May 13 14:21:51.981040 kernel: acpiphp: Slot [17] registered May 13 14:21:51.981049 kernel: acpiphp: Slot [18] registered May 13 14:21:51.981060 kernel: acpiphp: Slot [19] registered May 13 14:21:51.981069 kernel: acpiphp: Slot [20] registered May 13 14:21:51.981078 kernel: acpiphp: Slot [21] registered May 13 14:21:51.981087 kernel: acpiphp: Slot [22] registered May 13 14:21:51.981096 kernel: acpiphp: Slot [23] registered May 13 14:21:51.981105 kernel: acpiphp: Slot [24] registered May 13 14:21:51.981114 kernel: acpiphp: Slot [25] registered May 13 14:21:51.981124 kernel: acpiphp: Slot [26] registered May 13 14:21:51.981133 kernel: acpiphp: Slot [27] registered May 13 14:21:51.981142 kernel: acpiphp: Slot [28] registered May 13 14:21:51.981153 kernel: acpiphp: Slot [29] registered May 13 14:21:51.981162 kernel: acpiphp: Slot [30] registered May 13 14:21:51.981171 kernel: acpiphp: Slot [31] registered May 13 14:21:51.981180 kernel: PCI host bridge to bus 0000:00 May 13 14:21:51.981276 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 13 14:21:51.981372 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 13 14:21:51.981454 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 13 14:21:51.981531 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 13 14:21:51.981612 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] May 13 14:21:51.981687 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 13 14:21:51.981791 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint May 13 14:21:51.981894 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint May 13 14:21:51.981994 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint May 13 14:21:51.982082 kernel: pci 0000:00:01.1: BAR 4 [io 0xc120-0xc12f] May 13 14:21:51.982173 kernel: pci 0000:00:01.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk May 13 14:21:51.982259 kernel: pci 0000:00:01.1: BAR 1 [io 0x03f6]: legacy IDE quirk May 13 14:21:51.982347 kernel: pci 0000:00:01.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk May 13 14:21:51.985522 kernel: pci 0000:00:01.1: BAR 3 [io 0x0376]: legacy IDE quirk May 13 14:21:51.985631 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint May 13 14:21:51.985722 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI May 13 14:21:51.985816 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB May 13 14:21:51.985913 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint May 13 14:21:51.986003 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref] May 13 14:21:51.986090 kernel: pci 0000:00:02.0: BAR 2 [mem 0xc000000000-0xc000003fff 64bit pref] May 13 14:21:51.986177 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff] May 13 14:21:51.986263 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref] May 13 14:21:51.986350 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 13 14:21:51.987500 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint May 13 14:21:51.987591 kernel: pci 0000:00:03.0: BAR 0 [io 0xc080-0xc0bf] May 13 14:21:51.987679 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff] May 13 14:21:51.987799 kernel: pci 0000:00:03.0: BAR 4 [mem 0xc000004000-0xc000007fff 64bit pref] May 13 14:21:51.987888 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref] May 13 14:21:51.987985 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint May 13 14:21:51.988074 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc07f] May 13 14:21:51.988168 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff] May 13 14:21:51.988254 kernel: pci 0000:00:04.0: BAR 4 [mem 0xc000008000-0xc00000bfff 64bit pref] May 13 14:21:51.988529 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint May 13 14:21:51.988628 kernel: pci 0000:00:05.0: BAR 0 [io 0xc0c0-0xc0ff] May 13 14:21:51.988715 kernel: pci 0000:00:05.0: BAR 4 [mem 0xc00000c000-0xc00000ffff 64bit pref] May 13 14:21:51.988812 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint May 13 14:21:51.988901 kernel: pci 0000:00:06.0: BAR 0 [io 0xc100-0xc11f] May 13 14:21:51.988994 kernel: pci 0000:00:06.0: BAR 1 [mem 0xfeb93000-0xfeb93fff] May 13 14:21:51.989084 kernel: pci 0000:00:06.0: BAR 4 [mem 0xc000010000-0xc000013fff 64bit pref] May 13 14:21:51.989098 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 13 14:21:51.989108 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 13 14:21:51.989117 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 13 14:21:51.989127 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 13 14:21:51.989137 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 May 13 14:21:51.989146 kernel: iommu: Default domain type: Translated May 13 14:21:51.989159 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 13 14:21:51.989169 kernel: PCI: Using ACPI for IRQ routing May 13 14:21:51.989179 kernel: PCI: pci_cache_line_size set to 64 bytes May 13 14:21:51.989189 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 13 14:21:51.989198 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] May 13 14:21:51.989285 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device May 13 14:21:51.989466 kernel: pci 0000:00:02.0: vgaarb: bridge control possible May 13 14:21:51.989561 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 13 14:21:51.989575 kernel: vgaarb: loaded May 13 14:21:51.989588 kernel: clocksource: Switched to clocksource kvm-clock May 13 14:21:51.989598 kernel: VFS: Disk quotas dquot_6.6.0 May 13 14:21:51.989607 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 13 14:21:51.989617 kernel: pnp: PnP ACPI init May 13 14:21:51.989711 kernel: pnp 00:03: [dma 2] May 13 14:21:51.989726 kernel: pnp: PnP ACPI: found 5 devices May 13 14:21:51.989736 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 13 14:21:51.989745 kernel: NET: Registered PF_INET protocol family May 13 14:21:51.989758 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 13 14:21:51.989767 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 13 14:21:51.989777 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 13 14:21:51.990482 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 13 14:21:51.990509 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 13 14:21:51.990520 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 13 14:21:51.990530 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 14:21:51.990555 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 14:21:51.990565 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 13 14:21:51.990581 kernel: NET: Registered PF_XDP protocol family May 13 14:21:51.990722 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 13 14:21:51.990807 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 13 14:21:51.990885 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 13 14:21:51.990963 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] May 13 14:21:51.991039 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] May 13 14:21:51.991141 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release May 13 14:21:51.991236 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers May 13 14:21:51.991254 kernel: PCI: CLS 0 bytes, default 64 May 13 14:21:51.991263 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 13 14:21:51.991274 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) May 13 14:21:51.991283 kernel: Initialise system trusted keyrings May 13 14:21:51.991293 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 13 14:21:51.991302 kernel: Key type asymmetric registered May 13 14:21:51.991311 kernel: Asymmetric key parser 'x509' registered May 13 14:21:51.991321 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 13 14:21:51.991330 kernel: io scheduler mq-deadline registered May 13 14:21:51.991341 kernel: io scheduler kyber registered May 13 14:21:51.991374 kernel: io scheduler bfq registered May 13 14:21:51.991385 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 13 14:21:51.991395 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 May 13 14:21:51.991405 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 May 13 14:21:51.991414 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 May 13 14:21:51.991423 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 May 13 14:21:51.991433 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 13 14:21:51.991442 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 13 14:21:51.991454 kernel: random: crng init done May 13 14:21:51.991463 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 13 14:21:51.991473 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 13 14:21:51.991482 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 13 14:21:51.991580 kernel: rtc_cmos 00:04: RTC can wake from S4 May 13 14:21:51.991595 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 13 14:21:51.991671 kernel: rtc_cmos 00:04: registered as rtc0 May 13 14:21:51.991749 kernel: rtc_cmos 00:04: setting system clock to 2025-05-13T14:21:51 UTC (1747146111) May 13 14:21:51.991832 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram May 13 14:21:51.991846 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 13 14:21:51.991856 kernel: NET: Registered PF_INET6 protocol family May 13 14:21:51.991865 kernel: Segment Routing with IPv6 May 13 14:21:51.991874 kernel: In-situ OAM (IOAM) with IPv6 May 13 14:21:51.991883 kernel: NET: Registered PF_PACKET protocol family May 13 14:21:51.991892 kernel: Key type dns_resolver registered May 13 14:21:51.991902 kernel: IPI shorthand broadcast: enabled May 13 14:21:51.991911 kernel: sched_clock: Marking stable (3557007762, 186137989)->(3754958032, -11812281) May 13 14:21:51.991923 kernel: registered taskstats version 1 May 13 14:21:51.991932 kernel: Loading compiled-in X.509 certificates May 13 14:21:51.991942 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.28-flatcar: d81efc2839896c91a2830d4cfad7b0572af8b26a' May 13 14:21:51.991951 kernel: Demotion targets for Node 0: null May 13 14:21:51.991960 kernel: Key type .fscrypt registered May 13 14:21:51.991969 kernel: Key type fscrypt-provisioning registered May 13 14:21:51.991978 kernel: ima: No TPM chip found, activating TPM-bypass! May 13 14:21:51.991987 kernel: ima: Allocated hash algorithm: sha1 May 13 14:21:51.991998 kernel: ima: No architecture policies found May 13 14:21:51.992007 kernel: clk: Disabling unused clocks May 13 14:21:51.992017 kernel: Warning: unable to open an initial console. May 13 14:21:51.992026 kernel: Freeing unused kernel image (initmem) memory: 54420K May 13 14:21:51.992036 kernel: Write protecting the kernel read-only data: 24576k May 13 14:21:51.992045 kernel: Freeing unused kernel image (rodata/data gap) memory: 292K May 13 14:21:51.992054 kernel: Run /init as init process May 13 14:21:51.992063 kernel: with arguments: May 13 14:21:51.992073 kernel: /init May 13 14:21:51.992083 kernel: with environment: May 13 14:21:51.992092 kernel: HOME=/ May 13 14:21:51.992102 kernel: TERM=linux May 13 14:21:51.992110 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 13 14:21:51.992121 systemd[1]: Successfully made /usr/ read-only. May 13 14:21:51.992135 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 14:21:51.992148 systemd[1]: Detected virtualization kvm. May 13 14:21:51.992165 systemd[1]: Detected architecture x86-64. May 13 14:21:51.992176 systemd[1]: Running in initrd. May 13 14:21:51.992186 systemd[1]: No hostname configured, using default hostname. May 13 14:21:51.992197 systemd[1]: Hostname set to . May 13 14:21:51.992206 systemd[1]: Initializing machine ID from VM UUID. May 13 14:21:51.992216 systemd[1]: Queued start job for default target initrd.target. May 13 14:21:51.992228 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 14:21:51.992239 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 14:21:51.992250 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 13 14:21:51.992260 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 14:21:51.992270 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 13 14:21:51.992281 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 13 14:21:51.992293 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 13 14:21:51.992305 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 13 14:21:51.992315 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 14:21:51.992325 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 14:21:51.992335 systemd[1]: Reached target paths.target - Path Units. May 13 14:21:51.992428 systemd[1]: Reached target slices.target - Slice Units. May 13 14:21:51.992438 systemd[1]: Reached target swap.target - Swaps. May 13 14:21:51.992449 systemd[1]: Reached target timers.target - Timer Units. May 13 14:21:51.992459 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 13 14:21:51.992469 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 14:21:51.992482 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 13 14:21:51.992493 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 13 14:21:51.992503 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 14:21:51.992513 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 14:21:51.992523 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 14:21:51.992533 systemd[1]: Reached target sockets.target - Socket Units. May 13 14:21:51.992544 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 13 14:21:51.992554 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 14:21:51.992566 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 13 14:21:51.992577 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 13 14:21:51.992588 systemd[1]: Starting systemd-fsck-usr.service... May 13 14:21:51.992599 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 14:21:51.992609 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 14:21:51.992621 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 14:21:51.992631 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 13 14:21:51.992666 systemd-journald[212]: Collecting audit messages is disabled. May 13 14:21:51.992694 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 14:21:51.992705 systemd[1]: Finished systemd-fsck-usr.service. May 13 14:21:51.992715 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 14:21:51.992726 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 13 14:21:51.992736 kernel: Bridge firewalling registered May 13 14:21:51.992746 systemd-journald[212]: Journal started May 13 14:21:51.992771 systemd-journald[212]: Runtime Journal (/run/log/journal/0f871238a9714740baa32d7fc61ac72d) is 8M, max 78.5M, 70.5M free. May 13 14:21:51.993558 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 14:21:51.944110 systemd-modules-load[214]: Inserted module 'overlay' May 13 14:21:52.033987 systemd[1]: Started systemd-journald.service - Journal Service. May 13 14:21:51.989835 systemd-modules-load[214]: Inserted module 'br_netfilter' May 13 14:21:52.034716 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 14:21:52.035667 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 14:21:52.039461 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 14:21:52.042454 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 14:21:52.043641 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 14:21:52.067884 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 14:21:52.077111 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 14:21:52.080427 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 14:21:52.083630 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 14:21:52.085204 systemd-tmpfiles[233]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 13 14:21:52.088635 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 14:21:52.094471 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 13 14:21:52.098469 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 14:21:52.116178 dracut-cmdline[251]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=7099d7ee582d4f3e6d25a3763207cfa25fb4eb117c83034e2c517b959b8370a1 May 13 14:21:52.137880 systemd-resolved[252]: Positive Trust Anchors: May 13 14:21:52.138546 systemd-resolved[252]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 14:21:52.138588 systemd-resolved[252]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 14:21:52.144478 systemd-resolved[252]: Defaulting to hostname 'linux'. May 13 14:21:52.145351 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 14:21:52.146696 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 14:21:52.188411 kernel: SCSI subsystem initialized May 13 14:21:52.198425 kernel: Loading iSCSI transport class v2.0-870. May 13 14:21:52.210431 kernel: iscsi: registered transport (tcp) May 13 14:21:52.233954 kernel: iscsi: registered transport (qla4xxx) May 13 14:21:52.234022 kernel: QLogic iSCSI HBA Driver May 13 14:21:52.257395 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 14:21:52.275910 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 14:21:52.278241 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 14:21:52.344972 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 13 14:21:52.347636 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 13 14:21:52.407427 kernel: raid6: sse2x4 gen() 13122 MB/s May 13 14:21:52.425466 kernel: raid6: sse2x2 gen() 15142 MB/s May 13 14:21:52.443786 kernel: raid6: sse2x1 gen() 10119 MB/s May 13 14:21:52.443847 kernel: raid6: using algorithm sse2x2 gen() 15142 MB/s May 13 14:21:52.462807 kernel: raid6: .... xor() 9440 MB/s, rmw enabled May 13 14:21:52.462869 kernel: raid6: using ssse3x2 recovery algorithm May 13 14:21:52.484443 kernel: xor: measuring software checksum speed May 13 14:21:52.484505 kernel: prefetch64-sse : 16132 MB/sec May 13 14:21:52.486926 kernel: generic_sse : 16878 MB/sec May 13 14:21:52.486987 kernel: xor: using function: generic_sse (16878 MB/sec) May 13 14:21:52.683421 kernel: Btrfs loaded, zoned=no, fsverity=no May 13 14:21:52.692155 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 13 14:21:52.695489 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 14:21:52.744616 systemd-udevd[461]: Using default interface naming scheme 'v255'. May 13 14:21:52.757495 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 14:21:52.764533 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 13 14:21:52.790186 dracut-pre-trigger[471]: rd.md=0: removing MD RAID activation May 13 14:21:52.825975 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 13 14:21:52.828434 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 14:21:52.891909 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 14:21:52.899729 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 13 14:21:52.986416 kernel: libata version 3.00 loaded. May 13 14:21:52.989470 kernel: ata_piix 0000:00:01.1: version 2.13 May 13 14:21:52.994425 kernel: scsi host0: ata_piix May 13 14:21:53.002388 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues May 13 14:21:53.018384 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) May 13 14:21:53.018559 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 May 13 14:21:53.024480 kernel: scsi host1: ata_piix May 13 14:21:53.024651 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 lpm-pol 0 May 13 14:21:53.036163 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 lpm-pol 0 May 13 14:21:53.036203 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 13 14:21:53.036216 kernel: GPT:17805311 != 20971519 May 13 14:21:53.036228 kernel: GPT:Alternate GPT header not at the end of the disk. May 13 14:21:53.036245 kernel: GPT:17805311 != 20971519 May 13 14:21:53.036256 kernel: GPT: Use GNU Parted to correct GPT errors. May 13 14:21:53.036269 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 14:21:53.039881 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 14:21:53.040024 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 14:21:53.040824 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 14:21:53.043573 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 14:21:53.045137 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 14:21:53.107301 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 14:21:53.282126 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 13 14:21:53.283697 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 13 14:21:53.303323 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 13 14:21:53.312690 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 13 14:21:53.314329 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 13 14:21:53.338327 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 13 14:21:53.340617 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 13 14:21:53.341991 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 14:21:53.344101 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 14:21:53.348513 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 13 14:21:53.353583 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 13 14:21:53.378007 disk-uuid[568]: Primary Header is updated. May 13 14:21:53.378007 disk-uuid[568]: Secondary Entries is updated. May 13 14:21:53.378007 disk-uuid[568]: Secondary Header is updated. May 13 14:21:53.386613 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 13 14:21:53.399608 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 14:21:54.421560 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 14:21:54.423341 disk-uuid[573]: The operation has completed successfully. May 13 14:21:54.507012 systemd[1]: disk-uuid.service: Deactivated successfully. May 13 14:21:54.507846 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 13 14:21:54.554126 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 13 14:21:54.583719 sh[587]: Success May 13 14:21:54.609269 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 13 14:21:54.609321 kernel: device-mapper: uevent: version 1.0.3 May 13 14:21:54.612378 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 13 14:21:54.626407 kernel: device-mapper: verity: sha256 using shash "sha256-ssse3" May 13 14:21:54.721794 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 13 14:21:54.728557 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 13 14:21:54.751408 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 13 14:21:54.778411 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 13 14:21:54.787424 kernel: BTRFS: device fsid 3042589c-b63f-42f0-9a6f-a4369b1889f9 devid 1 transid 40 /dev/mapper/usr (253:0) scanned by mount (599) May 13 14:21:54.798462 kernel: BTRFS info (device dm-0): first mount of filesystem 3042589c-b63f-42f0-9a6f-a4369b1889f9 May 13 14:21:54.798564 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 13 14:21:54.798596 kernel: BTRFS info (device dm-0): using free-space-tree May 13 14:21:54.826410 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 13 14:21:54.828455 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 13 14:21:54.830498 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 13 14:21:54.833588 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 13 14:21:54.840567 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 13 14:21:54.889422 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (636) May 13 14:21:54.905144 kernel: BTRFS info (device vda6): first mount of filesystem 00c8da9a-330c-44ff-bf12-f9831c2c14e1 May 13 14:21:54.905211 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 14:21:54.905244 kernel: BTRFS info (device vda6): using free-space-tree May 13 14:21:54.919376 kernel: BTRFS info (device vda6): last unmount of filesystem 00c8da9a-330c-44ff-bf12-f9831c2c14e1 May 13 14:21:54.920855 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 13 14:21:54.923134 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 13 14:21:54.951082 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 14:21:54.956277 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 14:21:54.997695 systemd-networkd[769]: lo: Link UP May 13 14:21:54.997706 systemd-networkd[769]: lo: Gained carrier May 13 14:21:54.998787 systemd-networkd[769]: Enumeration completed May 13 14:21:54.998897 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 14:21:54.999582 systemd-networkd[769]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 14:21:54.999587 systemd-networkd[769]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 14:21:55.000482 systemd-networkd[769]: eth0: Link UP May 13 14:21:55.000486 systemd-networkd[769]: eth0: Gained carrier May 13 14:21:55.000495 systemd-networkd[769]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 14:21:55.000725 systemd[1]: Reached target network.target - Network. May 13 14:21:55.014419 systemd-networkd[769]: eth0: DHCPv4 address 172.24.4.33/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 13 14:21:55.136638 ignition[735]: Ignition 2.21.0 May 13 14:21:55.136665 ignition[735]: Stage: fetch-offline May 13 14:21:55.136729 ignition[735]: no configs at "/usr/lib/ignition/base.d" May 13 14:21:55.136750 ignition[735]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 13 14:21:55.136947 ignition[735]: parsed url from cmdline: "" May 13 14:21:55.136955 ignition[735]: no config URL provided May 13 14:21:55.136968 ignition[735]: reading system config file "/usr/lib/ignition/user.ign" May 13 14:21:55.136985 ignition[735]: no config at "/usr/lib/ignition/user.ign" May 13 14:21:55.142334 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 13 14:21:55.136996 ignition[735]: failed to fetch config: resource requires networking May 13 14:21:55.139191 ignition[735]: Ignition finished successfully May 13 14:21:55.147305 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 13 14:21:55.174017 ignition[780]: Ignition 2.21.0 May 13 14:21:55.174034 ignition[780]: Stage: fetch May 13 14:21:55.174200 ignition[780]: no configs at "/usr/lib/ignition/base.d" May 13 14:21:55.174212 ignition[780]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 13 14:21:55.174308 ignition[780]: parsed url from cmdline: "" May 13 14:21:55.174313 ignition[780]: no config URL provided May 13 14:21:55.174319 ignition[780]: reading system config file "/usr/lib/ignition/user.ign" May 13 14:21:55.174329 ignition[780]: no config at "/usr/lib/ignition/user.ign" May 13 14:21:55.174532 ignition[780]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 May 13 14:21:55.174597 ignition[780]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... May 13 14:21:55.174665 ignition[780]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... May 13 14:21:55.425525 ignition[780]: GET result: OK May 13 14:21:55.425795 ignition[780]: parsing config with SHA512: 29d90146ce04ba3d4ab0150be54ab4542b7e0c4b7dcb98613c5176ffc7b8278b6ed3193f3961c693959b57dddf14eab7cfc772936ec971623ae73b11c3e4440d May 13 14:21:55.441843 unknown[780]: fetched base config from "system" May 13 14:21:55.441867 unknown[780]: fetched base config from "system" May 13 14:21:55.442708 ignition[780]: fetch: fetch complete May 13 14:21:55.441881 unknown[780]: fetched user config from "openstack" May 13 14:21:55.442720 ignition[780]: fetch: fetch passed May 13 14:21:55.449952 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 13 14:21:55.442804 ignition[780]: Ignition finished successfully May 13 14:21:55.454224 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 13 14:21:55.512809 ignition[786]: Ignition 2.21.0 May 13 14:21:55.512849 ignition[786]: Stage: kargs May 13 14:21:55.513203 ignition[786]: no configs at "/usr/lib/ignition/base.d" May 13 14:21:55.513227 ignition[786]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 13 14:21:55.515313 ignition[786]: kargs: kargs passed May 13 14:21:55.517957 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 13 14:21:55.515440 ignition[786]: Ignition finished successfully May 13 14:21:55.523110 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 13 14:21:55.568632 ignition[792]: Ignition 2.21.0 May 13 14:21:55.568665 ignition[792]: Stage: disks May 13 14:21:55.569023 ignition[792]: no configs at "/usr/lib/ignition/base.d" May 13 14:21:55.569048 ignition[792]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 13 14:21:55.574267 ignition[792]: disks: disks passed May 13 14:21:55.574431 ignition[792]: Ignition finished successfully May 13 14:21:55.577745 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 13 14:21:55.579886 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 13 14:21:55.582161 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 13 14:21:55.585149 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 14:21:55.588122 systemd[1]: Reached target sysinit.target - System Initialization. May 13 14:21:55.590775 systemd[1]: Reached target basic.target - Basic System. May 13 14:21:55.596316 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 13 14:21:55.658583 systemd-fsck[801]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks May 13 14:21:55.673233 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 13 14:21:55.677969 systemd[1]: Mounting sysroot.mount - /sysroot... May 13 14:21:55.883979 kernel: EXT4-fs (vda9): mounted filesystem ebf7ca75-051f-4154-b098-5ec24084105d r/w with ordered data mode. Quota mode: none. May 13 14:21:55.888640 systemd[1]: Mounted sysroot.mount - /sysroot. May 13 14:21:55.890783 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 13 14:21:55.895547 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 14:21:55.905537 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 13 14:21:55.910912 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 13 14:21:55.917623 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... May 13 14:21:55.921552 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 13 14:21:55.921622 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 13 14:21:55.933235 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 13 14:21:55.940593 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 13 14:21:55.943935 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (809) May 13 14:21:55.945535 kernel: BTRFS info (device vda6): first mount of filesystem 00c8da9a-330c-44ff-bf12-f9831c2c14e1 May 13 14:21:55.945581 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 14:21:55.945604 kernel: BTRFS info (device vda6): using free-space-tree May 13 14:21:55.984077 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 14:21:56.064003 initrd-setup-root[838]: cut: /sysroot/etc/passwd: No such file or directory May 13 14:21:56.069425 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 13 14:21:56.073884 initrd-setup-root[846]: cut: /sysroot/etc/group: No such file or directory May 13 14:21:56.081641 initrd-setup-root[853]: cut: /sysroot/etc/shadow: No such file or directory May 13 14:21:56.090137 initrd-setup-root[860]: cut: /sysroot/etc/gshadow: No such file or directory May 13 14:21:56.185623 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 13 14:21:56.188193 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 13 14:21:56.189875 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 13 14:21:56.202636 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 13 14:21:56.205504 kernel: BTRFS info (device vda6): last unmount of filesystem 00c8da9a-330c-44ff-bf12-f9831c2c14e1 May 13 14:21:56.226826 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 13 14:21:56.233205 ignition[928]: INFO : Ignition 2.21.0 May 13 14:21:56.233205 ignition[928]: INFO : Stage: mount May 13 14:21:56.234401 ignition[928]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 14:21:56.234401 ignition[928]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 13 14:21:56.234401 ignition[928]: INFO : mount: mount passed May 13 14:21:56.234401 ignition[928]: INFO : Ignition finished successfully May 13 14:21:56.235801 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 13 14:21:57.021762 systemd-networkd[769]: eth0: Gained IPv6LL May 13 14:21:57.114443 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 13 14:21:59.125429 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 13 14:22:03.141432 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 13 14:22:03.148949 coreos-metadata[811]: May 13 14:22:03.148 WARN failed to locate config-drive, using the metadata service API instead May 13 14:22:03.189719 coreos-metadata[811]: May 13 14:22:03.189 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 13 14:22:03.206120 coreos-metadata[811]: May 13 14:22:03.206 INFO Fetch successful May 13 14:22:03.206120 coreos-metadata[811]: May 13 14:22:03.206 INFO wrote hostname ci-9999-9-100-9699b4e791.novalocal to /sysroot/etc/hostname May 13 14:22:03.209764 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. May 13 14:22:03.209990 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. May 13 14:22:03.218558 systemd[1]: Starting ignition-files.service - Ignition (files)... May 13 14:22:03.263051 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 14:22:03.298484 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (946) May 13 14:22:03.307123 kernel: BTRFS info (device vda6): first mount of filesystem 00c8da9a-330c-44ff-bf12-f9831c2c14e1 May 13 14:22:03.307218 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 14:22:03.311553 kernel: BTRFS info (device vda6): using free-space-tree May 13 14:22:03.325467 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 14:22:03.377646 ignition[964]: INFO : Ignition 2.21.0 May 13 14:22:03.377646 ignition[964]: INFO : Stage: files May 13 14:22:03.380568 ignition[964]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 14:22:03.380568 ignition[964]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 13 14:22:03.380568 ignition[964]: DEBUG : files: compiled without relabeling support, skipping May 13 14:22:03.386500 ignition[964]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 13 14:22:03.386500 ignition[964]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 13 14:22:03.386500 ignition[964]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 13 14:22:03.386500 ignition[964]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 13 14:22:03.394720 ignition[964]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 13 14:22:03.394720 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 13 14:22:03.394720 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 13 14:22:03.387097 unknown[964]: wrote ssh authorized keys file for user: core May 13 14:22:03.536803 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 13 14:22:04.300829 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 13 14:22:04.300829 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 13 14:22:04.307262 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 13 14:22:04.307262 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 13 14:22:04.307262 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 13 14:22:04.307262 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 14:22:04.307262 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 14:22:04.307262 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 14:22:04.307262 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 14:22:04.322084 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 13 14:22:04.322084 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 13 14:22:04.322084 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 14:22:04.322084 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 14:22:04.322084 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 14:22:04.322084 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 May 13 14:22:05.206512 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 13 14:22:07.543990 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 14:22:07.543990 ignition[964]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 13 14:22:07.550228 ignition[964]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 14:22:07.558051 ignition[964]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 14:22:07.558051 ignition[964]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 13 14:22:07.558051 ignition[964]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 13 14:22:07.558051 ignition[964]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 13 14:22:07.570287 ignition[964]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 13 14:22:07.570287 ignition[964]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 13 14:22:07.570287 ignition[964]: INFO : files: files passed May 13 14:22:07.570287 ignition[964]: INFO : Ignition finished successfully May 13 14:22:07.561904 systemd[1]: Finished ignition-files.service - Ignition (files). May 13 14:22:07.569627 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 13 14:22:07.577607 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 13 14:22:07.591063 systemd[1]: ignition-quench.service: Deactivated successfully. May 13 14:22:07.596555 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 13 14:22:07.617892 initrd-setup-root-after-ignition[993]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 14:22:07.617892 initrd-setup-root-after-ignition[993]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 13 14:22:07.623520 initrd-setup-root-after-ignition[996]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 14:22:07.626941 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 14:22:07.629340 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 13 14:22:07.633650 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 13 14:22:07.701860 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 13 14:22:07.702124 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 13 14:22:07.705606 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 13 14:22:07.707974 systemd[1]: Reached target initrd.target - Initrd Default Target. May 13 14:22:07.711012 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 13 14:22:07.712769 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 13 14:22:07.752776 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 14:22:07.758077 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 13 14:22:07.796688 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 13 14:22:07.798405 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 14:22:07.801639 systemd[1]: Stopped target timers.target - Timer Units. May 13 14:22:07.804728 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 13 14:22:07.805030 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 14:22:07.808078 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 13 14:22:07.810187 systemd[1]: Stopped target basic.target - Basic System. May 13 14:22:07.813172 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 13 14:22:07.816009 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 13 14:22:07.818621 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 13 14:22:07.821738 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 13 14:22:07.824646 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 13 14:22:07.827695 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 13 14:22:07.830854 systemd[1]: Stopped target sysinit.target - System Initialization. May 13 14:22:07.833626 systemd[1]: Stopped target local-fs.target - Local File Systems. May 13 14:22:07.836749 systemd[1]: Stopped target swap.target - Swaps. May 13 14:22:07.839438 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 13 14:22:07.839823 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 13 14:22:07.842816 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 13 14:22:07.844760 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 14:22:07.847270 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 13 14:22:07.848028 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 14:22:07.850304 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 13 14:22:07.850754 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 13 14:22:07.854564 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 13 14:22:07.854881 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 14:22:07.856805 systemd[1]: ignition-files.service: Deactivated successfully. May 13 14:22:07.857180 systemd[1]: Stopped ignition-files.service - Ignition (files). May 13 14:22:07.862806 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 13 14:22:07.871799 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 13 14:22:07.873842 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 13 14:22:07.874837 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 13 14:22:07.877898 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 13 14:22:07.878100 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 13 14:22:07.887681 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 13 14:22:07.891642 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 13 14:22:07.904893 ignition[1017]: INFO : Ignition 2.21.0 May 13 14:22:07.906733 ignition[1017]: INFO : Stage: umount May 13 14:22:07.906733 ignition[1017]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 14:22:07.906733 ignition[1017]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 13 14:22:07.909874 ignition[1017]: INFO : umount: umount passed May 13 14:22:07.909874 ignition[1017]: INFO : Ignition finished successfully May 13 14:22:07.911416 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 13 14:22:07.912025 systemd[1]: ignition-mount.service: Deactivated successfully. May 13 14:22:07.913234 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 13 14:22:07.914577 systemd[1]: sysroot-boot.service: Deactivated successfully. May 13 14:22:07.914674 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 13 14:22:07.915958 systemd[1]: ignition-disks.service: Deactivated successfully. May 13 14:22:07.916020 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 13 14:22:07.916777 systemd[1]: ignition-kargs.service: Deactivated successfully. May 13 14:22:07.916817 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 13 14:22:07.917764 systemd[1]: ignition-fetch.service: Deactivated successfully. May 13 14:22:07.917802 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 13 14:22:07.918703 systemd[1]: Stopped target network.target - Network. May 13 14:22:07.919681 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 13 14:22:07.919726 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 13 14:22:07.920784 systemd[1]: Stopped target paths.target - Path Units. May 13 14:22:07.921792 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 13 14:22:07.922044 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 14:22:07.922829 systemd[1]: Stopped target slices.target - Slice Units. May 13 14:22:07.923848 systemd[1]: Stopped target sockets.target - Socket Units. May 13 14:22:07.924864 systemd[1]: iscsid.socket: Deactivated successfully. May 13 14:22:07.924898 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 13 14:22:07.925999 systemd[1]: iscsiuio.socket: Deactivated successfully. May 13 14:22:07.926031 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 14:22:07.927145 systemd[1]: ignition-setup.service: Deactivated successfully. May 13 14:22:07.927190 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 13 14:22:07.928139 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 13 14:22:07.928178 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 13 14:22:07.929306 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 13 14:22:07.929372 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 13 14:22:07.930675 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 13 14:22:07.931672 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 13 14:22:07.934729 systemd[1]: systemd-resolved.service: Deactivated successfully. May 13 14:22:07.934836 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 13 14:22:07.937750 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 13 14:22:07.938221 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 13 14:22:07.938265 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 14:22:07.942763 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 13 14:22:07.944943 systemd[1]: systemd-networkd.service: Deactivated successfully. May 13 14:22:07.945036 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 13 14:22:07.946851 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 13 14:22:07.947118 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 13 14:22:07.947755 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 13 14:22:07.947797 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 13 14:22:07.949691 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 13 14:22:07.950907 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 13 14:22:07.950953 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 14:22:07.952769 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 13 14:22:07.952835 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 13 14:22:07.954511 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 13 14:22:07.954560 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 13 14:22:07.955158 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 14:22:07.957272 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 13 14:22:07.964833 systemd[1]: systemd-udevd.service: Deactivated successfully. May 13 14:22:07.965468 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 14:22:07.966626 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 13 14:22:07.966682 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 13 14:22:07.967185 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 13 14:22:07.967215 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 13 14:22:07.967776 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 13 14:22:07.967817 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 13 14:22:07.970469 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 13 14:22:07.970513 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 13 14:22:07.971843 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 14:22:07.971890 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 14:22:07.975492 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 13 14:22:07.976104 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 13 14:22:07.976152 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 13 14:22:07.977736 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 13 14:22:07.977780 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 14:22:07.978935 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 13 14:22:07.978976 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 14:22:07.980205 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 13 14:22:07.980246 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 13 14:22:07.981302 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 14:22:07.981342 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 14:22:07.985124 systemd[1]: network-cleanup.service: Deactivated successfully. May 13 14:22:07.985220 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 13 14:22:07.989528 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 13 14:22:07.989610 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 13 14:22:07.990324 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 13 14:22:07.991974 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 13 14:22:08.010291 systemd[1]: Switching root. May 13 14:22:08.044419 systemd-journald[212]: Journal stopped May 13 14:22:09.796544 systemd-journald[212]: Received SIGTERM from PID 1 (systemd). May 13 14:22:09.796610 kernel: SELinux: policy capability network_peer_controls=1 May 13 14:22:09.796629 kernel: SELinux: policy capability open_perms=1 May 13 14:22:09.796641 kernel: SELinux: policy capability extended_socket_class=1 May 13 14:22:09.796653 kernel: SELinux: policy capability always_check_network=0 May 13 14:22:09.796668 kernel: SELinux: policy capability cgroup_seclabel=1 May 13 14:22:09.796680 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 13 14:22:09.796691 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 13 14:22:09.796702 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 13 14:22:09.796713 kernel: SELinux: policy capability userspace_initial_context=0 May 13 14:22:09.796724 kernel: audit: type=1403 audit(1747146128.682:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 13 14:22:09.796741 systemd[1]: Successfully loaded SELinux policy in 69.180ms. May 13 14:22:09.796759 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.798ms. May 13 14:22:09.796773 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 14:22:09.796786 systemd[1]: Detected virtualization kvm. May 13 14:22:09.796798 systemd[1]: Detected architecture x86-64. May 13 14:22:09.796810 systemd[1]: Detected first boot. May 13 14:22:09.796822 systemd[1]: Hostname set to . May 13 14:22:09.796834 systemd[1]: Initializing machine ID from VM UUID. May 13 14:22:09.796849 zram_generator::config[1061]: No configuration found. May 13 14:22:09.796862 kernel: Guest personality initialized and is inactive May 13 14:22:09.796873 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 13 14:22:09.796884 kernel: Initialized host personality May 13 14:22:09.796895 kernel: NET: Registered PF_VSOCK protocol family May 13 14:22:09.796906 systemd[1]: Populated /etc with preset unit settings. May 13 14:22:09.796922 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 13 14:22:09.796934 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 13 14:22:09.796948 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 13 14:22:09.796960 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 13 14:22:09.796973 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 13 14:22:09.796985 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 13 14:22:09.796997 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 13 14:22:09.797009 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 13 14:22:09.797021 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 13 14:22:09.797034 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 13 14:22:09.797046 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 13 14:22:09.797060 systemd[1]: Created slice user.slice - User and Session Slice. May 13 14:22:09.797072 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 14:22:09.797084 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 14:22:09.797096 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 13 14:22:09.797109 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 13 14:22:09.797121 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 13 14:22:09.797135 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 14:22:09.797147 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 13 14:22:09.797159 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 14:22:09.797175 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 14:22:09.797187 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 13 14:22:09.797199 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 13 14:22:09.797211 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 13 14:22:09.797223 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 13 14:22:09.797235 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 14:22:09.797248 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 14:22:09.797261 systemd[1]: Reached target slices.target - Slice Units. May 13 14:22:09.797273 systemd[1]: Reached target swap.target - Swaps. May 13 14:22:09.797285 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 13 14:22:09.797297 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 13 14:22:09.797309 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 13 14:22:09.797321 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 14:22:09.797333 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 14:22:09.797348 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 14:22:09.803094 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 13 14:22:09.803121 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 13 14:22:09.803134 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 13 14:22:09.803147 systemd[1]: Mounting media.mount - External Media Directory... May 13 14:22:09.803160 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 14:22:09.803173 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 13 14:22:09.803185 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 13 14:22:09.803198 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 13 14:22:09.803212 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 13 14:22:09.803227 systemd[1]: Reached target machines.target - Containers. May 13 14:22:09.803241 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 13 14:22:09.803254 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 14:22:09.803267 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 14:22:09.803281 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 13 14:22:09.803294 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 14:22:09.803307 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 14:22:09.803320 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 14:22:09.803335 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 13 14:22:09.803348 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 14:22:09.803383 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 13 14:22:09.803398 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 13 14:22:09.803410 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 13 14:22:09.803423 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 13 14:22:09.803435 systemd[1]: Stopped systemd-fsck-usr.service. May 13 14:22:09.803449 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 14:22:09.803465 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 14:22:09.803477 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 14:22:09.803490 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 14:22:09.803502 kernel: loop: module loaded May 13 14:22:09.803514 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 13 14:22:09.803527 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 13 14:22:09.803542 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 14:22:09.803557 systemd[1]: verity-setup.service: Deactivated successfully. May 13 14:22:09.803569 systemd[1]: Stopped verity-setup.service. May 13 14:22:09.803583 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 14:22:09.803596 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 13 14:22:09.803611 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 13 14:22:09.803624 systemd[1]: Mounted media.mount - External Media Directory. May 13 14:22:09.803637 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 13 14:22:09.803650 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 13 14:22:09.803663 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 13 14:22:09.803675 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 14:22:09.803688 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 13 14:22:09.803701 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 13 14:22:09.803714 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 14:22:09.803729 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 14:22:09.803744 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 14:22:09.803757 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 14:22:09.803769 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 14:22:09.803781 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 14:22:09.803792 kernel: fuse: init (API version 7.41) May 13 14:22:09.803803 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 14:22:09.803815 kernel: ACPI: bus type drm_connector registered May 13 14:22:09.803826 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 14:22:09.803840 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 14:22:09.803852 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 14:22:09.803863 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 13 14:22:09.803875 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 13 14:22:09.803889 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 13 14:22:09.803901 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 13 14:22:09.803956 systemd-journald[1144]: Collecting audit messages is disabled. May 13 14:22:09.803984 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 14:22:09.803998 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 13 14:22:09.804010 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 13 14:22:09.804023 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 13 14:22:09.804036 systemd-journald[1144]: Journal started May 13 14:22:09.804061 systemd-journald[1144]: Runtime Journal (/run/log/journal/0f871238a9714740baa32d7fc61ac72d) is 8M, max 78.5M, 70.5M free. May 13 14:22:09.382433 systemd[1]: Queued start job for default target multi-user.target. May 13 14:22:09.401618 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 13 14:22:09.402088 systemd[1]: systemd-journald.service: Deactivated successfully. May 13 14:22:09.809423 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 14:22:09.814412 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 13 14:22:09.818092 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 13 14:22:09.818137 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 14:22:09.828115 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 13 14:22:09.828172 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 14:22:09.841413 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 13 14:22:09.845391 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 14:22:09.854482 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 14:22:09.857425 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 13 14:22:09.865394 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 14:22:09.873383 systemd[1]: Started systemd-journald.service - Journal Service. May 13 14:22:09.879608 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 13 14:22:09.880571 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 14:22:09.882112 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 13 14:22:09.883837 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 13 14:22:09.886456 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 13 14:22:09.892958 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 14:22:09.914424 kernel: loop0: detected capacity change from 0 to 146240 May 13 14:22:09.920782 systemd-tmpfiles[1182]: ACLs are not supported, ignoring. May 13 14:22:09.920800 systemd-tmpfiles[1182]: ACLs are not supported, ignoring. May 13 14:22:09.928820 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 14:22:09.931407 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 13 14:22:09.937208 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 13 14:22:09.941822 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 13 14:22:09.944490 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 13 14:22:09.968901 systemd-journald[1144]: Time spent on flushing to /var/log/journal/0f871238a9714740baa32d7fc61ac72d is 23.116ms for 981 entries. May 13 14:22:09.968901 systemd-journald[1144]: System Journal (/var/log/journal/0f871238a9714740baa32d7fc61ac72d) is 8M, max 584.8M, 576.8M free. May 13 14:22:10.043824 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 13 14:22:10.043878 systemd-journald[1144]: Received client request to flush runtime journal. May 13 14:22:10.043916 kernel: loop1: detected capacity change from 0 to 8 May 13 14:22:10.043940 kernel: loop2: detected capacity change from 0 to 205544 May 13 14:22:10.045952 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 13 14:22:10.065382 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 13 14:22:10.096271 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 13 14:22:10.104030 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 14:22:10.110255 kernel: loop3: detected capacity change from 0 to 113872 May 13 14:22:10.146494 systemd-tmpfiles[1222]: ACLs are not supported, ignoring. May 13 14:22:10.146513 systemd-tmpfiles[1222]: ACLs are not supported, ignoring. May 13 14:22:10.158390 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 14:22:10.168655 kernel: loop4: detected capacity change from 0 to 146240 May 13 14:22:10.232203 kernel: loop5: detected capacity change from 0 to 8 May 13 14:22:10.232320 kernel: loop6: detected capacity change from 0 to 205544 May 13 14:22:10.294397 kernel: loop7: detected capacity change from 0 to 113872 May 13 14:22:10.323767 (sd-merge)[1226]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. May 13 14:22:10.324828 (sd-merge)[1226]: Merged extensions into '/usr'. May 13 14:22:10.335573 systemd[1]: Reload requested from client PID 1181 ('systemd-sysext') (unit systemd-sysext.service)... May 13 14:22:10.335592 systemd[1]: Reloading... May 13 14:22:10.438389 zram_generator::config[1249]: No configuration found. May 13 14:22:10.605794 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 14:22:10.740573 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 13 14:22:10.740749 systemd[1]: Reloading finished in 404 ms. May 13 14:22:10.761307 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 13 14:22:10.771873 systemd[1]: Starting ensure-sysext.service... May 13 14:22:10.777636 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 14:22:10.802508 systemd[1]: Reload requested from client PID 1307 ('systemctl') (unit ensure-sysext.service)... May 13 14:22:10.802525 systemd[1]: Reloading... May 13 14:22:10.817069 systemd-tmpfiles[1308]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 13 14:22:10.817446 systemd-tmpfiles[1308]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 13 14:22:10.817735 systemd-tmpfiles[1308]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 13 14:22:10.818076 systemd-tmpfiles[1308]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 13 14:22:10.819156 systemd-tmpfiles[1308]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 13 14:22:10.819524 systemd-tmpfiles[1308]: ACLs are not supported, ignoring. May 13 14:22:10.819703 systemd-tmpfiles[1308]: ACLs are not supported, ignoring. May 13 14:22:10.824250 systemd-tmpfiles[1308]: Detected autofs mount point /boot during canonicalization of boot. May 13 14:22:10.824409 systemd-tmpfiles[1308]: Skipping /boot May 13 14:22:10.834881 systemd-tmpfiles[1308]: Detected autofs mount point /boot during canonicalization of boot. May 13 14:22:10.835036 systemd-tmpfiles[1308]: Skipping /boot May 13 14:22:10.900395 zram_generator::config[1333]: No configuration found. May 13 14:22:10.916008 ldconfig[1177]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 13 14:22:11.026341 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 14:22:11.126641 systemd[1]: Reloading finished in 323 ms. May 13 14:22:11.147458 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 13 14:22:11.148550 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 13 14:22:11.149346 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 14:22:11.164490 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 14:22:11.167674 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 13 14:22:11.170538 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 13 14:22:11.178000 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 14:22:11.182584 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 14:22:11.189888 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 13 14:22:11.198665 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 14:22:11.198855 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 14:22:11.201769 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 14:22:11.211071 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 14:22:11.214975 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 14:22:11.215697 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 14:22:11.215820 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 14:22:11.215948 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 14:22:11.219871 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 13 14:22:11.224796 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 14:22:11.224995 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 14:22:11.225171 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 14:22:11.225276 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 14:22:11.225448 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 14:22:11.232250 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 14:22:11.232553 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 14:22:11.250156 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 14:22:11.251568 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 14:22:11.251707 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 14:22:11.251911 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 14:22:11.257477 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 14:22:11.257667 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 14:22:11.263844 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 13 14:22:11.265976 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 14:22:11.266147 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 14:22:11.267911 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 14:22:11.268412 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 14:22:11.273798 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 14:22:11.276749 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 14:22:11.279044 systemd[1]: Finished ensure-sysext.service. May 13 14:22:11.285686 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 13 14:22:11.288086 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 14:22:11.288165 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 14:22:11.288833 systemd-udevd[1399]: Using default interface naming scheme 'v255'. May 13 14:22:11.293870 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 13 14:22:11.296476 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 13 14:22:11.321554 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 13 14:22:11.330766 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 13 14:22:11.331969 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 14:22:11.333915 augenrules[1435]: No rules May 13 14:22:11.336067 systemd[1]: audit-rules.service: Deactivated successfully. May 13 14:22:11.337326 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 14:22:11.341813 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 14:22:11.345582 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 14:22:11.357842 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 13 14:22:11.476025 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 13 14:22:11.509690 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 13 14:22:11.527559 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 13 14:22:11.554804 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 13 14:22:11.597847 systemd-networkd[1445]: lo: Link UP May 13 14:22:11.598123 systemd-networkd[1445]: lo: Gained carrier May 13 14:22:11.598868 systemd-networkd[1445]: Enumeration completed May 13 14:22:11.599016 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 14:22:11.601920 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 13 14:22:11.610511 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 13 14:22:11.631425 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 13 14:22:11.643383 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 May 13 14:22:11.649374 kernel: mousedev: PS/2 mouse device common for all mice May 13 14:22:11.649423 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 May 13 14:22:11.652130 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 13 14:22:11.660314 systemd-networkd[1445]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 14:22:11.660676 systemd-networkd[1445]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 14:22:11.661539 systemd-networkd[1445]: eth0: Link UP May 13 14:22:11.661778 systemd-networkd[1445]: eth0: Gained carrier May 13 14:22:11.662415 systemd-networkd[1445]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 14:22:11.674280 kernel: ACPI: button: Power Button [PWRF] May 13 14:22:11.674421 systemd-networkd[1445]: eth0: DHCPv4 address 172.24.4.33/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 13 14:22:11.683252 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 13 14:22:11.686753 systemd[1]: Reached target time-set.target - System Time Set. May 13 14:22:11.693048 systemd-resolved[1398]: Positive Trust Anchors: May 13 14:22:11.693063 systemd-resolved[1398]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 14:22:11.693106 systemd-resolved[1398]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 14:22:11.700072 systemd-resolved[1398]: Using system hostname 'ci-9999-9-100-9699b4e791.novalocal'. May 13 14:22:11.701647 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 14:22:11.702272 systemd[1]: Reached target network.target - Network. May 13 14:22:11.702747 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 14:22:11.703324 systemd[1]: Reached target sysinit.target - System Initialization. May 13 14:22:11.705046 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 13 14:22:11.705622 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 13 14:22:11.706154 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 13 14:22:11.706874 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 13 14:22:11.707586 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 13 14:22:11.708373 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 13 14:22:11.708901 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 13 14:22:11.708941 systemd[1]: Reached target paths.target - Path Units. May 13 14:22:11.709417 systemd[1]: Reached target timers.target - Timer Units. May 13 14:22:11.712595 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 13 14:22:11.714503 systemd[1]: Starting docker.socket - Docker Socket for the API... May 13 14:22:11.719732 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 13 14:22:11.720461 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 13 14:22:11.720993 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 13 14:22:11.723717 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 13 14:22:11.731587 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 13 14:22:11.733745 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 13 14:22:11.736109 systemd[1]: Reached target sockets.target - Socket Units. May 13 14:22:11.736629 systemd[1]: Reached target basic.target - Basic System. May 13 14:22:11.737471 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 13 14:22:11.737501 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 13 14:22:11.738960 systemd[1]: Starting containerd.service - containerd container runtime... May 13 14:22:11.743566 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 13 14:22:11.746441 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 13 14:22:11.761540 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 13 14:22:11.784456 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 13 14:22:11.790951 systemd-timesyncd[1426]: Contacted time server 148.113.194.34:123 (0.flatcar.pool.ntp.org). May 13 14:22:11.791025 systemd-timesyncd[1426]: Initial clock synchronization to Tue 2025-05-13 14:22:11.756447 UTC. May 13 14:22:11.794886 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 13 14:22:11.798483 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 13 14:22:11.799017 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 13 14:22:11.808436 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 13 14:22:11.814535 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 13 14:22:11.817171 jq[1514]: false May 13 14:22:11.818582 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 13 14:22:11.824490 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 13 14:22:11.827657 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 13 14:22:11.836907 systemd[1]: Starting systemd-logind.service - User Login Management... May 13 14:22:11.838387 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 13 14:22:11.838918 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 13 14:22:11.839631 systemd[1]: Starting update-engine.service - Update Engine... May 13 14:22:11.847261 extend-filesystems[1517]: Found loop4 May 13 14:22:11.848206 extend-filesystems[1517]: Found loop5 May 13 14:22:11.848206 extend-filesystems[1517]: Found loop6 May 13 14:22:11.848206 extend-filesystems[1517]: Found loop7 May 13 14:22:11.848206 extend-filesystems[1517]: Found vda May 13 14:22:11.848206 extend-filesystems[1517]: Found vda1 May 13 14:22:11.848206 extend-filesystems[1517]: Found vda2 May 13 14:22:11.848206 extend-filesystems[1517]: Found vda3 May 13 14:22:11.848206 extend-filesystems[1517]: Found usr May 13 14:22:11.848206 extend-filesystems[1517]: Found vda4 May 13 14:22:11.848206 extend-filesystems[1517]: Found vda6 May 13 14:22:11.848206 extend-filesystems[1517]: Found vda7 May 13 14:22:11.848206 extend-filesystems[1517]: Found vda9 May 13 14:22:11.848206 extend-filesystems[1517]: Checking size of /dev/vda9 May 13 14:22:11.943441 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks May 13 14:22:11.943470 kernel: EXT4-fs (vda9): resized filesystem to 2014203 May 13 14:22:11.858200 oslogin_cache_refresh[1518]: Refreshing passwd entry cache May 13 14:22:11.943592 google_oslogin_nss_cache[1518]: oslogin_cache_refresh[1518]: Refreshing passwd entry cache May 13 14:22:11.943592 google_oslogin_nss_cache[1518]: oslogin_cache_refresh[1518]: Failure getting users, quitting May 13 14:22:11.943592 google_oslogin_nss_cache[1518]: oslogin_cache_refresh[1518]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 13 14:22:11.943592 google_oslogin_nss_cache[1518]: oslogin_cache_refresh[1518]: Refreshing group entry cache May 13 14:22:11.943592 google_oslogin_nss_cache[1518]: oslogin_cache_refresh[1518]: Failure getting groups, quitting May 13 14:22:11.943592 google_oslogin_nss_cache[1518]: oslogin_cache_refresh[1518]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 13 14:22:11.848697 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 13 14:22:11.943885 extend-filesystems[1517]: Resized partition /dev/vda9 May 13 14:22:11.876801 oslogin_cache_refresh[1518]: Failure getting users, quitting May 13 14:22:11.868537 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 13 14:22:11.947939 extend-filesystems[1542]: resize2fs 1.47.2 (1-Jan-2025) May 13 14:22:11.876819 oslogin_cache_refresh[1518]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 13 14:22:11.869402 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 13 14:22:11.948712 jq[1526]: true May 13 14:22:11.876870 oslogin_cache_refresh[1518]: Refreshing group entry cache May 13 14:22:11.869623 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 13 14:22:11.948942 update_engine[1525]: I20250513 14:22:11.926208 1525 main.cc:92] Flatcar Update Engine starting May 13 14:22:11.894512 oslogin_cache_refresh[1518]: Failure getting groups, quitting May 13 14:22:11.870881 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 13 14:22:11.894524 oslogin_cache_refresh[1518]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 13 14:22:11.874618 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 13 14:22:11.896222 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 13 14:22:11.950198 jq[1534]: true May 13 14:22:11.897292 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 13 14:22:11.930755 systemd[1]: motdgen.service: Deactivated successfully. May 13 14:22:11.930995 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 13 14:22:11.944461 (ntainerd)[1551]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 13 14:22:11.976388 extend-filesystems[1542]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 13 14:22:11.976388 extend-filesystems[1542]: old_desc_blocks = 1, new_desc_blocks = 1 May 13 14:22:11.976388 extend-filesystems[1542]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. May 13 14:22:11.969555 dbus-daemon[1507]: [system] SELinux support is enabled May 13 14:22:11.969566 systemd[1]: extend-filesystems.service: Deactivated successfully. May 13 14:22:12.001685 update_engine[1525]: I20250513 14:22:11.983143 1525 update_check_scheduler.cc:74] Next update check in 7m24s May 13 14:22:12.001715 extend-filesystems[1517]: Resized filesystem in /dev/vda9 May 13 14:22:12.008538 tar[1532]: linux-amd64/helm May 13 14:22:11.969766 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 13 14:22:11.970714 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 13 14:22:11.979056 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 13 14:22:11.979259 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 13 14:22:11.984860 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 13 14:22:11.985011 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 13 14:22:11.990165 systemd[1]: Started update-engine.service - Update Engine. May 13 14:22:11.998451 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 14:22:12.005631 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 13 14:22:12.107515 bash[1574]: Updated "/home/core/.ssh/authorized_keys" May 13 14:22:12.110773 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 13 14:22:12.115392 systemd[1]: Starting sshkeys.service... May 13 14:22:12.135577 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 May 13 14:22:12.160654 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 13 14:22:12.162694 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 13 14:22:12.183383 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 13 14:22:12.201005 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 13 14:22:12.204384 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console May 13 14:22:12.254787 systemd-logind[1524]: Watching system buttons on /dev/input/event2 (Power Button) May 13 14:22:12.254810 systemd-logind[1524]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 13 14:22:12.255074 systemd-logind[1524]: New seat seat0. May 13 14:22:12.255953 systemd[1]: Started systemd-logind.service - User Login Management. May 13 14:22:12.260104 locksmithd[1567]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 13 14:22:12.400281 kernel: Console: switching to colour dummy device 80x25 May 13 14:22:12.400827 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 14:22:12.415037 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 13 14:22:12.415089 kernel: [drm] features: -context_init May 13 14:22:12.433607 kernel: [drm] number of scanouts: 1 May 13 14:22:12.433699 kernel: [drm] number of cap sets: 0 May 13 14:22:12.434330 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 14:22:12.434637 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 14:22:12.444489 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 14:22:12.450564 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0 May 13 14:22:12.447697 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 14:22:12.449347 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 14:22:12.532982 containerd[1551]: time="2025-05-13T14:22:12Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 13 14:22:12.534471 containerd[1551]: time="2025-05-13T14:22:12.534446655Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 13 14:22:12.549540 containerd[1551]: time="2025-05-13T14:22:12.549493915Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.728µs" May 13 14:22:12.549656 containerd[1551]: time="2025-05-13T14:22:12.549640175Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 13 14:22:12.549721 containerd[1551]: time="2025-05-13T14:22:12.549706567Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 13 14:22:12.549934 containerd[1551]: time="2025-05-13T14:22:12.549916259Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 13 14:22:12.550016 containerd[1551]: time="2025-05-13T14:22:12.549999627Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 13 14:22:12.550090 containerd[1551]: time="2025-05-13T14:22:12.550075315Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 14:22:12.550205 containerd[1551]: time="2025-05-13T14:22:12.550186545Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 14:22:12.550262 containerd[1551]: time="2025-05-13T14:22:12.550248188Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 14:22:12.550598 containerd[1551]: time="2025-05-13T14:22:12.550543266Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 14:22:12.550669 containerd[1551]: time="2025-05-13T14:22:12.550653336Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 14:22:12.550728 containerd[1551]: time="2025-05-13T14:22:12.550714059Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 14:22:12.550779 containerd[1551]: time="2025-05-13T14:22:12.550766316Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 13 14:22:12.550915 containerd[1551]: time="2025-05-13T14:22:12.550897619Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 13 14:22:12.551182 containerd[1551]: time="2025-05-13T14:22:12.551163896Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 14:22:12.551264 containerd[1551]: time="2025-05-13T14:22:12.551247053Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 14:22:12.551318 containerd[1551]: time="2025-05-13T14:22:12.551305756Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 13 14:22:12.551434 containerd[1551]: time="2025-05-13T14:22:12.551417156Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 13 14:22:12.551821 containerd[1551]: time="2025-05-13T14:22:12.551790164Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 13 14:22:12.551926 containerd[1551]: time="2025-05-13T14:22:12.551910311Z" level=info msg="metadata content store policy set" policy=shared May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563097772Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563149588Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563167483Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563186568Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563201114Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563213741Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563228467Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563245012Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563260558Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563277743Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563288780Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563308265Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563495673Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 13 14:22:12.563764 containerd[1551]: time="2025-05-13T14:22:12.563524945Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 13 14:22:12.564083 containerd[1551]: time="2025-05-13T14:22:12.563540811Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 13 14:22:12.564083 containerd[1551]: time="2025-05-13T14:22:12.563552317Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 13 14:22:12.564083 containerd[1551]: time="2025-05-13T14:22:12.563563235Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 13 14:22:12.564083 containerd[1551]: time="2025-05-13T14:22:12.563577611Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 13 14:22:12.564083 containerd[1551]: time="2025-05-13T14:22:12.563595925Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 13 14:22:12.564083 containerd[1551]: time="2025-05-13T14:22:12.563608142Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 13 14:22:12.564083 containerd[1551]: time="2025-05-13T14:22:12.563621199Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 13 14:22:12.564083 containerd[1551]: time="2025-05-13T14:22:12.563632925Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 13 14:22:12.564083 containerd[1551]: time="2025-05-13T14:22:12.563645282Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 13 14:22:12.564083 containerd[1551]: time="2025-05-13T14:22:12.563722161Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 13 14:22:12.564083 containerd[1551]: time="2025-05-13T14:22:12.563738786Z" level=info msg="Start snapshots syncer" May 13 14:22:12.566033 containerd[1551]: time="2025-05-13T14:22:12.565377755Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 13 14:22:12.569954 containerd[1551]: time="2025-05-13T14:22:12.569780704Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 13 14:22:12.570711 containerd[1551]: time="2025-05-13T14:22:12.570593190Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 13 14:22:12.572156 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 14:22:12.572466 containerd[1551]: time="2025-05-13T14:22:12.572280766Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 13 14:22:12.573647 containerd[1551]: time="2025-05-13T14:22:12.573465610Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 13 14:22:12.573647 containerd[1551]: time="2025-05-13T14:22:12.573496402Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 13 14:22:12.573647 containerd[1551]: time="2025-05-13T14:22:12.573510208Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 13 14:22:12.573647 containerd[1551]: time="2025-05-13T14:22:12.573522485Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 13 14:22:12.573647 containerd[1551]: time="2025-05-13T14:22:12.573536921Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 13 14:22:12.573647 containerd[1551]: time="2025-05-13T14:22:12.573548707Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 13 14:22:12.573647 containerd[1551]: time="2025-05-13T14:22:12.573560764Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 13 14:22:12.573647 containerd[1551]: time="2025-05-13T14:22:12.573589236Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 13 14:22:12.573647 containerd[1551]: time="2025-05-13T14:22:12.573601723Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 13 14:22:12.573647 containerd[1551]: time="2025-05-13T14:22:12.573614129Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 13 14:22:12.574639 containerd[1551]: time="2025-05-13T14:22:12.574398943Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 14:22:12.574639 containerd[1551]: time="2025-05-13T14:22:12.574425246Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 14:22:12.574639 containerd[1551]: time="2025-05-13T14:22:12.574436373Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 14:22:12.574639 containerd[1551]: time="2025-05-13T14:22:12.574498416Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 14:22:12.574639 containerd[1551]: time="2025-05-13T14:22:12.574510083Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 13 14:22:12.574639 containerd[1551]: time="2025-05-13T14:22:12.574522519Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 13 14:22:12.574639 containerd[1551]: time="2025-05-13T14:22:12.574535116Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 13 14:22:12.574639 containerd[1551]: time="2025-05-13T14:22:12.574552361Z" level=info msg="runtime interface created" May 13 14:22:12.574639 containerd[1551]: time="2025-05-13T14:22:12.574557760Z" level=info msg="created NRI interface" May 13 14:22:12.574639 containerd[1551]: time="2025-05-13T14:22:12.574565768Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 13 14:22:12.574639 containerd[1551]: time="2025-05-13T14:22:12.574582973Z" level=info msg="Connect containerd service" May 13 14:22:12.574639 containerd[1551]: time="2025-05-13T14:22:12.574609105Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 13 14:22:12.582487 containerd[1551]: time="2025-05-13T14:22:12.579914266Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 14:22:12.844559 containerd[1551]: time="2025-05-13T14:22:12.844459295Z" level=info msg="Start subscribing containerd event" May 13 14:22:12.844559 containerd[1551]: time="2025-05-13T14:22:12.844533474Z" level=info msg="Start recovering state" May 13 14:22:12.844693 containerd[1551]: time="2025-05-13T14:22:12.844654181Z" level=info msg="Start event monitor" May 13 14:22:12.844719 containerd[1551]: time="2025-05-13T14:22:12.844690391Z" level=info msg="Start cni network conf syncer for default" May 13 14:22:12.844719 containerd[1551]: time="2025-05-13T14:22:12.844701897Z" level=info msg="Start streaming server" May 13 14:22:12.844719 containerd[1551]: time="2025-05-13T14:22:12.844713244Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 13 14:22:12.844790 containerd[1551]: time="2025-05-13T14:22:12.844722032Z" level=info msg="runtime interface starting up..." May 13 14:22:12.844790 containerd[1551]: time="2025-05-13T14:22:12.844729060Z" level=info msg="starting plugins..." May 13 14:22:12.844790 containerd[1551]: time="2025-05-13T14:22:12.844742646Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 13 14:22:12.845714 containerd[1551]: time="2025-05-13T14:22:12.845682747Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 13 14:22:12.846061 containerd[1551]: time="2025-05-13T14:22:12.846044648Z" level=info msg=serving... address=/run/containerd/containerd.sock May 13 14:22:12.846190 containerd[1551]: time="2025-05-13T14:22:12.846175682Z" level=info msg="containerd successfully booted in 0.313591s" May 13 14:22:12.846286 systemd[1]: Started containerd.service - containerd container runtime. May 13 14:22:12.889406 tar[1532]: linux-amd64/LICENSE May 13 14:22:12.889812 tar[1532]: linux-amd64/README.md May 13 14:22:12.905330 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 13 14:22:13.021480 systemd-networkd[1445]: eth0: Gained IPv6LL May 13 14:22:13.027400 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 13 14:22:13.030991 systemd[1]: Reached target network-online.target - Network is Online. May 13 14:22:13.036880 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 14:22:13.044042 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 13 14:22:13.110480 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 13 14:22:13.164507 sshd_keygen[1549]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 13 14:22:13.187232 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 13 14:22:13.191391 systemd[1]: Starting issuegen.service - Generate /run/issue... May 13 14:22:13.193514 systemd[1]: Started sshd@0-172.24.4.33:22-172.24.4.1:55908.service - OpenSSH per-connection server daemon (172.24.4.1:55908). May 13 14:22:13.207195 systemd[1]: issuegen.service: Deactivated successfully. May 13 14:22:13.208790 systemd[1]: Finished issuegen.service - Generate /run/issue. May 13 14:22:13.213400 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 13 14:22:13.235882 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 13 14:22:13.237954 systemd[1]: Started getty@tty1.service - Getty on tty1. May 13 14:22:13.240652 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 13 14:22:13.240962 systemd[1]: Reached target getty.target - Login Prompts. May 13 14:22:13.542416 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 13 14:22:13.545421 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 13 14:22:14.120184 sshd[1641]: Accepted publickey for core from 172.24.4.1 port 55908 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:22:14.124735 sshd-session[1641]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:22:14.155656 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 13 14:22:14.158214 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 13 14:22:14.169562 systemd-logind[1524]: New session 1 of user core. May 13 14:22:14.197707 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 13 14:22:14.201855 systemd[1]: Starting user@500.service - User Manager for UID 500... May 13 14:22:14.214561 (systemd)[1654]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 13 14:22:14.222104 systemd-logind[1524]: New session c1 of user core. May 13 14:22:14.422733 systemd[1654]: Queued start job for default target default.target. May 13 14:22:14.429212 systemd[1654]: Created slice app.slice - User Application Slice. May 13 14:22:14.429239 systemd[1654]: Reached target paths.target - Paths. May 13 14:22:14.429279 systemd[1654]: Reached target timers.target - Timers. May 13 14:22:14.433447 systemd[1654]: Starting dbus.socket - D-Bus User Message Bus Socket... May 13 14:22:14.441882 systemd[1654]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 13 14:22:14.443268 systemd[1654]: Reached target sockets.target - Sockets. May 13 14:22:14.443312 systemd[1654]: Reached target basic.target - Basic System. May 13 14:22:14.443347 systemd[1654]: Reached target default.target - Main User Target. May 13 14:22:14.443559 systemd[1654]: Startup finished in 205ms. May 13 14:22:14.443775 systemd[1]: Started user@500.service - User Manager for UID 500. May 13 14:22:14.455563 systemd[1]: Started session-1.scope - Session 1 of User core. May 13 14:22:14.672984 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 14:22:14.688825 (kubelet)[1667]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 14:22:14.940687 systemd[1]: Started sshd@1-172.24.4.33:22-172.24.4.1:32896.service - OpenSSH per-connection server daemon (172.24.4.1:32896). May 13 14:22:15.570481 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 13 14:22:15.570621 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 13 14:22:15.770448 kubelet[1667]: E0513 14:22:15.770373 1667 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 14:22:15.775743 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 14:22:15.776550 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 14:22:15.777777 systemd[1]: kubelet.service: Consumed 1.740s CPU time, 236.1M memory peak. May 13 14:22:16.654335 sshd[1676]: Accepted publickey for core from 172.24.4.1 port 32896 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:22:16.656895 sshd-session[1676]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:22:16.667655 systemd-logind[1524]: New session 2 of user core. May 13 14:22:16.678760 systemd[1]: Started session-2.scope - Session 2 of User core. May 13 14:22:17.295727 sshd[1681]: Connection closed by 172.24.4.1 port 32896 May 13 14:22:17.298048 sshd-session[1676]: pam_unix(sshd:session): session closed for user core May 13 14:22:17.313143 systemd[1]: sshd@1-172.24.4.33:22-172.24.4.1:32896.service: Deactivated successfully. May 13 14:22:17.316663 systemd[1]: session-2.scope: Deactivated successfully. May 13 14:22:17.319041 systemd-logind[1524]: Session 2 logged out. Waiting for processes to exit. May 13 14:22:17.323858 systemd-logind[1524]: Removed session 2. May 13 14:22:17.326003 systemd[1]: Started sshd@2-172.24.4.33:22-172.24.4.1:32898.service - OpenSSH per-connection server daemon (172.24.4.1:32898). May 13 14:22:18.423840 login[1648]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 13 14:22:18.424114 login[1647]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 13 14:22:18.434886 systemd-logind[1524]: New session 4 of user core. May 13 14:22:18.441785 systemd[1]: Started session-4.scope - Session 4 of User core. May 13 14:22:18.448269 sshd[1687]: Accepted publickey for core from 172.24.4.1 port 32898 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:22:18.451482 systemd-logind[1524]: New session 3 of user core. May 13 14:22:18.454318 sshd-session[1687]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:22:18.458477 systemd[1]: Started session-3.scope - Session 3 of User core. May 13 14:22:18.481124 systemd-logind[1524]: New session 5 of user core. May 13 14:22:18.496836 systemd[1]: Started session-5.scope - Session 5 of User core. May 13 14:22:19.188237 sshd[1696]: Connection closed by 172.24.4.1 port 32898 May 13 14:22:19.189207 sshd-session[1687]: pam_unix(sshd:session): session closed for user core May 13 14:22:19.197798 systemd[1]: sshd@2-172.24.4.33:22-172.24.4.1:32898.service: Deactivated successfully. May 13 14:22:19.201912 systemd[1]: session-5.scope: Deactivated successfully. May 13 14:22:19.204317 systemd-logind[1524]: Session 5 logged out. Waiting for processes to exit. May 13 14:22:19.207458 systemd-logind[1524]: Removed session 5. May 13 14:22:19.601452 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 13 14:22:19.601632 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 13 14:22:19.618605 coreos-metadata[1506]: May 13 14:22:19.618 WARN failed to locate config-drive, using the metadata service API instead May 13 14:22:19.625580 coreos-metadata[1582]: May 13 14:22:19.625 WARN failed to locate config-drive, using the metadata service API instead May 13 14:22:19.668911 coreos-metadata[1582]: May 13 14:22:19.668 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 May 13 14:22:19.669573 coreos-metadata[1506]: May 13 14:22:19.669 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 May 13 14:22:19.858784 coreos-metadata[1506]: May 13 14:22:19.858 INFO Fetch successful May 13 14:22:19.858784 coreos-metadata[1506]: May 13 14:22:19.858 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 13 14:22:19.872989 coreos-metadata[1506]: May 13 14:22:19.872 INFO Fetch successful May 13 14:22:19.872989 coreos-metadata[1506]: May 13 14:22:19.872 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 May 13 14:22:19.886460 coreos-metadata[1506]: May 13 14:22:19.886 INFO Fetch successful May 13 14:22:19.886460 coreos-metadata[1506]: May 13 14:22:19.886 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 May 13 14:22:19.900183 coreos-metadata[1506]: May 13 14:22:19.900 INFO Fetch successful May 13 14:22:19.900183 coreos-metadata[1506]: May 13 14:22:19.900 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 May 13 14:22:19.912682 coreos-metadata[1506]: May 13 14:22:19.912 INFO Fetch successful May 13 14:22:19.912682 coreos-metadata[1506]: May 13 14:22:19.912 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 May 13 14:22:19.925185 coreos-metadata[1506]: May 13 14:22:19.925 INFO Fetch successful May 13 14:22:19.961557 coreos-metadata[1582]: May 13 14:22:19.960 INFO Fetch successful May 13 14:22:19.961557 coreos-metadata[1582]: May 13 14:22:19.961 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 May 13 14:22:19.968904 coreos-metadata[1582]: May 13 14:22:19.968 INFO Fetch successful May 13 14:22:19.971941 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 13 14:22:19.973747 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 13 14:22:19.974460 unknown[1582]: wrote ssh authorized keys file for user: core May 13 14:22:20.003810 update-ssh-keys[1730]: Updated "/home/core/.ssh/authorized_keys" May 13 14:22:20.004309 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 13 14:22:20.006751 systemd[1]: Finished sshkeys.service. May 13 14:22:20.007803 systemd[1]: Reached target multi-user.target - Multi-User System. May 13 14:22:20.009602 systemd[1]: Startup finished in 3.705s (kernel) + 16.950s (initrd) + 11.396s (userspace) = 32.053s. May 13 14:22:25.888182 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 13 14:22:25.891290 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 14:22:26.238654 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 14:22:26.254897 (kubelet)[1742]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 14:22:26.341723 kubelet[1742]: E0513 14:22:26.341636 1742 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 14:22:26.350438 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 14:22:26.350969 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 14:22:26.352140 systemd[1]: kubelet.service: Consumed 306ms CPU time, 93.7M memory peak. May 13 14:22:29.204204 systemd[1]: Started sshd@3-172.24.4.33:22-172.24.4.1:38238.service - OpenSSH per-connection server daemon (172.24.4.1:38238). May 13 14:22:30.372212 sshd[1750]: Accepted publickey for core from 172.24.4.1 port 38238 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:22:30.374953 sshd-session[1750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:22:30.386467 systemd-logind[1524]: New session 6 of user core. May 13 14:22:30.397662 systemd[1]: Started session-6.scope - Session 6 of User core. May 13 14:22:30.974654 sshd[1752]: Connection closed by 172.24.4.1 port 38238 May 13 14:22:30.976902 sshd-session[1750]: pam_unix(sshd:session): session closed for user core May 13 14:22:30.988293 systemd[1]: sshd@3-172.24.4.33:22-172.24.4.1:38238.service: Deactivated successfully. May 13 14:22:30.991779 systemd[1]: session-6.scope: Deactivated successfully. May 13 14:22:30.993997 systemd-logind[1524]: Session 6 logged out. Waiting for processes to exit. May 13 14:22:30.999803 systemd[1]: Started sshd@4-172.24.4.33:22-172.24.4.1:38240.service - OpenSSH per-connection server daemon (172.24.4.1:38240). May 13 14:22:31.001999 systemd-logind[1524]: Removed session 6. May 13 14:22:32.153565 sshd[1758]: Accepted publickey for core from 172.24.4.1 port 38240 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:22:32.156213 sshd-session[1758]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:22:32.167920 systemd-logind[1524]: New session 7 of user core. May 13 14:22:32.186733 systemd[1]: Started session-7.scope - Session 7 of User core. May 13 14:22:32.780055 sshd[1760]: Connection closed by 172.24.4.1 port 38240 May 13 14:22:32.784115 sshd-session[1758]: pam_unix(sshd:session): session closed for user core May 13 14:22:32.798222 systemd[1]: sshd@4-172.24.4.33:22-172.24.4.1:38240.service: Deactivated successfully. May 13 14:22:32.802191 systemd[1]: session-7.scope: Deactivated successfully. May 13 14:22:32.804404 systemd-logind[1524]: Session 7 logged out. Waiting for processes to exit. May 13 14:22:32.810417 systemd[1]: Started sshd@5-172.24.4.33:22-172.24.4.1:38242.service - OpenSSH per-connection server daemon (172.24.4.1:38242). May 13 14:22:32.814544 systemd-logind[1524]: Removed session 7. May 13 14:22:33.936929 sshd[1766]: Accepted publickey for core from 172.24.4.1 port 38242 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:22:33.939529 sshd-session[1766]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:22:33.951484 systemd-logind[1524]: New session 8 of user core. May 13 14:22:33.958772 systemd[1]: Started session-8.scope - Session 8 of User core. May 13 14:22:34.679403 sshd[1768]: Connection closed by 172.24.4.1 port 38242 May 13 14:22:34.678338 sshd-session[1766]: pam_unix(sshd:session): session closed for user core May 13 14:22:34.695271 systemd[1]: sshd@5-172.24.4.33:22-172.24.4.1:38242.service: Deactivated successfully. May 13 14:22:34.698781 systemd[1]: session-8.scope: Deactivated successfully. May 13 14:22:34.701070 systemd-logind[1524]: Session 8 logged out. Waiting for processes to exit. May 13 14:22:34.706625 systemd[1]: Started sshd@6-172.24.4.33:22-172.24.4.1:56176.service - OpenSSH per-connection server daemon (172.24.4.1:56176). May 13 14:22:34.708929 systemd-logind[1524]: Removed session 8. May 13 14:22:35.858736 sshd[1774]: Accepted publickey for core from 172.24.4.1 port 56176 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:22:35.861557 sshd-session[1774]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:22:35.872874 systemd-logind[1524]: New session 9 of user core. May 13 14:22:35.881659 systemd[1]: Started session-9.scope - Session 9 of User core. May 13 14:22:36.316226 sudo[1777]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 13 14:22:36.316952 sudo[1777]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 14:22:36.334828 sudo[1777]: pam_unix(sudo:session): session closed for user root May 13 14:22:36.387933 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 13 14:22:36.391794 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 14:22:36.502036 sshd[1776]: Connection closed by 172.24.4.1 port 56176 May 13 14:22:36.501810 sshd-session[1774]: pam_unix(sshd:session): session closed for user core May 13 14:22:36.520475 systemd[1]: sshd@6-172.24.4.33:22-172.24.4.1:56176.service: Deactivated successfully. May 13 14:22:36.524608 systemd[1]: session-9.scope: Deactivated successfully. May 13 14:22:36.528543 systemd-logind[1524]: Session 9 logged out. Waiting for processes to exit. May 13 14:22:36.533717 systemd[1]: Started sshd@7-172.24.4.33:22-172.24.4.1:56178.service - OpenSSH per-connection server daemon (172.24.4.1:56178). May 13 14:22:36.536733 systemd-logind[1524]: Removed session 9. May 13 14:22:36.725755 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 14:22:36.740243 (kubelet)[1792]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 14:22:36.826458 kubelet[1792]: E0513 14:22:36.826409 1792 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 14:22:36.831587 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 14:22:36.831723 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 14:22:36.832190 systemd[1]: kubelet.service: Consumed 278ms CPU time, 96.1M memory peak. May 13 14:22:37.771584 sshd[1786]: Accepted publickey for core from 172.24.4.1 port 56178 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:22:37.773867 sshd-session[1786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:22:37.781173 systemd-logind[1524]: New session 10 of user core. May 13 14:22:37.791504 systemd[1]: Started session-10.scope - Session 10 of User core. May 13 14:22:38.245964 sudo[1802]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 13 14:22:38.247146 sudo[1802]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 14:22:38.257599 sudo[1802]: pam_unix(sudo:session): session closed for user root May 13 14:22:38.268496 sudo[1801]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 13 14:22:38.269078 sudo[1801]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 14:22:38.292579 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 14:22:38.362611 augenrules[1824]: No rules May 13 14:22:38.364577 systemd[1]: audit-rules.service: Deactivated successfully. May 13 14:22:38.365094 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 14:22:38.367714 sudo[1801]: pam_unix(sudo:session): session closed for user root May 13 14:22:38.512025 sshd[1800]: Connection closed by 172.24.4.1 port 56178 May 13 14:22:38.513461 sshd-session[1786]: pam_unix(sshd:session): session closed for user core May 13 14:22:38.531887 systemd[1]: sshd@7-172.24.4.33:22-172.24.4.1:56178.service: Deactivated successfully. May 13 14:22:38.535245 systemd[1]: session-10.scope: Deactivated successfully. May 13 14:22:38.537408 systemd-logind[1524]: Session 10 logged out. Waiting for processes to exit. May 13 14:22:38.542280 systemd[1]: Started sshd@8-172.24.4.33:22-172.24.4.1:56194.service - OpenSSH per-connection server daemon (172.24.4.1:56194). May 13 14:22:38.545685 systemd-logind[1524]: Removed session 10. May 13 14:22:39.655270 sshd[1833]: Accepted publickey for core from 172.24.4.1 port 56194 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:22:39.657245 sshd-session[1833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:22:39.667524 systemd-logind[1524]: New session 11 of user core. May 13 14:22:39.679730 systemd[1]: Started session-11.scope - Session 11 of User core. May 13 14:22:40.128834 sudo[1836]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 13 14:22:40.130226 sudo[1836]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 14:22:40.852472 systemd[1]: Starting docker.service - Docker Application Container Engine... May 13 14:22:40.876951 (dockerd)[1854]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 13 14:22:41.392214 dockerd[1854]: time="2025-05-13T14:22:41.392125482Z" level=info msg="Starting up" May 13 14:22:41.393194 dockerd[1854]: time="2025-05-13T14:22:41.393163917Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 13 14:22:41.490212 dockerd[1854]: time="2025-05-13T14:22:41.490073849Z" level=info msg="Loading containers: start." May 13 14:22:41.514905 kernel: Initializing XFRM netlink socket May 13 14:22:41.880929 systemd-networkd[1445]: docker0: Link UP May 13 14:22:41.890043 dockerd[1854]: time="2025-05-13T14:22:41.889936888Z" level=info msg="Loading containers: done." May 13 14:22:41.912644 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2013038201-merged.mount: Deactivated successfully. May 13 14:22:41.918151 dockerd[1854]: time="2025-05-13T14:22:41.918066423Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 13 14:22:41.918295 dockerd[1854]: time="2025-05-13T14:22:41.918220060Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 13 14:22:41.918508 dockerd[1854]: time="2025-05-13T14:22:41.918454062Z" level=info msg="Initializing buildkit" May 13 14:22:41.972294 dockerd[1854]: time="2025-05-13T14:22:41.972183253Z" level=info msg="Completed buildkit initialization" May 13 14:22:41.989169 dockerd[1854]: time="2025-05-13T14:22:41.989063005Z" level=info msg="Daemon has completed initialization" May 13 14:22:41.989426 dockerd[1854]: time="2025-05-13T14:22:41.989243684Z" level=info msg="API listen on /run/docker.sock" May 13 14:22:41.989918 systemd[1]: Started docker.service - Docker Application Container Engine. May 13 14:22:43.591254 containerd[1551]: time="2025-05-13T14:22:43.591170634Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\"" May 13 14:22:44.358958 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2623234882.mount: Deactivated successfully. May 13 14:22:46.056422 containerd[1551]: time="2025-05-13T14:22:46.056371222Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:46.057815 containerd[1551]: time="2025-05-13T14:22:46.057749968Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.8: active requests=0, bytes read=27960995" May 13 14:22:46.058622 containerd[1551]: time="2025-05-13T14:22:46.058553613Z" level=info msg="ImageCreate event name:\"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:46.061802 containerd[1551]: time="2025-05-13T14:22:46.061769445Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:46.062943 containerd[1551]: time="2025-05-13T14:22:46.062917915Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.8\" with image id \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\", size \"27957787\" in 2.470895901s" May 13 14:22:46.063031 containerd[1551]: time="2025-05-13T14:22:46.063016155Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\" returns image reference \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\"" May 13 14:22:46.065291 containerd[1551]: time="2025-05-13T14:22:46.065252935Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\"" May 13 14:22:46.887181 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 13 14:22:46.892713 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 14:22:47.261867 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 14:22:47.274599 (kubelet)[2119]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 14:22:47.415706 kubelet[2119]: E0513 14:22:47.415593 2119 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 14:22:47.419325 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 14:22:47.419606 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 14:22:47.420177 systemd[1]: kubelet.service: Consumed 221ms CPU time, 95.8M memory peak. May 13 14:22:48.115602 containerd[1551]: time="2025-05-13T14:22:48.115531807Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:48.116848 containerd[1551]: time="2025-05-13T14:22:48.116819421Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.8: active requests=0, bytes read=24713784" May 13 14:22:48.118489 containerd[1551]: time="2025-05-13T14:22:48.118458721Z" level=info msg="ImageCreate event name:\"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:48.122314 containerd[1551]: time="2025-05-13T14:22:48.122263887Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:48.124435 containerd[1551]: time="2025-05-13T14:22:48.124405283Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.8\" with image id \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\", size \"26202149\" in 2.059118983s" May 13 14:22:48.124534 containerd[1551]: time="2025-05-13T14:22:48.124518001Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\" returns image reference \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\"" May 13 14:22:48.125046 containerd[1551]: time="2025-05-13T14:22:48.125006404Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\"" May 13 14:22:49.839154 containerd[1551]: time="2025-05-13T14:22:49.839096533Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:49.840269 containerd[1551]: time="2025-05-13T14:22:49.840241024Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.8: active requests=0, bytes read=18780394" May 13 14:22:49.841762 containerd[1551]: time="2025-05-13T14:22:49.841735472Z" level=info msg="ImageCreate event name:\"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:49.846371 containerd[1551]: time="2025-05-13T14:22:49.845797590Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:49.846472 containerd[1551]: time="2025-05-13T14:22:49.846448793Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.8\" with image id \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\", size \"20268777\" in 1.7214014s" May 13 14:22:49.846537 containerd[1551]: time="2025-05-13T14:22:49.846523269Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\" returns image reference \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\"" May 13 14:22:49.846998 containerd[1551]: time="2025-05-13T14:22:49.846974756Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\"" May 13 14:22:51.186647 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1073378863.mount: Deactivated successfully. May 13 14:22:51.757740 containerd[1551]: time="2025-05-13T14:22:51.757692179Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:51.758792 containerd[1551]: time="2025-05-13T14:22:51.758766557Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.8: active requests=0, bytes read=30354633" May 13 14:22:51.759857 containerd[1551]: time="2025-05-13T14:22:51.759812769Z" level=info msg="ImageCreate event name:\"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:51.762256 containerd[1551]: time="2025-05-13T14:22:51.762216110Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:51.763139 containerd[1551]: time="2025-05-13T14:22:51.762744870Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.8\" with image id \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\", repo tag \"registry.k8s.io/kube-proxy:v1.31.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\", size \"30353644\" in 1.915738061s" May 13 14:22:51.763139 containerd[1551]: time="2025-05-13T14:22:51.762785068Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\" returns image reference \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\"" May 13 14:22:51.763257 containerd[1551]: time="2025-05-13T14:22:51.763232700Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 13 14:22:52.415568 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount745868153.mount: Deactivated successfully. May 13 14:22:53.589501 containerd[1551]: time="2025-05-13T14:22:53.589445631Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:53.591260 containerd[1551]: time="2025-05-13T14:22:53.591232740Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" May 13 14:22:53.592579 containerd[1551]: time="2025-05-13T14:22:53.592490417Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:53.595200 containerd[1551]: time="2025-05-13T14:22:53.595144943Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:53.596373 containerd[1551]: time="2025-05-13T14:22:53.596202444Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.832935576s" May 13 14:22:53.596373 containerd[1551]: time="2025-05-13T14:22:53.596233348Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 13 14:22:53.596625 containerd[1551]: time="2025-05-13T14:22:53.596606331Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 13 14:22:54.355171 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3983589345.mount: Deactivated successfully. May 13 14:22:54.368530 containerd[1551]: time="2025-05-13T14:22:54.368443742Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 14:22:54.370136 containerd[1551]: time="2025-05-13T14:22:54.370046199Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 13 14:22:54.371625 containerd[1551]: time="2025-05-13T14:22:54.371487336Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 14:22:54.376759 containerd[1551]: time="2025-05-13T14:22:54.376621192Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 14:22:54.379276 containerd[1551]: time="2025-05-13T14:22:54.378344017Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 781.648804ms" May 13 14:22:54.379276 containerd[1551]: time="2025-05-13T14:22:54.378446896Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 13 14:22:54.379675 containerd[1551]: time="2025-05-13T14:22:54.379559924Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 13 14:22:55.025667 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3267084368.mount: Deactivated successfully. May 13 14:22:56.968884 update_engine[1525]: I20250513 14:22:56.968827 1525 update_attempter.cc:509] Updating boot flags... May 13 14:22:57.636966 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 13 14:22:57.640601 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 14:22:58.094420 containerd[1551]: time="2025-05-13T14:22:58.094266592Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:58.097880 containerd[1551]: time="2025-05-13T14:22:58.097757144Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780021" May 13 14:22:58.100179 containerd[1551]: time="2025-05-13T14:22:58.099958149Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:58.105622 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 14:22:58.114095 containerd[1551]: time="2025-05-13T14:22:58.113985783Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:22:58.120562 containerd[1551]: time="2025-05-13T14:22:58.119936277Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 3.740318121s" May 13 14:22:58.120562 containerd[1551]: time="2025-05-13T14:22:58.120013694Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" May 13 14:22:58.122237 (kubelet)[2268]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 14:22:58.198032 kubelet[2268]: E0513 14:22:58.197983 2268 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 14:22:58.200265 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 14:22:58.200425 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 14:22:58.200706 systemd[1]: kubelet.service: Consumed 181ms CPU time, 96M memory peak. May 13 14:23:02.149108 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 14:23:02.150034 systemd[1]: kubelet.service: Consumed 181ms CPU time, 96M memory peak. May 13 14:23:02.154447 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 14:23:02.206079 systemd[1]: Reload requested from client PID 2300 ('systemctl') (unit session-11.scope)... May 13 14:23:02.206108 systemd[1]: Reloading... May 13 14:23:02.314392 zram_generator::config[2348]: No configuration found. May 13 14:23:02.428638 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 14:23:02.567878 systemd[1]: Reloading finished in 360 ms. May 13 14:23:02.626874 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 13 14:23:02.627083 systemd[1]: kubelet.service: Failed with result 'signal'. May 13 14:23:02.627825 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 14:23:02.627923 systemd[1]: kubelet.service: Consumed 103ms CPU time, 83.6M memory peak. May 13 14:23:02.631549 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 14:23:02.992063 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 14:23:03.008915 (kubelet)[2412]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 14:23:03.093633 kubelet[2412]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 14:23:03.095402 kubelet[2412]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 14:23:03.095402 kubelet[2412]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 14:23:03.095402 kubelet[2412]: I0513 14:23:03.094541 2412 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 14:23:03.377192 kubelet[2412]: I0513 14:23:03.375776 2412 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 13 14:23:03.377192 kubelet[2412]: I0513 14:23:03.375807 2412 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 14:23:03.377192 kubelet[2412]: I0513 14:23:03.376208 2412 server.go:929] "Client rotation is on, will bootstrap in background" May 13 14:23:03.736045 kubelet[2412]: E0513 14:23:03.735935 2412 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.24.4.33:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.24.4.33:6443: connect: connection refused" logger="UnhandledError" May 13 14:23:03.736974 kubelet[2412]: I0513 14:23:03.736925 2412 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 14:23:03.756048 kubelet[2412]: I0513 14:23:03.755874 2412 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 14:23:03.765865 kubelet[2412]: I0513 14:23:03.765826 2412 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 14:23:03.766242 kubelet[2412]: I0513 14:23:03.766212 2412 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 13 14:23:03.766745 kubelet[2412]: I0513 14:23:03.766681 2412 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 14:23:03.767435 kubelet[2412]: I0513 14:23:03.767002 2412 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-9-100-9699b4e791.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 14:23:03.768252 kubelet[2412]: I0513 14:23:03.767740 2412 topology_manager.go:138] "Creating topology manager with none policy" May 13 14:23:03.768252 kubelet[2412]: I0513 14:23:03.767778 2412 container_manager_linux.go:300] "Creating device plugin manager" May 13 14:23:03.768252 kubelet[2412]: I0513 14:23:03.767970 2412 state_mem.go:36] "Initialized new in-memory state store" May 13 14:23:03.775707 kubelet[2412]: I0513 14:23:03.775633 2412 kubelet.go:408] "Attempting to sync node with API server" May 13 14:23:03.776199 kubelet[2412]: I0513 14:23:03.776172 2412 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 14:23:03.776434 kubelet[2412]: I0513 14:23:03.776410 2412 kubelet.go:314] "Adding apiserver pod source" May 13 14:23:03.776607 kubelet[2412]: I0513 14:23:03.776585 2412 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 14:23:03.800729 kubelet[2412]: W0513 14:23:03.800168 2412 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.33:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-9-100-9699b4e791.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused May 13 14:23:03.800729 kubelet[2412]: E0513 14:23:03.800305 2412 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.24.4.33:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-9-100-9699b4e791.novalocal&limit=500&resourceVersion=0\": dial tcp 172.24.4.33:6443: connect: connection refused" logger="UnhandledError" May 13 14:23:03.803280 kubelet[2412]: W0513 14:23:03.803199 2412 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.33:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused May 13 14:23:03.803584 kubelet[2412]: E0513 14:23:03.803538 2412 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.24.4.33:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.24.4.33:6443: connect: connection refused" logger="UnhandledError" May 13 14:23:03.803876 kubelet[2412]: I0513 14:23:03.803843 2412 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 13 14:23:03.808151 kubelet[2412]: I0513 14:23:03.808116 2412 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 14:23:03.808467 kubelet[2412]: W0513 14:23:03.808439 2412 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 13 14:23:03.811331 kubelet[2412]: I0513 14:23:03.811284 2412 server.go:1269] "Started kubelet" May 13 14:23:03.814339 kubelet[2412]: I0513 14:23:03.814273 2412 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 14:23:03.831847 kubelet[2412]: I0513 14:23:03.831501 2412 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 14:23:03.837420 kubelet[2412]: I0513 14:23:03.837141 2412 server.go:460] "Adding debug handlers to kubelet server" May 13 14:23:03.838450 kubelet[2412]: E0513 14:23:03.832755 2412 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.33:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.33:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-9999-9-100-9699b4e791.novalocal.183f1c3288bf28a8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-9999-9-100-9699b4e791.novalocal,UID:ci-9999-9-100-9699b4e791.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-9999-9-100-9699b4e791.novalocal,},FirstTimestamp:2025-05-13 14:23:03.81121348 +0000 UTC m=+0.795067085,LastTimestamp:2025-05-13 14:23:03.81121348 +0000 UTC m=+0.795067085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-9-100-9699b4e791.novalocal,}" May 13 14:23:03.842878 kubelet[2412]: I0513 14:23:03.842527 2412 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 14:23:03.843426 kubelet[2412]: I0513 14:23:03.843079 2412 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 14:23:03.843426 kubelet[2412]: I0513 14:23:03.843299 2412 volume_manager.go:289] "Starting Kubelet Volume Manager" May 13 14:23:03.843690 kubelet[2412]: I0513 14:23:03.843639 2412 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 14:23:03.844268 kubelet[2412]: E0513 14:23:03.844224 2412 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-9-100-9699b4e791.novalocal\" not found" May 13 14:23:03.848862 kubelet[2412]: I0513 14:23:03.848432 2412 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 13 14:23:03.848862 kubelet[2412]: I0513 14:23:03.848582 2412 reconciler.go:26] "Reconciler: start to sync state" May 13 14:23:03.851223 kubelet[2412]: E0513 14:23:03.851109 2412 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-9-100-9699b4e791.novalocal?timeout=10s\": dial tcp 172.24.4.33:6443: connect: connection refused" interval="200ms" May 13 14:23:03.852828 kubelet[2412]: I0513 14:23:03.852780 2412 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 14:23:03.856163 kubelet[2412]: I0513 14:23:03.856116 2412 factory.go:221] Registration of the containerd container factory successfully May 13 14:23:03.857286 kubelet[2412]: I0513 14:23:03.856307 2412 factory.go:221] Registration of the systemd container factory successfully May 13 14:23:03.863297 kubelet[2412]: I0513 14:23:03.863188 2412 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 14:23:03.864588 kubelet[2412]: I0513 14:23:03.864205 2412 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 14:23:03.864588 kubelet[2412]: I0513 14:23:03.864231 2412 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 14:23:03.864588 kubelet[2412]: I0513 14:23:03.864254 2412 kubelet.go:2321] "Starting kubelet main sync loop" May 13 14:23:03.864588 kubelet[2412]: E0513 14:23:03.864292 2412 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 14:23:03.868896 kubelet[2412]: W0513 14:23:03.868726 2412 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.33:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused May 13 14:23:03.868896 kubelet[2412]: E0513 14:23:03.868789 2412 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.24.4.33:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.24.4.33:6443: connect: connection refused" logger="UnhandledError" May 13 14:23:03.868896 kubelet[2412]: E0513 14:23:03.868840 2412 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 14:23:03.872380 kubelet[2412]: W0513 14:23:03.872178 2412 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.33:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused May 13 14:23:03.872380 kubelet[2412]: E0513 14:23:03.872279 2412 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.24.4.33:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.24.4.33:6443: connect: connection refused" logger="UnhandledError" May 13 14:23:03.878714 kubelet[2412]: I0513 14:23:03.878698 2412 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 14:23:03.878852 kubelet[2412]: I0513 14:23:03.878841 2412 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 14:23:03.879005 kubelet[2412]: I0513 14:23:03.878916 2412 state_mem.go:36] "Initialized new in-memory state store" May 13 14:23:03.885610 kubelet[2412]: I0513 14:23:03.885598 2412 policy_none.go:49] "None policy: Start" May 13 14:23:03.886388 kubelet[2412]: I0513 14:23:03.886209 2412 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 14:23:03.886388 kubelet[2412]: I0513 14:23:03.886229 2412 state_mem.go:35] "Initializing new in-memory state store" May 13 14:23:03.898742 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 13 14:23:03.911492 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 13 14:23:03.916237 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 13 14:23:03.935829 kubelet[2412]: I0513 14:23:03.935353 2412 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 14:23:03.936292 kubelet[2412]: I0513 14:23:03.936262 2412 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 14:23:03.936423 kubelet[2412]: I0513 14:23:03.936293 2412 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 14:23:03.938549 kubelet[2412]: I0513 14:23:03.938523 2412 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 14:23:03.941341 kubelet[2412]: E0513 14:23:03.941310 2412 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-9999-9-100-9699b4e791.novalocal\" not found" May 13 14:23:03.987648 systemd[1]: Created slice kubepods-burstable-podc0677dee366b63fb79c1bd93ea41fcda.slice - libcontainer container kubepods-burstable-podc0677dee366b63fb79c1bd93ea41fcda.slice. May 13 14:23:04.019622 systemd[1]: Created slice kubepods-burstable-pod86b9df13483696eee49702439b4fa312.slice - libcontainer container kubepods-burstable-pod86b9df13483696eee49702439b4fa312.slice. May 13 14:23:04.040610 kubelet[2412]: I0513 14:23:04.039918 2412 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.040610 kubelet[2412]: E0513 14:23:04.040511 2412 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.33:6443/api/v1/nodes\": dial tcp 172.24.4.33:6443: connect: connection refused" node="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.044007 systemd[1]: Created slice kubepods-burstable-pod7ed13ac78677dba22a5edae258734a38.slice - libcontainer container kubepods-burstable-pod7ed13ac78677dba22a5edae258734a38.slice. May 13 14:23:04.049323 kubelet[2412]: I0513 14:23:04.049239 2412 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7ed13ac78677dba22a5edae258734a38-kubeconfig\") pod \"kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"7ed13ac78677dba22a5edae258734a38\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.049323 kubelet[2412]: I0513 14:23:04.049318 2412 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7ed13ac78677dba22a5edae258734a38-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"7ed13ac78677dba22a5edae258734a38\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.049556 kubelet[2412]: I0513 14:23:04.049413 2412 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c0677dee366b63fb79c1bd93ea41fcda-kubeconfig\") pod \"kube-scheduler-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"c0677dee366b63fb79c1bd93ea41fcda\") " pod="kube-system/kube-scheduler-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.049556 kubelet[2412]: I0513 14:23:04.049480 2412 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/86b9df13483696eee49702439b4fa312-ca-certs\") pod \"kube-apiserver-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"86b9df13483696eee49702439b4fa312\") " pod="kube-system/kube-apiserver-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.049556 kubelet[2412]: I0513 14:23:04.049525 2412 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/86b9df13483696eee49702439b4fa312-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"86b9df13483696eee49702439b4fa312\") " pod="kube-system/kube-apiserver-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.049744 kubelet[2412]: I0513 14:23:04.049569 2412 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7ed13ac78677dba22a5edae258734a38-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"7ed13ac78677dba22a5edae258734a38\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.049744 kubelet[2412]: I0513 14:23:04.049613 2412 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/86b9df13483696eee49702439b4fa312-k8s-certs\") pod \"kube-apiserver-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"86b9df13483696eee49702439b4fa312\") " pod="kube-system/kube-apiserver-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.049744 kubelet[2412]: I0513 14:23:04.049653 2412 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7ed13ac78677dba22a5edae258734a38-ca-certs\") pod \"kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"7ed13ac78677dba22a5edae258734a38\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.049744 kubelet[2412]: I0513 14:23:04.049702 2412 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7ed13ac78677dba22a5edae258734a38-k8s-certs\") pod \"kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"7ed13ac78677dba22a5edae258734a38\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.052945 kubelet[2412]: E0513 14:23:04.052861 2412 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-9-100-9699b4e791.novalocal?timeout=10s\": dial tcp 172.24.4.33:6443: connect: connection refused" interval="400ms" May 13 14:23:04.244224 kubelet[2412]: I0513 14:23:04.243467 2412 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.244224 kubelet[2412]: E0513 14:23:04.244036 2412 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.33:6443/api/v1/nodes\": dial tcp 172.24.4.33:6443: connect: connection refused" node="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.314006 containerd[1551]: time="2025-05-13T14:23:04.313926595Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-9-100-9699b4e791.novalocal,Uid:c0677dee366b63fb79c1bd93ea41fcda,Namespace:kube-system,Attempt:0,}" May 13 14:23:04.337436 containerd[1551]: time="2025-05-13T14:23:04.337133482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-9-100-9699b4e791.novalocal,Uid:86b9df13483696eee49702439b4fa312,Namespace:kube-system,Attempt:0,}" May 13 14:23:04.354479 containerd[1551]: time="2025-05-13T14:23:04.352700383Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal,Uid:7ed13ac78677dba22a5edae258734a38,Namespace:kube-system,Attempt:0,}" May 13 14:23:04.370351 containerd[1551]: time="2025-05-13T14:23:04.370214300Z" level=info msg="connecting to shim 15ba9ec39b0ca23c731dadcaef3ee56d58b93f398424e4d6cceaf02394963231" address="unix:///run/containerd/s/0b41f5261cae1f653a9b4e5563ccc7a69f9d5dbcbf30f8209919c09039078f72" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:04.426260 containerd[1551]: time="2025-05-13T14:23:04.426186396Z" level=info msg="connecting to shim 5ffbc552d6076f34312f1bbd040919efdd6b31a0b699e9946d72743266fd28bd" address="unix:///run/containerd/s/734c044e8c9ded1f6fa1df9d620d52c0794652332d7414619b5db82ec9fba5e6" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:04.428336 containerd[1551]: time="2025-05-13T14:23:04.428307225Z" level=info msg="connecting to shim 3573e1c82bd0e8e7b96303b61b5e3eac700999f370c28f80ebb3075f957f3d31" address="unix:///run/containerd/s/75fdf800c0523479793b1f522d9e262868833a644761943ad21edab938ea9b19" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:04.440557 systemd[1]: Started cri-containerd-15ba9ec39b0ca23c731dadcaef3ee56d58b93f398424e4d6cceaf02394963231.scope - libcontainer container 15ba9ec39b0ca23c731dadcaef3ee56d58b93f398424e4d6cceaf02394963231. May 13 14:23:04.453412 kubelet[2412]: E0513 14:23:04.453307 2412 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-9-100-9699b4e791.novalocal?timeout=10s\": dial tcp 172.24.4.33:6443: connect: connection refused" interval="800ms" May 13 14:23:04.468089 systemd[1]: Started cri-containerd-3573e1c82bd0e8e7b96303b61b5e3eac700999f370c28f80ebb3075f957f3d31.scope - libcontainer container 3573e1c82bd0e8e7b96303b61b5e3eac700999f370c28f80ebb3075f957f3d31. May 13 14:23:04.473229 systemd[1]: Started cri-containerd-5ffbc552d6076f34312f1bbd040919efdd6b31a0b699e9946d72743266fd28bd.scope - libcontainer container 5ffbc552d6076f34312f1bbd040919efdd6b31a0b699e9946d72743266fd28bd. May 13 14:23:04.537533 containerd[1551]: time="2025-05-13T14:23:04.537148841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-9-100-9699b4e791.novalocal,Uid:c0677dee366b63fb79c1bd93ea41fcda,Namespace:kube-system,Attempt:0,} returns sandbox id \"15ba9ec39b0ca23c731dadcaef3ee56d58b93f398424e4d6cceaf02394963231\"" May 13 14:23:04.543046 containerd[1551]: time="2025-05-13T14:23:04.542971826Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-9-100-9699b4e791.novalocal,Uid:86b9df13483696eee49702439b4fa312,Namespace:kube-system,Attempt:0,} returns sandbox id \"5ffbc552d6076f34312f1bbd040919efdd6b31a0b699e9946d72743266fd28bd\"" May 13 14:23:04.544812 containerd[1551]: time="2025-05-13T14:23:04.544763132Z" level=info msg="CreateContainer within sandbox \"15ba9ec39b0ca23c731dadcaef3ee56d58b93f398424e4d6cceaf02394963231\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 13 14:23:04.552740 containerd[1551]: time="2025-05-13T14:23:04.552473805Z" level=info msg="CreateContainer within sandbox \"5ffbc552d6076f34312f1bbd040919efdd6b31a0b699e9946d72743266fd28bd\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 13 14:23:04.565477 containerd[1551]: time="2025-05-13T14:23:04.565442057Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal,Uid:7ed13ac78677dba22a5edae258734a38,Namespace:kube-system,Attempt:0,} returns sandbox id \"3573e1c82bd0e8e7b96303b61b5e3eac700999f370c28f80ebb3075f957f3d31\"" May 13 14:23:04.567997 containerd[1551]: time="2025-05-13T14:23:04.567969487Z" level=info msg="CreateContainer within sandbox \"3573e1c82bd0e8e7b96303b61b5e3eac700999f370c28f80ebb3075f957f3d31\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 13 14:23:04.574385 containerd[1551]: time="2025-05-13T14:23:04.574331684Z" level=info msg="Container 1063e9badd72086353af62cc2d25e69902848ae97aac4744a8101059224b1587: CDI devices from CRI Config.CDIDevices: []" May 13 14:23:04.584437 containerd[1551]: time="2025-05-13T14:23:04.583691338Z" level=info msg="Container fa395d556188400b016fa66ccbdece5952e0670c2db44331a07996a6c80191e4: CDI devices from CRI Config.CDIDevices: []" May 13 14:23:04.593022 containerd[1551]: time="2025-05-13T14:23:04.592810538Z" level=info msg="Container caa3ca584da2cf00c3acdb0996ef3b07c8c6c3ea76b52d3305db485f78feba0f: CDI devices from CRI Config.CDIDevices: []" May 13 14:23:04.602483 containerd[1551]: time="2025-05-13T14:23:04.602454584Z" level=info msg="CreateContainer within sandbox \"15ba9ec39b0ca23c731dadcaef3ee56d58b93f398424e4d6cceaf02394963231\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"1063e9badd72086353af62cc2d25e69902848ae97aac4744a8101059224b1587\"" May 13 14:23:04.603526 containerd[1551]: time="2025-05-13T14:23:04.603456447Z" level=info msg="StartContainer for \"1063e9badd72086353af62cc2d25e69902848ae97aac4744a8101059224b1587\"" May 13 14:23:04.606643 containerd[1551]: time="2025-05-13T14:23:04.606576185Z" level=info msg="connecting to shim 1063e9badd72086353af62cc2d25e69902848ae97aac4744a8101059224b1587" address="unix:///run/containerd/s/0b41f5261cae1f653a9b4e5563ccc7a69f9d5dbcbf30f8209919c09039078f72" protocol=ttrpc version=3 May 13 14:23:04.621148 containerd[1551]: time="2025-05-13T14:23:04.621116607Z" level=info msg="CreateContainer within sandbox \"3573e1c82bd0e8e7b96303b61b5e3eac700999f370c28f80ebb3075f957f3d31\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"caa3ca584da2cf00c3acdb0996ef3b07c8c6c3ea76b52d3305db485f78feba0f\"" May 13 14:23:04.622441 containerd[1551]: time="2025-05-13T14:23:04.622267229Z" level=info msg="CreateContainer within sandbox \"5ffbc552d6076f34312f1bbd040919efdd6b31a0b699e9946d72743266fd28bd\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"fa395d556188400b016fa66ccbdece5952e0670c2db44331a07996a6c80191e4\"" May 13 14:23:04.623440 containerd[1551]: time="2025-05-13T14:23:04.623422089Z" level=info msg="StartContainer for \"caa3ca584da2cf00c3acdb0996ef3b07c8c6c3ea76b52d3305db485f78feba0f\"" May 13 14:23:04.624505 containerd[1551]: time="2025-05-13T14:23:04.624383360Z" level=info msg="StartContainer for \"fa395d556188400b016fa66ccbdece5952e0670c2db44331a07996a6c80191e4\"" May 13 14:23:04.624769 containerd[1551]: time="2025-05-13T14:23:04.624747686Z" level=info msg="connecting to shim caa3ca584da2cf00c3acdb0996ef3b07c8c6c3ea76b52d3305db485f78feba0f" address="unix:///run/containerd/s/75fdf800c0523479793b1f522d9e262868833a644761943ad21edab938ea9b19" protocol=ttrpc version=3 May 13 14:23:04.627991 containerd[1551]: time="2025-05-13T14:23:04.627955542Z" level=info msg="connecting to shim fa395d556188400b016fa66ccbdece5952e0670c2db44331a07996a6c80191e4" address="unix:///run/containerd/s/734c044e8c9ded1f6fa1df9d620d52c0794652332d7414619b5db82ec9fba5e6" protocol=ttrpc version=3 May 13 14:23:04.632653 systemd[1]: Started cri-containerd-1063e9badd72086353af62cc2d25e69902848ae97aac4744a8101059224b1587.scope - libcontainer container 1063e9badd72086353af62cc2d25e69902848ae97aac4744a8101059224b1587. May 13 14:23:04.647079 kubelet[2412]: I0513 14:23:04.647057 2412 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.647642 kubelet[2412]: E0513 14:23:04.647621 2412 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.33:6443/api/v1/nodes\": dial tcp 172.24.4.33:6443: connect: connection refused" node="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:04.656593 systemd[1]: Started cri-containerd-caa3ca584da2cf00c3acdb0996ef3b07c8c6c3ea76b52d3305db485f78feba0f.scope - libcontainer container caa3ca584da2cf00c3acdb0996ef3b07c8c6c3ea76b52d3305db485f78feba0f. May 13 14:23:04.662800 systemd[1]: Started cri-containerd-fa395d556188400b016fa66ccbdece5952e0670c2db44331a07996a6c80191e4.scope - libcontainer container fa395d556188400b016fa66ccbdece5952e0670c2db44331a07996a6c80191e4. May 13 14:23:04.733175 containerd[1551]: time="2025-05-13T14:23:04.733077409Z" level=info msg="StartContainer for \"caa3ca584da2cf00c3acdb0996ef3b07c8c6c3ea76b52d3305db485f78feba0f\" returns successfully" May 13 14:23:04.733892 containerd[1551]: time="2025-05-13T14:23:04.733846365Z" level=info msg="StartContainer for \"1063e9badd72086353af62cc2d25e69902848ae97aac4744a8101059224b1587\" returns successfully" May 13 14:23:04.770905 kubelet[2412]: W0513 14:23:04.770789 2412 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.33:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused May 13 14:23:04.770905 kubelet[2412]: E0513 14:23:04.770861 2412 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.24.4.33:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.24.4.33:6443: connect: connection refused" logger="UnhandledError" May 13 14:23:04.774834 containerd[1551]: time="2025-05-13T14:23:04.774721066Z" level=info msg="StartContainer for \"fa395d556188400b016fa66ccbdece5952e0670c2db44331a07996a6c80191e4\" returns successfully" May 13 14:23:04.781310 kubelet[2412]: W0513 14:23:04.781245 2412 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.33:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-9-100-9699b4e791.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused May 13 14:23:04.781873 kubelet[2412]: E0513 14:23:04.781314 2412 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.24.4.33:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-9-100-9699b4e791.novalocal&limit=500&resourceVersion=0\": dial tcp 172.24.4.33:6443: connect: connection refused" logger="UnhandledError" May 13 14:23:05.451147 kubelet[2412]: I0513 14:23:05.450924 2412 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:06.641402 kubelet[2412]: E0513 14:23:06.641345 2412 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-9999-9-100-9699b4e791.novalocal\" not found" node="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:06.785351 kubelet[2412]: I0513 14:23:06.785031 2412 kubelet_node_status.go:75] "Successfully registered node" node="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:06.785351 kubelet[2412]: E0513 14:23:06.785062 2412 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-9999-9-100-9699b4e791.novalocal\": node \"ci-9999-9-100-9699b4e791.novalocal\" not found" May 13 14:23:06.805828 kubelet[2412]: E0513 14:23:06.805797 2412 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-9-100-9699b4e791.novalocal\" not found" May 13 14:23:06.906715 kubelet[2412]: E0513 14:23:06.906620 2412 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-9-100-9699b4e791.novalocal\" not found" May 13 14:23:07.007232 kubelet[2412]: E0513 14:23:07.007196 2412 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-9-100-9699b4e791.novalocal\" not found" May 13 14:23:07.107447 kubelet[2412]: E0513 14:23:07.107389 2412 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-9-100-9699b4e791.novalocal\" not found" May 13 14:23:07.208268 kubelet[2412]: E0513 14:23:07.208033 2412 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-9-100-9699b4e791.novalocal\" not found" May 13 14:23:07.308327 kubelet[2412]: E0513 14:23:07.308249 2412 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-9-100-9699b4e791.novalocal\" not found" May 13 14:23:07.409437 kubelet[2412]: E0513 14:23:07.409287 2412 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-9-100-9699b4e791.novalocal\" not found" May 13 14:23:07.510463 kubelet[2412]: E0513 14:23:07.510203 2412 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-9-100-9699b4e791.novalocal\" not found" May 13 14:23:07.805998 kubelet[2412]: I0513 14:23:07.805850 2412 apiserver.go:52] "Watching apiserver" May 13 14:23:07.849888 kubelet[2412]: I0513 14:23:07.849805 2412 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 13 14:23:09.244905 systemd[1]: Reload requested from client PID 2685 ('systemctl') (unit session-11.scope)... May 13 14:23:09.244942 systemd[1]: Reloading... May 13 14:23:09.355681 zram_generator::config[2730]: No configuration found. May 13 14:23:09.505967 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 14:23:09.667235 systemd[1]: Reloading finished in 421 ms. May 13 14:23:09.693301 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 13 14:23:09.707610 systemd[1]: kubelet.service: Deactivated successfully. May 13 14:23:09.707898 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 14:23:09.707942 systemd[1]: kubelet.service: Consumed 977ms CPU time, 115.2M memory peak. May 13 14:23:09.712565 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 14:23:09.946663 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 14:23:09.955895 (kubelet)[2793]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 14:23:10.021629 kubelet[2793]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 14:23:10.023407 kubelet[2793]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 14:23:10.023407 kubelet[2793]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 14:23:10.023407 kubelet[2793]: I0513 14:23:10.022002 2793 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 14:23:10.027822 kubelet[2793]: I0513 14:23:10.027806 2793 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 13 14:23:10.027897 kubelet[2793]: I0513 14:23:10.027888 2793 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 14:23:10.028179 kubelet[2793]: I0513 14:23:10.028166 2793 server.go:929] "Client rotation is on, will bootstrap in background" May 13 14:23:10.029722 kubelet[2793]: I0513 14:23:10.029708 2793 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 13 14:23:10.037158 kubelet[2793]: I0513 14:23:10.037114 2793 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 14:23:10.042418 kubelet[2793]: I0513 14:23:10.041997 2793 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 14:23:10.044852 kubelet[2793]: I0513 14:23:10.044837 2793 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 14:23:10.045027 kubelet[2793]: I0513 14:23:10.045017 2793 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 13 14:23:10.045227 kubelet[2793]: I0513 14:23:10.045203 2793 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 14:23:10.045488 kubelet[2793]: I0513 14:23:10.045287 2793 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-9-100-9699b4e791.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 14:23:10.045611 kubelet[2793]: I0513 14:23:10.045600 2793 topology_manager.go:138] "Creating topology manager with none policy" May 13 14:23:10.045666 kubelet[2793]: I0513 14:23:10.045659 2793 container_manager_linux.go:300] "Creating device plugin manager" May 13 14:23:10.045744 kubelet[2793]: I0513 14:23:10.045735 2793 state_mem.go:36] "Initialized new in-memory state store" May 13 14:23:10.045888 kubelet[2793]: I0513 14:23:10.045877 2793 kubelet.go:408] "Attempting to sync node with API server" May 13 14:23:10.045971 kubelet[2793]: I0513 14:23:10.045961 2793 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 14:23:10.046047 kubelet[2793]: I0513 14:23:10.046038 2793 kubelet.go:314] "Adding apiserver pod source" May 13 14:23:10.046110 kubelet[2793]: I0513 14:23:10.046102 2793 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 14:23:10.049023 kubelet[2793]: I0513 14:23:10.047288 2793 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 13 14:23:10.049023 kubelet[2793]: I0513 14:23:10.047900 2793 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 14:23:10.049023 kubelet[2793]: I0513 14:23:10.048529 2793 server.go:1269] "Started kubelet" May 13 14:23:10.050789 kubelet[2793]: I0513 14:23:10.050768 2793 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 14:23:10.055084 kubelet[2793]: I0513 14:23:10.055053 2793 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 14:23:10.056387 kubelet[2793]: I0513 14:23:10.055955 2793 server.go:460] "Adding debug handlers to kubelet server" May 13 14:23:10.059197 kubelet[2793]: I0513 14:23:10.057570 2793 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 14:23:10.059197 kubelet[2793]: I0513 14:23:10.057867 2793 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 14:23:10.059197 kubelet[2793]: I0513 14:23:10.058111 2793 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 14:23:10.059994 kubelet[2793]: I0513 14:23:10.059972 2793 volume_manager.go:289] "Starting Kubelet Volume Manager" May 13 14:23:10.060220 kubelet[2793]: E0513 14:23:10.060201 2793 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-9-100-9699b4e791.novalocal\" not found" May 13 14:23:10.063797 kubelet[2793]: I0513 14:23:10.063778 2793 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 13 14:23:10.065364 kubelet[2793]: I0513 14:23:10.063885 2793 reconciler.go:26] "Reconciler: start to sync state" May 13 14:23:10.066457 kubelet[2793]: I0513 14:23:10.066440 2793 factory.go:221] Registration of the systemd container factory successfully May 13 14:23:10.066625 kubelet[2793]: I0513 14:23:10.066606 2793 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 14:23:10.067973 kubelet[2793]: I0513 14:23:10.067935 2793 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 14:23:10.069184 kubelet[2793]: I0513 14:23:10.069164 2793 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 14:23:10.069184 kubelet[2793]: I0513 14:23:10.069182 2793 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 14:23:10.069253 kubelet[2793]: I0513 14:23:10.069197 2793 kubelet.go:2321] "Starting kubelet main sync loop" May 13 14:23:10.069375 kubelet[2793]: I0513 14:23:10.069344 2793 factory.go:221] Registration of the containerd container factory successfully May 13 14:23:10.069445 kubelet[2793]: E0513 14:23:10.069253 2793 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 14:23:10.101218 kubelet[2793]: E0513 14:23:10.101146 2793 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 14:23:10.139861 kubelet[2793]: I0513 14:23:10.139792 2793 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 14:23:10.140085 kubelet[2793]: I0513 14:23:10.140013 2793 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 14:23:10.140085 kubelet[2793]: I0513 14:23:10.140034 2793 state_mem.go:36] "Initialized new in-memory state store" May 13 14:23:10.140286 kubelet[2793]: I0513 14:23:10.140272 2793 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 13 14:23:10.140377 kubelet[2793]: I0513 14:23:10.140334 2793 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 13 14:23:10.140438 kubelet[2793]: I0513 14:23:10.140430 2793 policy_none.go:49] "None policy: Start" May 13 14:23:10.141199 kubelet[2793]: I0513 14:23:10.141179 2793 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 14:23:10.141285 kubelet[2793]: I0513 14:23:10.141209 2793 state_mem.go:35] "Initializing new in-memory state store" May 13 14:23:10.141436 kubelet[2793]: I0513 14:23:10.141416 2793 state_mem.go:75] "Updated machine memory state" May 13 14:23:10.147740 kubelet[2793]: I0513 14:23:10.147715 2793 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 14:23:10.149942 kubelet[2793]: I0513 14:23:10.149061 2793 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 14:23:10.149942 kubelet[2793]: I0513 14:23:10.149080 2793 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 14:23:10.150532 kubelet[2793]: I0513 14:23:10.150504 2793 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 14:23:10.192894 kubelet[2793]: W0513 14:23:10.192861 2793 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 13 14:23:10.194371 kubelet[2793]: W0513 14:23:10.193819 2793 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 13 14:23:10.194856 kubelet[2793]: W0513 14:23:10.194837 2793 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 13 14:23:10.252574 kubelet[2793]: I0513 14:23:10.252485 2793 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:10.262815 kubelet[2793]: I0513 14:23:10.262758 2793 kubelet_node_status.go:111] "Node was previously registered" node="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:10.262997 kubelet[2793]: I0513 14:23:10.262856 2793 kubelet_node_status.go:75] "Successfully registered node" node="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:10.265057 kubelet[2793]: I0513 14:23:10.265034 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/86b9df13483696eee49702439b4fa312-k8s-certs\") pod \"kube-apiserver-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"86b9df13483696eee49702439b4fa312\") " pod="kube-system/kube-apiserver-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:10.265990 kubelet[2793]: I0513 14:23:10.265068 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/86b9df13483696eee49702439b4fa312-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"86b9df13483696eee49702439b4fa312\") " pod="kube-system/kube-apiserver-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:10.265990 kubelet[2793]: I0513 14:23:10.265112 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7ed13ac78677dba22a5edae258734a38-ca-certs\") pod \"kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"7ed13ac78677dba22a5edae258734a38\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:10.265990 kubelet[2793]: I0513 14:23:10.265135 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7ed13ac78677dba22a5edae258734a38-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"7ed13ac78677dba22a5edae258734a38\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:10.265990 kubelet[2793]: I0513 14:23:10.265153 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7ed13ac78677dba22a5edae258734a38-k8s-certs\") pod \"kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"7ed13ac78677dba22a5edae258734a38\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:10.266109 kubelet[2793]: I0513 14:23:10.265173 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/86b9df13483696eee49702439b4fa312-ca-certs\") pod \"kube-apiserver-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"86b9df13483696eee49702439b4fa312\") " pod="kube-system/kube-apiserver-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:10.266109 kubelet[2793]: I0513 14:23:10.265191 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7ed13ac78677dba22a5edae258734a38-kubeconfig\") pod \"kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"7ed13ac78677dba22a5edae258734a38\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:10.266109 kubelet[2793]: I0513 14:23:10.265209 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7ed13ac78677dba22a5edae258734a38-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"7ed13ac78677dba22a5edae258734a38\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:10.266109 kubelet[2793]: I0513 14:23:10.265228 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c0677dee366b63fb79c1bd93ea41fcda-kubeconfig\") pod \"kube-scheduler-ci-9999-9-100-9699b4e791.novalocal\" (UID: \"c0677dee366b63fb79c1bd93ea41fcda\") " pod="kube-system/kube-scheduler-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:11.048148 kubelet[2793]: I0513 14:23:11.047750 2793 apiserver.go:52] "Watching apiserver" May 13 14:23:11.065145 kubelet[2793]: I0513 14:23:11.065026 2793 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 13 14:23:11.136037 kubelet[2793]: W0513 14:23:11.135465 2793 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 13 14:23:11.136037 kubelet[2793]: E0513 14:23:11.135888 2793 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-9999-9-100-9699b4e791.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:11.216407 kubelet[2793]: I0513 14:23:11.216330 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-9999-9-100-9699b4e791.novalocal" podStartSLOduration=1.216315984 podStartE2EDuration="1.216315984s" podCreationTimestamp="2025-05-13 14:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 14:23:11.215139404 +0000 UTC m=+1.254161330" watchObservedRunningTime="2025-05-13 14:23:11.216315984 +0000 UTC m=+1.255337900" May 13 14:23:11.243010 kubelet[2793]: I0513 14:23:11.242936 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-9999-9-100-9699b4e791.novalocal" podStartSLOduration=1.242919352 podStartE2EDuration="1.242919352s" podCreationTimestamp="2025-05-13 14:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 14:23:11.231765424 +0000 UTC m=+1.270787340" watchObservedRunningTime="2025-05-13 14:23:11.242919352 +0000 UTC m=+1.281941268" May 13 14:23:11.243277 kubelet[2793]: I0513 14:23:11.243141 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-9999-9-100-9699b4e791.novalocal" podStartSLOduration=1.243134005 podStartE2EDuration="1.243134005s" podCreationTimestamp="2025-05-13 14:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 14:23:11.241067547 +0000 UTC m=+1.280089473" watchObservedRunningTime="2025-05-13 14:23:11.243134005 +0000 UTC m=+1.282155921" May 13 14:23:13.847957 kubelet[2793]: I0513 14:23:13.847916 2793 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 13 14:23:13.849688 kubelet[2793]: I0513 14:23:13.848971 2793 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 13 14:23:13.849758 containerd[1551]: time="2025-05-13T14:23:13.848738651Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 13 14:23:14.869207 systemd[1]: Created slice kubepods-besteffort-pod54d12e7c_3e59_4de5_b9b8_46d7a2706f80.slice - libcontainer container kubepods-besteffort-pod54d12e7c_3e59_4de5_b9b8_46d7a2706f80.slice. May 13 14:23:14.894578 kubelet[2793]: I0513 14:23:14.894536 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/54d12e7c-3e59-4de5-b9b8-46d7a2706f80-kube-proxy\") pod \"kube-proxy-g2ffw\" (UID: \"54d12e7c-3e59-4de5-b9b8-46d7a2706f80\") " pod="kube-system/kube-proxy-g2ffw" May 13 14:23:14.894578 kubelet[2793]: I0513 14:23:14.894573 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/54d12e7c-3e59-4de5-b9b8-46d7a2706f80-xtables-lock\") pod \"kube-proxy-g2ffw\" (UID: \"54d12e7c-3e59-4de5-b9b8-46d7a2706f80\") " pod="kube-system/kube-proxy-g2ffw" May 13 14:23:14.894966 kubelet[2793]: I0513 14:23:14.894594 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54d12e7c-3e59-4de5-b9b8-46d7a2706f80-lib-modules\") pod \"kube-proxy-g2ffw\" (UID: \"54d12e7c-3e59-4de5-b9b8-46d7a2706f80\") " pod="kube-system/kube-proxy-g2ffw" May 13 14:23:14.894966 kubelet[2793]: I0513 14:23:14.894615 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8mz8\" (UniqueName: \"kubernetes.io/projected/54d12e7c-3e59-4de5-b9b8-46d7a2706f80-kube-api-access-r8mz8\") pod \"kube-proxy-g2ffw\" (UID: \"54d12e7c-3e59-4de5-b9b8-46d7a2706f80\") " pod="kube-system/kube-proxy-g2ffw" May 13 14:23:14.940427 systemd[1]: Created slice kubepods-besteffort-pod7ddd3bb9_88a3_4f6b_af6e_da21735ffcc0.slice - libcontainer container kubepods-besteffort-pod7ddd3bb9_88a3_4f6b_af6e_da21735ffcc0.slice. May 13 14:23:14.994954 kubelet[2793]: I0513 14:23:14.994865 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7ddd3bb9-88a3-4f6b-af6e-da21735ffcc0-var-lib-calico\") pod \"tigera-operator-6f6897fdc5-5zfwn\" (UID: \"7ddd3bb9-88a3-4f6b-af6e-da21735ffcc0\") " pod="tigera-operator/tigera-operator-6f6897fdc5-5zfwn" May 13 14:23:14.994954 kubelet[2793]: I0513 14:23:14.994930 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctjnx\" (UniqueName: \"kubernetes.io/projected/7ddd3bb9-88a3-4f6b-af6e-da21735ffcc0-kube-api-access-ctjnx\") pod \"tigera-operator-6f6897fdc5-5zfwn\" (UID: \"7ddd3bb9-88a3-4f6b-af6e-da21735ffcc0\") " pod="tigera-operator/tigera-operator-6f6897fdc5-5zfwn" May 13 14:23:15.183605 containerd[1551]: time="2025-05-13T14:23:15.183544737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-g2ffw,Uid:54d12e7c-3e59-4de5-b9b8-46d7a2706f80,Namespace:kube-system,Attempt:0,}" May 13 14:23:15.231890 containerd[1551]: time="2025-05-13T14:23:15.231222723Z" level=info msg="connecting to shim 037609973ca763cf2a118b389deeb57ab6d18409776c5adbd914f9e8d59e5c73" address="unix:///run/containerd/s/ec9997976ef2f4916aa3d37dad8264b8f1cb368317f091ccf6525892d6a66dea" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:15.245414 containerd[1551]: time="2025-05-13T14:23:15.245257007Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-5zfwn,Uid:7ddd3bb9-88a3-4f6b-af6e-da21735ffcc0,Namespace:tigera-operator,Attempt:0,}" May 13 14:23:15.293040 containerd[1551]: time="2025-05-13T14:23:15.292994333Z" level=info msg="connecting to shim 48563a5bbec53f9a55f1ec0e0439077594beddaf302de1333e8dca2b993490ff" address="unix:///run/containerd/s/2726cc54310c4035aab63c3056eb3cef624304738df4ca12774a3823adbaa39e" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:15.294709 systemd[1]: Started cri-containerd-037609973ca763cf2a118b389deeb57ab6d18409776c5adbd914f9e8d59e5c73.scope - libcontainer container 037609973ca763cf2a118b389deeb57ab6d18409776c5adbd914f9e8d59e5c73. May 13 14:23:15.324677 systemd[1]: Started cri-containerd-48563a5bbec53f9a55f1ec0e0439077594beddaf302de1333e8dca2b993490ff.scope - libcontainer container 48563a5bbec53f9a55f1ec0e0439077594beddaf302de1333e8dca2b993490ff. May 13 14:23:15.329628 containerd[1551]: time="2025-05-13T14:23:15.328820406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-g2ffw,Uid:54d12e7c-3e59-4de5-b9b8-46d7a2706f80,Namespace:kube-system,Attempt:0,} returns sandbox id \"037609973ca763cf2a118b389deeb57ab6d18409776c5adbd914f9e8d59e5c73\"" May 13 14:23:15.335062 containerd[1551]: time="2025-05-13T14:23:15.335033140Z" level=info msg="CreateContainer within sandbox \"037609973ca763cf2a118b389deeb57ab6d18409776c5adbd914f9e8d59e5c73\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 13 14:23:15.348753 containerd[1551]: time="2025-05-13T14:23:15.348700229Z" level=info msg="Container ba433d09fb7c8c37cb06100c777e7cf7047cc4defdc2733ddb3b7c448552ed54: CDI devices from CRI Config.CDIDevices: []" May 13 14:23:15.360429 containerd[1551]: time="2025-05-13T14:23:15.360401905Z" level=info msg="CreateContainer within sandbox \"037609973ca763cf2a118b389deeb57ab6d18409776c5adbd914f9e8d59e5c73\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ba433d09fb7c8c37cb06100c777e7cf7047cc4defdc2733ddb3b7c448552ed54\"" May 13 14:23:15.362500 containerd[1551]: time="2025-05-13T14:23:15.362476059Z" level=info msg="StartContainer for \"ba433d09fb7c8c37cb06100c777e7cf7047cc4defdc2733ddb3b7c448552ed54\"" May 13 14:23:15.366799 containerd[1551]: time="2025-05-13T14:23:15.366706397Z" level=info msg="connecting to shim ba433d09fb7c8c37cb06100c777e7cf7047cc4defdc2733ddb3b7c448552ed54" address="unix:///run/containerd/s/ec9997976ef2f4916aa3d37dad8264b8f1cb368317f091ccf6525892d6a66dea" protocol=ttrpc version=3 May 13 14:23:15.378840 containerd[1551]: time="2025-05-13T14:23:15.378803771Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-5zfwn,Uid:7ddd3bb9-88a3-4f6b-af6e-da21735ffcc0,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"48563a5bbec53f9a55f1ec0e0439077594beddaf302de1333e8dca2b993490ff\"" May 13 14:23:15.382740 containerd[1551]: time="2025-05-13T14:23:15.382699072Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 13 14:23:15.398615 systemd[1]: Started cri-containerd-ba433d09fb7c8c37cb06100c777e7cf7047cc4defdc2733ddb3b7c448552ed54.scope - libcontainer container ba433d09fb7c8c37cb06100c777e7cf7047cc4defdc2733ddb3b7c448552ed54. May 13 14:23:15.441918 containerd[1551]: time="2025-05-13T14:23:15.441514837Z" level=info msg="StartContainer for \"ba433d09fb7c8c37cb06100c777e7cf7047cc4defdc2733ddb3b7c448552ed54\" returns successfully" May 13 14:23:15.870817 sudo[1836]: pam_unix(sudo:session): session closed for user root May 13 14:23:16.151257 sshd[1835]: Connection closed by 172.24.4.1 port 56194 May 13 14:23:16.152490 sshd-session[1833]: pam_unix(sshd:session): session closed for user core May 13 14:23:16.163105 systemd[1]: sshd@8-172.24.4.33:22-172.24.4.1:56194.service: Deactivated successfully. May 13 14:23:16.167841 systemd[1]: session-11.scope: Deactivated successfully. May 13 14:23:16.168498 systemd[1]: session-11.scope: Consumed 6.998s CPU time, 229.7M memory peak. May 13 14:23:16.173125 systemd-logind[1524]: Session 11 logged out. Waiting for processes to exit. May 13 14:23:16.177009 systemd-logind[1524]: Removed session 11. May 13 14:23:16.192596 kubelet[2793]: I0513 14:23:16.192430 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-g2ffw" podStartSLOduration=2.192294151 podStartE2EDuration="2.192294151s" podCreationTimestamp="2025-05-13 14:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 14:23:16.189146151 +0000 UTC m=+6.228168117" watchObservedRunningTime="2025-05-13 14:23:16.192294151 +0000 UTC m=+6.231316117" May 13 14:23:17.137275 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount794011477.mount: Deactivated successfully. May 13 14:23:17.855058 containerd[1551]: time="2025-05-13T14:23:17.854998995Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:17.856869 containerd[1551]: time="2025-05-13T14:23:17.856826791Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 13 14:23:17.858388 containerd[1551]: time="2025-05-13T14:23:17.858323664Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:17.861142 containerd[1551]: time="2025-05-13T14:23:17.861099176Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:17.861910 containerd[1551]: time="2025-05-13T14:23:17.861766822Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 2.479033087s" May 13 14:23:17.861910 containerd[1551]: time="2025-05-13T14:23:17.861802599Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 13 14:23:17.864981 containerd[1551]: time="2025-05-13T14:23:17.864939129Z" level=info msg="CreateContainer within sandbox \"48563a5bbec53f9a55f1ec0e0439077594beddaf302de1333e8dca2b993490ff\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 13 14:23:17.878655 containerd[1551]: time="2025-05-13T14:23:17.878622558Z" level=info msg="Container 0b513367cb17d252578a72b17262c447e3bc3e8501270e5f7cd7c7ff66313cf8: CDI devices from CRI Config.CDIDevices: []" May 13 14:23:17.887854 containerd[1551]: time="2025-05-13T14:23:17.887819675Z" level=info msg="CreateContainer within sandbox \"48563a5bbec53f9a55f1ec0e0439077594beddaf302de1333e8dca2b993490ff\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"0b513367cb17d252578a72b17262c447e3bc3e8501270e5f7cd7c7ff66313cf8\"" May 13 14:23:17.888321 containerd[1551]: time="2025-05-13T14:23:17.888299173Z" level=info msg="StartContainer for \"0b513367cb17d252578a72b17262c447e3bc3e8501270e5f7cd7c7ff66313cf8\"" May 13 14:23:17.889081 containerd[1551]: time="2025-05-13T14:23:17.889048522Z" level=info msg="connecting to shim 0b513367cb17d252578a72b17262c447e3bc3e8501270e5f7cd7c7ff66313cf8" address="unix:///run/containerd/s/2726cc54310c4035aab63c3056eb3cef624304738df4ca12774a3823adbaa39e" protocol=ttrpc version=3 May 13 14:23:17.909508 systemd[1]: Started cri-containerd-0b513367cb17d252578a72b17262c447e3bc3e8501270e5f7cd7c7ff66313cf8.scope - libcontainer container 0b513367cb17d252578a72b17262c447e3bc3e8501270e5f7cd7c7ff66313cf8. May 13 14:23:17.940955 containerd[1551]: time="2025-05-13T14:23:17.940876953Z" level=info msg="StartContainer for \"0b513367cb17d252578a72b17262c447e3bc3e8501270e5f7cd7c7ff66313cf8\" returns successfully" May 13 14:23:21.292611 kubelet[2793]: I0513 14:23:21.292530 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6f6897fdc5-5zfwn" podStartSLOduration=4.810537559 podStartE2EDuration="7.292514733s" podCreationTimestamp="2025-05-13 14:23:14 +0000 UTC" firstStartedPulling="2025-05-13 14:23:15.380852387 +0000 UTC m=+5.419874303" lastFinishedPulling="2025-05-13 14:23:17.862829561 +0000 UTC m=+7.901851477" observedRunningTime="2025-05-13 14:23:18.192351532 +0000 UTC m=+8.231373498" watchObservedRunningTime="2025-05-13 14:23:21.292514733 +0000 UTC m=+11.331536649" May 13 14:23:21.301508 systemd[1]: Created slice kubepods-besteffort-podaa36223e_4de2_4292_8119_66c685569191.slice - libcontainer container kubepods-besteffort-podaa36223e_4de2_4292_8119_66c685569191.slice. May 13 14:23:21.340243 kubelet[2793]: I0513 14:23:21.340208 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/aa36223e-4de2-4292-8119-66c685569191-typha-certs\") pod \"calico-typha-58d75b9f4c-9xjhh\" (UID: \"aa36223e-4de2-4292-8119-66c685569191\") " pod="calico-system/calico-typha-58d75b9f4c-9xjhh" May 13 14:23:21.340243 kubelet[2793]: I0513 14:23:21.340247 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jbfx\" (UniqueName: \"kubernetes.io/projected/aa36223e-4de2-4292-8119-66c685569191-kube-api-access-5jbfx\") pod \"calico-typha-58d75b9f4c-9xjhh\" (UID: \"aa36223e-4de2-4292-8119-66c685569191\") " pod="calico-system/calico-typha-58d75b9f4c-9xjhh" May 13 14:23:21.341447 kubelet[2793]: I0513 14:23:21.340270 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa36223e-4de2-4292-8119-66c685569191-tigera-ca-bundle\") pod \"calico-typha-58d75b9f4c-9xjhh\" (UID: \"aa36223e-4de2-4292-8119-66c685569191\") " pod="calico-system/calico-typha-58d75b9f4c-9xjhh" May 13 14:23:21.486839 systemd[1]: Created slice kubepods-besteffort-pod3af519b6_40af_4414_bde1_91242298fdf4.slice - libcontainer container kubepods-besteffort-pod3af519b6_40af_4414_bde1_91242298fdf4.slice. May 13 14:23:21.541385 kubelet[2793]: I0513 14:23:21.541280 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3af519b6-40af-4414-bde1-91242298fdf4-var-run-calico\") pod \"calico-node-4lklq\" (UID: \"3af519b6-40af-4414-bde1-91242298fdf4\") " pod="calico-system/calico-node-4lklq" May 13 14:23:21.541385 kubelet[2793]: I0513 14:23:21.541319 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3af519b6-40af-4414-bde1-91242298fdf4-flexvol-driver-host\") pod \"calico-node-4lklq\" (UID: \"3af519b6-40af-4414-bde1-91242298fdf4\") " pod="calico-system/calico-node-4lklq" May 13 14:23:21.541934 kubelet[2793]: I0513 14:23:21.541563 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3af519b6-40af-4414-bde1-91242298fdf4-cni-bin-dir\") pod \"calico-node-4lklq\" (UID: \"3af519b6-40af-4414-bde1-91242298fdf4\") " pod="calico-system/calico-node-4lklq" May 13 14:23:21.541934 kubelet[2793]: I0513 14:23:21.541592 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3af519b6-40af-4414-bde1-91242298fdf4-tigera-ca-bundle\") pod \"calico-node-4lklq\" (UID: \"3af519b6-40af-4414-bde1-91242298fdf4\") " pod="calico-system/calico-node-4lklq" May 13 14:23:21.541934 kubelet[2793]: I0513 14:23:21.541612 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3af519b6-40af-4414-bde1-91242298fdf4-node-certs\") pod \"calico-node-4lklq\" (UID: \"3af519b6-40af-4414-bde1-91242298fdf4\") " pod="calico-system/calico-node-4lklq" May 13 14:23:21.541934 kubelet[2793]: I0513 14:23:21.541690 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3af519b6-40af-4414-bde1-91242298fdf4-var-lib-calico\") pod \"calico-node-4lklq\" (UID: \"3af519b6-40af-4414-bde1-91242298fdf4\") " pod="calico-system/calico-node-4lklq" May 13 14:23:21.542936 kubelet[2793]: I0513 14:23:21.542121 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3af519b6-40af-4414-bde1-91242298fdf4-cni-net-dir\") pod \"calico-node-4lklq\" (UID: \"3af519b6-40af-4414-bde1-91242298fdf4\") " pod="calico-system/calico-node-4lklq" May 13 14:23:21.542936 kubelet[2793]: I0513 14:23:21.542149 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3af519b6-40af-4414-bde1-91242298fdf4-policysync\") pod \"calico-node-4lklq\" (UID: \"3af519b6-40af-4414-bde1-91242298fdf4\") " pod="calico-system/calico-node-4lklq" May 13 14:23:21.542936 kubelet[2793]: I0513 14:23:21.542839 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3af519b6-40af-4414-bde1-91242298fdf4-xtables-lock\") pod \"calico-node-4lklq\" (UID: \"3af519b6-40af-4414-bde1-91242298fdf4\") " pod="calico-system/calico-node-4lklq" May 13 14:23:21.542936 kubelet[2793]: I0513 14:23:21.542866 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7g99\" (UniqueName: \"kubernetes.io/projected/3af519b6-40af-4414-bde1-91242298fdf4-kube-api-access-w7g99\") pod \"calico-node-4lklq\" (UID: \"3af519b6-40af-4414-bde1-91242298fdf4\") " pod="calico-system/calico-node-4lklq" May 13 14:23:21.543578 kubelet[2793]: I0513 14:23:21.543126 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3af519b6-40af-4414-bde1-91242298fdf4-lib-modules\") pod \"calico-node-4lklq\" (UID: \"3af519b6-40af-4414-bde1-91242298fdf4\") " pod="calico-system/calico-node-4lklq" May 13 14:23:21.543578 kubelet[2793]: I0513 14:23:21.543222 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3af519b6-40af-4414-bde1-91242298fdf4-cni-log-dir\") pod \"calico-node-4lklq\" (UID: \"3af519b6-40af-4414-bde1-91242298fdf4\") " pod="calico-system/calico-node-4lklq" May 13 14:23:21.580512 kubelet[2793]: E0513 14:23:21.580469 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g62pr" podUID="b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb" May 13 14:23:21.608526 containerd[1551]: time="2025-05-13T14:23:21.608486327Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-58d75b9f4c-9xjhh,Uid:aa36223e-4de2-4292-8119-66c685569191,Namespace:calico-system,Attempt:0,}" May 13 14:23:21.645746 kubelet[2793]: E0513 14:23:21.645438 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.645746 kubelet[2793]: W0513 14:23:21.645551 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.645746 kubelet[2793]: E0513 14:23:21.645699 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.646467 kubelet[2793]: E0513 14:23:21.645956 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.646467 kubelet[2793]: W0513 14:23:21.645969 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.646467 kubelet[2793]: E0513 14:23:21.646042 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.646467 kubelet[2793]: E0513 14:23:21.646314 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.646467 kubelet[2793]: W0513 14:23:21.646323 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.646467 kubelet[2793]: E0513 14:23:21.646385 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.647666 kubelet[2793]: E0513 14:23:21.647191 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.647666 kubelet[2793]: W0513 14:23:21.647204 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.647666 kubelet[2793]: E0513 14:23:21.647317 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.647666 kubelet[2793]: W0513 14:23:21.647324 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.647666 kubelet[2793]: E0513 14:23:21.647402 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.647666 kubelet[2793]: E0513 14:23:21.647442 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.648297 kubelet[2793]: E0513 14:23:21.648035 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.648297 kubelet[2793]: W0513 14:23:21.648047 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.648297 kubelet[2793]: E0513 14:23:21.648062 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.648297 kubelet[2793]: E0513 14:23:21.648250 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.648297 kubelet[2793]: W0513 14:23:21.648258 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.648297 kubelet[2793]: E0513 14:23:21.648268 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.649091 kubelet[2793]: E0513 14:23:21.648430 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.649091 kubelet[2793]: W0513 14:23:21.648438 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.649091 kubelet[2793]: E0513 14:23:21.648583 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.649091 kubelet[2793]: E0513 14:23:21.648650 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.649091 kubelet[2793]: W0513 14:23:21.648656 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.649091 kubelet[2793]: I0513 14:23:21.648608 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb-socket-dir\") pod \"csi-node-driver-g62pr\" (UID: \"b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb\") " pod="calico-system/csi-node-driver-g62pr" May 13 14:23:21.649091 kubelet[2793]: E0513 14:23:21.648744 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.649091 kubelet[2793]: E0513 14:23:21.648841 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.649091 kubelet[2793]: W0513 14:23:21.648848 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.649091 kubelet[2793]: E0513 14:23:21.649031 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.649320 kubelet[2793]: W0513 14:23:21.649039 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.649320 kubelet[2793]: E0513 14:23:21.649178 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.649320 kubelet[2793]: W0513 14:23:21.649186 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.649320 kubelet[2793]: E0513 14:23:21.649195 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.651907 kubelet[2793]: E0513 14:23:21.649331 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.651907 kubelet[2793]: W0513 14:23:21.649340 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.651907 kubelet[2793]: E0513 14:23:21.649349 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.651907 kubelet[2793]: E0513 14:23:21.649560 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.651907 kubelet[2793]: W0513 14:23:21.649568 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.651907 kubelet[2793]: E0513 14:23:21.649577 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.651907 kubelet[2793]: E0513 14:23:21.649907 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.651907 kubelet[2793]: W0513 14:23:21.649915 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.651907 kubelet[2793]: E0513 14:23:21.649924 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.651907 kubelet[2793]: E0513 14:23:21.650308 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.652146 kubelet[2793]: W0513 14:23:21.650316 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.652146 kubelet[2793]: E0513 14:23:21.650326 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.652146 kubelet[2793]: E0513 14:23:21.650569 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.652146 kubelet[2793]: W0513 14:23:21.650577 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.652146 kubelet[2793]: E0513 14:23:21.650586 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.652146 kubelet[2793]: E0513 14:23:21.650786 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.652146 kubelet[2793]: W0513 14:23:21.650795 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.652146 kubelet[2793]: E0513 14:23:21.651667 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.652443 kubelet[2793]: E0513 14:23:21.652416 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.652490 kubelet[2793]: E0513 14:23:21.652457 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.653291 kubelet[2793]: E0513 14:23:21.653278 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.653411 kubelet[2793]: W0513 14:23:21.653396 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.654446 kubelet[2793]: E0513 14:23:21.654429 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.654898 kubelet[2793]: E0513 14:23:21.654686 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.654898 kubelet[2793]: W0513 14:23:21.654884 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.654970 kubelet[2793]: E0513 14:23:21.654903 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.655053 kubelet[2793]: E0513 14:23:21.655038 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.655053 kubelet[2793]: W0513 14:23:21.655049 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.655168 kubelet[2793]: E0513 14:23:21.655063 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.655196 kubelet[2793]: E0513 14:23:21.655183 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.655196 kubelet[2793]: W0513 14:23:21.655191 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.655246 kubelet[2793]: E0513 14:23:21.655199 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.655431 kubelet[2793]: E0513 14:23:21.655413 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.655431 kubelet[2793]: W0513 14:23:21.655426 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.656511 kubelet[2793]: E0513 14:23:21.655441 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.656511 kubelet[2793]: E0513 14:23:21.655550 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.656511 kubelet[2793]: W0513 14:23:21.655558 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.656511 kubelet[2793]: E0513 14:23:21.655566 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.656511 kubelet[2793]: E0513 14:23:21.655778 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.656511 kubelet[2793]: W0513 14:23:21.655787 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.656511 kubelet[2793]: E0513 14:23:21.655796 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.656883 kubelet[2793]: E0513 14:23:21.656570 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.656883 kubelet[2793]: W0513 14:23:21.656585 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.656883 kubelet[2793]: E0513 14:23:21.656600 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.658080 containerd[1551]: time="2025-05-13T14:23:21.658039595Z" level=info msg="connecting to shim d4ba9713c0b367d8a4fa8e6c68c7aa3fb6718a4fa53788d961c51e72138891f2" address="unix:///run/containerd/s/c7c469ca1ce642a85e7974d4e8662779ca6f8edcf6ec61e4216e02a8d7de3fea" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:21.659390 kubelet[2793]: E0513 14:23:21.658571 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.659390 kubelet[2793]: W0513 14:23:21.658858 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.659390 kubelet[2793]: E0513 14:23:21.658872 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.659736 kubelet[2793]: E0513 14:23:21.659573 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.659968 kubelet[2793]: W0513 14:23:21.659857 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.662881 kubelet[2793]: E0513 14:23:21.660483 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.663720 kubelet[2793]: E0513 14:23:21.663130 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.663720 kubelet[2793]: W0513 14:23:21.663142 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.663720 kubelet[2793]: E0513 14:23:21.663153 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.663720 kubelet[2793]: I0513 14:23:21.663171 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb-varrun\") pod \"csi-node-driver-g62pr\" (UID: \"b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb\") " pod="calico-system/csi-node-driver-g62pr" May 13 14:23:21.663919 kubelet[2793]: E0513 14:23:21.663907 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.664104 kubelet[2793]: W0513 14:23:21.663970 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.666825 kubelet[2793]: E0513 14:23:21.664401 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.666943 kubelet[2793]: E0513 14:23:21.666194 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.666984 kubelet[2793]: W0513 14:23:21.666942 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.667030 kubelet[2793]: I0513 14:23:21.666921 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb-kubelet-dir\") pod \"csi-node-driver-g62pr\" (UID: \"b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb\") " pod="calico-system/csi-node-driver-g62pr" May 13 14:23:21.667099 kubelet[2793]: E0513 14:23:21.667085 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.668795 kubelet[2793]: E0513 14:23:21.668740 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.668795 kubelet[2793]: W0513 14:23:21.668755 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.668956 kubelet[2793]: E0513 14:23:21.668891 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.668956 kubelet[2793]: W0513 14:23:21.668909 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.668956 kubelet[2793]: E0513 14:23:21.668913 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.668956 kubelet[2793]: E0513 14:23:21.668941 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.669710 kubelet[2793]: E0513 14:23:21.669678 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.669710 kubelet[2793]: W0513 14:23:21.669690 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.670225 kubelet[2793]: E0513 14:23:21.670198 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.672469 kubelet[2793]: E0513 14:23:21.672434 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.672469 kubelet[2793]: W0513 14:23:21.672449 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.673610 kubelet[2793]: E0513 14:23:21.673571 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.673610 kubelet[2793]: W0513 14:23:21.673587 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.673929 kubelet[2793]: E0513 14:23:21.673696 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.673929 kubelet[2793]: W0513 14:23:21.673708 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.673929 kubelet[2793]: E0513 14:23:21.673805 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.673929 kubelet[2793]: W0513 14:23:21.673812 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.678138 kubelet[2793]: E0513 14:23:21.677411 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.678138 kubelet[2793]: W0513 14:23:21.677427 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.678138 kubelet[2793]: E0513 14:23:21.677442 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.678138 kubelet[2793]: E0513 14:23:21.677458 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.678138 kubelet[2793]: E0513 14:23:21.677924 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.678138 kubelet[2793]: W0513 14:23:21.677934 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.678138 kubelet[2793]: E0513 14:23:21.677943 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.678464 kubelet[2793]: E0513 14:23:21.678181 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.678464 kubelet[2793]: W0513 14:23:21.678190 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.678464 kubelet[2793]: E0513 14:23:21.678199 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.681385 kubelet[2793]: E0513 14:23:21.679067 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.681385 kubelet[2793]: E0513 14:23:21.679084 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.681385 kubelet[2793]: E0513 14:23:21.679095 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.681385 kubelet[2793]: I0513 14:23:21.679116 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frt7b\" (UniqueName: \"kubernetes.io/projected/b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb-kube-api-access-frt7b\") pod \"csi-node-driver-g62pr\" (UID: \"b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb\") " pod="calico-system/csi-node-driver-g62pr" May 13 14:23:21.681537 kubelet[2793]: E0513 14:23:21.681423 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.681537 kubelet[2793]: W0513 14:23:21.681434 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.681537 kubelet[2793]: E0513 14:23:21.681452 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.683650 kubelet[2793]: E0513 14:23:21.683255 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.683650 kubelet[2793]: W0513 14:23:21.683268 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.683650 kubelet[2793]: E0513 14:23:21.683283 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.683650 kubelet[2793]: E0513 14:23:21.683525 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.683650 kubelet[2793]: W0513 14:23:21.683534 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.683650 kubelet[2793]: E0513 14:23:21.683544 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.684226 kubelet[2793]: E0513 14:23:21.684011 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.684226 kubelet[2793]: W0513 14:23:21.684043 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.684878 kubelet[2793]: E0513 14:23:21.684493 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.684878 kubelet[2793]: E0513 14:23:21.684833 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.684878 kubelet[2793]: W0513 14:23:21.684841 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.685338 kubelet[2793]: E0513 14:23:21.685184 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.685700 kubelet[2793]: E0513 14:23:21.685373 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.685700 kubelet[2793]: W0513 14:23:21.685382 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.685700 kubelet[2793]: E0513 14:23:21.685612 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.685885 kubelet[2793]: E0513 14:23:21.685783 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.685885 kubelet[2793]: W0513 14:23:21.685791 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.686319 kubelet[2793]: E0513 14:23:21.686124 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.686319 kubelet[2793]: E0513 14:23:21.686205 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.686319 kubelet[2793]: W0513 14:23:21.686212 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.687379 kubelet[2793]: E0513 14:23:21.687312 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.688271 kubelet[2793]: E0513 14:23:21.687516 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.688271 kubelet[2793]: W0513 14:23:21.687530 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.688271 kubelet[2793]: E0513 14:23:21.687576 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.688271 kubelet[2793]: E0513 14:23:21.687662 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.688271 kubelet[2793]: W0513 14:23:21.687670 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.688271 kubelet[2793]: E0513 14:23:21.687768 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.688271 kubelet[2793]: W0513 14:23:21.687775 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.688271 kubelet[2793]: E0513 14:23:21.687891 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.688271 kubelet[2793]: E0513 14:23:21.687905 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.688271 kubelet[2793]: E0513 14:23:21.688005 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.688588 kubelet[2793]: W0513 14:23:21.688013 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.688588 kubelet[2793]: E0513 14:23:21.688022 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.688588 kubelet[2793]: I0513 14:23:21.688053 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb-registration-dir\") pod \"csi-node-driver-g62pr\" (UID: \"b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb\") " pod="calico-system/csi-node-driver-g62pr" May 13 14:23:21.688588 kubelet[2793]: E0513 14:23:21.688186 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.688588 kubelet[2793]: W0513 14:23:21.688195 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.688588 kubelet[2793]: E0513 14:23:21.688211 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.688588 kubelet[2793]: E0513 14:23:21.688396 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.688588 kubelet[2793]: W0513 14:23:21.688404 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.688588 kubelet[2793]: E0513 14:23:21.688413 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.688798 kubelet[2793]: E0513 14:23:21.688574 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.688798 kubelet[2793]: W0513 14:23:21.688582 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.688798 kubelet[2793]: E0513 14:23:21.688631 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.688798 kubelet[2793]: E0513 14:23:21.688790 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.688798 kubelet[2793]: W0513 14:23:21.688799 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.688914 kubelet[2793]: E0513 14:23:21.688811 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.689641 kubelet[2793]: E0513 14:23:21.688952 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.689641 kubelet[2793]: W0513 14:23:21.688965 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.689641 kubelet[2793]: E0513 14:23:21.688973 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.689641 kubelet[2793]: E0513 14:23:21.689172 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.689641 kubelet[2793]: W0513 14:23:21.689180 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.689641 kubelet[2793]: E0513 14:23:21.689193 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.689641 kubelet[2793]: E0513 14:23:21.689341 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.689641 kubelet[2793]: W0513 14:23:21.689349 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.689641 kubelet[2793]: E0513 14:23:21.689393 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.689641 kubelet[2793]: E0513 14:23:21.689512 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.690346 kubelet[2793]: W0513 14:23:21.689519 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.690346 kubelet[2793]: E0513 14:23:21.689527 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.690346 kubelet[2793]: E0513 14:23:21.689689 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.690346 kubelet[2793]: W0513 14:23:21.689729 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.690346 kubelet[2793]: E0513 14:23:21.689820 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.690346 kubelet[2793]: E0513 14:23:21.689913 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.690346 kubelet[2793]: W0513 14:23:21.689920 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.690346 kubelet[2793]: E0513 14:23:21.689933 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.690346 kubelet[2793]: E0513 14:23:21.690074 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.690346 kubelet[2793]: W0513 14:23:21.690081 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.692060 kubelet[2793]: E0513 14:23:21.690095 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.692060 kubelet[2793]: E0513 14:23:21.690261 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.692060 kubelet[2793]: W0513 14:23:21.690269 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.692060 kubelet[2793]: E0513 14:23:21.690277 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.711288 kubelet[2793]: E0513 14:23:21.711256 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.711615 kubelet[2793]: W0513 14:23:21.711591 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.711744 kubelet[2793]: E0513 14:23:21.711617 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.719529 systemd[1]: Started cri-containerd-d4ba9713c0b367d8a4fa8e6c68c7aa3fb6718a4fa53788d961c51e72138891f2.scope - libcontainer container d4ba9713c0b367d8a4fa8e6c68c7aa3fb6718a4fa53788d961c51e72138891f2. May 13 14:23:21.791611 kubelet[2793]: E0513 14:23:21.791189 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.791611 kubelet[2793]: W0513 14:23:21.791615 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.791817 kubelet[2793]: E0513 14:23:21.791796 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.792702 kubelet[2793]: E0513 14:23:21.792680 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.792702 kubelet[2793]: W0513 14:23:21.792695 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.792990 containerd[1551]: time="2025-05-13T14:23:21.792916010Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-4lklq,Uid:3af519b6-40af-4414-bde1-91242298fdf4,Namespace:calico-system,Attempt:0,}" May 13 14:23:21.793052 kubelet[2793]: E0513 14:23:21.792706 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.793777 kubelet[2793]: E0513 14:23:21.793741 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.793777 kubelet[2793]: W0513 14:23:21.793757 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.793862 kubelet[2793]: E0513 14:23:21.793847 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.795368 kubelet[2793]: E0513 14:23:21.794692 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.795368 kubelet[2793]: W0513 14:23:21.794734 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.795368 kubelet[2793]: E0513 14:23:21.794817 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.795368 kubelet[2793]: E0513 14:23:21.795006 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.795368 kubelet[2793]: W0513 14:23:21.795015 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.795368 kubelet[2793]: E0513 14:23:21.795090 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.795368 kubelet[2793]: E0513 14:23:21.795278 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.795368 kubelet[2793]: W0513 14:23:21.795287 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.795603 kubelet[2793]: E0513 14:23:21.795378 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.795603 kubelet[2793]: E0513 14:23:21.795535 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.795603 kubelet[2793]: W0513 14:23:21.795544 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.795603 kubelet[2793]: E0513 14:23:21.795560 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.796662 kubelet[2793]: E0513 14:23:21.796613 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.796662 kubelet[2793]: W0513 14:23:21.796628 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.797148 kubelet[2793]: E0513 14:23:21.796639 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.797148 kubelet[2793]: E0513 14:23:21.797034 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.797148 kubelet[2793]: W0513 14:23:21.797042 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.797148 kubelet[2793]: E0513 14:23:21.797059 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.798443 kubelet[2793]: E0513 14:23:21.797530 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.798443 kubelet[2793]: W0513 14:23:21.797538 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.798443 kubelet[2793]: E0513 14:23:21.798270 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.798546 containerd[1551]: time="2025-05-13T14:23:21.798087272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-58d75b9f4c-9xjhh,Uid:aa36223e-4de2-4292-8119-66c685569191,Namespace:calico-system,Attempt:0,} returns sandbox id \"d4ba9713c0b367d8a4fa8e6c68c7aa3fb6718a4fa53788d961c51e72138891f2\"" May 13 14:23:21.798583 kubelet[2793]: E0513 14:23:21.798489 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.798583 kubelet[2793]: W0513 14:23:21.798496 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.798583 kubelet[2793]: E0513 14:23:21.798508 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.798663 kubelet[2793]: E0513 14:23:21.798623 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.798663 kubelet[2793]: W0513 14:23:21.798631 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.798663 kubelet[2793]: E0513 14:23:21.798640 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.799433 kubelet[2793]: E0513 14:23:21.798814 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.799433 kubelet[2793]: W0513 14:23:21.798827 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.799433 kubelet[2793]: E0513 14:23:21.798976 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.800456 kubelet[2793]: E0513 14:23:21.799943 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.800456 kubelet[2793]: W0513 14:23:21.799951 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.800456 kubelet[2793]: E0513 14:23:21.800218 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.800456 kubelet[2793]: W0513 14:23:21.800226 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.803369 kubelet[2793]: E0513 14:23:21.800562 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.803369 kubelet[2793]: W0513 14:23:21.800575 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.803369 kubelet[2793]: E0513 14:23:21.800740 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.803369 kubelet[2793]: E0513 14:23:21.800756 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.803369 kubelet[2793]: E0513 14:23:21.800767 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.803369 kubelet[2793]: E0513 14:23:21.801191 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.803369 kubelet[2793]: W0513 14:23:21.801201 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.803369 kubelet[2793]: E0513 14:23:21.801211 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.803369 kubelet[2793]: E0513 14:23:21.801629 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.803369 kubelet[2793]: W0513 14:23:21.801707 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.803648 kubelet[2793]: E0513 14:23:21.801721 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.803648 kubelet[2793]: E0513 14:23:21.802185 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.803648 kubelet[2793]: W0513 14:23:21.802205 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.803648 kubelet[2793]: E0513 14:23:21.803243 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.803648 kubelet[2793]: E0513 14:23:21.803565 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.803648 kubelet[2793]: W0513 14:23:21.803575 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.803790 kubelet[2793]: E0513 14:23:21.803733 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.804413 kubelet[2793]: E0513 14:23:21.803957 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.804413 kubelet[2793]: W0513 14:23:21.803970 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.804413 kubelet[2793]: E0513 14:23:21.804034 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.804413 kubelet[2793]: E0513 14:23:21.804347 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.804413 kubelet[2793]: W0513 14:23:21.804384 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.804561 containerd[1551]: time="2025-05-13T14:23:21.804293164Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 13 14:23:21.804592 kubelet[2793]: E0513 14:23:21.804479 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.805296 kubelet[2793]: E0513 14:23:21.804813 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.805296 kubelet[2793]: W0513 14:23:21.804850 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.805296 kubelet[2793]: E0513 14:23:21.804890 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.805296 kubelet[2793]: E0513 14:23:21.805283 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.805296 kubelet[2793]: W0513 14:23:21.805291 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.805296 kubelet[2793]: E0513 14:23:21.805300 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.805783 kubelet[2793]: E0513 14:23:21.805774 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.805783 kubelet[2793]: W0513 14:23:21.805783 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.805834 kubelet[2793]: E0513 14:23:21.805792 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.824619 kubelet[2793]: E0513 14:23:21.824556 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:21.824619 kubelet[2793]: W0513 14:23:21.824574 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:21.824822 kubelet[2793]: E0513 14:23:21.824590 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:21.842681 containerd[1551]: time="2025-05-13T14:23:21.842519962Z" level=info msg="connecting to shim 7ff433576b9abc4725da3db9468ee947e5e055c6897eb3b8d18e97a7606cb0c8" address="unix:///run/containerd/s/a0d227a6010b8a8ac8c9fa12b4b997b58e1f68375a53dc42c760d387bc2c240d" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:21.871537 systemd[1]: Started cri-containerd-7ff433576b9abc4725da3db9468ee947e5e055c6897eb3b8d18e97a7606cb0c8.scope - libcontainer container 7ff433576b9abc4725da3db9468ee947e5e055c6897eb3b8d18e97a7606cb0c8. May 13 14:23:21.914037 containerd[1551]: time="2025-05-13T14:23:21.913816482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-4lklq,Uid:3af519b6-40af-4414-bde1-91242298fdf4,Namespace:calico-system,Attempt:0,} returns sandbox id \"7ff433576b9abc4725da3db9468ee947e5e055c6897eb3b8d18e97a7606cb0c8\"" May 13 14:23:23.070615 kubelet[2793]: E0513 14:23:23.070513 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g62pr" podUID="b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb" May 13 14:23:23.442695 kubelet[2793]: E0513 14:23:23.442668 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.442695 kubelet[2793]: W0513 14:23:23.442688 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.442908 kubelet[2793]: E0513 14:23:23.442704 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.442908 kubelet[2793]: E0513 14:23:23.442827 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.442908 kubelet[2793]: W0513 14:23:23.442835 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.442908 kubelet[2793]: E0513 14:23:23.442843 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.443084 kubelet[2793]: E0513 14:23:23.442954 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.443084 kubelet[2793]: W0513 14:23:23.442964 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.443084 kubelet[2793]: E0513 14:23:23.442972 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.443084 kubelet[2793]: E0513 14:23:23.443083 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.443292 kubelet[2793]: W0513 14:23:23.443091 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.443292 kubelet[2793]: E0513 14:23:23.443099 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.443292 kubelet[2793]: E0513 14:23:23.443215 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.443292 kubelet[2793]: W0513 14:23:23.443223 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.443292 kubelet[2793]: E0513 14:23:23.443231 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.443569 kubelet[2793]: E0513 14:23:23.443338 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.443569 kubelet[2793]: W0513 14:23:23.443347 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.443569 kubelet[2793]: E0513 14:23:23.443378 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.443569 kubelet[2793]: E0513 14:23:23.443489 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.443569 kubelet[2793]: W0513 14:23:23.443496 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.443569 kubelet[2793]: E0513 14:23:23.443504 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.443859 kubelet[2793]: E0513 14:23:23.443609 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.443859 kubelet[2793]: W0513 14:23:23.443617 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.443859 kubelet[2793]: E0513 14:23:23.443625 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.443859 kubelet[2793]: E0513 14:23:23.443763 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.443859 kubelet[2793]: W0513 14:23:23.443771 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.443859 kubelet[2793]: E0513 14:23:23.443779 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.444178 kubelet[2793]: E0513 14:23:23.443887 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.444178 kubelet[2793]: W0513 14:23:23.443896 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.444178 kubelet[2793]: E0513 14:23:23.443904 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.444178 kubelet[2793]: E0513 14:23:23.444032 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.444178 kubelet[2793]: W0513 14:23:23.444040 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.444178 kubelet[2793]: E0513 14:23:23.444050 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.444329 kubelet[2793]: E0513 14:23:23.444196 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.444329 kubelet[2793]: W0513 14:23:23.444204 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.444329 kubelet[2793]: E0513 14:23:23.444212 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.444329 kubelet[2793]: E0513 14:23:23.444327 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.444442 kubelet[2793]: W0513 14:23:23.444335 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.444442 kubelet[2793]: E0513 14:23:23.444343 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.444492 kubelet[2793]: E0513 14:23:23.444457 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.444492 kubelet[2793]: W0513 14:23:23.444465 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.444492 kubelet[2793]: E0513 14:23:23.444473 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.444704 kubelet[2793]: E0513 14:23:23.444598 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.444704 kubelet[2793]: W0513 14:23:23.444606 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.444704 kubelet[2793]: E0513 14:23:23.444614 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.444959 kubelet[2793]: E0513 14:23:23.444730 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.444959 kubelet[2793]: W0513 14:23:23.444738 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.444959 kubelet[2793]: E0513 14:23:23.444746 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.444959 kubelet[2793]: E0513 14:23:23.444918 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.444959 kubelet[2793]: W0513 14:23:23.444926 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.444959 kubelet[2793]: E0513 14:23:23.444935 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.445317 kubelet[2793]: E0513 14:23:23.445046 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.445317 kubelet[2793]: W0513 14:23:23.445054 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.445317 kubelet[2793]: E0513 14:23:23.445063 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.445317 kubelet[2793]: E0513 14:23:23.445183 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.445317 kubelet[2793]: W0513 14:23:23.445191 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.445317 kubelet[2793]: E0513 14:23:23.445200 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.445955 kubelet[2793]: E0513 14:23:23.445331 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.445955 kubelet[2793]: W0513 14:23:23.445339 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.445955 kubelet[2793]: E0513 14:23:23.445379 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.445955 kubelet[2793]: E0513 14:23:23.445554 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.445955 kubelet[2793]: W0513 14:23:23.445585 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.445955 kubelet[2793]: E0513 14:23:23.445595 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.445955 kubelet[2793]: E0513 14:23:23.445738 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.445955 kubelet[2793]: W0513 14:23:23.445747 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.445955 kubelet[2793]: E0513 14:23:23.445780 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.445955 kubelet[2793]: E0513 14:23:23.445899 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.446182 kubelet[2793]: W0513 14:23:23.445907 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.446182 kubelet[2793]: E0513 14:23:23.445915 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.446182 kubelet[2793]: E0513 14:23:23.446074 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.446182 kubelet[2793]: W0513 14:23:23.446083 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.446182 kubelet[2793]: E0513 14:23:23.446092 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:23.446293 kubelet[2793]: E0513 14:23:23.446266 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:23.446293 kubelet[2793]: W0513 14:23:23.446275 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:23.446293 kubelet[2793]: E0513 14:23:23.446284 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:25.070891 kubelet[2793]: E0513 14:23:25.070156 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g62pr" podUID="b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb" May 13 14:23:25.654012 containerd[1551]: time="2025-05-13T14:23:25.653962781Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:25.655219 containerd[1551]: time="2025-05-13T14:23:25.655175494Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 13 14:23:25.656312 containerd[1551]: time="2025-05-13T14:23:25.656271649Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:25.660824 containerd[1551]: time="2025-05-13T14:23:25.660740708Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:25.661375 containerd[1551]: time="2025-05-13T14:23:25.661134510Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 3.85681087s" May 13 14:23:25.661375 containerd[1551]: time="2025-05-13T14:23:25.661276814Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 13 14:23:25.663297 containerd[1551]: time="2025-05-13T14:23:25.663209463Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 13 14:23:25.673859 containerd[1551]: time="2025-05-13T14:23:25.673036132Z" level=info msg="CreateContainer within sandbox \"d4ba9713c0b367d8a4fa8e6c68c7aa3fb6718a4fa53788d961c51e72138891f2\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 13 14:23:25.687608 containerd[1551]: time="2025-05-13T14:23:25.687573618Z" level=info msg="Container 76957ce00bd300e8cf9511b8f1494fda1450f49124f87d3b30f147848bc18b83: CDI devices from CRI Config.CDIDevices: []" May 13 14:23:25.693537 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2773486262.mount: Deactivated successfully. May 13 14:23:25.704874 containerd[1551]: time="2025-05-13T14:23:25.703888707Z" level=info msg="CreateContainer within sandbox \"d4ba9713c0b367d8a4fa8e6c68c7aa3fb6718a4fa53788d961c51e72138891f2\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"76957ce00bd300e8cf9511b8f1494fda1450f49124f87d3b30f147848bc18b83\"" May 13 14:23:25.705425 containerd[1551]: time="2025-05-13T14:23:25.705384274Z" level=info msg="StartContainer for \"76957ce00bd300e8cf9511b8f1494fda1450f49124f87d3b30f147848bc18b83\"" May 13 14:23:25.706638 containerd[1551]: time="2025-05-13T14:23:25.706583641Z" level=info msg="connecting to shim 76957ce00bd300e8cf9511b8f1494fda1450f49124f87d3b30f147848bc18b83" address="unix:///run/containerd/s/c7c469ca1ce642a85e7974d4e8662779ca6f8edcf6ec61e4216e02a8d7de3fea" protocol=ttrpc version=3 May 13 14:23:25.734551 systemd[1]: Started cri-containerd-76957ce00bd300e8cf9511b8f1494fda1450f49124f87d3b30f147848bc18b83.scope - libcontainer container 76957ce00bd300e8cf9511b8f1494fda1450f49124f87d3b30f147848bc18b83. May 13 14:23:25.790748 containerd[1551]: time="2025-05-13T14:23:25.790715981Z" level=info msg="StartContainer for \"76957ce00bd300e8cf9511b8f1494fda1450f49124f87d3b30f147848bc18b83\" returns successfully" May 13 14:23:26.265811 kubelet[2793]: E0513 14:23:26.265755 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.265811 kubelet[2793]: W0513 14:23:26.265796 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.265811 kubelet[2793]: E0513 14:23:26.265833 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.267538 kubelet[2793]: E0513 14:23:26.266188 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.267538 kubelet[2793]: W0513 14:23:26.266210 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.267538 kubelet[2793]: E0513 14:23:26.266232 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.267538 kubelet[2793]: E0513 14:23:26.266596 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.267538 kubelet[2793]: W0513 14:23:26.266615 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.267538 kubelet[2793]: E0513 14:23:26.266635 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.267538 kubelet[2793]: E0513 14:23:26.266898 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.267538 kubelet[2793]: W0513 14:23:26.266916 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.267538 kubelet[2793]: E0513 14:23:26.266937 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.267538 kubelet[2793]: E0513 14:23:26.267223 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.270338 kubelet[2793]: W0513 14:23:26.267242 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.270338 kubelet[2793]: E0513 14:23:26.267262 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.270338 kubelet[2793]: E0513 14:23:26.267619 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.270338 kubelet[2793]: W0513 14:23:26.267639 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.270338 kubelet[2793]: E0513 14:23:26.267660 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.270338 kubelet[2793]: E0513 14:23:26.268026 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.270338 kubelet[2793]: W0513 14:23:26.268348 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.270338 kubelet[2793]: E0513 14:23:26.268453 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.270338 kubelet[2793]: E0513 14:23:26.268768 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.270338 kubelet[2793]: W0513 14:23:26.268786 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.271031 kubelet[2793]: E0513 14:23:26.268807 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.271031 kubelet[2793]: E0513 14:23:26.269538 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.271031 kubelet[2793]: W0513 14:23:26.269559 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.271031 kubelet[2793]: E0513 14:23:26.269622 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.271031 kubelet[2793]: E0513 14:23:26.269920 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.271031 kubelet[2793]: W0513 14:23:26.269942 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.271031 kubelet[2793]: E0513 14:23:26.269976 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.271031 kubelet[2793]: E0513 14:23:26.270780 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.271031 kubelet[2793]: W0513 14:23:26.270801 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.271031 kubelet[2793]: E0513 14:23:26.270838 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.272714 kubelet[2793]: E0513 14:23:26.271096 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.272714 kubelet[2793]: W0513 14:23:26.271115 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.272714 kubelet[2793]: E0513 14:23:26.271149 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.272714 kubelet[2793]: E0513 14:23:26.271632 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.272714 kubelet[2793]: W0513 14:23:26.271669 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.272714 kubelet[2793]: E0513 14:23:26.271692 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.272714 kubelet[2793]: E0513 14:23:26.272018 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.272714 kubelet[2793]: W0513 14:23:26.272051 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.272714 kubelet[2793]: E0513 14:23:26.272075 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.272714 kubelet[2793]: E0513 14:23:26.272393 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.273280 kubelet[2793]: W0513 14:23:26.272440 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.273280 kubelet[2793]: E0513 14:23:26.272461 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.336542 kubelet[2793]: E0513 14:23:26.336483 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.336542 kubelet[2793]: W0513 14:23:26.336524 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.337296 kubelet[2793]: E0513 14:23:26.336559 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.337296 kubelet[2793]: E0513 14:23:26.336949 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.337296 kubelet[2793]: W0513 14:23:26.336969 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.337296 kubelet[2793]: E0513 14:23:26.337002 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.337778 kubelet[2793]: E0513 14:23:26.337734 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.337983 kubelet[2793]: W0513 14:23:26.337940 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.338220 kubelet[2793]: E0513 14:23:26.338179 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.338683 kubelet[2793]: E0513 14:23:26.338591 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.338683 kubelet[2793]: W0513 14:23:26.338620 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.338683 kubelet[2793]: E0513 14:23:26.338653 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.339604 kubelet[2793]: E0513 14:23:26.338976 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.339604 kubelet[2793]: W0513 14:23:26.338996 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.339604 kubelet[2793]: E0513 14:23:26.339206 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.339604 kubelet[2793]: E0513 14:23:26.339259 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.339604 kubelet[2793]: W0513 14:23:26.339277 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.339604 kubelet[2793]: E0513 14:23:26.339516 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.340152 kubelet[2793]: E0513 14:23:26.339666 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.340152 kubelet[2793]: W0513 14:23:26.339686 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.340152 kubelet[2793]: E0513 14:23:26.339721 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.340739 kubelet[2793]: E0513 14:23:26.340714 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.341059 kubelet[2793]: W0513 14:23:26.340862 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.341059 kubelet[2793]: E0513 14:23:26.340917 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.341522 kubelet[2793]: E0513 14:23:26.341344 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.341522 kubelet[2793]: W0513 14:23:26.341423 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.341522 kubelet[2793]: E0513 14:23:26.341497 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.342107 kubelet[2793]: E0513 14:23:26.342074 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.342396 kubelet[2793]: W0513 14:23:26.342290 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.342577 kubelet[2793]: E0513 14:23:26.342432 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.343133 kubelet[2793]: E0513 14:23:26.342969 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.343133 kubelet[2793]: W0513 14:23:26.342994 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.343133 kubelet[2793]: E0513 14:23:26.343067 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.343710 kubelet[2793]: E0513 14:23:26.343684 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.343965 kubelet[2793]: W0513 14:23:26.343853 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.343965 kubelet[2793]: E0513 14:23:26.343939 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.344496 kubelet[2793]: E0513 14:23:26.344433 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.344496 kubelet[2793]: W0513 14:23:26.344463 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.344852 kubelet[2793]: E0513 14:23:26.344788 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.345462 kubelet[2793]: E0513 14:23:26.345271 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.345462 kubelet[2793]: W0513 14:23:26.345298 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.345614 kubelet[2793]: E0513 14:23:26.345468 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.346214 kubelet[2793]: E0513 14:23:26.346031 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.346214 kubelet[2793]: W0513 14:23:26.346061 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.347061 kubelet[2793]: E0513 14:23:26.346584 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.347061 kubelet[2793]: W0513 14:23:26.346611 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.347061 kubelet[2793]: E0513 14:23:26.346634 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.347061 kubelet[2793]: E0513 14:23:26.346956 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.347061 kubelet[2793]: W0513 14:23:26.346978 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.347061 kubelet[2793]: E0513 14:23:26.346982 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.347061 kubelet[2793]: E0513 14:23:26.347003 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:26.347596 kubelet[2793]: E0513 14:23:26.347564 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:26.347660 kubelet[2793]: W0513 14:23:26.347586 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:26.347720 kubelet[2793]: E0513 14:23:26.347666 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.070519 kubelet[2793]: E0513 14:23:27.070401 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g62pr" podUID="b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb" May 13 14:23:27.195189 kubelet[2793]: I0513 14:23:27.195082 2793 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 14:23:27.280455 kubelet[2793]: E0513 14:23:27.280294 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.280455 kubelet[2793]: W0513 14:23:27.280331 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.280455 kubelet[2793]: E0513 14:23:27.280402 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.281444 kubelet[2793]: E0513 14:23:27.280741 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.281444 kubelet[2793]: W0513 14:23:27.280761 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.281444 kubelet[2793]: E0513 14:23:27.280782 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.281444 kubelet[2793]: E0513 14:23:27.281137 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.281444 kubelet[2793]: W0513 14:23:27.281156 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.281444 kubelet[2793]: E0513 14:23:27.281176 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.282005 kubelet[2793]: E0513 14:23:27.281530 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.282005 kubelet[2793]: W0513 14:23:27.281551 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.282005 kubelet[2793]: E0513 14:23:27.281571 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.282005 kubelet[2793]: E0513 14:23:27.281886 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.282005 kubelet[2793]: W0513 14:23:27.281912 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.282005 kubelet[2793]: E0513 14:23:27.281941 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.282512 kubelet[2793]: E0513 14:23:27.282302 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.282512 kubelet[2793]: W0513 14:23:27.282323 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.282512 kubelet[2793]: E0513 14:23:27.282343 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.282764 kubelet[2793]: E0513 14:23:27.282662 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.282764 kubelet[2793]: W0513 14:23:27.282682 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.282764 kubelet[2793]: E0513 14:23:27.282702 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.283017 kubelet[2793]: E0513 14:23:27.282979 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.283017 kubelet[2793]: W0513 14:23:27.282999 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.283200 kubelet[2793]: E0513 14:23:27.283020 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.283338 kubelet[2793]: E0513 14:23:27.283303 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.283338 kubelet[2793]: W0513 14:23:27.283329 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.283546 kubelet[2793]: E0513 14:23:27.283352 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.283724 kubelet[2793]: E0513 14:23:27.283694 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.283724 kubelet[2793]: W0513 14:23:27.283721 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.283893 kubelet[2793]: E0513 14:23:27.283741 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.284044 kubelet[2793]: E0513 14:23:27.284013 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.284044 kubelet[2793]: W0513 14:23:27.284039 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.284201 kubelet[2793]: E0513 14:23:27.284059 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.284384 kubelet[2793]: E0513 14:23:27.284324 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.284384 kubelet[2793]: W0513 14:23:27.284350 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.284660 kubelet[2793]: E0513 14:23:27.284416 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.284724 kubelet[2793]: E0513 14:23:27.284685 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.284724 kubelet[2793]: W0513 14:23:27.284704 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.284834 kubelet[2793]: E0513 14:23:27.284724 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.285008 kubelet[2793]: E0513 14:23:27.284978 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.285008 kubelet[2793]: W0513 14:23:27.285004 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.285179 kubelet[2793]: E0513 14:23:27.285025 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.285306 kubelet[2793]: E0513 14:23:27.285272 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.285306 kubelet[2793]: W0513 14:23:27.285304 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.285485 kubelet[2793]: E0513 14:23:27.285323 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.349585 kubelet[2793]: E0513 14:23:27.349423 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.349585 kubelet[2793]: W0513 14:23:27.349457 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.349585 kubelet[2793]: E0513 14:23:27.349485 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.351070 kubelet[2793]: E0513 14:23:27.350996 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.351070 kubelet[2793]: W0513 14:23:27.351031 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.351592 kubelet[2793]: E0513 14:23:27.351223 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.352864 kubelet[2793]: E0513 14:23:27.352735 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.352864 kubelet[2793]: W0513 14:23:27.352764 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.352864 kubelet[2793]: E0513 14:23:27.352841 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.354165 kubelet[2793]: E0513 14:23:27.353825 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.354165 kubelet[2793]: W0513 14:23:27.353878 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.354165 kubelet[2793]: E0513 14:23:27.353990 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.355617 kubelet[2793]: E0513 14:23:27.355592 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.355903 kubelet[2793]: W0513 14:23:27.355777 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.355903 kubelet[2793]: E0513 14:23:27.355882 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.356643 kubelet[2793]: E0513 14:23:27.356448 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.356643 kubelet[2793]: W0513 14:23:27.356497 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.356957 kubelet[2793]: E0513 14:23:27.356892 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.357265 kubelet[2793]: E0513 14:23:27.357147 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.357265 kubelet[2793]: W0513 14:23:27.357169 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.357638 kubelet[2793]: E0513 14:23:27.357235 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.358024 kubelet[2793]: E0513 14:23:27.357909 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.358024 kubelet[2793]: W0513 14:23:27.357933 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.358175 kubelet[2793]: E0513 14:23:27.358019 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.358694 kubelet[2793]: E0513 14:23:27.358587 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.358694 kubelet[2793]: W0513 14:23:27.358609 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.358876 kubelet[2793]: E0513 14:23:27.358699 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.359498 kubelet[2793]: E0513 14:23:27.359425 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.359498 kubelet[2793]: W0513 14:23:27.359451 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.360232 kubelet[2793]: E0513 14:23:27.359964 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.360232 kubelet[2793]: E0513 14:23:27.360151 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.360232 kubelet[2793]: W0513 14:23:27.360173 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.360826 kubelet[2793]: E0513 14:23:27.360476 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.360826 kubelet[2793]: E0513 14:23:27.360710 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.360826 kubelet[2793]: W0513 14:23:27.360731 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.361194 kubelet[2793]: E0513 14:23:27.361097 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.361194 kubelet[2793]: E0513 14:23:27.361129 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.361194 kubelet[2793]: W0513 14:23:27.361189 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.363224 kubelet[2793]: E0513 14:23:27.362610 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.363224 kubelet[2793]: W0513 14:23:27.362642 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.363224 kubelet[2793]: E0513 14:23:27.361234 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.363224 kubelet[2793]: E0513 14:23:27.362706 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.364176 kubelet[2793]: E0513 14:23:27.364130 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.364176 kubelet[2793]: W0513 14:23:27.364161 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.364347 kubelet[2793]: E0513 14:23:27.364197 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.364709 kubelet[2793]: E0513 14:23:27.364675 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.364709 kubelet[2793]: W0513 14:23:27.364706 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.364858 kubelet[2793]: E0513 14:23:27.364780 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.365344 kubelet[2793]: E0513 14:23:27.365311 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.365344 kubelet[2793]: W0513 14:23:27.365339 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.365559 kubelet[2793]: E0513 14:23:27.365437 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:27.366622 kubelet[2793]: E0513 14:23:27.366591 2793 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 14:23:27.366622 kubelet[2793]: W0513 14:23:27.366613 2793 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 14:23:27.366778 kubelet[2793]: E0513 14:23:27.366635 2793 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 14:23:29.071211 kubelet[2793]: E0513 14:23:29.070558 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g62pr" podUID="b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb" May 13 14:23:29.200717 containerd[1551]: time="2025-05-13T14:23:29.200670142Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:29.201604 containerd[1551]: time="2025-05-13T14:23:29.201573781Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 13 14:23:29.202695 containerd[1551]: time="2025-05-13T14:23:29.202629575Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:29.205140 containerd[1551]: time="2025-05-13T14:23:29.205094998Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:29.205829 containerd[1551]: time="2025-05-13T14:23:29.205715100Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 3.542269318s" May 13 14:23:29.205829 containerd[1551]: time="2025-05-13T14:23:29.205746338Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 13 14:23:29.208201 containerd[1551]: time="2025-05-13T14:23:29.208156428Z" level=info msg="CreateContainer within sandbox \"7ff433576b9abc4725da3db9468ee947e5e055c6897eb3b8d18e97a7606cb0c8\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 13 14:23:29.223014 containerd[1551]: time="2025-05-13T14:23:29.222117381Z" level=info msg="Container b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0: CDI devices from CRI Config.CDIDevices: []" May 13 14:23:29.236006 containerd[1551]: time="2025-05-13T14:23:29.235969021Z" level=info msg="CreateContainer within sandbox \"7ff433576b9abc4725da3db9468ee947e5e055c6897eb3b8d18e97a7606cb0c8\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0\"" May 13 14:23:29.236852 containerd[1551]: time="2025-05-13T14:23:29.236826224Z" level=info msg="StartContainer for \"b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0\"" May 13 14:23:29.240882 containerd[1551]: time="2025-05-13T14:23:29.240806023Z" level=info msg="connecting to shim b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0" address="unix:///run/containerd/s/a0d227a6010b8a8ac8c9fa12b4b997b58e1f68375a53dc42c760d387bc2c240d" protocol=ttrpc version=3 May 13 14:23:29.265497 systemd[1]: Started cri-containerd-b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0.scope - libcontainer container b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0. May 13 14:23:29.312063 containerd[1551]: time="2025-05-13T14:23:29.311740767Z" level=info msg="StartContainer for \"b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0\" returns successfully" May 13 14:23:29.319645 systemd[1]: cri-containerd-b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0.scope: Deactivated successfully. May 13 14:23:29.322662 containerd[1551]: time="2025-05-13T14:23:29.322213134Z" level=info msg="received exit event container_id:\"b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0\" id:\"b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0\" pid:3526 exited_at:{seconds:1747146209 nanos:321895654}" May 13 14:23:29.323266 containerd[1551]: time="2025-05-13T14:23:29.322886246Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0\" id:\"b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0\" pid:3526 exited_at:{seconds:1747146209 nanos:321895654}" May 13 14:23:29.347424 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0-rootfs.mount: Deactivated successfully. May 13 14:23:30.233963 containerd[1551]: time="2025-05-13T14:23:30.233907219Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 13 14:23:30.265382 kubelet[2793]: I0513 14:23:30.264256 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-58d75b9f4c-9xjhh" podStartSLOduration=5.402910285 podStartE2EDuration="9.264241562s" podCreationTimestamp="2025-05-13 14:23:21 +0000 UTC" firstStartedPulling="2025-05-13 14:23:21.801046672 +0000 UTC m=+11.840068598" lastFinishedPulling="2025-05-13 14:23:25.662377949 +0000 UTC m=+15.701399875" observedRunningTime="2025-05-13 14:23:26.22324736 +0000 UTC m=+16.262269326" watchObservedRunningTime="2025-05-13 14:23:30.264241562 +0000 UTC m=+20.303263478" May 13 14:23:31.070681 kubelet[2793]: E0513 14:23:31.070565 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g62pr" podUID="b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb" May 13 14:23:33.070159 kubelet[2793]: E0513 14:23:33.070086 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g62pr" podUID="b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb" May 13 14:23:35.070549 kubelet[2793]: E0513 14:23:35.070508 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g62pr" podUID="b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb" May 13 14:23:36.348639 containerd[1551]: time="2025-05-13T14:23:36.348601245Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:36.349988 containerd[1551]: time="2025-05-13T14:23:36.349964744Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 13 14:23:36.351249 containerd[1551]: time="2025-05-13T14:23:36.351208048Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:36.353725 containerd[1551]: time="2025-05-13T14:23:36.353666605Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:36.354463 containerd[1551]: time="2025-05-13T14:23:36.354325341Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 6.120355826s" May 13 14:23:36.354463 containerd[1551]: time="2025-05-13T14:23:36.354385624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 13 14:23:36.357376 containerd[1551]: time="2025-05-13T14:23:36.356864128Z" level=info msg="CreateContainer within sandbox \"7ff433576b9abc4725da3db9468ee947e5e055c6897eb3b8d18e97a7606cb0c8\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 13 14:23:36.364757 containerd[1551]: time="2025-05-13T14:23:36.364728139Z" level=info msg="Container 35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e: CDI devices from CRI Config.CDIDevices: []" May 13 14:23:36.386266 containerd[1551]: time="2025-05-13T14:23:36.386202468Z" level=info msg="CreateContainer within sandbox \"7ff433576b9abc4725da3db9468ee947e5e055c6897eb3b8d18e97a7606cb0c8\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e\"" May 13 14:23:36.387098 containerd[1551]: time="2025-05-13T14:23:36.387067690Z" level=info msg="StartContainer for \"35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e\"" May 13 14:23:36.389150 containerd[1551]: time="2025-05-13T14:23:36.389113177Z" level=info msg="connecting to shim 35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e" address="unix:///run/containerd/s/a0d227a6010b8a8ac8c9fa12b4b997b58e1f68375a53dc42c760d387bc2c240d" protocol=ttrpc version=3 May 13 14:23:36.422499 systemd[1]: Started cri-containerd-35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e.scope - libcontainer container 35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e. May 13 14:23:36.465507 containerd[1551]: time="2025-05-13T14:23:36.465470063Z" level=info msg="StartContainer for \"35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e\" returns successfully" May 13 14:23:37.070944 kubelet[2793]: E0513 14:23:37.069882 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g62pr" podUID="b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb" May 13 14:23:37.674987 containerd[1551]: time="2025-05-13T14:23:37.674854670Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 14:23:37.678480 systemd[1]: cri-containerd-35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e.scope: Deactivated successfully. May 13 14:23:37.678987 systemd[1]: cri-containerd-35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e.scope: Consumed 694ms CPU time, 171.8M memory peak, 154M written to disk. May 13 14:23:37.687165 containerd[1551]: time="2025-05-13T14:23:37.687097206Z" level=info msg="TaskExit event in podsandbox handler container_id:\"35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e\" id:\"35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e\" pid:3587 exited_at:{seconds:1747146217 nanos:685982941}" May 13 14:23:37.687438 containerd[1551]: time="2025-05-13T14:23:37.687392566Z" level=info msg="received exit event container_id:\"35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e\" id:\"35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e\" pid:3587 exited_at:{seconds:1747146217 nanos:685982941}" May 13 14:23:37.722646 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e-rootfs.mount: Deactivated successfully. May 13 14:23:37.739350 kubelet[2793]: I0513 14:23:37.739280 2793 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 13 14:23:38.353771 systemd[1]: Created slice kubepods-besteffort-podf1d2f603_6e50_45a5_983a_f14916c24ded.slice - libcontainer container kubepods-besteffort-podf1d2f603_6e50_45a5_983a_f14916c24ded.slice. May 13 14:23:38.364053 systemd[1]: Created slice kubepods-burstable-podc330e9e0_2c3a_4bf9_a416_6d9d0dfb79ea.slice - libcontainer container kubepods-burstable-podc330e9e0_2c3a_4bf9_a416_6d9d0dfb79ea.slice. May 13 14:23:38.437239 kubelet[2793]: I0513 14:23:38.437135 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcbld\" (UniqueName: \"kubernetes.io/projected/c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea-kube-api-access-kcbld\") pod \"coredns-6f6b679f8f-g9gj9\" (UID: \"c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea\") " pod="kube-system/coredns-6f6b679f8f-g9gj9" May 13 14:23:38.437804 kubelet[2793]: I0513 14:23:38.437260 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nvkk\" (UniqueName: \"kubernetes.io/projected/f1d2f603-6e50-45a5-983a-f14916c24ded-kube-api-access-9nvkk\") pod \"calico-kube-controllers-558d4878cb-w59hw\" (UID: \"f1d2f603-6e50-45a5-983a-f14916c24ded\") " pod="calico-system/calico-kube-controllers-558d4878cb-w59hw" May 13 14:23:38.437804 kubelet[2793]: I0513 14:23:38.437316 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d2f603-6e50-45a5-983a-f14916c24ded-tigera-ca-bundle\") pod \"calico-kube-controllers-558d4878cb-w59hw\" (UID: \"f1d2f603-6e50-45a5-983a-f14916c24ded\") " pod="calico-system/calico-kube-controllers-558d4878cb-w59hw" May 13 14:23:38.437804 kubelet[2793]: I0513 14:23:38.437450 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea-config-volume\") pod \"coredns-6f6b679f8f-g9gj9\" (UID: \"c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea\") " pod="kube-system/coredns-6f6b679f8f-g9gj9" May 13 14:23:38.609692 systemd[1]: Created slice kubepods-besteffort-pod293aca7c_7ada_43c0_90e4_5a055324f6a5.slice - libcontainer container kubepods-besteffort-pod293aca7c_7ada_43c0_90e4_5a055324f6a5.slice. May 13 14:23:38.639475 systemd[1]: Created slice kubepods-besteffort-pod39290338_b216_4ae5_9c4f_676b2992736d.slice - libcontainer container kubepods-besteffort-pod39290338_b216_4ae5_9c4f_676b2992736d.slice. May 13 14:23:38.655745 systemd[1]: Created slice kubepods-burstable-pod7ab065bc_8b20_44be_b443_9e6e35fcebd5.slice - libcontainer container kubepods-burstable-pod7ab065bc_8b20_44be_b443_9e6e35fcebd5.slice. May 13 14:23:38.663040 containerd[1551]: time="2025-05-13T14:23:38.662916367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-558d4878cb-w59hw,Uid:f1d2f603-6e50-45a5-983a-f14916c24ded,Namespace:calico-system,Attempt:0,}" May 13 14:23:38.739731 kubelet[2793]: I0513 14:23:38.739693 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ab065bc-8b20-44be-b443-9e6e35fcebd5-config-volume\") pod \"coredns-6f6b679f8f-xc4qq\" (UID: \"7ab065bc-8b20-44be-b443-9e6e35fcebd5\") " pod="kube-system/coredns-6f6b679f8f-xc4qq" May 13 14:23:38.741562 kubelet[2793]: I0513 14:23:38.741403 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl69g\" (UniqueName: \"kubernetes.io/projected/7ab065bc-8b20-44be-b443-9e6e35fcebd5-kube-api-access-tl69g\") pod \"coredns-6f6b679f8f-xc4qq\" (UID: \"7ab065bc-8b20-44be-b443-9e6e35fcebd5\") " pod="kube-system/coredns-6f6b679f8f-xc4qq" May 13 14:23:38.741562 kubelet[2793]: I0513 14:23:38.741441 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/39290338-b216-4ae5-9c4f-676b2992736d-calico-apiserver-certs\") pod \"calico-apiserver-6b4d8b97f4-rpkpt\" (UID: \"39290338-b216-4ae5-9c4f-676b2992736d\") " pod="calico-apiserver/calico-apiserver-6b4d8b97f4-rpkpt" May 13 14:23:38.741562 kubelet[2793]: I0513 14:23:38.741480 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/293aca7c-7ada-43c0-90e4-5a055324f6a5-calico-apiserver-certs\") pod \"calico-apiserver-6b4d8b97f4-w976v\" (UID: \"293aca7c-7ada-43c0-90e4-5a055324f6a5\") " pod="calico-apiserver/calico-apiserver-6b4d8b97f4-w976v" May 13 14:23:38.741562 kubelet[2793]: I0513 14:23:38.741503 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd967\" (UniqueName: \"kubernetes.io/projected/293aca7c-7ada-43c0-90e4-5a055324f6a5-kube-api-access-wd967\") pod \"calico-apiserver-6b4d8b97f4-w976v\" (UID: \"293aca7c-7ada-43c0-90e4-5a055324f6a5\") " pod="calico-apiserver/calico-apiserver-6b4d8b97f4-w976v" May 13 14:23:38.741562 kubelet[2793]: I0513 14:23:38.741527 2793 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66jpw\" (UniqueName: \"kubernetes.io/projected/39290338-b216-4ae5-9c4f-676b2992736d-kube-api-access-66jpw\") pod \"calico-apiserver-6b4d8b97f4-rpkpt\" (UID: \"39290338-b216-4ae5-9c4f-676b2992736d\") " pod="calico-apiserver/calico-apiserver-6b4d8b97f4-rpkpt" May 13 14:23:38.749315 containerd[1551]: time="2025-05-13T14:23:38.749263369Z" level=error msg="Failed to destroy network for sandbox \"17fa67200c68269cda1ab35956b9ba5c79651869022864eabf5f0d20b5e01b5c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:38.751307 systemd[1]: run-netns-cni\x2d70f31681\x2d9458\x2dcd6e\x2d6cc0\x2d5cf7c6a2daa5.mount: Deactivated successfully. May 13 14:23:38.753902 containerd[1551]: time="2025-05-13T14:23:38.753810589Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-558d4878cb-w59hw,Uid:f1d2f603-6e50-45a5-983a-f14916c24ded,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"17fa67200c68269cda1ab35956b9ba5c79651869022864eabf5f0d20b5e01b5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:38.754793 kubelet[2793]: E0513 14:23:38.754740 2793 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17fa67200c68269cda1ab35956b9ba5c79651869022864eabf5f0d20b5e01b5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:38.754856 kubelet[2793]: E0513 14:23:38.754810 2793 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17fa67200c68269cda1ab35956b9ba5c79651869022864eabf5f0d20b5e01b5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-558d4878cb-w59hw" May 13 14:23:38.754856 kubelet[2793]: E0513 14:23:38.754833 2793 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17fa67200c68269cda1ab35956b9ba5c79651869022864eabf5f0d20b5e01b5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-558d4878cb-w59hw" May 13 14:23:38.754915 kubelet[2793]: E0513 14:23:38.754877 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-558d4878cb-w59hw_calico-system(f1d2f603-6e50-45a5-983a-f14916c24ded)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-558d4878cb-w59hw_calico-system(f1d2f603-6e50-45a5-983a-f14916c24ded)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"17fa67200c68269cda1ab35956b9ba5c79651869022864eabf5f0d20b5e01b5c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-558d4878cb-w59hw" podUID="f1d2f603-6e50-45a5-983a-f14916c24ded" May 13 14:23:38.923918 containerd[1551]: time="2025-05-13T14:23:38.923882151Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b4d8b97f4-w976v,Uid:293aca7c-7ada-43c0-90e4-5a055324f6a5,Namespace:calico-apiserver,Attempt:0,}" May 13 14:23:38.951059 containerd[1551]: time="2025-05-13T14:23:38.951021286Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b4d8b97f4-rpkpt,Uid:39290338-b216-4ae5-9c4f-676b2992736d,Namespace:calico-apiserver,Attempt:0,}" May 13 14:23:38.971629 containerd[1551]: time="2025-05-13T14:23:38.971522037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-g9gj9,Uid:c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea,Namespace:kube-system,Attempt:0,}" May 13 14:23:38.974538 containerd[1551]: time="2025-05-13T14:23:38.974501005Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xc4qq,Uid:7ab065bc-8b20-44be-b443-9e6e35fcebd5,Namespace:kube-system,Attempt:0,}" May 13 14:23:38.985833 containerd[1551]: time="2025-05-13T14:23:38.985792424Z" level=error msg="Failed to destroy network for sandbox \"9a5896cccdc2d1d9404a50675ccf89714c9208fc36ce081a0219763c4b96e9a0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.008434 containerd[1551]: time="2025-05-13T14:23:39.008338396Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b4d8b97f4-w976v,Uid:293aca7c-7ada-43c0-90e4-5a055324f6a5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a5896cccdc2d1d9404a50675ccf89714c9208fc36ce081a0219763c4b96e9a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.008749 kubelet[2793]: E0513 14:23:39.008554 2793 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a5896cccdc2d1d9404a50675ccf89714c9208fc36ce081a0219763c4b96e9a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.008749 kubelet[2793]: E0513 14:23:39.008606 2793 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a5896cccdc2d1d9404a50675ccf89714c9208fc36ce081a0219763c4b96e9a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b4d8b97f4-w976v" May 13 14:23:39.008749 kubelet[2793]: E0513 14:23:39.008628 2793 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a5896cccdc2d1d9404a50675ccf89714c9208fc36ce081a0219763c4b96e9a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b4d8b97f4-w976v" May 13 14:23:39.008881 kubelet[2793]: E0513 14:23:39.008677 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b4d8b97f4-w976v_calico-apiserver(293aca7c-7ada-43c0-90e4-5a055324f6a5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b4d8b97f4-w976v_calico-apiserver(293aca7c-7ada-43c0-90e4-5a055324f6a5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9a5896cccdc2d1d9404a50675ccf89714c9208fc36ce081a0219763c4b96e9a0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b4d8b97f4-w976v" podUID="293aca7c-7ada-43c0-90e4-5a055324f6a5" May 13 14:23:39.051343 containerd[1551]: time="2025-05-13T14:23:39.051242005Z" level=error msg="Failed to destroy network for sandbox \"fcc9d7a8c8722c6e5fb806c8c16ff219dc91b84f48f8ccededd9237796e98873\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.053271 containerd[1551]: time="2025-05-13T14:23:39.053145830Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b4d8b97f4-rpkpt,Uid:39290338-b216-4ae5-9c4f-676b2992736d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fcc9d7a8c8722c6e5fb806c8c16ff219dc91b84f48f8ccededd9237796e98873\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.053928 kubelet[2793]: E0513 14:23:39.053867 2793 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fcc9d7a8c8722c6e5fb806c8c16ff219dc91b84f48f8ccededd9237796e98873\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.054006 kubelet[2793]: E0513 14:23:39.053929 2793 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fcc9d7a8c8722c6e5fb806c8c16ff219dc91b84f48f8ccededd9237796e98873\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b4d8b97f4-rpkpt" May 13 14:23:39.054006 kubelet[2793]: E0513 14:23:39.053954 2793 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fcc9d7a8c8722c6e5fb806c8c16ff219dc91b84f48f8ccededd9237796e98873\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b4d8b97f4-rpkpt" May 13 14:23:39.054077 kubelet[2793]: E0513 14:23:39.053996 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b4d8b97f4-rpkpt_calico-apiserver(39290338-b216-4ae5-9c4f-676b2992736d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b4d8b97f4-rpkpt_calico-apiserver(39290338-b216-4ae5-9c4f-676b2992736d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fcc9d7a8c8722c6e5fb806c8c16ff219dc91b84f48f8ccededd9237796e98873\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b4d8b97f4-rpkpt" podUID="39290338-b216-4ae5-9c4f-676b2992736d" May 13 14:23:39.073097 containerd[1551]: time="2025-05-13T14:23:39.072970910Z" level=error msg="Failed to destroy network for sandbox \"584d0d4df4971f6cfaea02ac495a95ae9c1676eb8096ecf02ee8898559efbc90\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.074815 containerd[1551]: time="2025-05-13T14:23:39.074785410Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-g9gj9,Uid:c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"584d0d4df4971f6cfaea02ac495a95ae9c1676eb8096ecf02ee8898559efbc90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.075183 kubelet[2793]: E0513 14:23:39.075157 2793 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"584d0d4df4971f6cfaea02ac495a95ae9c1676eb8096ecf02ee8898559efbc90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.075488 kubelet[2793]: E0513 14:23:39.075433 2793 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"584d0d4df4971f6cfaea02ac495a95ae9c1676eb8096ecf02ee8898559efbc90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-g9gj9" May 13 14:23:39.075488 kubelet[2793]: E0513 14:23:39.075459 2793 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"584d0d4df4971f6cfaea02ac495a95ae9c1676eb8096ecf02ee8898559efbc90\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-g9gj9" May 13 14:23:39.075746 kubelet[2793]: E0513 14:23:39.075699 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-g9gj9_kube-system(c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-g9gj9_kube-system(c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"584d0d4df4971f6cfaea02ac495a95ae9c1676eb8096ecf02ee8898559efbc90\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-g9gj9" podUID="c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea" May 13 14:23:39.080287 systemd[1]: Created slice kubepods-besteffort-podb34a8f80_a3ed_4e61_8b2d_450b9d0e1edb.slice - libcontainer container kubepods-besteffort-podb34a8f80_a3ed_4e61_8b2d_450b9d0e1edb.slice. May 13 14:23:39.083791 containerd[1551]: time="2025-05-13T14:23:39.083761149Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-g62pr,Uid:b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb,Namespace:calico-system,Attempt:0,}" May 13 14:23:39.090584 containerd[1551]: time="2025-05-13T14:23:39.090457502Z" level=error msg="Failed to destroy network for sandbox \"b910a0d4c96ce18ef901e5bb2cf0c8f049f0812e6c85d856743709d3c9bb76e2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.092606 containerd[1551]: time="2025-05-13T14:23:39.092505708Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xc4qq,Uid:7ab065bc-8b20-44be-b443-9e6e35fcebd5,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b910a0d4c96ce18ef901e5bb2cf0c8f049f0812e6c85d856743709d3c9bb76e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.094380 kubelet[2793]: E0513 14:23:39.093571 2793 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b910a0d4c96ce18ef901e5bb2cf0c8f049f0812e6c85d856743709d3c9bb76e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.094380 kubelet[2793]: E0513 14:23:39.093625 2793 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b910a0d4c96ce18ef901e5bb2cf0c8f049f0812e6c85d856743709d3c9bb76e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xc4qq" May 13 14:23:39.094380 kubelet[2793]: E0513 14:23:39.093658 2793 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b910a0d4c96ce18ef901e5bb2cf0c8f049f0812e6c85d856743709d3c9bb76e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xc4qq" May 13 14:23:39.094510 kubelet[2793]: E0513 14:23:39.093702 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-xc4qq_kube-system(7ab065bc-8b20-44be-b443-9e6e35fcebd5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-xc4qq_kube-system(7ab065bc-8b20-44be-b443-9e6e35fcebd5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b910a0d4c96ce18ef901e5bb2cf0c8f049f0812e6c85d856743709d3c9bb76e2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-xc4qq" podUID="7ab065bc-8b20-44be-b443-9e6e35fcebd5" May 13 14:23:39.138218 containerd[1551]: time="2025-05-13T14:23:39.138178174Z" level=error msg="Failed to destroy network for sandbox \"aa0bc14113346fcdd10a699b239300e978bba328952e146a94b05dc35b4f83b3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.140390 containerd[1551]: time="2025-05-13T14:23:39.140162721Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-g62pr,Uid:b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa0bc14113346fcdd10a699b239300e978bba328952e146a94b05dc35b4f83b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.140558 kubelet[2793]: E0513 14:23:39.140345 2793 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa0bc14113346fcdd10a699b239300e978bba328952e146a94b05dc35b4f83b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 14:23:39.140558 kubelet[2793]: E0513 14:23:39.140415 2793 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa0bc14113346fcdd10a699b239300e978bba328952e146a94b05dc35b4f83b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-g62pr" May 13 14:23:39.140558 kubelet[2793]: E0513 14:23:39.140434 2793 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa0bc14113346fcdd10a699b239300e978bba328952e146a94b05dc35b4f83b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-g62pr" May 13 14:23:39.140658 kubelet[2793]: E0513 14:23:39.140478 2793 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-g62pr_calico-system(b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-g62pr_calico-system(b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aa0bc14113346fcdd10a699b239300e978bba328952e146a94b05dc35b4f83b3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-g62pr" podUID="b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb" May 13 14:23:39.287510 containerd[1551]: time="2025-05-13T14:23:39.287177230Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 13 14:23:42.025745 kubelet[2793]: I0513 14:23:42.024670 2793 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 14:23:47.779202 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3773197437.mount: Deactivated successfully. May 13 14:23:47.826060 containerd[1551]: time="2025-05-13T14:23:47.825847812Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:47.827339 containerd[1551]: time="2025-05-13T14:23:47.827312453Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 13 14:23:47.828539 containerd[1551]: time="2025-05-13T14:23:47.828497033Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:47.831958 containerd[1551]: time="2025-05-13T14:23:47.831771679Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:23:47.832742 containerd[1551]: time="2025-05-13T14:23:47.832703488Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 8.545419629s" May 13 14:23:47.832791 containerd[1551]: time="2025-05-13T14:23:47.832734745Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 13 14:23:47.845232 containerd[1551]: time="2025-05-13T14:23:47.845163496Z" level=info msg="CreateContainer within sandbox \"7ff433576b9abc4725da3db9468ee947e5e055c6897eb3b8d18e97a7606cb0c8\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 13 14:23:47.859982 containerd[1551]: time="2025-05-13T14:23:47.859945854Z" level=info msg="Container c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332: CDI devices from CRI Config.CDIDevices: []" May 13 14:23:47.875399 containerd[1551]: time="2025-05-13T14:23:47.875339814Z" level=info msg="CreateContainer within sandbox \"7ff433576b9abc4725da3db9468ee947e5e055c6897eb3b8d18e97a7606cb0c8\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\"" May 13 14:23:47.877122 containerd[1551]: time="2025-05-13T14:23:47.875980810Z" level=info msg="StartContainer for \"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\"" May 13 14:23:47.877658 containerd[1551]: time="2025-05-13T14:23:47.877629965Z" level=info msg="connecting to shim c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332" address="unix:///run/containerd/s/a0d227a6010b8a8ac8c9fa12b4b997b58e1f68375a53dc42c760d387bc2c240d" protocol=ttrpc version=3 May 13 14:23:47.898511 systemd[1]: Started cri-containerd-c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332.scope - libcontainer container c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332. May 13 14:23:47.944130 containerd[1551]: time="2025-05-13T14:23:47.944037977Z" level=info msg="StartContainer for \"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" returns successfully" May 13 14:23:48.008274 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 13 14:23:48.008369 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 13 14:23:48.339532 kubelet[2793]: I0513 14:23:48.338235 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-4lklq" podStartSLOduration=1.421632395 podStartE2EDuration="27.338073s" podCreationTimestamp="2025-05-13 14:23:21 +0000 UTC" firstStartedPulling="2025-05-13 14:23:21.916905381 +0000 UTC m=+11.955927297" lastFinishedPulling="2025-05-13 14:23:47.833345986 +0000 UTC m=+37.872367902" observedRunningTime="2025-05-13 14:23:48.33725308 +0000 UTC m=+38.376274996" watchObservedRunningTime="2025-05-13 14:23:48.338073 +0000 UTC m=+38.377094916" May 13 14:23:48.380347 containerd[1551]: time="2025-05-13T14:23:48.380277341Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"f1704f877403a976b1ff8ac18ed16a2a6f4c932616f21c77fc6c9762b1b08605\" pid:3885 exit_status:1 exited_at:{seconds:1747146228 nanos:379796594}" May 13 14:23:49.613339 containerd[1551]: time="2025-05-13T14:23:49.613299590Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"f86453321e9d58b5f92e6aa1edc50501d63b5f68bc6a886103a2deea1171dfbd\" pid:3949 exit_status:1 exited_at:{seconds:1747146229 nanos:612935981}" May 13 14:23:49.959484 systemd-networkd[1445]: vxlan.calico: Link UP May 13 14:23:49.960091 systemd-networkd[1445]: vxlan.calico: Gained carrier May 13 14:23:51.071458 containerd[1551]: time="2025-05-13T14:23:51.071304482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b4d8b97f4-rpkpt,Uid:39290338-b216-4ae5-9c4f-676b2992736d,Namespace:calico-apiserver,Attempt:0,}" May 13 14:23:51.325701 systemd-networkd[1445]: vxlan.calico: Gained IPv6LL May 13 14:23:51.358519 systemd-networkd[1445]: calic97c2e09d0e: Link UP May 13 14:23:51.359847 systemd-networkd[1445]: calic97c2e09d0e: Gained carrier May 13 14:23:51.393133 containerd[1551]: 2025-05-13 14:23:51.191 [INFO][4108] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0 calico-apiserver-6b4d8b97f4- calico-apiserver 39290338-b216-4ae5-9c4f-676b2992736d 683 0 2025-05-13 14:23:21 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b4d8b97f4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-9-100-9699b4e791.novalocal calico-apiserver-6b4d8b97f4-rpkpt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic97c2e09d0e [] []}} ContainerID="5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-rpkpt" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-" May 13 14:23:51.393133 containerd[1551]: 2025-05-13 14:23:51.191 [INFO][4108] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-rpkpt" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0" May 13 14:23:51.393133 containerd[1551]: 2025-05-13 14:23:51.252 [INFO][4119] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" HandleID="k8s-pod-network.5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0" May 13 14:23:51.393336 containerd[1551]: 2025-05-13 14:23:51.266 [INFO][4119] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" HandleID="k8s-pod-network.5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051bd0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-9-100-9699b4e791.novalocal", "pod":"calico-apiserver-6b4d8b97f4-rpkpt", "timestamp":"2025-05-13 14:23:51.252943732 +0000 UTC"}, Hostname:"ci-9999-9-100-9699b4e791.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 14:23:51.393336 containerd[1551]: 2025-05-13 14:23:51.266 [INFO][4119] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 14:23:51.393336 containerd[1551]: 2025-05-13 14:23:51.266 [INFO][4119] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 14:23:51.393336 containerd[1551]: 2025-05-13 14:23:51.266 [INFO][4119] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-9699b4e791.novalocal' May 13 14:23:51.393336 containerd[1551]: 2025-05-13 14:23:51.269 [INFO][4119] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:51.393336 containerd[1551]: 2025-05-13 14:23:51.275 [INFO][4119] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:51.393336 containerd[1551]: 2025-05-13 14:23:51.281 [INFO][4119] ipam/ipam.go 489: Trying affinity for 192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:51.393336 containerd[1551]: 2025-05-13 14:23:51.284 [INFO][4119] ipam/ipam.go 155: Attempting to load block cidr=192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:51.393336 containerd[1551]: 2025-05-13 14:23:51.288 [INFO][4119] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:51.394820 containerd[1551]: 2025-05-13 14:23:51.288 [INFO][4119] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.21.0/26 handle="k8s-pod-network.5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:51.394820 containerd[1551]: 2025-05-13 14:23:51.292 [INFO][4119] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f May 13 14:23:51.394820 containerd[1551]: 2025-05-13 14:23:51.320 [INFO][4119] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.21.0/26 handle="k8s-pod-network.5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:51.394820 containerd[1551]: 2025-05-13 14:23:51.349 [INFO][4119] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.21.1/26] block=192.168.21.0/26 handle="k8s-pod-network.5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:51.394820 containerd[1551]: 2025-05-13 14:23:51.349 [INFO][4119] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.21.1/26] handle="k8s-pod-network.5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:51.394820 containerd[1551]: 2025-05-13 14:23:51.349 [INFO][4119] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 14:23:51.394820 containerd[1551]: 2025-05-13 14:23:51.349 [INFO][4119] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.1/26] IPv6=[] ContainerID="5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" HandleID="k8s-pod-network.5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0" May 13 14:23:51.394976 containerd[1551]: 2025-05-13 14:23:51.354 [INFO][4108] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-rpkpt" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0", GenerateName:"calico-apiserver-6b4d8b97f4-", Namespace:"calico-apiserver", SelfLink:"", UID:"39290338-b216-4ae5-9c4f-676b2992736d", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 14, 23, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b4d8b97f4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-9699b4e791.novalocal", ContainerID:"", Pod:"calico-apiserver-6b4d8b97f4-rpkpt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic97c2e09d0e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 14:23:51.395042 containerd[1551]: 2025-05-13 14:23:51.354 [INFO][4108] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.21.1/32] ContainerID="5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-rpkpt" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0" May 13 14:23:51.395042 containerd[1551]: 2025-05-13 14:23:51.354 [INFO][4108] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic97c2e09d0e ContainerID="5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-rpkpt" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0" May 13 14:23:51.395042 containerd[1551]: 2025-05-13 14:23:51.357 [INFO][4108] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-rpkpt" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0" May 13 14:23:51.395111 containerd[1551]: 2025-05-13 14:23:51.358 [INFO][4108] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-rpkpt" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0", GenerateName:"calico-apiserver-6b4d8b97f4-", Namespace:"calico-apiserver", SelfLink:"", UID:"39290338-b216-4ae5-9c4f-676b2992736d", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 14, 23, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b4d8b97f4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-9699b4e791.novalocal", ContainerID:"5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f", Pod:"calico-apiserver-6b4d8b97f4-rpkpt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic97c2e09d0e", MAC:"1a:3a:ba:d6:b6:b4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 14:23:51.395172 containerd[1551]: 2025-05-13 14:23:51.390 [INFO][4108] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-rpkpt" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--rpkpt-eth0" May 13 14:23:51.842395 containerd[1551]: time="2025-05-13T14:23:51.842156221Z" level=info msg="connecting to shim 5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f" address="unix:///run/containerd/s/2ea99c8f03e45079d9ab730c353515d1cd92cff9055d12c0ddd026b62bfda2c5" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:51.930537 systemd[1]: Started cri-containerd-5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f.scope - libcontainer container 5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f. May 13 14:23:51.989191 containerd[1551]: time="2025-05-13T14:23:51.989158100Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b4d8b97f4-rpkpt,Uid:39290338-b216-4ae5-9c4f-676b2992736d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f\"" May 13 14:23:51.991438 containerd[1551]: time="2025-05-13T14:23:51.991309534Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 14:23:52.070853 containerd[1551]: time="2025-05-13T14:23:52.070630177Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-558d4878cb-w59hw,Uid:f1d2f603-6e50-45a5-983a-f14916c24ded,Namespace:calico-system,Attempt:0,}" May 13 14:23:52.210511 systemd-networkd[1445]: calic7d0d4205ab: Link UP May 13 14:23:52.210679 systemd-networkd[1445]: calic7d0d4205ab: Gained carrier May 13 14:23:52.227786 containerd[1551]: 2025-05-13 14:23:52.114 [INFO][4188] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0 calico-kube-controllers-558d4878cb- calico-system f1d2f603-6e50-45a5-983a-f14916c24ded 677 0 2025-05-13 14:23:21 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:558d4878cb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-9999-9-100-9699b4e791.novalocal calico-kube-controllers-558d4878cb-w59hw eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic7d0d4205ab [] []}} ContainerID="71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" Namespace="calico-system" Pod="calico-kube-controllers-558d4878cb-w59hw" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-" May 13 14:23:52.227786 containerd[1551]: 2025-05-13 14:23:52.114 [INFO][4188] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" Namespace="calico-system" Pod="calico-kube-controllers-558d4878cb-w59hw" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0" May 13 14:23:52.227786 containerd[1551]: 2025-05-13 14:23:52.164 [INFO][4201] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" HandleID="k8s-pod-network.71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0" May 13 14:23:52.228186 containerd[1551]: 2025-05-13 14:23:52.175 [INFO][4201] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" HandleID="k8s-pod-network.71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003345e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-9-100-9699b4e791.novalocal", "pod":"calico-kube-controllers-558d4878cb-w59hw", "timestamp":"2025-05-13 14:23:52.164483172 +0000 UTC"}, Hostname:"ci-9999-9-100-9699b4e791.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 14:23:52.228186 containerd[1551]: 2025-05-13 14:23:52.175 [INFO][4201] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 14:23:52.228186 containerd[1551]: 2025-05-13 14:23:52.175 [INFO][4201] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 14:23:52.228186 containerd[1551]: 2025-05-13 14:23:52.175 [INFO][4201] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-9699b4e791.novalocal' May 13 14:23:52.228186 containerd[1551]: 2025-05-13 14:23:52.177 [INFO][4201] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:52.228186 containerd[1551]: 2025-05-13 14:23:52.182 [INFO][4201] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:52.228186 containerd[1551]: 2025-05-13 14:23:52.187 [INFO][4201] ipam/ipam.go 489: Trying affinity for 192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:52.228186 containerd[1551]: 2025-05-13 14:23:52.189 [INFO][4201] ipam/ipam.go 155: Attempting to load block cidr=192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:52.228186 containerd[1551]: 2025-05-13 14:23:52.192 [INFO][4201] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:52.229030 containerd[1551]: 2025-05-13 14:23:52.192 [INFO][4201] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.21.0/26 handle="k8s-pod-network.71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:52.229030 containerd[1551]: 2025-05-13 14:23:52.194 [INFO][4201] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414 May 13 14:23:52.229030 containerd[1551]: 2025-05-13 14:23:52.198 [INFO][4201] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.21.0/26 handle="k8s-pod-network.71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:52.229030 containerd[1551]: 2025-05-13 14:23:52.205 [INFO][4201] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.21.2/26] block=192.168.21.0/26 handle="k8s-pod-network.71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:52.229030 containerd[1551]: 2025-05-13 14:23:52.205 [INFO][4201] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.21.2/26] handle="k8s-pod-network.71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:52.229030 containerd[1551]: 2025-05-13 14:23:52.205 [INFO][4201] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 14:23:52.229030 containerd[1551]: 2025-05-13 14:23:52.206 [INFO][4201] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.2/26] IPv6=[] ContainerID="71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" HandleID="k8s-pod-network.71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0" May 13 14:23:52.229539 containerd[1551]: 2025-05-13 14:23:52.208 [INFO][4188] cni-plugin/k8s.go 386: Populated endpoint ContainerID="71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" Namespace="calico-system" Pod="calico-kube-controllers-558d4878cb-w59hw" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0", GenerateName:"calico-kube-controllers-558d4878cb-", Namespace:"calico-system", SelfLink:"", UID:"f1d2f603-6e50-45a5-983a-f14916c24ded", ResourceVersion:"677", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 14, 23, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"558d4878cb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-9699b4e791.novalocal", ContainerID:"", Pod:"calico-kube-controllers-558d4878cb-w59hw", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.21.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic7d0d4205ab", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 14:23:52.229697 containerd[1551]: 2025-05-13 14:23:52.208 [INFO][4188] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.21.2/32] ContainerID="71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" Namespace="calico-system" Pod="calico-kube-controllers-558d4878cb-w59hw" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0" May 13 14:23:52.229697 containerd[1551]: 2025-05-13 14:23:52.208 [INFO][4188] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic7d0d4205ab ContainerID="71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" Namespace="calico-system" Pod="calico-kube-controllers-558d4878cb-w59hw" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0" May 13 14:23:52.229697 containerd[1551]: 2025-05-13 14:23:52.210 [INFO][4188] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" Namespace="calico-system" Pod="calico-kube-controllers-558d4878cb-w59hw" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0" May 13 14:23:52.229826 containerd[1551]: 2025-05-13 14:23:52.211 [INFO][4188] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" Namespace="calico-system" Pod="calico-kube-controllers-558d4878cb-w59hw" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0", GenerateName:"calico-kube-controllers-558d4878cb-", Namespace:"calico-system", SelfLink:"", UID:"f1d2f603-6e50-45a5-983a-f14916c24ded", ResourceVersion:"677", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 14, 23, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"558d4878cb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-9699b4e791.novalocal", ContainerID:"71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414", Pod:"calico-kube-controllers-558d4878cb-w59hw", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.21.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic7d0d4205ab", MAC:"4a:39:12:ba:1b:d3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 14:23:52.229970 containerd[1551]: 2025-05-13 14:23:52.224 [INFO][4188] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" Namespace="calico-system" Pod="calico-kube-controllers-558d4878cb-w59hw" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--kube--controllers--558d4878cb--w59hw-eth0" May 13 14:23:52.269577 containerd[1551]: time="2025-05-13T14:23:52.269439287Z" level=info msg="connecting to shim 71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414" address="unix:///run/containerd/s/ba7d5b06294cb04f60f865fdbb4dae7ad53913c0d9f90fe1297c8b3c95bbcd62" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:52.305661 systemd[1]: Started cri-containerd-71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414.scope - libcontainer container 71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414. May 13 14:23:52.374604 containerd[1551]: time="2025-05-13T14:23:52.374533680Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-558d4878cb-w59hw,Uid:f1d2f603-6e50-45a5-983a-f14916c24ded,Namespace:calico-system,Attempt:0,} returns sandbox id \"71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414\"" May 13 14:23:52.413531 systemd-networkd[1445]: calic97c2e09d0e: Gained IPv6LL May 13 14:23:52.606455 containerd[1551]: time="2025-05-13T14:23:52.606213855Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"4c88e0ca1494fccc6f9a30bc284ed7338842cf6e376d9eaaf6c5452218f66ee2\" pid:4276 exited_at:{seconds:1747146232 nanos:605921098}" May 13 14:23:53.071404 containerd[1551]: time="2025-05-13T14:23:53.071254471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-g62pr,Uid:b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb,Namespace:calico-system,Attempt:0,}" May 13 14:23:53.072475 containerd[1551]: time="2025-05-13T14:23:53.071805580Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-g9gj9,Uid:c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea,Namespace:kube-system,Attempt:0,}" May 13 14:23:53.276974 systemd-networkd[1445]: calie05ae6471a6: Link UP May 13 14:23:53.277181 systemd-networkd[1445]: calie05ae6471a6: Gained carrier May 13 14:23:53.296702 containerd[1551]: 2025-05-13 14:23:53.169 [INFO][4288] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0 csi-node-driver- calico-system b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb 582 0 2025-05-13 14:23:21 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5bcd8f69 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-9999-9-100-9699b4e791.novalocal csi-node-driver-g62pr eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calie05ae6471a6 [] []}} ContainerID="11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" Namespace="calico-system" Pod="csi-node-driver-g62pr" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-" May 13 14:23:53.296702 containerd[1551]: 2025-05-13 14:23:53.170 [INFO][4288] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" Namespace="calico-system" Pod="csi-node-driver-g62pr" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0" May 13 14:23:53.296702 containerd[1551]: 2025-05-13 14:23:53.212 [INFO][4315] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" HandleID="k8s-pod-network.11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0" May 13 14:23:53.297088 containerd[1551]: 2025-05-13 14:23:53.229 [INFO][4315] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" HandleID="k8s-pod-network.11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00011a170), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-9-100-9699b4e791.novalocal", "pod":"csi-node-driver-g62pr", "timestamp":"2025-05-13 14:23:53.212891559 +0000 UTC"}, Hostname:"ci-9999-9-100-9699b4e791.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 14:23:53.297088 containerd[1551]: 2025-05-13 14:23:53.230 [INFO][4315] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 14:23:53.297088 containerd[1551]: 2025-05-13 14:23:53.230 [INFO][4315] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 14:23:53.297088 containerd[1551]: 2025-05-13 14:23:53.230 [INFO][4315] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-9699b4e791.novalocal' May 13 14:23:53.297088 containerd[1551]: 2025-05-13 14:23:53.232 [INFO][4315] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.297088 containerd[1551]: 2025-05-13 14:23:53.240 [INFO][4315] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.297088 containerd[1551]: 2025-05-13 14:23:53.248 [INFO][4315] ipam/ipam.go 489: Trying affinity for 192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.297088 containerd[1551]: 2025-05-13 14:23:53.250 [INFO][4315] ipam/ipam.go 155: Attempting to load block cidr=192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.297088 containerd[1551]: 2025-05-13 14:23:53.254 [INFO][4315] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.297766 containerd[1551]: 2025-05-13 14:23:53.254 [INFO][4315] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.21.0/26 handle="k8s-pod-network.11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.297766 containerd[1551]: 2025-05-13 14:23:53.256 [INFO][4315] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68 May 13 14:23:53.297766 containerd[1551]: 2025-05-13 14:23:53.262 [INFO][4315] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.21.0/26 handle="k8s-pod-network.11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.297766 containerd[1551]: 2025-05-13 14:23:53.270 [INFO][4315] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.21.3/26] block=192.168.21.0/26 handle="k8s-pod-network.11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.297766 containerd[1551]: 2025-05-13 14:23:53.270 [INFO][4315] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.21.3/26] handle="k8s-pod-network.11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.297766 containerd[1551]: 2025-05-13 14:23:53.270 [INFO][4315] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 14:23:53.297766 containerd[1551]: 2025-05-13 14:23:53.271 [INFO][4315] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.3/26] IPv6=[] ContainerID="11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" HandleID="k8s-pod-network.11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0" May 13 14:23:53.298113 containerd[1551]: 2025-05-13 14:23:53.273 [INFO][4288] cni-plugin/k8s.go 386: Populated endpoint ContainerID="11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" Namespace="calico-system" Pod="csi-node-driver-g62pr" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb", ResourceVersion:"582", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 14, 23, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-9699b4e791.novalocal", ContainerID:"", Pod:"csi-node-driver-g62pr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.21.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie05ae6471a6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 14:23:53.298226 containerd[1551]: 2025-05-13 14:23:53.273 [INFO][4288] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.21.3/32] ContainerID="11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" Namespace="calico-system" Pod="csi-node-driver-g62pr" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0" May 13 14:23:53.298226 containerd[1551]: 2025-05-13 14:23:53.274 [INFO][4288] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie05ae6471a6 ContainerID="11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" Namespace="calico-system" Pod="csi-node-driver-g62pr" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0" May 13 14:23:53.298226 containerd[1551]: 2025-05-13 14:23:53.277 [INFO][4288] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" Namespace="calico-system" Pod="csi-node-driver-g62pr" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0" May 13 14:23:53.298489 containerd[1551]: 2025-05-13 14:23:53.278 [INFO][4288] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" Namespace="calico-system" Pod="csi-node-driver-g62pr" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb", ResourceVersion:"582", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 14, 23, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-9699b4e791.novalocal", ContainerID:"11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68", Pod:"csi-node-driver-g62pr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.21.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie05ae6471a6", MAC:"02:2e:1b:a0:c5:08", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 14:23:53.298619 containerd[1551]: 2025-05-13 14:23:53.294 [INFO][4288] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" Namespace="calico-system" Pod="csi-node-driver-g62pr" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-csi--node--driver--g62pr-eth0" May 13 14:23:53.328598 containerd[1551]: time="2025-05-13T14:23:53.328477525Z" level=info msg="connecting to shim 11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68" address="unix:///run/containerd/s/4bbe3e29c5b61152dabe88686d9d7627b5659ba090fe7ec7629ccd5f972706bb" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:53.370543 systemd[1]: Started cri-containerd-11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68.scope - libcontainer container 11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68. May 13 14:23:53.391600 systemd-networkd[1445]: cali232df9cf724: Link UP May 13 14:23:53.392780 systemd-networkd[1445]: cali232df9cf724: Gained carrier May 13 14:23:53.414950 containerd[1551]: 2025-05-13 14:23:53.181 [INFO][4290] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0 coredns-6f6b679f8f- kube-system c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea 680 0 2025-05-13 14:23:14 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-9-100-9699b4e791.novalocal coredns-6f6b679f8f-g9gj9 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali232df9cf724 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" Namespace="kube-system" Pod="coredns-6f6b679f8f-g9gj9" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-" May 13 14:23:53.414950 containerd[1551]: 2025-05-13 14:23:53.182 [INFO][4290] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" Namespace="kube-system" Pod="coredns-6f6b679f8f-g9gj9" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0" May 13 14:23:53.414950 containerd[1551]: 2025-05-13 14:23:53.225 [INFO][4320] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" HandleID="k8s-pod-network.783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0" May 13 14:23:53.415160 containerd[1551]: 2025-05-13 14:23:53.249 [INFO][4320] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" HandleID="k8s-pod-network.783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000313440), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-9-100-9699b4e791.novalocal", "pod":"coredns-6f6b679f8f-g9gj9", "timestamp":"2025-05-13 14:23:53.225376821 +0000 UTC"}, Hostname:"ci-9999-9-100-9699b4e791.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 14:23:53.415160 containerd[1551]: 2025-05-13 14:23:53.249 [INFO][4320] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 14:23:53.415160 containerd[1551]: 2025-05-13 14:23:53.270 [INFO][4320] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 14:23:53.415160 containerd[1551]: 2025-05-13 14:23:53.270 [INFO][4320] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-9699b4e791.novalocal' May 13 14:23:53.415160 containerd[1551]: 2025-05-13 14:23:53.340 [INFO][4320] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.415160 containerd[1551]: 2025-05-13 14:23:53.349 [INFO][4320] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.415160 containerd[1551]: 2025-05-13 14:23:53.357 [INFO][4320] ipam/ipam.go 489: Trying affinity for 192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.415160 containerd[1551]: 2025-05-13 14:23:53.359 [INFO][4320] ipam/ipam.go 155: Attempting to load block cidr=192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.415160 containerd[1551]: 2025-05-13 14:23:53.364 [INFO][4320] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.415768 containerd[1551]: 2025-05-13 14:23:53.364 [INFO][4320] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.21.0/26 handle="k8s-pod-network.783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.415768 containerd[1551]: 2025-05-13 14:23:53.368 [INFO][4320] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75 May 13 14:23:53.415768 containerd[1551]: 2025-05-13 14:23:53.373 [INFO][4320] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.21.0/26 handle="k8s-pod-network.783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.415768 containerd[1551]: 2025-05-13 14:23:53.384 [INFO][4320] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.21.4/26] block=192.168.21.0/26 handle="k8s-pod-network.783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.415768 containerd[1551]: 2025-05-13 14:23:53.384 [INFO][4320] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.21.4/26] handle="k8s-pod-network.783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:53.415768 containerd[1551]: 2025-05-13 14:23:53.384 [INFO][4320] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 14:23:53.415768 containerd[1551]: 2025-05-13 14:23:53.384 [INFO][4320] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.4/26] IPv6=[] ContainerID="783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" HandleID="k8s-pod-network.783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0" May 13 14:23:53.415936 containerd[1551]: 2025-05-13 14:23:53.387 [INFO][4290] cni-plugin/k8s.go 386: Populated endpoint ContainerID="783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" Namespace="kube-system" Pod="coredns-6f6b679f8f-g9gj9" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea", ResourceVersion:"680", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 14, 23, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-9699b4e791.novalocal", ContainerID:"", Pod:"coredns-6f6b679f8f-g9gj9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali232df9cf724", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 14:23:53.415936 containerd[1551]: 2025-05-13 14:23:53.387 [INFO][4290] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.21.4/32] ContainerID="783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" Namespace="kube-system" Pod="coredns-6f6b679f8f-g9gj9" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0" May 13 14:23:53.415936 containerd[1551]: 2025-05-13 14:23:53.387 [INFO][4290] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali232df9cf724 ContainerID="783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" Namespace="kube-system" Pod="coredns-6f6b679f8f-g9gj9" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0" May 13 14:23:53.415936 containerd[1551]: 2025-05-13 14:23:53.391 [INFO][4290] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" Namespace="kube-system" Pod="coredns-6f6b679f8f-g9gj9" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0" May 13 14:23:53.415936 containerd[1551]: 2025-05-13 14:23:53.392 [INFO][4290] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" Namespace="kube-system" Pod="coredns-6f6b679f8f-g9gj9" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea", ResourceVersion:"680", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 14, 23, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-9699b4e791.novalocal", ContainerID:"783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75", Pod:"coredns-6f6b679f8f-g9gj9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali232df9cf724", MAC:"12:43:1c:a2:94:ab", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 14:23:53.415936 containerd[1551]: 2025-05-13 14:23:53.408 [INFO][4290] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" Namespace="kube-system" Pod="coredns-6f6b679f8f-g9gj9" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--g9gj9-eth0" May 13 14:23:53.424547 containerd[1551]: time="2025-05-13T14:23:53.424468892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-g62pr,Uid:b34a8f80-a3ed-4e61-8b2d-450b9d0e1edb,Namespace:calico-system,Attempt:0,} returns sandbox id \"11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68\"" May 13 14:23:53.456931 containerd[1551]: time="2025-05-13T14:23:53.456883659Z" level=info msg="connecting to shim 783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75" address="unix:///run/containerd/s/e74913375f4177b976d4e7618e601d4ccb1257504bbf1446f89ae36a61d8d17a" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:53.481519 systemd[1]: Started cri-containerd-783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75.scope - libcontainer container 783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75. May 13 14:23:53.528178 containerd[1551]: time="2025-05-13T14:23:53.528133579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-g9gj9,Uid:c330e9e0-2c3a-4bf9-a416-6d9d0dfb79ea,Namespace:kube-system,Attempt:0,} returns sandbox id \"783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75\"" May 13 14:23:53.531780 containerd[1551]: time="2025-05-13T14:23:53.531758384Z" level=info msg="CreateContainer within sandbox \"783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 14:23:53.547210 containerd[1551]: time="2025-05-13T14:23:53.547126536Z" level=info msg="Container 10e112fd4415a297d6986fe9298e5d6f1e719ae9150db0fc5632e957faba1b7e: CDI devices from CRI Config.CDIDevices: []" May 13 14:23:53.559626 containerd[1551]: time="2025-05-13T14:23:53.559535497Z" level=info msg="CreateContainer within sandbox \"783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"10e112fd4415a297d6986fe9298e5d6f1e719ae9150db0fc5632e957faba1b7e\"" May 13 14:23:53.560521 containerd[1551]: time="2025-05-13T14:23:53.560476903Z" level=info msg="StartContainer for \"10e112fd4415a297d6986fe9298e5d6f1e719ae9150db0fc5632e957faba1b7e\"" May 13 14:23:53.561827 containerd[1551]: time="2025-05-13T14:23:53.561766911Z" level=info msg="connecting to shim 10e112fd4415a297d6986fe9298e5d6f1e719ae9150db0fc5632e957faba1b7e" address="unix:///run/containerd/s/e74913375f4177b976d4e7618e601d4ccb1257504bbf1446f89ae36a61d8d17a" protocol=ttrpc version=3 May 13 14:23:53.581496 systemd[1]: Started cri-containerd-10e112fd4415a297d6986fe9298e5d6f1e719ae9150db0fc5632e957faba1b7e.scope - libcontainer container 10e112fd4415a297d6986fe9298e5d6f1e719ae9150db0fc5632e957faba1b7e. May 13 14:23:53.613751 containerd[1551]: time="2025-05-13T14:23:53.613689488Z" level=info msg="StartContainer for \"10e112fd4415a297d6986fe9298e5d6f1e719ae9150db0fc5632e957faba1b7e\" returns successfully" May 13 14:23:54.072200 containerd[1551]: time="2025-05-13T14:23:54.072108233Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xc4qq,Uid:7ab065bc-8b20-44be-b443-9e6e35fcebd5,Namespace:kube-system,Attempt:0,}" May 13 14:23:54.072908 containerd[1551]: time="2025-05-13T14:23:54.072881386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b4d8b97f4-w976v,Uid:293aca7c-7ada-43c0-90e4-5a055324f6a5,Namespace:calico-apiserver,Attempt:0,}" May 13 14:23:54.141618 systemd-networkd[1445]: calic7d0d4205ab: Gained IPv6LL May 13 14:23:54.245927 systemd-networkd[1445]: calie45151fda29: Link UP May 13 14:23:54.246598 systemd-networkd[1445]: calie45151fda29: Gained carrier May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.138 [INFO][4477] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0 coredns-6f6b679f8f- kube-system 7ab065bc-8b20-44be-b443-9e6e35fcebd5 682 0 2025-05-13 14:23:14 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-9-100-9699b4e791.novalocal coredns-6f6b679f8f-xc4qq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie45151fda29 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" Namespace="kube-system" Pod="coredns-6f6b679f8f-xc4qq" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-" May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.139 [INFO][4477] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" Namespace="kube-system" Pod="coredns-6f6b679f8f-xc4qq" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0" May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.179 [INFO][4500] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" HandleID="k8s-pod-network.353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0" May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.200 [INFO][4500] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" HandleID="k8s-pod-network.353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000307c00), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-9-100-9699b4e791.novalocal", "pod":"coredns-6f6b679f8f-xc4qq", "timestamp":"2025-05-13 14:23:54.179781625 +0000 UTC"}, Hostname:"ci-9999-9-100-9699b4e791.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.200 [INFO][4500] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.201 [INFO][4500] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.201 [INFO][4500] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-9699b4e791.novalocal' May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.205 [INFO][4500] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.211 [INFO][4500] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.216 [INFO][4500] ipam/ipam.go 489: Trying affinity for 192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.218 [INFO][4500] ipam/ipam.go 155: Attempting to load block cidr=192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.221 [INFO][4500] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.221 [INFO][4500] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.21.0/26 handle="k8s-pod-network.353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.223 [INFO][4500] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.229 [INFO][4500] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.21.0/26 handle="k8s-pod-network.353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.239 [INFO][4500] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.21.5/26] block=192.168.21.0/26 handle="k8s-pod-network.353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.239 [INFO][4500] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.21.5/26] handle="k8s-pod-network.353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.239 [INFO][4500] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 14:23:54.266250 containerd[1551]: 2025-05-13 14:23:54.239 [INFO][4500] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.5/26] IPv6=[] ContainerID="353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" HandleID="k8s-pod-network.353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0" May 13 14:23:54.268224 containerd[1551]: 2025-05-13 14:23:54.242 [INFO][4477] cni-plugin/k8s.go 386: Populated endpoint ContainerID="353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" Namespace="kube-system" Pod="coredns-6f6b679f8f-xc4qq" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"7ab065bc-8b20-44be-b443-9e6e35fcebd5", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 14, 23, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-9699b4e791.novalocal", ContainerID:"", Pod:"coredns-6f6b679f8f-xc4qq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie45151fda29", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 14:23:54.268224 containerd[1551]: 2025-05-13 14:23:54.242 [INFO][4477] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.21.5/32] ContainerID="353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" Namespace="kube-system" Pod="coredns-6f6b679f8f-xc4qq" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0" May 13 14:23:54.268224 containerd[1551]: 2025-05-13 14:23:54.242 [INFO][4477] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie45151fda29 ContainerID="353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" Namespace="kube-system" Pod="coredns-6f6b679f8f-xc4qq" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0" May 13 14:23:54.268224 containerd[1551]: 2025-05-13 14:23:54.246 [INFO][4477] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" Namespace="kube-system" Pod="coredns-6f6b679f8f-xc4qq" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0" May 13 14:23:54.268224 containerd[1551]: 2025-05-13 14:23:54.247 [INFO][4477] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" Namespace="kube-system" Pod="coredns-6f6b679f8f-xc4qq" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"7ab065bc-8b20-44be-b443-9e6e35fcebd5", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 14, 23, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-9699b4e791.novalocal", ContainerID:"353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee", Pod:"coredns-6f6b679f8f-xc4qq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie45151fda29", MAC:"ae:3a:97:95:cc:fd", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 14:23:54.268224 containerd[1551]: 2025-05-13 14:23:54.262 [INFO][4477] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" Namespace="kube-system" Pod="coredns-6f6b679f8f-xc4qq" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-coredns--6f6b679f8f--xc4qq-eth0" May 13 14:23:54.301570 containerd[1551]: time="2025-05-13T14:23:54.301515000Z" level=info msg="connecting to shim 353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee" address="unix:///run/containerd/s/59486430ae1fcb47bfc57522a53d4d0b03f90a8e024c5ba84eb812c11cf93cf4" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:54.330207 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1190160811.mount: Deactivated successfully. May 13 14:23:54.351117 systemd[1]: Started cri-containerd-353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee.scope - libcontainer container 353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee. May 13 14:23:54.379138 systemd-networkd[1445]: cali3fa83d7ae22: Link UP May 13 14:23:54.381529 systemd-networkd[1445]: cali3fa83d7ae22: Gained carrier May 13 14:23:54.406262 kubelet[2793]: I0513 14:23:54.406198 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-g9gj9" podStartSLOduration=40.406180085 podStartE2EDuration="40.406180085s" podCreationTimestamp="2025-05-13 14:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 14:23:54.389691627 +0000 UTC m=+44.428713563" watchObservedRunningTime="2025-05-13 14:23:54.406180085 +0000 UTC m=+44.445202001" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.146 [INFO][4487] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0 calico-apiserver-6b4d8b97f4- calico-apiserver 293aca7c-7ada-43c0-90e4-5a055324f6a5 681 0 2025-05-13 14:23:21 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b4d8b97f4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-9-100-9699b4e791.novalocal calico-apiserver-6b4d8b97f4-w976v eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3fa83d7ae22 [] []}} ContainerID="21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-w976v" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.146 [INFO][4487] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-w976v" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.193 [INFO][4505] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" HandleID="k8s-pod-network.21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.208 [INFO][4505] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" HandleID="k8s-pod-network.21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051bf0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-9-100-9699b4e791.novalocal", "pod":"calico-apiserver-6b4d8b97f4-w976v", "timestamp":"2025-05-13 14:23:54.193758303 +0000 UTC"}, Hostname:"ci-9999-9-100-9699b4e791.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.208 [INFO][4505] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.240 [INFO][4505] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.240 [INFO][4505] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-9699b4e791.novalocal' May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.306 [INFO][4505] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.314 [INFO][4505] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.325 [INFO][4505] ipam/ipam.go 489: Trying affinity for 192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.329 [INFO][4505] ipam/ipam.go 155: Attempting to load block cidr=192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.336 [INFO][4505] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.21.0/26 host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.336 [INFO][4505] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.21.0/26 handle="k8s-pod-network.21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.338 [INFO][4505] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.348 [INFO][4505] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.21.0/26 handle="k8s-pod-network.21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.371 [INFO][4505] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.21.6/26] block=192.168.21.0/26 handle="k8s-pod-network.21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.371 [INFO][4505] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.21.6/26] handle="k8s-pod-network.21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" host="ci-9999-9-100-9699b4e791.novalocal" May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.371 [INFO][4505] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 14:23:54.410906 containerd[1551]: 2025-05-13 14:23:54.371 [INFO][4505] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.6/26] IPv6=[] ContainerID="21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" HandleID="k8s-pod-network.21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" Workload="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0" May 13 14:23:54.412773 containerd[1551]: 2025-05-13 14:23:54.377 [INFO][4487] cni-plugin/k8s.go 386: Populated endpoint ContainerID="21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-w976v" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0", GenerateName:"calico-apiserver-6b4d8b97f4-", Namespace:"calico-apiserver", SelfLink:"", UID:"293aca7c-7ada-43c0-90e4-5a055324f6a5", ResourceVersion:"681", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 14, 23, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b4d8b97f4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-9699b4e791.novalocal", ContainerID:"", Pod:"calico-apiserver-6b4d8b97f4-w976v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3fa83d7ae22", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 14:23:54.412773 containerd[1551]: 2025-05-13 14:23:54.377 [INFO][4487] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.21.6/32] ContainerID="21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-w976v" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0" May 13 14:23:54.412773 containerd[1551]: 2025-05-13 14:23:54.377 [INFO][4487] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3fa83d7ae22 ContainerID="21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-w976v" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0" May 13 14:23:54.412773 containerd[1551]: 2025-05-13 14:23:54.382 [INFO][4487] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-w976v" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0" May 13 14:23:54.412773 containerd[1551]: 2025-05-13 14:23:54.382 [INFO][4487] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-w976v" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0", GenerateName:"calico-apiserver-6b4d8b97f4-", Namespace:"calico-apiserver", SelfLink:"", UID:"293aca7c-7ada-43c0-90e4-5a055324f6a5", ResourceVersion:"681", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 14, 23, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b4d8b97f4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-9699b4e791.novalocal", ContainerID:"21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf", Pod:"calico-apiserver-6b4d8b97f4-w976v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3fa83d7ae22", MAC:"da:f4:e0:55:1b:6a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 14:23:54.412773 containerd[1551]: 2025-05-13 14:23:54.408 [INFO][4487] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" Namespace="calico-apiserver" Pod="calico-apiserver-6b4d8b97f4-w976v" WorkloadEndpoint="ci--9999--9--100--9699b4e791.novalocal-k8s-calico--apiserver--6b4d8b97f4--w976v-eth0" May 13 14:23:54.453573 containerd[1551]: time="2025-05-13T14:23:54.453522754Z" level=info msg="connecting to shim 21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf" address="unix:///run/containerd/s/1af876f59ffa3ea3cfb30e8a1ca332ab2915379c943283b0efc798cce4519d89" namespace=k8s.io protocol=ttrpc version=3 May 13 14:23:54.493910 containerd[1551]: time="2025-05-13T14:23:54.493820252Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xc4qq,Uid:7ab065bc-8b20-44be-b443-9e6e35fcebd5,Namespace:kube-system,Attempt:0,} returns sandbox id \"353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee\"" May 13 14:23:54.500192 containerd[1551]: time="2025-05-13T14:23:54.500093260Z" level=info msg="CreateContainer within sandbox \"353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 14:23:54.512552 systemd[1]: Started cri-containerd-21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf.scope - libcontainer container 21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf. May 13 14:23:54.536395 containerd[1551]: time="2025-05-13T14:23:54.535488839Z" level=info msg="Container ff33e0c71e5dc95590dd13996073e3b154a3570b21ca08f9c0ac7d8d6f60b8fc: CDI devices from CRI Config.CDIDevices: []" May 13 14:23:54.551549 containerd[1551]: time="2025-05-13T14:23:54.551502450Z" level=info msg="CreateContainer within sandbox \"353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ff33e0c71e5dc95590dd13996073e3b154a3570b21ca08f9c0ac7d8d6f60b8fc\"" May 13 14:23:54.553047 containerd[1551]: time="2025-05-13T14:23:54.552334404Z" level=info msg="StartContainer for \"ff33e0c71e5dc95590dd13996073e3b154a3570b21ca08f9c0ac7d8d6f60b8fc\"" May 13 14:23:54.554768 containerd[1551]: time="2025-05-13T14:23:54.554706581Z" level=info msg="connecting to shim ff33e0c71e5dc95590dd13996073e3b154a3570b21ca08f9c0ac7d8d6f60b8fc" address="unix:///run/containerd/s/59486430ae1fcb47bfc57522a53d4d0b03f90a8e024c5ba84eb812c11cf93cf4" protocol=ttrpc version=3 May 13 14:23:54.580537 systemd[1]: Started cri-containerd-ff33e0c71e5dc95590dd13996073e3b154a3570b21ca08f9c0ac7d8d6f60b8fc.scope - libcontainer container ff33e0c71e5dc95590dd13996073e3b154a3570b21ca08f9c0ac7d8d6f60b8fc. May 13 14:23:54.586381 containerd[1551]: time="2025-05-13T14:23:54.586229761Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b4d8b97f4-w976v,Uid:293aca7c-7ada-43c0-90e4-5a055324f6a5,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf\"" May 13 14:23:54.622552 containerd[1551]: time="2025-05-13T14:23:54.622447495Z" level=info msg="StartContainer for \"ff33e0c71e5dc95590dd13996073e3b154a3570b21ca08f9c0ac7d8d6f60b8fc\" returns successfully" May 13 14:23:54.653556 systemd-networkd[1445]: cali232df9cf724: Gained IPv6LL May 13 14:23:54.781637 systemd-networkd[1445]: calie05ae6471a6: Gained IPv6LL May 13 14:23:55.432528 kubelet[2793]: I0513 14:23:55.431867 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-xc4qq" podStartSLOduration=41.431673712 podStartE2EDuration="41.431673712s" podCreationTimestamp="2025-05-13 14:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 14:23:55.394560299 +0000 UTC m=+45.433582275" watchObservedRunningTime="2025-05-13 14:23:55.431673712 +0000 UTC m=+45.470695678" May 13 14:23:55.869717 systemd-networkd[1445]: cali3fa83d7ae22: Gained IPv6LL May 13 14:23:55.933648 systemd-networkd[1445]: calie45151fda29: Gained IPv6LL May 13 14:24:00.031153 containerd[1551]: time="2025-05-13T14:24:00.031096408Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:00.032827 containerd[1551]: time="2025-05-13T14:24:00.032789340Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 13 14:24:00.034000 containerd[1551]: time="2025-05-13T14:24:00.033945339Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:00.037267 containerd[1551]: time="2025-05-13T14:24:00.037191040Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:00.037877 containerd[1551]: time="2025-05-13T14:24:00.037755765Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 8.046369238s" May 13 14:24:00.037877 containerd[1551]: time="2025-05-13T14:24:00.037785962Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 13 14:24:00.039533 containerd[1551]: time="2025-05-13T14:24:00.039508048Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 13 14:24:00.040794 containerd[1551]: time="2025-05-13T14:24:00.040463282Z" level=info msg="CreateContainer within sandbox \"5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 14:24:00.051777 containerd[1551]: time="2025-05-13T14:24:00.051746926Z" level=info msg="Container 01072395419997700e8cafc332dbcaeb51c493b24fe581b740c849d1a5c85198: CDI devices from CRI Config.CDIDevices: []" May 13 14:24:00.064098 containerd[1551]: time="2025-05-13T14:24:00.064050936Z" level=info msg="CreateContainer within sandbox \"5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"01072395419997700e8cafc332dbcaeb51c493b24fe581b740c849d1a5c85198\"" May 13 14:24:00.065393 containerd[1551]: time="2025-05-13T14:24:00.065266035Z" level=info msg="StartContainer for \"01072395419997700e8cafc332dbcaeb51c493b24fe581b740c849d1a5c85198\"" May 13 14:24:00.066531 containerd[1551]: time="2025-05-13T14:24:00.066509698Z" level=info msg="connecting to shim 01072395419997700e8cafc332dbcaeb51c493b24fe581b740c849d1a5c85198" address="unix:///run/containerd/s/2ea99c8f03e45079d9ab730c353515d1cd92cff9055d12c0ddd026b62bfda2c5" protocol=ttrpc version=3 May 13 14:24:00.096513 systemd[1]: Started cri-containerd-01072395419997700e8cafc332dbcaeb51c493b24fe581b740c849d1a5c85198.scope - libcontainer container 01072395419997700e8cafc332dbcaeb51c493b24fe581b740c849d1a5c85198. May 13 14:24:00.168320 containerd[1551]: time="2025-05-13T14:24:00.168234015Z" level=info msg="StartContainer for \"01072395419997700e8cafc332dbcaeb51c493b24fe581b740c849d1a5c85198\" returns successfully" May 13 14:24:00.436971 kubelet[2793]: I0513 14:24:00.436912 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6b4d8b97f4-rpkpt" podStartSLOduration=31.38853462 podStartE2EDuration="39.436896769s" podCreationTimestamp="2025-05-13 14:23:21 +0000 UTC" firstStartedPulling="2025-05-13 14:23:51.990762082 +0000 UTC m=+42.029783998" lastFinishedPulling="2025-05-13 14:24:00.039124231 +0000 UTC m=+50.078146147" observedRunningTime="2025-05-13 14:24:00.436872223 +0000 UTC m=+50.475894139" watchObservedRunningTime="2025-05-13 14:24:00.436896769 +0000 UTC m=+50.475918695" May 13 14:24:01.388467 kubelet[2793]: I0513 14:24:01.388433 2793 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 14:24:04.721478 kubelet[2793]: I0513 14:24:04.721131 2793 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 14:24:06.740489 containerd[1551]: time="2025-05-13T14:24:06.740425269Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:06.741860 containerd[1551]: time="2025-05-13T14:24:06.741652312Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 13 14:24:06.743075 containerd[1551]: time="2025-05-13T14:24:06.743044613Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:06.746292 containerd[1551]: time="2025-05-13T14:24:06.746257537Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:06.747312 containerd[1551]: time="2025-05-13T14:24:06.747276571Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 6.70774029s" May 13 14:24:06.747378 containerd[1551]: time="2025-05-13T14:24:06.747305916Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 13 14:24:06.749077 containerd[1551]: time="2025-05-13T14:24:06.749052680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 13 14:24:06.780482 containerd[1551]: time="2025-05-13T14:24:06.779985159Z" level=info msg="CreateContainer within sandbox \"71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 13 14:24:06.796888 containerd[1551]: time="2025-05-13T14:24:06.795424349Z" level=info msg="Container c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb: CDI devices from CRI Config.CDIDevices: []" May 13 14:24:06.810411 containerd[1551]: time="2025-05-13T14:24:06.810377510Z" level=info msg="CreateContainer within sandbox \"71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\"" May 13 14:24:06.811380 containerd[1551]: time="2025-05-13T14:24:06.810938989Z" level=info msg="StartContainer for \"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\"" May 13 14:24:06.812284 containerd[1551]: time="2025-05-13T14:24:06.812176892Z" level=info msg="connecting to shim c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb" address="unix:///run/containerd/s/ba7d5b06294cb04f60f865fdbb4dae7ad53913c0d9f90fe1297c8b3c95bbcd62" protocol=ttrpc version=3 May 13 14:24:06.836539 systemd[1]: Started cri-containerd-c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb.scope - libcontainer container c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb. May 13 14:24:06.906093 containerd[1551]: time="2025-05-13T14:24:06.906043549Z" level=info msg="StartContainer for \"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" returns successfully" May 13 14:24:07.432864 kubelet[2793]: I0513 14:24:07.432347 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-558d4878cb-w59hw" podStartSLOduration=32.059574452 podStartE2EDuration="46.431454805s" podCreationTimestamp="2025-05-13 14:23:21 +0000 UTC" firstStartedPulling="2025-05-13 14:23:52.376156679 +0000 UTC m=+42.415178605" lastFinishedPulling="2025-05-13 14:24:06.748037042 +0000 UTC m=+56.787058958" observedRunningTime="2025-05-13 14:24:07.432318599 +0000 UTC m=+57.471340515" watchObservedRunningTime="2025-05-13 14:24:07.431454805 +0000 UTC m=+57.470476731" May 13 14:24:07.486826 containerd[1551]: time="2025-05-13T14:24:07.486791813Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"db6ecebcfcc7e401a1b6799f8069f6e33af7e46dbe688956f512daab6b85252e\" pid:4788 exited_at:{seconds:1747146247 nanos:485245985}" May 13 14:24:17.361491 containerd[1551]: time="2025-05-13T14:24:17.361433716Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:17.362741 containerd[1551]: time="2025-05-13T14:24:17.362704402Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 13 14:24:17.364485 containerd[1551]: time="2025-05-13T14:24:17.364441199Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:17.367135 containerd[1551]: time="2025-05-13T14:24:17.367090293Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:17.367696 containerd[1551]: time="2025-05-13T14:24:17.367658385Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 10.61857623s" May 13 14:24:17.367747 containerd[1551]: time="2025-05-13T14:24:17.367697328Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 13 14:24:17.369270 containerd[1551]: time="2025-05-13T14:24:17.369242728Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 14:24:17.370202 containerd[1551]: time="2025-05-13T14:24:17.370173498Z" level=info msg="CreateContainer within sandbox \"11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 13 14:24:17.389340 containerd[1551]: time="2025-05-13T14:24:17.387530583Z" level=info msg="Container 05217c6a9fdaf46620ff46cf3e693559717595b9ffc4074fa0f92548eeb8b5dc: CDI devices from CRI Config.CDIDevices: []" May 13 14:24:17.401415 containerd[1551]: time="2025-05-13T14:24:17.401385530Z" level=info msg="CreateContainer within sandbox \"11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"05217c6a9fdaf46620ff46cf3e693559717595b9ffc4074fa0f92548eeb8b5dc\"" May 13 14:24:17.402578 containerd[1551]: time="2025-05-13T14:24:17.402553984Z" level=info msg="StartContainer for \"05217c6a9fdaf46620ff46cf3e693559717595b9ffc4074fa0f92548eeb8b5dc\"" May 13 14:24:17.404429 containerd[1551]: time="2025-05-13T14:24:17.404349271Z" level=info msg="connecting to shim 05217c6a9fdaf46620ff46cf3e693559717595b9ffc4074fa0f92548eeb8b5dc" address="unix:///run/containerd/s/4bbe3e29c5b61152dabe88686d9d7627b5659ba090fe7ec7629ccd5f972706bb" protocol=ttrpc version=3 May 13 14:24:17.430497 systemd[1]: Started cri-containerd-05217c6a9fdaf46620ff46cf3e693559717595b9ffc4074fa0f92548eeb8b5dc.scope - libcontainer container 05217c6a9fdaf46620ff46cf3e693559717595b9ffc4074fa0f92548eeb8b5dc. May 13 14:24:17.475219 containerd[1551]: time="2025-05-13T14:24:17.475121163Z" level=info msg="StartContainer for \"05217c6a9fdaf46620ff46cf3e693559717595b9ffc4074fa0f92548eeb8b5dc\" returns successfully" May 13 14:24:17.928823 containerd[1551]: time="2025-05-13T14:24:17.928591870Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:17.930221 containerd[1551]: time="2025-05-13T14:24:17.930138613Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 13 14:24:17.938221 containerd[1551]: time="2025-05-13T14:24:17.938070895Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 568.754709ms" May 13 14:24:17.938221 containerd[1551]: time="2025-05-13T14:24:17.938152467Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 13 14:24:17.941546 containerd[1551]: time="2025-05-13T14:24:17.940760213Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 13 14:24:17.946294 containerd[1551]: time="2025-05-13T14:24:17.946019216Z" level=info msg="CreateContainer within sandbox \"21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 14:24:17.967107 containerd[1551]: time="2025-05-13T14:24:17.967007801Z" level=info msg="Container ea9eb6e05144992a777c9658961d50ad05be0c0d73d3bd9cd2cd2c55562ecc95: CDI devices from CRI Config.CDIDevices: []" May 13 14:24:17.989007 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3717061214.mount: Deactivated successfully. May 13 14:24:18.001962 containerd[1551]: time="2025-05-13T14:24:18.001893914Z" level=info msg="CreateContainer within sandbox \"21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ea9eb6e05144992a777c9658961d50ad05be0c0d73d3bd9cd2cd2c55562ecc95\"" May 13 14:24:18.005241 containerd[1551]: time="2025-05-13T14:24:18.005139152Z" level=info msg="StartContainer for \"ea9eb6e05144992a777c9658961d50ad05be0c0d73d3bd9cd2cd2c55562ecc95\"" May 13 14:24:18.010247 containerd[1551]: time="2025-05-13T14:24:18.010206357Z" level=info msg="connecting to shim ea9eb6e05144992a777c9658961d50ad05be0c0d73d3bd9cd2cd2c55562ecc95" address="unix:///run/containerd/s/1af876f59ffa3ea3cfb30e8a1ca332ab2915379c943283b0efc798cce4519d89" protocol=ttrpc version=3 May 13 14:24:18.036543 systemd[1]: Started cri-containerd-ea9eb6e05144992a777c9658961d50ad05be0c0d73d3bd9cd2cd2c55562ecc95.scope - libcontainer container ea9eb6e05144992a777c9658961d50ad05be0c0d73d3bd9cd2cd2c55562ecc95. May 13 14:24:18.087528 containerd[1551]: time="2025-05-13T14:24:18.087470445Z" level=info msg="StartContainer for \"ea9eb6e05144992a777c9658961d50ad05be0c0d73d3bd9cd2cd2c55562ecc95\" returns successfully" May 13 14:24:19.467294 kubelet[2793]: I0513 14:24:19.466924 2793 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 14:24:22.634334 containerd[1551]: time="2025-05-13T14:24:22.634280240Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"f673916347c6350da4fb70a86cc3967703a566c233908e1ed374d66ecaa10b82\" pid:4894 exited_at:{seconds:1747146262 nanos:633876545}" May 13 14:24:29.427841 containerd[1551]: time="2025-05-13T14:24:29.427676191Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"91a63fc64073ac15a47d110b1e529ffd12480e8c6c7612b55f1696e604fa0a18\" pid:4919 exited_at:{seconds:1747146269 nanos:427157040}" May 13 14:24:30.384462 containerd[1551]: time="2025-05-13T14:24:30.384143073Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:30.386586 containerd[1551]: time="2025-05-13T14:24:30.386453646Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 13 14:24:30.388134 containerd[1551]: time="2025-05-13T14:24:30.388083836Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:30.392287 containerd[1551]: time="2025-05-13T14:24:30.392229252Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 14:24:30.394656 containerd[1551]: time="2025-05-13T14:24:30.394525648Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 12.453704171s" May 13 14:24:30.394656 containerd[1551]: time="2025-05-13T14:24:30.394591351Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 13 14:24:30.399455 containerd[1551]: time="2025-05-13T14:24:30.399417269Z" level=info msg="CreateContainer within sandbox \"11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 13 14:24:30.412531 containerd[1551]: time="2025-05-13T14:24:30.410957760Z" level=info msg="Container 7822df420b39b478c89349c119539b5d8f2b369bec052f4919c5e36493fae870: CDI devices from CRI Config.CDIDevices: []" May 13 14:24:30.419529 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2289290463.mount: Deactivated successfully. May 13 14:24:30.426995 containerd[1551]: time="2025-05-13T14:24:30.426954698Z" level=info msg="CreateContainer within sandbox \"11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"7822df420b39b478c89349c119539b5d8f2b369bec052f4919c5e36493fae870\"" May 13 14:24:30.427558 containerd[1551]: time="2025-05-13T14:24:30.427475402Z" level=info msg="StartContainer for \"7822df420b39b478c89349c119539b5d8f2b369bec052f4919c5e36493fae870\"" May 13 14:24:30.429934 containerd[1551]: time="2025-05-13T14:24:30.429868610Z" level=info msg="connecting to shim 7822df420b39b478c89349c119539b5d8f2b369bec052f4919c5e36493fae870" address="unix:///run/containerd/s/4bbe3e29c5b61152dabe88686d9d7627b5659ba090fe7ec7629ccd5f972706bb" protocol=ttrpc version=3 May 13 14:24:30.465514 systemd[1]: Started cri-containerd-7822df420b39b478c89349c119539b5d8f2b369bec052f4919c5e36493fae870.scope - libcontainer container 7822df420b39b478c89349c119539b5d8f2b369bec052f4919c5e36493fae870. May 13 14:24:30.538169 containerd[1551]: time="2025-05-13T14:24:30.538114623Z" level=info msg="StartContainer for \"7822df420b39b478c89349c119539b5d8f2b369bec052f4919c5e36493fae870\" returns successfully" May 13 14:24:30.861321 containerd[1551]: time="2025-05-13T14:24:30.861246629Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"c6fcb912e1fd91eba55e154e91683c7e1f3b797538b15e3421b8c84db83e98a2\" pid:4982 exited_at:{seconds:1747146270 nanos:860720996}" May 13 14:24:31.205991 kubelet[2793]: I0513 14:24:31.205949 2793 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 13 14:24:31.206749 kubelet[2793]: I0513 14:24:31.206511 2793 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 13 14:24:31.545503 kubelet[2793]: I0513 14:24:31.545165 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6b4d8b97f4-w976v" podStartSLOduration=47.19355314 podStartE2EDuration="1m10.545136729s" podCreationTimestamp="2025-05-13 14:23:21 +0000 UTC" firstStartedPulling="2025-05-13 14:23:54.588565311 +0000 UTC m=+44.627587227" lastFinishedPulling="2025-05-13 14:24:17.94014885 +0000 UTC m=+67.979170816" observedRunningTime="2025-05-13 14:24:18.4791072 +0000 UTC m=+68.518129166" watchObservedRunningTime="2025-05-13 14:24:31.545136729 +0000 UTC m=+81.584158695" May 13 14:24:31.547698 kubelet[2793]: I0513 14:24:31.547592 2793 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-g62pr" podStartSLOduration=33.577978305 podStartE2EDuration="1m10.547539754s" podCreationTimestamp="2025-05-13 14:23:21 +0000 UTC" firstStartedPulling="2025-05-13 14:23:53.42647751 +0000 UTC m=+43.465499426" lastFinishedPulling="2025-05-13 14:24:30.396038959 +0000 UTC m=+80.435060875" observedRunningTime="2025-05-13 14:24:31.542679612 +0000 UTC m=+81.581701538" watchObservedRunningTime="2025-05-13 14:24:31.547539754 +0000 UTC m=+81.586561711" May 13 14:24:33.828292 kubelet[2793]: I0513 14:24:33.827509 2793 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 14:24:52.602317 containerd[1551]: time="2025-05-13T14:24:52.602211997Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"1d999046234da9159fa507e0947c96f4256bf925b5175fddc4f2e734099cd12d\" pid:5012 exited_at:{seconds:1747146292 nanos:600644243}" May 13 14:25:00.891871 containerd[1551]: time="2025-05-13T14:25:00.891797637Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"b1ea2ad2e0f418be61e1943e0a1c2bb028f85c2c8b7f80e66655028edaf58a60\" pid:5037 exited_at:{seconds:1747146300 nanos:890883497}" May 13 14:25:22.621677 containerd[1551]: time="2025-05-13T14:25:22.621612652Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"c23d468a6c30f3d6d1f77e056b362e23ef8b7f51006e6e73bfa6889f2d54fc21\" pid:5072 exited_at:{seconds:1747146322 nanos:620956102}" May 13 14:25:29.445171 containerd[1551]: time="2025-05-13T14:25:29.445112470Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"3e8fe341d073d195cdd93a7fb4028b972bc619e219fc1180fedd45b9730c26b0\" pid:5114 exited_at:{seconds:1747146329 nanos:444695066}" May 13 14:25:30.912146 containerd[1551]: time="2025-05-13T14:25:30.912062200Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"471d08d6fa34511c719bfcbf0457b5bc81ed011af5416bce443a7fcf6c78e8b4\" pid:5136 exited_at:{seconds:1747146330 nanos:911610562}" May 13 14:25:52.615699 containerd[1551]: time="2025-05-13T14:25:52.615607793Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"d31ba0a3c68c134e144f3b1d84f4e98d38d3ec8447829bcc29814f75baab23b1\" pid:5163 exited_at:{seconds:1747146352 nanos:614732084}" May 13 14:26:00.915803 containerd[1551]: time="2025-05-13T14:26:00.915690689Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"5357d865e8e6e553aee64a563e2c32afb1273e696518af8ca2116d1ebfe18fc1\" pid:5187 exited_at:{seconds:1747146360 nanos:914214166}" May 13 14:26:22.598829 containerd[1551]: time="2025-05-13T14:26:22.598764913Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"a46792dd0cf20ae3443a200af403478ff10d3ad9bdc355e3c74ad2a7a38f0e8b\" pid:5214 exited_at:{seconds:1747146382 nanos:597850547}" May 13 14:26:29.443887 containerd[1551]: time="2025-05-13T14:26:29.443842398Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"a2adf0c3767773fd97be2c8fdc9830c7d11a83040eec25d27682a9ee109201dd\" pid:5238 exited_at:{seconds:1747146389 nanos:443416004}" May 13 14:26:30.885513 containerd[1551]: time="2025-05-13T14:26:30.885208739Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"e14b9312da674871a833ac22f4b8a4501d995fe0110a5efc93cfa854bbc18f57\" pid:5266 exited_at:{seconds:1747146390 nanos:883838606}" May 13 14:26:52.598248 containerd[1551]: time="2025-05-13T14:26:52.598126401Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"dd1a187df40e01b9a386867af06684640e480ec9d681e2e23779213e44782682\" pid:5292 exited_at:{seconds:1747146412 nanos:597834290}" May 13 14:27:00.889304 containerd[1551]: time="2025-05-13T14:27:00.889251472Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"de04bafa52a20201c18655672a096740412ae1a05261449c550e01bdb1675ec1\" pid:5334 exited_at:{seconds:1747146420 nanos:889023943}" May 13 14:27:11.706193 systemd[1]: Started sshd@9-172.24.4.33:22-172.24.4.1:43440.service - OpenSSH per-connection server daemon (172.24.4.1:43440). May 13 14:27:13.106380 sshd[5352]: Accepted publickey for core from 172.24.4.1 port 43440 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:27:13.110679 sshd-session[5352]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:27:13.127676 systemd-logind[1524]: New session 12 of user core. May 13 14:27:13.136696 systemd[1]: Started session-12.scope - Session 12 of User core. May 13 14:27:13.947383 sshd[5357]: Connection closed by 172.24.4.1 port 43440 May 13 14:27:13.948447 sshd-session[5352]: pam_unix(sshd:session): session closed for user core May 13 14:27:13.956635 systemd[1]: sshd@9-172.24.4.33:22-172.24.4.1:43440.service: Deactivated successfully. May 13 14:27:13.962636 systemd[1]: session-12.scope: Deactivated successfully. May 13 14:27:13.966497 systemd-logind[1524]: Session 12 logged out. Waiting for processes to exit. May 13 14:27:13.970119 systemd-logind[1524]: Removed session 12. May 13 14:27:18.972537 systemd[1]: Started sshd@10-172.24.4.33:22-172.24.4.1:51838.service - OpenSSH per-connection server daemon (172.24.4.1:51838). May 13 14:27:20.084007 sshd[5372]: Accepted publickey for core from 172.24.4.1 port 51838 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:27:20.087788 sshd-session[5372]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:27:20.103511 systemd-logind[1524]: New session 13 of user core. May 13 14:27:20.115888 systemd[1]: Started session-13.scope - Session 13 of User core. May 13 14:27:20.924785 sshd[5374]: Connection closed by 172.24.4.1 port 51838 May 13 14:27:20.925923 sshd-session[5372]: pam_unix(sshd:session): session closed for user core May 13 14:27:20.933850 systemd[1]: sshd@10-172.24.4.33:22-172.24.4.1:51838.service: Deactivated successfully. May 13 14:27:20.940069 systemd[1]: session-13.scope: Deactivated successfully. May 13 14:27:20.943079 systemd-logind[1524]: Session 13 logged out. Waiting for processes to exit. May 13 14:27:20.946465 systemd-logind[1524]: Removed session 13. May 13 14:27:22.603787 containerd[1551]: time="2025-05-13T14:27:22.603688768Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"9d2874324c23358302b1adfc6563a5998e156eaa00e8b9c0f8bd7ee8043a9422\" pid:5397 exited_at:{seconds:1747146442 nanos:603155325}" May 13 14:27:25.950264 systemd[1]: Started sshd@11-172.24.4.33:22-172.24.4.1:33842.service - OpenSSH per-connection server daemon (172.24.4.1:33842). May 13 14:27:27.213677 sshd[5409]: Accepted publickey for core from 172.24.4.1 port 33842 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:27:27.216393 sshd-session[5409]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:27:27.227953 systemd-logind[1524]: New session 14 of user core. May 13 14:27:27.236638 systemd[1]: Started session-14.scope - Session 14 of User core. May 13 14:27:28.049056 sshd[5411]: Connection closed by 172.24.4.1 port 33842 May 13 14:27:28.051648 sshd-session[5409]: pam_unix(sshd:session): session closed for user core May 13 14:27:28.069830 systemd[1]: sshd@11-172.24.4.33:22-172.24.4.1:33842.service: Deactivated successfully. May 13 14:27:28.077731 systemd[1]: session-14.scope: Deactivated successfully. May 13 14:27:28.084494 systemd-logind[1524]: Session 14 logged out. Waiting for processes to exit. May 13 14:27:28.091965 systemd[1]: Started sshd@12-172.24.4.33:22-172.24.4.1:33844.service - OpenSSH per-connection server daemon (172.24.4.1:33844). May 13 14:27:28.096886 systemd-logind[1524]: Removed session 14. May 13 14:27:29.307579 sshd[5424]: Accepted publickey for core from 172.24.4.1 port 33844 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:27:29.310153 sshd-session[5424]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:27:29.321519 systemd-logind[1524]: New session 15 of user core. May 13 14:27:29.327867 systemd[1]: Started session-15.scope - Session 15 of User core. May 13 14:27:29.425740 containerd[1551]: time="2025-05-13T14:27:29.425671201Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"9e7aa7e73c3d869f983dc29393861df4b969b2104c9ca7f690b3e5dbfab803a3\" pid:5439 exited_at:{seconds:1747146449 nanos:425226345}" May 13 14:27:30.041669 sshd[5426]: Connection closed by 172.24.4.1 port 33844 May 13 14:27:30.043631 sshd-session[5424]: pam_unix(sshd:session): session closed for user core May 13 14:27:30.057601 systemd[1]: sshd@12-172.24.4.33:22-172.24.4.1:33844.service: Deactivated successfully. May 13 14:27:30.063690 systemd[1]: session-15.scope: Deactivated successfully. May 13 14:27:30.066813 systemd-logind[1524]: Session 15 logged out. Waiting for processes to exit. May 13 14:27:30.076294 systemd[1]: Started sshd@13-172.24.4.33:22-172.24.4.1:33852.service - OpenSSH per-connection server daemon (172.24.4.1:33852). May 13 14:27:30.079253 systemd-logind[1524]: Removed session 15. May 13 14:27:30.895967 containerd[1551]: time="2025-05-13T14:27:30.895837317Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"3598a3c969807129c2f4c16899b8798650baa0f9c1c577b720fbb5a621bd2715\" pid:5471 exited_at:{seconds:1747146450 nanos:895320275}" May 13 14:27:31.319764 sshd[5458]: Accepted publickey for core from 172.24.4.1 port 33852 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:27:31.323148 sshd-session[5458]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:27:31.336508 systemd-logind[1524]: New session 16 of user core. May 13 14:27:31.343667 systemd[1]: Started session-16.scope - Session 16 of User core. May 13 14:27:32.192690 sshd[5480]: Connection closed by 172.24.4.1 port 33852 May 13 14:27:32.192514 sshd-session[5458]: pam_unix(sshd:session): session closed for user core May 13 14:27:32.199971 systemd[1]: sshd@13-172.24.4.33:22-172.24.4.1:33852.service: Deactivated successfully. May 13 14:27:32.207138 systemd[1]: session-16.scope: Deactivated successfully. May 13 14:27:32.209976 systemd-logind[1524]: Session 16 logged out. Waiting for processes to exit. May 13 14:27:32.213533 systemd-logind[1524]: Removed session 16. May 13 14:27:37.217563 systemd[1]: Started sshd@14-172.24.4.33:22-172.24.4.1:33268.service - OpenSSH per-connection server daemon (172.24.4.1:33268). May 13 14:27:38.461744 sshd[5492]: Accepted publickey for core from 172.24.4.1 port 33268 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:27:38.463253 sshd-session[5492]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:27:38.476468 systemd-logind[1524]: New session 17 of user core. May 13 14:27:38.485213 systemd[1]: Started session-17.scope - Session 17 of User core. May 13 14:27:39.327275 sshd[5497]: Connection closed by 172.24.4.1 port 33268 May 13 14:27:39.329683 sshd-session[5492]: pam_unix(sshd:session): session closed for user core May 13 14:27:39.336185 systemd[1]: sshd@14-172.24.4.33:22-172.24.4.1:33268.service: Deactivated successfully. May 13 14:27:39.340754 systemd[1]: session-17.scope: Deactivated successfully. May 13 14:27:39.343978 systemd-logind[1524]: Session 17 logged out. Waiting for processes to exit. May 13 14:27:39.347806 systemd-logind[1524]: Removed session 17. May 13 14:27:44.350140 systemd[1]: Started sshd@15-172.24.4.33:22-172.24.4.1:53148.service - OpenSSH per-connection server daemon (172.24.4.1:53148). May 13 14:27:45.647453 sshd[5510]: Accepted publickey for core from 172.24.4.1 port 53148 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:27:45.649615 sshd-session[5510]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:27:45.656403 systemd-logind[1524]: New session 18 of user core. May 13 14:27:45.663184 systemd[1]: Started session-18.scope - Session 18 of User core. May 13 14:27:46.249576 sshd[5514]: Connection closed by 172.24.4.1 port 53148 May 13 14:27:46.250685 sshd-session[5510]: pam_unix(sshd:session): session closed for user core May 13 14:27:46.259977 systemd[1]: sshd@15-172.24.4.33:22-172.24.4.1:53148.service: Deactivated successfully. May 13 14:27:46.265195 systemd[1]: session-18.scope: Deactivated successfully. May 13 14:27:46.268738 systemd-logind[1524]: Session 18 logged out. Waiting for processes to exit. May 13 14:27:46.271175 systemd-logind[1524]: Removed session 18. May 13 14:27:51.271627 systemd[1]: Started sshd@16-172.24.4.33:22-172.24.4.1:53160.service - OpenSSH per-connection server daemon (172.24.4.1:53160). May 13 14:27:52.611064 containerd[1551]: time="2025-05-13T14:27:52.611000517Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"b4796a64d633cabdf2d4a5de8ce673d0a77f7ca181eab7901ef4fdd5372b5a9b\" pid:5541 exited_at:{seconds:1747146472 nanos:610730239}" May 13 14:27:52.641288 sshd[5527]: Accepted publickey for core from 172.24.4.1 port 53160 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:27:52.642713 sshd-session[5527]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:27:52.648302 systemd-logind[1524]: New session 19 of user core. May 13 14:27:52.654494 systemd[1]: Started session-19.scope - Session 19 of User core. May 13 14:27:53.327429 sshd[5552]: Connection closed by 172.24.4.1 port 53160 May 13 14:27:53.327944 sshd-session[5527]: pam_unix(sshd:session): session closed for user core May 13 14:27:53.337204 systemd[1]: sshd@16-172.24.4.33:22-172.24.4.1:53160.service: Deactivated successfully. May 13 14:27:53.342959 systemd[1]: session-19.scope: Deactivated successfully. May 13 14:27:53.347512 systemd-logind[1524]: Session 19 logged out. Waiting for processes to exit. May 13 14:27:53.350101 systemd-logind[1524]: Removed session 19. May 13 14:27:58.351738 systemd[1]: Started sshd@17-172.24.4.33:22-172.24.4.1:39944.service - OpenSSH per-connection server daemon (172.24.4.1:39944). May 13 14:27:59.643452 sshd[5563]: Accepted publickey for core from 172.24.4.1 port 39944 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:27:59.646594 sshd-session[5563]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:27:59.658479 systemd-logind[1524]: New session 20 of user core. May 13 14:27:59.667731 systemd[1]: Started session-20.scope - Session 20 of User core. May 13 14:28:00.568567 sshd[5565]: Connection closed by 172.24.4.1 port 39944 May 13 14:28:00.569163 sshd-session[5563]: pam_unix(sshd:session): session closed for user core May 13 14:28:00.581955 systemd[1]: sshd@17-172.24.4.33:22-172.24.4.1:39944.service: Deactivated successfully. May 13 14:28:00.583670 systemd[1]: session-20.scope: Deactivated successfully. May 13 14:28:00.587248 systemd-logind[1524]: Session 20 logged out. Waiting for processes to exit. May 13 14:28:00.591607 systemd[1]: Started sshd@18-172.24.4.33:22-172.24.4.1:39948.service - OpenSSH per-connection server daemon (172.24.4.1:39948). May 13 14:28:00.592943 systemd-logind[1524]: Removed session 20. May 13 14:28:00.860275 containerd[1551]: time="2025-05-13T14:28:00.859916582Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"374210108edc9a737f0db12029c381a818fbf6d611d53fe1ad1fea2eb00506a5\" pid:5592 exited_at:{seconds:1747146480 nanos:859072537}" May 13 14:28:01.938718 sshd[5577]: Accepted publickey for core from 172.24.4.1 port 39948 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:28:01.942637 sshd-session[5577]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:28:01.963607 systemd-logind[1524]: New session 21 of user core. May 13 14:28:01.974587 systemd[1]: Started session-21.scope - Session 21 of User core. May 13 14:28:03.057400 sshd[5602]: Connection closed by 172.24.4.1 port 39948 May 13 14:28:03.055566 sshd-session[5577]: pam_unix(sshd:session): session closed for user core May 13 14:28:03.073184 systemd[1]: sshd@18-172.24.4.33:22-172.24.4.1:39948.service: Deactivated successfully. May 13 14:28:03.077478 systemd[1]: session-21.scope: Deactivated successfully. May 13 14:28:03.080657 systemd-logind[1524]: Session 21 logged out. Waiting for processes to exit. May 13 14:28:03.086681 systemd[1]: Started sshd@19-172.24.4.33:22-172.24.4.1:39956.service - OpenSSH per-connection server daemon (172.24.4.1:39956). May 13 14:28:03.091132 systemd-logind[1524]: Removed session 21. May 13 14:28:04.133202 sshd[5612]: Accepted publickey for core from 172.24.4.1 port 39956 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:28:04.136134 sshd-session[5612]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:28:04.147930 systemd-logind[1524]: New session 22 of user core. May 13 14:28:04.153016 systemd[1]: Started session-22.scope - Session 22 of User core. May 13 14:28:04.537866 containerd[1551]: time="2025-05-13T14:28:04.537562817Z" level=warning msg="container event discarded" container=15ba9ec39b0ca23c731dadcaef3ee56d58b93f398424e4d6cceaf02394963231 type=CONTAINER_CREATED_EVENT May 13 14:28:04.537866 containerd[1551]: time="2025-05-13T14:28:04.537748645Z" level=warning msg="container event discarded" container=15ba9ec39b0ca23c731dadcaef3ee56d58b93f398424e4d6cceaf02394963231 type=CONTAINER_STARTED_EVENT May 13 14:28:04.553858 containerd[1551]: time="2025-05-13T14:28:04.553725655Z" level=warning msg="container event discarded" container=5ffbc552d6076f34312f1bbd040919efdd6b31a0b699e9946d72743266fd28bd type=CONTAINER_CREATED_EVENT May 13 14:28:04.553858 containerd[1551]: time="2025-05-13T14:28:04.553823659Z" level=warning msg="container event discarded" container=5ffbc552d6076f34312f1bbd040919efdd6b31a0b699e9946d72743266fd28bd type=CONTAINER_STARTED_EVENT May 13 14:28:04.576577 containerd[1551]: time="2025-05-13T14:28:04.576392921Z" level=warning msg="container event discarded" container=3573e1c82bd0e8e7b96303b61b5e3eac700999f370c28f80ebb3075f957f3d31 type=CONTAINER_CREATED_EVENT May 13 14:28:04.576577 containerd[1551]: time="2025-05-13T14:28:04.576506505Z" level=warning msg="container event discarded" container=3573e1c82bd0e8e7b96303b61b5e3eac700999f370c28f80ebb3075f957f3d31 type=CONTAINER_STARTED_EVENT May 13 14:28:04.611942 containerd[1551]: time="2025-05-13T14:28:04.611822098Z" level=warning msg="container event discarded" container=1063e9badd72086353af62cc2d25e69902848ae97aac4744a8101059224b1587 type=CONTAINER_CREATED_EVENT May 13 14:28:04.624594 containerd[1551]: time="2025-05-13T14:28:04.624434117Z" level=warning msg="container event discarded" container=fa395d556188400b016fa66ccbdece5952e0670c2db44331a07996a6c80191e4 type=CONTAINER_CREATED_EVENT May 13 14:28:04.624594 containerd[1551]: time="2025-05-13T14:28:04.624515700Z" level=warning msg="container event discarded" container=caa3ca584da2cf00c3acdb0996ef3b07c8c6c3ea76b52d3305db485f78feba0f type=CONTAINER_CREATED_EVENT May 13 14:28:04.739008 containerd[1551]: time="2025-05-13T14:28:04.738847099Z" level=warning msg="container event discarded" container=1063e9badd72086353af62cc2d25e69902848ae97aac4744a8101059224b1587 type=CONTAINER_STARTED_EVENT May 13 14:28:04.739426 containerd[1551]: time="2025-05-13T14:28:04.738974037Z" level=warning msg="container event discarded" container=caa3ca584da2cf00c3acdb0996ef3b07c8c6c3ea76b52d3305db485f78feba0f type=CONTAINER_STARTED_EVENT May 13 14:28:04.783821 containerd[1551]: time="2025-05-13T14:28:04.783654381Z" level=warning msg="container event discarded" container=fa395d556188400b016fa66ccbdece5952e0670c2db44331a07996a6c80191e4 type=CONTAINER_STARTED_EVENT May 13 14:28:07.806664 sshd[5614]: Connection closed by 172.24.4.1 port 39956 May 13 14:28:07.809425 sshd-session[5612]: pam_unix(sshd:session): session closed for user core May 13 14:28:07.833532 systemd[1]: sshd@19-172.24.4.33:22-172.24.4.1:39956.service: Deactivated successfully. May 13 14:28:07.839943 systemd[1]: session-22.scope: Deactivated successfully. May 13 14:28:07.840536 systemd[1]: session-22.scope: Consumed 953ms CPU time, 70.3M memory peak. May 13 14:28:07.844475 systemd-logind[1524]: Session 22 logged out. Waiting for processes to exit. May 13 14:28:07.850063 systemd[1]: Started sshd@20-172.24.4.33:22-172.24.4.1:37460.service - OpenSSH per-connection server daemon (172.24.4.1:37460). May 13 14:28:07.857933 systemd-logind[1524]: Removed session 22. May 13 14:28:09.019024 sshd[5631]: Accepted publickey for core from 172.24.4.1 port 37460 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:28:09.024702 sshd-session[5631]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:28:09.040483 systemd-logind[1524]: New session 23 of user core. May 13 14:28:09.047805 systemd[1]: Started session-23.scope - Session 23 of User core. May 13 14:28:09.829138 sshd[5633]: Connection closed by 172.24.4.1 port 37460 May 13 14:28:09.829787 sshd-session[5631]: pam_unix(sshd:session): session closed for user core May 13 14:28:09.848948 systemd[1]: sshd@20-172.24.4.33:22-172.24.4.1:37460.service: Deactivated successfully. May 13 14:28:09.854208 systemd[1]: session-23.scope: Deactivated successfully. May 13 14:28:09.859683 systemd-logind[1524]: Session 23 logged out. Waiting for processes to exit. May 13 14:28:09.867637 systemd[1]: Started sshd@21-172.24.4.33:22-172.24.4.1:37462.service - OpenSSH per-connection server daemon (172.24.4.1:37462). May 13 14:28:09.876038 systemd-logind[1524]: Removed session 23. May 13 14:28:10.909771 sshd[5643]: Accepted publickey for core from 172.24.4.1 port 37462 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:28:10.912585 sshd-session[5643]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:28:10.925905 systemd-logind[1524]: New session 24 of user core. May 13 14:28:10.932744 systemd[1]: Started session-24.scope - Session 24 of User core. May 13 14:28:11.761268 sshd[5647]: Connection closed by 172.24.4.1 port 37462 May 13 14:28:11.762590 sshd-session[5643]: pam_unix(sshd:session): session closed for user core May 13 14:28:11.772112 systemd-logind[1524]: Session 24 logged out. Waiting for processes to exit. May 13 14:28:11.772254 systemd[1]: sshd@21-172.24.4.33:22-172.24.4.1:37462.service: Deactivated successfully. May 13 14:28:11.777560 systemd[1]: session-24.scope: Deactivated successfully. May 13 14:28:11.788343 systemd-logind[1524]: Removed session 24. May 13 14:28:15.339062 containerd[1551]: time="2025-05-13T14:28:15.338859330Z" level=warning msg="container event discarded" container=037609973ca763cf2a118b389deeb57ab6d18409776c5adbd914f9e8d59e5c73 type=CONTAINER_CREATED_EVENT May 13 14:28:15.339062 containerd[1551]: time="2025-05-13T14:28:15.339045670Z" level=warning msg="container event discarded" container=037609973ca763cf2a118b389deeb57ab6d18409776c5adbd914f9e8d59e5c73 type=CONTAINER_STARTED_EVENT May 13 14:28:15.370510 containerd[1551]: time="2025-05-13T14:28:15.370401255Z" level=warning msg="container event discarded" container=ba433d09fb7c8c37cb06100c777e7cf7047cc4defdc2733ddb3b7c448552ed54 type=CONTAINER_CREATED_EVENT May 13 14:28:15.389869 containerd[1551]: time="2025-05-13T14:28:15.389687423Z" level=warning msg="container event discarded" container=48563a5bbec53f9a55f1ec0e0439077594beddaf302de1333e8dca2b993490ff type=CONTAINER_CREATED_EVENT May 13 14:28:15.389869 containerd[1551]: time="2025-05-13T14:28:15.389750872Z" level=warning msg="container event discarded" container=48563a5bbec53f9a55f1ec0e0439077594beddaf302de1333e8dca2b993490ff type=CONTAINER_STARTED_EVENT May 13 14:28:15.450995 containerd[1551]: time="2025-05-13T14:28:15.450757190Z" level=warning msg="container event discarded" container=ba433d09fb7c8c37cb06100c777e7cf7047cc4defdc2733ddb3b7c448552ed54 type=CONTAINER_STARTED_EVENT May 13 14:28:16.790473 systemd[1]: Started sshd@22-172.24.4.33:22-172.24.4.1:33976.service - OpenSSH per-connection server daemon (172.24.4.1:33976). May 13 14:28:17.897416 containerd[1551]: time="2025-05-13T14:28:17.897278842Z" level=warning msg="container event discarded" container=0b513367cb17d252578a72b17262c447e3bc3e8501270e5f7cd7c7ff66313cf8 type=CONTAINER_CREATED_EVENT May 13 14:28:17.932850 sshd[5664]: Accepted publickey for core from 172.24.4.1 port 33976 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:28:17.934252 sshd-session[5664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:28:17.943618 systemd-logind[1524]: New session 25 of user core. May 13 14:28:17.949836 containerd[1551]: time="2025-05-13T14:28:17.949628449Z" level=warning msg="container event discarded" container=0b513367cb17d252578a72b17262c447e3bc3e8501270e5f7cd7c7ff66313cf8 type=CONTAINER_STARTED_EVENT May 13 14:28:17.951897 systemd[1]: Started session-25.scope - Session 25 of User core. May 13 14:28:18.754548 sshd[5666]: Connection closed by 172.24.4.1 port 33976 May 13 14:28:18.755975 sshd-session[5664]: pam_unix(sshd:session): session closed for user core May 13 14:28:18.767424 systemd[1]: sshd@22-172.24.4.33:22-172.24.4.1:33976.service: Deactivated successfully. May 13 14:28:18.773828 systemd[1]: session-25.scope: Deactivated successfully. May 13 14:28:18.777067 systemd-logind[1524]: Session 25 logged out. Waiting for processes to exit. May 13 14:28:18.784074 systemd-logind[1524]: Removed session 25. May 13 14:28:21.809257 containerd[1551]: time="2025-05-13T14:28:21.808844368Z" level=warning msg="container event discarded" container=d4ba9713c0b367d8a4fa8e6c68c7aa3fb6718a4fa53788d961c51e72138891f2 type=CONTAINER_CREATED_EVENT May 13 14:28:21.809257 containerd[1551]: time="2025-05-13T14:28:21.809031921Z" level=warning msg="container event discarded" container=d4ba9713c0b367d8a4fa8e6c68c7aa3fb6718a4fa53788d961c51e72138891f2 type=CONTAINER_STARTED_EVENT May 13 14:28:21.925409 containerd[1551]: time="2025-05-13T14:28:21.923994396Z" level=warning msg="container event discarded" container=7ff433576b9abc4725da3db9468ee947e5e055c6897eb3b8d18e97a7606cb0c8 type=CONTAINER_CREATED_EVENT May 13 14:28:21.926627 containerd[1551]: time="2025-05-13T14:28:21.926429379Z" level=warning msg="container event discarded" container=7ff433576b9abc4725da3db9468ee947e5e055c6897eb3b8d18e97a7606cb0c8 type=CONTAINER_STARTED_EVENT May 13 14:28:22.625306 containerd[1551]: time="2025-05-13T14:28:22.625250197Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"f4043f97343f4875f45758001c851f6e507aad117f51dcca79cb9934a5f95906\" pid:5689 exited_at:{seconds:1747146502 nanos:624734197}" May 13 14:28:23.784895 systemd[1]: Started sshd@23-172.24.4.33:22-172.24.4.1:44216.service - OpenSSH per-connection server daemon (172.24.4.1:44216). May 13 14:28:24.911656 sshd[5702]: Accepted publickey for core from 172.24.4.1 port 44216 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:28:24.918564 sshd-session[5702]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:28:24.936533 systemd-logind[1524]: New session 26 of user core. May 13 14:28:24.945957 systemd[1]: Started session-26.scope - Session 26 of User core. May 13 14:28:25.714722 containerd[1551]: time="2025-05-13T14:28:25.713704660Z" level=warning msg="container event discarded" container=76957ce00bd300e8cf9511b8f1494fda1450f49124f87d3b30f147848bc18b83 type=CONTAINER_CREATED_EVENT May 13 14:28:25.737060 sshd[5704]: Connection closed by 172.24.4.1 port 44216 May 13 14:28:25.738543 sshd-session[5702]: pam_unix(sshd:session): session closed for user core May 13 14:28:25.746313 systemd[1]: sshd@23-172.24.4.33:22-172.24.4.1:44216.service: Deactivated successfully. May 13 14:28:25.752818 systemd[1]: session-26.scope: Deactivated successfully. May 13 14:28:25.756086 systemd-logind[1524]: Session 26 logged out. Waiting for processes to exit. May 13 14:28:25.761873 systemd-logind[1524]: Removed session 26. May 13 14:28:25.800022 containerd[1551]: time="2025-05-13T14:28:25.799876284Z" level=warning msg="container event discarded" container=76957ce00bd300e8cf9511b8f1494fda1450f49124f87d3b30f147848bc18b83 type=CONTAINER_STARTED_EVENT May 13 14:28:29.245350 containerd[1551]: time="2025-05-13T14:28:29.245182258Z" level=warning msg="container event discarded" container=b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0 type=CONTAINER_CREATED_EVENT May 13 14:28:29.316788 containerd[1551]: time="2025-05-13T14:28:29.316664543Z" level=warning msg="container event discarded" container=b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0 type=CONTAINER_STARTED_EVENT May 13 14:28:29.487436 containerd[1551]: time="2025-05-13T14:28:29.487227853Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"c09fc03fc2e572543741a79e0563c89ace4c36f9b20798f3f9a3081040aa3bc6\" pid:5727 exited_at:{seconds:1747146509 nanos:485673405}" May 13 14:28:30.232537 containerd[1551]: time="2025-05-13T14:28:30.232399154Z" level=warning msg="container event discarded" container=b24706f8599df61df3125c331ccd203623850b2d6662d45acbf34a93952d89b0 type=CONTAINER_STOPPED_EVENT May 13 14:28:30.767195 systemd[1]: Started sshd@24-172.24.4.33:22-172.24.4.1:44232.service - OpenSSH per-connection server daemon (172.24.4.1:44232). May 13 14:28:30.893616 containerd[1551]: time="2025-05-13T14:28:30.893555116Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"aec58de6699d59915f0b2feb5e29bc456d1dfa2a62e0d1fd1a65ffcdedc31222\" pid:5763 exited_at:{seconds:1747146510 nanos:893201463}" May 13 14:28:31.920461 sshd[5748]: Accepted publickey for core from 172.24.4.1 port 44232 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:28:31.923761 sshd-session[5748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:28:31.944153 systemd-logind[1524]: New session 27 of user core. May 13 14:28:31.957935 systemd[1]: Started session-27.scope - Session 27 of User core. May 13 14:28:32.752517 sshd[5772]: Connection closed by 172.24.4.1 port 44232 May 13 14:28:32.753950 sshd-session[5748]: pam_unix(sshd:session): session closed for user core May 13 14:28:32.766952 systemd[1]: sshd@24-172.24.4.33:22-172.24.4.1:44232.service: Deactivated successfully. May 13 14:28:32.774842 systemd[1]: session-27.scope: Deactivated successfully. May 13 14:28:32.778986 systemd-logind[1524]: Session 27 logged out. Waiting for processes to exit. May 13 14:28:32.782575 systemd-logind[1524]: Removed session 27. May 13 14:28:36.395565 containerd[1551]: time="2025-05-13T14:28:36.395433787Z" level=warning msg="container event discarded" container=35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e type=CONTAINER_CREATED_EVENT May 13 14:28:36.475139 containerd[1551]: time="2025-05-13T14:28:36.475019094Z" level=warning msg="container event discarded" container=35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e type=CONTAINER_STARTED_EVENT May 13 14:28:37.776895 systemd[1]: Started sshd@25-172.24.4.33:22-172.24.4.1:50128.service - OpenSSH per-connection server daemon (172.24.4.1:50128). May 13 14:28:38.651611 containerd[1551]: time="2025-05-13T14:28:38.651432160Z" level=warning msg="container event discarded" container=35d2399635b893c592567528a3e386d0bb06adf3944b009c91a34f1dd1e8f70e type=CONTAINER_STOPPED_EVENT May 13 14:28:39.209582 sshd[5789]: Accepted publickey for core from 172.24.4.1 port 50128 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:28:39.212263 sshd-session[5789]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:28:39.229506 systemd-logind[1524]: New session 28 of user core. May 13 14:28:39.241911 systemd[1]: Started session-28.scope - Session 28 of User core. May 13 14:28:40.043773 sshd[5791]: Connection closed by 172.24.4.1 port 50128 May 13 14:28:40.043577 sshd-session[5789]: pam_unix(sshd:session): session closed for user core May 13 14:28:40.052634 systemd[1]: sshd@25-172.24.4.33:22-172.24.4.1:50128.service: Deactivated successfully. May 13 14:28:40.058908 systemd[1]: session-28.scope: Deactivated successfully. May 13 14:28:40.061544 systemd-logind[1524]: Session 28 logged out. Waiting for processes to exit. May 13 14:28:40.065925 systemd-logind[1524]: Removed session 28. May 13 14:28:45.065238 systemd[1]: Started sshd@26-172.24.4.33:22-172.24.4.1:54264.service - OpenSSH per-connection server daemon (172.24.4.1:54264). May 13 14:28:46.184911 sshd[5803]: Accepted publickey for core from 172.24.4.1 port 54264 ssh2: RSA SHA256:av4FHeEzAcUMWUhmqWCCXwdzRC6ovVk13UX6HXtHGh4 May 13 14:28:46.188779 sshd-session[5803]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 14:28:46.200914 systemd-logind[1524]: New session 29 of user core. May 13 14:28:46.212717 systemd[1]: Started session-29.scope - Session 29 of User core. May 13 14:28:47.012885 sshd[5807]: Connection closed by 172.24.4.1 port 54264 May 13 14:28:47.014288 sshd-session[5803]: pam_unix(sshd:session): session closed for user core May 13 14:28:47.023677 systemd[1]: sshd@26-172.24.4.33:22-172.24.4.1:54264.service: Deactivated successfully. May 13 14:28:47.030016 systemd[1]: session-29.scope: Deactivated successfully. May 13 14:28:47.033584 systemd-logind[1524]: Session 29 logged out. Waiting for processes to exit. May 13 14:28:47.037192 systemd-logind[1524]: Removed session 29. May 13 14:28:47.885007 containerd[1551]: time="2025-05-13T14:28:47.884820075Z" level=warning msg="container event discarded" container=c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332 type=CONTAINER_CREATED_EVENT May 13 14:28:47.953454 containerd[1551]: time="2025-05-13T14:28:47.953275453Z" level=warning msg="container event discarded" container=c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332 type=CONTAINER_STARTED_EVENT May 13 14:28:51.999478 containerd[1551]: time="2025-05-13T14:28:51.999262797Z" level=warning msg="container event discarded" container=5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f type=CONTAINER_CREATED_EVENT May 13 14:28:51.999478 containerd[1551]: time="2025-05-13T14:28:51.999417828Z" level=warning msg="container event discarded" container=5ed6e7ba9a0d3a4085202948098c93fdaffcf7769aad27132a52638f9ddc229f type=CONTAINER_STARTED_EVENT May 13 14:28:52.385697 containerd[1551]: time="2025-05-13T14:28:52.385279162Z" level=warning msg="container event discarded" container=71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414 type=CONTAINER_CREATED_EVENT May 13 14:28:52.385697 containerd[1551]: time="2025-05-13T14:28:52.385562834Z" level=warning msg="container event discarded" container=71de3176c72d2e1e8b0e3ef8d57b596adff9ac49df2dfca5ef531f5a0fbd4414 type=CONTAINER_STARTED_EVENT May 13 14:28:52.611876 containerd[1551]: time="2025-05-13T14:28:52.611810618Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"2852faff3a842a51f0e41d6b741a35dc4e05d72a64f4c91b52c1c9d9c65e12e9\" pid:5830 exited_at:{seconds:1747146532 nanos:610313158}" May 13 14:28:53.435453 containerd[1551]: time="2025-05-13T14:28:53.435216087Z" level=warning msg="container event discarded" container=11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68 type=CONTAINER_CREATED_EVENT May 13 14:28:53.435453 containerd[1551]: time="2025-05-13T14:28:53.435423797Z" level=warning msg="container event discarded" container=11d018d9c2b9b738d5a278b904275ef51fb280787f39961a843e349e79e3ed68 type=CONTAINER_STARTED_EVENT May 13 14:28:53.539177 containerd[1551]: time="2025-05-13T14:28:53.538999073Z" level=warning msg="container event discarded" container=783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75 type=CONTAINER_CREATED_EVENT May 13 14:28:53.539177 containerd[1551]: time="2025-05-13T14:28:53.539127033Z" level=warning msg="container event discarded" container=783841b9f07d09e817264cb0dd14c23ff60d09406e97862d2520a5c9f3071f75 type=CONTAINER_STARTED_EVENT May 13 14:28:53.569672 containerd[1551]: time="2025-05-13T14:28:53.569548319Z" level=warning msg="container event discarded" container=10e112fd4415a297d6986fe9298e5d6f1e719ae9150db0fc5632e957faba1b7e type=CONTAINER_CREATED_EVENT May 13 14:28:53.623114 containerd[1551]: time="2025-05-13T14:28:53.622919884Z" level=warning msg="container event discarded" container=10e112fd4415a297d6986fe9298e5d6f1e719ae9150db0fc5632e957faba1b7e type=CONTAINER_STARTED_EVENT May 13 14:28:54.504654 containerd[1551]: time="2025-05-13T14:28:54.504465847Z" level=warning msg="container event discarded" container=353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee type=CONTAINER_CREATED_EVENT May 13 14:28:54.504654 containerd[1551]: time="2025-05-13T14:28:54.504632419Z" level=warning msg="container event discarded" container=353823ee3093e4efdf9fd42d3a2ad357804783df913f297fdf312691cfde86ee type=CONTAINER_STARTED_EVENT May 13 14:28:54.561922 containerd[1551]: time="2025-05-13T14:28:54.561767219Z" level=warning msg="container event discarded" container=ff33e0c71e5dc95590dd13996073e3b154a3570b21ca08f9c0ac7d8d6f60b8fc type=CONTAINER_CREATED_EVENT May 13 14:28:54.597236 containerd[1551]: time="2025-05-13T14:28:54.596288872Z" level=warning msg="container event discarded" container=21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf type=CONTAINER_CREATED_EVENT May 13 14:28:54.597236 containerd[1551]: time="2025-05-13T14:28:54.596771767Z" level=warning msg="container event discarded" container=21bf7e36f9644f81e5bc74fc509ddd53538803408e85cfa76fdcdb3b514167cf type=CONTAINER_STARTED_EVENT May 13 14:28:54.631621 containerd[1551]: time="2025-05-13T14:28:54.631536014Z" level=warning msg="container event discarded" container=ff33e0c71e5dc95590dd13996073e3b154a3570b21ca08f9c0ac7d8d6f60b8fc type=CONTAINER_STARTED_EVENT May 13 14:29:00.074539 containerd[1551]: time="2025-05-13T14:29:00.073631523Z" level=warning msg="container event discarded" container=01072395419997700e8cafc332dbcaeb51c493b24fe581b740c849d1a5c85198 type=CONTAINER_CREATED_EVENT May 13 14:29:00.176596 containerd[1551]: time="2025-05-13T14:29:00.176192273Z" level=warning msg="container event discarded" container=01072395419997700e8cafc332dbcaeb51c493b24fe581b740c849d1a5c85198 type=CONTAINER_STARTED_EVENT May 13 14:29:00.922908 containerd[1551]: time="2025-05-13T14:29:00.922857887Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"4f567cfc5cbef4c7f1d66bc0802087307d03e80da906a8e8dd8e5d593aec4568\" pid:5855 exited_at:{seconds:1747146540 nanos:922548977}" May 13 14:29:06.819919 containerd[1551]: time="2025-05-13T14:29:06.819680440Z" level=warning msg="container event discarded" container=c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb type=CONTAINER_CREATED_EVENT May 13 14:29:06.914494 containerd[1551]: time="2025-05-13T14:29:06.914319894Z" level=warning msg="container event discarded" container=c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb type=CONTAINER_STARTED_EVENT May 13 14:29:17.411100 containerd[1551]: time="2025-05-13T14:29:17.410512072Z" level=warning msg="container event discarded" container=05217c6a9fdaf46620ff46cf3e693559717595b9ffc4074fa0f92548eeb8b5dc type=CONTAINER_CREATED_EVENT May 13 14:29:17.484978 containerd[1551]: time="2025-05-13T14:29:17.484732091Z" level=warning msg="container event discarded" container=05217c6a9fdaf46620ff46cf3e693559717595b9ffc4074fa0f92548eeb8b5dc type=CONTAINER_STARTED_EVENT May 13 14:29:18.011780 containerd[1551]: time="2025-05-13T14:29:18.009529769Z" level=warning msg="container event discarded" container=ea9eb6e05144992a777c9658961d50ad05be0c0d73d3bd9cd2cd2c55562ecc95 type=CONTAINER_CREATED_EVENT May 13 14:29:18.097327 containerd[1551]: time="2025-05-13T14:29:18.097166307Z" level=warning msg="container event discarded" container=ea9eb6e05144992a777c9658961d50ad05be0c0d73d3bd9cd2cd2c55562ecc95 type=CONTAINER_STARTED_EVENT May 13 14:29:22.694941 containerd[1551]: time="2025-05-13T14:29:22.694858274Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"4ed2c088b7e76c8178a0fa4c0985d269e2d7b2dab8dd6c6530e983e125b9e228\" pid:5889 exited_at:{seconds:1747146562 nanos:694504520}" May 13 14:29:29.488786 containerd[1551]: time="2025-05-13T14:29:29.486290644Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"a2e636e9c74e6f4fbb4c2ff3d02ec6fec291f8ab848018df6df185271f124107\" pid:5914 exited_at:{seconds:1747146569 nanos:485233170}" May 13 14:29:30.437057 containerd[1551]: time="2025-05-13T14:29:30.436868075Z" level=warning msg="container event discarded" container=7822df420b39b478c89349c119539b5d8f2b369bec052f4919c5e36493fae870 type=CONTAINER_CREATED_EVENT May 13 14:29:30.546671 containerd[1551]: time="2025-05-13T14:29:30.546498330Z" level=warning msg="container event discarded" container=7822df420b39b478c89349c119539b5d8f2b369bec052f4919c5e36493fae870 type=CONTAINER_STARTED_EVENT May 13 14:29:30.909211 containerd[1551]: time="2025-05-13T14:29:30.908925179Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"ef2e7f284f050ea70604eff39749907d0706d3dee5838fa7ac8e6551122c0d2a\" pid:5936 exited_at:{seconds:1747146570 nanos:908768786}" May 13 14:29:36.007425 update_engine[1525]: I20250513 14:29:36.006975 1525 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 13 14:29:36.007425 update_engine[1525]: I20250513 14:29:36.007297 1525 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 13 14:29:36.008900 update_engine[1525]: I20250513 14:29:36.008823 1525 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 13 14:29:36.013196 update_engine[1525]: I20250513 14:29:36.011576 1525 omaha_request_params.cc:62] Current group set to developer May 13 14:29:36.013196 update_engine[1525]: I20250513 14:29:36.012745 1525 update_attempter.cc:499] Already updated boot flags. Skipping. May 13 14:29:36.013196 update_engine[1525]: I20250513 14:29:36.012779 1525 update_attempter.cc:643] Scheduling an action processor start. May 13 14:29:36.013196 update_engine[1525]: I20250513 14:29:36.012859 1525 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 13 14:29:36.013196 update_engine[1525]: I20250513 14:29:36.013098 1525 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 13 14:29:36.013764 update_engine[1525]: I20250513 14:29:36.013298 1525 omaha_request_action.cc:271] Posting an Omaha request to disabled May 13 14:29:36.013764 update_engine[1525]: I20250513 14:29:36.013326 1525 omaha_request_action.cc:272] Request: May 13 14:29:36.013764 update_engine[1525]: May 13 14:29:36.013764 update_engine[1525]: May 13 14:29:36.013764 update_engine[1525]: May 13 14:29:36.013764 update_engine[1525]: May 13 14:29:36.013764 update_engine[1525]: May 13 14:29:36.013764 update_engine[1525]: May 13 14:29:36.013764 update_engine[1525]: May 13 14:29:36.013764 update_engine[1525]: May 13 14:29:36.014962 update_engine[1525]: I20250513 14:29:36.013348 1525 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 14:29:36.038485 update_engine[1525]: I20250513 14:29:36.031687 1525 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 14:29:36.038485 update_engine[1525]: I20250513 14:29:36.033138 1525 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 14:29:36.040510 locksmithd[1567]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 13 14:29:36.041853 update_engine[1525]: E20250513 14:29:36.041771 1525 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 14:29:36.042030 update_engine[1525]: I20250513 14:29:36.041987 1525 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 13 14:29:45.967618 update_engine[1525]: I20250513 14:29:45.966445 1525 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 14:29:45.967618 update_engine[1525]: I20250513 14:29:45.966897 1525 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 14:29:45.967618 update_engine[1525]: I20250513 14:29:45.967573 1525 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 14:29:45.983561 update_engine[1525]: E20250513 14:29:45.983451 1525 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 14:29:45.983841 update_engine[1525]: I20250513 14:29:45.983692 1525 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 13 14:29:52.679023 containerd[1551]: time="2025-05-13T14:29:52.678972863Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"f5b6ee254dbcf07737d5ec72a472267eb0002c1a7137dbcbc963581b9bdb6a03\" pid:5960 exited_at:{seconds:1747146592 nanos:678442048}" May 13 14:29:55.964740 update_engine[1525]: I20250513 14:29:55.964530 1525 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 14:29:55.966150 update_engine[1525]: I20250513 14:29:55.965075 1525 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 14:29:55.966150 update_engine[1525]: I20250513 14:29:55.965970 1525 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 14:29:55.971644 update_engine[1525]: E20250513 14:29:55.971529 1525 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 14:29:55.971846 update_engine[1525]: I20250513 14:29:55.971689 1525 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 13 14:30:00.928942 containerd[1551]: time="2025-05-13T14:30:00.928888590Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c11508733a66007193a95338a9643122d32b47dfcfa35362b40895dec75471fb\" id:\"8f3646cd0d3e560e808b961c9395ad8bcb3932a2a26ef833af3d8e4cb16b4409\" pid:5986 exited_at:{seconds:1747146600 nanos:928500536}" May 13 14:30:05.965612 update_engine[1525]: I20250513 14:30:05.965469 1525 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 14:30:05.966548 update_engine[1525]: I20250513 14:30:05.965929 1525 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 14:30:05.966694 update_engine[1525]: I20250513 14:30:05.966542 1525 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 14:30:05.972247 update_engine[1525]: E20250513 14:30:05.972139 1525 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 14:30:05.972247 update_engine[1525]: I20250513 14:30:05.972242 1525 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 13 14:30:05.972600 update_engine[1525]: I20250513 14:30:05.972261 1525 omaha_request_action.cc:617] Omaha request response: May 13 14:30:05.972600 update_engine[1525]: E20250513 14:30:05.972502 1525 omaha_request_action.cc:636] Omaha request network transfer failed. May 13 14:30:05.972762 update_engine[1525]: I20250513 14:30:05.972597 1525 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 13 14:30:05.972762 update_engine[1525]: I20250513 14:30:05.972612 1525 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 13 14:30:05.972762 update_engine[1525]: I20250513 14:30:05.972622 1525 update_attempter.cc:306] Processing Done. May 13 14:30:05.972762 update_engine[1525]: E20250513 14:30:05.972680 1525 update_attempter.cc:619] Update failed. May 13 14:30:05.972762 update_engine[1525]: I20250513 14:30:05.972708 1525 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 13 14:30:05.972762 update_engine[1525]: I20250513 14:30:05.972720 1525 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 13 14:30:05.972762 update_engine[1525]: I20250513 14:30:05.972731 1525 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 13 14:30:05.973290 update_engine[1525]: I20250513 14:30:05.972870 1525 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 13 14:30:05.973290 update_engine[1525]: I20250513 14:30:05.972915 1525 omaha_request_action.cc:271] Posting an Omaha request to disabled May 13 14:30:05.973290 update_engine[1525]: I20250513 14:30:05.972926 1525 omaha_request_action.cc:272] Request: May 13 14:30:05.973290 update_engine[1525]: May 13 14:30:05.973290 update_engine[1525]: May 13 14:30:05.973290 update_engine[1525]: May 13 14:30:05.973290 update_engine[1525]: May 13 14:30:05.973290 update_engine[1525]: May 13 14:30:05.973290 update_engine[1525]: May 13 14:30:05.973290 update_engine[1525]: I20250513 14:30:05.972939 1525 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 14:30:05.973290 update_engine[1525]: I20250513 14:30:05.973234 1525 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 14:30:05.974706 update_engine[1525]: I20250513 14:30:05.973865 1525 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 14:30:05.974801 locksmithd[1567]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 13 14:30:05.979037 update_engine[1525]: E20250513 14:30:05.978940 1525 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 14:30:05.979037 update_engine[1525]: I20250513 14:30:05.979038 1525 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 13 14:30:05.979635 update_engine[1525]: I20250513 14:30:05.979056 1525 omaha_request_action.cc:617] Omaha request response: May 13 14:30:05.979635 update_engine[1525]: I20250513 14:30:05.979068 1525 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 13 14:30:05.979635 update_engine[1525]: I20250513 14:30:05.979079 1525 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 13 14:30:05.979635 update_engine[1525]: I20250513 14:30:05.979090 1525 update_attempter.cc:306] Processing Done. May 13 14:30:05.979635 update_engine[1525]: I20250513 14:30:05.979102 1525 update_attempter.cc:310] Error event sent. May 13 14:30:05.979635 update_engine[1525]: I20250513 14:30:05.979133 1525 update_check_scheduler.cc:74] Next update check in 49m28s May 13 14:30:05.980242 locksmithd[1567]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 13 14:30:22.664020 containerd[1551]: time="2025-05-13T14:30:22.663652472Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8a5f3858324e74db827812386aa83a4b5b57b49dc31a865187f4714b8134332\" id:\"93f0336c55006758ff596aa35a52e41d9641455d8e259936ca900b2dfc1cf34b\" pid:6028 exited_at:{seconds:1747146622 nanos:662683492}"