May 9 01:59:40.061962 kernel: Linux version 6.6.89-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu May 8 22:15:16 -00 2025 May 9 01:59:40.061993 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=6dbb211661f4d09f7718fdc7eab00f1550a8baafb68f4d2efdaedafa102351ae May 9 01:59:40.062004 kernel: BIOS-provided physical RAM map: May 9 01:59:40.062012 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 9 01:59:40.062019 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 9 01:59:40.062029 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 9 01:59:40.062037 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable May 9 01:59:40.062045 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved May 9 01:59:40.062053 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 9 01:59:40.062060 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 9 01:59:40.062068 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable May 9 01:59:40.062076 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 9 01:59:40.062084 kernel: NX (Execute Disable) protection: active May 9 01:59:40.062092 kernel: APIC: Static calls initialized May 9 01:59:40.062103 kernel: SMBIOS 3.0.0 present. May 9 01:59:40.062111 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 May 9 01:59:40.062119 kernel: Hypervisor detected: KVM May 9 01:59:40.062127 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 9 01:59:40.062135 kernel: kvm-clock: using sched offset of 3559840858 cycles May 9 01:59:40.062143 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 9 01:59:40.062154 kernel: tsc: Detected 1996.249 MHz processor May 9 01:59:40.062162 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 9 01:59:40.062171 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 9 01:59:40.062179 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 May 9 01:59:40.062188 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 9 01:59:40.062196 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 9 01:59:40.062224 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 May 9 01:59:40.062232 kernel: ACPI: Early table checksum verification disabled May 9 01:59:40.062243 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) May 9 01:59:40.062252 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 9 01:59:40.062260 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 9 01:59:40.062269 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 9 01:59:40.062277 kernel: ACPI: FACS 0x00000000BFFE0000 000040 May 9 01:59:40.062285 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 9 01:59:40.062293 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 9 01:59:40.062302 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] May 9 01:59:40.062310 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] May 9 01:59:40.062320 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] May 9 01:59:40.062328 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] May 9 01:59:40.062336 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] May 9 01:59:40.062347 kernel: No NUMA configuration found May 9 01:59:40.062356 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] May 9 01:59:40.062365 kernel: NODE_DATA(0) allocated [mem 0x13fff7000-0x13fffcfff] May 9 01:59:40.062373 kernel: Zone ranges: May 9 01:59:40.062384 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 9 01:59:40.062392 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] May 9 01:59:40.062401 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] May 9 01:59:40.062410 kernel: Movable zone start for each node May 9 01:59:40.062418 kernel: Early memory node ranges May 9 01:59:40.062427 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 9 01:59:40.062435 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] May 9 01:59:40.062444 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] May 9 01:59:40.062454 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] May 9 01:59:40.062463 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 9 01:59:40.062471 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 9 01:59:40.062480 kernel: On node 0, zone Normal: 35 pages in unavailable ranges May 9 01:59:40.062488 kernel: ACPI: PM-Timer IO Port: 0x608 May 9 01:59:40.062497 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 9 01:59:40.062506 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 9 01:59:40.062514 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 9 01:59:40.062523 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 9 01:59:40.062533 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 9 01:59:40.062542 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 9 01:59:40.062551 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 9 01:59:40.062559 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 9 01:59:40.062568 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs May 9 01:59:40.062577 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 9 01:59:40.062585 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices May 9 01:59:40.062594 kernel: Booting paravirtualized kernel on KVM May 9 01:59:40.062602 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 9 01:59:40.062613 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 9 01:59:40.062622 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 May 9 01:59:40.062630 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 May 9 01:59:40.062639 kernel: pcpu-alloc: [0] 0 1 May 9 01:59:40.062647 kernel: kvm-guest: PV spinlocks disabled, no host support May 9 01:59:40.062657 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=6dbb211661f4d09f7718fdc7eab00f1550a8baafb68f4d2efdaedafa102351ae May 9 01:59:40.062667 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 9 01:59:40.062675 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 9 01:59:40.062686 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 9 01:59:40.062694 kernel: Fallback order for Node 0: 0 May 9 01:59:40.062703 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1031901 May 9 01:59:40.062712 kernel: Policy zone: Normal May 9 01:59:40.062720 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 9 01:59:40.062729 kernel: software IO TLB: area num 2. May 9 01:59:40.062738 kernel: Memory: 3962104K/4193772K available (14336K kernel code, 2296K rwdata, 25068K rodata, 43604K init, 1468K bss, 231408K reserved, 0K cma-reserved) May 9 01:59:40.062747 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 9 01:59:40.062756 kernel: ftrace: allocating 37993 entries in 149 pages May 9 01:59:40.062766 kernel: ftrace: allocated 149 pages with 4 groups May 9 01:59:40.062775 kernel: Dynamic Preempt: voluntary May 9 01:59:40.062783 kernel: rcu: Preemptible hierarchical RCU implementation. May 9 01:59:40.062793 kernel: rcu: RCU event tracing is enabled. May 9 01:59:40.062802 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 9 01:59:40.062811 kernel: Trampoline variant of Tasks RCU enabled. May 9 01:59:40.062819 kernel: Rude variant of Tasks RCU enabled. May 9 01:59:40.062828 kernel: Tracing variant of Tasks RCU enabled. May 9 01:59:40.062837 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 9 01:59:40.062847 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 9 01:59:40.062856 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 May 9 01:59:40.062865 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 9 01:59:40.062873 kernel: Console: colour VGA+ 80x25 May 9 01:59:40.062882 kernel: printk: console [tty0] enabled May 9 01:59:40.062890 kernel: printk: console [ttyS0] enabled May 9 01:59:40.062899 kernel: ACPI: Core revision 20230628 May 9 01:59:40.062907 kernel: APIC: Switch to symmetric I/O mode setup May 9 01:59:40.062916 kernel: x2apic enabled May 9 01:59:40.062926 kernel: APIC: Switched APIC routing to: physical x2apic May 9 01:59:40.062935 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 9 01:59:40.062944 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized May 9 01:59:40.062952 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) May 9 01:59:40.062961 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 9 01:59:40.062970 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 9 01:59:40.062978 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 9 01:59:40.062987 kernel: Spectre V2 : Mitigation: Retpolines May 9 01:59:40.062995 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 9 01:59:40.063006 kernel: Speculative Store Bypass: Vulnerable May 9 01:59:40.063014 kernel: x86/fpu: x87 FPU will use FXSAVE May 9 01:59:40.063023 kernel: Freeing SMP alternatives memory: 32K May 9 01:59:40.063031 kernel: pid_max: default: 32768 minimum: 301 May 9 01:59:40.063046 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 9 01:59:40.063056 kernel: landlock: Up and running. May 9 01:59:40.063065 kernel: SELinux: Initializing. May 9 01:59:40.063074 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 9 01:59:40.063083 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 9 01:59:40.063092 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) May 9 01:59:40.063102 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 9 01:59:40.063111 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 9 01:59:40.063122 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 9 01:59:40.063131 kernel: Performance Events: AMD PMU driver. May 9 01:59:40.063140 kernel: ... version: 0 May 9 01:59:40.063149 kernel: ... bit width: 48 May 9 01:59:40.063158 kernel: ... generic registers: 4 May 9 01:59:40.063169 kernel: ... value mask: 0000ffffffffffff May 9 01:59:40.063178 kernel: ... max period: 00007fffffffffff May 9 01:59:40.063187 kernel: ... fixed-purpose events: 0 May 9 01:59:40.063196 kernel: ... event mask: 000000000000000f May 9 01:59:40.066229 kernel: signal: max sigframe size: 1440 May 9 01:59:40.066241 kernel: rcu: Hierarchical SRCU implementation. May 9 01:59:40.066251 kernel: rcu: Max phase no-delay instances is 400. May 9 01:59:40.066260 kernel: smp: Bringing up secondary CPUs ... May 9 01:59:40.066270 kernel: smpboot: x86: Booting SMP configuration: May 9 01:59:40.066283 kernel: .... node #0, CPUs: #1 May 9 01:59:40.066292 kernel: smp: Brought up 1 node, 2 CPUs May 9 01:59:40.066301 kernel: smpboot: Max logical packages: 2 May 9 01:59:40.066311 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) May 9 01:59:40.066320 kernel: devtmpfs: initialized May 9 01:59:40.066329 kernel: x86/mm: Memory block size: 128MB May 9 01:59:40.066339 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 9 01:59:40.066348 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 9 01:59:40.066357 kernel: pinctrl core: initialized pinctrl subsystem May 9 01:59:40.066369 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 9 01:59:40.066378 kernel: audit: initializing netlink subsys (disabled) May 9 01:59:40.066387 kernel: audit: type=2000 audit(1746755979.459:1): state=initialized audit_enabled=0 res=1 May 9 01:59:40.066396 kernel: thermal_sys: Registered thermal governor 'step_wise' May 9 01:59:40.066406 kernel: thermal_sys: Registered thermal governor 'user_space' May 9 01:59:40.066415 kernel: cpuidle: using governor menu May 9 01:59:40.066424 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 9 01:59:40.066433 kernel: dca service started, version 1.12.1 May 9 01:59:40.066442 kernel: PCI: Using configuration type 1 for base access May 9 01:59:40.066453 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 9 01:59:40.066463 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 9 01:59:40.066472 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 9 01:59:40.066481 kernel: ACPI: Added _OSI(Module Device) May 9 01:59:40.066490 kernel: ACPI: Added _OSI(Processor Device) May 9 01:59:40.066499 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 9 01:59:40.066509 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 9 01:59:40.066518 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 9 01:59:40.066527 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC May 9 01:59:40.066538 kernel: ACPI: Interpreter enabled May 9 01:59:40.066547 kernel: ACPI: PM: (supports S0 S3 S5) May 9 01:59:40.066556 kernel: ACPI: Using IOAPIC for interrupt routing May 9 01:59:40.066565 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 9 01:59:40.066575 kernel: PCI: Using E820 reservations for host bridge windows May 9 01:59:40.066584 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F May 9 01:59:40.066593 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 9 01:59:40.066744 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] May 9 01:59:40.066847 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] May 9 01:59:40.066940 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge May 9 01:59:40.066954 kernel: acpiphp: Slot [3] registered May 9 01:59:40.066964 kernel: acpiphp: Slot [4] registered May 9 01:59:40.066973 kernel: acpiphp: Slot [5] registered May 9 01:59:40.066982 kernel: acpiphp: Slot [6] registered May 9 01:59:40.066991 kernel: acpiphp: Slot [7] registered May 9 01:59:40.067000 kernel: acpiphp: Slot [8] registered May 9 01:59:40.067012 kernel: acpiphp: Slot [9] registered May 9 01:59:40.067021 kernel: acpiphp: Slot [10] registered May 9 01:59:40.067030 kernel: acpiphp: Slot [11] registered May 9 01:59:40.067039 kernel: acpiphp: Slot [12] registered May 9 01:59:40.067048 kernel: acpiphp: Slot [13] registered May 9 01:59:40.067057 kernel: acpiphp: Slot [14] registered May 9 01:59:40.067066 kernel: acpiphp: Slot [15] registered May 9 01:59:40.067075 kernel: acpiphp: Slot [16] registered May 9 01:59:40.067084 kernel: acpiphp: Slot [17] registered May 9 01:59:40.067093 kernel: acpiphp: Slot [18] registered May 9 01:59:40.067104 kernel: acpiphp: Slot [19] registered May 9 01:59:40.067113 kernel: acpiphp: Slot [20] registered May 9 01:59:40.067122 kernel: acpiphp: Slot [21] registered May 9 01:59:40.067131 kernel: acpiphp: Slot [22] registered May 9 01:59:40.067140 kernel: acpiphp: Slot [23] registered May 9 01:59:40.067149 kernel: acpiphp: Slot [24] registered May 9 01:59:40.067158 kernel: acpiphp: Slot [25] registered May 9 01:59:40.067167 kernel: acpiphp: Slot [26] registered May 9 01:59:40.067176 kernel: acpiphp: Slot [27] registered May 9 01:59:40.067187 kernel: acpiphp: Slot [28] registered May 9 01:59:40.067196 kernel: acpiphp: Slot [29] registered May 9 01:59:40.067223 kernel: acpiphp: Slot [30] registered May 9 01:59:40.067232 kernel: acpiphp: Slot [31] registered May 9 01:59:40.067242 kernel: PCI host bridge to bus 0000:00 May 9 01:59:40.067341 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 9 01:59:40.067427 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 9 01:59:40.067512 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 9 01:59:40.067600 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 9 01:59:40.067683 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] May 9 01:59:40.067765 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 9 01:59:40.067894 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 May 9 01:59:40.068010 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 May 9 01:59:40.068120 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 May 9 01:59:40.070279 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] May 9 01:59:40.070395 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] May 9 01:59:40.070498 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] May 9 01:59:40.070600 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] May 9 01:59:40.070697 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] May 9 01:59:40.070800 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 May 9 01:59:40.070896 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI May 9 01:59:40.070997 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB May 9 01:59:40.071102 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 May 9 01:59:40.071237 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] May 9 01:59:40.071340 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xc000000000-0xc000003fff 64bit pref] May 9 01:59:40.071434 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] May 9 01:59:40.071532 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] May 9 01:59:40.071634 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 9 01:59:40.071751 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 May 9 01:59:40.071875 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] May 9 01:59:40.071979 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] May 9 01:59:40.072082 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xc000004000-0xc000007fff 64bit pref] May 9 01:59:40.072184 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] May 9 01:59:40.073344 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 May 9 01:59:40.073452 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] May 9 01:59:40.073560 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] May 9 01:59:40.073661 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xc000008000-0xc00000bfff 64bit pref] May 9 01:59:40.073761 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 May 9 01:59:40.073865 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] May 9 01:59:40.073967 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xc00000c000-0xc00000ffff 64bit pref] May 9 01:59:40.074077 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 May 9 01:59:40.074180 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] May 9 01:59:40.074310 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfeb93000-0xfeb93fff] May 9 01:59:40.074414 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xc000010000-0xc000013fff 64bit pref] May 9 01:59:40.074430 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 9 01:59:40.074440 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 9 01:59:40.074451 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 9 01:59:40.074461 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 9 01:59:40.074471 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 May 9 01:59:40.074481 kernel: iommu: Default domain type: Translated May 9 01:59:40.074495 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 9 01:59:40.074506 kernel: PCI: Using ACPI for IRQ routing May 9 01:59:40.074516 kernel: PCI: pci_cache_line_size set to 64 bytes May 9 01:59:40.074525 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 9 01:59:40.074535 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] May 9 01:59:40.074638 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device May 9 01:59:40.074740 kernel: pci 0000:00:02.0: vgaarb: bridge control possible May 9 01:59:40.074845 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 9 01:59:40.074860 kernel: vgaarb: loaded May 9 01:59:40.074874 kernel: clocksource: Switched to clocksource kvm-clock May 9 01:59:40.074884 kernel: VFS: Disk quotas dquot_6.6.0 May 9 01:59:40.074894 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 9 01:59:40.074905 kernel: pnp: PnP ACPI init May 9 01:59:40.075015 kernel: pnp 00:03: [dma 2] May 9 01:59:40.075032 kernel: pnp: PnP ACPI: found 5 devices May 9 01:59:40.075043 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 9 01:59:40.075053 kernel: NET: Registered PF_INET protocol family May 9 01:59:40.075066 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 9 01:59:40.075076 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 9 01:59:40.075087 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 9 01:59:40.075097 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 9 01:59:40.075107 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 9 01:59:40.075117 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 9 01:59:40.075127 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 9 01:59:40.075138 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 9 01:59:40.075148 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 9 01:59:40.075160 kernel: NET: Registered PF_XDP protocol family May 9 01:59:40.078272 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 9 01:59:40.078360 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 9 01:59:40.078444 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 9 01:59:40.078525 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] May 9 01:59:40.078607 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] May 9 01:59:40.078703 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release May 9 01:59:40.078799 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers May 9 01:59:40.078818 kernel: PCI: CLS 0 bytes, default 64 May 9 01:59:40.078827 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 9 01:59:40.078837 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) May 9 01:59:40.078846 kernel: Initialise system trusted keyrings May 9 01:59:40.078856 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 9 01:59:40.078865 kernel: Key type asymmetric registered May 9 01:59:40.078874 kernel: Asymmetric key parser 'x509' registered May 9 01:59:40.078883 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) May 9 01:59:40.078894 kernel: io scheduler mq-deadline registered May 9 01:59:40.078903 kernel: io scheduler kyber registered May 9 01:59:40.078913 kernel: io scheduler bfq registered May 9 01:59:40.078922 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 9 01:59:40.078932 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 May 9 01:59:40.078941 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 May 9 01:59:40.078951 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 May 9 01:59:40.078960 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 May 9 01:59:40.078969 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 9 01:59:40.078979 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 9 01:59:40.078990 kernel: random: crng init done May 9 01:59:40.078999 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 9 01:59:40.079008 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 9 01:59:40.079017 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 9 01:59:40.079109 kernel: rtc_cmos 00:04: RTC can wake from S4 May 9 01:59:40.079124 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 9 01:59:40.079227 kernel: rtc_cmos 00:04: registered as rtc0 May 9 01:59:40.079318 kernel: rtc_cmos 00:04: setting system clock to 2025-05-09T01:59:39 UTC (1746755979) May 9 01:59:40.079410 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram May 9 01:59:40.079425 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 9 01:59:40.079434 kernel: NET: Registered PF_INET6 protocol family May 9 01:59:40.079443 kernel: Segment Routing with IPv6 May 9 01:59:40.079452 kernel: In-situ OAM (IOAM) with IPv6 May 9 01:59:40.079462 kernel: NET: Registered PF_PACKET protocol family May 9 01:59:40.079471 kernel: Key type dns_resolver registered May 9 01:59:40.079480 kernel: IPI shorthand broadcast: enabled May 9 01:59:40.079489 kernel: sched_clock: Marking stable (989009071, 166639110)->(1185445583, -29797402) May 9 01:59:40.079502 kernel: registered taskstats version 1 May 9 01:59:40.079511 kernel: Loading compiled-in X.509 certificates May 9 01:59:40.079520 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.89-flatcar: 247aefc84589d8961003173d18a9b4daf28f7c9e' May 9 01:59:40.079529 kernel: Key type .fscrypt registered May 9 01:59:40.079538 kernel: Key type fscrypt-provisioning registered May 9 01:59:40.079547 kernel: ima: No TPM chip found, activating TPM-bypass! May 9 01:59:40.079557 kernel: ima: Allocated hash algorithm: sha1 May 9 01:59:40.079566 kernel: ima: No architecture policies found May 9 01:59:40.079576 kernel: clk: Disabling unused clocks May 9 01:59:40.079586 kernel: Freeing unused kernel image (initmem) memory: 43604K May 9 01:59:40.079595 kernel: Write protecting the kernel read-only data: 40960k May 9 01:59:40.079604 kernel: Freeing unused kernel image (rodata/data gap) memory: 1556K May 9 01:59:40.079613 kernel: Run /init as init process May 9 01:59:40.079622 kernel: with arguments: May 9 01:59:40.079631 kernel: /init May 9 01:59:40.079640 kernel: with environment: May 9 01:59:40.079649 kernel: HOME=/ May 9 01:59:40.079658 kernel: TERM=linux May 9 01:59:40.079668 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 9 01:59:40.079678 systemd[1]: Successfully made /usr/ read-only. May 9 01:59:40.079692 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 9 01:59:40.079702 systemd[1]: Detected virtualization kvm. May 9 01:59:40.079712 systemd[1]: Detected architecture x86-64. May 9 01:59:40.079722 systemd[1]: Running in initrd. May 9 01:59:40.079733 systemd[1]: No hostname configured, using default hostname. May 9 01:59:40.079743 systemd[1]: Hostname set to . May 9 01:59:40.079753 systemd[1]: Initializing machine ID from VM UUID. May 9 01:59:40.079763 systemd[1]: Queued start job for default target initrd.target. May 9 01:59:40.079772 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 9 01:59:40.079782 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 9 01:59:40.079793 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 9 01:59:40.079823 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 9 01:59:40.079835 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 9 01:59:40.079846 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 9 01:59:40.079857 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 9 01:59:40.079868 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 9 01:59:40.079881 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 9 01:59:40.079894 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 9 01:59:40.079905 systemd[1]: Reached target paths.target - Path Units. May 9 01:59:40.079915 systemd[1]: Reached target slices.target - Slice Units. May 9 01:59:40.079926 systemd[1]: Reached target swap.target - Swaps. May 9 01:59:40.079937 systemd[1]: Reached target timers.target - Timer Units. May 9 01:59:40.079948 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 9 01:59:40.079959 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 9 01:59:40.079970 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 9 01:59:40.079981 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 9 01:59:40.079993 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 9 01:59:40.080004 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 9 01:59:40.080015 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 9 01:59:40.080026 systemd[1]: Reached target sockets.target - Socket Units. May 9 01:59:40.080037 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 9 01:59:40.080048 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 9 01:59:40.080059 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 9 01:59:40.080071 systemd[1]: Starting systemd-fsck-usr.service... May 9 01:59:40.080084 systemd[1]: Starting systemd-journald.service - Journal Service... May 9 01:59:40.080095 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 9 01:59:40.080106 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 9 01:59:40.080118 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 9 01:59:40.080130 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 9 01:59:40.080148 systemd[1]: Finished systemd-fsck-usr.service. May 9 01:59:40.080168 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 9 01:59:40.085401 systemd-journald[185]: Collecting audit messages is disabled. May 9 01:59:40.085437 systemd-journald[185]: Journal started May 9 01:59:40.085461 systemd-journald[185]: Runtime Journal (/run/log/journal/19b35dc7344b4cdc879f60c33bdcb8e5) is 8M, max 78.2M, 70.2M free. May 9 01:59:40.056752 systemd-modules-load[186]: Inserted module 'overlay' May 9 01:59:40.107410 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 9 01:59:40.107431 kernel: Bridge firewalling registered May 9 01:59:40.098557 systemd-modules-load[186]: Inserted module 'br_netfilter' May 9 01:59:40.117223 systemd[1]: Started systemd-journald.service - Journal Service. May 9 01:59:40.117915 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 9 01:59:40.118636 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 9 01:59:40.120502 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 9 01:59:40.123791 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 9 01:59:40.126330 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 9 01:59:40.127431 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 9 01:59:40.137358 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 9 01:59:40.142051 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 9 01:59:40.143772 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 9 01:59:40.153420 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 9 01:59:40.156338 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 9 01:59:40.157868 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 9 01:59:40.162313 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 9 01:59:40.186560 dracut-cmdline[219]: dracut-dracut-053 May 9 01:59:40.189269 dracut-cmdline[219]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=6dbb211661f4d09f7718fdc7eab00f1550a8baafb68f4d2efdaedafa102351ae May 9 01:59:40.210546 systemd-resolved[221]: Positive Trust Anchors: May 9 01:59:40.211285 systemd-resolved[221]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 9 01:59:40.211328 systemd-resolved[221]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 9 01:59:40.217526 systemd-resolved[221]: Defaulting to hostname 'linux'. May 9 01:59:40.218513 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 9 01:59:40.220788 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 9 01:59:40.251250 kernel: SCSI subsystem initialized May 9 01:59:40.262247 kernel: Loading iSCSI transport class v2.0-870. May 9 01:59:40.274267 kernel: iscsi: registered transport (tcp) May 9 01:59:40.296298 kernel: iscsi: registered transport (qla4xxx) May 9 01:59:40.296370 kernel: QLogic iSCSI HBA Driver May 9 01:59:40.337087 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 9 01:59:40.339260 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 9 01:59:40.395992 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 9 01:59:40.396127 kernel: device-mapper: uevent: version 1.0.3 May 9 01:59:40.398932 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 9 01:59:40.459428 kernel: raid6: sse2x4 gen() 5460 MB/s May 9 01:59:40.477281 kernel: raid6: sse2x2 gen() 8025 MB/s May 9 01:59:40.496357 kernel: raid6: sse2x1 gen() 9725 MB/s May 9 01:59:40.496430 kernel: raid6: using algorithm sse2x1 gen() 9725 MB/s May 9 01:59:40.516286 kernel: raid6: .... xor() 4519 MB/s, rmw enabled May 9 01:59:40.516345 kernel: raid6: using ssse3x2 recovery algorithm May 9 01:59:40.542578 kernel: xor: measuring software checksum speed May 9 01:59:40.542661 kernel: prefetch64-sse : 17099 MB/sec May 9 01:59:40.543948 kernel: generic_sse : 15638 MB/sec May 9 01:59:40.544007 kernel: xor: using function: prefetch64-sse (17099 MB/sec) May 9 01:59:40.719275 kernel: Btrfs loaded, zoned=no, fsverity=no May 9 01:59:40.732363 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 9 01:59:40.737586 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 9 01:59:40.761180 systemd-udevd[404]: Using default interface naming scheme 'v255'. May 9 01:59:40.766162 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 9 01:59:40.772434 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 9 01:59:40.794841 dracut-pre-trigger[418]: rd.md=0: removing MD RAID activation May 9 01:59:40.830381 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 9 01:59:40.834681 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 9 01:59:40.897796 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 9 01:59:40.901717 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 9 01:59:40.939717 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 9 01:59:40.940791 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 9 01:59:40.943537 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 9 01:59:40.945601 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 9 01:59:40.947863 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 9 01:59:40.975462 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 9 01:59:41.013312 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues May 9 01:59:41.025291 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) May 9 01:59:41.027027 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 9 01:59:41.027625 kernel: libata version 3.00 loaded. May 9 01:59:41.027790 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 9 01:59:41.030337 kernel: ata_piix 0000:00:01.1: version 2.13 May 9 01:59:41.030506 kernel: scsi host0: ata_piix May 9 01:59:41.030629 kernel: scsi host1: ata_piix May 9 01:59:41.030743 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 May 9 01:59:41.030757 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 May 9 01:59:41.029335 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 9 01:59:41.041440 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 9 01:59:41.041627 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 9 01:59:41.043044 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 9 01:59:41.045367 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 9 01:59:41.047919 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 9 01:59:41.060387 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 9 01:59:41.060417 kernel: GPT:17805311 != 20971519 May 9 01:59:41.060432 kernel: GPT:Alternate GPT header not at the end of the disk. May 9 01:59:41.060445 kernel: GPT:17805311 != 20971519 May 9 01:59:41.060456 kernel: GPT: Use GNU Parted to correct GPT errors. May 9 01:59:41.060468 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 9 01:59:41.114705 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 9 01:59:41.116607 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 9 01:59:41.141811 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 9 01:59:41.237781 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (451) May 9 01:59:41.246255 kernel: BTRFS: device fsid d4537cc2-bda5-4424-8730-1f8e8c76a79a devid 1 transid 42 /dev/vda3 scanned by (udev-worker) (453) May 9 01:59:41.279109 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 9 01:59:41.290303 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 9 01:59:41.306644 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 9 01:59:41.307240 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 9 01:59:41.318646 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 9 01:59:41.322316 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 9 01:59:41.343561 disk-uuid[514]: Primary Header is updated. May 9 01:59:41.343561 disk-uuid[514]: Secondary Entries is updated. May 9 01:59:41.343561 disk-uuid[514]: Secondary Header is updated. May 9 01:59:41.355225 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 9 01:59:42.370248 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 9 01:59:42.373242 disk-uuid[515]: The operation has completed successfully. May 9 01:59:42.425214 systemd[1]: disk-uuid.service: Deactivated successfully. May 9 01:59:42.425321 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 9 01:59:42.472419 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 9 01:59:42.497319 sh[527]: Success May 9 01:59:42.513224 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" May 9 01:59:42.591469 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 9 01:59:42.597370 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 9 01:59:42.611094 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 9 01:59:42.641234 kernel: BTRFS info (device dm-0): first mount of filesystem d4537cc2-bda5-4424-8730-1f8e8c76a79a May 9 01:59:42.641323 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 9 01:59:42.643559 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 9 01:59:42.647138 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 9 01:59:42.647238 kernel: BTRFS info (device dm-0): using free space tree May 9 01:59:42.666886 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 9 01:59:42.669334 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 9 01:59:42.670753 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 9 01:59:42.673407 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 9 01:59:42.706140 kernel: BTRFS info (device vda6): first mount of filesystem 2d988641-706e-44d5-976c-175654fd684c May 9 01:59:42.706187 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 9 01:59:42.706213 kernel: BTRFS info (device vda6): using free space tree May 9 01:59:42.713244 kernel: BTRFS info (device vda6): auto enabling async discard May 9 01:59:42.726226 kernel: BTRFS info (device vda6): last unmount of filesystem 2d988641-706e-44d5-976c-175654fd684c May 9 01:59:42.740667 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 9 01:59:42.744406 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 9 01:59:42.841547 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 9 01:59:42.849731 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 9 01:59:42.886818 systemd-networkd[706]: lo: Link UP May 9 01:59:42.887295 systemd-networkd[706]: lo: Gained carrier May 9 01:59:42.888704 systemd-networkd[706]: Enumeration completed May 9 01:59:42.890423 systemd[1]: Started systemd-networkd.service - Network Configuration. May 9 01:59:42.891131 systemd[1]: Reached target network.target - Network. May 9 01:59:42.891470 systemd-networkd[706]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 9 01:59:42.891474 systemd-networkd[706]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 9 01:59:42.892436 systemd-networkd[706]: eth0: Link UP May 9 01:59:42.892440 systemd-networkd[706]: eth0: Gained carrier May 9 01:59:42.892450 systemd-networkd[706]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 9 01:59:42.914339 systemd-networkd[706]: eth0: DHCPv4 address 172.24.4.122/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 9 01:59:42.932104 ignition[639]: Ignition 2.20.0 May 9 01:59:42.932116 ignition[639]: Stage: fetch-offline May 9 01:59:42.932153 ignition[639]: no configs at "/usr/lib/ignition/base.d" May 9 01:59:42.934044 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 9 01:59:42.932165 ignition[639]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 9 01:59:42.932309 ignition[639]: parsed url from cmdline: "" May 9 01:59:42.936340 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 9 01:59:42.932313 ignition[639]: no config URL provided May 9 01:59:42.932320 ignition[639]: reading system config file "/usr/lib/ignition/user.ign" May 9 01:59:42.932330 ignition[639]: no config at "/usr/lib/ignition/user.ign" May 9 01:59:42.932335 ignition[639]: failed to fetch config: resource requires networking May 9 01:59:42.932533 ignition[639]: Ignition finished successfully May 9 01:59:42.972303 ignition[717]: Ignition 2.20.0 May 9 01:59:42.972323 ignition[717]: Stage: fetch May 9 01:59:42.972595 ignition[717]: no configs at "/usr/lib/ignition/base.d" May 9 01:59:42.972614 ignition[717]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 9 01:59:42.972780 ignition[717]: parsed url from cmdline: "" May 9 01:59:42.972787 ignition[717]: no config URL provided May 9 01:59:42.972798 ignition[717]: reading system config file "/usr/lib/ignition/user.ign" May 9 01:59:42.972813 ignition[717]: no config at "/usr/lib/ignition/user.ign" May 9 01:59:42.972896 ignition[717]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... May 9 01:59:42.972922 ignition[717]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... May 9 01:59:42.972991 ignition[717]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 May 9 01:59:43.245331 ignition[717]: GET result: OK May 9 01:59:43.245525 ignition[717]: parsing config with SHA512: 6437bbd5cc150150fc3255990dd7f2857542c582f479a409dbdd0ce39b6c9d099f34ac226a5506a5b52739a4efa7d1182876a3fa380af6f72b04a08db24f3d79 May 9 01:59:43.260811 unknown[717]: fetched base config from "system" May 9 01:59:43.260851 unknown[717]: fetched base config from "system" May 9 01:59:43.262744 ignition[717]: fetch: fetch complete May 9 01:59:43.260867 unknown[717]: fetched user config from "openstack" May 9 01:59:43.262769 ignition[717]: fetch: fetch passed May 9 01:59:43.266006 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 9 01:59:43.262867 ignition[717]: Ignition finished successfully May 9 01:59:43.271477 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 9 01:59:43.316956 ignition[724]: Ignition 2.20.0 May 9 01:59:43.316990 ignition[724]: Stage: kargs May 9 01:59:43.317404 ignition[724]: no configs at "/usr/lib/ignition/base.d" May 9 01:59:43.317431 ignition[724]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 9 01:59:43.321672 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 9 01:59:43.319736 ignition[724]: kargs: kargs passed May 9 01:59:43.319870 ignition[724]: Ignition finished successfully May 9 01:59:43.327507 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 9 01:59:43.366294 ignition[730]: Ignition 2.20.0 May 9 01:59:43.367404 ignition[730]: Stage: disks May 9 01:59:43.367822 ignition[730]: no configs at "/usr/lib/ignition/base.d" May 9 01:59:43.367852 ignition[730]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 9 01:59:43.374956 ignition[730]: disks: disks passed May 9 01:59:43.375055 ignition[730]: Ignition finished successfully May 9 01:59:43.377758 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 9 01:59:43.379507 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 9 01:59:43.381795 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 9 01:59:43.384761 systemd[1]: Reached target local-fs.target - Local File Systems. May 9 01:59:43.387692 systemd[1]: Reached target sysinit.target - System Initialization. May 9 01:59:43.390165 systemd[1]: Reached target basic.target - Basic System. May 9 01:59:43.394834 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 9 01:59:43.447549 systemd-fsck[739]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks May 9 01:59:43.457105 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 9 01:59:43.462177 systemd[1]: Mounting sysroot.mount - /sysroot... May 9 01:59:43.611230 kernel: EXT4-fs (vda9): mounted filesystem 0829e1d9-eacd-4a94-9591-6f579c115eeb r/w with ordered data mode. Quota mode: none. May 9 01:59:43.612236 systemd[1]: Mounted sysroot.mount - /sysroot. May 9 01:59:43.613605 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 9 01:59:43.617047 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 9 01:59:43.619293 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 9 01:59:43.619995 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 9 01:59:43.624479 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... May 9 01:59:43.625434 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 9 01:59:43.625468 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 9 01:59:43.633066 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 9 01:59:43.635322 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 9 01:59:43.650249 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (747) May 9 01:59:43.661657 kernel: BTRFS info (device vda6): first mount of filesystem 2d988641-706e-44d5-976c-175654fd684c May 9 01:59:43.661683 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 9 01:59:43.661695 kernel: BTRFS info (device vda6): using free space tree May 9 01:59:43.667222 kernel: BTRFS info (device vda6): auto enabling async discard May 9 01:59:43.669217 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 9 01:59:43.780854 initrd-setup-root[776]: cut: /sysroot/etc/passwd: No such file or directory May 9 01:59:43.789949 initrd-setup-root[783]: cut: /sysroot/etc/group: No such file or directory May 9 01:59:43.794693 initrd-setup-root[790]: cut: /sysroot/etc/shadow: No such file or directory May 9 01:59:43.804660 initrd-setup-root[797]: cut: /sysroot/etc/gshadow: No such file or directory May 9 01:59:43.931392 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 9 01:59:43.934579 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 9 01:59:43.938385 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 9 01:59:43.961397 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 9 01:59:43.965914 kernel: BTRFS info (device vda6): last unmount of filesystem 2d988641-706e-44d5-976c-175654fd684c May 9 01:59:43.998630 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 9 01:59:44.008789 ignition[865]: INFO : Ignition 2.20.0 May 9 01:59:44.008789 ignition[865]: INFO : Stage: mount May 9 01:59:44.010094 ignition[865]: INFO : no configs at "/usr/lib/ignition/base.d" May 9 01:59:44.010094 ignition[865]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 9 01:59:44.011630 ignition[865]: INFO : mount: mount passed May 9 01:59:44.011630 ignition[865]: INFO : Ignition finished successfully May 9 01:59:44.012335 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 9 01:59:44.588501 systemd-networkd[706]: eth0: Gained IPv6LL May 9 01:59:50.827522 coreos-metadata[749]: May 09 01:59:50.827 WARN failed to locate config-drive, using the metadata service API instead May 9 01:59:50.868839 coreos-metadata[749]: May 09 01:59:50.868 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 9 01:59:50.884257 coreos-metadata[749]: May 09 01:59:50.884 INFO Fetch successful May 9 01:59:50.885688 coreos-metadata[749]: May 09 01:59:50.885 INFO wrote hostname ci-4284-0-0-n-abffc5acbe.novalocal to /sysroot/etc/hostname May 9 01:59:50.888576 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. May 9 01:59:50.888785 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. May 9 01:59:50.896458 systemd[1]: Starting ignition-files.service - Ignition (files)... May 9 01:59:50.921635 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 9 01:59:50.952296 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (881) May 9 01:59:50.958265 kernel: BTRFS info (device vda6): first mount of filesystem 2d988641-706e-44d5-976c-175654fd684c May 9 01:59:50.967275 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 9 01:59:50.967366 kernel: BTRFS info (device vda6): using free space tree May 9 01:59:50.977340 kernel: BTRFS info (device vda6): auto enabling async discard May 9 01:59:50.983154 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 9 01:59:51.039433 ignition[899]: INFO : Ignition 2.20.0 May 9 01:59:51.039433 ignition[899]: INFO : Stage: files May 9 01:59:51.042190 ignition[899]: INFO : no configs at "/usr/lib/ignition/base.d" May 9 01:59:51.042190 ignition[899]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 9 01:59:51.042190 ignition[899]: DEBUG : files: compiled without relabeling support, skipping May 9 01:59:51.047549 ignition[899]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 9 01:59:51.047549 ignition[899]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 9 01:59:51.051227 ignition[899]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 9 01:59:51.051227 ignition[899]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 9 01:59:51.051227 ignition[899]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 9 01:59:51.051013 unknown[899]: wrote ssh authorized keys file for user: core May 9 01:59:51.058285 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 9 01:59:51.058285 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 9 01:59:51.119870 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 9 01:59:51.446559 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 9 01:59:51.446559 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 9 01:59:51.446559 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 9 01:59:51.446559 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 9 01:59:51.455472 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 9 01:59:51.455472 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 9 01:59:51.455472 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 9 01:59:51.455472 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 9 01:59:51.455472 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 9 01:59:51.455472 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 9 01:59:51.455472 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 9 01:59:51.455472 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 9 01:59:51.455472 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 9 01:59:51.455472 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 9 01:59:51.455472 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 May 9 01:59:52.215640 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 9 01:59:55.021236 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 9 01:59:55.021236 ignition[899]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 9 01:59:55.024302 ignition[899]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 9 01:59:55.027125 ignition[899]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 9 01:59:55.027125 ignition[899]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 9 01:59:55.027125 ignition[899]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 9 01:59:55.027125 ignition[899]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 9 01:59:55.027125 ignition[899]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 9 01:59:55.027125 ignition[899]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 9 01:59:55.027125 ignition[899]: INFO : files: files passed May 9 01:59:55.027125 ignition[899]: INFO : Ignition finished successfully May 9 01:59:55.025858 systemd[1]: Finished ignition-files.service - Ignition (files). May 9 01:59:55.031324 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 9 01:59:55.035304 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 9 01:59:55.044613 systemd[1]: ignition-quench.service: Deactivated successfully. May 9 01:59:55.044699 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 9 01:59:55.053857 initrd-setup-root-after-ignition[929]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 9 01:59:55.053857 initrd-setup-root-after-ignition[929]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 9 01:59:55.056614 initrd-setup-root-after-ignition[933]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 9 01:59:55.059687 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 9 01:59:55.060526 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 9 01:59:55.064322 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 9 01:59:55.112906 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 9 01:59:55.113110 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 9 01:59:55.115344 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 9 01:59:55.126530 systemd[1]: Reached target initrd.target - Initrd Default Target. May 9 01:59:55.127108 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 9 01:59:55.130456 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 9 01:59:55.172364 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 9 01:59:55.177543 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 9 01:59:55.219278 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 9 01:59:55.222558 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 9 01:59:55.224358 systemd[1]: Stopped target timers.target - Timer Units. May 9 01:59:55.227145 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 9 01:59:55.227479 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 9 01:59:55.230521 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 9 01:59:55.232360 systemd[1]: Stopped target basic.target - Basic System. May 9 01:59:55.235244 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 9 01:59:55.237779 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 9 01:59:55.240357 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 9 01:59:55.243258 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 9 01:59:55.246132 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 9 01:59:55.249241 systemd[1]: Stopped target sysinit.target - System Initialization. May 9 01:59:55.252047 systemd[1]: Stopped target local-fs.target - Local File Systems. May 9 01:59:55.254999 systemd[1]: Stopped target swap.target - Swaps. May 9 01:59:55.257621 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 9 01:59:55.257908 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 9 01:59:55.260964 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 9 01:59:55.262907 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 9 01:59:55.265373 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 9 01:59:55.265630 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 9 01:59:55.268368 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 9 01:59:55.268760 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 9 01:59:55.272430 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 9 01:59:55.272821 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 9 01:59:55.275795 systemd[1]: ignition-files.service: Deactivated successfully. May 9 01:59:55.276075 systemd[1]: Stopped ignition-files.service - Ignition (files). May 9 01:59:55.282652 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 9 01:59:55.284409 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 9 01:59:55.286417 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 9 01:59:55.293954 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 9 01:59:55.295337 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 9 01:59:55.296801 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 9 01:59:55.298620 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 9 01:59:55.299009 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 9 01:59:55.310844 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 9 01:59:55.311698 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 9 01:59:55.321491 ignition[953]: INFO : Ignition 2.20.0 May 9 01:59:55.321491 ignition[953]: INFO : Stage: umount May 9 01:59:55.323524 ignition[953]: INFO : no configs at "/usr/lib/ignition/base.d" May 9 01:59:55.323524 ignition[953]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 9 01:59:55.323524 ignition[953]: INFO : umount: umount passed May 9 01:59:55.323524 ignition[953]: INFO : Ignition finished successfully May 9 01:59:55.324269 systemd[1]: ignition-mount.service: Deactivated successfully. May 9 01:59:55.324359 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 9 01:59:55.325977 systemd[1]: ignition-disks.service: Deactivated successfully. May 9 01:59:55.326049 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 9 01:59:55.326643 systemd[1]: ignition-kargs.service: Deactivated successfully. May 9 01:59:55.326699 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 9 01:59:55.327651 systemd[1]: ignition-fetch.service: Deactivated successfully. May 9 01:59:55.327693 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 9 01:59:55.329524 systemd[1]: Stopped target network.target - Network. May 9 01:59:55.330452 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 9 01:59:55.330499 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 9 01:59:55.331484 systemd[1]: Stopped target paths.target - Path Units. May 9 01:59:55.333494 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 9 01:59:55.339264 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 9 01:59:55.339834 systemd[1]: Stopped target slices.target - Slice Units. May 9 01:59:55.341023 systemd[1]: Stopped target sockets.target - Socket Units. May 9 01:59:55.342382 systemd[1]: iscsid.socket: Deactivated successfully. May 9 01:59:55.342417 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 9 01:59:55.343407 systemd[1]: iscsiuio.socket: Deactivated successfully. May 9 01:59:55.343449 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 9 01:59:55.344768 systemd[1]: ignition-setup.service: Deactivated successfully. May 9 01:59:55.344812 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 9 01:59:55.345310 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 9 01:59:55.345351 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 9 01:59:55.347275 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 9 01:59:55.348452 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 9 01:59:55.351847 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 9 01:59:55.352428 systemd[1]: systemd-resolved.service: Deactivated successfully. May 9 01:59:55.352523 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 9 01:59:55.356265 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 9 01:59:55.356517 systemd[1]: systemd-networkd.service: Deactivated successfully. May 9 01:59:55.356621 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 9 01:59:55.358558 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 9 01:59:55.358768 systemd[1]: sysroot-boot.service: Deactivated successfully. May 9 01:59:55.358863 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 9 01:59:55.360979 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 9 01:59:55.361389 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 9 01:59:55.362518 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 9 01:59:55.362563 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 9 01:59:55.366267 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 9 01:59:55.369089 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 9 01:59:55.369144 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 9 01:59:55.369698 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 9 01:59:55.369740 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 9 01:59:55.370882 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 9 01:59:55.370924 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 9 01:59:55.371919 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 9 01:59:55.371959 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 9 01:59:55.373305 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 9 01:59:55.375085 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 9 01:59:55.375143 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 9 01:59:55.386915 systemd[1]: systemd-udevd.service: Deactivated successfully. May 9 01:59:55.387182 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 9 01:59:55.389402 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 9 01:59:55.389499 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 9 01:59:55.390389 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 9 01:59:55.390423 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 9 01:59:55.391577 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 9 01:59:55.391632 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 9 01:59:55.393437 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 9 01:59:55.393486 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 9 01:59:55.394626 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 9 01:59:55.394675 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 9 01:59:55.397318 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 9 01:59:55.397908 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 9 01:59:55.397958 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 9 01:59:55.400398 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 9 01:59:55.400457 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 9 01:59:55.404542 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 9 01:59:55.404608 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 9 01:59:55.409410 systemd[1]: network-cleanup.service: Deactivated successfully. May 9 01:59:55.409558 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 9 01:59:55.414686 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 9 01:59:55.414795 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 9 01:59:55.416258 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 9 01:59:55.418444 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 9 01:59:55.436946 systemd[1]: Switching root. May 9 01:59:55.470836 systemd-journald[185]: Journal stopped May 9 01:59:57.211048 systemd-journald[185]: Received SIGTERM from PID 1 (systemd). May 9 01:59:57.211121 kernel: SELinux: policy capability network_peer_controls=1 May 9 01:59:57.211141 kernel: SELinux: policy capability open_perms=1 May 9 01:59:57.211155 kernel: SELinux: policy capability extended_socket_class=1 May 9 01:59:57.211170 kernel: SELinux: policy capability always_check_network=0 May 9 01:59:57.211182 kernel: SELinux: policy capability cgroup_seclabel=1 May 9 01:59:57.211196 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 9 01:59:57.213036 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 9 01:59:57.213051 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 9 01:59:57.213064 kernel: audit: type=1403 audit(1746755996.097:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 9 01:59:57.213079 systemd[1]: Successfully loaded SELinux policy in 80.465ms. May 9 01:59:57.213106 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 27.399ms. May 9 01:59:57.213124 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 9 01:59:57.213138 systemd[1]: Detected virtualization kvm. May 9 01:59:57.213151 systemd[1]: Detected architecture x86-64. May 9 01:59:57.213167 systemd[1]: Detected first boot. May 9 01:59:57.213181 systemd[1]: Hostname set to . May 9 01:59:57.213194 systemd[1]: Initializing machine ID from VM UUID. May 9 01:59:57.213227 zram_generator::config[1000]: No configuration found. May 9 01:59:57.213242 kernel: Guest personality initialized and is inactive May 9 01:59:57.213256 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 9 01:59:57.213268 kernel: Initialized host personality May 9 01:59:57.213281 kernel: NET: Registered PF_VSOCK protocol family May 9 01:59:57.213297 systemd[1]: Populated /etc with preset unit settings. May 9 01:59:57.213312 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 9 01:59:57.213326 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 9 01:59:57.213339 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 9 01:59:57.213352 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 9 01:59:57.213366 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 9 01:59:57.213380 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 9 01:59:57.213393 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 9 01:59:57.213411 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 9 01:59:57.213428 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 9 01:59:57.213441 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 9 01:59:57.213455 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 9 01:59:57.213468 systemd[1]: Created slice user.slice - User and Session Slice. May 9 01:59:57.213482 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 9 01:59:57.213496 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 9 01:59:57.213509 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 9 01:59:57.213522 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 9 01:59:57.213539 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 9 01:59:57.213553 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 9 01:59:57.213567 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 9 01:59:57.213580 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 9 01:59:57.213594 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 9 01:59:57.213607 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 9 01:59:57.213620 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 9 01:59:57.213636 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 9 01:59:57.213650 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 9 01:59:57.213664 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 9 01:59:57.213677 systemd[1]: Reached target slices.target - Slice Units. May 9 01:59:57.213691 systemd[1]: Reached target swap.target - Swaps. May 9 01:59:57.213704 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 9 01:59:57.213718 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 9 01:59:57.213732 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 9 01:59:57.213745 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 9 01:59:57.213761 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 9 01:59:57.213775 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 9 01:59:57.213788 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 9 01:59:57.213803 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 9 01:59:57.213817 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 9 01:59:57.213830 systemd[1]: Mounting media.mount - External Media Directory... May 9 01:59:57.213844 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 9 01:59:57.213857 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 9 01:59:57.213871 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 9 01:59:57.213886 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 9 01:59:57.213901 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 9 01:59:57.213914 systemd[1]: Reached target machines.target - Containers. May 9 01:59:57.213928 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 9 01:59:57.213941 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 9 01:59:57.213955 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 9 01:59:57.213968 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 9 01:59:57.213982 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 9 01:59:57.213997 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 9 01:59:57.214010 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 9 01:59:57.214024 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 9 01:59:57.214037 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 9 01:59:57.214051 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 9 01:59:57.214065 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 9 01:59:57.214078 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 9 01:59:57.214093 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 9 01:59:57.214106 systemd[1]: Stopped systemd-fsck-usr.service. May 9 01:59:57.214123 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 9 01:59:57.214136 kernel: loop: module loaded May 9 01:59:57.214149 systemd[1]: Starting systemd-journald.service - Journal Service... May 9 01:59:57.214162 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 9 01:59:57.214176 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 9 01:59:57.214190 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 9 01:59:57.218653 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 9 01:59:57.218674 kernel: fuse: init (API version 7.39) May 9 01:59:57.218692 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 9 01:59:57.218706 systemd[1]: verity-setup.service: Deactivated successfully. May 9 01:59:57.218720 systemd[1]: Stopped verity-setup.service. May 9 01:59:57.218734 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 9 01:59:57.218747 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 9 01:59:57.218764 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 9 01:59:57.218780 systemd[1]: Mounted media.mount - External Media Directory. May 9 01:59:57.218816 systemd-journald[1101]: Collecting audit messages is disabled. May 9 01:59:57.218848 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 9 01:59:57.218866 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 9 01:59:57.218880 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 9 01:59:57.218893 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 9 01:59:57.218909 systemd-journald[1101]: Journal started May 9 01:59:57.218936 systemd-journald[1101]: Runtime Journal (/run/log/journal/19b35dc7344b4cdc879f60c33bdcb8e5) is 8M, max 78.2M, 70.2M free. May 9 01:59:56.861848 systemd[1]: Queued start job for default target multi-user.target. May 9 01:59:57.221390 systemd[1]: Started systemd-journald.service - Journal Service. May 9 01:59:56.871446 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 9 01:59:56.871947 systemd[1]: systemd-journald.service: Deactivated successfully. May 9 01:59:57.224257 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 9 01:59:57.225009 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 9 01:59:57.225171 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 9 01:59:57.225925 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 9 01:59:57.226071 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 9 01:59:57.226818 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 9 01:59:57.226968 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 9 01:59:57.227725 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 9 01:59:57.227877 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 9 01:59:57.228635 systemd[1]: modprobe@loop.service: Deactivated successfully. May 9 01:59:57.228775 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 9 01:59:57.229870 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 9 01:59:57.230844 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 9 01:59:57.231767 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 9 01:59:57.232751 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 9 01:59:57.242304 systemd[1]: Reached target network-pre.target - Preparation for Network. May 9 01:59:57.245307 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 9 01:59:57.249640 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 9 01:59:57.252565 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 9 01:59:57.252599 systemd[1]: Reached target local-fs.target - Local File Systems. May 9 01:59:57.258084 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 9 01:59:57.261308 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 9 01:59:57.264605 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 9 01:59:57.265934 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 9 01:59:57.285252 kernel: ACPI: bus type drm_connector registered May 9 01:59:57.284959 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 9 01:59:57.289356 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 9 01:59:57.289986 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 9 01:59:57.292317 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 9 01:59:57.292977 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 9 01:59:57.294916 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 9 01:59:57.299589 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 9 01:59:57.305615 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 9 01:59:57.310045 systemd[1]: modprobe@drm.service: Deactivated successfully. May 9 01:59:57.310309 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 9 01:59:57.311091 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 9 01:59:57.311705 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 9 01:59:57.312561 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 9 01:59:57.328657 systemd-journald[1101]: Time spent on flushing to /var/log/journal/19b35dc7344b4cdc879f60c33bdcb8e5 is 39.125ms for 957 entries. May 9 01:59:57.328657 systemd-journald[1101]: System Journal (/var/log/journal/19b35dc7344b4cdc879f60c33bdcb8e5) is 8M, max 584.8M, 576.8M free. May 9 01:59:57.381481 systemd-journald[1101]: Received client request to flush runtime journal. May 9 01:59:57.381552 kernel: loop0: detected capacity change from 0 to 109808 May 9 01:59:57.364553 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 9 01:59:57.366152 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 9 01:59:57.374513 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 9 01:59:57.385852 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 9 01:59:57.387927 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 9 01:59:57.397045 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 9 01:59:57.405718 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 9 01:59:57.408714 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 9 01:59:57.415039 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 9 01:59:57.435059 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 9 01:59:57.438394 udevadm[1155]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. May 9 01:59:57.454111 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 9 01:59:57.460471 kernel: loop1: detected capacity change from 0 to 151640 May 9 01:59:57.464679 systemd-tmpfiles[1158]: ACLs are not supported, ignoring. May 9 01:59:57.465014 systemd-tmpfiles[1158]: ACLs are not supported, ignoring. May 9 01:59:57.472125 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 9 01:59:57.515531 kernel: loop2: detected capacity change from 0 to 8 May 9 01:59:57.536257 kernel: loop3: detected capacity change from 0 to 210664 May 9 01:59:57.624807 kernel: loop4: detected capacity change from 0 to 109808 May 9 01:59:57.677920 kernel: loop5: detected capacity change from 0 to 151640 May 9 01:59:57.751227 kernel: loop6: detected capacity change from 0 to 8 May 9 01:59:57.756235 kernel: loop7: detected capacity change from 0 to 210664 May 9 01:59:57.821160 (sd-merge)[1166]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. May 9 01:59:57.821689 (sd-merge)[1166]: Merged extensions into '/usr'. May 9 01:59:57.826591 systemd[1]: Reload requested from client PID 1138 ('systemd-sysext') (unit systemd-sysext.service)... May 9 01:59:57.826696 systemd[1]: Reloading... May 9 01:59:57.959227 zram_generator::config[1190]: No configuration found. May 9 01:59:58.185477 ldconfig[1133]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 9 01:59:58.214811 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 9 01:59:58.295524 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 9 01:59:58.296040 systemd[1]: Reloading finished in 467 ms. May 9 01:59:58.316065 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 9 01:59:58.316980 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 9 01:59:58.317870 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 9 01:59:58.334631 systemd[1]: Starting ensure-sysext.service... May 9 01:59:58.336221 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 9 01:59:58.340534 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 9 01:59:58.369462 systemd[1]: Reload requested from client PID 1251 ('systemctl') (unit ensure-sysext.service)... May 9 01:59:58.369481 systemd[1]: Reloading... May 9 01:59:58.386863 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 9 01:59:58.388330 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 9 01:59:58.393715 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 9 01:59:58.395534 systemd-tmpfiles[1252]: ACLs are not supported, ignoring. May 9 01:59:58.395691 systemd-tmpfiles[1252]: ACLs are not supported, ignoring. May 9 01:59:58.416747 systemd-tmpfiles[1252]: Detected autofs mount point /boot during canonicalization of boot. May 9 01:59:58.417332 systemd-tmpfiles[1252]: Skipping /boot May 9 01:59:58.424784 systemd-udevd[1253]: Using default interface naming scheme 'v255'. May 9 01:59:58.434825 systemd-tmpfiles[1252]: Detected autofs mount point /boot during canonicalization of boot. May 9 01:59:58.434944 systemd-tmpfiles[1252]: Skipping /boot May 9 01:59:58.467236 zram_generator::config[1281]: No configuration found. May 9 01:59:58.604266 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 42 scanned by (udev-worker) (1315) May 9 01:59:58.652235 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 May 9 01:59:58.670295 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 May 9 01:59:58.696234 kernel: ACPI: button: Power Button [PWRF] May 9 01:59:58.700016 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 9 01:59:58.719277 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 May 9 01:59:58.772244 kernel: mousedev: PS/2 mouse device common for all mice May 9 01:59:58.794023 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 May 9 01:59:58.794090 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console May 9 01:59:58.798418 kernel: Console: switching to colour dummy device 80x25 May 9 01:59:58.800233 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 9 01:59:58.800274 kernel: [drm] features: -context_init May 9 01:59:58.801582 kernel: [drm] number of scanouts: 1 May 9 01:59:58.802219 kernel: [drm] number of cap sets: 0 May 9 01:59:58.806230 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 May 9 01:59:58.810278 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device May 9 01:59:58.815755 kernel: Console: switching to colour frame buffer device 160x50 May 9 01:59:58.823245 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device May 9 01:59:58.833180 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 9 01:59:58.835914 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 9 01:59:58.836321 systemd[1]: Reloading finished in 466 ms. May 9 01:59:58.850903 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 9 01:59:58.863297 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 9 01:59:58.894142 systemd[1]: Finished ensure-sysext.service. May 9 01:59:58.920092 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 9 01:59:58.925535 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 9 01:59:58.927104 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 9 01:59:58.936410 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 9 01:59:58.939176 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 9 01:59:58.940655 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 9 01:59:58.941784 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 9 01:59:58.949476 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 9 01:59:58.956022 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 9 01:59:58.964900 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 9 01:59:58.965326 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 9 01:59:58.968465 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 9 01:59:58.970419 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 9 01:59:58.982062 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 9 01:59:58.988546 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 9 01:59:58.998510 lvm[1374]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 9 01:59:58.997713 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 9 01:59:59.013273 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 9 01:59:59.025317 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 9 01:59:59.030605 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 9 01:59:59.031588 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 9 01:59:59.032744 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 9 01:59:59.033084 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 9 01:59:59.035335 systemd[1]: modprobe@drm.service: Deactivated successfully. May 9 01:59:59.035663 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 9 01:59:59.036866 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 9 01:59:59.037067 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 9 01:59:59.040555 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 9 01:59:59.047947 systemd[1]: modprobe@loop.service: Deactivated successfully. May 9 01:59:59.048157 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 9 01:59:59.054306 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 9 01:59:59.054374 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 9 01:59:59.059339 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 9 01:59:59.060612 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 9 01:59:59.064226 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 9 01:59:59.072242 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 9 01:59:59.081068 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 9 01:59:59.103397 lvm[1409]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 9 01:59:59.104867 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 9 01:59:59.113171 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 9 01:59:59.118239 augenrules[1419]: No rules May 9 01:59:59.120563 systemd[1]: audit-rules.service: Deactivated successfully. May 9 01:59:59.120781 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 9 01:59:59.139262 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 9 01:59:59.150260 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 9 01:59:59.162364 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 9 01:59:59.166692 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 9 01:59:59.170991 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 9 01:59:59.220281 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 9 01:59:59.258706 systemd-networkd[1387]: lo: Link UP May 9 01:59:59.258714 systemd-networkd[1387]: lo: Gained carrier May 9 01:59:59.260028 systemd-networkd[1387]: Enumeration completed May 9 01:59:59.260130 systemd[1]: Started systemd-networkd.service - Network Configuration. May 9 01:59:59.260658 systemd-networkd[1387]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 9 01:59:59.260670 systemd-networkd[1387]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 9 01:59:59.261802 systemd-networkd[1387]: eth0: Link UP May 9 01:59:59.261805 systemd-networkd[1387]: eth0: Gained carrier May 9 01:59:59.261821 systemd-networkd[1387]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 9 01:59:59.265046 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 9 01:59:59.271503 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 9 01:59:59.289328 systemd-networkd[1387]: eth0: DHCPv4 address 172.24.4.122/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 9 01:59:59.310601 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 9 01:59:59.315438 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 9 01:59:59.319654 systemd[1]: Reached target time-set.target - System Time Set. May 9 01:59:59.331052 systemd-resolved[1388]: Positive Trust Anchors: May 9 01:59:59.331069 systemd-resolved[1388]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 9 01:59:59.331113 systemd-resolved[1388]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 9 01:59:59.336446 systemd-resolved[1388]: Using system hostname 'ci-4284-0-0-n-abffc5acbe.novalocal'. May 9 01:59:59.338138 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 9 01:59:59.339161 systemd[1]: Reached target network.target - Network. May 9 01:59:59.339814 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 9 01:59:59.341614 systemd[1]: Reached target sysinit.target - System Initialization. May 9 01:59:59.344475 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 9 01:59:59.347858 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 9 01:59:59.351570 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 9 01:59:59.353360 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 9 01:59:59.355451 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 9 01:59:59.356928 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 9 01:59:59.356963 systemd[1]: Reached target paths.target - Path Units. May 9 01:59:59.358235 systemd[1]: Reached target timers.target - Timer Units. May 9 01:59:59.361892 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 9 01:59:59.366290 systemd[1]: Starting docker.socket - Docker Socket for the API... May 9 01:59:59.372545 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 9 01:59:59.375894 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 9 01:59:59.378722 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 9 01:59:59.390047 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 9 01:59:59.391115 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 9 01:59:59.393521 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 9 01:59:59.397355 systemd[1]: Reached target sockets.target - Socket Units. May 9 01:59:59.401895 systemd[1]: Reached target basic.target - Basic System. May 9 01:59:59.403122 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 9 01:59:59.403160 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 9 01:59:59.406297 systemd[1]: Starting containerd.service - containerd container runtime... May 9 01:59:59.414424 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 9 01:59:59.419438 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 9 01:59:59.442310 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 9 01:59:59.447505 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 9 01:59:59.450955 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 9 01:59:59.456285 jq[1448]: false May 9 01:59:59.456364 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 9 01:59:59.460056 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 9 01:59:59.466463 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 9 01:59:59.474473 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 9 01:59:59.484775 systemd[1]: Starting systemd-logind.service - User Login Management... May 9 01:59:59.489750 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 9 01:59:59.491384 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 9 01:59:59.494349 systemd[1]: Starting update-engine.service - Update Engine... May 9 01:59:59.495807 dbus-daemon[1447]: [system] SELinux support is enabled May 9 01:59:59.499844 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 9 01:59:59.501362 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 9 01:59:59.515947 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 9 01:59:59.516679 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 9 01:59:59.522338 jq[1461]: true May 9 01:59:59.526552 extend-filesystems[1449]: Found loop4 May 9 01:59:59.546232 extend-filesystems[1449]: Found loop5 May 9 01:59:59.546232 extend-filesystems[1449]: Found loop6 May 9 01:59:59.546232 extend-filesystems[1449]: Found loop7 May 9 01:59:59.546232 extend-filesystems[1449]: Found vda May 9 01:59:59.546232 extend-filesystems[1449]: Found vda1 May 9 01:59:59.546232 extend-filesystems[1449]: Found vda2 May 9 01:59:59.546232 extend-filesystems[1449]: Found vda3 May 9 01:59:59.546232 extend-filesystems[1449]: Found usr May 9 01:59:59.546232 extend-filesystems[1449]: Found vda4 May 9 01:59:59.546232 extend-filesystems[1449]: Found vda6 May 9 01:59:59.546232 extend-filesystems[1449]: Found vda7 May 9 01:59:59.546232 extend-filesystems[1449]: Found vda9 May 9 01:59:59.546232 extend-filesystems[1449]: Checking size of /dev/vda9 May 9 02:00:00.550452 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks May 9 02:00:00.550516 kernel: EXT4-fs (vda9): resized filesystem to 2014203 May 9 02:00:00.550539 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 42 scanned by (udev-worker) (1311) May 9 01:59:59.535058 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 9 02:00:00.550700 extend-filesystems[1449]: Resized partition /dev/vda9 May 9 02:00:00.552265 update_engine[1459]: I20250509 01:59:59.550170 1459 main.cc:92] Flatcar Update Engine starting May 9 02:00:00.552265 update_engine[1459]: I20250509 01:59:59.566507 1459 update_check_scheduler.cc:74] Next update check in 3m58s May 9 01:59:59.535117 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 9 02:00:00.557157 extend-filesystems[1484]: resize2fs 1.47.2 (1-Jan-2025) May 9 01:59:59.553075 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 9 02:00:00.562378 extend-filesystems[1484]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 9 02:00:00.562378 extend-filesystems[1484]: old_desc_blocks = 1, new_desc_blocks = 1 May 9 02:00:00.562378 extend-filesystems[1484]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. May 9 01:59:59.553104 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 9 02:00:00.598616 extend-filesystems[1449]: Resized filesystem in /dev/vda9 May 9 02:00:00.608496 tar[1470]: linux-amd64/helm May 9 01:59:59.567055 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 9 02:00:00.618249 jq[1474]: true May 9 01:59:59.568322 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 9 02:00:00.428034 systemd-timesyncd[1389]: Contacted time server 23.186.168.125:123 (0.flatcar.pool.ntp.org). May 9 02:00:00.428132 systemd-timesyncd[1389]: Initial clock synchronization to Fri 2025-05-09 02:00:00.425939 UTC. May 9 02:00:00.428212 systemd-resolved[1388]: Clock change detected. Flushing caches. May 9 02:00:00.447816 systemd[1]: motdgen.service: Deactivated successfully. May 9 02:00:00.448066 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 9 02:00:00.468001 systemd[1]: Started update-engine.service - Update Engine. May 9 02:00:00.476231 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 9 02:00:00.481196 (ntainerd)[1478]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 9 02:00:00.511557 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 9 02:00:00.562854 systemd[1]: extend-filesystems.service: Deactivated successfully. May 9 02:00:00.563091 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 9 02:00:00.584037 systemd-logind[1456]: New seat seat0. May 9 02:00:00.627312 systemd-logind[1456]: Watching system buttons on /dev/input/event1 (Power Button) May 9 02:00:00.627906 systemd-logind[1456]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 9 02:00:00.628083 systemd[1]: Started systemd-logind.service - User Login Management. May 9 02:00:00.710036 bash[1506]: Updated "/home/core/.ssh/authorized_keys" May 9 02:00:00.710273 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 9 02:00:00.717729 systemd[1]: Starting sshkeys.service... May 9 02:00:00.749297 locksmithd[1487]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 9 02:00:00.751160 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 9 02:00:00.756990 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 9 02:00:00.905527 sshd_keygen[1463]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 9 02:00:00.970041 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 9 02:00:00.978093 systemd[1]: Starting issuegen.service - Generate /run/issue... May 9 02:00:00.983606 systemd[1]: Started sshd@0-172.24.4.122:22-172.24.4.1:38098.service - OpenSSH per-connection server daemon (172.24.4.1:38098). May 9 02:00:01.015355 systemd[1]: issuegen.service: Deactivated successfully. May 9 02:00:01.015551 systemd[1]: Finished issuegen.service - Generate /run/issue. May 9 02:00:01.022565 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 9 02:00:01.032819 containerd[1478]: time="2025-05-09T02:00:01Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 9 02:00:01.033431 containerd[1478]: time="2025-05-09T02:00:01.033397506Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 May 9 02:00:01.053158 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 9 02:00:01.056750 containerd[1478]: time="2025-05-09T02:00:01.056711662Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.977µs" May 9 02:00:01.056851 containerd[1478]: time="2025-05-09T02:00:01.056832679Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 9 02:00:01.056923 containerd[1478]: time="2025-05-09T02:00:01.056906838Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 9 02:00:01.057230 containerd[1478]: time="2025-05-09T02:00:01.057211750Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 9 02:00:01.057331 containerd[1478]: time="2025-05-09T02:00:01.057314292Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 9 02:00:01.057403 containerd[1478]: time="2025-05-09T02:00:01.057388942Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 9 02:00:01.057561 containerd[1478]: time="2025-05-09T02:00:01.057541989Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 9 02:00:01.057645 containerd[1478]: time="2025-05-09T02:00:01.057615046Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 9 02:00:01.057954 containerd[1478]: time="2025-05-09T02:00:01.057929717Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 9 02:00:01.058028 containerd[1478]: time="2025-05-09T02:00:01.058005409Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 9 02:00:01.058095 containerd[1478]: time="2025-05-09T02:00:01.058079478Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 9 02:00:01.058152 containerd[1478]: time="2025-05-09T02:00:01.058137717Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 9 02:00:01.058298 containerd[1478]: time="2025-05-09T02:00:01.058280474Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 9 02:00:01.058563 containerd[1478]: time="2025-05-09T02:00:01.058545041Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 9 02:00:01.058676 containerd[1478]: time="2025-05-09T02:00:01.058657672Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 9 02:00:01.058776 containerd[1478]: time="2025-05-09T02:00:01.058726721Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 9 02:00:01.058875 containerd[1478]: time="2025-05-09T02:00:01.058853569Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 9 02:00:01.059272 containerd[1478]: time="2025-05-09T02:00:01.059254321Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 9 02:00:01.059391 containerd[1478]: time="2025-05-09T02:00:01.059374897Z" level=info msg="metadata content store policy set" policy=shared May 9 02:00:01.060341 systemd[1]: Started getty@tty1.service - Getty on tty1. May 9 02:00:01.066892 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 9 02:00:01.069549 systemd[1]: Reached target getty.target - Login Prompts. May 9 02:00:01.075363 containerd[1478]: time="2025-05-09T02:00:01.075047030Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 9 02:00:01.075363 containerd[1478]: time="2025-05-09T02:00:01.075115368Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 9 02:00:01.075363 containerd[1478]: time="2025-05-09T02:00:01.075133933Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 9 02:00:01.075363 containerd[1478]: time="2025-05-09T02:00:01.075148651Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 9 02:00:01.075363 containerd[1478]: time="2025-05-09T02:00:01.075165282Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 9 02:00:01.075363 containerd[1478]: time="2025-05-09T02:00:01.075178938Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 9 02:00:01.075363 containerd[1478]: time="2025-05-09T02:00:01.075193976Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 9 02:00:01.075363 containerd[1478]: time="2025-05-09T02:00:01.075212210Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 9 02:00:01.075363 containerd[1478]: time="2025-05-09T02:00:01.075224964Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 9 02:00:01.075363 containerd[1478]: time="2025-05-09T02:00:01.075245913Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 9 02:00:01.075363 containerd[1478]: time="2025-05-09T02:00:01.075260511Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 9 02:00:01.075363 containerd[1478]: time="2025-05-09T02:00:01.075278695Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.075895521Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.075930367Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.075946046Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.075959862Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.075973187Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.075984859Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.076007742Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.076021628Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.076036185Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.076048448Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.076061112Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.076131364Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.076147434Z" level=info msg="Start snapshots syncer" May 9 02:00:01.076651 containerd[1478]: time="2025-05-09T02:00:01.076172802Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 9 02:00:01.077112 containerd[1478]: time="2025-05-09T02:00:01.076437298Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 9 02:00:01.077112 containerd[1478]: time="2025-05-09T02:00:01.076490638Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 9 02:00:01.077267 containerd[1478]: time="2025-05-09T02:00:01.077023407Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 9 02:00:01.077267 containerd[1478]: time="2025-05-09T02:00:01.077119918Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 9 02:00:01.077267 containerd[1478]: time="2025-05-09T02:00:01.077156687Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 9 02:00:01.077267 containerd[1478]: time="2025-05-09T02:00:01.077195369Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 9 02:00:01.077267 containerd[1478]: time="2025-05-09T02:00:01.077214896Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 9 02:00:01.077267 containerd[1478]: time="2025-05-09T02:00:01.077234302Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 9 02:00:01.077267 containerd[1478]: time="2025-05-09T02:00:01.077251455Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 9 02:00:01.077267 containerd[1478]: time="2025-05-09T02:00:01.077265611Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 9 02:00:01.077432 containerd[1478]: time="2025-05-09T02:00:01.077296920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 9 02:00:01.077432 containerd[1478]: time="2025-05-09T02:00:01.077318210Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 9 02:00:01.077432 containerd[1478]: time="2025-05-09T02:00:01.077333989Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 9 02:00:01.077432 containerd[1478]: time="2025-05-09T02:00:01.077372131Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 9 02:00:01.077432 containerd[1478]: time="2025-05-09T02:00:01.077388231Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 9 02:00:01.077432 containerd[1478]: time="2025-05-09T02:00:01.077403149Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 9 02:00:01.077432 containerd[1478]: time="2025-05-09T02:00:01.077418909Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 9 02:00:01.077432 containerd[1478]: time="2025-05-09T02:00:01.077433155Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 9 02:00:01.077598 containerd[1478]: time="2025-05-09T02:00:01.077447843Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 9 02:00:01.077598 containerd[1478]: time="2025-05-09T02:00:01.077464895Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 9 02:00:01.077598 containerd[1478]: time="2025-05-09T02:00:01.077487698Z" level=info msg="runtime interface created" May 9 02:00:01.077598 containerd[1478]: time="2025-05-09T02:00:01.077494981Z" level=info msg="created NRI interface" May 9 02:00:01.077598 containerd[1478]: time="2025-05-09T02:00:01.077509068Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 9 02:00:01.077598 containerd[1478]: time="2025-05-09T02:00:01.077522152Z" level=info msg="Connect containerd service" May 9 02:00:01.077598 containerd[1478]: time="2025-05-09T02:00:01.077555244Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 9 02:00:01.078886 containerd[1478]: time="2025-05-09T02:00:01.078856695Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 9 02:00:01.238322 containerd[1478]: time="2025-05-09T02:00:01.238195034Z" level=info msg="Start subscribing containerd event" May 9 02:00:01.238636 containerd[1478]: time="2025-05-09T02:00:01.238477885Z" level=info msg="Start recovering state" May 9 02:00:01.239064 containerd[1478]: time="2025-05-09T02:00:01.238962874Z" level=info msg="Start event monitor" May 9 02:00:01.239064 containerd[1478]: time="2025-05-09T02:00:01.239015172Z" level=info msg="Start cni network conf syncer for default" May 9 02:00:01.239064 containerd[1478]: time="2025-05-09T02:00:01.239026824Z" level=info msg="Start streaming server" May 9 02:00:01.239064 containerd[1478]: time="2025-05-09T02:00:01.239046962Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 9 02:00:01.239064 containerd[1478]: time="2025-05-09T02:00:01.239056830Z" level=info msg="runtime interface starting up..." May 9 02:00:01.239064 containerd[1478]: time="2025-05-09T02:00:01.239065477Z" level=info msg="starting plugins..." May 9 02:00:01.239227 containerd[1478]: time="2025-05-09T02:00:01.239086356Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 9 02:00:01.239540 containerd[1478]: time="2025-05-09T02:00:01.239323931Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 9 02:00:01.239540 containerd[1478]: time="2025-05-09T02:00:01.239406456Z" level=info msg=serving... address=/run/containerd/containerd.sock May 9 02:00:01.240720 containerd[1478]: time="2025-05-09T02:00:01.239672425Z" level=info msg="containerd successfully booted in 0.208821s" May 9 02:00:01.239821 systemd[1]: Started containerd.service - containerd container runtime. May 9 02:00:01.297827 tar[1470]: linux-amd64/LICENSE May 9 02:00:01.297989 tar[1470]: linux-amd64/README.md May 9 02:00:01.321612 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 9 02:00:01.886076 systemd-networkd[1387]: eth0: Gained IPv6LL May 9 02:00:01.891357 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 9 02:00:01.900829 systemd[1]: Reached target network-online.target - Network is Online. May 9 02:00:01.910114 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 9 02:00:01.920216 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 9 02:00:01.976950 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 9 02:00:02.112258 sshd[1530]: Accepted publickey for core from 172.24.4.1 port 38098 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:00:02.114246 sshd-session[1530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:00:02.148464 systemd-logind[1456]: New session 1 of user core. May 9 02:00:02.153488 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 9 02:00:02.159661 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 9 02:00:02.188273 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 9 02:00:02.195908 systemd[1]: Starting user@500.service - User Manager for UID 500... May 9 02:00:02.210231 (systemd)[1573]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 9 02:00:02.213385 systemd-logind[1456]: New session c1 of user core. May 9 02:00:02.390412 systemd[1573]: Queued start job for default target default.target. May 9 02:00:02.398973 systemd[1573]: Created slice app.slice - User Application Slice. May 9 02:00:02.399214 systemd[1573]: Reached target paths.target - Paths. May 9 02:00:02.399339 systemd[1573]: Reached target timers.target - Timers. May 9 02:00:02.400719 systemd[1573]: Starting dbus.socket - D-Bus User Message Bus Socket... May 9 02:00:02.440040 systemd[1573]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 9 02:00:02.440191 systemd[1573]: Reached target sockets.target - Sockets. May 9 02:00:02.441003 systemd[1573]: Reached target basic.target - Basic System. May 9 02:00:02.441128 systemd[1573]: Reached target default.target - Main User Target. May 9 02:00:02.441217 systemd[1573]: Startup finished in 218ms. May 9 02:00:02.441360 systemd[1]: Started user@500.service - User Manager for UID 500. May 9 02:00:02.455178 systemd[1]: Started session-1.scope - Session 1 of User core. May 9 02:00:02.962453 systemd[1]: Started sshd@1-172.24.4.122:22-172.24.4.1:38112.service - OpenSSH per-connection server daemon (172.24.4.1:38112). May 9 02:00:03.665062 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 9 02:00:03.686563 (kubelet)[1591]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 9 02:00:04.681380 sshd[1584]: Accepted publickey for core from 172.24.4.1 port 38112 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:00:04.683108 sshd-session[1584]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:00:04.694228 systemd-logind[1456]: New session 2 of user core. May 9 02:00:04.705022 systemd[1]: Started session-2.scope - Session 2 of User core. May 9 02:00:05.378758 sshd[1598]: Connection closed by 172.24.4.1 port 38112 May 9 02:00:05.380117 sshd-session[1584]: pam_unix(sshd:session): session closed for user core May 9 02:00:05.401585 systemd[1]: sshd@1-172.24.4.122:22-172.24.4.1:38112.service: Deactivated successfully. May 9 02:00:05.406169 systemd[1]: session-2.scope: Deactivated successfully. May 9 02:00:05.409050 systemd-logind[1456]: Session 2 logged out. Waiting for processes to exit. May 9 02:00:05.415753 systemd[1]: Started sshd@2-172.24.4.122:22-172.24.4.1:39672.service - OpenSSH per-connection server daemon (172.24.4.1:39672). May 9 02:00:05.428267 systemd-logind[1456]: Removed session 2. May 9 02:00:05.891160 kubelet[1591]: E0509 02:00:05.890990 1591 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 9 02:00:05.896030 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 9 02:00:05.896415 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 9 02:00:05.897799 systemd[1]: kubelet.service: Consumed 2.130s CPU time, 247M memory peak. May 9 02:00:06.351938 login[1538]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 9 02:00:06.361296 login[1540]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 9 02:00:06.365021 systemd-logind[1456]: New session 3 of user core. May 9 02:00:06.376399 systemd[1]: Started session-3.scope - Session 3 of User core. May 9 02:00:06.384724 systemd-logind[1456]: New session 4 of user core. May 9 02:00:06.392040 systemd[1]: Started session-4.scope - Session 4 of User core. May 9 02:00:06.765744 sshd[1604]: Accepted publickey for core from 172.24.4.1 port 39672 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:00:06.768802 sshd-session[1604]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:00:06.778862 systemd-logind[1456]: New session 5 of user core. May 9 02:00:06.792367 systemd[1]: Started session-5.scope - Session 5 of User core. May 9 02:00:07.372097 coreos-metadata[1446]: May 09 02:00:07.372 WARN failed to locate config-drive, using the metadata service API instead May 9 02:00:07.431670 sshd[1635]: Connection closed by 172.24.4.1 port 39672 May 9 02:00:07.432570 sshd-session[1604]: pam_unix(sshd:session): session closed for user core May 9 02:00:07.439235 systemd[1]: sshd@2-172.24.4.122:22-172.24.4.1:39672.service: Deactivated successfully. May 9 02:00:07.445778 systemd[1]: session-5.scope: Deactivated successfully. May 9 02:00:07.448251 systemd-logind[1456]: Session 5 logged out. Waiting for processes to exit. May 9 02:00:07.452761 systemd-logind[1456]: Removed session 5. May 9 02:00:07.471308 coreos-metadata[1446]: May 09 02:00:07.470 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 May 9 02:00:07.632712 coreos-metadata[1446]: May 09 02:00:07.632 INFO Fetch successful May 9 02:00:07.632712 coreos-metadata[1446]: May 09 02:00:07.632 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 9 02:00:07.645070 coreos-metadata[1446]: May 09 02:00:07.645 INFO Fetch successful May 9 02:00:07.645450 coreos-metadata[1446]: May 09 02:00:07.645 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 May 9 02:00:07.656671 coreos-metadata[1446]: May 09 02:00:07.656 INFO Fetch successful May 9 02:00:07.656671 coreos-metadata[1446]: May 09 02:00:07.656 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 May 9 02:00:07.668530 coreos-metadata[1446]: May 09 02:00:07.668 INFO Fetch successful May 9 02:00:07.668530 coreos-metadata[1446]: May 09 02:00:07.668 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 May 9 02:00:07.679927 coreos-metadata[1446]: May 09 02:00:07.679 INFO Fetch successful May 9 02:00:07.679927 coreos-metadata[1446]: May 09 02:00:07.679 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 May 9 02:00:07.692529 coreos-metadata[1446]: May 09 02:00:07.692 INFO Fetch successful May 9 02:00:07.748557 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 9 02:00:07.750991 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 9 02:00:07.864043 coreos-metadata[1514]: May 09 02:00:07.863 WARN failed to locate config-drive, using the metadata service API instead May 9 02:00:07.907119 coreos-metadata[1514]: May 09 02:00:07.906 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 May 9 02:00:07.922968 coreos-metadata[1514]: May 09 02:00:07.922 INFO Fetch successful May 9 02:00:07.922968 coreos-metadata[1514]: May 09 02:00:07.922 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 May 9 02:00:07.937428 coreos-metadata[1514]: May 09 02:00:07.937 INFO Fetch successful May 9 02:00:07.953344 unknown[1514]: wrote ssh authorized keys file for user: core May 9 02:00:08.006186 update-ssh-keys[1650]: Updated "/home/core/.ssh/authorized_keys" May 9 02:00:08.008506 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 9 02:00:08.012177 systemd[1]: Finished sshkeys.service. May 9 02:00:08.017473 systemd[1]: Reached target multi-user.target - Multi-User System. May 9 02:00:08.018079 systemd[1]: Startup finished in 1.207s (kernel) + 16.275s (initrd) + 11.149s (userspace) = 28.633s. May 9 02:00:16.061585 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 9 02:00:16.064856 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 9 02:00:16.461834 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 9 02:00:16.477324 (kubelet)[1661]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 9 02:00:16.561588 kubelet[1661]: E0509 02:00:16.561491 1661 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 9 02:00:16.568689 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 9 02:00:16.568970 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 9 02:00:16.569862 systemd[1]: kubelet.service: Consumed 293ms CPU time, 96.1M memory peak. May 9 02:00:17.450181 systemd[1]: Started sshd@3-172.24.4.122:22-172.24.4.1:50194.service - OpenSSH per-connection server daemon (172.24.4.1:50194). May 9 02:00:18.635421 sshd[1671]: Accepted publickey for core from 172.24.4.1 port 50194 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:00:18.638293 sshd-session[1671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:00:18.648174 systemd-logind[1456]: New session 6 of user core. May 9 02:00:18.659903 systemd[1]: Started session-6.scope - Session 6 of User core. May 9 02:00:19.425687 sshd[1673]: Connection closed by 172.24.4.1 port 50194 May 9 02:00:19.427008 sshd-session[1671]: pam_unix(sshd:session): session closed for user core May 9 02:00:19.448088 systemd[1]: sshd@3-172.24.4.122:22-172.24.4.1:50194.service: Deactivated successfully. May 9 02:00:19.452266 systemd[1]: session-6.scope: Deactivated successfully. May 9 02:00:19.454956 systemd-logind[1456]: Session 6 logged out. Waiting for processes to exit. May 9 02:00:19.460299 systemd[1]: Started sshd@4-172.24.4.122:22-172.24.4.1:50206.service - OpenSSH per-connection server daemon (172.24.4.1:50206). May 9 02:00:19.465373 systemd-logind[1456]: Removed session 6. May 9 02:00:20.618257 sshd[1678]: Accepted publickey for core from 172.24.4.1 port 50206 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:00:20.621541 sshd-session[1678]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:00:20.635778 systemd-logind[1456]: New session 7 of user core. May 9 02:00:20.641061 systemd[1]: Started session-7.scope - Session 7 of User core. May 9 02:00:21.323521 sshd[1681]: Connection closed by 172.24.4.1 port 50206 May 9 02:00:21.324513 sshd-session[1678]: pam_unix(sshd:session): session closed for user core May 9 02:00:21.339446 systemd[1]: sshd@4-172.24.4.122:22-172.24.4.1:50206.service: Deactivated successfully. May 9 02:00:21.343429 systemd[1]: session-7.scope: Deactivated successfully. May 9 02:00:21.347045 systemd-logind[1456]: Session 7 logged out. Waiting for processes to exit. May 9 02:00:21.349455 systemd[1]: Started sshd@5-172.24.4.122:22-172.24.4.1:50222.service - OpenSSH per-connection server daemon (172.24.4.1:50222). May 9 02:00:21.351605 systemd-logind[1456]: Removed session 7. May 9 02:00:22.472912 sshd[1686]: Accepted publickey for core from 172.24.4.1 port 50222 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:00:22.476143 sshd-session[1686]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:00:22.489743 systemd-logind[1456]: New session 8 of user core. May 9 02:00:22.501543 systemd[1]: Started session-8.scope - Session 8 of User core. May 9 02:00:23.221744 sshd[1689]: Connection closed by 172.24.4.1 port 50222 May 9 02:00:23.224983 sshd-session[1686]: pam_unix(sshd:session): session closed for user core May 9 02:00:23.241771 systemd[1]: sshd@5-172.24.4.122:22-172.24.4.1:50222.service: Deactivated successfully. May 9 02:00:23.246117 systemd[1]: session-8.scope: Deactivated successfully. May 9 02:00:23.248346 systemd-logind[1456]: Session 8 logged out. Waiting for processes to exit. May 9 02:00:23.254711 systemd[1]: Started sshd@6-172.24.4.122:22-172.24.4.1:50230.service - OpenSSH per-connection server daemon (172.24.4.1:50230). May 9 02:00:23.257715 systemd-logind[1456]: Removed session 8. May 9 02:00:24.871972 sshd[1694]: Accepted publickey for core from 172.24.4.1 port 50230 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:00:24.875134 sshd-session[1694]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:00:24.890536 systemd-logind[1456]: New session 9 of user core. May 9 02:00:24.897036 systemd[1]: Started session-9.scope - Session 9 of User core. May 9 02:00:25.467783 sudo[1698]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 9 02:00:25.468827 sudo[1698]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 9 02:00:25.491905 sudo[1698]: pam_unix(sudo:session): session closed for user root May 9 02:00:25.755203 sshd[1697]: Connection closed by 172.24.4.1 port 50230 May 9 02:00:25.752958 sshd-session[1694]: pam_unix(sshd:session): session closed for user core May 9 02:00:25.776441 systemd[1]: sshd@6-172.24.4.122:22-172.24.4.1:50230.service: Deactivated successfully. May 9 02:00:25.780339 systemd[1]: session-9.scope: Deactivated successfully. May 9 02:00:25.783923 systemd-logind[1456]: Session 9 logged out. Waiting for processes to exit. May 9 02:00:25.787176 systemd[1]: Started sshd@7-172.24.4.122:22-172.24.4.1:56664.service - OpenSSH per-connection server daemon (172.24.4.1:56664). May 9 02:00:25.790890 systemd-logind[1456]: Removed session 9. May 9 02:00:26.770544 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 9 02:00:26.775146 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 9 02:00:27.062803 sshd[1703]: Accepted publickey for core from 172.24.4.1 port 56664 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:00:27.064490 sshd-session[1703]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:00:27.087083 systemd-logind[1456]: New session 10 of user core. May 9 02:00:27.094790 systemd[1]: Started session-10.scope - Session 10 of User core. May 9 02:00:27.099512 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 9 02:00:27.110109 (kubelet)[1714]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 9 02:00:27.181037 kubelet[1714]: E0509 02:00:27.180947 1714 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 9 02:00:27.184972 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 9 02:00:27.185140 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 9 02:00:27.185458 systemd[1]: kubelet.service: Consumed 267ms CPU time, 95.4M memory peak. May 9 02:00:27.493289 sudo[1724]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 9 02:00:27.495275 sudo[1724]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 9 02:00:27.503489 sudo[1724]: pam_unix(sudo:session): session closed for user root May 9 02:00:27.514934 sudo[1723]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 9 02:00:27.515563 sudo[1723]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 9 02:00:27.536908 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 9 02:00:27.606984 augenrules[1746]: No rules May 9 02:00:27.609220 systemd[1]: audit-rules.service: Deactivated successfully. May 9 02:00:27.609939 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 9 02:00:27.611936 sudo[1723]: pam_unix(sudo:session): session closed for user root May 9 02:00:27.762415 sshd[1715]: Connection closed by 172.24.4.1 port 56664 May 9 02:00:27.765510 sshd-session[1703]: pam_unix(sshd:session): session closed for user core May 9 02:00:27.778950 systemd[1]: sshd@7-172.24.4.122:22-172.24.4.1:56664.service: Deactivated successfully. May 9 02:00:27.782512 systemd[1]: session-10.scope: Deactivated successfully. May 9 02:00:27.784469 systemd-logind[1456]: Session 10 logged out. Waiting for processes to exit. May 9 02:00:27.788263 systemd[1]: Started sshd@8-172.24.4.122:22-172.24.4.1:56672.service - OpenSSH per-connection server daemon (172.24.4.1:56672). May 9 02:00:27.791682 systemd-logind[1456]: Removed session 10. May 9 02:00:29.100300 sshd[1754]: Accepted publickey for core from 172.24.4.1 port 56672 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:00:29.102706 sshd-session[1754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:00:29.115271 systemd-logind[1456]: New session 11 of user core. May 9 02:00:29.124934 systemd[1]: Started session-11.scope - Session 11 of User core. May 9 02:00:29.575263 sudo[1758]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 9 02:00:29.576297 sudo[1758]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 9 02:00:30.284022 systemd[1]: Starting docker.service - Docker Application Container Engine... May 9 02:00:30.297666 (dockerd)[1776]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 9 02:00:30.754041 dockerd[1776]: time="2025-05-09T02:00:30.753677434Z" level=info msg="Starting up" May 9 02:00:30.754836 dockerd[1776]: time="2025-05-09T02:00:30.754796042Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 9 02:00:30.845319 dockerd[1776]: time="2025-05-09T02:00:30.845153938Z" level=info msg="Loading containers: start." May 9 02:00:31.148687 kernel: Initializing XFRM netlink socket May 9 02:00:31.428475 systemd-networkd[1387]: docker0: Link UP May 9 02:00:31.512124 dockerd[1776]: time="2025-05-09T02:00:31.512027048Z" level=info msg="Loading containers: done." May 9 02:00:31.547200 dockerd[1776]: time="2025-05-09T02:00:31.547108353Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 9 02:00:31.547441 dockerd[1776]: time="2025-05-09T02:00:31.547272672Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 May 9 02:00:31.547511 dockerd[1776]: time="2025-05-09T02:00:31.547466896Z" level=info msg="Daemon has completed initialization" May 9 02:00:31.552928 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2787486176-merged.mount: Deactivated successfully. May 9 02:00:31.620421 dockerd[1776]: time="2025-05-09T02:00:31.619483252Z" level=info msg="API listen on /run/docker.sock" May 9 02:00:31.619847 systemd[1]: Started docker.service - Docker Application Container Engine. May 9 02:00:33.421685 containerd[1478]: time="2025-05-09T02:00:33.421092968Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\"" May 9 02:00:34.179534 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2971768650.mount: Deactivated successfully. May 9 02:00:36.321780 containerd[1478]: time="2025-05-09T02:00:36.320963598Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:36.331792 containerd[1478]: time="2025-05-09T02:00:36.331595795Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.12: active requests=0, bytes read=32674881" May 9 02:00:36.342075 containerd[1478]: time="2025-05-09T02:00:36.341960030Z" level=info msg="ImageCreate event name:\"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:36.365677 containerd[1478]: time="2025-05-09T02:00:36.365452418Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:36.370085 containerd[1478]: time="2025-05-09T02:00:36.368140622Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.12\" with image id \"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\", size \"32671673\" in 2.946979864s" May 9 02:00:36.370085 containerd[1478]: time="2025-05-09T02:00:36.368215936Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\" returns image reference \"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\"" May 9 02:00:36.409520 containerd[1478]: time="2025-05-09T02:00:36.409333910Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\"" May 9 02:00:37.311281 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 9 02:00:37.319143 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 9 02:00:37.479712 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 9 02:00:37.487894 (kubelet)[2052]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 9 02:00:37.535538 kubelet[2052]: E0509 02:00:37.535497 2052 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 9 02:00:37.538909 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 9 02:00:37.539039 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 9 02:00:37.539318 systemd[1]: kubelet.service: Consumed 173ms CPU time, 95.6M memory peak. May 9 02:00:39.157303 containerd[1478]: time="2025-05-09T02:00:39.157125591Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:39.161148 containerd[1478]: time="2025-05-09T02:00:39.161002339Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.12: active requests=0, bytes read=29617542" May 9 02:00:39.163741 containerd[1478]: time="2025-05-09T02:00:39.163614825Z" level=info msg="ImageCreate event name:\"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:39.171240 containerd[1478]: time="2025-05-09T02:00:39.171115501Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:39.174728 containerd[1478]: time="2025-05-09T02:00:39.174161721Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.12\" with image id \"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\", size \"31105907\" in 2.764745715s" May 9 02:00:39.174728 containerd[1478]: time="2025-05-09T02:00:39.174291118Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\" returns image reference \"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\"" May 9 02:00:39.216085 containerd[1478]: time="2025-05-09T02:00:39.215994103Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\"" May 9 02:00:40.932114 containerd[1478]: time="2025-05-09T02:00:40.932075628Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:40.937741 containerd[1478]: time="2025-05-09T02:00:40.937681103Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.12: active requests=0, bytes read=17903690" May 9 02:00:40.938764 containerd[1478]: time="2025-05-09T02:00:40.938741637Z" level=info msg="ImageCreate event name:\"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:40.941940 containerd[1478]: time="2025-05-09T02:00:40.941903031Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:40.943095 containerd[1478]: time="2025-05-09T02:00:40.943068464Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.12\" with image id \"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\", size \"19392073\" in 1.726735237s" May 9 02:00:40.943184 containerd[1478]: time="2025-05-09T02:00:40.943166231Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\" returns image reference \"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\"" May 9 02:00:40.961363 containerd[1478]: time="2025-05-09T02:00:40.961294037Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\"" May 9 02:00:42.564884 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount576284427.mount: Deactivated successfully. May 9 02:00:43.052900 containerd[1478]: time="2025-05-09T02:00:43.052749180Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:43.054431 containerd[1478]: time="2025-05-09T02:00:43.054381233Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.12: active requests=0, bytes read=29185825" May 9 02:00:43.055694 containerd[1478]: time="2025-05-09T02:00:43.055660908Z" level=info msg="ImageCreate event name:\"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:43.057736 containerd[1478]: time="2025-05-09T02:00:43.057702136Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:43.058536 containerd[1478]: time="2025-05-09T02:00:43.058368629Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.12\" with image id \"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\", repo tag \"registry.k8s.io/kube-proxy:v1.30.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\", size \"29184836\" in 2.09694235s" May 9 02:00:43.058536 containerd[1478]: time="2025-05-09T02:00:43.058402994Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\" returns image reference \"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\"" May 9 02:00:43.076539 containerd[1478]: time="2025-05-09T02:00:43.076499537Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 9 02:00:43.695066 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2030083104.mount: Deactivated successfully. May 9 02:00:45.231130 containerd[1478]: time="2025-05-09T02:00:45.231086072Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:45.232891 containerd[1478]: time="2025-05-09T02:00:45.232849198Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" May 9 02:00:45.234372 containerd[1478]: time="2025-05-09T02:00:45.234346072Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:45.237900 containerd[1478]: time="2025-05-09T02:00:45.237852137Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:45.239175 containerd[1478]: time="2025-05-09T02:00:45.239139123Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.162604149s" May 9 02:00:45.239175 containerd[1478]: time="2025-05-09T02:00:45.239173087Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 9 02:00:45.258802 containerd[1478]: time="2025-05-09T02:00:45.258761530Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" May 9 02:00:45.398794 update_engine[1459]: I20250509 02:00:45.398742 1459 update_attempter.cc:509] Updating boot flags... May 9 02:00:45.448705 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 42 scanned by (udev-worker) (2155) May 9 02:00:46.038451 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2394166726.mount: Deactivated successfully. May 9 02:00:46.047789 containerd[1478]: time="2025-05-09T02:00:46.047709362Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:46.050346 containerd[1478]: time="2025-05-09T02:00:46.050130953Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322298" May 9 02:00:46.052116 containerd[1478]: time="2025-05-09T02:00:46.052003405Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:46.056206 containerd[1478]: time="2025-05-09T02:00:46.056151301Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:46.058311 containerd[1478]: time="2025-05-09T02:00:46.058065441Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 799.057585ms" May 9 02:00:46.058311 containerd[1478]: time="2025-05-09T02:00:46.058133429Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" May 9 02:00:46.097840 containerd[1478]: time="2025-05-09T02:00:46.096910468Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" May 9 02:00:47.563073 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3577490476.mount: Deactivated successfully. May 9 02:00:47.567695 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 9 02:00:47.571192 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 9 02:00:47.964590 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 9 02:00:47.979323 (kubelet)[2188]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 9 02:00:48.061726 kubelet[2188]: E0509 02:00:48.061665 2188 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 9 02:00:48.066345 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 9 02:00:48.066615 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 9 02:00:48.067719 systemd[1]: kubelet.service: Consumed 253ms CPU time, 95.6M memory peak. May 9 02:00:50.991192 containerd[1478]: time="2025-05-09T02:00:50.990292572Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:50.998013 containerd[1478]: time="2025-05-09T02:00:50.997864808Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238579" May 9 02:00:51.003065 containerd[1478]: time="2025-05-09T02:00:51.002986024Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:51.149445 containerd[1478]: time="2025-05-09T02:00:51.149268242Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:00:51.152354 containerd[1478]: time="2025-05-09T02:00:51.152092842Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 5.055112212s" May 9 02:00:51.152354 containerd[1478]: time="2025-05-09T02:00:51.152164027Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" May 9 02:00:54.668995 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 9 02:00:54.669507 systemd[1]: kubelet.service: Consumed 253ms CPU time, 95.6M memory peak. May 9 02:00:54.674233 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 9 02:00:54.724406 systemd[1]: Reload requested from client PID 2318 ('systemctl') (unit session-11.scope)... May 9 02:00:54.724442 systemd[1]: Reloading... May 9 02:00:54.853670 zram_generator::config[2367]: No configuration found. May 9 02:00:55.013967 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 9 02:00:55.147097 systemd[1]: Reloading finished in 421 ms. May 9 02:00:55.190506 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 9 02:00:55.191016 systemd[1]: kubelet.service: Failed with result 'signal'. May 9 02:00:55.191737 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 9 02:00:55.191853 systemd[1]: kubelet.service: Consumed 105ms CPU time, 83.6M memory peak. May 9 02:00:55.195878 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 9 02:00:55.364967 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 9 02:00:55.374138 (kubelet)[2428]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 9 02:00:55.649826 kubelet[2428]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 9 02:00:55.649826 kubelet[2428]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 9 02:00:55.649826 kubelet[2428]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 9 02:00:55.650575 kubelet[2428]: I0509 02:00:55.650007 2428 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 9 02:00:56.892256 kubelet[2428]: I0509 02:00:56.892146 2428 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 9 02:00:56.892256 kubelet[2428]: I0509 02:00:56.892210 2428 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 9 02:00:56.893195 kubelet[2428]: I0509 02:00:56.892692 2428 server.go:927] "Client rotation is on, will bootstrap in background" May 9 02:00:57.561036 kubelet[2428]: I0509 02:00:57.560983 2428 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 9 02:00:57.576145 kubelet[2428]: E0509 02:00:57.576048 2428 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.24.4.122:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:57.665678 kubelet[2428]: I0509 02:00:57.664768 2428 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 9 02:00:57.665678 kubelet[2428]: I0509 02:00:57.665071 2428 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 9 02:00:57.665678 kubelet[2428]: I0509 02:00:57.665113 2428 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-abffc5acbe.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 9 02:00:57.666114 kubelet[2428]: I0509 02:00:57.666096 2428 topology_manager.go:138] "Creating topology manager with none policy" May 9 02:00:57.666200 kubelet[2428]: I0509 02:00:57.666188 2428 container_manager_linux.go:301] "Creating device plugin manager" May 9 02:00:57.666510 kubelet[2428]: I0509 02:00:57.666495 2428 state_mem.go:36] "Initialized new in-memory state store" May 9 02:00:57.719973 kubelet[2428]: I0509 02:00:57.719911 2428 kubelet.go:400] "Attempting to sync node with API server" May 9 02:00:57.720839 kubelet[2428]: I0509 02:00:57.720812 2428 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 9 02:00:57.721422 kubelet[2428]: I0509 02:00:57.721375 2428 kubelet.go:312] "Adding apiserver pod source" May 9 02:00:57.721536 kubelet[2428]: I0509 02:00:57.721439 2428 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 9 02:00:57.722704 kubelet[2428]: W0509 02:00:57.722534 2428 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.122:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-abffc5acbe.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:57.722704 kubelet[2428]: E0509 02:00:57.722702 2428 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.122:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-abffc5acbe.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:57.806742 kubelet[2428]: W0509 02:00:57.806150 2428 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.122:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:57.806742 kubelet[2428]: E0509 02:00:57.806261 2428 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.122:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:57.807616 kubelet[2428]: I0509 02:00:57.807160 2428 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 9 02:00:57.830796 kubelet[2428]: I0509 02:00:57.829938 2428 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 9 02:00:57.830796 kubelet[2428]: W0509 02:00:57.830050 2428 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 9 02:00:57.832538 kubelet[2428]: I0509 02:00:57.831902 2428 server.go:1264] "Started kubelet" May 9 02:00:57.861008 kubelet[2428]: I0509 02:00:57.860939 2428 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 9 02:00:57.874497 kubelet[2428]: I0509 02:00:57.874387 2428 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 9 02:00:57.876795 kubelet[2428]: I0509 02:00:57.876740 2428 server.go:455] "Adding debug handlers to kubelet server" May 9 02:00:57.879181 kubelet[2428]: I0509 02:00:57.879068 2428 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 9 02:00:57.879564 kubelet[2428]: I0509 02:00:57.879502 2428 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 9 02:00:57.883093 kubelet[2428]: I0509 02:00:57.883017 2428 volume_manager.go:291] "Starting Kubelet Volume Manager" May 9 02:00:57.889726 kubelet[2428]: I0509 02:00:57.889299 2428 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 9 02:00:57.889726 kubelet[2428]: I0509 02:00:57.889431 2428 reconciler.go:26] "Reconciler: start to sync state" May 9 02:00:57.900125 kubelet[2428]: E0509 02:00:57.900016 2428 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.122:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-abffc5acbe.novalocal?timeout=10s\": dial tcp 172.24.4.122:6443: connect: connection refused" interval="200ms" May 9 02:00:57.905765 kubelet[2428]: E0509 02:00:57.904229 2428 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.122:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.122:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284-0-0-n-abffc5acbe.novalocal.183db96140543fa1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284-0-0-n-abffc5acbe.novalocal,UID:ci-4284-0-0-n-abffc5acbe.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-n-abffc5acbe.novalocal,},FirstTimestamp:2025-05-09 02:00:57.831849889 +0000 UTC m=+2.453140049,LastTimestamp:2025-05-09 02:00:57.831849889 +0000 UTC m=+2.453140049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-n-abffc5acbe.novalocal,}" May 9 02:00:57.907695 kubelet[2428]: W0509 02:00:57.907514 2428 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.122:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:57.908522 kubelet[2428]: E0509 02:00:57.907618 2428 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.122:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:57.911675 kubelet[2428]: I0509 02:00:57.908810 2428 factory.go:221] Registration of the systemd container factory successfully May 9 02:00:57.911675 kubelet[2428]: I0509 02:00:57.909713 2428 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 9 02:00:57.914423 kubelet[2428]: I0509 02:00:57.914382 2428 factory.go:221] Registration of the containerd container factory successfully May 9 02:00:57.931210 kubelet[2428]: I0509 02:00:57.931118 2428 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 9 02:00:57.938791 kubelet[2428]: I0509 02:00:57.938725 2428 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 9 02:00:57.938924 kubelet[2428]: I0509 02:00:57.938834 2428 status_manager.go:217] "Starting to sync pod status with apiserver" May 9 02:00:57.938924 kubelet[2428]: I0509 02:00:57.938898 2428 kubelet.go:2337] "Starting kubelet main sync loop" May 9 02:00:57.939054 kubelet[2428]: E0509 02:00:57.938990 2428 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 9 02:00:57.943360 kubelet[2428]: W0509 02:00:57.943088 2428 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.122:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:57.943360 kubelet[2428]: E0509 02:00:57.943194 2428 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.122:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:57.962498 kubelet[2428]: I0509 02:00:57.962451 2428 cpu_manager.go:214] "Starting CPU manager" policy="none" May 9 02:00:57.962498 kubelet[2428]: I0509 02:00:57.962490 2428 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 9 02:00:57.962498 kubelet[2428]: I0509 02:00:57.962529 2428 state_mem.go:36] "Initialized new in-memory state store" May 9 02:00:57.970363 kubelet[2428]: I0509 02:00:57.970316 2428 policy_none.go:49] "None policy: Start" May 9 02:00:57.971690 kubelet[2428]: I0509 02:00:57.971393 2428 memory_manager.go:170] "Starting memorymanager" policy="None" May 9 02:00:57.971690 kubelet[2428]: I0509 02:00:57.971429 2428 state_mem.go:35] "Initializing new in-memory state store" May 9 02:00:57.985980 kubelet[2428]: I0509 02:00:57.985305 2428 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:57.986705 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 9 02:00:57.988157 kubelet[2428]: E0509 02:00:57.988095 2428 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.122:6443/api/v1/nodes\": dial tcp 172.24.4.122:6443: connect: connection refused" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.005846 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 9 02:00:58.010352 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 9 02:00:58.016158 kubelet[2428]: I0509 02:00:58.016040 2428 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 9 02:00:58.016439 kubelet[2428]: I0509 02:00:58.016361 2428 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 9 02:00:58.016590 kubelet[2428]: I0509 02:00:58.016564 2428 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 9 02:00:58.020430 kubelet[2428]: E0509 02:00:58.020337 2428 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284-0-0-n-abffc5acbe.novalocal\" not found" May 9 02:00:58.039715 kubelet[2428]: I0509 02:00:58.039638 2428 topology_manager.go:215] "Topology Admit Handler" podUID="7c219790ed34d26d2192a0697cf28beb" podNamespace="kube-system" podName="kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.042293 kubelet[2428]: I0509 02:00:58.042129 2428 topology_manager.go:215] "Topology Admit Handler" podUID="22f1b64f13ac577a5c2c21597862b45c" podNamespace="kube-system" podName="kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.045071 kubelet[2428]: I0509 02:00:58.045051 2428 topology_manager.go:215] "Topology Admit Handler" podUID="f5f05c3d9503d25d590436808668265a" podNamespace="kube-system" podName="kube-scheduler-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.064851 systemd[1]: Created slice kubepods-burstable-pod7c219790ed34d26d2192a0697cf28beb.slice - libcontainer container kubepods-burstable-pod7c219790ed34d26d2192a0697cf28beb.slice. May 9 02:00:58.085519 systemd[1]: Created slice kubepods-burstable-pod22f1b64f13ac577a5c2c21597862b45c.slice - libcontainer container kubepods-burstable-pod22f1b64f13ac577a5c2c21597862b45c.slice. May 9 02:00:58.099325 kubelet[2428]: I0509 02:00:58.099274 2428 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7c219790ed34d26d2192a0697cf28beb-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"7c219790ed34d26d2192a0697cf28beb\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.099762 kubelet[2428]: I0509 02:00:58.099674 2428 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/22f1b64f13ac577a5c2c21597862b45c-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"22f1b64f13ac577a5c2c21597862b45c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.100712 kubelet[2428]: I0509 02:00:58.100516 2428 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/22f1b64f13ac577a5c2c21597862b45c-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"22f1b64f13ac577a5c2c21597862b45c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.100712 kubelet[2428]: I0509 02:00:58.100651 2428 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/22f1b64f13ac577a5c2c21597862b45c-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"22f1b64f13ac577a5c2c21597862b45c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.101194 kubelet[2428]: E0509 02:00:58.100888 2428 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.122:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-abffc5acbe.novalocal?timeout=10s\": dial tcp 172.24.4.122:6443: connect: connection refused" interval="400ms" May 9 02:00:58.101194 kubelet[2428]: I0509 02:00:58.101035 2428 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/22f1b64f13ac577a5c2c21597862b45c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"22f1b64f13ac577a5c2c21597862b45c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.102100 kubelet[2428]: I0509 02:00:58.101408 2428 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7c219790ed34d26d2192a0697cf28beb-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"7c219790ed34d26d2192a0697cf28beb\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.102984 kubelet[2428]: I0509 02:00:58.102248 2428 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7c219790ed34d26d2192a0697cf28beb-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"7c219790ed34d26d2192a0697cf28beb\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.102984 kubelet[2428]: I0509 02:00:58.102340 2428 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/22f1b64f13ac577a5c2c21597862b45c-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"22f1b64f13ac577a5c2c21597862b45c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.102984 kubelet[2428]: I0509 02:00:58.102389 2428 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f5f05c3d9503d25d590436808668265a-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"f5f05c3d9503d25d590436808668265a\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.104398 systemd[1]: Created slice kubepods-burstable-podf5f05c3d9503d25d590436808668265a.slice - libcontainer container kubepods-burstable-podf5f05c3d9503d25d590436808668265a.slice. May 9 02:00:58.191922 kubelet[2428]: I0509 02:00:58.191807 2428 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.192898 kubelet[2428]: E0509 02:00:58.192818 2428 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.122:6443/api/v1/nodes\": dial tcp 172.24.4.122:6443: connect: connection refused" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.383343 containerd[1478]: time="2025-05-09T02:00:58.382946279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal,Uid:7c219790ed34d26d2192a0697cf28beb,Namespace:kube-system,Attempt:0,}" May 9 02:00:58.397347 containerd[1478]: time="2025-05-09T02:00:58.397065253Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal,Uid:22f1b64f13ac577a5c2c21597862b45c,Namespace:kube-system,Attempt:0,}" May 9 02:00:58.410548 containerd[1478]: time="2025-05-09T02:00:58.410409148Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-abffc5acbe.novalocal,Uid:f5f05c3d9503d25d590436808668265a,Namespace:kube-system,Attempt:0,}" May 9 02:00:58.501946 kubelet[2428]: E0509 02:00:58.501808 2428 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.122:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-abffc5acbe.novalocal?timeout=10s\": dial tcp 172.24.4.122:6443: connect: connection refused" interval="800ms" May 9 02:00:58.597094 kubelet[2428]: I0509 02:00:58.596997 2428 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.597888 kubelet[2428]: E0509 02:00:58.597802 2428 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.122:6443/api/v1/nodes\": dial tcp 172.24.4.122:6443: connect: connection refused" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:58.719787 kubelet[2428]: W0509 02:00:58.719401 2428 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.122:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-abffc5acbe.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:58.719787 kubelet[2428]: E0509 02:00:58.719560 2428 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.122:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-abffc5acbe.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:58.960817 kubelet[2428]: W0509 02:00:58.960520 2428 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.122:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:58.960817 kubelet[2428]: E0509 02:00:58.960693 2428 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.122:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:58.991497 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3454030627.mount: Deactivated successfully. May 9 02:00:59.000874 containerd[1478]: time="2025-05-09T02:00:59.000793600Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 9 02:00:59.003979 containerd[1478]: time="2025-05-09T02:00:59.003886692Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 9 02:00:59.007168 containerd[1478]: time="2025-05-09T02:00:59.007050848Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 9 02:00:59.008525 containerd[1478]: time="2025-05-09T02:00:59.008440765Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 9 02:00:59.012615 containerd[1478]: time="2025-05-09T02:00:59.012499906Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 9 02:00:59.015541 containerd[1478]: time="2025-05-09T02:00:59.015171516Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 9 02:00:59.015541 containerd[1478]: time="2025-05-09T02:00:59.015391730Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 9 02:00:59.021144 containerd[1478]: time="2025-05-09T02:00:59.021025374Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 9 02:00:59.024015 kubelet[2428]: W0509 02:00:59.023917 2428 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.122:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:59.024990 containerd[1478]: time="2025-05-09T02:00:59.024828344Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 603.734347ms" May 9 02:00:59.025116 kubelet[2428]: E0509 02:00:59.024948 2428 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.122:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:59.030705 containerd[1478]: time="2025-05-09T02:00:59.029870836Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 641.384355ms" May 9 02:00:59.031451 containerd[1478]: time="2025-05-09T02:00:59.031398992Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 629.426981ms" May 9 02:00:59.099268 containerd[1478]: time="2025-05-09T02:00:59.099177511Z" level=info msg="connecting to shim 2436a202bc4174c32efaea116f54fcc2d789fcdfdd5388329da8f36302910cf0" address="unix:///run/containerd/s/35cf1c592c0a4a9ce6f25c63011c10c42f891f369a678f8cb5e6fa2e77dfe84a" namespace=k8s.io protocol=ttrpc version=3 May 9 02:00:59.133819 systemd[1]: Started cri-containerd-2436a202bc4174c32efaea116f54fcc2d789fcdfdd5388329da8f36302910cf0.scope - libcontainer container 2436a202bc4174c32efaea116f54fcc2d789fcdfdd5388329da8f36302910cf0. May 9 02:00:59.216078 containerd[1478]: time="2025-05-09T02:00:59.215995133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-abffc5acbe.novalocal,Uid:f5f05c3d9503d25d590436808668265a,Namespace:kube-system,Attempt:0,} returns sandbox id \"2436a202bc4174c32efaea116f54fcc2d789fcdfdd5388329da8f36302910cf0\"" May 9 02:00:59.222788 containerd[1478]: time="2025-05-09T02:00:59.222734759Z" level=info msg="CreateContainer within sandbox \"2436a202bc4174c32efaea116f54fcc2d789fcdfdd5388329da8f36302910cf0\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 9 02:00:59.295747 kubelet[2428]: W0509 02:00:59.294431 2428 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.122:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:59.295747 kubelet[2428]: E0509 02:00:59.294564 2428 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.122:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:59.303452 kubelet[2428]: E0509 02:00:59.303357 2428 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.122:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-abffc5acbe.novalocal?timeout=10s\": dial tcp 172.24.4.122:6443: connect: connection refused" interval="1.6s" May 9 02:00:59.348325 containerd[1478]: time="2025-05-09T02:00:59.347746465Z" level=info msg="connecting to shim 860fd1fca0d8689c4fc5bba3eeb01cace53d2963f7e79afa7d6b2545dfdd77f5" address="unix:///run/containerd/s/b0e70f9be2c657faad33e9a63a4454821ef931c6c3d7646b279544ec48569d3a" namespace=k8s.io protocol=ttrpc version=3 May 9 02:00:59.382014 systemd[1]: Started cri-containerd-860fd1fca0d8689c4fc5bba3eeb01cace53d2963f7e79afa7d6b2545dfdd77f5.scope - libcontainer container 860fd1fca0d8689c4fc5bba3eeb01cace53d2963f7e79afa7d6b2545dfdd77f5. May 9 02:00:59.402571 kubelet[2428]: I0509 02:00:59.402471 2428 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:59.403791 kubelet[2428]: E0509 02:00:59.403403 2428 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.122:6443/api/v1/nodes\": dial tcp 172.24.4.122:6443: connect: connection refused" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:00:59.451278 containerd[1478]: time="2025-05-09T02:00:59.451150086Z" level=info msg="connecting to shim 062e4616d77d53c4de23c564bfaccfc5d071e7b63d36d1749445bc49aa18c3ba" address="unix:///run/containerd/s/7186b3d37acd1758c3d4558216789d30ee426993b1936aa786c8bb03e726ad44" namespace=k8s.io protocol=ttrpc version=3 May 9 02:00:59.478796 systemd[1]: Started cri-containerd-062e4616d77d53c4de23c564bfaccfc5d071e7b63d36d1749445bc49aa18c3ba.scope - libcontainer container 062e4616d77d53c4de23c564bfaccfc5d071e7b63d36d1749445bc49aa18c3ba. May 9 02:00:59.594826 kubelet[2428]: E0509 02:00:59.594742 2428 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.24.4.122:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.24.4.122:6443: connect: connection refused May 9 02:00:59.616921 containerd[1478]: time="2025-05-09T02:00:59.616677079Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal,Uid:22f1b64f13ac577a5c2c21597862b45c,Namespace:kube-system,Attempt:0,} returns sandbox id \"860fd1fca0d8689c4fc5bba3eeb01cace53d2963f7e79afa7d6b2545dfdd77f5\"" May 9 02:00:59.623759 containerd[1478]: time="2025-05-09T02:00:59.622379473Z" level=info msg="CreateContainer within sandbox \"860fd1fca0d8689c4fc5bba3eeb01cace53d2963f7e79afa7d6b2545dfdd77f5\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 9 02:00:59.641992 containerd[1478]: time="2025-05-09T02:00:59.641928313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal,Uid:7c219790ed34d26d2192a0697cf28beb,Namespace:kube-system,Attempt:0,} returns sandbox id \"062e4616d77d53c4de23c564bfaccfc5d071e7b63d36d1749445bc49aa18c3ba\"" May 9 02:00:59.651403 containerd[1478]: time="2025-05-09T02:00:59.651330892Z" level=info msg="Container cf80c3e6e3ada7fa1bd9b74b823513a8111c5da037542eaddb86b6c436586000: CDI devices from CRI Config.CDIDevices: []" May 9 02:00:59.652500 containerd[1478]: time="2025-05-09T02:00:59.652120779Z" level=info msg="CreateContainer within sandbox \"062e4616d77d53c4de23c564bfaccfc5d071e7b63d36d1749445bc49aa18c3ba\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 9 02:00:59.668377 containerd[1478]: time="2025-05-09T02:00:59.668315905Z" level=info msg="Container 4deedc1792f9759ba4c7818db61b16725b038005565f7778bc384f6846869265: CDI devices from CRI Config.CDIDevices: []" May 9 02:00:59.674369 containerd[1478]: time="2025-05-09T02:00:59.674305711Z" level=info msg="CreateContainer within sandbox \"2436a202bc4174c32efaea116f54fcc2d789fcdfdd5388329da8f36302910cf0\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"cf80c3e6e3ada7fa1bd9b74b823513a8111c5da037542eaddb86b6c436586000\"" May 9 02:00:59.676692 containerd[1478]: time="2025-05-09T02:00:59.675773814Z" level=info msg="StartContainer for \"cf80c3e6e3ada7fa1bd9b74b823513a8111c5da037542eaddb86b6c436586000\"" May 9 02:00:59.689618 containerd[1478]: time="2025-05-09T02:00:59.689526213Z" level=info msg="connecting to shim cf80c3e6e3ada7fa1bd9b74b823513a8111c5da037542eaddb86b6c436586000" address="unix:///run/containerd/s/35cf1c592c0a4a9ce6f25c63011c10c42f891f369a678f8cb5e6fa2e77dfe84a" protocol=ttrpc version=3 May 9 02:00:59.697709 containerd[1478]: time="2025-05-09T02:00:59.696886408Z" level=info msg="Container 99793087f021c42b2aea6346cee4c8b586d5a5b99d09f6d5137433194cc3cf2e: CDI devices from CRI Config.CDIDevices: []" May 9 02:00:59.702289 containerd[1478]: time="2025-05-09T02:00:59.702205441Z" level=info msg="CreateContainer within sandbox \"860fd1fca0d8689c4fc5bba3eeb01cace53d2963f7e79afa7d6b2545dfdd77f5\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"4deedc1792f9759ba4c7818db61b16725b038005565f7778bc384f6846869265\"" May 9 02:00:59.704086 containerd[1478]: time="2025-05-09T02:00:59.703897075Z" level=info msg="StartContainer for \"4deedc1792f9759ba4c7818db61b16725b038005565f7778bc384f6846869265\"" May 9 02:00:59.709945 containerd[1478]: time="2025-05-09T02:00:59.709871972Z" level=info msg="connecting to shim 4deedc1792f9759ba4c7818db61b16725b038005565f7778bc384f6846869265" address="unix:///run/containerd/s/b0e70f9be2c657faad33e9a63a4454821ef931c6c3d7646b279544ec48569d3a" protocol=ttrpc version=3 May 9 02:00:59.722994 containerd[1478]: time="2025-05-09T02:00:59.722820677Z" level=info msg="CreateContainer within sandbox \"062e4616d77d53c4de23c564bfaccfc5d071e7b63d36d1749445bc49aa18c3ba\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"99793087f021c42b2aea6346cee4c8b586d5a5b99d09f6d5137433194cc3cf2e\"" May 9 02:00:59.724120 containerd[1478]: time="2025-05-09T02:00:59.724058558Z" level=info msg="StartContainer for \"99793087f021c42b2aea6346cee4c8b586d5a5b99d09f6d5137433194cc3cf2e\"" May 9 02:00:59.729111 containerd[1478]: time="2025-05-09T02:00:59.729037550Z" level=info msg="connecting to shim 99793087f021c42b2aea6346cee4c8b586d5a5b99d09f6d5137433194cc3cf2e" address="unix:///run/containerd/s/7186b3d37acd1758c3d4558216789d30ee426993b1936aa786c8bb03e726ad44" protocol=ttrpc version=3 May 9 02:00:59.746852 systemd[1]: Started cri-containerd-cf80c3e6e3ada7fa1bd9b74b823513a8111c5da037542eaddb86b6c436586000.scope - libcontainer container cf80c3e6e3ada7fa1bd9b74b823513a8111c5da037542eaddb86b6c436586000. May 9 02:00:59.758738 systemd[1]: Started cri-containerd-4deedc1792f9759ba4c7818db61b16725b038005565f7778bc384f6846869265.scope - libcontainer container 4deedc1792f9759ba4c7818db61b16725b038005565f7778bc384f6846869265. May 9 02:00:59.775802 systemd[1]: Started cri-containerd-99793087f021c42b2aea6346cee4c8b586d5a5b99d09f6d5137433194cc3cf2e.scope - libcontainer container 99793087f021c42b2aea6346cee4c8b586d5a5b99d09f6d5137433194cc3cf2e. May 9 02:00:59.839219 containerd[1478]: time="2025-05-09T02:00:59.838824667Z" level=info msg="StartContainer for \"cf80c3e6e3ada7fa1bd9b74b823513a8111c5da037542eaddb86b6c436586000\" returns successfully" May 9 02:00:59.870209 containerd[1478]: time="2025-05-09T02:00:59.869477369Z" level=info msg="StartContainer for \"4deedc1792f9759ba4c7818db61b16725b038005565f7778bc384f6846869265\" returns successfully" May 9 02:00:59.873595 containerd[1478]: time="2025-05-09T02:00:59.873424559Z" level=info msg="StartContainer for \"99793087f021c42b2aea6346cee4c8b586d5a5b99d09f6d5137433194cc3cf2e\" returns successfully" May 9 02:01:01.005577 kubelet[2428]: I0509 02:01:01.005542 2428 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:01.665250 kubelet[2428]: E0509 02:01:01.665187 2428 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4284-0-0-n-abffc5acbe.novalocal\" not found" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:01.725687 kubelet[2428]: I0509 02:01:01.725658 2428 apiserver.go:52] "Watching apiserver" May 9 02:01:01.789917 kubelet[2428]: I0509 02:01:01.789851 2428 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 9 02:01:01.832084 kubelet[2428]: I0509 02:01:01.832055 2428 kubelet_node_status.go:76] "Successfully registered node" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:02.510412 kubelet[2428]: E0509 02:01:02.510350 2428 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:03.893502 kubelet[2428]: W0509 02:01:03.893073 2428 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 9 02:01:04.629048 systemd[1]: Reload requested from client PID 2710 ('systemctl') (unit session-11.scope)... May 9 02:01:04.629699 systemd[1]: Reloading... May 9 02:01:04.771666 zram_generator::config[2759]: No configuration found. May 9 02:01:04.932154 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 9 02:01:05.084155 systemd[1]: Reloading finished in 453 ms. May 9 02:01:05.116865 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 9 02:01:05.129144 systemd[1]: kubelet.service: Deactivated successfully. May 9 02:01:05.129516 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 9 02:01:05.129568 systemd[1]: kubelet.service: Consumed 1.860s CPU time, 115.7M memory peak. May 9 02:01:05.134061 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 9 02:01:05.267352 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 9 02:01:05.276129 (kubelet)[2820]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 9 02:01:05.349082 kubelet[2820]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 9 02:01:05.349082 kubelet[2820]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 9 02:01:05.349082 kubelet[2820]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 9 02:01:05.349445 kubelet[2820]: I0509 02:01:05.349154 2820 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 9 02:01:05.357291 kubelet[2820]: I0509 02:01:05.355683 2820 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 9 02:01:05.357291 kubelet[2820]: I0509 02:01:05.355707 2820 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 9 02:01:05.357291 kubelet[2820]: I0509 02:01:05.355912 2820 server.go:927] "Client rotation is on, will bootstrap in background" May 9 02:01:05.357535 kubelet[2820]: I0509 02:01:05.357522 2820 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 9 02:01:05.358892 kubelet[2820]: I0509 02:01:05.358866 2820 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 9 02:01:05.364151 kubelet[2820]: I0509 02:01:05.364135 2820 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 9 02:01:05.364435 kubelet[2820]: I0509 02:01:05.364410 2820 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 9 02:01:05.364711 kubelet[2820]: I0509 02:01:05.364494 2820 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-abffc5acbe.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 9 02:01:05.364833 kubelet[2820]: I0509 02:01:05.364822 2820 topology_manager.go:138] "Creating topology manager with none policy" May 9 02:01:05.364910 kubelet[2820]: I0509 02:01:05.364901 2820 container_manager_linux.go:301] "Creating device plugin manager" May 9 02:01:05.365000 kubelet[2820]: I0509 02:01:05.364990 2820 state_mem.go:36] "Initialized new in-memory state store" May 9 02:01:05.365128 kubelet[2820]: I0509 02:01:05.365117 2820 kubelet.go:400] "Attempting to sync node with API server" May 9 02:01:05.365197 kubelet[2820]: I0509 02:01:05.365187 2820 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 9 02:01:05.365260 kubelet[2820]: I0509 02:01:05.365252 2820 kubelet.go:312] "Adding apiserver pod source" May 9 02:01:05.365324 kubelet[2820]: I0509 02:01:05.365315 2820 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 9 02:01:05.370639 kubelet[2820]: I0509 02:01:05.368976 2820 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 9 02:01:05.370908 kubelet[2820]: I0509 02:01:05.370895 2820 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 9 02:01:05.371330 kubelet[2820]: I0509 02:01:05.371317 2820 server.go:1264] "Started kubelet" May 9 02:01:05.371487 kubelet[2820]: I0509 02:01:05.371452 2820 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 9 02:01:05.371805 kubelet[2820]: I0509 02:01:05.371763 2820 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 9 02:01:05.372816 kubelet[2820]: I0509 02:01:05.372796 2820 server.go:455] "Adding debug handlers to kubelet server" May 9 02:01:05.373897 kubelet[2820]: I0509 02:01:05.373282 2820 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 9 02:01:05.376539 kubelet[2820]: I0509 02:01:05.376525 2820 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 9 02:01:05.387012 kubelet[2820]: I0509 02:01:05.386976 2820 volume_manager.go:291] "Starting Kubelet Volume Manager" May 9 02:01:05.387555 kubelet[2820]: I0509 02:01:05.387532 2820 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 9 02:01:05.387716 kubelet[2820]: I0509 02:01:05.387695 2820 reconciler.go:26] "Reconciler: start to sync state" May 9 02:01:05.395171 kubelet[2820]: I0509 02:01:05.395074 2820 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 9 02:01:05.396736 kubelet[2820]: I0509 02:01:05.396403 2820 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 9 02:01:05.396736 kubelet[2820]: I0509 02:01:05.396426 2820 status_manager.go:217] "Starting to sync pod status with apiserver" May 9 02:01:05.396736 kubelet[2820]: I0509 02:01:05.396445 2820 kubelet.go:2337] "Starting kubelet main sync loop" May 9 02:01:05.396736 kubelet[2820]: E0509 02:01:05.396479 2820 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 9 02:01:05.411675 kubelet[2820]: I0509 02:01:05.408193 2820 factory.go:221] Registration of the systemd container factory successfully May 9 02:01:05.411675 kubelet[2820]: I0509 02:01:05.408281 2820 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 9 02:01:05.411675 kubelet[2820]: I0509 02:01:05.410101 2820 factory.go:221] Registration of the containerd container factory successfully May 9 02:01:05.450879 kubelet[2820]: I0509 02:01:05.450854 2820 cpu_manager.go:214] "Starting CPU manager" policy="none" May 9 02:01:05.450879 kubelet[2820]: I0509 02:01:05.450872 2820 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 9 02:01:05.450879 kubelet[2820]: I0509 02:01:05.450888 2820 state_mem.go:36] "Initialized new in-memory state store" May 9 02:01:05.451061 kubelet[2820]: I0509 02:01:05.451040 2820 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 9 02:01:05.451061 kubelet[2820]: I0509 02:01:05.451052 2820 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 9 02:01:05.451123 kubelet[2820]: I0509 02:01:05.451069 2820 policy_none.go:49] "None policy: Start" May 9 02:01:05.451740 kubelet[2820]: I0509 02:01:05.451720 2820 memory_manager.go:170] "Starting memorymanager" policy="None" May 9 02:01:05.451809 kubelet[2820]: I0509 02:01:05.451742 2820 state_mem.go:35] "Initializing new in-memory state store" May 9 02:01:05.451874 kubelet[2820]: I0509 02:01:05.451855 2820 state_mem.go:75] "Updated machine memory state" May 9 02:01:05.456111 kubelet[2820]: I0509 02:01:05.456081 2820 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 9 02:01:05.456249 kubelet[2820]: I0509 02:01:05.456220 2820 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 9 02:01:05.456310 kubelet[2820]: I0509 02:01:05.456297 2820 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 9 02:01:05.489379 kubelet[2820]: I0509 02:01:05.489355 2820 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.497494 kubelet[2820]: I0509 02:01:05.497442 2820 topology_manager.go:215] "Topology Admit Handler" podUID="7c219790ed34d26d2192a0697cf28beb" podNamespace="kube-system" podName="kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.497579 kubelet[2820]: I0509 02:01:05.497558 2820 topology_manager.go:215] "Topology Admit Handler" podUID="22f1b64f13ac577a5c2c21597862b45c" podNamespace="kube-system" podName="kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.497609 kubelet[2820]: I0509 02:01:05.497592 2820 topology_manager.go:215] "Topology Admit Handler" podUID="f5f05c3d9503d25d590436808668265a" podNamespace="kube-system" podName="kube-scheduler-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.589233 kubelet[2820]: I0509 02:01:05.589063 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/22f1b64f13ac577a5c2c21597862b45c-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"22f1b64f13ac577a5c2c21597862b45c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.589524 kubelet[2820]: I0509 02:01:05.589430 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/22f1b64f13ac577a5c2c21597862b45c-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"22f1b64f13ac577a5c2c21597862b45c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.589694 kubelet[2820]: I0509 02:01:05.589585 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/22f1b64f13ac577a5c2c21597862b45c-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"22f1b64f13ac577a5c2c21597862b45c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.589870 kubelet[2820]: I0509 02:01:05.589796 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f5f05c3d9503d25d590436808668265a-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"f5f05c3d9503d25d590436808668265a\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.590015 kubelet[2820]: I0509 02:01:05.589956 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7c219790ed34d26d2192a0697cf28beb-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"7c219790ed34d26d2192a0697cf28beb\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.590150 kubelet[2820]: I0509 02:01:05.590102 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7c219790ed34d26d2192a0697cf28beb-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"7c219790ed34d26d2192a0697cf28beb\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.590276 kubelet[2820]: I0509 02:01:05.590232 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/22f1b64f13ac577a5c2c21597862b45c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"22f1b64f13ac577a5c2c21597862b45c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.590346 kubelet[2820]: I0509 02:01:05.590301 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7c219790ed34d26d2192a0697cf28beb-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"7c219790ed34d26d2192a0697cf28beb\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.590417 kubelet[2820]: I0509 02:01:05.590386 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/22f1b64f13ac577a5c2c21597862b45c-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal\" (UID: \"22f1b64f13ac577a5c2c21597862b45c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:05.854779 kubelet[2820]: W0509 02:01:05.854133 2820 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 9 02:01:05.856564 kubelet[2820]: W0509 02:01:05.856504 2820 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 9 02:01:05.871830 kubelet[2820]: W0509 02:01:05.871787 2820 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 9 02:01:05.873533 kubelet[2820]: E0509 02:01:05.872271 2820 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4284-0-0-n-abffc5acbe.novalocal\" already exists" pod="kube-system/kube-scheduler-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:06.262552 kubelet[2820]: I0509 02:01:06.260897 2820 kubelet_node_status.go:112] "Node was previously registered" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:06.262552 kubelet[2820]: I0509 02:01:06.261027 2820 kubelet_node_status.go:76] "Successfully registered node" node="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:06.365796 kubelet[2820]: I0509 02:01:06.365733 2820 apiserver.go:52] "Watching apiserver" May 9 02:01:06.388092 kubelet[2820]: I0509 02:01:06.388024 2820 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 9 02:01:06.544725 kubelet[2820]: W0509 02:01:06.543811 2820 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 9 02:01:06.544725 kubelet[2820]: E0509 02:01:06.543974 2820 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:06.551302 kubelet[2820]: W0509 02:01:06.550116 2820 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 9 02:01:06.551302 kubelet[2820]: E0509 02:01:06.550249 2820 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal\" already exists" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:01:06.913553 kubelet[2820]: I0509 02:01:06.913260 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284-0-0-n-abffc5acbe.novalocal" podStartSLOduration=1.913226833 podStartE2EDuration="1.913226833s" podCreationTimestamp="2025-05-09 02:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-09 02:01:06.547885494 +0000 UTC m=+1.264814764" watchObservedRunningTime="2025-05-09 02:01:06.913226833 +0000 UTC m=+1.630156103" May 9 02:01:07.298317 kubelet[2820]: I0509 02:01:07.297603 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-abffc5acbe.novalocal" podStartSLOduration=2.297569361 podStartE2EDuration="2.297569361s" podCreationTimestamp="2025-05-09 02:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-09 02:01:06.916137063 +0000 UTC m=+1.633066333" watchObservedRunningTime="2025-05-09 02:01:07.297569361 +0000 UTC m=+2.014498632" May 9 02:01:07.547178 kubelet[2820]: I0509 02:01:07.547050 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284-0-0-n-abffc5acbe.novalocal" podStartSLOduration=4.547015769 podStartE2EDuration="4.547015769s" podCreationTimestamp="2025-05-09 02:01:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-09 02:01:07.302310362 +0000 UTC m=+2.019239632" watchObservedRunningTime="2025-05-09 02:01:07.547015769 +0000 UTC m=+2.263945039" May 9 02:01:13.315618 sudo[1758]: pam_unix(sudo:session): session closed for user root May 9 02:01:13.557717 sshd[1757]: Connection closed by 172.24.4.1 port 56672 May 9 02:01:13.558989 sshd-session[1754]: pam_unix(sshd:session): session closed for user core May 9 02:01:13.567382 systemd[1]: sshd@8-172.24.4.122:22-172.24.4.1:56672.service: Deactivated successfully. May 9 02:01:13.574148 systemd[1]: session-11.scope: Deactivated successfully. May 9 02:01:13.574990 systemd[1]: session-11.scope: Consumed 7.142s CPU time, 243.7M memory peak. May 9 02:01:13.578615 systemd-logind[1456]: Session 11 logged out. Waiting for processes to exit. May 9 02:01:13.582082 systemd-logind[1456]: Removed session 11. May 9 02:01:19.106045 kubelet[2820]: I0509 02:01:19.105828 2820 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 9 02:01:19.106808 containerd[1478]: time="2025-05-09T02:01:19.106741133Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 9 02:01:19.107448 kubelet[2820]: I0509 02:01:19.106973 2820 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 9 02:01:19.286368 kubelet[2820]: I0509 02:01:19.286316 2820 topology_manager.go:215] "Topology Admit Handler" podUID="71781194-1c59-489b-9e57-93d2d5b2284c" podNamespace="kube-system" podName="kube-proxy-zzckk" May 9 02:01:19.299897 systemd[1]: Created slice kubepods-besteffort-pod71781194_1c59_489b_9e57_93d2d5b2284c.slice - libcontainer container kubepods-besteffort-pod71781194_1c59_489b_9e57_93d2d5b2284c.slice. May 9 02:01:19.484175 kubelet[2820]: I0509 02:01:19.483654 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/71781194-1c59-489b-9e57-93d2d5b2284c-kube-proxy\") pod \"kube-proxy-zzckk\" (UID: \"71781194-1c59-489b-9e57-93d2d5b2284c\") " pod="kube-system/kube-proxy-zzckk" May 9 02:01:19.484175 kubelet[2820]: I0509 02:01:19.483722 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/71781194-1c59-489b-9e57-93d2d5b2284c-xtables-lock\") pod \"kube-proxy-zzckk\" (UID: \"71781194-1c59-489b-9e57-93d2d5b2284c\") " pod="kube-system/kube-proxy-zzckk" May 9 02:01:19.484175 kubelet[2820]: I0509 02:01:19.483771 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb4nh\" (UniqueName: \"kubernetes.io/projected/71781194-1c59-489b-9e57-93d2d5b2284c-kube-api-access-jb4nh\") pod \"kube-proxy-zzckk\" (UID: \"71781194-1c59-489b-9e57-93d2d5b2284c\") " pod="kube-system/kube-proxy-zzckk" May 9 02:01:19.484175 kubelet[2820]: I0509 02:01:19.483816 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/71781194-1c59-489b-9e57-93d2d5b2284c-lib-modules\") pod \"kube-proxy-zzckk\" (UID: \"71781194-1c59-489b-9e57-93d2d5b2284c\") " pod="kube-system/kube-proxy-zzckk" May 9 02:01:19.773231 kubelet[2820]: I0509 02:01:19.772164 2820 topology_manager.go:215] "Topology Admit Handler" podUID="3e612bfb-d690-47e4-8ffc-cbebd4deca47" podNamespace="tigera-operator" podName="tigera-operator-797db67f8-nj42b" May 9 02:01:19.775827 kubelet[2820]: W0509 02:01:19.775746 2820 reflector.go:547] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4284-0-0-n-abffc5acbe.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4284-0-0-n-abffc5acbe.novalocal' and this object May 9 02:01:19.775827 kubelet[2820]: E0509 02:01:19.775814 2820 reflector.go:150] object-"tigera-operator"/"kubernetes-services-endpoint": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4284-0-0-n-abffc5acbe.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4284-0-0-n-abffc5acbe.novalocal' and this object May 9 02:01:19.776381 kubelet[2820]: W0509 02:01:19.776228 2820 reflector.go:547] object-"tigera-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4284-0-0-n-abffc5acbe.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4284-0-0-n-abffc5acbe.novalocal' and this object May 9 02:01:19.776381 kubelet[2820]: E0509 02:01:19.776361 2820 reflector.go:150] object-"tigera-operator"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4284-0-0-n-abffc5acbe.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4284-0-0-n-abffc5acbe.novalocal' and this object May 9 02:01:19.782184 systemd[1]: Created slice kubepods-besteffort-pod3e612bfb_d690_47e4_8ffc_cbebd4deca47.slice - libcontainer container kubepods-besteffort-pod3e612bfb_d690_47e4_8ffc_cbebd4deca47.slice. May 9 02:01:19.887486 kubelet[2820]: I0509 02:01:19.887338 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3e612bfb-d690-47e4-8ffc-cbebd4deca47-var-lib-calico\") pod \"tigera-operator-797db67f8-nj42b\" (UID: \"3e612bfb-d690-47e4-8ffc-cbebd4deca47\") " pod="tigera-operator/tigera-operator-797db67f8-nj42b" May 9 02:01:19.887486 kubelet[2820]: I0509 02:01:19.887448 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv7qh\" (UniqueName: \"kubernetes.io/projected/3e612bfb-d690-47e4-8ffc-cbebd4deca47-kube-api-access-rv7qh\") pod \"tigera-operator-797db67f8-nj42b\" (UID: \"3e612bfb-d690-47e4-8ffc-cbebd4deca47\") " pod="tigera-operator/tigera-operator-797db67f8-nj42b" May 9 02:01:19.910468 containerd[1478]: time="2025-05-09T02:01:19.910149938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zzckk,Uid:71781194-1c59-489b-9e57-93d2d5b2284c,Namespace:kube-system,Attempt:0,}" May 9 02:01:19.964163 containerd[1478]: time="2025-05-09T02:01:19.960001139Z" level=info msg="connecting to shim 1d032d3afc92079fd72862b41d05a76af54686b0800628500edb97f5acd96d86" address="unix:///run/containerd/s/46cb2a346ab88c4786769873902dc35cfd446c8268ff0344adc47f76727f9b3d" namespace=k8s.io protocol=ttrpc version=3 May 9 02:01:20.008800 systemd[1]: Started cri-containerd-1d032d3afc92079fd72862b41d05a76af54686b0800628500edb97f5acd96d86.scope - libcontainer container 1d032d3afc92079fd72862b41d05a76af54686b0800628500edb97f5acd96d86. May 9 02:01:20.045257 containerd[1478]: time="2025-05-09T02:01:20.045172606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zzckk,Uid:71781194-1c59-489b-9e57-93d2d5b2284c,Namespace:kube-system,Attempt:0,} returns sandbox id \"1d032d3afc92079fd72862b41d05a76af54686b0800628500edb97f5acd96d86\"" May 9 02:01:20.050779 containerd[1478]: time="2025-05-09T02:01:20.050448399Z" level=info msg="CreateContainer within sandbox \"1d032d3afc92079fd72862b41d05a76af54686b0800628500edb97f5acd96d86\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 9 02:01:20.065473 containerd[1478]: time="2025-05-09T02:01:20.065434182Z" level=info msg="Container a9fd3d340bb5f7723426e733bec461949ce4e76683b53ebd390a64bffacbdeca: CDI devices from CRI Config.CDIDevices: []" May 9 02:01:20.082122 containerd[1478]: time="2025-05-09T02:01:20.082064773Z" level=info msg="CreateContainer within sandbox \"1d032d3afc92079fd72862b41d05a76af54686b0800628500edb97f5acd96d86\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"a9fd3d340bb5f7723426e733bec461949ce4e76683b53ebd390a64bffacbdeca\"" May 9 02:01:20.082908 containerd[1478]: time="2025-05-09T02:01:20.082706888Z" level=info msg="StartContainer for \"a9fd3d340bb5f7723426e733bec461949ce4e76683b53ebd390a64bffacbdeca\"" May 9 02:01:20.084246 containerd[1478]: time="2025-05-09T02:01:20.084219999Z" level=info msg="connecting to shim a9fd3d340bb5f7723426e733bec461949ce4e76683b53ebd390a64bffacbdeca" address="unix:///run/containerd/s/46cb2a346ab88c4786769873902dc35cfd446c8268ff0344adc47f76727f9b3d" protocol=ttrpc version=3 May 9 02:01:20.108793 systemd[1]: Started cri-containerd-a9fd3d340bb5f7723426e733bec461949ce4e76683b53ebd390a64bffacbdeca.scope - libcontainer container a9fd3d340bb5f7723426e733bec461949ce4e76683b53ebd390a64bffacbdeca. May 9 02:01:20.165088 containerd[1478]: time="2025-05-09T02:01:20.165047024Z" level=info msg="StartContainer for \"a9fd3d340bb5f7723426e733bec461949ce4e76683b53ebd390a64bffacbdeca\" returns successfully" May 9 02:01:20.988154 containerd[1478]: time="2025-05-09T02:01:20.988077670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-nj42b,Uid:3e612bfb-d690-47e4-8ffc-cbebd4deca47,Namespace:tigera-operator,Attempt:0,}" May 9 02:01:21.037713 containerd[1478]: time="2025-05-09T02:01:21.036865274Z" level=info msg="connecting to shim 773cf86dc08c123c107ed25ebf9c43125445091cb17df67e28915d67551d2902" address="unix:///run/containerd/s/91e1777269273a56d511bee5249ffeb2617f6a167f94df19de45be5ff37b640a" namespace=k8s.io protocol=ttrpc version=3 May 9 02:01:21.101786 systemd[1]: Started cri-containerd-773cf86dc08c123c107ed25ebf9c43125445091cb17df67e28915d67551d2902.scope - libcontainer container 773cf86dc08c123c107ed25ebf9c43125445091cb17df67e28915d67551d2902. May 9 02:01:21.155159 containerd[1478]: time="2025-05-09T02:01:21.155116361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-nj42b,Uid:3e612bfb-d690-47e4-8ffc-cbebd4deca47,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"773cf86dc08c123c107ed25ebf9c43125445091cb17df67e28915d67551d2902\"" May 9 02:01:21.158522 containerd[1478]: time="2025-05-09T02:01:21.158482320Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 9 02:01:22.966560 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2252484048.mount: Deactivated successfully. May 9 02:01:23.775077 containerd[1478]: time="2025-05-09T02:01:23.774983742Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:01:23.776844 containerd[1478]: time="2025-05-09T02:01:23.776796043Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 9 02:01:23.778824 containerd[1478]: time="2025-05-09T02:01:23.778776390Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:01:23.782877 containerd[1478]: time="2025-05-09T02:01:23.782831842Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:01:23.784852 containerd[1478]: time="2025-05-09T02:01:23.784495145Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 2.625963121s" May 9 02:01:23.784852 containerd[1478]: time="2025-05-09T02:01:23.784556239Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 9 02:01:23.788692 containerd[1478]: time="2025-05-09T02:01:23.788146688Z" level=info msg="CreateContainer within sandbox \"773cf86dc08c123c107ed25ebf9c43125445091cb17df67e28915d67551d2902\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 9 02:01:23.808611 containerd[1478]: time="2025-05-09T02:01:23.808551407Z" level=info msg="Container 12b9e547c6372b27b2a5004899c77f048dc7a1b69b3d4528987137b13d675df7: CDI devices from CRI Config.CDIDevices: []" May 9 02:01:23.821045 containerd[1478]: time="2025-05-09T02:01:23.820935010Z" level=info msg="CreateContainer within sandbox \"773cf86dc08c123c107ed25ebf9c43125445091cb17df67e28915d67551d2902\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"12b9e547c6372b27b2a5004899c77f048dc7a1b69b3d4528987137b13d675df7\"" May 9 02:01:23.821964 containerd[1478]: time="2025-05-09T02:01:23.821849087Z" level=info msg="StartContainer for \"12b9e547c6372b27b2a5004899c77f048dc7a1b69b3d4528987137b13d675df7\"" May 9 02:01:23.822084 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2335610148.mount: Deactivated successfully. May 9 02:01:23.828907 containerd[1478]: time="2025-05-09T02:01:23.828383461Z" level=info msg="connecting to shim 12b9e547c6372b27b2a5004899c77f048dc7a1b69b3d4528987137b13d675df7" address="unix:///run/containerd/s/91e1777269273a56d511bee5249ffeb2617f6a167f94df19de45be5ff37b640a" protocol=ttrpc version=3 May 9 02:01:23.857765 systemd[1]: Started cri-containerd-12b9e547c6372b27b2a5004899c77f048dc7a1b69b3d4528987137b13d675df7.scope - libcontainer container 12b9e547c6372b27b2a5004899c77f048dc7a1b69b3d4528987137b13d675df7. May 9 02:01:23.894544 containerd[1478]: time="2025-05-09T02:01:23.894498988Z" level=info msg="StartContainer for \"12b9e547c6372b27b2a5004899c77f048dc7a1b69b3d4528987137b13d675df7\" returns successfully" May 9 02:01:24.522246 kubelet[2820]: I0509 02:01:24.521405 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-zzckk" podStartSLOduration=5.521368829 podStartE2EDuration="5.521368829s" podCreationTimestamp="2025-05-09 02:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-09 02:01:20.49677842 +0000 UTC m=+15.213707650" watchObservedRunningTime="2025-05-09 02:01:24.521368829 +0000 UTC m=+19.238298099" May 9 02:01:25.427951 kubelet[2820]: I0509 02:01:25.427815 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-797db67f8-nj42b" podStartSLOduration=3.798860759 podStartE2EDuration="6.427777995s" podCreationTimestamp="2025-05-09 02:01:19 +0000 UTC" firstStartedPulling="2025-05-09 02:01:21.157168023 +0000 UTC m=+15.874097253" lastFinishedPulling="2025-05-09 02:01:23.786085259 +0000 UTC m=+18.503014489" observedRunningTime="2025-05-09 02:01:24.521798175 +0000 UTC m=+19.238727495" watchObservedRunningTime="2025-05-09 02:01:25.427777995 +0000 UTC m=+20.144707265" May 9 02:01:27.563831 kubelet[2820]: I0509 02:01:27.563784 2820 topology_manager.go:215] "Topology Admit Handler" podUID="f6ba9944-c85c-4203-8bf7-d1f1c65c5f18" podNamespace="calico-system" podName="calico-typha-8fc68b5d4-98kdh" May 9 02:01:27.575211 systemd[1]: Created slice kubepods-besteffort-podf6ba9944_c85c_4203_8bf7_d1f1c65c5f18.slice - libcontainer container kubepods-besteffort-podf6ba9944_c85c_4203_8bf7_d1f1c65c5f18.slice. May 9 02:01:27.745180 kubelet[2820]: I0509 02:01:27.745136 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6ba9944-c85c-4203-8bf7-d1f1c65c5f18-tigera-ca-bundle\") pod \"calico-typha-8fc68b5d4-98kdh\" (UID: \"f6ba9944-c85c-4203-8bf7-d1f1c65c5f18\") " pod="calico-system/calico-typha-8fc68b5d4-98kdh" May 9 02:01:27.745180 kubelet[2820]: I0509 02:01:27.745173 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f6ba9944-c85c-4203-8bf7-d1f1c65c5f18-typha-certs\") pod \"calico-typha-8fc68b5d4-98kdh\" (UID: \"f6ba9944-c85c-4203-8bf7-d1f1c65c5f18\") " pod="calico-system/calico-typha-8fc68b5d4-98kdh" May 9 02:01:27.745383 kubelet[2820]: I0509 02:01:27.745197 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hphll\" (UniqueName: \"kubernetes.io/projected/f6ba9944-c85c-4203-8bf7-d1f1c65c5f18-kube-api-access-hphll\") pod \"calico-typha-8fc68b5d4-98kdh\" (UID: \"f6ba9944-c85c-4203-8bf7-d1f1c65c5f18\") " pod="calico-system/calico-typha-8fc68b5d4-98kdh" May 9 02:01:28.051555 kubelet[2820]: I0509 02:01:28.050797 2820 topology_manager.go:215] "Topology Admit Handler" podUID="4d3e64aa-d9b1-49e7-82aa-b334fce41b09" podNamespace="calico-system" podName="calico-node-m9s6k" May 9 02:01:28.062544 systemd[1]: Created slice kubepods-besteffort-pod4d3e64aa_d9b1_49e7_82aa_b334fce41b09.slice - libcontainer container kubepods-besteffort-pod4d3e64aa_d9b1_49e7_82aa_b334fce41b09.slice. May 9 02:01:28.148449 kubelet[2820]: I0509 02:01:28.147922 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/4d3e64aa-d9b1-49e7-82aa-b334fce41b09-policysync\") pod \"calico-node-m9s6k\" (UID: \"4d3e64aa-d9b1-49e7-82aa-b334fce41b09\") " pod="calico-system/calico-node-m9s6k" May 9 02:01:28.148449 kubelet[2820]: I0509 02:01:28.147971 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/4d3e64aa-d9b1-49e7-82aa-b334fce41b09-cni-bin-dir\") pod \"calico-node-m9s6k\" (UID: \"4d3e64aa-d9b1-49e7-82aa-b334fce41b09\") " pod="calico-system/calico-node-m9s6k" May 9 02:01:28.148449 kubelet[2820]: I0509 02:01:28.148005 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/4d3e64aa-d9b1-49e7-82aa-b334fce41b09-cni-log-dir\") pod \"calico-node-m9s6k\" (UID: \"4d3e64aa-d9b1-49e7-82aa-b334fce41b09\") " pod="calico-system/calico-node-m9s6k" May 9 02:01:28.148449 kubelet[2820]: I0509 02:01:28.148038 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/4d3e64aa-d9b1-49e7-82aa-b334fce41b09-flexvol-driver-host\") pod \"calico-node-m9s6k\" (UID: \"4d3e64aa-d9b1-49e7-82aa-b334fce41b09\") " pod="calico-system/calico-node-m9s6k" May 9 02:01:28.148449 kubelet[2820]: I0509 02:01:28.148080 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4d3e64aa-d9b1-49e7-82aa-b334fce41b09-xtables-lock\") pod \"calico-node-m9s6k\" (UID: \"4d3e64aa-d9b1-49e7-82aa-b334fce41b09\") " pod="calico-system/calico-node-m9s6k" May 9 02:01:28.148761 kubelet[2820]: I0509 02:01:28.148117 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4d3e64aa-d9b1-49e7-82aa-b334fce41b09-var-lib-calico\") pod \"calico-node-m9s6k\" (UID: \"4d3e64aa-d9b1-49e7-82aa-b334fce41b09\") " pod="calico-system/calico-node-m9s6k" May 9 02:01:28.148761 kubelet[2820]: I0509 02:01:28.148146 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/4d3e64aa-d9b1-49e7-82aa-b334fce41b09-cni-net-dir\") pod \"calico-node-m9s6k\" (UID: \"4d3e64aa-d9b1-49e7-82aa-b334fce41b09\") " pod="calico-system/calico-node-m9s6k" May 9 02:01:28.148761 kubelet[2820]: I0509 02:01:28.148176 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/4d3e64aa-d9b1-49e7-82aa-b334fce41b09-var-run-calico\") pod \"calico-node-m9s6k\" (UID: \"4d3e64aa-d9b1-49e7-82aa-b334fce41b09\") " pod="calico-system/calico-node-m9s6k" May 9 02:01:28.148761 kubelet[2820]: I0509 02:01:28.148208 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d3e64aa-d9b1-49e7-82aa-b334fce41b09-lib-modules\") pod \"calico-node-m9s6k\" (UID: \"4d3e64aa-d9b1-49e7-82aa-b334fce41b09\") " pod="calico-system/calico-node-m9s6k" May 9 02:01:28.148761 kubelet[2820]: I0509 02:01:28.148236 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d3e64aa-d9b1-49e7-82aa-b334fce41b09-tigera-ca-bundle\") pod \"calico-node-m9s6k\" (UID: \"4d3e64aa-d9b1-49e7-82aa-b334fce41b09\") " pod="calico-system/calico-node-m9s6k" May 9 02:01:28.148913 kubelet[2820]: I0509 02:01:28.148264 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rldxl\" (UniqueName: \"kubernetes.io/projected/4d3e64aa-d9b1-49e7-82aa-b334fce41b09-kube-api-access-rldxl\") pod \"calico-node-m9s6k\" (UID: \"4d3e64aa-d9b1-49e7-82aa-b334fce41b09\") " pod="calico-system/calico-node-m9s6k" May 9 02:01:28.148913 kubelet[2820]: I0509 02:01:28.148294 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/4d3e64aa-d9b1-49e7-82aa-b334fce41b09-node-certs\") pod \"calico-node-m9s6k\" (UID: \"4d3e64aa-d9b1-49e7-82aa-b334fce41b09\") " pod="calico-system/calico-node-m9s6k" May 9 02:01:28.183676 containerd[1478]: time="2025-05-09T02:01:28.183596329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8fc68b5d4-98kdh,Uid:f6ba9944-c85c-4203-8bf7-d1f1c65c5f18,Namespace:calico-system,Attempt:0,}" May 9 02:01:28.189610 kubelet[2820]: I0509 02:01:28.189557 2820 topology_manager.go:215] "Topology Admit Handler" podUID="7c514110-0ced-4d72-9e74-278f80566401" podNamespace="calico-system" podName="csi-node-driver-5w5lh" May 9 02:01:28.190707 kubelet[2820]: E0509 02:01:28.189856 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5w5lh" podUID="7c514110-0ced-4d72-9e74-278f80566401" May 9 02:01:28.246262 containerd[1478]: time="2025-05-09T02:01:28.246189560Z" level=info msg="connecting to shim 2dfa7e9f3dbf457c8c4f1bd9053525e6d1ef1a93e5652fc346122531e24fc3f0" address="unix:///run/containerd/s/9ecd5adbb7a7ed3b68595bafcf408d2f103a15cf813c6fb1b70708fc04ec61f3" namespace=k8s.io protocol=ttrpc version=3 May 9 02:01:28.256148 kubelet[2820]: I0509 02:01:28.255347 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c514110-0ced-4d72-9e74-278f80566401-socket-dir\") pod \"csi-node-driver-5w5lh\" (UID: \"7c514110-0ced-4d72-9e74-278f80566401\") " pod="calico-system/csi-node-driver-5w5lh" May 9 02:01:28.256148 kubelet[2820]: I0509 02:01:28.255459 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c514110-0ced-4d72-9e74-278f80566401-kubelet-dir\") pod \"csi-node-driver-5w5lh\" (UID: \"7c514110-0ced-4d72-9e74-278f80566401\") " pod="calico-system/csi-node-driver-5w5lh" May 9 02:01:28.256148 kubelet[2820]: I0509 02:01:28.255490 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n9g8\" (UniqueName: \"kubernetes.io/projected/7c514110-0ced-4d72-9e74-278f80566401-kube-api-access-7n9g8\") pod \"csi-node-driver-5w5lh\" (UID: \"7c514110-0ced-4d72-9e74-278f80566401\") " pod="calico-system/csi-node-driver-5w5lh" May 9 02:01:28.257137 kubelet[2820]: I0509 02:01:28.256402 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/7c514110-0ced-4d72-9e74-278f80566401-varrun\") pod \"csi-node-driver-5w5lh\" (UID: \"7c514110-0ced-4d72-9e74-278f80566401\") " pod="calico-system/csi-node-driver-5w5lh" May 9 02:01:28.257137 kubelet[2820]: I0509 02:01:28.256470 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c514110-0ced-4d72-9e74-278f80566401-registration-dir\") pod \"csi-node-driver-5w5lh\" (UID: \"7c514110-0ced-4d72-9e74-278f80566401\") " pod="calico-system/csi-node-driver-5w5lh" May 9 02:01:28.311140 kubelet[2820]: E0509 02:01:28.311110 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.312452 kubelet[2820]: W0509 02:01:28.311380 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.312452 kubelet[2820]: E0509 02:01:28.311606 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.343890 systemd[1]: Started cri-containerd-2dfa7e9f3dbf457c8c4f1bd9053525e6d1ef1a93e5652fc346122531e24fc3f0.scope - libcontainer container 2dfa7e9f3dbf457c8c4f1bd9053525e6d1ef1a93e5652fc346122531e24fc3f0. May 9 02:01:28.361308 kubelet[2820]: E0509 02:01:28.361281 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.361659 kubelet[2820]: W0509 02:01:28.361603 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.361776 kubelet[2820]: E0509 02:01:28.361762 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.362188 kubelet[2820]: E0509 02:01:28.362176 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.362285 kubelet[2820]: W0509 02:01:28.362272 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.362385 kubelet[2820]: E0509 02:01:28.362373 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.362898 kubelet[2820]: E0509 02:01:28.362655 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.362898 kubelet[2820]: W0509 02:01:28.362667 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.362898 kubelet[2820]: E0509 02:01:28.362678 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.363562 kubelet[2820]: E0509 02:01:28.363549 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.363731 kubelet[2820]: W0509 02:01:28.363619 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.363917 kubelet[2820]: E0509 02:01:28.363800 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.364086 kubelet[2820]: E0509 02:01:28.364074 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.364238 kubelet[2820]: W0509 02:01:28.364169 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.364238 kubelet[2820]: E0509 02:01:28.364191 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.364469 kubelet[2820]: E0509 02:01:28.364443 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.364469 kubelet[2820]: W0509 02:01:28.364467 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.364730 kubelet[2820]: E0509 02:01:28.364508 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.364730 kubelet[2820]: E0509 02:01:28.364709 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.364857 kubelet[2820]: W0509 02:01:28.364719 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.364857 kubelet[2820]: E0509 02:01:28.364768 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.365061 kubelet[2820]: E0509 02:01:28.364914 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.365061 kubelet[2820]: W0509 02:01:28.364924 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.365061 kubelet[2820]: E0509 02:01:28.364974 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.365528 kubelet[2820]: E0509 02:01:28.365093 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.365528 kubelet[2820]: W0509 02:01:28.365102 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.365528 kubelet[2820]: E0509 02:01:28.365281 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.365528 kubelet[2820]: W0509 02:01:28.365289 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.365528 kubelet[2820]: E0509 02:01:28.365298 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.365528 kubelet[2820]: E0509 02:01:28.365376 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.365528 kubelet[2820]: E0509 02:01:28.365453 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.365528 kubelet[2820]: W0509 02:01:28.365474 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.365528 kubelet[2820]: E0509 02:01:28.365484 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.365780 kubelet[2820]: E0509 02:01:28.365667 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.365780 kubelet[2820]: W0509 02:01:28.365677 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.365780 kubelet[2820]: E0509 02:01:28.365686 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.366263 kubelet[2820]: E0509 02:01:28.365908 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.366263 kubelet[2820]: W0509 02:01:28.365922 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.366263 kubelet[2820]: E0509 02:01:28.365940 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.366552 kubelet[2820]: E0509 02:01:28.366516 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.366612 kubelet[2820]: W0509 02:01:28.366532 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.366831 kubelet[2820]: E0509 02:01:28.366753 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.366831 kubelet[2820]: W0509 02:01:28.366766 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.366831 kubelet[2820]: E0509 02:01:28.366790 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.366831 kubelet[2820]: E0509 02:01:28.366812 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.367197 kubelet[2820]: E0509 02:01:28.366978 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.367197 kubelet[2820]: W0509 02:01:28.366987 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.367261 containerd[1478]: time="2025-05-09T02:01:28.367083655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-m9s6k,Uid:4d3e64aa-d9b1-49e7-82aa-b334fce41b09,Namespace:calico-system,Attempt:0,}" May 9 02:01:28.368175 kubelet[2820]: E0509 02:01:28.367019 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.368175 kubelet[2820]: E0509 02:01:28.367416 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.368175 kubelet[2820]: W0509 02:01:28.368005 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.375567 kubelet[2820]: E0509 02:01:28.370118 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.376383 kubelet[2820]: E0509 02:01:28.375762 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.376383 kubelet[2820]: W0509 02:01:28.375780 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.376383 kubelet[2820]: E0509 02:01:28.376234 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.377333 kubelet[2820]: E0509 02:01:28.377097 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.377333 kubelet[2820]: W0509 02:01:28.377110 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.378671 kubelet[2820]: E0509 02:01:28.377414 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.378671 kubelet[2820]: E0509 02:01:28.378029 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.378671 kubelet[2820]: W0509 02:01:28.378040 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.378671 kubelet[2820]: E0509 02:01:28.378109 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.378671 kubelet[2820]: E0509 02:01:28.378357 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.378671 kubelet[2820]: W0509 02:01:28.378366 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.378671 kubelet[2820]: E0509 02:01:28.378476 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.378960 kubelet[2820]: E0509 02:01:28.378759 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.378960 kubelet[2820]: W0509 02:01:28.378770 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.379785 kubelet[2820]: E0509 02:01:28.379703 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.379920 kubelet[2820]: E0509 02:01:28.379909 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.380134 kubelet[2820]: W0509 02:01:28.379997 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.380134 kubelet[2820]: E0509 02:01:28.380085 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.381291 kubelet[2820]: E0509 02:01:28.380842 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.381291 kubelet[2820]: W0509 02:01:28.380854 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.381584 kubelet[2820]: E0509 02:01:28.381545 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.381918 kubelet[2820]: E0509 02:01:28.381860 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.381918 kubelet[2820]: W0509 02:01:28.381872 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.381918 kubelet[2820]: E0509 02:01:28.381894 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.415827 kubelet[2820]: E0509 02:01:28.415551 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:28.415827 kubelet[2820]: W0509 02:01:28.415572 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:28.415827 kubelet[2820]: E0509 02:01:28.415616 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:28.443746 containerd[1478]: time="2025-05-09T02:01:28.443577121Z" level=info msg="connecting to shim 835f8bb95ed5f9f8ba36028371f8e59fe761942909e02f532bca43f8aa19a5f9" address="unix:///run/containerd/s/920d6f236e95077d59b16ca7f5fe20d131db92acf51d11ae4237c265a6cd0933" namespace=k8s.io protocol=ttrpc version=3 May 9 02:01:28.532837 systemd[1]: Started cri-containerd-835f8bb95ed5f9f8ba36028371f8e59fe761942909e02f532bca43f8aa19a5f9.scope - libcontainer container 835f8bb95ed5f9f8ba36028371f8e59fe761942909e02f532bca43f8aa19a5f9. May 9 02:01:28.549719 containerd[1478]: time="2025-05-09T02:01:28.549550147Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8fc68b5d4-98kdh,Uid:f6ba9944-c85c-4203-8bf7-d1f1c65c5f18,Namespace:calico-system,Attempt:0,} returns sandbox id \"2dfa7e9f3dbf457c8c4f1bd9053525e6d1ef1a93e5652fc346122531e24fc3f0\"" May 9 02:01:28.553013 containerd[1478]: time="2025-05-09T02:01:28.552695238Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 9 02:01:28.597555 containerd[1478]: time="2025-05-09T02:01:28.597388869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-m9s6k,Uid:4d3e64aa-d9b1-49e7-82aa-b334fce41b09,Namespace:calico-system,Attempt:0,} returns sandbox id \"835f8bb95ed5f9f8ba36028371f8e59fe761942909e02f532bca43f8aa19a5f9\"" May 9 02:01:30.400846 kubelet[2820]: E0509 02:01:30.398519 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5w5lh" podUID="7c514110-0ced-4d72-9e74-278f80566401" May 9 02:01:32.276501 containerd[1478]: time="2025-05-09T02:01:32.276060902Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:01:32.277536 containerd[1478]: time="2025-05-09T02:01:32.276986348Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 9 02:01:32.281641 containerd[1478]: time="2025-05-09T02:01:32.281375013Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:01:32.291650 containerd[1478]: time="2025-05-09T02:01:32.291499750Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:01:32.293801 containerd[1478]: time="2025-05-09T02:01:32.293731618Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 3.740979823s" May 9 02:01:32.294049 containerd[1478]: time="2025-05-09T02:01:32.293919209Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 9 02:01:32.298074 containerd[1478]: time="2025-05-09T02:01:32.297064061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 9 02:01:32.326526 containerd[1478]: time="2025-05-09T02:01:32.326475245Z" level=info msg="CreateContainer within sandbox \"2dfa7e9f3dbf457c8c4f1bd9053525e6d1ef1a93e5652fc346122531e24fc3f0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 9 02:01:32.344534 containerd[1478]: time="2025-05-09T02:01:32.343192512Z" level=info msg="Container 97a0f69e3a97c54ed2727589e847c86030a8bf8c8d38b783c842c2639fd368d8: CDI devices from CRI Config.CDIDevices: []" May 9 02:01:32.358994 containerd[1478]: time="2025-05-09T02:01:32.358959717Z" level=info msg="CreateContainer within sandbox \"2dfa7e9f3dbf457c8c4f1bd9053525e6d1ef1a93e5652fc346122531e24fc3f0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"97a0f69e3a97c54ed2727589e847c86030a8bf8c8d38b783c842c2639fd368d8\"" May 9 02:01:32.360634 containerd[1478]: time="2025-05-09T02:01:32.359727027Z" level=info msg="StartContainer for \"97a0f69e3a97c54ed2727589e847c86030a8bf8c8d38b783c842c2639fd368d8\"" May 9 02:01:32.360795 containerd[1478]: time="2025-05-09T02:01:32.360751429Z" level=info msg="connecting to shim 97a0f69e3a97c54ed2727589e847c86030a8bf8c8d38b783c842c2639fd368d8" address="unix:///run/containerd/s/9ecd5adbb7a7ed3b68595bafcf408d2f103a15cf813c6fb1b70708fc04ec61f3" protocol=ttrpc version=3 May 9 02:01:32.387294 systemd[1]: Started cri-containerd-97a0f69e3a97c54ed2727589e847c86030a8bf8c8d38b783c842c2639fd368d8.scope - libcontainer container 97a0f69e3a97c54ed2727589e847c86030a8bf8c8d38b783c842c2639fd368d8. May 9 02:01:32.397400 kubelet[2820]: E0509 02:01:32.397351 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5w5lh" podUID="7c514110-0ced-4d72-9e74-278f80566401" May 9 02:01:32.456779 containerd[1478]: time="2025-05-09T02:01:32.456658802Z" level=info msg="StartContainer for \"97a0f69e3a97c54ed2727589e847c86030a8bf8c8d38b783c842c2639fd368d8\" returns successfully" May 9 02:01:32.591807 kubelet[2820]: E0509 02:01:32.591779 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.592078 kubelet[2820]: W0509 02:01:32.591936 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.592078 kubelet[2820]: E0509 02:01:32.591961 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.592478 kubelet[2820]: E0509 02:01:32.592358 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.592478 kubelet[2820]: W0509 02:01:32.592391 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.592478 kubelet[2820]: E0509 02:01:32.592406 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.592961 kubelet[2820]: E0509 02:01:32.592741 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.592961 kubelet[2820]: W0509 02:01:32.592764 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.592961 kubelet[2820]: E0509 02:01:32.592791 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.593223 kubelet[2820]: E0509 02:01:32.593211 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.593451 kubelet[2820]: W0509 02:01:32.593374 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.593451 kubelet[2820]: E0509 02:01:32.593395 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.593848 kubelet[2820]: E0509 02:01:32.593748 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.593848 kubelet[2820]: W0509 02:01:32.593760 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.593848 kubelet[2820]: E0509 02:01:32.593770 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.594110 kubelet[2820]: E0509 02:01:32.593979 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.594110 kubelet[2820]: W0509 02:01:32.593989 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.594110 kubelet[2820]: E0509 02:01:32.593999 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.594548 kubelet[2820]: E0509 02:01:32.594408 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.594548 kubelet[2820]: W0509 02:01:32.594420 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.594548 kubelet[2820]: E0509 02:01:32.594430 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.594942 kubelet[2820]: E0509 02:01:32.594822 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.594942 kubelet[2820]: W0509 02:01:32.594834 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.594942 kubelet[2820]: E0509 02:01:32.594855 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.595198 kubelet[2820]: E0509 02:01:32.595142 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.595198 kubelet[2820]: W0509 02:01:32.595155 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.595198 kubelet[2820]: E0509 02:01:32.595165 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.595652 kubelet[2820]: E0509 02:01:32.595488 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.595652 kubelet[2820]: W0509 02:01:32.595507 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.595652 kubelet[2820]: E0509 02:01:32.595517 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.596097 kubelet[2820]: E0509 02:01:32.595891 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.596097 kubelet[2820]: W0509 02:01:32.595932 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.596097 kubelet[2820]: E0509 02:01:32.595943 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.596899 kubelet[2820]: E0509 02:01:32.596759 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.596899 kubelet[2820]: W0509 02:01:32.596773 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.596899 kubelet[2820]: E0509 02:01:32.596786 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.597424 kubelet[2820]: E0509 02:01:32.597262 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.597424 kubelet[2820]: W0509 02:01:32.597275 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.597424 kubelet[2820]: E0509 02:01:32.597306 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.597772 kubelet[2820]: E0509 02:01:32.597542 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.597772 kubelet[2820]: W0509 02:01:32.597552 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.597772 kubelet[2820]: E0509 02:01:32.597563 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.598220 kubelet[2820]: E0509 02:01:32.598033 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.598220 kubelet[2820]: W0509 02:01:32.598045 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.598220 kubelet[2820]: E0509 02:01:32.598068 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.611247 kubelet[2820]: E0509 02:01:32.611192 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.611247 kubelet[2820]: W0509 02:01:32.611235 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.611532 kubelet[2820]: E0509 02:01:32.611256 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.611532 kubelet[2820]: E0509 02:01:32.611510 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.611532 kubelet[2820]: W0509 02:01:32.611520 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.611532 kubelet[2820]: E0509 02:01:32.611538 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.612101 kubelet[2820]: E0509 02:01:32.611959 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.612101 kubelet[2820]: W0509 02:01:32.611991 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.612101 kubelet[2820]: E0509 02:01:32.612014 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.612330 kubelet[2820]: E0509 02:01:32.612281 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.612330 kubelet[2820]: W0509 02:01:32.612293 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.612633 kubelet[2820]: E0509 02:01:32.612445 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.612705 kubelet[2820]: E0509 02:01:32.612636 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.612705 kubelet[2820]: W0509 02:01:32.612647 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.612705 kubelet[2820]: E0509 02:01:32.612665 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.612891 kubelet[2820]: E0509 02:01:32.612860 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.612891 kubelet[2820]: W0509 02:01:32.612869 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.613022 kubelet[2820]: E0509 02:01:32.612937 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.613185 kubelet[2820]: E0509 02:01:32.613169 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.613185 kubelet[2820]: W0509 02:01:32.613184 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.613377 kubelet[2820]: E0509 02:01:32.613252 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.613377 kubelet[2820]: E0509 02:01:32.613339 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.613377 kubelet[2820]: W0509 02:01:32.613348 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.613377 kubelet[2820]: E0509 02:01:32.613363 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.613683 kubelet[2820]: E0509 02:01:32.613542 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.613683 kubelet[2820]: W0509 02:01:32.613566 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.613683 kubelet[2820]: E0509 02:01:32.613580 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.614186 kubelet[2820]: E0509 02:01:32.614067 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.614186 kubelet[2820]: W0509 02:01:32.614081 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.614186 kubelet[2820]: E0509 02:01:32.614120 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.614560 kubelet[2820]: E0509 02:01:32.614450 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.614560 kubelet[2820]: W0509 02:01:32.614462 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.614560 kubelet[2820]: E0509 02:01:32.614491 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.614848 kubelet[2820]: E0509 02:01:32.614782 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.614848 kubelet[2820]: W0509 02:01:32.614794 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.614848 kubelet[2820]: E0509 02:01:32.614821 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.615312 kubelet[2820]: E0509 02:01:32.615175 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.615312 kubelet[2820]: W0509 02:01:32.615186 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.615312 kubelet[2820]: E0509 02:01:32.615204 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.615647 kubelet[2820]: E0509 02:01:32.615553 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.615647 kubelet[2820]: W0509 02:01:32.615566 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.615647 kubelet[2820]: E0509 02:01:32.615583 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.615871 kubelet[2820]: E0509 02:01:32.615805 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.615871 kubelet[2820]: W0509 02:01:32.615822 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.615871 kubelet[2820]: E0509 02:01:32.615832 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.616130 kubelet[2820]: E0509 02:01:32.616111 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.616130 kubelet[2820]: W0509 02:01:32.616127 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.616334 kubelet[2820]: E0509 02:01:32.616138 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.616495 kubelet[2820]: E0509 02:01:32.616393 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.616495 kubelet[2820]: W0509 02:01:32.616408 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.616495 kubelet[2820]: E0509 02:01:32.616417 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:32.616713 kubelet[2820]: E0509 02:01:32.616700 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:32.616811 kubelet[2820]: W0509 02:01:32.616774 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:32.616811 kubelet[2820]: E0509 02:01:32.616789 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.577657 kubelet[2820]: I0509 02:01:33.576857 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8fc68b5d4-98kdh" podStartSLOduration=2.831465526 podStartE2EDuration="6.576823154s" podCreationTimestamp="2025-05-09 02:01:27 +0000 UTC" firstStartedPulling="2025-05-09 02:01:28.551294831 +0000 UTC m=+23.268224061" lastFinishedPulling="2025-05-09 02:01:32.296652439 +0000 UTC m=+27.013581689" observedRunningTime="2025-05-09 02:01:32.548329205 +0000 UTC m=+27.265258435" watchObservedRunningTime="2025-05-09 02:01:33.576823154 +0000 UTC m=+28.293752424" May 9 02:01:33.609905 kubelet[2820]: E0509 02:01:33.609811 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.610825 kubelet[2820]: W0509 02:01:33.609855 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.610825 kubelet[2820]: E0509 02:01:33.610151 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.612115 kubelet[2820]: E0509 02:01:33.611400 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.612115 kubelet[2820]: W0509 02:01:33.611427 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.612115 kubelet[2820]: E0509 02:01:33.611453 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.613475 kubelet[2820]: E0509 02:01:33.613091 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.613475 kubelet[2820]: W0509 02:01:33.613154 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.613475 kubelet[2820]: E0509 02:01:33.613182 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.617076 kubelet[2820]: E0509 02:01:33.616148 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.617076 kubelet[2820]: W0509 02:01:33.616763 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.617076 kubelet[2820]: E0509 02:01:33.616822 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.620322 kubelet[2820]: E0509 02:01:33.618271 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.620322 kubelet[2820]: W0509 02:01:33.618300 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.620322 kubelet[2820]: E0509 02:01:33.618328 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.620322 kubelet[2820]: E0509 02:01:33.619670 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.620322 kubelet[2820]: W0509 02:01:33.619695 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.620322 kubelet[2820]: E0509 02:01:33.619722 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.623350 kubelet[2820]: E0509 02:01:33.622911 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.623350 kubelet[2820]: W0509 02:01:33.622947 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.623350 kubelet[2820]: E0509 02:01:33.622981 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.624103 kubelet[2820]: E0509 02:01:33.623764 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.624103 kubelet[2820]: W0509 02:01:33.623787 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.624103 kubelet[2820]: E0509 02:01:33.623810 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.627020 kubelet[2820]: E0509 02:01:33.626822 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.627020 kubelet[2820]: W0509 02:01:33.626854 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.627020 kubelet[2820]: E0509 02:01:33.626925 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.628558 kubelet[2820]: E0509 02:01:33.628184 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.628558 kubelet[2820]: W0509 02:01:33.628211 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.628558 kubelet[2820]: E0509 02:01:33.628238 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.630868 kubelet[2820]: E0509 02:01:33.628978 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.630868 kubelet[2820]: W0509 02:01:33.629000 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.630868 kubelet[2820]: E0509 02:01:33.629024 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.630868 kubelet[2820]: E0509 02:01:33.630004 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.630868 kubelet[2820]: W0509 02:01:33.630056 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.630868 kubelet[2820]: E0509 02:01:33.630080 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.630868 kubelet[2820]: E0509 02:01:33.630468 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.630868 kubelet[2820]: W0509 02:01:33.630488 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.630868 kubelet[2820]: E0509 02:01:33.630509 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.632215 kubelet[2820]: E0509 02:01:33.631980 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.632215 kubelet[2820]: W0509 02:01:33.632006 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.632215 kubelet[2820]: E0509 02:01:33.632029 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.633219 kubelet[2820]: E0509 02:01:33.632381 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.633219 kubelet[2820]: W0509 02:01:33.632401 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.633219 kubelet[2820]: E0509 02:01:33.632422 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.635172 kubelet[2820]: E0509 02:01:33.634926 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.635172 kubelet[2820]: W0509 02:01:33.634956 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.635172 kubelet[2820]: E0509 02:01:33.634981 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.635672 kubelet[2820]: E0509 02:01:33.635472 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.635672 kubelet[2820]: W0509 02:01:33.635491 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.635672 kubelet[2820]: E0509 02:01:33.635522 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.636849 kubelet[2820]: E0509 02:01:33.636603 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.636849 kubelet[2820]: W0509 02:01:33.636683 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.636849 kubelet[2820]: E0509 02:01:33.636714 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.637282 kubelet[2820]: E0509 02:01:33.637251 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.637282 kubelet[2820]: W0509 02:01:33.637272 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.637349 kubelet[2820]: E0509 02:01:33.637300 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.638076 kubelet[2820]: E0509 02:01:33.638057 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.638076 kubelet[2820]: W0509 02:01:33.638073 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.638225 kubelet[2820]: E0509 02:01:33.638136 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.638634 kubelet[2820]: E0509 02:01:33.638597 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.638634 kubelet[2820]: W0509 02:01:33.638612 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.639144 kubelet[2820]: E0509 02:01:33.638841 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.640850 kubelet[2820]: E0509 02:01:33.640731 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.640850 kubelet[2820]: W0509 02:01:33.640745 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.640850 kubelet[2820]: E0509 02:01:33.640811 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.641275 kubelet[2820]: E0509 02:01:33.641030 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.641275 kubelet[2820]: W0509 02:01:33.641045 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.641275 kubelet[2820]: E0509 02:01:33.641122 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.641543 kubelet[2820]: E0509 02:01:33.641320 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.641543 kubelet[2820]: W0509 02:01:33.641330 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.641857 kubelet[2820]: E0509 02:01:33.641690 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.641976 kubelet[2820]: E0509 02:01:33.641885 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.641976 kubelet[2820]: W0509 02:01:33.641895 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.641976 kubelet[2820]: E0509 02:01:33.641917 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.642479 kubelet[2820]: E0509 02:01:33.642148 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.642479 kubelet[2820]: W0509 02:01:33.642159 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.642479 kubelet[2820]: E0509 02:01:33.642175 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.642479 kubelet[2820]: E0509 02:01:33.642419 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.642479 kubelet[2820]: W0509 02:01:33.642428 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.642479 kubelet[2820]: E0509 02:01:33.642438 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.643310 kubelet[2820]: E0509 02:01:33.643074 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.643310 kubelet[2820]: W0509 02:01:33.643097 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.644255 kubelet[2820]: E0509 02:01:33.643177 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.644438 kubelet[2820]: E0509 02:01:33.644422 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.644611 kubelet[2820]: W0509 02:01:33.644595 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.645280 kubelet[2820]: E0509 02:01:33.644755 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.645511 kubelet[2820]: E0509 02:01:33.645483 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.645511 kubelet[2820]: W0509 02:01:33.645501 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.645511 kubelet[2820]: E0509 02:01:33.645522 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.646203 kubelet[2820]: E0509 02:01:33.646181 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.646203 kubelet[2820]: W0509 02:01:33.646194 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.646203 kubelet[2820]: E0509 02:01:33.646211 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.646866 kubelet[2820]: E0509 02:01:33.646844 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.646866 kubelet[2820]: W0509 02:01:33.646857 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.646998 kubelet[2820]: E0509 02:01:33.646878 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:33.647912 kubelet[2820]: E0509 02:01:33.647616 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:33.647912 kubelet[2820]: W0509 02:01:33.647893 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:33.647912 kubelet[2820]: E0509 02:01:33.647906 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.397668 kubelet[2820]: E0509 02:01:34.397592 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5w5lh" podUID="7c514110-0ced-4d72-9e74-278f80566401" May 9 02:01:34.543133 kubelet[2820]: E0509 02:01:34.543077 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.543713 kubelet[2820]: W0509 02:01:34.543443 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.543713 kubelet[2820]: E0509 02:01:34.543495 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.544265 kubelet[2820]: E0509 02:01:34.543925 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.544265 kubelet[2820]: W0509 02:01:34.543948 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.544265 kubelet[2820]: E0509 02:01:34.543973 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.544730 kubelet[2820]: E0509 02:01:34.544701 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.544899 kubelet[2820]: W0509 02:01:34.544869 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.545048 kubelet[2820]: E0509 02:01:34.545021 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.545706 kubelet[2820]: E0509 02:01:34.545588 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.546083 kubelet[2820]: W0509 02:01:34.545841 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.546083 kubelet[2820]: E0509 02:01:34.545885 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.546401 kubelet[2820]: E0509 02:01:34.546373 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.547731 kubelet[2820]: W0509 02:01:34.547456 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.547731 kubelet[2820]: E0509 02:01:34.547503 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.548289 kubelet[2820]: E0509 02:01:34.548204 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.548734 kubelet[2820]: W0509 02:01:34.548469 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.548734 kubelet[2820]: E0509 02:01:34.548514 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.550833 kubelet[2820]: E0509 02:01:34.550567 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.550833 kubelet[2820]: W0509 02:01:34.550598 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.550833 kubelet[2820]: E0509 02:01:34.550665 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.551535 kubelet[2820]: E0509 02:01:34.551309 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.551535 kubelet[2820]: W0509 02:01:34.551338 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.551535 kubelet[2820]: E0509 02:01:34.551361 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.552694 kubelet[2820]: E0509 02:01:34.552052 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.552694 kubelet[2820]: W0509 02:01:34.552079 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.552694 kubelet[2820]: E0509 02:01:34.552102 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.554818 kubelet[2820]: E0509 02:01:34.553157 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.554818 kubelet[2820]: W0509 02:01:34.553185 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.554818 kubelet[2820]: E0509 02:01:34.553209 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.555384 kubelet[2820]: E0509 02:01:34.555164 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.555384 kubelet[2820]: W0509 02:01:34.555197 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.555384 kubelet[2820]: E0509 02:01:34.555222 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.556138 kubelet[2820]: E0509 02:01:34.555877 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.556138 kubelet[2820]: W0509 02:01:34.555906 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.556138 kubelet[2820]: E0509 02:01:34.555930 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.558672 kubelet[2820]: E0509 02:01:34.556913 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.558672 kubelet[2820]: W0509 02:01:34.556941 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.558672 kubelet[2820]: E0509 02:01:34.556964 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.559484 kubelet[2820]: E0509 02:01:34.559139 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.559484 kubelet[2820]: W0509 02:01:34.559168 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.559484 kubelet[2820]: E0509 02:01:34.559194 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.560051 kubelet[2820]: E0509 02:01:34.559871 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.560051 kubelet[2820]: W0509 02:01:34.559900 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.560051 kubelet[2820]: E0509 02:01:34.559927 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.646654 kubelet[2820]: E0509 02:01:34.645972 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.646654 kubelet[2820]: W0509 02:01:34.646002 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.646654 kubelet[2820]: E0509 02:01:34.646026 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.646654 kubelet[2820]: E0509 02:01:34.646370 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.646654 kubelet[2820]: W0509 02:01:34.646380 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.646654 kubelet[2820]: E0509 02:01:34.646408 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.647322 kubelet[2820]: E0509 02:01:34.646680 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.647322 kubelet[2820]: W0509 02:01:34.646689 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.647322 kubelet[2820]: E0509 02:01:34.646703 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.647322 kubelet[2820]: E0509 02:01:34.646900 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.647322 kubelet[2820]: W0509 02:01:34.646910 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.647322 kubelet[2820]: E0509 02:01:34.646933 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.647322 kubelet[2820]: E0509 02:01:34.647108 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.647322 kubelet[2820]: W0509 02:01:34.647117 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.647322 kubelet[2820]: E0509 02:01:34.647142 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.647322 kubelet[2820]: E0509 02:01:34.647307 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.647582 kubelet[2820]: W0509 02:01:34.647316 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.647582 kubelet[2820]: E0509 02:01:34.647330 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.647582 kubelet[2820]: E0509 02:01:34.647543 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.647582 kubelet[2820]: W0509 02:01:34.647553 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.647753 kubelet[2820]: E0509 02:01:34.647645 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.652011 kubelet[2820]: E0509 02:01:34.649776 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.652011 kubelet[2820]: W0509 02:01:34.649801 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.652011 kubelet[2820]: E0509 02:01:34.649894 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.652520 kubelet[2820]: E0509 02:01:34.652071 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.652520 kubelet[2820]: W0509 02:01:34.652087 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.652520 kubelet[2820]: E0509 02:01:34.652244 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.652520 kubelet[2820]: E0509 02:01:34.652507 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.652520 kubelet[2820]: W0509 02:01:34.652517 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.652750 kubelet[2820]: E0509 02:01:34.652546 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.652979 kubelet[2820]: E0509 02:01:34.652961 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.652979 kubelet[2820]: W0509 02:01:34.652978 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.653097 kubelet[2820]: E0509 02:01:34.653003 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.653359 kubelet[2820]: E0509 02:01:34.653198 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.653359 kubelet[2820]: W0509 02:01:34.653210 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.654690 kubelet[2820]: E0509 02:01:34.654663 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.654873 kubelet[2820]: E0509 02:01:34.654857 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.654924 kubelet[2820]: W0509 02:01:34.654907 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.655067 kubelet[2820]: E0509 02:01:34.654932 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.655594 kubelet[2820]: E0509 02:01:34.655132 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.655594 kubelet[2820]: W0509 02:01:34.655142 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.655594 kubelet[2820]: E0509 02:01:34.655157 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.655594 kubelet[2820]: E0509 02:01:34.655318 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.655594 kubelet[2820]: W0509 02:01:34.655327 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.655594 kubelet[2820]: E0509 02:01:34.655343 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.655594 kubelet[2820]: E0509 02:01:34.655556 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.655594 kubelet[2820]: W0509 02:01:34.655565 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.656719 kubelet[2820]: E0509 02:01:34.655597 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.656719 kubelet[2820]: E0509 02:01:34.655954 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.656719 kubelet[2820]: W0509 02:01:34.655964 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.656719 kubelet[2820]: E0509 02:01:34.656000 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:34.656719 kubelet[2820]: E0509 02:01:34.656283 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:34.656719 kubelet[2820]: W0509 02:01:34.656292 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:34.656719 kubelet[2820]: E0509 02:01:34.656302 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.113753 containerd[1478]: time="2025-05-09T02:01:35.113275838Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:01:35.133350 containerd[1478]: time="2025-05-09T02:01:35.133214747Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 9 02:01:35.171466 containerd[1478]: time="2025-05-09T02:01:35.171325850Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:01:35.286563 containerd[1478]: time="2025-05-09T02:01:35.286120265Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:01:35.288905 containerd[1478]: time="2025-05-09T02:01:35.287617985Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 2.99049799s" May 9 02:01:35.288905 containerd[1478]: time="2025-05-09T02:01:35.287735255Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 9 02:01:35.295517 containerd[1478]: time="2025-05-09T02:01:35.295011607Z" level=info msg="CreateContainer within sandbox \"835f8bb95ed5f9f8ba36028371f8e59fe761942909e02f532bca43f8aa19a5f9\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 9 02:01:35.427311 containerd[1478]: time="2025-05-09T02:01:35.425916508Z" level=info msg="Container be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659: CDI devices from CRI Config.CDIDevices: []" May 9 02:01:35.449997 containerd[1478]: time="2025-05-09T02:01:35.449809465Z" level=info msg="CreateContainer within sandbox \"835f8bb95ed5f9f8ba36028371f8e59fe761942909e02f532bca43f8aa19a5f9\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659\"" May 9 02:01:35.451553 containerd[1478]: time="2025-05-09T02:01:35.451057838Z" level=info msg="StartContainer for \"be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659\"" May 9 02:01:35.454583 containerd[1478]: time="2025-05-09T02:01:35.454429393Z" level=info msg="connecting to shim be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659" address="unix:///run/containerd/s/920d6f236e95077d59b16ca7f5fe20d131db92acf51d11ae4237c265a6cd0933" protocol=ttrpc version=3 May 9 02:01:35.498780 systemd[1]: Started cri-containerd-be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659.scope - libcontainer container be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659. May 9 02:01:35.561160 containerd[1478]: time="2025-05-09T02:01:35.561096981Z" level=info msg="StartContainer for \"be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659\" returns successfully" May 9 02:01:35.572271 kubelet[2820]: E0509 02:01:35.572240 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.572271 kubelet[2820]: W0509 02:01:35.572260 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.572419 kubelet[2820]: E0509 02:01:35.572281 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.572685 kubelet[2820]: E0509 02:01:35.572470 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.572685 kubelet[2820]: W0509 02:01:35.572483 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.572685 kubelet[2820]: E0509 02:01:35.572492 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.572685 kubelet[2820]: E0509 02:01:35.572617 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.572685 kubelet[2820]: W0509 02:01:35.572651 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.572685 kubelet[2820]: E0509 02:01:35.572660 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.572856 kubelet[2820]: E0509 02:01:35.572788 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.572856 kubelet[2820]: W0509 02:01:35.572797 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.572856 kubelet[2820]: E0509 02:01:35.572805 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.573653 kubelet[2820]: E0509 02:01:35.572931 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.573653 kubelet[2820]: W0509 02:01:35.572946 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.573653 kubelet[2820]: E0509 02:01:35.572954 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.573653 kubelet[2820]: E0509 02:01:35.573073 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.573653 kubelet[2820]: W0509 02:01:35.573081 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.573653 kubelet[2820]: E0509 02:01:35.573089 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.574758 kubelet[2820]: E0509 02:01:35.574738 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.574758 kubelet[2820]: W0509 02:01:35.574752 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.575193 kubelet[2820]: E0509 02:01:35.574762 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.575193 kubelet[2820]: E0509 02:01:35.574894 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.575193 kubelet[2820]: W0509 02:01:35.574903 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.575193 kubelet[2820]: E0509 02:01:35.574911 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.575193 kubelet[2820]: E0509 02:01:35.575125 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.575193 kubelet[2820]: W0509 02:01:35.575133 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.575193 kubelet[2820]: E0509 02:01:35.575175 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.575583 kubelet[2820]: E0509 02:01:35.575326 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.575583 kubelet[2820]: W0509 02:01:35.575335 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.575583 kubelet[2820]: E0509 02:01:35.575345 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.575583 kubelet[2820]: E0509 02:01:35.575475 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.575583 kubelet[2820]: W0509 02:01:35.575483 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.575583 kubelet[2820]: E0509 02:01:35.575491 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.576426 kubelet[2820]: E0509 02:01:35.575727 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.576426 kubelet[2820]: W0509 02:01:35.575736 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.576426 kubelet[2820]: E0509 02:01:35.575744 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.576861 kubelet[2820]: E0509 02:01:35.576724 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.576861 kubelet[2820]: W0509 02:01:35.576733 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.576861 kubelet[2820]: E0509 02:01:35.576741 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.577165 kubelet[2820]: E0509 02:01:35.576865 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.577165 kubelet[2820]: W0509 02:01:35.576873 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.577165 kubelet[2820]: E0509 02:01:35.576882 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.577165 kubelet[2820]: E0509 02:01:35.576998 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 9 02:01:35.577165 kubelet[2820]: W0509 02:01:35.577007 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 9 02:01:35.577165 kubelet[2820]: E0509 02:01:35.577015 2820 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 9 02:01:35.584405 systemd[1]: cri-containerd-be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659.scope: Deactivated successfully. May 9 02:01:35.587321 containerd[1478]: time="2025-05-09T02:01:35.587281298Z" level=info msg="received exit event container_id:\"be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659\" id:\"be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659\" pid:3473 exited_at:{seconds:1746756095 nanos:586434939}" May 9 02:01:35.587686 containerd[1478]: time="2025-05-09T02:01:35.587604263Z" level=info msg="TaskExit event in podsandbox handler container_id:\"be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659\" id:\"be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659\" pid:3473 exited_at:{seconds:1746756095 nanos:586434939}" May 9 02:01:35.634547 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659-rootfs.mount: Deactivated successfully. May 9 02:01:36.397380 kubelet[2820]: E0509 02:01:36.397159 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5w5lh" podUID="7c514110-0ced-4d72-9e74-278f80566401" May 9 02:01:37.562452 containerd[1478]: time="2025-05-09T02:01:37.561748519Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 9 02:01:38.397920 kubelet[2820]: E0509 02:01:38.397846 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5w5lh" podUID="7c514110-0ced-4d72-9e74-278f80566401" May 9 02:01:40.398647 kubelet[2820]: E0509 02:01:40.397213 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5w5lh" podUID="7c514110-0ced-4d72-9e74-278f80566401" May 9 02:01:42.397344 kubelet[2820]: E0509 02:01:42.397240 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5w5lh" podUID="7c514110-0ced-4d72-9e74-278f80566401" May 9 02:01:44.397199 kubelet[2820]: E0509 02:01:44.397144 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5w5lh" podUID="7c514110-0ced-4d72-9e74-278f80566401" May 9 02:01:44.711727 containerd[1478]: time="2025-05-09T02:01:44.710722465Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:01:44.712324 containerd[1478]: time="2025-05-09T02:01:44.712283713Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 9 02:01:44.713681 containerd[1478]: time="2025-05-09T02:01:44.713651619Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:01:44.716270 containerd[1478]: time="2025-05-09T02:01:44.716245318Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:01:44.717058 containerd[1478]: time="2025-05-09T02:01:44.717030907Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 7.155216445s" May 9 02:01:44.717108 containerd[1478]: time="2025-05-09T02:01:44.717065862Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 9 02:01:44.721158 containerd[1478]: time="2025-05-09T02:01:44.721102568Z" level=info msg="CreateContainer within sandbox \"835f8bb95ed5f9f8ba36028371f8e59fe761942909e02f532bca43f8aa19a5f9\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 9 02:01:44.735903 containerd[1478]: time="2025-05-09T02:01:44.735867008Z" level=info msg="Container 9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8: CDI devices from CRI Config.CDIDevices: []" May 9 02:01:44.740482 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3236069343.mount: Deactivated successfully. May 9 02:01:44.756291 containerd[1478]: time="2025-05-09T02:01:44.756247415Z" level=info msg="CreateContainer within sandbox \"835f8bb95ed5f9f8ba36028371f8e59fe761942909e02f532bca43f8aa19a5f9\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8\"" May 9 02:01:44.758218 containerd[1478]: time="2025-05-09T02:01:44.758180178Z" level=info msg="StartContainer for \"9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8\"" May 9 02:01:44.760138 containerd[1478]: time="2025-05-09T02:01:44.760087223Z" level=info msg="connecting to shim 9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8" address="unix:///run/containerd/s/920d6f236e95077d59b16ca7f5fe20d131db92acf51d11ae4237c265a6cd0933" protocol=ttrpc version=3 May 9 02:01:44.795563 systemd[1]: Started cri-containerd-9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8.scope - libcontainer container 9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8. May 9 02:01:44.857497 containerd[1478]: time="2025-05-09T02:01:44.857456928Z" level=info msg="StartContainer for \"9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8\" returns successfully" May 9 02:01:46.398582 kubelet[2820]: E0509 02:01:46.397561 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5w5lh" podUID="7c514110-0ced-4d72-9e74-278f80566401" May 9 02:01:47.598375 systemd[1]: cri-containerd-9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8.scope: Deactivated successfully. May 9 02:01:47.600928 systemd[1]: cri-containerd-9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8.scope: Consumed 1.218s CPU time, 173.5M memory peak, 154M written to disk. May 9 02:01:47.605700 containerd[1478]: time="2025-05-09T02:01:47.604600711Z" level=info msg="received exit event container_id:\"9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8\" id:\"9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8\" pid:3545 exited_at:{seconds:1746756107 nanos:602158895}" May 9 02:01:47.605700 containerd[1478]: time="2025-05-09T02:01:47.605556808Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8\" id:\"9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8\" pid:3545 exited_at:{seconds:1746756107 nanos:602158895}" May 9 02:01:47.639220 kubelet[2820]: I0509 02:01:47.638703 2820 kubelet_node_status.go:497] "Fast updating node status as it just became ready" May 9 02:01:47.657753 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8-rootfs.mount: Deactivated successfully. May 9 02:01:48.020744 kubelet[2820]: I0509 02:01:48.020442 2820 topology_manager.go:215] "Topology Admit Handler" podUID="35b5d640-2550-4b8b-9939-50b387738597" podNamespace="kube-system" podName="coredns-7db6d8ff4d-blfcp" May 9 02:01:48.041516 systemd[1]: Created slice kubepods-burstable-pod35b5d640_2550_4b8b_9939_50b387738597.slice - libcontainer container kubepods-burstable-pod35b5d640_2550_4b8b_9939_50b387738597.slice. May 9 02:01:48.071837 kubelet[2820]: I0509 02:01:48.071738 2820 topology_manager.go:215] "Topology Admit Handler" podUID="130a24ff-32ab-4698-82a3-c00a27cc01a1" podNamespace="kube-system" podName="coredns-7db6d8ff4d-6m9ql" May 9 02:01:48.080814 kubelet[2820]: I0509 02:01:48.078279 2820 topology_manager.go:215] "Topology Admit Handler" podUID="ed3c5ab6-d900-4bbb-847c-cc6ac6245a17" podNamespace="calico-apiserver" podName="calico-apiserver-76d8dcc867-r899n" May 9 02:01:48.080814 kubelet[2820]: I0509 02:01:48.080083 2820 topology_manager.go:215] "Topology Admit Handler" podUID="33649465-2591-42c1-b37f-5c5ee7d9ef5e" podNamespace="calico-apiserver" podName="calico-apiserver-76d8dcc867-sctwt" May 9 02:01:48.085754 kubelet[2820]: I0509 02:01:48.085685 2820 topology_manager.go:215] "Topology Admit Handler" podUID="c5892730-f8e6-4ce6-a683-d0ddf4f4389e" podNamespace="calico-system" podName="calico-kube-controllers-86bd677f74-trx7j" May 9 02:01:48.105554 systemd[1]: Created slice kubepods-burstable-pod130a24ff_32ab_4698_82a3_c00a27cc01a1.slice - libcontainer container kubepods-burstable-pod130a24ff_32ab_4698_82a3_c00a27cc01a1.slice. May 9 02:01:48.128470 systemd[1]: Created slice kubepods-besteffort-poded3c5ab6_d900_4bbb_847c_cc6ac6245a17.slice - libcontainer container kubepods-besteffort-poded3c5ab6_d900_4bbb_847c_cc6ac6245a17.slice. May 9 02:01:48.136003 systemd[1]: Created slice kubepods-besteffort-pod33649465_2591_42c1_b37f_5c5ee7d9ef5e.slice - libcontainer container kubepods-besteffort-pod33649465_2591_42c1_b37f_5c5ee7d9ef5e.slice. May 9 02:01:48.143753 systemd[1]: Created slice kubepods-besteffort-podc5892730_f8e6_4ce6_a683_d0ddf4f4389e.slice - libcontainer container kubepods-besteffort-podc5892730_f8e6_4ce6_a683_d0ddf4f4389e.slice. May 9 02:01:48.164683 kubelet[2820]: I0509 02:01:48.164308 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhjnx\" (UniqueName: \"kubernetes.io/projected/35b5d640-2550-4b8b-9939-50b387738597-kube-api-access-zhjnx\") pod \"coredns-7db6d8ff4d-blfcp\" (UID: \"35b5d640-2550-4b8b-9939-50b387738597\") " pod="kube-system/coredns-7db6d8ff4d-blfcp" May 9 02:01:48.164683 kubelet[2820]: I0509 02:01:48.164400 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd2ch\" (UniqueName: \"kubernetes.io/projected/130a24ff-32ab-4698-82a3-c00a27cc01a1-kube-api-access-vd2ch\") pod \"coredns-7db6d8ff4d-6m9ql\" (UID: \"130a24ff-32ab-4698-82a3-c00a27cc01a1\") " pod="kube-system/coredns-7db6d8ff4d-6m9ql" May 9 02:01:48.164683 kubelet[2820]: I0509 02:01:48.164453 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35b5d640-2550-4b8b-9939-50b387738597-config-volume\") pod \"coredns-7db6d8ff4d-blfcp\" (UID: \"35b5d640-2550-4b8b-9939-50b387738597\") " pod="kube-system/coredns-7db6d8ff4d-blfcp" May 9 02:01:48.164683 kubelet[2820]: I0509 02:01:48.164505 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/130a24ff-32ab-4698-82a3-c00a27cc01a1-config-volume\") pod \"coredns-7db6d8ff4d-6m9ql\" (UID: \"130a24ff-32ab-4698-82a3-c00a27cc01a1\") " pod="kube-system/coredns-7db6d8ff4d-6m9ql" May 9 02:01:48.267380 kubelet[2820]: I0509 02:01:48.265283 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ed3c5ab6-d900-4bbb-847c-cc6ac6245a17-calico-apiserver-certs\") pod \"calico-apiserver-76d8dcc867-r899n\" (UID: \"ed3c5ab6-d900-4bbb-847c-cc6ac6245a17\") " pod="calico-apiserver/calico-apiserver-76d8dcc867-r899n" May 9 02:01:48.267380 kubelet[2820]: I0509 02:01:48.265395 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lddtr\" (UniqueName: \"kubernetes.io/projected/33649465-2591-42c1-b37f-5c5ee7d9ef5e-kube-api-access-lddtr\") pod \"calico-apiserver-76d8dcc867-sctwt\" (UID: \"33649465-2591-42c1-b37f-5c5ee7d9ef5e\") " pod="calico-apiserver/calico-apiserver-76d8dcc867-sctwt" May 9 02:01:48.267380 kubelet[2820]: I0509 02:01:48.265436 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/33649465-2591-42c1-b37f-5c5ee7d9ef5e-calico-apiserver-certs\") pod \"calico-apiserver-76d8dcc867-sctwt\" (UID: \"33649465-2591-42c1-b37f-5c5ee7d9ef5e\") " pod="calico-apiserver/calico-apiserver-76d8dcc867-sctwt" May 9 02:01:48.267380 kubelet[2820]: I0509 02:01:48.265509 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvdts\" (UniqueName: \"kubernetes.io/projected/ed3c5ab6-d900-4bbb-847c-cc6ac6245a17-kube-api-access-cvdts\") pod \"calico-apiserver-76d8dcc867-r899n\" (UID: \"ed3c5ab6-d900-4bbb-847c-cc6ac6245a17\") " pod="calico-apiserver/calico-apiserver-76d8dcc867-r899n" May 9 02:01:48.267380 kubelet[2820]: I0509 02:01:48.265549 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5892730-f8e6-4ce6-a683-d0ddf4f4389e-tigera-ca-bundle\") pod \"calico-kube-controllers-86bd677f74-trx7j\" (UID: \"c5892730-f8e6-4ce6-a683-d0ddf4f4389e\") " pod="calico-system/calico-kube-controllers-86bd677f74-trx7j" May 9 02:01:48.267877 kubelet[2820]: I0509 02:01:48.265618 2820 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frtvw\" (UniqueName: \"kubernetes.io/projected/c5892730-f8e6-4ce6-a683-d0ddf4f4389e-kube-api-access-frtvw\") pod \"calico-kube-controllers-86bd677f74-trx7j\" (UID: \"c5892730-f8e6-4ce6-a683-d0ddf4f4389e\") " pod="calico-system/calico-kube-controllers-86bd677f74-trx7j" May 9 02:01:48.451749 containerd[1478]: time="2025-05-09T02:01:48.450476399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86bd677f74-trx7j,Uid:c5892730-f8e6-4ce6-a683-d0ddf4f4389e,Namespace:calico-system,Attempt:0,}" May 9 02:01:48.460330 systemd[1]: Created slice kubepods-besteffort-pod7c514110_0ced_4d72_9e74_278f80566401.slice - libcontainer container kubepods-besteffort-pod7c514110_0ced_4d72_9e74_278f80566401.slice. May 9 02:01:48.464233 containerd[1478]: time="2025-05-09T02:01:48.464182339Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5w5lh,Uid:7c514110-0ced-4d72-9e74-278f80566401,Namespace:calico-system,Attempt:0,}" May 9 02:01:48.650996 containerd[1478]: time="2025-05-09T02:01:48.650922946Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-blfcp,Uid:35b5d640-2550-4b8b-9939-50b387738597,Namespace:kube-system,Attempt:0,}" May 9 02:01:48.737513 containerd[1478]: time="2025-05-09T02:01:48.736810672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-6m9ql,Uid:130a24ff-32ab-4698-82a3-c00a27cc01a1,Namespace:kube-system,Attempt:0,}" May 9 02:01:48.737513 containerd[1478]: time="2025-05-09T02:01:48.737034992Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76d8dcc867-r899n,Uid:ed3c5ab6-d900-4bbb-847c-cc6ac6245a17,Namespace:calico-apiserver,Attempt:0,}" May 9 02:01:48.738909 containerd[1478]: time="2025-05-09T02:01:48.738889619Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 9 02:01:48.741697 containerd[1478]: time="2025-05-09T02:01:48.741672804Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76d8dcc867-sctwt,Uid:33649465-2591-42c1-b37f-5c5ee7d9ef5e,Namespace:calico-apiserver,Attempt:0,}" May 9 02:01:48.902413 containerd[1478]: time="2025-05-09T02:01:48.902349712Z" level=error msg="Failed to destroy network for sandbox \"0276d07654c274079f937becc0469c3cf4850fe67c189cd6ddfb341521528381\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:48.904924 containerd[1478]: time="2025-05-09T02:01:48.904830591Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86bd677f74-trx7j,Uid:c5892730-f8e6-4ce6-a683-d0ddf4f4389e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0276d07654c274079f937becc0469c3cf4850fe67c189cd6ddfb341521528381\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:48.906122 kubelet[2820]: E0509 02:01:48.905993 2820 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0276d07654c274079f937becc0469c3cf4850fe67c189cd6ddfb341521528381\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:48.906122 kubelet[2820]: E0509 02:01:48.906106 2820 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0276d07654c274079f937becc0469c3cf4850fe67c189cd6ddfb341521528381\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86bd677f74-trx7j" May 9 02:01:48.906929 kubelet[2820]: E0509 02:01:48.906137 2820 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0276d07654c274079f937becc0469c3cf4850fe67c189cd6ddfb341521528381\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86bd677f74-trx7j" May 9 02:01:48.906929 kubelet[2820]: E0509 02:01:48.906200 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86bd677f74-trx7j_calico-system(c5892730-f8e6-4ce6-a683-d0ddf4f4389e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86bd677f74-trx7j_calico-system(c5892730-f8e6-4ce6-a683-d0ddf4f4389e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0276d07654c274079f937becc0469c3cf4850fe67c189cd6ddfb341521528381\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86bd677f74-trx7j" podUID="c5892730-f8e6-4ce6-a683-d0ddf4f4389e" May 9 02:01:48.999336 containerd[1478]: time="2025-05-09T02:01:48.999213343Z" level=error msg="Failed to destroy network for sandbox \"00372e7ec38a53fda0c6202f3004a2748dcab3439086a8fedf9a2297abc1f7e7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.003264 containerd[1478]: time="2025-05-09T02:01:49.003064274Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5w5lh,Uid:7c514110-0ced-4d72-9e74-278f80566401,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"00372e7ec38a53fda0c6202f3004a2748dcab3439086a8fedf9a2297abc1f7e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.004135 kubelet[2820]: E0509 02:01:49.003711 2820 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00372e7ec38a53fda0c6202f3004a2748dcab3439086a8fedf9a2297abc1f7e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.004135 kubelet[2820]: E0509 02:01:49.003827 2820 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00372e7ec38a53fda0c6202f3004a2748dcab3439086a8fedf9a2297abc1f7e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5w5lh" May 9 02:01:49.004135 kubelet[2820]: E0509 02:01:49.003864 2820 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00372e7ec38a53fda0c6202f3004a2748dcab3439086a8fedf9a2297abc1f7e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5w5lh" May 9 02:01:49.004302 kubelet[2820]: E0509 02:01:49.003941 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-5w5lh_calico-system(7c514110-0ced-4d72-9e74-278f80566401)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-5w5lh_calico-system(7c514110-0ced-4d72-9e74-278f80566401)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"00372e7ec38a53fda0c6202f3004a2748dcab3439086a8fedf9a2297abc1f7e7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-5w5lh" podUID="7c514110-0ced-4d72-9e74-278f80566401" May 9 02:01:49.066096 containerd[1478]: time="2025-05-09T02:01:49.066029046Z" level=error msg="Failed to destroy network for sandbox \"b99310bdaf8c6e5cc265af4f0f48e57b5ad2e6f411bba3f8521a192d3d78bb5b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.067442 containerd[1478]: time="2025-05-09T02:01:49.067403936Z" level=error msg="Failed to destroy network for sandbox \"5a08416ef7ca665323fb6627decd3dcae8b5789aee712724af123fd82e8e5ecf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.068197 containerd[1478]: time="2025-05-09T02:01:49.068143379Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76d8dcc867-sctwt,Uid:33649465-2591-42c1-b37f-5c5ee7d9ef5e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b99310bdaf8c6e5cc265af4f0f48e57b5ad2e6f411bba3f8521a192d3d78bb5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.068794 kubelet[2820]: E0509 02:01:49.068437 2820 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b99310bdaf8c6e5cc265af4f0f48e57b5ad2e6f411bba3f8521a192d3d78bb5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.068794 kubelet[2820]: E0509 02:01:49.068494 2820 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b99310bdaf8c6e5cc265af4f0f48e57b5ad2e6f411bba3f8521a192d3d78bb5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76d8dcc867-sctwt" May 9 02:01:49.068794 kubelet[2820]: E0509 02:01:49.068517 2820 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b99310bdaf8c6e5cc265af4f0f48e57b5ad2e6f411bba3f8521a192d3d78bb5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76d8dcc867-sctwt" May 9 02:01:49.069071 containerd[1478]: time="2025-05-09T02:01:49.068763449Z" level=error msg="Failed to destroy network for sandbox \"98d12b2350a94557a5c8c1fc7a6d376b3258fdfa992b46cf1e23b36c6679797c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.069108 kubelet[2820]: E0509 02:01:49.068747 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76d8dcc867-sctwt_calico-apiserver(33649465-2591-42c1-b37f-5c5ee7d9ef5e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76d8dcc867-sctwt_calico-apiserver(33649465-2591-42c1-b37f-5c5ee7d9ef5e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b99310bdaf8c6e5cc265af4f0f48e57b5ad2e6f411bba3f8521a192d3d78bb5b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76d8dcc867-sctwt" podUID="33649465-2591-42c1-b37f-5c5ee7d9ef5e" May 9 02:01:49.071779 containerd[1478]: time="2025-05-09T02:01:49.071721140Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76d8dcc867-r899n,Uid:ed3c5ab6-d900-4bbb-847c-cc6ac6245a17,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a08416ef7ca665323fb6627decd3dcae8b5789aee712724af123fd82e8e5ecf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.072497 kubelet[2820]: E0509 02:01:49.072452 2820 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a08416ef7ca665323fb6627decd3dcae8b5789aee712724af123fd82e8e5ecf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.072816 kubelet[2820]: E0509 02:01:49.072778 2820 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a08416ef7ca665323fb6627decd3dcae8b5789aee712724af123fd82e8e5ecf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76d8dcc867-r899n" May 9 02:01:49.072897 kubelet[2820]: E0509 02:01:49.072820 2820 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a08416ef7ca665323fb6627decd3dcae8b5789aee712724af123fd82e8e5ecf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76d8dcc867-r899n" May 9 02:01:49.072938 kubelet[2820]: E0509 02:01:49.072889 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76d8dcc867-r899n_calico-apiserver(ed3c5ab6-d900-4bbb-847c-cc6ac6245a17)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76d8dcc867-r899n_calico-apiserver(ed3c5ab6-d900-4bbb-847c-cc6ac6245a17)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5a08416ef7ca665323fb6627decd3dcae8b5789aee712724af123fd82e8e5ecf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76d8dcc867-r899n" podUID="ed3c5ab6-d900-4bbb-847c-cc6ac6245a17" May 9 02:01:49.073801 containerd[1478]: time="2025-05-09T02:01:49.073749844Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-6m9ql,Uid:130a24ff-32ab-4698-82a3-c00a27cc01a1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"98d12b2350a94557a5c8c1fc7a6d376b3258fdfa992b46cf1e23b36c6679797c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.073978 kubelet[2820]: E0509 02:01:49.073933 2820 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98d12b2350a94557a5c8c1fc7a6d376b3258fdfa992b46cf1e23b36c6679797c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.074039 kubelet[2820]: E0509 02:01:49.073987 2820 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98d12b2350a94557a5c8c1fc7a6d376b3258fdfa992b46cf1e23b36c6679797c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-6m9ql" May 9 02:01:49.074039 kubelet[2820]: E0509 02:01:49.074010 2820 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98d12b2350a94557a5c8c1fc7a6d376b3258fdfa992b46cf1e23b36c6679797c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-6m9ql" May 9 02:01:49.074154 kubelet[2820]: E0509 02:01:49.074048 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-6m9ql_kube-system(130a24ff-32ab-4698-82a3-c00a27cc01a1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-6m9ql_kube-system(130a24ff-32ab-4698-82a3-c00a27cc01a1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"98d12b2350a94557a5c8c1fc7a6d376b3258fdfa992b46cf1e23b36c6679797c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-6m9ql" podUID="130a24ff-32ab-4698-82a3-c00a27cc01a1" May 9 02:01:49.075193 containerd[1478]: time="2025-05-09T02:01:49.075128051Z" level=error msg="Failed to destroy network for sandbox \"f6187f9ba6a348edbcc8e28e4f3d858f1a95874e94fd3a3ec31d5ee77e0c0fc3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.078460 containerd[1478]: time="2025-05-09T02:01:49.078398077Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-blfcp,Uid:35b5d640-2550-4b8b-9939-50b387738597,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6187f9ba6a348edbcc8e28e4f3d858f1a95874e94fd3a3ec31d5ee77e0c0fc3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.078818 kubelet[2820]: E0509 02:01:49.078722 2820 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6187f9ba6a348edbcc8e28e4f3d858f1a95874e94fd3a3ec31d5ee77e0c0fc3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:01:49.078896 kubelet[2820]: E0509 02:01:49.078846 2820 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6187f9ba6a348edbcc8e28e4f3d858f1a95874e94fd3a3ec31d5ee77e0c0fc3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-blfcp" May 9 02:01:49.078936 kubelet[2820]: E0509 02:01:49.078898 2820 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6187f9ba6a348edbcc8e28e4f3d858f1a95874e94fd3a3ec31d5ee77e0c0fc3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-blfcp" May 9 02:01:49.079011 kubelet[2820]: E0509 02:01:49.078978 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-blfcp_kube-system(35b5d640-2550-4b8b-9939-50b387738597)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-blfcp_kube-system(35b5d640-2550-4b8b-9939-50b387738597)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f6187f9ba6a348edbcc8e28e4f3d858f1a95874e94fd3a3ec31d5ee77e0c0fc3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-blfcp" podUID="35b5d640-2550-4b8b-9939-50b387738597" May 9 02:01:49.659908 systemd[1]: run-netns-cni\x2d77e7253a\x2d7052\x2dcc99\x2d5fca\x2d708799889339.mount: Deactivated successfully. May 9 02:01:49.660189 systemd[1]: run-netns-cni\x2d148dbf42\x2d7359\x2df6f7\x2dc73c\x2d152ed6694841.mount: Deactivated successfully. May 9 02:01:49.660365 systemd[1]: run-netns-cni\x2d07916ced\x2d286b\x2d6580\x2d8639\x2dd5514d5523a2.mount: Deactivated successfully. May 9 02:01:49.660534 systemd[1]: run-netns-cni\x2db259b63c\x2d233a\x2d88c4\x2d6f5c\x2dfda87b46d1d0.mount: Deactivated successfully. May 9 02:01:49.660758 systemd[1]: run-netns-cni\x2d1b6af99b\x2daaca\x2d6f6c\x2d15c2\x2dd308c41f7238.mount: Deactivated successfully. May 9 02:01:49.660929 systemd[1]: run-netns-cni\x2db9800faa\x2d3b8d\x2d5d06\x2d60a1\x2daedc494c5b3a.mount: Deactivated successfully. May 9 02:02:00.398669 containerd[1478]: time="2025-05-09T02:02:00.398483903Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76d8dcc867-sctwt,Uid:33649465-2591-42c1-b37f-5c5ee7d9ef5e,Namespace:calico-apiserver,Attempt:0,}" May 9 02:02:00.402612 containerd[1478]: time="2025-05-09T02:02:00.402166425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-blfcp,Uid:35b5d640-2550-4b8b-9939-50b387738597,Namespace:kube-system,Attempt:0,}" May 9 02:02:00.402612 containerd[1478]: time="2025-05-09T02:02:00.402352693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76d8dcc867-r899n,Uid:ed3c5ab6-d900-4bbb-847c-cc6ac6245a17,Namespace:calico-apiserver,Attempt:0,}" May 9 02:02:00.767968 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount951150738.mount: Deactivated successfully. May 9 02:02:00.851261 containerd[1478]: time="2025-05-09T02:02:00.851174009Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:00.853176 containerd[1478]: time="2025-05-09T02:02:00.853106175Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 9 02:02:00.856690 containerd[1478]: time="2025-05-09T02:02:00.856335368Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:00.863324 containerd[1478]: time="2025-05-09T02:02:00.863275356Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:00.865733 containerd[1478]: time="2025-05-09T02:02:00.865678603Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 12.126656576s" May 9 02:02:00.865876 containerd[1478]: time="2025-05-09T02:02:00.865856446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 9 02:02:00.915150 containerd[1478]: time="2025-05-09T02:02:00.915109732Z" level=info msg="CreateContainer within sandbox \"835f8bb95ed5f9f8ba36028371f8e59fe761942909e02f532bca43f8aa19a5f9\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 9 02:02:00.939656 containerd[1478]: time="2025-05-09T02:02:00.938328654Z" level=info msg="Container 78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855: CDI devices from CRI Config.CDIDevices: []" May 9 02:02:00.943437 containerd[1478]: time="2025-05-09T02:02:00.943381511Z" level=error msg="Failed to destroy network for sandbox \"c56c0bb3e889e40fa11460e6462a6c5860a022799b5052ffc83f58d57bae7de0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:02:00.945417 containerd[1478]: time="2025-05-09T02:02:00.945375553Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-blfcp,Uid:35b5d640-2550-4b8b-9939-50b387738597,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c56c0bb3e889e40fa11460e6462a6c5860a022799b5052ffc83f58d57bae7de0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:02:00.946366 kubelet[2820]: E0509 02:02:00.946277 2820 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c56c0bb3e889e40fa11460e6462a6c5860a022799b5052ffc83f58d57bae7de0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:02:00.947391 kubelet[2820]: E0509 02:02:00.946502 2820 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c56c0bb3e889e40fa11460e6462a6c5860a022799b5052ffc83f58d57bae7de0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-blfcp" May 9 02:02:00.947391 kubelet[2820]: E0509 02:02:00.946713 2820 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c56c0bb3e889e40fa11460e6462a6c5860a022799b5052ffc83f58d57bae7de0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-blfcp" May 9 02:02:00.948568 kubelet[2820]: E0509 02:02:00.946821 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-blfcp_kube-system(35b5d640-2550-4b8b-9939-50b387738597)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-blfcp_kube-system(35b5d640-2550-4b8b-9939-50b387738597)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c56c0bb3e889e40fa11460e6462a6c5860a022799b5052ffc83f58d57bae7de0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-blfcp" podUID="35b5d640-2550-4b8b-9939-50b387738597" May 9 02:02:00.964588 containerd[1478]: time="2025-05-09T02:02:00.964547291Z" level=info msg="CreateContainer within sandbox \"835f8bb95ed5f9f8ba36028371f8e59fe761942909e02f532bca43f8aa19a5f9\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\"" May 9 02:02:00.967033 containerd[1478]: time="2025-05-09T02:02:00.966995984Z" level=info msg="StartContainer for \"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\"" May 9 02:02:00.974097 containerd[1478]: time="2025-05-09T02:02:00.974039114Z" level=error msg="Failed to destroy network for sandbox \"094c0c37ea28dd2e145ec14580a2b39094a88e101a2cba623fb52bbeb2bf4088\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:02:00.977027 containerd[1478]: time="2025-05-09T02:02:00.976951124Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76d8dcc867-sctwt,Uid:33649465-2591-42c1-b37f-5c5ee7d9ef5e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"094c0c37ea28dd2e145ec14580a2b39094a88e101a2cba623fb52bbeb2bf4088\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:02:00.978228 kubelet[2820]: E0509 02:02:00.977933 2820 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"094c0c37ea28dd2e145ec14580a2b39094a88e101a2cba623fb52bbeb2bf4088\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:02:00.978331 kubelet[2820]: E0509 02:02:00.978293 2820 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"094c0c37ea28dd2e145ec14580a2b39094a88e101a2cba623fb52bbeb2bf4088\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76d8dcc867-sctwt" May 9 02:02:00.978331 kubelet[2820]: E0509 02:02:00.978322 2820 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"094c0c37ea28dd2e145ec14580a2b39094a88e101a2cba623fb52bbeb2bf4088\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76d8dcc867-sctwt" May 9 02:02:00.980881 kubelet[2820]: E0509 02:02:00.978507 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76d8dcc867-sctwt_calico-apiserver(33649465-2591-42c1-b37f-5c5ee7d9ef5e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76d8dcc867-sctwt_calico-apiserver(33649465-2591-42c1-b37f-5c5ee7d9ef5e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"094c0c37ea28dd2e145ec14580a2b39094a88e101a2cba623fb52bbeb2bf4088\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76d8dcc867-sctwt" podUID="33649465-2591-42c1-b37f-5c5ee7d9ef5e" May 9 02:02:00.983875 containerd[1478]: time="2025-05-09T02:02:00.983790083Z" level=info msg="connecting to shim 78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855" address="unix:///run/containerd/s/920d6f236e95077d59b16ca7f5fe20d131db92acf51d11ae4237c265a6cd0933" protocol=ttrpc version=3 May 9 02:02:00.998717 containerd[1478]: time="2025-05-09T02:02:00.998662255Z" level=error msg="Failed to destroy network for sandbox \"75b150413b99482dc0f0b70e92c8c7e2fd32c1563a2953a0f3b00e205e37bce2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:02:01.000451 containerd[1478]: time="2025-05-09T02:02:01.000404045Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76d8dcc867-r899n,Uid:ed3c5ab6-d900-4bbb-847c-cc6ac6245a17,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"75b150413b99482dc0f0b70e92c8c7e2fd32c1563a2953a0f3b00e205e37bce2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:02:01.000759 kubelet[2820]: E0509 02:02:01.000665 2820 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"75b150413b99482dc0f0b70e92c8c7e2fd32c1563a2953a0f3b00e205e37bce2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 9 02:02:01.000846 kubelet[2820]: E0509 02:02:01.000789 2820 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"75b150413b99482dc0f0b70e92c8c7e2fd32c1563a2953a0f3b00e205e37bce2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76d8dcc867-r899n" May 9 02:02:01.000846 kubelet[2820]: E0509 02:02:01.000818 2820 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"75b150413b99482dc0f0b70e92c8c7e2fd32c1563a2953a0f3b00e205e37bce2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76d8dcc867-r899n" May 9 02:02:01.000924 kubelet[2820]: E0509 02:02:01.000880 2820 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76d8dcc867-r899n_calico-apiserver(ed3c5ab6-d900-4bbb-847c-cc6ac6245a17)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76d8dcc867-r899n_calico-apiserver(ed3c5ab6-d900-4bbb-847c-cc6ac6245a17)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"75b150413b99482dc0f0b70e92c8c7e2fd32c1563a2953a0f3b00e205e37bce2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76d8dcc867-r899n" podUID="ed3c5ab6-d900-4bbb-847c-cc6ac6245a17" May 9 02:02:01.016833 systemd[1]: Started cri-containerd-78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855.scope - libcontainer container 78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855. May 9 02:02:01.072318 containerd[1478]: time="2025-05-09T02:02:01.072201858Z" level=info msg="StartContainer for \"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" returns successfully" May 9 02:02:01.211932 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 9 02:02:01.212037 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 9 02:02:01.749341 systemd[1]: run-netns-cni\x2dff28c856\x2d5290\x2dc86c\x2d611b\x2d0b5ac5a2def4.mount: Deactivated successfully. May 9 02:02:01.749578 systemd[1]: run-netns-cni\x2dadabef2e\x2d0cd7\x2d47da\x2ddb29\x2d316758d88f18.mount: Deactivated successfully. May 9 02:02:01.749816 systemd[1]: run-netns-cni\x2d16d4cd77\x2d2f83\x2df340\x2db6dd\x2dce34d903426a.mount: Deactivated successfully. May 9 02:02:02.037988 containerd[1478]: time="2025-05-09T02:02:02.037544076Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"2c16224f2a1508e04c45b91f215093b6097a846896775c5c191b92b4a373dcc4\" pid:3921 exit_status:1 exited_at:{seconds:1746756122 nanos:37138517}" May 9 02:02:02.407819 containerd[1478]: time="2025-05-09T02:02:02.407580002Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5w5lh,Uid:7c514110-0ced-4d72-9e74-278f80566401,Namespace:calico-system,Attempt:0,}" May 9 02:02:02.684409 systemd-networkd[1387]: caliecc83a3b720: Link UP May 9 02:02:02.685121 systemd-networkd[1387]: caliecc83a3b720: Gained carrier May 9 02:02:02.707927 kubelet[2820]: I0509 02:02:02.707500 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-m9s6k" podStartSLOduration=2.423814975 podStartE2EDuration="34.707478556s" podCreationTimestamp="2025-05-09 02:01:28 +0000 UTC" firstStartedPulling="2025-05-09 02:01:28.600401573 +0000 UTC m=+23.317330803" lastFinishedPulling="2025-05-09 02:02:00.884065153 +0000 UTC m=+55.600994384" observedRunningTime="2025-05-09 02:02:02.029213544 +0000 UTC m=+56.746142785" watchObservedRunningTime="2025-05-09 02:02:02.707478556 +0000 UTC m=+57.424407776" May 9 02:02:02.717971 containerd[1478]: 2025-05-09 02:02:02.500 [INFO][3934] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 9 02:02:02.717971 containerd[1478]: 2025-05-09 02:02:02.542 [INFO][3934] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0 csi-node-driver- calico-system 7c514110-0ced-4d72-9e74-278f80566401 638 0 2025-05-09 02:01:28 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b7b4b9d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284-0-0-n-abffc5acbe.novalocal csi-node-driver-5w5lh eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliecc83a3b720 [] []}} ContainerID="fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" Namespace="calico-system" Pod="csi-node-driver-5w5lh" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-" May 9 02:02:02.717971 containerd[1478]: 2025-05-09 02:02:02.542 [INFO][3934] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" Namespace="calico-system" Pod="csi-node-driver-5w5lh" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0" May 9 02:02:02.717971 containerd[1478]: 2025-05-09 02:02:02.616 [INFO][3954] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" HandleID="k8s-pod-network.fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0" May 9 02:02:02.718182 containerd[1478]: 2025-05-09 02:02:02.633 [INFO][3954] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" HandleID="k8s-pod-network.fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ecbf0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-abffc5acbe.novalocal", "pod":"csi-node-driver-5w5lh", "timestamp":"2025-05-09 02:02:02.616105569 +0000 UTC"}, Hostname:"ci-4284-0-0-n-abffc5acbe.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 9 02:02:02.718182 containerd[1478]: 2025-05-09 02:02:02.633 [INFO][3954] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 9 02:02:02.718182 containerd[1478]: 2025-05-09 02:02:02.634 [INFO][3954] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 9 02:02:02.718182 containerd[1478]: 2025-05-09 02:02:02.634 [INFO][3954] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-abffc5acbe.novalocal' May 9 02:02:02.718182 containerd[1478]: 2025-05-09 02:02:02.636 [INFO][3954] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:02.718182 containerd[1478]: 2025-05-09 02:02:02.642 [INFO][3954] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:02.718182 containerd[1478]: 2025-05-09 02:02:02.648 [INFO][3954] ipam/ipam.go 489: Trying affinity for 192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:02.718182 containerd[1478]: 2025-05-09 02:02:02.650 [INFO][3954] ipam/ipam.go 155: Attempting to load block cidr=192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:02.718182 containerd[1478]: 2025-05-09 02:02:02.654 [INFO][3954] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:02.718440 containerd[1478]: 2025-05-09 02:02:02.654 [INFO][3954] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.60.192/26 handle="k8s-pod-network.fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:02.718440 containerd[1478]: 2025-05-09 02:02:02.656 [INFO][3954] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644 May 9 02:02:02.718440 containerd[1478]: 2025-05-09 02:02:02.663 [INFO][3954] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.60.192/26 handle="k8s-pod-network.fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:02.718440 containerd[1478]: 2025-05-09 02:02:02.671 [INFO][3954] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.60.193/26] block=192.168.60.192/26 handle="k8s-pod-network.fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:02.718440 containerd[1478]: 2025-05-09 02:02:02.671 [INFO][3954] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.60.193/26] handle="k8s-pod-network.fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:02.718440 containerd[1478]: 2025-05-09 02:02:02.671 [INFO][3954] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 9 02:02:02.718440 containerd[1478]: 2025-05-09 02:02:02.671 [INFO][3954] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.60.193/26] IPv6=[] ContainerID="fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" HandleID="k8s-pod-network.fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0" May 9 02:02:02.718609 containerd[1478]: 2025-05-09 02:02:02.675 [INFO][3934] cni-plugin/k8s.go 386: Populated endpoint ContainerID="fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" Namespace="calico-system" Pod="csi-node-driver-5w5lh" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7c514110-0ced-4d72-9e74-278f80566401", ResourceVersion:"638", Generation:0, CreationTimestamp:time.Date(2025, time.May, 9, 2, 1, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-abffc5acbe.novalocal", ContainerID:"", Pod:"csi-node-driver-5w5lh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.60.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliecc83a3b720", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 9 02:02:02.719743 containerd[1478]: 2025-05-09 02:02:02.675 [INFO][3934] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.60.193/32] ContainerID="fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" Namespace="calico-system" Pod="csi-node-driver-5w5lh" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0" May 9 02:02:02.719743 containerd[1478]: 2025-05-09 02:02:02.675 [INFO][3934] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliecc83a3b720 ContainerID="fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" Namespace="calico-system" Pod="csi-node-driver-5w5lh" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0" May 9 02:02:02.719743 containerd[1478]: 2025-05-09 02:02:02.683 [INFO][3934] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" Namespace="calico-system" Pod="csi-node-driver-5w5lh" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0" May 9 02:02:02.719852 containerd[1478]: 2025-05-09 02:02:02.684 [INFO][3934] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" Namespace="calico-system" Pod="csi-node-driver-5w5lh" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7c514110-0ced-4d72-9e74-278f80566401", ResourceVersion:"638", Generation:0, CreationTimestamp:time.Date(2025, time.May, 9, 2, 1, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-abffc5acbe.novalocal", ContainerID:"fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644", Pod:"csi-node-driver-5w5lh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.60.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliecc83a3b720", MAC:"8a:6a:10:8d:92:a5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 9 02:02:02.719918 containerd[1478]: 2025-05-09 02:02:02.707 [INFO][3934] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" Namespace="calico-system" Pod="csi-node-driver-5w5lh" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-csi--node--driver--5w5lh-eth0" May 9 02:02:02.783037 containerd[1478]: time="2025-05-09T02:02:02.782968967Z" level=info msg="connecting to shim fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644" address="unix:///run/containerd/s/4f25608a84bb4ff9cb1923aa62ad1a5519aeb43548c6c9677ec3d5d00431dc92" namespace=k8s.io protocol=ttrpc version=3 May 9 02:02:02.816863 systemd[1]: Started cri-containerd-fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644.scope - libcontainer container fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644. May 9 02:02:02.852028 containerd[1478]: time="2025-05-09T02:02:02.851986484Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5w5lh,Uid:7c514110-0ced-4d72-9e74-278f80566401,Namespace:calico-system,Attempt:0,} returns sandbox id \"fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644\"" May 9 02:02:02.856037 containerd[1478]: time="2025-05-09T02:02:02.855770686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 9 02:02:02.993101 containerd[1478]: time="2025-05-09T02:02:02.991114752Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"e6e92d390020283cde1a1ac8d6f209009c55582cb9fee76c4216b2292bbb7044\" pid:4032 exit_status:1 exited_at:{seconds:1746756122 nanos:989869231}" May 9 02:02:03.400187 containerd[1478]: time="2025-05-09T02:02:03.399380369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-6m9ql,Uid:130a24ff-32ab-4698-82a3-c00a27cc01a1,Namespace:kube-system,Attempt:0,}" May 9 02:02:03.703171 systemd-networkd[1387]: calib241164b666: Link UP May 9 02:02:03.704530 systemd-networkd[1387]: calib241164b666: Gained carrier May 9 02:02:03.727700 containerd[1478]: 2025-05-09 02:02:03.491 [INFO][4045] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 9 02:02:03.727700 containerd[1478]: 2025-05-09 02:02:03.527 [INFO][4045] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0 coredns-7db6d8ff4d- kube-system 130a24ff-32ab-4698-82a3-c00a27cc01a1 753 0 2025-05-09 02:01:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-abffc5acbe.novalocal coredns-7db6d8ff4d-6m9ql eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib241164b666 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6m9ql" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-" May 9 02:02:03.727700 containerd[1478]: 2025-05-09 02:02:03.528 [INFO][4045] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6m9ql" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0" May 9 02:02:03.727700 containerd[1478]: 2025-05-09 02:02:03.619 [INFO][4056] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" HandleID="k8s-pod-network.29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0" May 9 02:02:03.728060 containerd[1478]: 2025-05-09 02:02:03.634 [INFO][4056] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" HandleID="k8s-pod-network.29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003be540), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-abffc5acbe.novalocal", "pod":"coredns-7db6d8ff4d-6m9ql", "timestamp":"2025-05-09 02:02:03.61966291 +0000 UTC"}, Hostname:"ci-4284-0-0-n-abffc5acbe.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 9 02:02:03.728060 containerd[1478]: 2025-05-09 02:02:03.634 [INFO][4056] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 9 02:02:03.728060 containerd[1478]: 2025-05-09 02:02:03.634 [INFO][4056] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 9 02:02:03.728060 containerd[1478]: 2025-05-09 02:02:03.634 [INFO][4056] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-abffc5acbe.novalocal' May 9 02:02:03.728060 containerd[1478]: 2025-05-09 02:02:03.636 [INFO][4056] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:03.728060 containerd[1478]: 2025-05-09 02:02:03.647 [INFO][4056] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:03.728060 containerd[1478]: 2025-05-09 02:02:03.656 [INFO][4056] ipam/ipam.go 489: Trying affinity for 192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:03.728060 containerd[1478]: 2025-05-09 02:02:03.660 [INFO][4056] ipam/ipam.go 155: Attempting to load block cidr=192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:03.728060 containerd[1478]: 2025-05-09 02:02:03.663 [INFO][4056] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:03.728414 containerd[1478]: 2025-05-09 02:02:03.664 [INFO][4056] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.60.192/26 handle="k8s-pod-network.29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:03.728414 containerd[1478]: 2025-05-09 02:02:03.666 [INFO][4056] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a May 9 02:02:03.728414 containerd[1478]: 2025-05-09 02:02:03.672 [INFO][4056] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.60.192/26 handle="k8s-pod-network.29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:03.728414 containerd[1478]: 2025-05-09 02:02:03.688 [INFO][4056] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.60.194/26] block=192.168.60.192/26 handle="k8s-pod-network.29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:03.728414 containerd[1478]: 2025-05-09 02:02:03.688 [INFO][4056] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.60.194/26] handle="k8s-pod-network.29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:03.728414 containerd[1478]: 2025-05-09 02:02:03.688 [INFO][4056] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 9 02:02:03.728414 containerd[1478]: 2025-05-09 02:02:03.688 [INFO][4056] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.60.194/26] IPv6=[] ContainerID="29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" HandleID="k8s-pod-network.29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0" May 9 02:02:03.729888 containerd[1478]: 2025-05-09 02:02:03.693 [INFO][4045] cni-plugin/k8s.go 386: Populated endpoint ContainerID="29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6m9ql" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"130a24ff-32ab-4698-82a3-c00a27cc01a1", ResourceVersion:"753", Generation:0, CreationTimestamp:time.Date(2025, time.May, 9, 2, 1, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-abffc5acbe.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-6m9ql", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.60.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib241164b666", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 9 02:02:03.729888 containerd[1478]: 2025-05-09 02:02:03.693 [INFO][4045] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.60.194/32] ContainerID="29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6m9ql" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0" May 9 02:02:03.729888 containerd[1478]: 2025-05-09 02:02:03.693 [INFO][4045] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib241164b666 ContainerID="29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6m9ql" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0" May 9 02:02:03.729888 containerd[1478]: 2025-05-09 02:02:03.706 [INFO][4045] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6m9ql" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0" May 9 02:02:03.729888 containerd[1478]: 2025-05-09 02:02:03.708 [INFO][4045] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6m9ql" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"130a24ff-32ab-4698-82a3-c00a27cc01a1", ResourceVersion:"753", Generation:0, CreationTimestamp:time.Date(2025, time.May, 9, 2, 1, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-abffc5acbe.novalocal", ContainerID:"29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a", Pod:"coredns-7db6d8ff4d-6m9ql", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.60.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib241164b666", MAC:"3a:e6:58:b6:2b:61", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 9 02:02:03.729888 containerd[1478]: 2025-05-09 02:02:03.721 [INFO][4045] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6m9ql" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--6m9ql-eth0" May 9 02:02:03.805909 systemd-networkd[1387]: caliecc83a3b720: Gained IPv6LL May 9 02:02:03.882651 containerd[1478]: time="2025-05-09T02:02:03.882075688Z" level=info msg="connecting to shim 29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a" address="unix:///run/containerd/s/5ca1e7e4a263035a652c70d4b59739150c2ca184c182fd470a66e9a4f4eff566" namespace=k8s.io protocol=ttrpc version=3 May 9 02:02:04.016183 systemd[1]: Started cri-containerd-29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a.scope - libcontainer container 29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a. May 9 02:02:04.137055 containerd[1478]: time="2025-05-09T02:02:04.136804056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-6m9ql,Uid:130a24ff-32ab-4698-82a3-c00a27cc01a1,Namespace:kube-system,Attempt:0,} returns sandbox id \"29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a\"" May 9 02:02:04.148761 containerd[1478]: time="2025-05-09T02:02:04.148076837Z" level=info msg="CreateContainer within sandbox \"29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 9 02:02:04.178880 containerd[1478]: time="2025-05-09T02:02:04.178826876Z" level=info msg="Container bbaa55a6fc7078546f29acb92e1a92da9c299938bf084055ce1756ea5206a44b: CDI devices from CRI Config.CDIDevices: []" May 9 02:02:04.200281 containerd[1478]: time="2025-05-09T02:02:04.199868903Z" level=info msg="CreateContainer within sandbox \"29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"bbaa55a6fc7078546f29acb92e1a92da9c299938bf084055ce1756ea5206a44b\"" May 9 02:02:04.202026 containerd[1478]: time="2025-05-09T02:02:04.202000582Z" level=info msg="StartContainer for \"bbaa55a6fc7078546f29acb92e1a92da9c299938bf084055ce1756ea5206a44b\"" May 9 02:02:04.204390 containerd[1478]: time="2025-05-09T02:02:04.203399000Z" level=info msg="connecting to shim bbaa55a6fc7078546f29acb92e1a92da9c299938bf084055ce1756ea5206a44b" address="unix:///run/containerd/s/5ca1e7e4a263035a652c70d4b59739150c2ca184c182fd470a66e9a4f4eff566" protocol=ttrpc version=3 May 9 02:02:04.240845 systemd[1]: Started cri-containerd-bbaa55a6fc7078546f29acb92e1a92da9c299938bf084055ce1756ea5206a44b.scope - libcontainer container bbaa55a6fc7078546f29acb92e1a92da9c299938bf084055ce1756ea5206a44b. May 9 02:02:04.297156 containerd[1478]: time="2025-05-09T02:02:04.297053977Z" level=info msg="StartContainer for \"bbaa55a6fc7078546f29acb92e1a92da9c299938bf084055ce1756ea5206a44b\" returns successfully" May 9 02:02:04.398612 containerd[1478]: time="2025-05-09T02:02:04.397763258Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86bd677f74-trx7j,Uid:c5892730-f8e6-4ce6-a683-d0ddf4f4389e,Namespace:calico-system,Attempt:0,}" May 9 02:02:04.479718 kernel: bpftool[4279]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 9 02:02:04.608972 systemd-networkd[1387]: calica3ef33c77e: Link UP May 9 02:02:04.610922 systemd-networkd[1387]: calica3ef33c77e: Gained carrier May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.475 [INFO][4249] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0 calico-kube-controllers-86bd677f74- calico-system c5892730-f8e6-4ce6-a683-d0ddf4f4389e 749 0 2025-05-09 02:01:28 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:86bd677f74 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284-0-0-n-abffc5acbe.novalocal calico-kube-controllers-86bd677f74-trx7j eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calica3ef33c77e [] []}} ContainerID="a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" Namespace="calico-system" Pod="calico-kube-controllers-86bd677f74-trx7j" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-" May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.476 [INFO][4249] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" Namespace="calico-system" Pod="calico-kube-controllers-86bd677f74-trx7j" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0" May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.532 [INFO][4282] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" HandleID="k8s-pod-network.a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0" May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.544 [INFO][4282] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" HandleID="k8s-pod-network.a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000318fc0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-abffc5acbe.novalocal", "pod":"calico-kube-controllers-86bd677f74-trx7j", "timestamp":"2025-05-09 02:02:04.532320065 +0000 UTC"}, Hostname:"ci-4284-0-0-n-abffc5acbe.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.544 [INFO][4282] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.544 [INFO][4282] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.544 [INFO][4282] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-abffc5acbe.novalocal' May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.547 [INFO][4282] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.552 [INFO][4282] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.559 [INFO][4282] ipam/ipam.go 489: Trying affinity for 192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.561 [INFO][4282] ipam/ipam.go 155: Attempting to load block cidr=192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.564 [INFO][4282] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.564 [INFO][4282] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.60.192/26 handle="k8s-pod-network.a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.572 [INFO][4282] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.579 [INFO][4282] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.60.192/26 handle="k8s-pod-network.a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.592 [INFO][4282] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.60.195/26] block=192.168.60.192/26 handle="k8s-pod-network.a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.593 [INFO][4282] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.60.195/26] handle="k8s-pod-network.a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.593 [INFO][4282] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 9 02:02:04.646773 containerd[1478]: 2025-05-09 02:02:04.593 [INFO][4282] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.60.195/26] IPv6=[] ContainerID="a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" HandleID="k8s-pod-network.a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0" May 9 02:02:04.648323 containerd[1478]: 2025-05-09 02:02:04.599 [INFO][4249] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" Namespace="calico-system" Pod="calico-kube-controllers-86bd677f74-trx7j" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0", GenerateName:"calico-kube-controllers-86bd677f74-", Namespace:"calico-system", SelfLink:"", UID:"c5892730-f8e6-4ce6-a683-d0ddf4f4389e", ResourceVersion:"749", Generation:0, CreationTimestamp:time.Date(2025, time.May, 9, 2, 1, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86bd677f74", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-abffc5acbe.novalocal", ContainerID:"", Pod:"calico-kube-controllers-86bd677f74-trx7j", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.60.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calica3ef33c77e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 9 02:02:04.648323 containerd[1478]: 2025-05-09 02:02:04.601 [INFO][4249] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.60.195/32] ContainerID="a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" Namespace="calico-system" Pod="calico-kube-controllers-86bd677f74-trx7j" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0" May 9 02:02:04.648323 containerd[1478]: 2025-05-09 02:02:04.601 [INFO][4249] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calica3ef33c77e ContainerID="a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" Namespace="calico-system" Pod="calico-kube-controllers-86bd677f74-trx7j" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0" May 9 02:02:04.648323 containerd[1478]: 2025-05-09 02:02:04.611 [INFO][4249] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" Namespace="calico-system" Pod="calico-kube-controllers-86bd677f74-trx7j" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0" May 9 02:02:04.648323 containerd[1478]: 2025-05-09 02:02:04.612 [INFO][4249] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" Namespace="calico-system" Pod="calico-kube-controllers-86bd677f74-trx7j" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0", GenerateName:"calico-kube-controllers-86bd677f74-", Namespace:"calico-system", SelfLink:"", UID:"c5892730-f8e6-4ce6-a683-d0ddf4f4389e", ResourceVersion:"749", Generation:0, CreationTimestamp:time.Date(2025, time.May, 9, 2, 1, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86bd677f74", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-abffc5acbe.novalocal", ContainerID:"a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c", Pod:"calico-kube-controllers-86bd677f74-trx7j", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.60.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calica3ef33c77e", MAC:"6a:5b:a4:ce:5b:6a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 9 02:02:04.648323 containerd[1478]: 2025-05-09 02:02:04.642 [INFO][4249] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" Namespace="calico-system" Pod="calico-kube-controllers-86bd677f74-trx7j" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--kube--controllers--86bd677f74--trx7j-eth0" May 9 02:02:04.696213 containerd[1478]: time="2025-05-09T02:02:04.695685929Z" level=info msg="connecting to shim a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c" address="unix:///run/containerd/s/69aae255eb210832dd9a53055de96e0e5098f2300f8a2d36c39ea4e17fd5d81b" namespace=k8s.io protocol=ttrpc version=3 May 9 02:02:04.733817 systemd[1]: Started cri-containerd-a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c.scope - libcontainer container a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c. May 9 02:02:04.808182 containerd[1478]: time="2025-05-09T02:02:04.808122311Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86bd677f74-trx7j,Uid:c5892730-f8e6-4ce6-a683-d0ddf4f4389e,Namespace:calico-system,Attempt:0,} returns sandbox id \"a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c\"" May 9 02:02:04.972680 systemd-networkd[1387]: vxlan.calico: Link UP May 9 02:02:04.972688 systemd-networkd[1387]: vxlan.calico: Gained carrier May 9 02:02:05.725950 systemd-networkd[1387]: calib241164b666: Gained IPv6LL May 9 02:02:06.018304 kubelet[2820]: I0509 02:02:06.017988 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-6m9ql" podStartSLOduration=47.017943593 podStartE2EDuration="47.017943593s" podCreationTimestamp="2025-05-09 02:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-09 02:02:04.929317688 +0000 UTC m=+59.646246968" watchObservedRunningTime="2025-05-09 02:02:06.017943593 +0000 UTC m=+60.734872824" May 9 02:02:06.332974 systemd-networkd[1387]: calica3ef33c77e: Gained IPv6LL May 9 02:02:06.487465 containerd[1478]: time="2025-05-09T02:02:06.486561899Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:06.488120 containerd[1478]: time="2025-05-09T02:02:06.488047240Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 9 02:02:06.489606 containerd[1478]: time="2025-05-09T02:02:06.489554481Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:06.492241 containerd[1478]: time="2025-05-09T02:02:06.492203029Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:06.493774 containerd[1478]: time="2025-05-09T02:02:06.493743774Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 3.637935087s" May 9 02:02:06.493858 containerd[1478]: time="2025-05-09T02:02:06.493777888Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 9 02:02:06.495848 containerd[1478]: time="2025-05-09T02:02:06.495725844Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 9 02:02:06.501433 containerd[1478]: time="2025-05-09T02:02:06.501373747Z" level=info msg="CreateContainer within sandbox \"fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 9 02:02:06.523543 containerd[1478]: time="2025-05-09T02:02:06.522064250Z" level=info msg="Container 6bde35bb97dae91ab2108a6779a7598499f1fc55f2d31c3e793de61a455129e2: CDI devices from CRI Config.CDIDevices: []" May 9 02:02:06.547212 containerd[1478]: time="2025-05-09T02:02:06.547103343Z" level=info msg="CreateContainer within sandbox \"fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"6bde35bb97dae91ab2108a6779a7598499f1fc55f2d31c3e793de61a455129e2\"" May 9 02:02:06.550041 containerd[1478]: time="2025-05-09T02:02:06.549901832Z" level=info msg="StartContainer for \"6bde35bb97dae91ab2108a6779a7598499f1fc55f2d31c3e793de61a455129e2\"" May 9 02:02:06.565769 containerd[1478]: time="2025-05-09T02:02:06.565597835Z" level=info msg="connecting to shim 6bde35bb97dae91ab2108a6779a7598499f1fc55f2d31c3e793de61a455129e2" address="unix:///run/containerd/s/4f25608a84bb4ff9cb1923aa62ad1a5519aeb43548c6c9677ec3d5d00431dc92" protocol=ttrpc version=3 May 9 02:02:06.605878 systemd[1]: Started cri-containerd-6bde35bb97dae91ab2108a6779a7598499f1fc55f2d31c3e793de61a455129e2.scope - libcontainer container 6bde35bb97dae91ab2108a6779a7598499f1fc55f2d31c3e793de61a455129e2. May 9 02:02:06.686832 systemd-networkd[1387]: vxlan.calico: Gained IPv6LL May 9 02:02:06.719474 containerd[1478]: time="2025-05-09T02:02:06.719383933Z" level=info msg="StartContainer for \"6bde35bb97dae91ab2108a6779a7598499f1fc55f2d31c3e793de61a455129e2\" returns successfully" May 9 02:02:10.706675 containerd[1478]: time="2025-05-09T02:02:10.706032854Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:10.709675 containerd[1478]: time="2025-05-09T02:02:10.709435124Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 9 02:02:10.711342 containerd[1478]: time="2025-05-09T02:02:10.711140006Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:10.715931 containerd[1478]: time="2025-05-09T02:02:10.715857939Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:10.716700 containerd[1478]: time="2025-05-09T02:02:10.716574561Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 4.220808762s" May 9 02:02:10.716700 containerd[1478]: time="2025-05-09T02:02:10.716663738Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 9 02:02:10.728279 containerd[1478]: time="2025-05-09T02:02:10.726494986Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 9 02:02:10.747078 containerd[1478]: time="2025-05-09T02:02:10.747033270Z" level=info msg="CreateContainer within sandbox \"a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 9 02:02:10.765343 containerd[1478]: time="2025-05-09T02:02:10.761099576Z" level=info msg="Container f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2: CDI devices from CRI Config.CDIDevices: []" May 9 02:02:10.795650 containerd[1478]: time="2025-05-09T02:02:10.795476421Z" level=info msg="CreateContainer within sandbox \"a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\"" May 9 02:02:10.798702 containerd[1478]: time="2025-05-09T02:02:10.798376882Z" level=info msg="StartContainer for \"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\"" May 9 02:02:10.800710 containerd[1478]: time="2025-05-09T02:02:10.800382857Z" level=info msg="connecting to shim f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2" address="unix:///run/containerd/s/69aae255eb210832dd9a53055de96e0e5098f2300f8a2d36c39ea4e17fd5d81b" protocol=ttrpc version=3 May 9 02:02:10.840816 systemd[1]: Started cri-containerd-f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2.scope - libcontainer container f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2. May 9 02:02:10.918384 containerd[1478]: time="2025-05-09T02:02:10.918308873Z" level=info msg="StartContainer for \"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" returns successfully" May 9 02:02:11.024780 kubelet[2820]: I0509 02:02:11.024564 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-86bd677f74-trx7j" podStartSLOduration=37.118926926 podStartE2EDuration="43.024489369s" podCreationTimestamp="2025-05-09 02:01:28 +0000 UTC" firstStartedPulling="2025-05-09 02:02:04.812437738 +0000 UTC m=+59.529366958" lastFinishedPulling="2025-05-09 02:02:10.718000161 +0000 UTC m=+65.434929401" observedRunningTime="2025-05-09 02:02:11.019799557 +0000 UTC m=+65.736728777" watchObservedRunningTime="2025-05-09 02:02:11.024489369 +0000 UTC m=+65.741418589" May 9 02:02:12.091653 containerd[1478]: time="2025-05-09T02:02:12.091574060Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"3a43354b4bcdb2644c04b90551891cb126b2d259df8e4ea7a8bdbd4f687638a2\" pid:4512 exited_at:{seconds:1746756132 nanos:88431927}" May 9 02:02:13.376803 containerd[1478]: time="2025-05-09T02:02:13.376743763Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:13.378937 containerd[1478]: time="2025-05-09T02:02:13.378875255Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 9 02:02:13.380814 containerd[1478]: time="2025-05-09T02:02:13.380772277Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:13.385350 containerd[1478]: time="2025-05-09T02:02:13.385261704Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:13.385968 containerd[1478]: time="2025-05-09T02:02:13.385926328Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 2.659372082s" May 9 02:02:13.386055 containerd[1478]: time="2025-05-09T02:02:13.385968177Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 9 02:02:13.390317 containerd[1478]: time="2025-05-09T02:02:13.390277636Z" level=info msg="CreateContainer within sandbox \"fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 9 02:02:13.409035 containerd[1478]: time="2025-05-09T02:02:13.408958968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76d8dcc867-sctwt,Uid:33649465-2591-42c1-b37f-5c5ee7d9ef5e,Namespace:calico-apiserver,Attempt:0,}" May 9 02:02:13.416091 containerd[1478]: time="2025-05-09T02:02:13.413875704Z" level=info msg="Container 76ef1b22e3bbf85a464137ad1d25e21e8518088f5546f990c4659e34c8f5137b: CDI devices from CRI Config.CDIDevices: []" May 9 02:02:13.443718 containerd[1478]: time="2025-05-09T02:02:13.443613450Z" level=info msg="CreateContainer within sandbox \"fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"76ef1b22e3bbf85a464137ad1d25e21e8518088f5546f990c4659e34c8f5137b\"" May 9 02:02:13.444838 containerd[1478]: time="2025-05-09T02:02:13.444808908Z" level=info msg="StartContainer for \"76ef1b22e3bbf85a464137ad1d25e21e8518088f5546f990c4659e34c8f5137b\"" May 9 02:02:13.452326 containerd[1478]: time="2025-05-09T02:02:13.452244682Z" level=info msg="connecting to shim 76ef1b22e3bbf85a464137ad1d25e21e8518088f5546f990c4659e34c8f5137b" address="unix:///run/containerd/s/4f25608a84bb4ff9cb1923aa62ad1a5519aeb43548c6c9677ec3d5d00431dc92" protocol=ttrpc version=3 May 9 02:02:13.496408 systemd[1]: Started cri-containerd-76ef1b22e3bbf85a464137ad1d25e21e8518088f5546f990c4659e34c8f5137b.scope - libcontainer container 76ef1b22e3bbf85a464137ad1d25e21e8518088f5546f990c4659e34c8f5137b. May 9 02:02:13.603444 containerd[1478]: time="2025-05-09T02:02:13.602755899Z" level=info msg="StartContainer for \"76ef1b22e3bbf85a464137ad1d25e21e8518088f5546f990c4659e34c8f5137b\" returns successfully" May 9 02:02:13.667115 systemd-networkd[1387]: cali7bf4314f49d: Link UP May 9 02:02:13.667835 systemd-networkd[1387]: cali7bf4314f49d: Gained carrier May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.533 [INFO][4525] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0 calico-apiserver-76d8dcc867- calico-apiserver 33649465-2591-42c1-b37f-5c5ee7d9ef5e 756 0 2025-05-09 02:01:27 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:76d8dcc867 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-abffc5acbe.novalocal calico-apiserver-76d8dcc867-sctwt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7bf4314f49d [] []}} ContainerID="f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-sctwt" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-" May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.534 [INFO][4525] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-sctwt" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0" May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.607 [INFO][4558] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" HandleID="k8s-pod-network.f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0" May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.620 [INFO][4558] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" HandleID="k8s-pod-network.f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004f2e00), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-abffc5acbe.novalocal", "pod":"calico-apiserver-76d8dcc867-sctwt", "timestamp":"2025-05-09 02:02:13.606850366 +0000 UTC"}, Hostname:"ci-4284-0-0-n-abffc5acbe.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.620 [INFO][4558] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.621 [INFO][4558] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.621 [INFO][4558] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-abffc5acbe.novalocal' May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.623 [INFO][4558] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.629 [INFO][4558] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.635 [INFO][4558] ipam/ipam.go 489: Trying affinity for 192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.637 [INFO][4558] ipam/ipam.go 155: Attempting to load block cidr=192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.640 [INFO][4558] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.640 [INFO][4558] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.60.192/26 handle="k8s-pod-network.f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.642 [INFO][4558] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.650 [INFO][4558] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.60.192/26 handle="k8s-pod-network.f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.658 [INFO][4558] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.60.196/26] block=192.168.60.192/26 handle="k8s-pod-network.f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.659 [INFO][4558] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.60.196/26] handle="k8s-pod-network.f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.659 [INFO][4558] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 9 02:02:13.700171 containerd[1478]: 2025-05-09 02:02:13.659 [INFO][4558] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.60.196/26] IPv6=[] ContainerID="f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" HandleID="k8s-pod-network.f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0" May 9 02:02:13.702091 containerd[1478]: 2025-05-09 02:02:13.661 [INFO][4525] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-sctwt" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0", GenerateName:"calico-apiserver-76d8dcc867-", Namespace:"calico-apiserver", SelfLink:"", UID:"33649465-2591-42c1-b37f-5c5ee7d9ef5e", ResourceVersion:"756", Generation:0, CreationTimestamp:time.Date(2025, time.May, 9, 2, 1, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76d8dcc867", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-abffc5acbe.novalocal", ContainerID:"", Pod:"calico-apiserver-76d8dcc867-sctwt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.60.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7bf4314f49d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 9 02:02:13.702091 containerd[1478]: 2025-05-09 02:02:13.662 [INFO][4525] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.60.196/32] ContainerID="f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-sctwt" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0" May 9 02:02:13.702091 containerd[1478]: 2025-05-09 02:02:13.662 [INFO][4525] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7bf4314f49d ContainerID="f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-sctwt" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0" May 9 02:02:13.702091 containerd[1478]: 2025-05-09 02:02:13.665 [INFO][4525] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-sctwt" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0" May 9 02:02:13.702091 containerd[1478]: 2025-05-09 02:02:13.669 [INFO][4525] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-sctwt" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0", GenerateName:"calico-apiserver-76d8dcc867-", Namespace:"calico-apiserver", SelfLink:"", UID:"33649465-2591-42c1-b37f-5c5ee7d9ef5e", ResourceVersion:"756", Generation:0, CreationTimestamp:time.Date(2025, time.May, 9, 2, 1, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76d8dcc867", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-abffc5acbe.novalocal", ContainerID:"f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb", Pod:"calico-apiserver-76d8dcc867-sctwt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.60.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7bf4314f49d", MAC:"4a:24:9f:7a:6f:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 9 02:02:13.702091 containerd[1478]: 2025-05-09 02:02:13.688 [INFO][4525] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-sctwt" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--sctwt-eth0" May 9 02:02:13.702408 kubelet[2820]: I0509 02:02:13.701436 2820 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 9 02:02:13.702408 kubelet[2820]: I0509 02:02:13.701503 2820 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 9 02:02:13.773410 containerd[1478]: time="2025-05-09T02:02:13.773343241Z" level=info msg="connecting to shim f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb" address="unix:///run/containerd/s/32cde813791c5a9f399163f781047ab9045e4cb0342431188e30db193aef8c81" namespace=k8s.io protocol=ttrpc version=3 May 9 02:02:13.831847 systemd[1]: Started cri-containerd-f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb.scope - libcontainer container f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb. May 9 02:02:13.919971 containerd[1478]: time="2025-05-09T02:02:13.919804444Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"1b606679061f9853f782d3bcca78d45c2ff9bc81b5aed06f41cbebdd7579bf9f\" pid:4625 exited_at:{seconds:1746756133 nanos:919176297}" May 9 02:02:13.923918 containerd[1478]: time="2025-05-09T02:02:13.923881208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76d8dcc867-sctwt,Uid:33649465-2591-42c1-b37f-5c5ee7d9ef5e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb\"" May 9 02:02:13.930787 containerd[1478]: time="2025-05-09T02:02:13.928957613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 9 02:02:14.105639 kubelet[2820]: I0509 02:02:14.103960 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-5w5lh" podStartSLOduration=35.570270621 podStartE2EDuration="46.103942576s" podCreationTimestamp="2025-05-09 02:01:28 +0000 UTC" firstStartedPulling="2025-05-09 02:02:02.853804296 +0000 UTC m=+57.570733516" lastFinishedPulling="2025-05-09 02:02:13.387476251 +0000 UTC m=+68.104405471" observedRunningTime="2025-05-09 02:02:14.102761534 +0000 UTC m=+68.819690764" watchObservedRunningTime="2025-05-09 02:02:14.103942576 +0000 UTC m=+68.820871796" May 9 02:02:14.398478 containerd[1478]: time="2025-05-09T02:02:14.398355234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76d8dcc867-r899n,Uid:ed3c5ab6-d900-4bbb-847c-cc6ac6245a17,Namespace:calico-apiserver,Attempt:0,}" May 9 02:02:14.582274 systemd-networkd[1387]: calid519b743b53: Link UP May 9 02:02:14.583918 systemd-networkd[1387]: calid519b743b53: Gained carrier May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.460 [INFO][4662] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0 calico-apiserver-76d8dcc867- calico-apiserver ed3c5ab6-d900-4bbb-847c-cc6ac6245a17 754 0 2025-05-09 02:01:27 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:76d8dcc867 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-abffc5acbe.novalocal calico-apiserver-76d8dcc867-r899n eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid519b743b53 [] []}} ContainerID="e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-r899n" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-" May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.460 [INFO][4662] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-r899n" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0" May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.497 [INFO][4675] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" HandleID="k8s-pod-network.e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0" May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.518 [INFO][4675] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" HandleID="k8s-pod-network.e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000384ae0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-abffc5acbe.novalocal", "pod":"calico-apiserver-76d8dcc867-r899n", "timestamp":"2025-05-09 02:02:14.497864725 +0000 UTC"}, Hostname:"ci-4284-0-0-n-abffc5acbe.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.518 [INFO][4675] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.518 [INFO][4675] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.518 [INFO][4675] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-abffc5acbe.novalocal' May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.521 [INFO][4675] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.527 [INFO][4675] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.533 [INFO][4675] ipam/ipam.go 489: Trying affinity for 192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.537 [INFO][4675] ipam/ipam.go 155: Attempting to load block cidr=192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.540 [INFO][4675] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.540 [INFO][4675] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.60.192/26 handle="k8s-pod-network.e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.543 [INFO][4675] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824 May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.552 [INFO][4675] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.60.192/26 handle="k8s-pod-network.e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.567 [INFO][4675] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.60.197/26] block=192.168.60.192/26 handle="k8s-pod-network.e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.567 [INFO][4675] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.60.197/26] handle="k8s-pod-network.e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.567 [INFO][4675] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 9 02:02:14.620492 containerd[1478]: 2025-05-09 02:02:14.567 [INFO][4675] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.60.197/26] IPv6=[] ContainerID="e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" HandleID="k8s-pod-network.e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0" May 9 02:02:14.622275 containerd[1478]: 2025-05-09 02:02:14.570 [INFO][4662] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-r899n" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0", GenerateName:"calico-apiserver-76d8dcc867-", Namespace:"calico-apiserver", SelfLink:"", UID:"ed3c5ab6-d900-4bbb-847c-cc6ac6245a17", ResourceVersion:"754", Generation:0, CreationTimestamp:time.Date(2025, time.May, 9, 2, 1, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76d8dcc867", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-abffc5acbe.novalocal", ContainerID:"", Pod:"calico-apiserver-76d8dcc867-r899n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.60.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid519b743b53", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 9 02:02:14.622275 containerd[1478]: 2025-05-09 02:02:14.570 [INFO][4662] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.60.197/32] ContainerID="e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-r899n" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0" May 9 02:02:14.622275 containerd[1478]: 2025-05-09 02:02:14.570 [INFO][4662] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid519b743b53 ContainerID="e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-r899n" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0" May 9 02:02:14.622275 containerd[1478]: 2025-05-09 02:02:14.585 [INFO][4662] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-r899n" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0" May 9 02:02:14.622275 containerd[1478]: 2025-05-09 02:02:14.586 [INFO][4662] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-r899n" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0", GenerateName:"calico-apiserver-76d8dcc867-", Namespace:"calico-apiserver", SelfLink:"", UID:"ed3c5ab6-d900-4bbb-847c-cc6ac6245a17", ResourceVersion:"754", Generation:0, CreationTimestamp:time.Date(2025, time.May, 9, 2, 1, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76d8dcc867", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-abffc5acbe.novalocal", ContainerID:"e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824", Pod:"calico-apiserver-76d8dcc867-r899n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.60.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid519b743b53", MAC:"c6:d3:ec:7a:33:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 9 02:02:14.622275 containerd[1478]: 2025-05-09 02:02:14.617 [INFO][4662] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" Namespace="calico-apiserver" Pod="calico-apiserver-76d8dcc867-r899n" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-calico--apiserver--76d8dcc867--r899n-eth0" May 9 02:02:14.670646 containerd[1478]: time="2025-05-09T02:02:14.669100001Z" level=info msg="connecting to shim e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824" address="unix:///run/containerd/s/d4919014df855b835cbc3301976e6c0f9ac85926c2c3c5969f8f1816d01760d3" namespace=k8s.io protocol=ttrpc version=3 May 9 02:02:14.702898 systemd[1]: Started cri-containerd-e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824.scope - libcontainer container e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824. May 9 02:02:14.758274 containerd[1478]: time="2025-05-09T02:02:14.758219516Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76d8dcc867-r899n,Uid:ed3c5ab6-d900-4bbb-847c-cc6ac6245a17,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824\"" May 9 02:02:14.877867 systemd-networkd[1387]: cali7bf4314f49d: Gained IPv6LL May 9 02:02:15.400306 containerd[1478]: time="2025-05-09T02:02:15.400095311Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-blfcp,Uid:35b5d640-2550-4b8b-9939-50b387738597,Namespace:kube-system,Attempt:0,}" May 9 02:02:15.629519 systemd-networkd[1387]: cali9d3fb0820c2: Link UP May 9 02:02:15.630272 systemd-networkd[1387]: cali9d3fb0820c2: Gained carrier May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.499 [INFO][4737] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0 coredns-7db6d8ff4d- kube-system 35b5d640-2550-4b8b-9939-50b387738597 747 0 2025-05-09 02:01:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-abffc5acbe.novalocal coredns-7db6d8ff4d-blfcp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9d3fb0820c2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-blfcp" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-" May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.499 [INFO][4737] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-blfcp" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0" May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.540 [INFO][4749] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" HandleID="k8s-pod-network.e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0" May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.559 [INFO][4749] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" HandleID="k8s-pod-network.e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004d6b30), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-abffc5acbe.novalocal", "pod":"coredns-7db6d8ff4d-blfcp", "timestamp":"2025-05-09 02:02:15.540294447 +0000 UTC"}, Hostname:"ci-4284-0-0-n-abffc5acbe.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.559 [INFO][4749] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.559 [INFO][4749] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.559 [INFO][4749] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-abffc5acbe.novalocal' May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.562 [INFO][4749] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.570 [INFO][4749] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.579 [INFO][4749] ipam/ipam.go 489: Trying affinity for 192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.584 [INFO][4749] ipam/ipam.go 155: Attempting to load block cidr=192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.590 [INFO][4749] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.60.192/26 host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.590 [INFO][4749] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.60.192/26 handle="k8s-pod-network.e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.594 [INFO][4749] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4 May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.602 [INFO][4749] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.60.192/26 handle="k8s-pod-network.e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.620 [INFO][4749] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.60.198/26] block=192.168.60.192/26 handle="k8s-pod-network.e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.620 [INFO][4749] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.60.198/26] handle="k8s-pod-network.e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" host="ci-4284-0-0-n-abffc5acbe.novalocal" May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.620 [INFO][4749] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 9 02:02:15.653982 containerd[1478]: 2025-05-09 02:02:15.620 [INFO][4749] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.60.198/26] IPv6=[] ContainerID="e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" HandleID="k8s-pod-network.e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" Workload="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0" May 9 02:02:15.657581 containerd[1478]: 2025-05-09 02:02:15.623 [INFO][4737] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-blfcp" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"35b5d640-2550-4b8b-9939-50b387738597", ResourceVersion:"747", Generation:0, CreationTimestamp:time.Date(2025, time.May, 9, 2, 1, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-abffc5acbe.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-blfcp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.60.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9d3fb0820c2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 9 02:02:15.657581 containerd[1478]: 2025-05-09 02:02:15.623 [INFO][4737] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.60.198/32] ContainerID="e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-blfcp" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0" May 9 02:02:15.657581 containerd[1478]: 2025-05-09 02:02:15.623 [INFO][4737] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9d3fb0820c2 ContainerID="e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-blfcp" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0" May 9 02:02:15.657581 containerd[1478]: 2025-05-09 02:02:15.630 [INFO][4737] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-blfcp" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0" May 9 02:02:15.657581 containerd[1478]: 2025-05-09 02:02:15.632 [INFO][4737] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-blfcp" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"35b5d640-2550-4b8b-9939-50b387738597", ResourceVersion:"747", Generation:0, CreationTimestamp:time.Date(2025, time.May, 9, 2, 1, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-abffc5acbe.novalocal", ContainerID:"e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4", Pod:"coredns-7db6d8ff4d-blfcp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.60.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9d3fb0820c2", MAC:"a6:f1:fb:23:f1:a7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 9 02:02:15.657581 containerd[1478]: 2025-05-09 02:02:15.647 [INFO][4737] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-blfcp" WorkloadEndpoint="ci--4284--0--0--n--abffc5acbe.novalocal-k8s-coredns--7db6d8ff4d--blfcp-eth0" May 9 02:02:15.692097 containerd[1478]: time="2025-05-09T02:02:15.692032561Z" level=info msg="connecting to shim e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4" address="unix:///run/containerd/s/d2012e7a4e7b70e65b112c89a85fb9e872275de9a61b71f5980e76e8e5217342" namespace=k8s.io protocol=ttrpc version=3 May 9 02:02:15.738935 systemd[1]: Started cri-containerd-e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4.scope - libcontainer container e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4. May 9 02:02:15.816427 containerd[1478]: time="2025-05-09T02:02:15.816121208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-blfcp,Uid:35b5d640-2550-4b8b-9939-50b387738597,Namespace:kube-system,Attempt:0,} returns sandbox id \"e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4\"" May 9 02:02:15.822560 containerd[1478]: time="2025-05-09T02:02:15.822524279Z" level=info msg="CreateContainer within sandbox \"e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 9 02:02:15.844213 containerd[1478]: time="2025-05-09T02:02:15.843302540Z" level=info msg="Container 15c99df04ebef533acf903641fbd27a0169809cbd430592bc552dd69102208e5: CDI devices from CRI Config.CDIDevices: []" May 9 02:02:15.854480 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1487329561.mount: Deactivated successfully. May 9 02:02:15.866741 containerd[1478]: time="2025-05-09T02:02:15.866222271Z" level=info msg="CreateContainer within sandbox \"e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"15c99df04ebef533acf903641fbd27a0169809cbd430592bc552dd69102208e5\"" May 9 02:02:15.868985 containerd[1478]: time="2025-05-09T02:02:15.868952504Z" level=info msg="StartContainer for \"15c99df04ebef533acf903641fbd27a0169809cbd430592bc552dd69102208e5\"" May 9 02:02:15.870385 containerd[1478]: time="2025-05-09T02:02:15.870328400Z" level=info msg="connecting to shim 15c99df04ebef533acf903641fbd27a0169809cbd430592bc552dd69102208e5" address="unix:///run/containerd/s/d2012e7a4e7b70e65b112c89a85fb9e872275de9a61b71f5980e76e8e5217342" protocol=ttrpc version=3 May 9 02:02:15.909796 systemd[1]: Started cri-containerd-15c99df04ebef533acf903641fbd27a0169809cbd430592bc552dd69102208e5.scope - libcontainer container 15c99df04ebef533acf903641fbd27a0169809cbd430592bc552dd69102208e5. May 9 02:02:16.006297 containerd[1478]: time="2025-05-09T02:02:16.006173113Z" level=info msg="StartContainer for \"15c99df04ebef533acf903641fbd27a0169809cbd430592bc552dd69102208e5\" returns successfully" May 9 02:02:16.029961 systemd-networkd[1387]: calid519b743b53: Gained IPv6LL May 9 02:02:16.083999 kubelet[2820]: I0509 02:02:16.083594 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-blfcp" podStartSLOduration=57.083572901 podStartE2EDuration="57.083572901s" podCreationTimestamp="2025-05-09 02:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-09 02:02:16.082659069 +0000 UTC m=+70.799588299" watchObservedRunningTime="2025-05-09 02:02:16.083572901 +0000 UTC m=+70.800502121" May 9 02:02:17.245899 systemd-networkd[1387]: cali9d3fb0820c2: Gained IPv6LL May 9 02:02:18.810658 containerd[1478]: time="2025-05-09T02:02:18.810513649Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"331d14a903055840b63dee7dce2d61d33da0377b09455609f64ab6df3b1e9e2a\" pid:4875 exited_at:{seconds:1746756138 nanos:808910346}" May 9 02:02:20.838418 containerd[1478]: time="2025-05-09T02:02:20.838344930Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:20.840518 containerd[1478]: time="2025-05-09T02:02:20.840433841Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 9 02:02:20.842533 containerd[1478]: time="2025-05-09T02:02:20.841858841Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:20.848198 containerd[1478]: time="2025-05-09T02:02:20.848145686Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:20.853150 containerd[1478]: time="2025-05-09T02:02:20.852601060Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 6.921777513s" May 9 02:02:20.853150 containerd[1478]: time="2025-05-09T02:02:20.852689666Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 9 02:02:20.856896 containerd[1478]: time="2025-05-09T02:02:20.855482247Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 9 02:02:20.859721 containerd[1478]: time="2025-05-09T02:02:20.859397059Z" level=info msg="CreateContainer within sandbox \"f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 9 02:02:20.890913 containerd[1478]: time="2025-05-09T02:02:20.890094154Z" level=info msg="Container 8f2a1b2f25d7ee0cd2e79f6d05f64eff706be3f443e03c8af42cdff8a2ed2d74: CDI devices from CRI Config.CDIDevices: []" May 9 02:02:20.920335 containerd[1478]: time="2025-05-09T02:02:20.920257859Z" level=info msg="CreateContainer within sandbox \"f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8f2a1b2f25d7ee0cd2e79f6d05f64eff706be3f443e03c8af42cdff8a2ed2d74\"" May 9 02:02:20.922454 containerd[1478]: time="2025-05-09T02:02:20.922403037Z" level=info msg="StartContainer for \"8f2a1b2f25d7ee0cd2e79f6d05f64eff706be3f443e03c8af42cdff8a2ed2d74\"" May 9 02:02:20.924180 containerd[1478]: time="2025-05-09T02:02:20.924130722Z" level=info msg="connecting to shim 8f2a1b2f25d7ee0cd2e79f6d05f64eff706be3f443e03c8af42cdff8a2ed2d74" address="unix:///run/containerd/s/32cde813791c5a9f399163f781047ab9045e4cb0342431188e30db193aef8c81" protocol=ttrpc version=3 May 9 02:02:20.970315 systemd[1]: Started cri-containerd-8f2a1b2f25d7ee0cd2e79f6d05f64eff706be3f443e03c8af42cdff8a2ed2d74.scope - libcontainer container 8f2a1b2f25d7ee0cd2e79f6d05f64eff706be3f443e03c8af42cdff8a2ed2d74. May 9 02:02:21.151202 containerd[1478]: time="2025-05-09T02:02:21.150801246Z" level=info msg="StartContainer for \"8f2a1b2f25d7ee0cd2e79f6d05f64eff706be3f443e03c8af42cdff8a2ed2d74\" returns successfully" May 9 02:02:21.420915 containerd[1478]: time="2025-05-09T02:02:21.420746346Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 9 02:02:21.423433 containerd[1478]: time="2025-05-09T02:02:21.422884281Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 9 02:02:21.427679 containerd[1478]: time="2025-05-09T02:02:21.427604431Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 572.055099ms" May 9 02:02:21.427679 containerd[1478]: time="2025-05-09T02:02:21.427678410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 9 02:02:21.434234 containerd[1478]: time="2025-05-09T02:02:21.434182493Z" level=info msg="CreateContainer within sandbox \"e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 9 02:02:21.451175 containerd[1478]: time="2025-05-09T02:02:21.451116580Z" level=info msg="Container 8c2ccfbd241927610a8629971cf4affac189c278b8be6263ec42a873d2e8a92d: CDI devices from CRI Config.CDIDevices: []" May 9 02:02:21.469499 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2554910724.mount: Deactivated successfully. May 9 02:02:21.480469 containerd[1478]: time="2025-05-09T02:02:21.478130848Z" level=info msg="CreateContainer within sandbox \"e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8c2ccfbd241927610a8629971cf4affac189c278b8be6263ec42a873d2e8a92d\"" May 9 02:02:21.481930 containerd[1478]: time="2025-05-09T02:02:21.481062730Z" level=info msg="StartContainer for \"8c2ccfbd241927610a8629971cf4affac189c278b8be6263ec42a873d2e8a92d\"" May 9 02:02:21.484898 containerd[1478]: time="2025-05-09T02:02:21.484844753Z" level=info msg="connecting to shim 8c2ccfbd241927610a8629971cf4affac189c278b8be6263ec42a873d2e8a92d" address="unix:///run/containerd/s/d4919014df855b835cbc3301976e6c0f9ac85926c2c3c5969f8f1816d01760d3" protocol=ttrpc version=3 May 9 02:02:21.555047 systemd[1]: Started cri-containerd-8c2ccfbd241927610a8629971cf4affac189c278b8be6263ec42a873d2e8a92d.scope - libcontainer container 8c2ccfbd241927610a8629971cf4affac189c278b8be6263ec42a873d2e8a92d. May 9 02:02:21.658498 containerd[1478]: time="2025-05-09T02:02:21.657287972Z" level=info msg="StartContainer for \"8c2ccfbd241927610a8629971cf4affac189c278b8be6263ec42a873d2e8a92d\" returns successfully" May 9 02:02:22.134336 kubelet[2820]: I0509 02:02:22.133331 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-76d8dcc867-r899n" podStartSLOduration=48.464487672 podStartE2EDuration="55.133307369s" podCreationTimestamp="2025-05-09 02:01:27 +0000 UTC" firstStartedPulling="2025-05-09 02:02:14.760128842 +0000 UTC m=+69.477058062" lastFinishedPulling="2025-05-09 02:02:21.428948539 +0000 UTC m=+76.145877759" observedRunningTime="2025-05-09 02:02:22.130311638 +0000 UTC m=+76.847240858" watchObservedRunningTime="2025-05-09 02:02:22.133307369 +0000 UTC m=+76.850236599" May 9 02:02:23.659346 kubelet[2820]: I0509 02:02:23.658791 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-76d8dcc867-sctwt" podStartSLOduration=49.732085319 podStartE2EDuration="56.658767838s" podCreationTimestamp="2025-05-09 02:01:27 +0000 UTC" firstStartedPulling="2025-05-09 02:02:13.928661158 +0000 UTC m=+68.645590388" lastFinishedPulling="2025-05-09 02:02:20.855343687 +0000 UTC m=+75.572272907" observedRunningTime="2025-05-09 02:02:22.154263742 +0000 UTC m=+76.871192972" watchObservedRunningTime="2025-05-09 02:02:23.658767838 +0000 UTC m=+78.375697058" May 9 02:02:43.976750 containerd[1478]: time="2025-05-09T02:02:43.976423728Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"b60434f54767fce1a9436c68a73e0afae239afd7d04c85dd1354c6268d9ebad6\" pid:4991 exited_at:{seconds:1746756163 nanos:974293896}" May 9 02:02:46.455697 containerd[1478]: time="2025-05-09T02:02:46.455608022Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"7d297d95ec7b6d70cdfc3654faf00d267120ae742dff5442248b5384c4a71cbd\" pid:5022 exited_at:{seconds:1746756166 nanos:455060736}" May 9 02:02:48.561560 containerd[1478]: time="2025-05-09T02:02:48.561473079Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"1652d52150fe12ba9eff9b36de218518d01b75fef84d5f1f6c0a8729fe359b63\" pid:5043 exited_at:{seconds:1746756168 nanos:560717323}" May 9 02:03:13.962469 containerd[1478]: time="2025-05-09T02:03:13.961914729Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"7dbabddc51cd135644b45408c5620ea9e37c516bd933659881a241f145e90202\" pid:5069 exited_at:{seconds:1746756193 nanos:961262968}" May 9 02:03:18.609198 containerd[1478]: time="2025-05-09T02:03:18.608795708Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"58938234024b345cb1b320f6b80ed4efe66286c74f6dd3618f062b30a447682f\" pid:5094 exited_at:{seconds:1746756198 nanos:607708299}" May 9 02:03:43.947375 containerd[1478]: time="2025-05-09T02:03:43.946802594Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"8e4ce737a4230f6dd1daad3adf6e8409a26ce3aa3c2aa65189eb36707a7380a6\" pid:5144 exited_at:{seconds:1746756223 nanos:945487528}" May 9 02:03:46.440776 containerd[1478]: time="2025-05-09T02:03:46.440702498Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"e8edeae3ae80ebf0364f995f27b1ee87131e907e33c7b5bb220d6c5fc8246f7d\" pid:5168 exited_at:{seconds:1746756226 nanos:440308026}" May 9 02:03:48.554364 containerd[1478]: time="2025-05-09T02:03:48.554162454Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"0747b8b09c8587e2d34b443405cabfde390b9c89030b33c4cadf92f188dc8bb6\" pid:5190 exited_at:{seconds:1746756228 nanos:552355644}" May 9 02:03:58.482753 update_engine[1459]: I20250509 02:03:58.481690 1459 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 9 02:03:58.482753 update_engine[1459]: I20250509 02:03:58.481997 1459 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 9 02:03:58.484453 update_engine[1459]: I20250509 02:03:58.483531 1459 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 9 02:03:58.487743 update_engine[1459]: I20250509 02:03:58.486411 1459 omaha_request_params.cc:62] Current group set to alpha May 9 02:03:58.487743 update_engine[1459]: I20250509 02:03:58.487298 1459 update_attempter.cc:499] Already updated boot flags. Skipping. May 9 02:03:58.487743 update_engine[1459]: I20250509 02:03:58.487335 1459 update_attempter.cc:643] Scheduling an action processor start. May 9 02:03:58.487743 update_engine[1459]: I20250509 02:03:58.487460 1459 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 9 02:03:58.489104 update_engine[1459]: I20250509 02:03:58.489050 1459 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 9 02:03:58.489404 update_engine[1459]: I20250509 02:03:58.489344 1459 omaha_request_action.cc:271] Posting an Omaha request to disabled May 9 02:03:58.489404 update_engine[1459]: I20250509 02:03:58.489388 1459 omaha_request_action.cc:272] Request: May 9 02:03:58.489404 update_engine[1459]: May 9 02:03:58.489404 update_engine[1459]: May 9 02:03:58.489404 update_engine[1459]: May 9 02:03:58.489404 update_engine[1459]: May 9 02:03:58.489404 update_engine[1459]: May 9 02:03:58.489404 update_engine[1459]: May 9 02:03:58.489404 update_engine[1459]: May 9 02:03:58.489404 update_engine[1459]: May 9 02:03:58.497250 update_engine[1459]: I20250509 02:03:58.489418 1459 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 9 02:03:58.505702 update_engine[1459]: I20250509 02:03:58.505322 1459 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 9 02:03:58.506019 locksmithd[1487]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 9 02:03:58.506921 update_engine[1459]: I20250509 02:03:58.506830 1459 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 9 02:03:58.514539 update_engine[1459]: E20250509 02:03:58.514427 1459 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 9 02:03:58.514954 update_engine[1459]: I20250509 02:03:58.514780 1459 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 9 02:04:08.388180 update_engine[1459]: I20250509 02:04:08.387355 1459 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 9 02:04:08.392599 update_engine[1459]: I20250509 02:04:08.389233 1459 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 9 02:04:08.392599 update_engine[1459]: I20250509 02:04:08.390595 1459 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 9 02:04:08.396010 update_engine[1459]: E20250509 02:04:08.395908 1459 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 9 02:04:08.396365 update_engine[1459]: I20250509 02:04:08.396284 1459 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 9 02:04:13.963347 containerd[1478]: time="2025-05-09T02:04:13.963099098Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"2446d0183dae390a24b6218b8687acb5ddd61bca353778a58232a9eea0fe7c30\" pid:5217 exited_at:{seconds:1746756253 nanos:962353777}" May 9 02:04:18.386607 update_engine[1459]: I20250509 02:04:18.386452 1459 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 9 02:04:18.387948 update_engine[1459]: I20250509 02:04:18.387044 1459 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 9 02:04:18.387948 update_engine[1459]: I20250509 02:04:18.387588 1459 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 9 02:04:18.393147 update_engine[1459]: E20250509 02:04:18.393059 1459 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 9 02:04:18.393348 update_engine[1459]: I20250509 02:04:18.393203 1459 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 9 02:04:18.539806 containerd[1478]: time="2025-05-09T02:04:18.539748059Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"52ea39a30281d59342c7863d8df00b66555723e9ece63685d28d6aaf4994940e\" pid:5242 exited_at:{seconds:1746756258 nanos:539212993}" May 9 02:04:28.397610 update_engine[1459]: I20250509 02:04:28.394868 1459 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 9 02:04:28.397610 update_engine[1459]: I20250509 02:04:28.397305 1459 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 9 02:04:28.402820 update_engine[1459]: I20250509 02:04:28.401038 1459 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 9 02:04:28.406074 update_engine[1459]: E20250509 02:04:28.405980 1459 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 9 02:04:28.406413 update_engine[1459]: I20250509 02:04:28.406172 1459 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 9 02:04:28.406413 update_engine[1459]: I20250509 02:04:28.406225 1459 omaha_request_action.cc:617] Omaha request response: May 9 02:04:28.407092 update_engine[1459]: E20250509 02:04:28.406991 1459 omaha_request_action.cc:636] Omaha request network transfer failed. May 9 02:04:28.407657 update_engine[1459]: I20250509 02:04:28.407593 1459 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 9 02:04:28.407817 update_engine[1459]: I20250509 02:04:28.407662 1459 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 9 02:04:28.407817 update_engine[1459]: I20250509 02:04:28.407692 1459 update_attempter.cc:306] Processing Done. May 9 02:04:28.408027 update_engine[1459]: E20250509 02:04:28.407824 1459 update_attempter.cc:619] Update failed. May 9 02:04:28.408027 update_engine[1459]: I20250509 02:04:28.407857 1459 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 9 02:04:28.408027 update_engine[1459]: I20250509 02:04:28.407872 1459 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 9 02:04:28.408027 update_engine[1459]: I20250509 02:04:28.407887 1459 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 9 02:04:28.408734 update_engine[1459]: I20250509 02:04:28.408511 1459 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 9 02:04:28.408734 update_engine[1459]: I20250509 02:04:28.408707 1459 omaha_request_action.cc:271] Posting an Omaha request to disabled May 9 02:04:28.409032 update_engine[1459]: I20250509 02:04:28.408735 1459 omaha_request_action.cc:272] Request: May 9 02:04:28.409032 update_engine[1459]: May 9 02:04:28.409032 update_engine[1459]: May 9 02:04:28.409032 update_engine[1459]: May 9 02:04:28.409032 update_engine[1459]: May 9 02:04:28.409032 update_engine[1459]: May 9 02:04:28.409032 update_engine[1459]: May 9 02:04:28.409032 update_engine[1459]: I20250509 02:04:28.408763 1459 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 9 02:04:28.410953 update_engine[1459]: I20250509 02:04:28.409214 1459 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 9 02:04:28.410953 update_engine[1459]: I20250509 02:04:28.409797 1459 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 9 02:04:28.414553 locksmithd[1487]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 9 02:04:28.416079 update_engine[1459]: E20250509 02:04:28.415922 1459 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 9 02:04:28.416190 update_engine[1459]: I20250509 02:04:28.416105 1459 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 9 02:04:28.416190 update_engine[1459]: I20250509 02:04:28.416136 1459 omaha_request_action.cc:617] Omaha request response: May 9 02:04:28.416190 update_engine[1459]: I20250509 02:04:28.416152 1459 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 9 02:04:28.416190 update_engine[1459]: I20250509 02:04:28.416168 1459 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 9 02:04:28.416190 update_engine[1459]: I20250509 02:04:28.416181 1459 update_attempter.cc:306] Processing Done. May 9 02:04:28.416724 update_engine[1459]: I20250509 02:04:28.416195 1459 update_attempter.cc:310] Error event sent. May 9 02:04:28.416724 update_engine[1459]: I20250509 02:04:28.416237 1459 update_check_scheduler.cc:74] Next update check in 42m16s May 9 02:04:28.418408 locksmithd[1487]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 9 02:04:31.851751 systemd[1]: Started sshd@9-172.24.4.122:22-172.24.4.1:52638.service - OpenSSH per-connection server daemon (172.24.4.1:52638). May 9 02:04:32.995994 sshd[5257]: Accepted publickey for core from 172.24.4.1 port 52638 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:04:33.001871 sshd-session[5257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:04:33.041796 systemd-logind[1456]: New session 12 of user core. May 9 02:04:33.057086 systemd[1]: Started session-12.scope - Session 12 of User core. May 9 02:04:33.803427 sshd[5259]: Connection closed by 172.24.4.1 port 52638 May 9 02:04:33.803242 sshd-session[5257]: pam_unix(sshd:session): session closed for user core May 9 02:04:33.812451 systemd[1]: sshd@9-172.24.4.122:22-172.24.4.1:52638.service: Deactivated successfully. May 9 02:04:33.823450 systemd[1]: session-12.scope: Deactivated successfully. May 9 02:04:33.830213 systemd-logind[1456]: Session 12 logged out. Waiting for processes to exit. May 9 02:04:33.833432 systemd-logind[1456]: Removed session 12. May 9 02:04:38.829525 systemd[1]: Started sshd@10-172.24.4.122:22-172.24.4.1:49844.service - OpenSSH per-connection server daemon (172.24.4.1:49844). May 9 02:04:40.095987 sshd[5271]: Accepted publickey for core from 172.24.4.1 port 49844 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:04:40.103314 sshd-session[5271]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:04:40.124745 systemd-logind[1456]: New session 13 of user core. May 9 02:04:40.151511 systemd[1]: Started session-13.scope - Session 13 of User core. May 9 02:04:40.874657 sshd[5273]: Connection closed by 172.24.4.1 port 49844 May 9 02:04:40.877115 sshd-session[5271]: pam_unix(sshd:session): session closed for user core May 9 02:04:40.890255 systemd[1]: sshd@10-172.24.4.122:22-172.24.4.1:49844.service: Deactivated successfully. May 9 02:04:40.899032 systemd[1]: session-13.scope: Deactivated successfully. May 9 02:04:40.901470 systemd-logind[1456]: Session 13 logged out. Waiting for processes to exit. May 9 02:04:40.905152 systemd-logind[1456]: Removed session 13. May 9 02:04:43.952127 containerd[1478]: time="2025-05-09T02:04:43.951914204Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"311921fc6987cc955d7b888a5d573220e744f4b5d0ff6e747c7a3463223dd9b8\" pid:5297 exited_at:{seconds:1746756283 nanos:950925246}" May 9 02:04:45.912005 systemd[1]: Started sshd@11-172.24.4.122:22-172.24.4.1:56788.service - OpenSSH per-connection server daemon (172.24.4.1:56788). May 9 02:04:46.450580 containerd[1478]: time="2025-05-09T02:04:46.449804278Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"18d9d029817ec09b68b739b82c70818e68b395e3091ce93358f5bfd174eb172d\" pid:5331 exited_at:{seconds:1746756286 nanos:448515598}" May 9 02:04:47.221915 sshd[5316]: Accepted publickey for core from 172.24.4.1 port 56788 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:04:47.227543 sshd-session[5316]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:04:47.251228 systemd-logind[1456]: New session 14 of user core. May 9 02:04:47.262192 systemd[1]: Started session-14.scope - Session 14 of User core. May 9 02:04:47.992695 sshd[5340]: Connection closed by 172.24.4.1 port 56788 May 9 02:04:47.992047 sshd-session[5316]: pam_unix(sshd:session): session closed for user core May 9 02:04:48.001010 systemd[1]: sshd@11-172.24.4.122:22-172.24.4.1:56788.service: Deactivated successfully. May 9 02:04:48.007590 systemd[1]: session-14.scope: Deactivated successfully. May 9 02:04:48.012599 systemd-logind[1456]: Session 14 logged out. Waiting for processes to exit. May 9 02:04:48.016176 systemd-logind[1456]: Removed session 14. May 9 02:04:48.560419 containerd[1478]: time="2025-05-09T02:04:48.560338475Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"cc1b438284fe159469a651ddb3671660b879107a2e5e0feda5cab43d32e9e1fe\" pid:5364 exited_at:{seconds:1746756288 nanos:558712982}" May 9 02:04:53.016330 systemd[1]: Started sshd@12-172.24.4.122:22-172.24.4.1:56792.service - OpenSSH per-connection server daemon (172.24.4.1:56792). May 9 02:04:54.339143 sshd[5376]: Accepted publickey for core from 172.24.4.1 port 56792 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:04:54.343188 sshd-session[5376]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:04:54.353815 systemd-logind[1456]: New session 15 of user core. May 9 02:04:54.357898 systemd[1]: Started session-15.scope - Session 15 of User core. May 9 02:04:55.088228 sshd[5378]: Connection closed by 172.24.4.1 port 56792 May 9 02:04:55.089081 sshd-session[5376]: pam_unix(sshd:session): session closed for user core May 9 02:04:55.101335 systemd[1]: sshd@12-172.24.4.122:22-172.24.4.1:56792.service: Deactivated successfully. May 9 02:04:55.104237 systemd[1]: session-15.scope: Deactivated successfully. May 9 02:04:55.106819 systemd-logind[1456]: Session 15 logged out. Waiting for processes to exit. May 9 02:04:55.110512 systemd[1]: Started sshd@13-172.24.4.122:22-172.24.4.1:52546.service - OpenSSH per-connection server daemon (172.24.4.1:52546). May 9 02:04:55.113619 systemd-logind[1456]: Removed session 15. May 9 02:04:56.460891 sshd[5389]: Accepted publickey for core from 172.24.4.1 port 52546 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:04:56.465920 sshd-session[5389]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:04:56.477388 systemd-logind[1456]: New session 16 of user core. May 9 02:04:56.488224 systemd[1]: Started session-16.scope - Session 16 of User core. May 9 02:04:57.745787 sshd[5394]: Connection closed by 172.24.4.1 port 52546 May 9 02:04:57.751352 sshd-session[5389]: pam_unix(sshd:session): session closed for user core May 9 02:04:57.779289 systemd[1]: sshd@13-172.24.4.122:22-172.24.4.1:52546.service: Deactivated successfully. May 9 02:04:57.785218 systemd[1]: session-16.scope: Deactivated successfully. May 9 02:04:57.796860 systemd-logind[1456]: Session 16 logged out. Waiting for processes to exit. May 9 02:04:57.801097 systemd[1]: Started sshd@14-172.24.4.122:22-172.24.4.1:52560.service - OpenSSH per-connection server daemon (172.24.4.1:52560). May 9 02:04:57.813357 systemd-logind[1456]: Removed session 16. May 9 02:04:59.163849 sshd[5402]: Accepted publickey for core from 172.24.4.1 port 52560 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:04:59.166881 sshd-session[5402]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:04:59.184077 systemd-logind[1456]: New session 17 of user core. May 9 02:04:59.195208 systemd[1]: Started session-17.scope - Session 17 of User core. May 9 02:04:59.876106 sshd[5406]: Connection closed by 172.24.4.1 port 52560 May 9 02:04:59.878415 sshd-session[5402]: pam_unix(sshd:session): session closed for user core May 9 02:04:59.897243 systemd[1]: sshd@14-172.24.4.122:22-172.24.4.1:52560.service: Deactivated successfully. May 9 02:04:59.906970 systemd[1]: session-17.scope: Deactivated successfully. May 9 02:04:59.912910 systemd-logind[1456]: Session 17 logged out. Waiting for processes to exit. May 9 02:04:59.915989 systemd-logind[1456]: Removed session 17. May 9 02:05:04.891184 systemd[1]: Started sshd@15-172.24.4.122:22-172.24.4.1:36920.service - OpenSSH per-connection server daemon (172.24.4.1:36920). May 9 02:05:06.225617 sshd[5424]: Accepted publickey for core from 172.24.4.1 port 36920 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:05:06.227454 sshd-session[5424]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:05:06.241358 systemd-logind[1456]: New session 18 of user core. May 9 02:05:06.247878 systemd[1]: Started session-18.scope - Session 18 of User core. May 9 02:05:07.108663 sshd[5431]: Connection closed by 172.24.4.1 port 36920 May 9 02:05:07.109075 sshd-session[5424]: pam_unix(sshd:session): session closed for user core May 9 02:05:07.114500 systemd-logind[1456]: Session 18 logged out. Waiting for processes to exit. May 9 02:05:07.115094 systemd[1]: sshd@15-172.24.4.122:22-172.24.4.1:36920.service: Deactivated successfully. May 9 02:05:07.118845 systemd[1]: session-18.scope: Deactivated successfully. May 9 02:05:07.121763 systemd-logind[1456]: Removed session 18. May 9 02:05:12.147523 systemd[1]: Started sshd@16-172.24.4.122:22-172.24.4.1:36936.service - OpenSSH per-connection server daemon (172.24.4.1:36936). May 9 02:05:13.474795 sshd[5454]: Accepted publickey for core from 172.24.4.1 port 36936 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:05:13.479447 sshd-session[5454]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:05:13.497770 systemd-logind[1456]: New session 19 of user core. May 9 02:05:13.505270 systemd[1]: Started session-19.scope - Session 19 of User core. May 9 02:05:14.041526 containerd[1478]: time="2025-05-09T02:05:14.041243368Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"73e5f9d52e39569c33374947ffea7f79ed56a4a1766b5a25efaac112089a4747\" pid:5471 exited_at:{seconds:1746756314 nanos:25097448}" May 9 02:05:14.267691 sshd[5456]: Connection closed by 172.24.4.1 port 36936 May 9 02:05:14.267667 sshd-session[5454]: pam_unix(sshd:session): session closed for user core May 9 02:05:14.278427 systemd[1]: sshd@16-172.24.4.122:22-172.24.4.1:36936.service: Deactivated successfully. May 9 02:05:14.285670 systemd[1]: session-19.scope: Deactivated successfully. May 9 02:05:14.287252 systemd-logind[1456]: Session 19 logged out. Waiting for processes to exit. May 9 02:05:14.289441 systemd-logind[1456]: Removed session 19. May 9 02:05:18.563039 containerd[1478]: time="2025-05-09T02:05:18.562753065Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"5e191a62ecc6db322734d3212d81530f6375aad9eeda0a2b731fa52e4d082d89\" pid:5513 exited_at:{seconds:1746756318 nanos:560503342}" May 9 02:05:19.298254 systemd[1]: Started sshd@17-172.24.4.122:22-172.24.4.1:50900.service - OpenSSH per-connection server daemon (172.24.4.1:50900). May 9 02:05:20.586804 sshd[5523]: Accepted publickey for core from 172.24.4.1 port 50900 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:05:20.593118 sshd-session[5523]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:05:20.610135 systemd-logind[1456]: New session 20 of user core. May 9 02:05:20.616013 systemd[1]: Started session-20.scope - Session 20 of User core. May 9 02:05:21.458638 sshd[5527]: Connection closed by 172.24.4.1 port 50900 May 9 02:05:21.459350 sshd-session[5523]: pam_unix(sshd:session): session closed for user core May 9 02:05:21.477216 systemd[1]: sshd@17-172.24.4.122:22-172.24.4.1:50900.service: Deactivated successfully. May 9 02:05:21.481352 systemd[1]: session-20.scope: Deactivated successfully. May 9 02:05:21.483099 systemd-logind[1456]: Session 20 logged out. Waiting for processes to exit. May 9 02:05:21.490424 systemd[1]: Started sshd@18-172.24.4.122:22-172.24.4.1:50914.service - OpenSSH per-connection server daemon (172.24.4.1:50914). May 9 02:05:21.496994 systemd-logind[1456]: Removed session 20. May 9 02:05:22.835122 sshd[5538]: Accepted publickey for core from 172.24.4.1 port 50914 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:05:22.839033 sshd-session[5538]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:05:22.857342 systemd-logind[1456]: New session 21 of user core. May 9 02:05:22.865180 systemd[1]: Started session-21.scope - Session 21 of User core. May 9 02:05:23.978323 sshd[5541]: Connection closed by 172.24.4.1 port 50914 May 9 02:05:23.980463 sshd-session[5538]: pam_unix(sshd:session): session closed for user core May 9 02:05:23.991048 systemd[1]: Started sshd@19-172.24.4.122:22-172.24.4.1:43868.service - OpenSSH per-connection server daemon (172.24.4.1:43868). May 9 02:05:23.993016 systemd[1]: sshd@18-172.24.4.122:22-172.24.4.1:50914.service: Deactivated successfully. May 9 02:05:24.001247 systemd[1]: session-21.scope: Deactivated successfully. May 9 02:05:24.006916 systemd-logind[1456]: Session 21 logged out. Waiting for processes to exit. May 9 02:05:24.013893 systemd-logind[1456]: Removed session 21. May 9 02:05:25.306161 sshd[5547]: Accepted publickey for core from 172.24.4.1 port 43868 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:05:25.310200 sshd-session[5547]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:05:25.329156 systemd-logind[1456]: New session 22 of user core. May 9 02:05:25.340037 systemd[1]: Started session-22.scope - Session 22 of User core. May 9 02:05:29.073251 sshd[5552]: Connection closed by 172.24.4.1 port 43868 May 9 02:05:29.076291 sshd-session[5547]: pam_unix(sshd:session): session closed for user core May 9 02:05:29.086060 systemd[1]: Started sshd@20-172.24.4.122:22-172.24.4.1:43884.service - OpenSSH per-connection server daemon (172.24.4.1:43884). May 9 02:05:29.089481 systemd[1]: sshd@19-172.24.4.122:22-172.24.4.1:43868.service: Deactivated successfully. May 9 02:05:29.097890 systemd[1]: session-22.scope: Deactivated successfully. May 9 02:05:29.098216 systemd[1]: session-22.scope: Consumed 993ms CPU time, 64.1M memory peak. May 9 02:05:29.102778 systemd-logind[1456]: Session 22 logged out. Waiting for processes to exit. May 9 02:05:29.109615 systemd-logind[1456]: Removed session 22. May 9 02:05:30.248727 sshd[5566]: Accepted publickey for core from 172.24.4.1 port 43884 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:05:30.252310 sshd-session[5566]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:05:30.269789 systemd-logind[1456]: New session 23 of user core. May 9 02:05:30.276095 systemd[1]: Started session-23.scope - Session 23 of User core. May 9 02:05:31.235272 sshd[5571]: Connection closed by 172.24.4.1 port 43884 May 9 02:05:31.234123 sshd-session[5566]: pam_unix(sshd:session): session closed for user core May 9 02:05:31.245740 systemd[1]: sshd@20-172.24.4.122:22-172.24.4.1:43884.service: Deactivated successfully. May 9 02:05:31.249411 systemd[1]: session-23.scope: Deactivated successfully. May 9 02:05:31.251337 systemd-logind[1456]: Session 23 logged out. Waiting for processes to exit. May 9 02:05:31.257213 systemd[1]: Started sshd@21-172.24.4.122:22-172.24.4.1:43894.service - OpenSSH per-connection server daemon (172.24.4.1:43894). May 9 02:05:31.259394 systemd-logind[1456]: Removed session 23. May 9 02:05:32.583172 sshd[5580]: Accepted publickey for core from 172.24.4.1 port 43894 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:05:32.585000 sshd-session[5580]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:05:32.591141 systemd-logind[1456]: New session 24 of user core. May 9 02:05:32.599853 systemd[1]: Started session-24.scope - Session 24 of User core. May 9 02:05:33.395756 sshd[5583]: Connection closed by 172.24.4.1 port 43894 May 9 02:05:33.397021 sshd-session[5580]: pam_unix(sshd:session): session closed for user core May 9 02:05:33.408440 systemd[1]: sshd@21-172.24.4.122:22-172.24.4.1:43894.service: Deactivated successfully. May 9 02:05:33.417305 systemd[1]: session-24.scope: Deactivated successfully. May 9 02:05:33.419699 systemd-logind[1456]: Session 24 logged out. Waiting for processes to exit. May 9 02:05:33.422724 systemd-logind[1456]: Removed session 24. May 9 02:05:38.418182 systemd[1]: Started sshd@22-172.24.4.122:22-172.24.4.1:47006.service - OpenSSH per-connection server daemon (172.24.4.1:47006). May 9 02:05:39.706985 sshd[5595]: Accepted publickey for core from 172.24.4.1 port 47006 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:05:39.712541 sshd-session[5595]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:05:39.735491 systemd-logind[1456]: New session 25 of user core. May 9 02:05:39.745124 systemd[1]: Started session-25.scope - Session 25 of User core. May 9 02:05:40.569786 sshd[5600]: Connection closed by 172.24.4.1 port 47006 May 9 02:05:40.570907 sshd-session[5595]: pam_unix(sshd:session): session closed for user core May 9 02:05:40.577488 systemd[1]: sshd@22-172.24.4.122:22-172.24.4.1:47006.service: Deactivated successfully. May 9 02:05:40.578198 systemd-logind[1456]: Session 25 logged out. Waiting for processes to exit. May 9 02:05:40.580553 systemd[1]: session-25.scope: Deactivated successfully. May 9 02:05:40.584331 systemd-logind[1456]: Removed session 25. May 9 02:05:43.922599 containerd[1478]: time="2025-05-09T02:05:43.922312745Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"727fde9d4b95980dceaf5af2baf09a6e912499a83575146c19624c763f946e40\" pid:5624 exited_at:{seconds:1746756343 nanos:918292570}" May 9 02:05:45.604062 systemd[1]: Started sshd@23-172.24.4.122:22-172.24.4.1:59686.service - OpenSSH per-connection server daemon (172.24.4.1:59686). May 9 02:05:46.445896 containerd[1478]: time="2025-05-09T02:05:46.445812333Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"14d6294f2a929876590bf3c8fb0a16fd885d44494cc778608e3837ff435c1abc\" pid:5653 exited_at:{seconds:1746756346 nanos:444360258}" May 9 02:05:46.962892 sshd[5638]: Accepted publickey for core from 172.24.4.1 port 59686 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:05:46.966716 sshd-session[5638]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:05:46.983795 systemd-logind[1456]: New session 26 of user core. May 9 02:05:46.992311 systemd[1]: Started session-26.scope - Session 26 of User core. May 9 02:05:47.884451 sshd[5661]: Connection closed by 172.24.4.1 port 59686 May 9 02:05:47.885140 sshd-session[5638]: pam_unix(sshd:session): session closed for user core May 9 02:05:47.894536 systemd[1]: sshd@23-172.24.4.122:22-172.24.4.1:59686.service: Deactivated successfully. May 9 02:05:47.904081 systemd[1]: session-26.scope: Deactivated successfully. May 9 02:05:47.907210 systemd-logind[1456]: Session 26 logged out. Waiting for processes to exit. May 9 02:05:47.910679 systemd-logind[1456]: Removed session 26. May 9 02:05:48.499147 containerd[1478]: time="2025-05-09T02:05:48.499078254Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"55da42ceea99c7c820d01d7f5ade19f079fb670232d9168a0a02467bc60763a8\" pid:5683 exited_at:{seconds:1746756348 nanos:497688175}" May 9 02:05:52.909710 systemd[1]: Started sshd@24-172.24.4.122:22-172.24.4.1:59700.service - OpenSSH per-connection server daemon (172.24.4.1:59700). May 9 02:05:54.260695 sshd[5696]: Accepted publickey for core from 172.24.4.1 port 59700 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:05:54.263452 sshd-session[5696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:05:54.276339 systemd-logind[1456]: New session 27 of user core. May 9 02:05:54.288954 systemd[1]: Started session-27.scope - Session 27 of User core. May 9 02:05:55.008486 sshd[5698]: Connection closed by 172.24.4.1 port 59700 May 9 02:05:55.009604 sshd-session[5696]: pam_unix(sshd:session): session closed for user core May 9 02:05:55.016026 systemd[1]: sshd@24-172.24.4.122:22-172.24.4.1:59700.service: Deactivated successfully. May 9 02:05:55.020703 systemd[1]: session-27.scope: Deactivated successfully. May 9 02:05:55.024301 systemd-logind[1456]: Session 27 logged out. Waiting for processes to exit. May 9 02:05:55.026776 systemd-logind[1456]: Removed session 27. May 9 02:05:59.217511 containerd[1478]: time="2025-05-09T02:05:59.217108957Z" level=warning msg="container event discarded" container=2436a202bc4174c32efaea116f54fcc2d789fcdfdd5388329da8f36302910cf0 type=CONTAINER_CREATED_EVENT May 9 02:05:59.229028 containerd[1478]: time="2025-05-09T02:05:59.228885324Z" level=warning msg="container event discarded" container=2436a202bc4174c32efaea116f54fcc2d789fcdfdd5388329da8f36302910cf0 type=CONTAINER_STARTED_EVENT May 9 02:05:59.627022 containerd[1478]: time="2025-05-09T02:05:59.626792623Z" level=warning msg="container event discarded" container=860fd1fca0d8689c4fc5bba3eeb01cace53d2963f7e79afa7d6b2545dfdd77f5 type=CONTAINER_CREATED_EVENT May 9 02:05:59.627022 containerd[1478]: time="2025-05-09T02:05:59.626923448Z" level=warning msg="container event discarded" container=860fd1fca0d8689c4fc5bba3eeb01cace53d2963f7e79afa7d6b2545dfdd77f5 type=CONTAINER_STARTED_EVENT May 9 02:05:59.652439 containerd[1478]: time="2025-05-09T02:05:59.652289255Z" level=warning msg="container event discarded" container=062e4616d77d53c4de23c564bfaccfc5d071e7b63d36d1749445bc49aa18c3ba type=CONTAINER_CREATED_EVENT May 9 02:05:59.652439 containerd[1478]: time="2025-05-09T02:05:59.652409821Z" level=warning msg="container event discarded" container=062e4616d77d53c4de23c564bfaccfc5d071e7b63d36d1749445bc49aa18c3ba type=CONTAINER_STARTED_EVENT May 9 02:05:59.682339 containerd[1478]: time="2025-05-09T02:05:59.682135027Z" level=warning msg="container event discarded" container=cf80c3e6e3ada7fa1bd9b74b823513a8111c5da037542eaddb86b6c436586000 type=CONTAINER_CREATED_EVENT May 9 02:05:59.707952 containerd[1478]: time="2025-05-09T02:05:59.707699075Z" level=warning msg="container event discarded" container=4deedc1792f9759ba4c7818db61b16725b038005565f7778bc384f6846869265 type=CONTAINER_CREATED_EVENT May 9 02:05:59.730817 containerd[1478]: time="2025-05-09T02:05:59.730563497Z" level=warning msg="container event discarded" container=99793087f021c42b2aea6346cee4c8b586d5a5b99d09f6d5137433194cc3cf2e type=CONTAINER_CREATED_EVENT May 9 02:05:59.845270 containerd[1478]: time="2025-05-09T02:05:59.845100112Z" level=warning msg="container event discarded" container=cf80c3e6e3ada7fa1bd9b74b823513a8111c5da037542eaddb86b6c436586000 type=CONTAINER_STARTED_EVENT May 9 02:05:59.878418 containerd[1478]: time="2025-05-09T02:05:59.877984185Z" level=warning msg="container event discarded" container=4deedc1792f9759ba4c7818db61b16725b038005565f7778bc384f6846869265 type=CONTAINER_STARTED_EVENT May 9 02:05:59.878418 containerd[1478]: time="2025-05-09T02:05:59.878079986Z" level=warning msg="container event discarded" container=99793087f021c42b2aea6346cee4c8b586d5a5b99d09f6d5137433194cc3cf2e type=CONTAINER_STARTED_EVENT May 9 02:06:00.036452 systemd[1]: Started sshd@25-172.24.4.122:22-172.24.4.1:34800.service - OpenSSH per-connection server daemon (172.24.4.1:34800). May 9 02:06:01.351708 sshd[5709]: Accepted publickey for core from 172.24.4.1 port 34800 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:06:01.353809 sshd-session[5709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:06:01.368162 systemd-logind[1456]: New session 28 of user core. May 9 02:06:01.374968 systemd[1]: Started session-28.scope - Session 28 of User core. May 9 02:06:02.086809 sshd[5711]: Connection closed by 172.24.4.1 port 34800 May 9 02:06:02.087962 sshd-session[5709]: pam_unix(sshd:session): session closed for user core May 9 02:06:02.095300 systemd[1]: sshd@25-172.24.4.122:22-172.24.4.1:34800.service: Deactivated successfully. May 9 02:06:02.100980 systemd[1]: session-28.scope: Deactivated successfully. May 9 02:06:02.104711 systemd-logind[1456]: Session 28 logged out. Waiting for processes to exit. May 9 02:06:02.107903 systemd-logind[1456]: Removed session 28. May 9 02:06:07.121444 systemd[1]: Started sshd@26-172.24.4.122:22-172.24.4.1:47074.service - OpenSSH per-connection server daemon (172.24.4.1:47074). May 9 02:06:08.272433 sshd[5727]: Accepted publickey for core from 172.24.4.1 port 47074 ssh2: RSA SHA256:NEPSem0kL1hy9llaBF9CLFRCDHwio9wvc/afolXNBLk May 9 02:06:08.280547 sshd-session[5727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 9 02:06:08.303544 systemd-logind[1456]: New session 29 of user core. May 9 02:06:08.319082 systemd[1]: Started session-29.scope - Session 29 of User core. May 9 02:06:09.034687 sshd[5729]: Connection closed by 172.24.4.1 port 47074 May 9 02:06:09.033998 sshd-session[5727]: pam_unix(sshd:session): session closed for user core May 9 02:06:09.044665 systemd[1]: sshd@26-172.24.4.122:22-172.24.4.1:47074.service: Deactivated successfully. May 9 02:06:09.052320 systemd[1]: session-29.scope: Deactivated successfully. May 9 02:06:09.056112 systemd-logind[1456]: Session 29 logged out. Waiting for processes to exit. May 9 02:06:09.059097 systemd-logind[1456]: Removed session 29. May 9 02:06:13.940568 containerd[1478]: time="2025-05-09T02:06:13.940284217Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"2f44ecb46a337ed6ac65f9e33bc411f2e326bf7278d51b9472e64b4210b2035e\" pid:5751 exited_at:{seconds:1746756373 nanos:938841260}" May 9 02:06:18.559439 containerd[1478]: time="2025-05-09T02:06:18.559352607Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"64113b31cc1e8b9a4e08890fb5d6edbd880889a9081a5d160f1679350730c895\" pid:5777 exited_at:{seconds:1746756378 nanos:558844524}" May 9 02:06:20.056545 containerd[1478]: time="2025-05-09T02:06:20.056271817Z" level=warning msg="container event discarded" container=1d032d3afc92079fd72862b41d05a76af54686b0800628500edb97f5acd96d86 type=CONTAINER_CREATED_EVENT May 9 02:06:20.056545 containerd[1478]: time="2025-05-09T02:06:20.056500145Z" level=warning msg="container event discarded" container=1d032d3afc92079fd72862b41d05a76af54686b0800628500edb97f5acd96d86 type=CONTAINER_STARTED_EVENT May 9 02:06:20.091086 containerd[1478]: time="2025-05-09T02:06:20.090977661Z" level=warning msg="container event discarded" container=a9fd3d340bb5f7723426e733bec461949ce4e76683b53ebd390a64bffacbdeca type=CONTAINER_CREATED_EVENT May 9 02:06:20.174587 containerd[1478]: time="2025-05-09T02:06:20.174402999Z" level=warning msg="container event discarded" container=a9fd3d340bb5f7723426e733bec461949ce4e76683b53ebd390a64bffacbdeca type=CONTAINER_STARTED_EVENT May 9 02:06:21.165930 containerd[1478]: time="2025-05-09T02:06:21.165732828Z" level=warning msg="container event discarded" container=773cf86dc08c123c107ed25ebf9c43125445091cb17df67e28915d67551d2902 type=CONTAINER_CREATED_EVENT May 9 02:06:21.165930 containerd[1478]: time="2025-05-09T02:06:21.165837004Z" level=warning msg="container event discarded" container=773cf86dc08c123c107ed25ebf9c43125445091cb17df67e28915d67551d2902 type=CONTAINER_STARTED_EVENT May 9 02:06:23.827598 containerd[1478]: time="2025-05-09T02:06:23.827465489Z" level=warning msg="container event discarded" container=12b9e547c6372b27b2a5004899c77f048dc7a1b69b3d4528987137b13d675df7 type=CONTAINER_CREATED_EVENT May 9 02:06:23.904199 containerd[1478]: time="2025-05-09T02:06:23.903976674Z" level=warning msg="container event discarded" container=12b9e547c6372b27b2a5004899c77f048dc7a1b69b3d4528987137b13d675df7 type=CONTAINER_STARTED_EVENT May 9 02:06:28.559892 containerd[1478]: time="2025-05-09T02:06:28.559746978Z" level=warning msg="container event discarded" container=2dfa7e9f3dbf457c8c4f1bd9053525e6d1ef1a93e5652fc346122531e24fc3f0 type=CONTAINER_CREATED_EVENT May 9 02:06:28.562443 containerd[1478]: time="2025-05-09T02:06:28.560052621Z" level=warning msg="container event discarded" container=2dfa7e9f3dbf457c8c4f1bd9053525e6d1ef1a93e5652fc346122531e24fc3f0 type=CONTAINER_STARTED_EVENT May 9 02:06:28.607880 containerd[1478]: time="2025-05-09T02:06:28.607709588Z" level=warning msg="container event discarded" container=835f8bb95ed5f9f8ba36028371f8e59fe761942909e02f532bca43f8aa19a5f9 type=CONTAINER_CREATED_EVENT May 9 02:06:28.607880 containerd[1478]: time="2025-05-09T02:06:28.607803055Z" level=warning msg="container event discarded" container=835f8bb95ed5f9f8ba36028371f8e59fe761942909e02f532bca43f8aa19a5f9 type=CONTAINER_STARTED_EVENT May 9 02:06:32.368024 containerd[1478]: time="2025-05-09T02:06:32.367956392Z" level=warning msg="container event discarded" container=97a0f69e3a97c54ed2727589e847c86030a8bf8c8d38b783c842c2639fd368d8 type=CONTAINER_CREATED_EVENT May 9 02:06:32.466319 containerd[1478]: time="2025-05-09T02:06:32.466220289Z" level=warning msg="container event discarded" container=97a0f69e3a97c54ed2727589e847c86030a8bf8c8d38b783c842c2639fd368d8 type=CONTAINER_STARTED_EVENT May 9 02:06:35.457552 containerd[1478]: time="2025-05-09T02:06:35.457420762Z" level=warning msg="container event discarded" container=be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659 type=CONTAINER_CREATED_EVENT May 9 02:06:35.563092 containerd[1478]: time="2025-05-09T02:06:35.562948974Z" level=warning msg="container event discarded" container=be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659 type=CONTAINER_STARTED_EVENT May 9 02:06:37.179765 containerd[1478]: time="2025-05-09T02:06:37.179585264Z" level=warning msg="container event discarded" container=be75eef11ff24f466f6b89acdbe86e45fd732617aa9c95754255dad6e6ae8659 type=CONTAINER_STOPPED_EVENT May 9 02:06:43.971445 containerd[1478]: time="2025-05-09T02:06:43.971089127Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"b0b1efb7b90ee1f93d0c042401e21f76d973577015781799cec175c5649c927c\" pid:5811 exited_at:{seconds:1746756403 nanos:970444217}" May 9 02:06:44.765225 containerd[1478]: time="2025-05-09T02:06:44.764963580Z" level=warning msg="container event discarded" container=9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8 type=CONTAINER_CREATED_EVENT May 9 02:06:44.864937 containerd[1478]: time="2025-05-09T02:06:44.864565078Z" level=warning msg="container event discarded" container=9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8 type=CONTAINER_STARTED_EVENT May 9 02:06:46.449992 containerd[1478]: time="2025-05-09T02:06:46.449369635Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"aac9a3436c79a8e66ca1ce4dd69dcdf551b800b27ee98a6b79b1175e2cfe8d8f\" pid:5834 exited_at:{seconds:1746756406 nanos:445438020}" May 9 02:06:48.565240 containerd[1478]: time="2025-05-09T02:06:48.565191178Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"7f14f01637ed205470d0326efec3d6c3fe21a78afd74b697951cda5a06fd9ffb\" pid:5862 exited_at:{seconds:1746756408 nanos:564351703}" May 9 02:06:48.709369 containerd[1478]: time="2025-05-09T02:06:48.709247017Z" level=warning msg="container event discarded" container=9b79fd63eda3c3c9985a798692bd8d4deaee06c93f6425fec7ec7448cb4c1da8 type=CONTAINER_STOPPED_EVENT May 9 02:07:00.973997 containerd[1478]: time="2025-05-09T02:07:00.973725293Z" level=warning msg="container event discarded" container=78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855 type=CONTAINER_CREATED_EVENT May 9 02:07:01.080952 containerd[1478]: time="2025-05-09T02:07:01.080828322Z" level=warning msg="container event discarded" container=78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855 type=CONTAINER_STARTED_EVENT May 9 02:07:02.862376 containerd[1478]: time="2025-05-09T02:07:02.862239037Z" level=warning msg="container event discarded" container=fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644 type=CONTAINER_CREATED_EVENT May 9 02:07:02.862376 containerd[1478]: time="2025-05-09T02:07:02.862347320Z" level=warning msg="container event discarded" container=fb151dfe640be5c47f7333b7b350a511b759f99b54ce73dcc520a0e38bcb7644 type=CONTAINER_STARTED_EVENT May 9 02:07:04.146887 containerd[1478]: time="2025-05-09T02:07:04.146617571Z" level=warning msg="container event discarded" container=29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a type=CONTAINER_CREATED_EVENT May 9 02:07:04.146887 containerd[1478]: time="2025-05-09T02:07:04.146867911Z" level=warning msg="container event discarded" container=29dd92f31f183f5303dfbe1d25e07dc3668af1ec78bef62a6b5597deb512957a type=CONTAINER_STARTED_EVENT May 9 02:07:04.209410 containerd[1478]: time="2025-05-09T02:07:04.209268163Z" level=warning msg="container event discarded" container=bbaa55a6fc7078546f29acb92e1a92da9c299938bf084055ce1756ea5206a44b type=CONTAINER_CREATED_EVENT May 9 02:07:04.307050 containerd[1478]: time="2025-05-09T02:07:04.306839692Z" level=warning msg="container event discarded" container=bbaa55a6fc7078546f29acb92e1a92da9c299938bf084055ce1756ea5206a44b type=CONTAINER_STARTED_EVENT May 9 02:07:04.819218 containerd[1478]: time="2025-05-09T02:07:04.819060587Z" level=warning msg="container event discarded" container=a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c type=CONTAINER_CREATED_EVENT May 9 02:07:04.819218 containerd[1478]: time="2025-05-09T02:07:04.819183998Z" level=warning msg="container event discarded" container=a0ef5a1028e41c53ff21764d051526c7c9df2d947c165305495d295902e9e63c type=CONTAINER_STARTED_EVENT May 9 02:07:06.555830 containerd[1478]: time="2025-05-09T02:07:06.555193043Z" level=warning msg="container event discarded" container=6bde35bb97dae91ab2108a6779a7598499f1fc55f2d31c3e793de61a455129e2 type=CONTAINER_CREATED_EVENT May 9 02:07:06.728199 containerd[1478]: time="2025-05-09T02:07:06.727965004Z" level=warning msg="container event discarded" container=6bde35bb97dae91ab2108a6779a7598499f1fc55f2d31c3e793de61a455129e2 type=CONTAINER_STARTED_EVENT May 9 02:07:10.805354 containerd[1478]: time="2025-05-09T02:07:10.805141316Z" level=warning msg="container event discarded" container=f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2 type=CONTAINER_CREATED_EVENT May 9 02:07:10.926763 containerd[1478]: time="2025-05-09T02:07:10.926576415Z" level=warning msg="container event discarded" container=f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2 type=CONTAINER_STARTED_EVENT May 9 02:07:13.452717 containerd[1478]: time="2025-05-09T02:07:13.451744069Z" level=warning msg="container event discarded" container=76ef1b22e3bbf85a464137ad1d25e21e8518088f5546f990c4659e34c8f5137b type=CONTAINER_CREATED_EVENT May 9 02:07:13.609301 containerd[1478]: time="2025-05-09T02:07:13.609147727Z" level=warning msg="container event discarded" container=76ef1b22e3bbf85a464137ad1d25e21e8518088f5546f990c4659e34c8f5137b type=CONTAINER_STARTED_EVENT May 9 02:07:13.935019 containerd[1478]: time="2025-05-09T02:07:13.934661956Z" level=warning msg="container event discarded" container=f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb type=CONTAINER_CREATED_EVENT May 9 02:07:13.935019 containerd[1478]: time="2025-05-09T02:07:13.934880626Z" level=warning msg="container event discarded" container=f9460feca0c1ec2a6bfff989ba1b37fb307f7f104a391a8ca29367de25333ebb type=CONTAINER_STARTED_EVENT May 9 02:07:13.957864 containerd[1478]: time="2025-05-09T02:07:13.957769211Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"0e62492bcb68c78d326facf1515cbec19937a2346fed5b4e0d30eaebec891dff\" pid:5887 exited_at:{seconds:1746756433 nanos:956310014}" May 9 02:07:14.769140 containerd[1478]: time="2025-05-09T02:07:14.768993602Z" level=warning msg="container event discarded" container=e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824 type=CONTAINER_CREATED_EVENT May 9 02:07:14.769140 containerd[1478]: time="2025-05-09T02:07:14.769101024Z" level=warning msg="container event discarded" container=e60e37d20174f5c60de586f54ea8e880a4f15ac5ba0cecab1a559dc6a290a824 type=CONTAINER_STARTED_EVENT May 9 02:07:15.827583 containerd[1478]: time="2025-05-09T02:07:15.827207002Z" level=warning msg="container event discarded" container=e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4 type=CONTAINER_CREATED_EVENT May 9 02:07:15.827583 containerd[1478]: time="2025-05-09T02:07:15.827367633Z" level=warning msg="container event discarded" container=e5b6ac799b4b03b42aa9ff8400fda87b1d5767dd5dc4fd1922c8031290c3bac4 type=CONTAINER_STARTED_EVENT May 9 02:07:15.877951 containerd[1478]: time="2025-05-09T02:07:15.877801677Z" level=warning msg="container event discarded" container=15c99df04ebef533acf903641fbd27a0169809cbd430592bc552dd69102208e5 type=CONTAINER_CREATED_EVENT May 9 02:07:16.015686 containerd[1478]: time="2025-05-09T02:07:16.015012957Z" level=warning msg="container event discarded" container=15c99df04ebef533acf903641fbd27a0169809cbd430592bc552dd69102208e5 type=CONTAINER_STARTED_EVENT May 9 02:07:18.599948 containerd[1478]: time="2025-05-09T02:07:18.599655049Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"09d943f9936dc0d749cc883e1c7bf4c2479053728e79300c6c08c22a5d0a6b4e\" pid:5912 exited_at:{seconds:1746756438 nanos:592054652}" May 9 02:07:20.929607 containerd[1478]: time="2025-05-09T02:07:20.929337398Z" level=warning msg="container event discarded" container=8f2a1b2f25d7ee0cd2e79f6d05f64eff706be3f443e03c8af42cdff8a2ed2d74 type=CONTAINER_CREATED_EVENT May 9 02:07:21.151892 containerd[1478]: time="2025-05-09T02:07:21.151619830Z" level=warning msg="container event discarded" container=8f2a1b2f25d7ee0cd2e79f6d05f64eff706be3f443e03c8af42cdff8a2ed2d74 type=CONTAINER_STARTED_EVENT May 9 02:07:21.486501 containerd[1478]: time="2025-05-09T02:07:21.486312216Z" level=warning msg="container event discarded" container=8c2ccfbd241927610a8629971cf4affac189c278b8be6263ec42a873d2e8a92d type=CONTAINER_CREATED_EVENT May 9 02:07:21.666954 containerd[1478]: time="2025-05-09T02:07:21.666815368Z" level=warning msg="container event discarded" container=8c2ccfbd241927610a8629971cf4affac189c278b8be6263ec42a873d2e8a92d type=CONTAINER_STARTED_EVENT May 9 02:07:43.974037 containerd[1478]: time="2025-05-09T02:07:43.973737323Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78e485d3df0ab8b17c510cfa33b5e8a9115eb9f128c818574154977fa2041855\" id:\"04f401ac2b83aa4bee6bc91867ee6f55b7fc4bf3c4e77cf23deb8ebc3dbee13d\" pid:5942 exited_at:{seconds:1746756463 nanos:972587917}" May 9 02:07:46.429080 containerd[1478]: time="2025-05-09T02:07:46.429012421Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"991528542010071c1988852d9968dcfbcecb30ed6983d5b1693251e0b787c72c\" pid:5966 exited_at:{seconds:1746756466 nanos:428535397}" May 9 02:07:48.571031 containerd[1478]: time="2025-05-09T02:07:48.570851134Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92d606d3e964f9fc41ea22bb7d5530320df7a67baa22b6c252eac02cb94ffa2\" id:\"9bbaf92eabf2f368d19397dcb45be1d0f11c84f3cefc036f9be1b3e2b4d7b0f2\" pid:5990 exited_at:{seconds:1746756468 nanos:569655862}"