May 14 00:17:12.113626 kernel: Linux version 6.6.89-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 13 22:08:35 -00 2025 May 14 00:17:12.113699 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 14 00:17:12.113728 kernel: BIOS-provided physical RAM map: May 14 00:17:12.113750 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 14 00:17:12.113771 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 14 00:17:12.113797 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 14 00:17:12.113822 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable May 14 00:17:12.113844 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved May 14 00:17:12.113865 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 14 00:17:12.113886 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 14 00:17:12.113908 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable May 14 00:17:12.113929 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 14 00:17:12.113951 kernel: NX (Execute Disable) protection: active May 14 00:17:12.115196 kernel: APIC: Static calls initialized May 14 00:17:12.115235 kernel: SMBIOS 3.0.0 present. May 14 00:17:12.115258 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 May 14 00:17:12.115284 kernel: Hypervisor detected: KVM May 14 00:17:12.115313 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 14 00:17:12.115337 kernel: kvm-clock: using sched offset of 3684840473 cycles May 14 00:17:12.115360 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 14 00:17:12.115390 kernel: tsc: Detected 1996.249 MHz processor May 14 00:17:12.115414 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 14 00:17:12.115459 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 14 00:17:12.115490 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 May 14 00:17:12.115520 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 14 00:17:12.115551 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 14 00:17:12.115585 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 May 14 00:17:12.115615 kernel: ACPI: Early table checksum verification disabled May 14 00:17:12.115657 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) May 14 00:17:12.115689 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 14 00:17:12.115713 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 14 00:17:12.115737 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 14 00:17:12.115760 kernel: ACPI: FACS 0x00000000BFFE0000 000040 May 14 00:17:12.115783 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 14 00:17:12.115807 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 14 00:17:12.115896 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] May 14 00:17:12.115920 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] May 14 00:17:12.115951 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] May 14 00:17:12.116020 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] May 14 00:17:12.116044 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] May 14 00:17:12.116077 kernel: No NUMA configuration found May 14 00:17:12.116101 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] May 14 00:17:12.116125 kernel: NODE_DATA(0) allocated [mem 0x13fff7000-0x13fffcfff] May 14 00:17:12.116150 kernel: Zone ranges: May 14 00:17:12.116179 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 14 00:17:12.116202 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] May 14 00:17:12.116226 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] May 14 00:17:12.116250 kernel: Movable zone start for each node May 14 00:17:12.116274 kernel: Early memory node ranges May 14 00:17:12.116298 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 14 00:17:12.116323 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] May 14 00:17:12.116354 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] May 14 00:17:12.116390 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] May 14 00:17:12.116416 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 14 00:17:12.116440 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 14 00:17:12.116465 kernel: On node 0, zone Normal: 35 pages in unavailable ranges May 14 00:17:12.116489 kernel: ACPI: PM-Timer IO Port: 0x608 May 14 00:17:12.116513 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 14 00:17:12.116537 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 14 00:17:12.116561 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 14 00:17:12.116586 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 14 00:17:12.116615 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 14 00:17:12.116639 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 14 00:17:12.116663 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 14 00:17:12.116687 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 14 00:17:12.116711 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs May 14 00:17:12.116735 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 14 00:17:12.116759 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices May 14 00:17:12.116783 kernel: Booting paravirtualized kernel on KVM May 14 00:17:12.116809 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 14 00:17:12.116837 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 14 00:17:12.116887 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 May 14 00:17:12.116912 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 May 14 00:17:12.116936 kernel: pcpu-alloc: [0] 0 1 May 14 00:17:12.119401 kernel: kvm-guest: PV spinlocks disabled, no host support May 14 00:17:12.119448 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 14 00:17:12.119473 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 14 00:17:12.119495 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 14 00:17:12.119526 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 14 00:17:12.119549 kernel: Fallback order for Node 0: 0 May 14 00:17:12.119570 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1031901 May 14 00:17:12.119593 kernel: Policy zone: Normal May 14 00:17:12.119615 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 14 00:17:12.119636 kernel: software IO TLB: area num 2. May 14 00:17:12.119660 kernel: Memory: 3962108K/4193772K available (14336K kernel code, 2296K rwdata, 25068K rodata, 43604K init, 1468K bss, 231404K reserved, 0K cma-reserved) May 14 00:17:12.119682 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 14 00:17:12.119704 kernel: ftrace: allocating 37993 entries in 149 pages May 14 00:17:12.119730 kernel: ftrace: allocated 149 pages with 4 groups May 14 00:17:12.119752 kernel: Dynamic Preempt: voluntary May 14 00:17:12.119773 kernel: rcu: Preemptible hierarchical RCU implementation. May 14 00:17:12.119797 kernel: rcu: RCU event tracing is enabled. May 14 00:17:12.119820 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 14 00:17:12.119842 kernel: Trampoline variant of Tasks RCU enabled. May 14 00:17:12.119864 kernel: Rude variant of Tasks RCU enabled. May 14 00:17:12.119885 kernel: Tracing variant of Tasks RCU enabled. May 14 00:17:12.119907 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 14 00:17:12.119933 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 14 00:17:12.119955 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 May 14 00:17:12.120019 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 14 00:17:12.120041 kernel: Console: colour VGA+ 80x25 May 14 00:17:12.120062 kernel: printk: console [tty0] enabled May 14 00:17:12.120085 kernel: printk: console [ttyS0] enabled May 14 00:17:12.120106 kernel: ACPI: Core revision 20230628 May 14 00:17:12.120128 kernel: APIC: Switch to symmetric I/O mode setup May 14 00:17:12.120150 kernel: x2apic enabled May 14 00:17:12.120177 kernel: APIC: Switched APIC routing to: physical x2apic May 14 00:17:12.120198 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 14 00:17:12.120220 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized May 14 00:17:12.120242 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) May 14 00:17:12.120264 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 14 00:17:12.120286 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 14 00:17:12.120308 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 14 00:17:12.120330 kernel: Spectre V2 : Mitigation: Retpolines May 14 00:17:12.120352 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 14 00:17:12.120378 kernel: Speculative Store Bypass: Vulnerable May 14 00:17:12.120400 kernel: x86/fpu: x87 FPU will use FXSAVE May 14 00:17:12.120421 kernel: Freeing SMP alternatives memory: 32K May 14 00:17:12.120443 kernel: pid_max: default: 32768 minimum: 301 May 14 00:17:12.120478 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 14 00:17:12.120504 kernel: landlock: Up and running. May 14 00:17:12.120527 kernel: SELinux: Initializing. May 14 00:17:12.120550 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 14 00:17:12.120573 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 14 00:17:12.120596 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) May 14 00:17:12.120619 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 14 00:17:12.120643 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 14 00:17:12.120671 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 14 00:17:12.120694 kernel: Performance Events: AMD PMU driver. May 14 00:17:12.120717 kernel: ... version: 0 May 14 00:17:12.120740 kernel: ... bit width: 48 May 14 00:17:12.120762 kernel: ... generic registers: 4 May 14 00:17:12.120791 kernel: ... value mask: 0000ffffffffffff May 14 00:17:12.120824 kernel: ... max period: 00007fffffffffff May 14 00:17:12.120884 kernel: ... fixed-purpose events: 0 May 14 00:17:12.120922 kernel: ... event mask: 000000000000000f May 14 00:17:12.124993 kernel: signal: max sigframe size: 1440 May 14 00:17:12.125059 kernel: rcu: Hierarchical SRCU implementation. May 14 00:17:12.125075 kernel: rcu: Max phase no-delay instances is 400. May 14 00:17:12.125091 kernel: smp: Bringing up secondary CPUs ... May 14 00:17:12.125105 kernel: smpboot: x86: Booting SMP configuration: May 14 00:17:12.125128 kernel: .... node #0, CPUs: #1 May 14 00:17:12.125139 kernel: smp: Brought up 1 node, 2 CPUs May 14 00:17:12.125149 kernel: smpboot: Max logical packages: 2 May 14 00:17:12.125162 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) May 14 00:17:12.125176 kernel: devtmpfs: initialized May 14 00:17:12.125190 kernel: x86/mm: Memory block size: 128MB May 14 00:17:12.125205 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 14 00:17:12.125219 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 14 00:17:12.125230 kernel: pinctrl core: initialized pinctrl subsystem May 14 00:17:12.125248 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 14 00:17:12.125259 kernel: audit: initializing netlink subsys (disabled) May 14 00:17:12.125270 kernel: audit: type=2000 audit(1747181831.956:1): state=initialized audit_enabled=0 res=1 May 14 00:17:12.125281 kernel: thermal_sys: Registered thermal governor 'step_wise' May 14 00:17:12.125295 kernel: thermal_sys: Registered thermal governor 'user_space' May 14 00:17:12.125306 kernel: cpuidle: using governor menu May 14 00:17:12.125317 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 14 00:17:12.125327 kernel: dca service started, version 1.12.1 May 14 00:17:12.125338 kernel: PCI: Using configuration type 1 for base access May 14 00:17:12.125357 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 14 00:17:12.125372 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 14 00:17:12.125384 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 14 00:17:12.125397 kernel: ACPI: Added _OSI(Module Device) May 14 00:17:12.125409 kernel: ACPI: Added _OSI(Processor Device) May 14 00:17:12.125421 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 14 00:17:12.125434 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 14 00:17:12.125447 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 14 00:17:12.125458 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC May 14 00:17:12.125472 kernel: ACPI: Interpreter enabled May 14 00:17:12.125482 kernel: ACPI: PM: (supports S0 S3 S5) May 14 00:17:12.125493 kernel: ACPI: Using IOAPIC for interrupt routing May 14 00:17:12.125503 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 14 00:17:12.125514 kernel: PCI: Using E820 reservations for host bridge windows May 14 00:17:12.125524 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F May 14 00:17:12.125535 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 14 00:17:12.125738 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] May 14 00:17:12.125887 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] May 14 00:17:12.126060 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge May 14 00:17:12.126080 kernel: acpiphp: Slot [3] registered May 14 00:17:12.126091 kernel: acpiphp: Slot [4] registered May 14 00:17:12.126102 kernel: acpiphp: Slot [5] registered May 14 00:17:12.126116 kernel: acpiphp: Slot [6] registered May 14 00:17:12.126131 kernel: acpiphp: Slot [7] registered May 14 00:17:12.126145 kernel: acpiphp: Slot [8] registered May 14 00:17:12.126164 kernel: acpiphp: Slot [9] registered May 14 00:17:12.126178 kernel: acpiphp: Slot [10] registered May 14 00:17:12.126191 kernel: acpiphp: Slot [11] registered May 14 00:17:12.126202 kernel: acpiphp: Slot [12] registered May 14 00:17:12.126216 kernel: acpiphp: Slot [13] registered May 14 00:17:12.126231 kernel: acpiphp: Slot [14] registered May 14 00:17:12.126245 kernel: acpiphp: Slot [15] registered May 14 00:17:12.126259 kernel: acpiphp: Slot [16] registered May 14 00:17:12.126273 kernel: acpiphp: Slot [17] registered May 14 00:17:12.126287 kernel: acpiphp: Slot [18] registered May 14 00:17:12.126304 kernel: acpiphp: Slot [19] registered May 14 00:17:12.126318 kernel: acpiphp: Slot [20] registered May 14 00:17:12.126332 kernel: acpiphp: Slot [21] registered May 14 00:17:12.126346 kernel: acpiphp: Slot [22] registered May 14 00:17:12.126360 kernel: acpiphp: Slot [23] registered May 14 00:17:12.126374 kernel: acpiphp: Slot [24] registered May 14 00:17:12.126388 kernel: acpiphp: Slot [25] registered May 14 00:17:12.126402 kernel: acpiphp: Slot [26] registered May 14 00:17:12.126416 kernel: acpiphp: Slot [27] registered May 14 00:17:12.126433 kernel: acpiphp: Slot [28] registered May 14 00:17:12.126447 kernel: acpiphp: Slot [29] registered May 14 00:17:12.126461 kernel: acpiphp: Slot [30] registered May 14 00:17:12.126474 kernel: acpiphp: Slot [31] registered May 14 00:17:12.126489 kernel: PCI host bridge to bus 0000:00 May 14 00:17:12.126646 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 14 00:17:12.126758 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 14 00:17:12.126853 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 14 00:17:12.126951 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 14 00:17:12.129156 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] May 14 00:17:12.129251 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 14 00:17:12.129380 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 May 14 00:17:12.129501 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 May 14 00:17:12.129623 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 May 14 00:17:12.129735 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] May 14 00:17:12.129838 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] May 14 00:17:12.131001 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] May 14 00:17:12.131119 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] May 14 00:17:12.131219 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] May 14 00:17:12.131330 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 May 14 00:17:12.131430 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI May 14 00:17:12.131535 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB May 14 00:17:12.131647 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 May 14 00:17:12.131748 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] May 14 00:17:12.131850 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xc000000000-0xc000003fff 64bit pref] May 14 00:17:12.131952 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] May 14 00:17:12.133113 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] May 14 00:17:12.133228 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 14 00:17:12.133360 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 May 14 00:17:12.133473 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] May 14 00:17:12.133584 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] May 14 00:17:12.133696 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xc000004000-0xc000007fff 64bit pref] May 14 00:17:12.133808 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] May 14 00:17:12.133932 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 May 14 00:17:12.135385 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] May 14 00:17:12.135496 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] May 14 00:17:12.135595 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xc000008000-0xc00000bfff 64bit pref] May 14 00:17:12.135706 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 May 14 00:17:12.135807 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] May 14 00:17:12.135929 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xc00000c000-0xc00000ffff 64bit pref] May 14 00:17:12.140511 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 May 14 00:17:12.140611 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] May 14 00:17:12.140713 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfeb93000-0xfeb93fff] May 14 00:17:12.140817 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xc000010000-0xc000013fff 64bit pref] May 14 00:17:12.140838 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 14 00:17:12.140850 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 14 00:17:12.140872 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 14 00:17:12.140884 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 14 00:17:12.140897 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 May 14 00:17:12.140911 kernel: iommu: Default domain type: Translated May 14 00:17:12.140929 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 14 00:17:12.140945 kernel: PCI: Using ACPI for IRQ routing May 14 00:17:12.140994 kernel: PCI: pci_cache_line_size set to 64 bytes May 14 00:17:12.141008 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 14 00:17:12.141021 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] May 14 00:17:12.141149 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device May 14 00:17:12.141262 kernel: pci 0000:00:02.0: vgaarb: bridge control possible May 14 00:17:12.141372 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 14 00:17:12.141388 kernel: vgaarb: loaded May 14 00:17:12.141405 kernel: clocksource: Switched to clocksource kvm-clock May 14 00:17:12.141416 kernel: VFS: Disk quotas dquot_6.6.0 May 14 00:17:12.141427 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 14 00:17:12.141438 kernel: pnp: PnP ACPI init May 14 00:17:12.141563 kernel: pnp 00:03: [dma 2] May 14 00:17:12.141582 kernel: pnp: PnP ACPI: found 5 devices May 14 00:17:12.141594 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 14 00:17:12.141605 kernel: NET: Registered PF_INET protocol family May 14 00:17:12.141620 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 14 00:17:12.141631 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 14 00:17:12.141643 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 14 00:17:12.141654 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 14 00:17:12.141665 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 14 00:17:12.141677 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 14 00:17:12.141688 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 14 00:17:12.141699 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 14 00:17:12.141711 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 14 00:17:12.141724 kernel: NET: Registered PF_XDP protocol family May 14 00:17:12.141831 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 14 00:17:12.141929 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 14 00:17:12.142084 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 14 00:17:12.142173 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] May 14 00:17:12.142259 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] May 14 00:17:12.142363 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release May 14 00:17:12.142485 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers May 14 00:17:12.142512 kernel: PCI: CLS 0 bytes, default 64 May 14 00:17:12.142527 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 14 00:17:12.142539 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) May 14 00:17:12.142550 kernel: Initialise system trusted keyrings May 14 00:17:12.142564 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 14 00:17:12.142576 kernel: Key type asymmetric registered May 14 00:17:12.142586 kernel: Asymmetric key parser 'x509' registered May 14 00:17:12.142597 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) May 14 00:17:12.142611 kernel: io scheduler mq-deadline registered May 14 00:17:12.142625 kernel: io scheduler kyber registered May 14 00:17:12.142635 kernel: io scheduler bfq registered May 14 00:17:12.142646 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 14 00:17:12.142660 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 May 14 00:17:12.142673 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 May 14 00:17:12.142683 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 May 14 00:17:12.142694 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 May 14 00:17:12.142704 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 14 00:17:12.142715 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 14 00:17:12.142728 kernel: random: crng init done May 14 00:17:12.142738 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 14 00:17:12.142748 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 14 00:17:12.142759 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 14 00:17:12.142868 kernel: rtc_cmos 00:04: RTC can wake from S4 May 14 00:17:12.142885 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 14 00:17:12.143028 kernel: rtc_cmos 00:04: registered as rtc0 May 14 00:17:12.143148 kernel: rtc_cmos 00:04: setting system clock to 2025-05-14T00:17:11 UTC (1747181831) May 14 00:17:12.143261 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram May 14 00:17:12.143278 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 14 00:17:12.143289 kernel: NET: Registered PF_INET6 protocol family May 14 00:17:12.143299 kernel: Segment Routing with IPv6 May 14 00:17:12.143310 kernel: In-situ OAM (IOAM) with IPv6 May 14 00:17:12.143320 kernel: NET: Registered PF_PACKET protocol family May 14 00:17:12.143330 kernel: Key type dns_resolver registered May 14 00:17:12.143340 kernel: IPI shorthand broadcast: enabled May 14 00:17:12.143351 kernel: sched_clock: Marking stable (1013007148, 170397129)->(1223375401, -39971124) May 14 00:17:12.143368 kernel: registered taskstats version 1 May 14 00:17:12.143378 kernel: Loading compiled-in X.509 certificates May 14 00:17:12.143388 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.89-flatcar: 166efda032ca4d6e9037c569aca9b53585ee6f94' May 14 00:17:12.143399 kernel: Key type .fscrypt registered May 14 00:17:12.143409 kernel: Key type fscrypt-provisioning registered May 14 00:17:12.143419 kernel: ima: No TPM chip found, activating TPM-bypass! May 14 00:17:12.143429 kernel: ima: Allocated hash algorithm: sha1 May 14 00:17:12.143439 kernel: ima: No architecture policies found May 14 00:17:12.143451 kernel: clk: Disabling unused clocks May 14 00:17:12.143462 kernel: Freeing unused kernel image (initmem) memory: 43604K May 14 00:17:12.143472 kernel: Write protecting the kernel read-only data: 40960k May 14 00:17:12.143482 kernel: Freeing unused kernel image (rodata/data gap) memory: 1556K May 14 00:17:12.143493 kernel: Run /init as init process May 14 00:17:12.143503 kernel: with arguments: May 14 00:17:12.143513 kernel: /init May 14 00:17:12.143523 kernel: with environment: May 14 00:17:12.143533 kernel: HOME=/ May 14 00:17:12.143543 kernel: TERM=linux May 14 00:17:12.143555 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 14 00:17:12.143567 systemd[1]: Successfully made /usr/ read-only. May 14 00:17:12.143583 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 14 00:17:12.143596 systemd[1]: Detected virtualization kvm. May 14 00:17:12.143607 systemd[1]: Detected architecture x86-64. May 14 00:17:12.143618 systemd[1]: Running in initrd. May 14 00:17:12.143628 systemd[1]: No hostname configured, using default hostname. May 14 00:17:12.143642 systemd[1]: Hostname set to . May 14 00:17:12.143653 systemd[1]: Initializing machine ID from VM UUID. May 14 00:17:12.143664 systemd[1]: Queued start job for default target initrd.target. May 14 00:17:12.143676 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 14 00:17:12.143687 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 14 00:17:12.143699 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 14 00:17:12.143721 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 14 00:17:12.143734 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 14 00:17:12.143747 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 14 00:17:12.143760 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 14 00:17:12.143772 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 14 00:17:12.143783 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 14 00:17:12.143797 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 14 00:17:12.143808 systemd[1]: Reached target paths.target - Path Units. May 14 00:17:12.143820 systemd[1]: Reached target slices.target - Slice Units. May 14 00:17:12.143831 systemd[1]: Reached target swap.target - Swaps. May 14 00:17:12.143842 systemd[1]: Reached target timers.target - Timer Units. May 14 00:17:12.143854 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 14 00:17:12.143865 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 14 00:17:12.143877 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 14 00:17:12.143888 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 14 00:17:12.143903 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 14 00:17:12.143914 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 14 00:17:12.143926 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 14 00:17:12.143937 systemd[1]: Reached target sockets.target - Socket Units. May 14 00:17:12.143949 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 14 00:17:12.144020 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 14 00:17:12.144033 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 14 00:17:12.144045 systemd[1]: Starting systemd-fsck-usr.service... May 14 00:17:12.144060 systemd[1]: Starting systemd-journald.service - Journal Service... May 14 00:17:12.144071 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 14 00:17:12.144083 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 00:17:12.144094 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 14 00:17:12.144105 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 14 00:17:12.144152 systemd-journald[184]: Collecting audit messages is disabled. May 14 00:17:12.144188 systemd[1]: Finished systemd-fsck-usr.service. May 14 00:17:12.144201 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 14 00:17:12.144217 systemd-journald[184]: Journal started May 14 00:17:12.145037 systemd-journald[184]: Runtime Journal (/run/log/journal/28e76fc085494fe6ac31dbf7f1deefb2) is 8M, max 78.2M, 70.2M free. May 14 00:17:12.145092 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 14 00:17:12.109504 systemd-modules-load[186]: Inserted module 'overlay' May 14 00:17:12.181876 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 14 00:17:12.181902 kernel: Bridge firewalling registered May 14 00:17:12.158208 systemd-modules-load[186]: Inserted module 'br_netfilter' May 14 00:17:12.184987 systemd[1]: Started systemd-journald.service - Journal Service. May 14 00:17:12.185596 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 14 00:17:12.186338 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 14 00:17:12.190509 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 14 00:17:12.193153 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 14 00:17:12.194332 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 14 00:17:12.200756 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 14 00:17:12.214209 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 14 00:17:12.222405 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 14 00:17:12.227947 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 14 00:17:12.231400 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 14 00:17:12.232157 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 00:17:12.242806 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 14 00:17:12.259936 dracut-cmdline[221]: dracut-dracut-053 May 14 00:17:12.263017 dracut-cmdline[221]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 14 00:17:12.277375 systemd-resolved[220]: Positive Trust Anchors: May 14 00:17:12.278101 systemd-resolved[220]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 14 00:17:12.278853 systemd-resolved[220]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 14 00:17:12.284521 systemd-resolved[220]: Defaulting to hostname 'linux'. May 14 00:17:12.285494 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 14 00:17:12.286092 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 14 00:17:12.334022 kernel: SCSI subsystem initialized May 14 00:17:12.344025 kernel: Loading iSCSI transport class v2.0-870. May 14 00:17:12.356065 kernel: iscsi: registered transport (tcp) May 14 00:17:12.380091 kernel: iscsi: registered transport (qla4xxx) May 14 00:17:12.380174 kernel: QLogic iSCSI HBA Driver May 14 00:17:12.442225 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 14 00:17:12.446406 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 14 00:17:12.502773 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 14 00:17:12.502839 kernel: device-mapper: uevent: version 1.0.3 May 14 00:17:12.504909 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 14 00:17:12.566096 kernel: raid6: sse2x4 gen() 5206 MB/s May 14 00:17:12.584034 kernel: raid6: sse2x2 gen() 8176 MB/s May 14 00:17:12.602463 kernel: raid6: sse2x1 gen() 9720 MB/s May 14 00:17:12.602542 kernel: raid6: using algorithm sse2x1 gen() 9720 MB/s May 14 00:17:12.621363 kernel: raid6: .... xor() 7193 MB/s, rmw enabled May 14 00:17:12.621437 kernel: raid6: using ssse3x2 recovery algorithm May 14 00:17:12.644662 kernel: xor: measuring software checksum speed May 14 00:17:12.644767 kernel: prefetch64-sse : 18524 MB/sec May 14 00:17:12.644813 kernel: generic_sse : 16881 MB/sec May 14 00:17:12.646314 kernel: xor: using function: prefetch64-sse (18524 MB/sec) May 14 00:17:12.826639 kernel: Btrfs loaded, zoned=no, fsverity=no May 14 00:17:12.843643 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 14 00:17:12.849943 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 14 00:17:12.875164 systemd-udevd[404]: Using default interface naming scheme 'v255'. May 14 00:17:12.880081 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 14 00:17:12.888356 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 14 00:17:12.916013 dracut-pre-trigger[416]: rd.md=0: removing MD RAID activation May 14 00:17:12.966340 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 14 00:17:12.972132 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 14 00:17:13.027294 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 14 00:17:13.036194 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 14 00:17:13.095097 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 14 00:17:13.098507 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 14 00:17:13.100318 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 14 00:17:13.102024 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 14 00:17:13.106095 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 14 00:17:13.127135 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues May 14 00:17:13.127470 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 14 00:17:13.135515 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) May 14 00:17:13.153526 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 14 00:17:13.153573 kernel: GPT:17805311 != 20971519 May 14 00:17:13.153587 kernel: GPT:Alternate GPT header not at the end of the disk. May 14 00:17:13.153601 kernel: GPT:17805311 != 20971519 May 14 00:17:13.153613 kernel: GPT: Use GNU Parted to correct GPT errors. May 14 00:17:13.153625 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 14 00:17:13.169874 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 14 00:17:13.170892 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 00:17:13.171590 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 14 00:17:13.173727 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 14 00:17:13.173865 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 14 00:17:13.175493 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 14 00:17:13.178270 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 00:17:13.181023 kernel: libata version 3.00 loaded. May 14 00:17:13.180487 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 14 00:17:13.195050 kernel: ata_piix 0000:00:01.1: version 2.13 May 14 00:17:13.200538 kernel: scsi host0: ata_piix May 14 00:17:13.200709 kernel: scsi host1: ata_piix May 14 00:17:13.204992 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 May 14 00:17:13.205021 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 May 14 00:17:13.220006 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (468) May 14 00:17:13.220986 kernel: BTRFS: device fsid d2fbd39e-42cb-4ccb-87ec-99f56cfe77f8 devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (466) May 14 00:17:13.260333 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 14 00:17:13.272584 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 14 00:17:13.284700 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 14 00:17:13.294057 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 14 00:17:13.294668 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 14 00:17:13.307326 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 14 00:17:13.311076 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 14 00:17:13.313873 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 14 00:17:13.333844 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 00:17:13.337276 disk-uuid[512]: Primary Header is updated. May 14 00:17:13.337276 disk-uuid[512]: Secondary Entries is updated. May 14 00:17:13.337276 disk-uuid[512]: Secondary Header is updated. May 14 00:17:13.343996 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 14 00:17:14.360321 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 14 00:17:14.363631 disk-uuid[520]: The operation has completed successfully. May 14 00:17:14.442379 systemd[1]: disk-uuid.service: Deactivated successfully. May 14 00:17:14.442505 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 14 00:17:14.492020 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 14 00:17:14.513456 sh[531]: Success May 14 00:17:14.536063 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" May 14 00:17:14.617125 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 14 00:17:14.629157 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 14 00:17:14.632115 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 14 00:17:14.661075 kernel: BTRFS info (device dm-0): first mount of filesystem d2fbd39e-42cb-4ccb-87ec-99f56cfe77f8 May 14 00:17:14.661170 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 14 00:17:14.661222 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 14 00:17:14.663439 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 14 00:17:14.665157 kernel: BTRFS info (device dm-0): using free space tree May 14 00:17:14.684247 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 14 00:17:14.687106 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 14 00:17:14.688845 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 14 00:17:14.693188 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 14 00:17:14.717832 kernel: BTRFS info (device vda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 14 00:17:14.717887 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 14 00:17:14.722107 kernel: BTRFS info (device vda6): using free space tree May 14 00:17:14.732990 kernel: BTRFS info (device vda6): auto enabling async discard May 14 00:17:14.740011 kernel: BTRFS info (device vda6): last unmount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 14 00:17:14.752582 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 14 00:17:14.756105 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 14 00:17:14.830704 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 14 00:17:14.835074 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 14 00:17:14.884601 systemd-networkd[711]: lo: Link UP May 14 00:17:14.884610 systemd-networkd[711]: lo: Gained carrier May 14 00:17:14.889923 systemd-networkd[711]: Enumeration completed May 14 00:17:14.890506 systemd[1]: Started systemd-networkd.service - Network Configuration. May 14 00:17:14.891072 systemd[1]: Reached target network.target - Network. May 14 00:17:14.892012 systemd-networkd[711]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 00:17:14.892016 systemd-networkd[711]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 14 00:17:14.893937 systemd-networkd[711]: eth0: Link UP May 14 00:17:14.893941 systemd-networkd[711]: eth0: Gained carrier May 14 00:17:14.893951 systemd-networkd[711]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 00:17:14.907028 systemd-networkd[711]: eth0: DHCPv4 address 172.24.4.34/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 14 00:17:14.921831 ignition[639]: Ignition 2.20.0 May 14 00:17:14.921852 ignition[639]: Stage: fetch-offline May 14 00:17:14.921913 ignition[639]: no configs at "/usr/lib/ignition/base.d" May 14 00:17:14.921930 ignition[639]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 00:17:14.923978 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 14 00:17:14.922074 ignition[639]: parsed url from cmdline: "" May 14 00:17:14.922079 ignition[639]: no config URL provided May 14 00:17:14.922087 ignition[639]: reading system config file "/usr/lib/ignition/user.ign" May 14 00:17:14.922097 ignition[639]: no config at "/usr/lib/ignition/user.ign" May 14 00:17:14.922102 ignition[639]: failed to fetch config: resource requires networking May 14 00:17:14.922468 ignition[639]: Ignition finished successfully May 14 00:17:14.929102 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 14 00:17:14.952729 ignition[722]: Ignition 2.20.0 May 14 00:17:14.952749 ignition[722]: Stage: fetch May 14 00:17:14.953022 ignition[722]: no configs at "/usr/lib/ignition/base.d" May 14 00:17:14.953036 ignition[722]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 00:17:14.953135 ignition[722]: parsed url from cmdline: "" May 14 00:17:14.953140 ignition[722]: no config URL provided May 14 00:17:14.953146 ignition[722]: reading system config file "/usr/lib/ignition/user.ign" May 14 00:17:14.953157 ignition[722]: no config at "/usr/lib/ignition/user.ign" May 14 00:17:14.953284 ignition[722]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 May 14 00:17:14.953871 ignition[722]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... May 14 00:17:14.953909 ignition[722]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... May 14 00:17:15.191235 ignition[722]: GET result: OK May 14 00:17:15.191446 ignition[722]: parsing config with SHA512: 26b0eb4c3e622806d1bd4f0c26b7f3e672388ae7f823a9f0f4c3fc5999f2e1ad72792214f74253caee1f79144e2e18005b10f998e69e134d480dd12958410a2f May 14 00:17:15.202734 unknown[722]: fetched base config from "system" May 14 00:17:15.202756 unknown[722]: fetched base config from "system" May 14 00:17:15.203746 ignition[722]: fetch: fetch complete May 14 00:17:15.202774 unknown[722]: fetched user config from "openstack" May 14 00:17:15.203759 ignition[722]: fetch: fetch passed May 14 00:17:15.206320 systemd-resolved[220]: Detected conflict on linux IN A 172.24.4.34 May 14 00:17:15.203849 ignition[722]: Ignition finished successfully May 14 00:17:15.206338 systemd-resolved[220]: Hostname conflict, changing published hostname from 'linux' to 'linux2'. May 14 00:17:15.207387 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 14 00:17:15.214224 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 14 00:17:15.262904 ignition[728]: Ignition 2.20.0 May 14 00:17:15.262933 ignition[728]: Stage: kargs May 14 00:17:15.263387 ignition[728]: no configs at "/usr/lib/ignition/base.d" May 14 00:17:15.263416 ignition[728]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 00:17:15.268604 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 14 00:17:15.265774 ignition[728]: kargs: kargs passed May 14 00:17:15.265882 ignition[728]: Ignition finished successfully May 14 00:17:15.274255 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 14 00:17:15.314875 ignition[734]: Ignition 2.20.0 May 14 00:17:15.316626 ignition[734]: Stage: disks May 14 00:17:15.317107 ignition[734]: no configs at "/usr/lib/ignition/base.d" May 14 00:17:15.317135 ignition[734]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 00:17:15.323813 ignition[734]: disks: disks passed May 14 00:17:15.325126 ignition[734]: Ignition finished successfully May 14 00:17:15.327142 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 14 00:17:15.329419 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 14 00:17:15.331427 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 14 00:17:15.334421 systemd[1]: Reached target local-fs.target - Local File Systems. May 14 00:17:15.337383 systemd[1]: Reached target sysinit.target - System Initialization. May 14 00:17:15.339899 systemd[1]: Reached target basic.target - Basic System. May 14 00:17:15.344759 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 14 00:17:15.397723 systemd-fsck[743]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks May 14 00:17:15.410653 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 14 00:17:15.416525 systemd[1]: Mounting sysroot.mount - /sysroot... May 14 00:17:15.578988 kernel: EXT4-fs (vda9): mounted filesystem c413e98b-da35-46b1-9852-45706e1b1f52 r/w with ordered data mode. Quota mode: none. May 14 00:17:15.580936 systemd[1]: Mounted sysroot.mount - /sysroot. May 14 00:17:15.583193 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 14 00:17:15.587485 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 14 00:17:15.598864 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 14 00:17:15.600442 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 14 00:17:15.603081 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... May 14 00:17:15.604660 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 14 00:17:15.604691 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 14 00:17:15.636627 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (751) May 14 00:17:15.636678 kernel: BTRFS info (device vda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 14 00:17:15.636711 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 14 00:17:15.636741 kernel: BTRFS info (device vda6): using free space tree May 14 00:17:15.636771 kernel: BTRFS info (device vda6): auto enabling async discard May 14 00:17:15.635379 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 14 00:17:15.641120 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 14 00:17:15.652517 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 14 00:17:15.726789 initrd-setup-root[778]: cut: /sysroot/etc/passwd: No such file or directory May 14 00:17:15.733635 initrd-setup-root[786]: cut: /sysroot/etc/group: No such file or directory May 14 00:17:15.740155 initrd-setup-root[793]: cut: /sysroot/etc/shadow: No such file or directory May 14 00:17:15.746298 initrd-setup-root[800]: cut: /sysroot/etc/gshadow: No such file or directory May 14 00:17:15.851518 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 14 00:17:15.854342 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 14 00:17:15.868176 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 14 00:17:15.872811 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 14 00:17:15.875158 kernel: BTRFS info (device vda6): last unmount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 14 00:17:15.911152 ignition[867]: INFO : Ignition 2.20.0 May 14 00:17:15.911152 ignition[867]: INFO : Stage: mount May 14 00:17:15.911152 ignition[867]: INFO : no configs at "/usr/lib/ignition/base.d" May 14 00:17:15.911152 ignition[867]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 00:17:15.919050 ignition[867]: INFO : mount: mount passed May 14 00:17:15.919050 ignition[867]: INFO : Ignition finished successfully May 14 00:17:15.916357 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 14 00:17:15.927563 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 14 00:17:16.567673 systemd-networkd[711]: eth0: Gained IPv6LL May 14 00:17:22.797219 coreos-metadata[753]: May 14 00:17:22.796 WARN failed to locate config-drive, using the metadata service API instead May 14 00:17:22.837785 coreos-metadata[753]: May 14 00:17:22.837 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 14 00:17:22.852814 coreos-metadata[753]: May 14 00:17:22.852 INFO Fetch successful May 14 00:17:22.854304 coreos-metadata[753]: May 14 00:17:22.853 INFO wrote hostname ci-4284-0-0-n-4643e7afba.novalocal to /sysroot/etc/hostname May 14 00:17:22.858436 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. May 14 00:17:22.858706 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. May 14 00:17:22.866569 systemd[1]: Starting ignition-files.service - Ignition (files)... May 14 00:17:22.908097 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 14 00:17:22.940220 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (884) May 14 00:17:22.947757 kernel: BTRFS info (device vda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 14 00:17:22.947831 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 14 00:17:22.954297 kernel: BTRFS info (device vda6): using free space tree May 14 00:17:22.964223 kernel: BTRFS info (device vda6): auto enabling async discard May 14 00:17:22.969133 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 14 00:17:23.027700 ignition[902]: INFO : Ignition 2.20.0 May 14 00:17:23.027700 ignition[902]: INFO : Stage: files May 14 00:17:23.031017 ignition[902]: INFO : no configs at "/usr/lib/ignition/base.d" May 14 00:17:23.031017 ignition[902]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 00:17:23.031017 ignition[902]: DEBUG : files: compiled without relabeling support, skipping May 14 00:17:23.037002 ignition[902]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 14 00:17:23.037002 ignition[902]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 14 00:17:23.037002 ignition[902]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 14 00:17:23.037002 ignition[902]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 14 00:17:23.037002 ignition[902]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 14 00:17:23.036574 unknown[902]: wrote ssh authorized keys file for user: core May 14 00:17:23.048239 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 14 00:17:23.048239 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 14 00:17:23.117690 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 14 00:17:23.457226 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 14 00:17:23.457226 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 14 00:17:23.462409 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 May 14 00:17:24.226668 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 14 00:17:26.633310 ignition[902]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 14 00:17:26.633310 ignition[902]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 14 00:17:26.691246 ignition[902]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 14 00:17:26.693645 ignition[902]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 14 00:17:26.693645 ignition[902]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 14 00:17:26.693645 ignition[902]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 14 00:17:26.693645 ignition[902]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 14 00:17:26.693645 ignition[902]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 14 00:17:26.693645 ignition[902]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 14 00:17:26.693645 ignition[902]: INFO : files: files passed May 14 00:17:26.693645 ignition[902]: INFO : Ignition finished successfully May 14 00:17:26.695436 systemd[1]: Finished ignition-files.service - Ignition (files). May 14 00:17:26.704320 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 14 00:17:26.713201 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 14 00:17:26.743187 initrd-setup-root-after-ignition[929]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 14 00:17:26.745688 initrd-setup-root-after-ignition[933]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 14 00:17:26.748583 initrd-setup-root-after-ignition[929]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 14 00:17:26.748753 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 14 00:17:26.752733 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 14 00:17:26.758505 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 14 00:17:26.762287 systemd[1]: ignition-quench.service: Deactivated successfully. May 14 00:17:26.762486 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 14 00:17:26.827515 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 14 00:17:26.827739 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 14 00:17:26.840521 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 14 00:17:26.842808 systemd[1]: Reached target initrd.target - Initrd Default Target. May 14 00:17:26.845668 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 14 00:17:26.848223 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 14 00:17:26.893068 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 14 00:17:26.898269 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 14 00:17:26.938327 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 14 00:17:26.941806 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 14 00:17:26.943544 systemd[1]: Stopped target timers.target - Timer Units. May 14 00:17:26.946360 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 14 00:17:26.946662 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 14 00:17:26.950724 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 14 00:17:26.952624 systemd[1]: Stopped target basic.target - Basic System. May 14 00:17:26.955840 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 14 00:17:26.958351 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 14 00:17:26.960680 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 14 00:17:26.963581 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 14 00:17:26.966403 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 14 00:17:26.969337 systemd[1]: Stopped target sysinit.target - System Initialization. May 14 00:17:26.972066 systemd[1]: Stopped target local-fs.target - Local File Systems. May 14 00:17:26.974894 systemd[1]: Stopped target swap.target - Swaps. May 14 00:17:26.977505 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 14 00:17:26.977819 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 14 00:17:26.981214 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 14 00:17:26.984122 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 14 00:17:26.986801 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 14 00:17:26.987094 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 14 00:17:26.989763 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 14 00:17:26.990194 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 14 00:17:26.993462 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 14 00:17:26.993883 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 14 00:17:26.997366 systemd[1]: ignition-files.service: Deactivated successfully. May 14 00:17:26.997642 systemd[1]: Stopped ignition-files.service - Ignition (files). May 14 00:17:27.003428 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 14 00:17:27.012727 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 14 00:17:27.015557 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 14 00:17:27.017647 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 14 00:17:27.019829 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 14 00:17:27.020162 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 14 00:17:27.027931 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 14 00:17:27.028624 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 14 00:17:27.043450 ignition[955]: INFO : Ignition 2.20.0 May 14 00:17:27.043450 ignition[955]: INFO : Stage: umount May 14 00:17:27.043450 ignition[955]: INFO : no configs at "/usr/lib/ignition/base.d" May 14 00:17:27.043450 ignition[955]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 00:17:27.043450 ignition[955]: INFO : umount: umount passed May 14 00:17:27.043450 ignition[955]: INFO : Ignition finished successfully May 14 00:17:27.045011 systemd[1]: ignition-mount.service: Deactivated successfully. May 14 00:17:27.045120 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 14 00:17:27.046724 systemd[1]: ignition-disks.service: Deactivated successfully. May 14 00:17:27.046795 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 14 00:17:27.049049 systemd[1]: ignition-kargs.service: Deactivated successfully. May 14 00:17:27.049092 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 14 00:17:27.049585 systemd[1]: ignition-fetch.service: Deactivated successfully. May 14 00:17:27.049623 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 14 00:17:27.050164 systemd[1]: Stopped target network.target - Network. May 14 00:17:27.050630 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 14 00:17:27.050675 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 14 00:17:27.053037 systemd[1]: Stopped target paths.target - Path Units. May 14 00:17:27.053552 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 14 00:17:27.056998 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 14 00:17:27.057553 systemd[1]: Stopped target slices.target - Slice Units. May 14 00:17:27.058775 systemd[1]: Stopped target sockets.target - Socket Units. May 14 00:17:27.060030 systemd[1]: iscsid.socket: Deactivated successfully. May 14 00:17:27.060068 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 14 00:17:27.061104 systemd[1]: iscsiuio.socket: Deactivated successfully. May 14 00:17:27.061135 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 14 00:17:27.062294 systemd[1]: ignition-setup.service: Deactivated successfully. May 14 00:17:27.062336 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 14 00:17:27.063514 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 14 00:17:27.063558 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 14 00:17:27.064660 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 14 00:17:27.065854 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 14 00:17:27.068084 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 14 00:17:27.068680 systemd[1]: sysroot-boot.service: Deactivated successfully. May 14 00:17:27.068762 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 14 00:17:27.069951 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 14 00:17:27.070054 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 14 00:17:27.073276 systemd[1]: systemd-resolved.service: Deactivated successfully. May 14 00:17:27.073476 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 14 00:17:27.076678 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 14 00:17:27.076883 systemd[1]: systemd-networkd.service: Deactivated successfully. May 14 00:17:27.077004 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 14 00:17:27.078640 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 14 00:17:27.079318 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 14 00:17:27.079499 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 14 00:17:27.082054 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 14 00:17:27.084096 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 14 00:17:27.084153 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 14 00:17:27.089333 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 14 00:17:27.089381 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 14 00:17:27.091096 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 14 00:17:27.091208 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 14 00:17:27.094249 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 14 00:17:27.094295 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 14 00:17:27.095856 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 14 00:17:27.097686 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 14 00:17:27.097752 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 14 00:17:27.106280 systemd[1]: systemd-udevd.service: Deactivated successfully. May 14 00:17:27.106954 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 14 00:17:27.108408 systemd[1]: network-cleanup.service: Deactivated successfully. May 14 00:17:27.108506 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 14 00:17:27.109996 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 14 00:17:27.110059 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 14 00:17:27.111256 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 14 00:17:27.111287 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 14 00:17:27.112414 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 14 00:17:27.112457 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 14 00:17:27.114034 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 14 00:17:27.114077 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 14 00:17:27.115023 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 14 00:17:27.115067 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 00:17:27.118057 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 14 00:17:27.119185 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 14 00:17:27.119235 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 14 00:17:27.121435 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 14 00:17:27.121479 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 14 00:17:27.122549 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 14 00:17:27.122591 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 14 00:17:27.123756 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 14 00:17:27.123798 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 14 00:17:27.126595 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 14 00:17:27.126656 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 14 00:17:27.132365 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 14 00:17:27.132478 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 14 00:17:27.134084 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 14 00:17:27.136150 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 14 00:17:27.153787 systemd[1]: Switching root. May 14 00:17:27.190072 systemd-journald[184]: Journal stopped May 14 00:17:28.929926 systemd-journald[184]: Received SIGTERM from PID 1 (systemd). May 14 00:17:28.930000 kernel: SELinux: policy capability network_peer_controls=1 May 14 00:17:28.930021 kernel: SELinux: policy capability open_perms=1 May 14 00:17:28.930034 kernel: SELinux: policy capability extended_socket_class=1 May 14 00:17:28.930049 kernel: SELinux: policy capability always_check_network=0 May 14 00:17:28.930061 kernel: SELinux: policy capability cgroup_seclabel=1 May 14 00:17:28.930074 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 14 00:17:28.930086 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 14 00:17:28.930101 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 14 00:17:28.930119 systemd[1]: Successfully loaded SELinux policy in 75.908ms. May 14 00:17:28.930142 kernel: audit: type=1403 audit(1747181847.656:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 14 00:17:28.930155 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 26.518ms. May 14 00:17:28.930170 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 14 00:17:28.930184 systemd[1]: Detected virtualization kvm. May 14 00:17:28.930201 systemd[1]: Detected architecture x86-64. May 14 00:17:28.930214 systemd[1]: Detected first boot. May 14 00:17:28.930230 systemd[1]: Hostname set to . May 14 00:17:28.930244 systemd[1]: Initializing machine ID from VM UUID. May 14 00:17:28.930258 zram_generator::config[1000]: No configuration found. May 14 00:17:28.930273 kernel: Guest personality initialized and is inactive May 14 00:17:28.930285 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 14 00:17:28.930297 kernel: Initialized host personality May 14 00:17:28.930309 kernel: NET: Registered PF_VSOCK protocol family May 14 00:17:28.930321 systemd[1]: Populated /etc with preset unit settings. May 14 00:17:28.930341 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 14 00:17:28.930356 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 14 00:17:28.930369 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 14 00:17:28.930382 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 14 00:17:28.930395 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 14 00:17:28.930409 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 14 00:17:28.930422 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 14 00:17:28.930435 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 14 00:17:28.930448 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 14 00:17:28.930463 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 14 00:17:28.930478 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 14 00:17:28.930492 systemd[1]: Created slice user.slice - User and Session Slice. May 14 00:17:28.930505 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 14 00:17:28.930520 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 14 00:17:28.930533 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 14 00:17:28.930545 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 14 00:17:28.930562 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 14 00:17:28.930575 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 14 00:17:28.930591 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 14 00:17:28.930603 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 14 00:17:28.930615 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 14 00:17:28.930627 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 14 00:17:28.930641 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 14 00:17:28.930653 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 14 00:17:28.930668 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 14 00:17:28.930680 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 14 00:17:28.930692 systemd[1]: Reached target slices.target - Slice Units. May 14 00:17:28.930704 systemd[1]: Reached target swap.target - Swaps. May 14 00:17:28.930716 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 14 00:17:28.930728 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 14 00:17:28.930741 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 14 00:17:28.930753 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 14 00:17:28.930765 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 14 00:17:28.930780 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 14 00:17:28.930792 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 14 00:17:28.930804 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 14 00:17:28.930816 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 14 00:17:28.930828 systemd[1]: Mounting media.mount - External Media Directory... May 14 00:17:28.930841 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 00:17:28.930853 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 14 00:17:28.930866 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 14 00:17:28.930878 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 14 00:17:28.930893 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 14 00:17:28.930905 systemd[1]: Reached target machines.target - Containers. May 14 00:17:28.930917 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 14 00:17:28.930930 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 14 00:17:28.930942 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 14 00:17:28.930954 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 14 00:17:28.935049 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 14 00:17:28.935066 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 14 00:17:28.935086 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 14 00:17:28.935101 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 14 00:17:28.935115 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 14 00:17:28.935130 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 14 00:17:28.935142 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 14 00:17:28.935154 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 14 00:17:28.935167 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 14 00:17:28.935179 systemd[1]: Stopped systemd-fsck-usr.service. May 14 00:17:28.935192 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 14 00:17:28.935207 kernel: fuse: init (API version 7.39) May 14 00:17:28.935219 systemd[1]: Starting systemd-journald.service - Journal Service... May 14 00:17:28.935232 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 14 00:17:28.935244 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 14 00:17:28.935256 kernel: loop: module loaded May 14 00:17:28.935268 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 14 00:17:28.935281 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 14 00:17:28.935293 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 14 00:17:28.935308 systemd[1]: verity-setup.service: Deactivated successfully. May 14 00:17:28.935321 systemd[1]: Stopped verity-setup.service. May 14 00:17:28.935333 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 00:17:28.935348 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 14 00:17:28.935361 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 14 00:17:28.935373 systemd[1]: Mounted media.mount - External Media Directory. May 14 00:17:28.935385 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 14 00:17:28.935398 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 14 00:17:28.935410 kernel: ACPI: bus type drm_connector registered May 14 00:17:28.935421 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 14 00:17:28.935436 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 14 00:17:28.935448 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 14 00:17:28.935460 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 14 00:17:28.935472 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 14 00:17:28.935502 systemd-journald[1101]: Collecting audit messages is disabled. May 14 00:17:28.935527 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 14 00:17:28.935540 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 14 00:17:28.935553 systemd[1]: modprobe@drm.service: Deactivated successfully. May 14 00:17:28.935568 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 14 00:17:28.935580 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 14 00:17:28.935593 systemd-journald[1101]: Journal started May 14 00:17:28.935617 systemd-journald[1101]: Runtime Journal (/run/log/journal/28e76fc085494fe6ac31dbf7f1deefb2) is 8M, max 78.2M, 70.2M free. May 14 00:17:28.536647 systemd[1]: Queued start job for default target multi-user.target. May 14 00:17:28.549121 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 14 00:17:28.937996 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 14 00:17:28.549625 systemd[1]: systemd-journald.service: Deactivated successfully. May 14 00:17:28.942018 systemd[1]: Started systemd-journald.service - Journal Service. May 14 00:17:28.941942 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 14 00:17:28.942721 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 14 00:17:28.943613 systemd[1]: modprobe@loop.service: Deactivated successfully. May 14 00:17:28.943856 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 14 00:17:28.944700 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 14 00:17:28.945681 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 14 00:17:28.946545 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 14 00:17:28.947507 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 14 00:17:28.962059 systemd[1]: Reached target network-pre.target - Preparation for Network. May 14 00:17:28.965059 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 14 00:17:28.969015 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 14 00:17:28.970026 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 14 00:17:28.970060 systemd[1]: Reached target local-fs.target - Local File Systems. May 14 00:17:28.972027 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 14 00:17:28.978193 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 14 00:17:28.981070 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 14 00:17:28.981669 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 14 00:17:28.984127 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 14 00:17:28.987564 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 14 00:17:28.988169 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 14 00:17:28.992093 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 14 00:17:28.992843 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 14 00:17:28.994124 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 14 00:17:28.997215 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 14 00:17:28.999219 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 14 00:17:29.002534 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 14 00:17:29.007880 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 14 00:17:29.009255 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 14 00:17:29.011366 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 14 00:17:29.024598 systemd-journald[1101]: Time spent on flushing to /var/log/journal/28e76fc085494fe6ac31dbf7f1deefb2 is 42.752ms for 962 entries. May 14 00:17:29.024598 systemd-journald[1101]: System Journal (/var/log/journal/28e76fc085494fe6ac31dbf7f1deefb2) is 8M, max 584.8M, 576.8M free. May 14 00:17:29.084282 systemd-journald[1101]: Received client request to flush runtime journal. May 14 00:17:29.084319 kernel: loop0: detected capacity change from 0 to 210664 May 14 00:17:29.024119 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 14 00:17:29.038827 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 14 00:17:29.049552 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 14 00:17:29.050353 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 14 00:17:29.053165 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 14 00:17:29.073583 udevadm[1146]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. May 14 00:17:29.086776 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 14 00:17:29.091421 systemd-tmpfiles[1140]: ACLs are not supported, ignoring. May 14 00:17:29.091438 systemd-tmpfiles[1140]: ACLs are not supported, ignoring. May 14 00:17:29.098494 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 14 00:17:29.100333 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 14 00:17:29.147146 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 14 00:17:29.152769 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 14 00:17:29.178378 kernel: loop1: detected capacity change from 0 to 8 May 14 00:17:29.188026 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 14 00:17:29.193129 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 14 00:17:29.205006 kernel: loop2: detected capacity change from 0 to 151640 May 14 00:17:29.227922 systemd-tmpfiles[1162]: ACLs are not supported, ignoring. May 14 00:17:29.227942 systemd-tmpfiles[1162]: ACLs are not supported, ignoring. May 14 00:17:29.235393 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 14 00:17:29.267119 kernel: loop3: detected capacity change from 0 to 109808 May 14 00:17:29.331996 kernel: loop4: detected capacity change from 0 to 210664 May 14 00:17:29.399189 kernel: loop5: detected capacity change from 0 to 8 May 14 00:17:29.402016 kernel: loop6: detected capacity change from 0 to 151640 May 14 00:17:29.453074 kernel: loop7: detected capacity change from 0 to 109808 May 14 00:17:29.490079 (sd-merge)[1168]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. May 14 00:17:29.490555 (sd-merge)[1168]: Merged extensions into '/usr'. May 14 00:17:29.500464 systemd[1]: Reload requested from client PID 1139 ('systemd-sysext') (unit systemd-sysext.service)... May 14 00:17:29.500573 systemd[1]: Reloading... May 14 00:17:29.628699 zram_generator::config[1193]: No configuration found. May 14 00:17:29.784759 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 00:17:29.868468 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 14 00:17:29.868780 systemd[1]: Reloading finished in 366 ms. May 14 00:17:29.888860 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 14 00:17:29.896328 systemd[1]: Starting ensure-sysext.service... May 14 00:17:29.900374 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 14 00:17:29.934366 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 14 00:17:29.939159 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 14 00:17:29.939913 systemd[1]: Reload requested from client PID 1252 ('systemctl') (unit ensure-sysext.service)... May 14 00:17:29.939931 systemd[1]: Reloading... May 14 00:17:29.951761 systemd-tmpfiles[1253]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 14 00:17:29.952046 systemd-tmpfiles[1253]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 14 00:17:29.952884 systemd-tmpfiles[1253]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 14 00:17:29.957972 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. May 14 00:17:29.958156 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. May 14 00:17:29.973008 systemd-tmpfiles[1253]: Detected autofs mount point /boot during canonicalization of boot. May 14 00:17:29.973018 systemd-tmpfiles[1253]: Skipping /boot May 14 00:17:29.992693 systemd-tmpfiles[1253]: Detected autofs mount point /boot during canonicalization of boot. May 14 00:17:29.992707 systemd-tmpfiles[1253]: Skipping /boot May 14 00:17:30.027816 systemd-udevd[1256]: Using default interface naming scheme 'v255'. May 14 00:17:30.031582 ldconfig[1134]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 14 00:17:30.047018 zram_generator::config[1284]: No configuration found. May 14 00:17:30.198979 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 May 14 00:17:30.203291 kernel: ACPI: button: Power Button [PWRF] May 14 00:17:30.227992 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1320) May 14 00:17:30.263056 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 00:17:30.264984 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 May 14 00:17:30.330124 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 May 14 00:17:30.362994 kernel: mousedev: PS/2 mouse device common for all mice May 14 00:17:30.373507 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 May 14 00:17:30.373573 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console May 14 00:17:30.378459 kernel: Console: switching to colour dummy device 80x25 May 14 00:17:30.381751 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 14 00:17:30.381811 kernel: [drm] features: -context_init May 14 00:17:30.388730 kernel: [drm] number of scanouts: 1 May 14 00:17:30.390085 kernel: [drm] number of cap sets: 0 May 14 00:17:30.396978 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 May 14 00:17:30.397029 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device May 14 00:17:30.397051 kernel: Console: switching to colour frame buffer device 160x50 May 14 00:17:30.399569 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 14 00:17:30.401942 systemd[1]: Reloading finished in 461 ms. May 14 00:17:30.409021 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device May 14 00:17:30.409999 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 14 00:17:30.410406 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 14 00:17:30.416586 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 14 00:17:30.447454 systemd[1]: Finished ensure-sysext.service. May 14 00:17:30.453406 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 14 00:17:30.474316 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 14 00:17:30.479089 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 00:17:30.480272 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 14 00:17:30.490647 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 14 00:17:30.490884 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 14 00:17:30.494079 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 14 00:17:30.497071 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 14 00:17:30.500061 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 14 00:17:30.505238 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 14 00:17:30.508248 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 14 00:17:30.510171 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 14 00:17:30.513186 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 14 00:17:30.514560 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 14 00:17:30.518151 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 14 00:17:30.524082 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 14 00:17:30.524208 lvm[1377]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 14 00:17:30.528138 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 14 00:17:30.534085 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 14 00:17:30.539206 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 14 00:17:30.552608 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 00:17:30.552711 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 00:17:30.557533 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 14 00:17:30.558525 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 14 00:17:30.558857 systemd[1]: modprobe@drm.service: Deactivated successfully. May 14 00:17:30.559042 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 14 00:17:30.562013 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 14 00:17:30.568739 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 14 00:17:30.576097 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 14 00:17:30.584770 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 14 00:17:30.592114 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 14 00:17:30.592317 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 14 00:17:30.593821 systemd[1]: modprobe@loop.service: Deactivated successfully. May 14 00:17:30.595906 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 14 00:17:30.596878 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 14 00:17:30.602511 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 14 00:17:30.602591 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 14 00:17:30.618040 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 14 00:17:30.621249 lvm[1400]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 14 00:17:30.632459 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 14 00:17:30.638269 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 14 00:17:30.656614 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 14 00:17:30.660988 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 14 00:17:30.666784 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 14 00:17:30.669634 augenrules[1423]: No rules May 14 00:17:30.671520 systemd[1]: audit-rules.service: Deactivated successfully. May 14 00:17:30.672614 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 14 00:17:30.678551 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 14 00:17:30.684455 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 14 00:17:30.754476 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 14 00:17:30.776780 systemd-networkd[1390]: lo: Link UP May 14 00:17:30.776789 systemd-networkd[1390]: lo: Gained carrier May 14 00:17:30.778619 systemd-networkd[1390]: Enumeration completed May 14 00:17:30.778727 systemd[1]: Started systemd-networkd.service - Network Configuration. May 14 00:17:30.779120 systemd-networkd[1390]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 00:17:30.779181 systemd-networkd[1390]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 14 00:17:30.779815 systemd-networkd[1390]: eth0: Link UP May 14 00:17:30.779879 systemd-networkd[1390]: eth0: Gained carrier May 14 00:17:30.779945 systemd-networkd[1390]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 00:17:30.784147 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 14 00:17:30.788174 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 14 00:17:30.793023 systemd-networkd[1390]: eth0: DHCPv4 address 172.24.4.34/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 14 00:17:30.810176 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 14 00:17:30.820043 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 14 00:17:30.821830 systemd[1]: Reached target time-set.target - System Time Set. May 14 00:17:30.822849 systemd-resolved[1391]: Positive Trust Anchors: May 14 00:17:30.823143 systemd-resolved[1391]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 14 00:17:30.823235 systemd-resolved[1391]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 14 00:17:30.830530 systemd-resolved[1391]: Using system hostname 'ci-4284-0-0-n-4643e7afba.novalocal'. May 14 00:17:30.832156 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 14 00:17:30.833473 systemd[1]: Reached target network.target - Network. May 14 00:17:30.835156 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 14 00:17:30.836431 systemd[1]: Reached target sysinit.target - System Initialization. May 14 00:17:30.838009 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 14 00:17:30.839685 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 14 00:17:30.842289 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 14 00:17:30.842917 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 14 00:17:30.844416 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 14 00:17:30.845999 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 14 00:17:30.846105 systemd[1]: Reached target paths.target - Path Units. May 14 00:17:30.847552 systemd[1]: Reached target timers.target - Timer Units. May 14 00:17:30.850604 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 14 00:17:30.853498 systemd[1]: Starting docker.socket - Docker Socket for the API... May 14 00:17:30.860697 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 14 00:17:30.862371 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 14 00:17:30.864393 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 14 00:17:30.878571 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 14 00:17:30.879602 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 14 00:17:30.880905 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 14 00:17:30.884211 systemd[1]: Reached target sockets.target - Socket Units. May 14 00:17:30.884723 systemd[1]: Reached target basic.target - Basic System. May 14 00:17:30.885303 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 14 00:17:30.885336 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 14 00:17:30.888034 systemd[1]: Starting containerd.service - containerd container runtime... May 14 00:17:30.892796 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 14 00:17:30.901104 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 14 00:17:30.905140 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 14 00:17:30.910131 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 14 00:17:30.914083 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 14 00:17:30.917760 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 14 00:17:30.922103 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 14 00:17:30.933800 jq[1454]: false May 14 00:17:30.934282 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 14 00:17:30.938247 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 14 00:17:30.945752 systemd[1]: Starting systemd-logind.service - User Login Management... May 14 00:17:30.947909 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 14 00:17:30.951954 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 14 00:17:30.954989 extend-filesystems[1455]: Found loop4 May 14 00:17:30.954989 extend-filesystems[1455]: Found loop5 May 14 00:17:30.962239 extend-filesystems[1455]: Found loop6 May 14 00:17:30.962239 extend-filesystems[1455]: Found loop7 May 14 00:17:30.962239 extend-filesystems[1455]: Found vda May 14 00:17:30.962239 extend-filesystems[1455]: Found vda1 May 14 00:17:30.962239 extend-filesystems[1455]: Found vda2 May 14 00:17:30.962239 extend-filesystems[1455]: Found vda3 May 14 00:17:30.962239 extend-filesystems[1455]: Found usr May 14 00:17:30.962239 extend-filesystems[1455]: Found vda4 May 14 00:17:30.962239 extend-filesystems[1455]: Found vda6 May 14 00:17:30.962239 extend-filesystems[1455]: Found vda7 May 14 00:17:30.962239 extend-filesystems[1455]: Found vda9 May 14 00:17:30.962239 extend-filesystems[1455]: Checking size of /dev/vda9 May 14 00:17:30.960665 systemd[1]: Starting update-engine.service - Update Engine... May 14 00:17:31.006417 dbus-daemon[1451]: [system] SELinux support is enabled May 14 00:17:32.428198 extend-filesystems[1455]: Resized partition /dev/vda9 May 14 00:17:30.970317 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 14 00:17:30.990212 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 14 00:17:30.990408 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 14 00:17:32.443531 update_engine[1468]: I20250514 00:17:32.421937 1468 main.cc:92] Flatcar Update Engine starting May 14 00:17:32.443531 update_engine[1468]: I20250514 00:17:32.440718 1468 update_check_scheduler.cc:74] Next update check in 7m27s May 14 00:17:30.990759 systemd[1]: motdgen.service: Deactivated successfully. May 14 00:17:32.456770 jq[1470]: true May 14 00:17:30.990914 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 14 00:17:31.002800 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 14 00:17:31.003023 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 14 00:17:31.007192 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 14 00:17:32.396910 systemd-timesyncd[1392]: Contacted time server 44.190.5.123:123 (0.flatcar.pool.ntp.org). May 14 00:17:32.464713 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1319) May 14 00:17:32.396963 systemd-timesyncd[1392]: Initial clock synchronization to Wed 2025-05-14 00:17:32.396794 UTC. May 14 00:17:32.464818 extend-filesystems[1486]: resize2fs 1.47.2 (1-Jan-2025) May 14 00:17:32.397057 systemd-resolved[1391]: Clock change detected. Flushing caches. May 14 00:17:32.432062 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 14 00:17:32.432087 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 14 00:17:32.442496 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 14 00:17:32.442536 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 14 00:17:32.443273 systemd[1]: Started update-engine.service - Update Engine. May 14 00:17:32.446282 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 14 00:17:32.455476 (ntainerd)[1482]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 14 00:17:32.476042 jq[1481]: true May 14 00:17:32.558906 systemd-logind[1463]: New seat seat0. May 14 00:17:32.561262 tar[1475]: linux-amd64/helm May 14 00:17:32.564430 systemd-logind[1463]: Watching system buttons on /dev/input/event1 (Power Button) May 14 00:17:32.564453 systemd-logind[1463]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 14 00:17:32.564647 systemd[1]: Started systemd-logind.service - User Login Management. May 14 00:17:32.583316 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks May 14 00:17:32.587538 kernel: EXT4-fs (vda9): resized filesystem to 2014203 May 14 00:17:32.836433 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 14 00:17:32.844729 locksmithd[1487]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 14 00:17:32.904289 sshd_keygen[1478]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 14 00:17:32.911467 extend-filesystems[1486]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 14 00:17:32.911467 extend-filesystems[1486]: old_desc_blocks = 1, new_desc_blocks = 1 May 14 00:17:32.911467 extend-filesystems[1486]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. May 14 00:17:32.961694 extend-filesystems[1455]: Resized filesystem in /dev/vda9 May 14 00:17:32.978528 bash[1506]: Updated "/home/core/.ssh/authorized_keys" May 14 00:17:32.912277 systemd[1]: extend-filesystems.service: Deactivated successfully. May 14 00:17:32.913850 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 14 00:17:32.947709 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 14 00:17:32.965016 systemd[1]: Starting sshkeys.service... May 14 00:17:33.002623 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 14 00:17:33.008652 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 14 00:17:33.012689 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 14 00:17:33.025744 systemd[1]: Starting issuegen.service - Generate /run/issue... May 14 00:17:33.034873 systemd[1]: Started sshd@0-172.24.4.34:22-172.24.4.1:34946.service - OpenSSH per-connection server daemon (172.24.4.1:34946). May 14 00:17:33.057863 systemd[1]: issuegen.service: Deactivated successfully. May 14 00:17:33.058077 systemd[1]: Finished issuegen.service - Generate /run/issue. May 14 00:17:33.062864 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 14 00:17:33.105190 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 14 00:17:33.114072 systemd[1]: Started getty@tty1.service - Getty on tty1. May 14 00:17:33.121287 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 14 00:17:33.124008 systemd[1]: Reached target getty.target - Login Prompts. May 14 00:17:33.212996 containerd[1482]: time="2025-05-14T00:17:33Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 14 00:17:33.214077 containerd[1482]: time="2025-05-14T00:17:33.214048286Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 May 14 00:17:33.225370 containerd[1482]: time="2025-05-14T00:17:33.225329745Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.122µs" May 14 00:17:33.225487 containerd[1482]: time="2025-05-14T00:17:33.225470228Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 14 00:17:33.225569 containerd[1482]: time="2025-05-14T00:17:33.225553054Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 14 00:17:33.225801 containerd[1482]: time="2025-05-14T00:17:33.225782354Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 14 00:17:33.225874 containerd[1482]: time="2025-05-14T00:17:33.225858947Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 14 00:17:33.225983 containerd[1482]: time="2025-05-14T00:17:33.225966920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 14 00:17:33.226104 containerd[1482]: time="2025-05-14T00:17:33.226084260Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 14 00:17:33.226161 containerd[1482]: time="2025-05-14T00:17:33.226147799Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 14 00:17:33.226460 containerd[1482]: time="2025-05-14T00:17:33.226437663Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 14 00:17:33.226551 containerd[1482]: time="2025-05-14T00:17:33.226535176Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 14 00:17:33.226615 containerd[1482]: time="2025-05-14T00:17:33.226600318Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 14 00:17:33.226679 containerd[1482]: time="2025-05-14T00:17:33.226664548Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 14 00:17:33.227053 containerd[1482]: time="2025-05-14T00:17:33.227033119Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 14 00:17:33.227326 containerd[1482]: time="2025-05-14T00:17:33.227305651Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 14 00:17:33.227421 containerd[1482]: time="2025-05-14T00:17:33.227402282Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 14 00:17:33.227481 containerd[1482]: time="2025-05-14T00:17:33.227467103Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 14 00:17:33.227587 containerd[1482]: time="2025-05-14T00:17:33.227566850Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 14 00:17:33.227893 containerd[1482]: time="2025-05-14T00:17:33.227874147Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 14 00:17:33.228145 containerd[1482]: time="2025-05-14T00:17:33.228027545Z" level=info msg="metadata content store policy set" policy=shared May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.238771094Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.238812462Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.238828662Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.238841656Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.238854390Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.238867795Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.238880459Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.238893163Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.238904324Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.238915605Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.238926045Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.238939129Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.239040770Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 14 00:17:33.240021 containerd[1482]: time="2025-05-14T00:17:33.239063452Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 14 00:17:33.240367 containerd[1482]: time="2025-05-14T00:17:33.239081867Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 14 00:17:33.240367 containerd[1482]: time="2025-05-14T00:17:33.239100071Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 14 00:17:33.240367 containerd[1482]: time="2025-05-14T00:17:33.239117854Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 14 00:17:33.240367 containerd[1482]: time="2025-05-14T00:17:33.239130218Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 14 00:17:33.240367 containerd[1482]: time="2025-05-14T00:17:33.239142621Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 14 00:17:33.240367 containerd[1482]: time="2025-05-14T00:17:33.239153752Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 14 00:17:33.240367 containerd[1482]: time="2025-05-14T00:17:33.239171766Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 14 00:17:33.240367 containerd[1482]: time="2025-05-14T00:17:33.239184970Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 14 00:17:33.240367 containerd[1482]: time="2025-05-14T00:17:33.239196101Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 14 00:17:33.240367 containerd[1482]: time="2025-05-14T00:17:33.239250473Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 14 00:17:33.240367 containerd[1482]: time="2025-05-14T00:17:33.239265601Z" level=info msg="Start snapshots syncer" May 14 00:17:33.240367 containerd[1482]: time="2025-05-14T00:17:33.239282904Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 14 00:17:33.240633 containerd[1482]: time="2025-05-14T00:17:33.239555225Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 14 00:17:33.240633 containerd[1482]: time="2025-05-14T00:17:33.239612071Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 14 00:17:33.240762 containerd[1482]: time="2025-05-14T00:17:33.239680500Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 14 00:17:33.240762 containerd[1482]: time="2025-05-14T00:17:33.239761862Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 14 00:17:33.240762 containerd[1482]: time="2025-05-14T00:17:33.239784725Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 14 00:17:33.240762 containerd[1482]: time="2025-05-14T00:17:33.239796788Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 14 00:17:33.240762 containerd[1482]: time="2025-05-14T00:17:33.239808850Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 14 00:17:33.240762 containerd[1482]: time="2025-05-14T00:17:33.239825101Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 14 00:17:33.240762 containerd[1482]: time="2025-05-14T00:17:33.239836422Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 14 00:17:33.240762 containerd[1482]: time="2025-05-14T00:17:33.239848034Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 14 00:17:33.240762 containerd[1482]: time="2025-05-14T00:17:33.239869935Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 14 00:17:33.240762 containerd[1482]: time="2025-05-14T00:17:33.239882779Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 14 00:17:33.240762 containerd[1482]: time="2025-05-14T00:17:33.239896154Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 14 00:17:33.241645 containerd[1482]: time="2025-05-14T00:17:33.241625958Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 14 00:17:33.241756 containerd[1482]: time="2025-05-14T00:17:33.241736345Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 14 00:17:33.241830 containerd[1482]: time="2025-05-14T00:17:33.241816015Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 14 00:17:33.241907 containerd[1482]: time="2025-05-14T00:17:33.241875506Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 14 00:17:33.241972 containerd[1482]: time="2025-05-14T00:17:33.241948944Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 14 00:17:33.242093 containerd[1482]: time="2025-05-14T00:17:33.242021811Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 14 00:17:33.242093 containerd[1482]: time="2025-05-14T00:17:33.242040606Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 14 00:17:33.242093 containerd[1482]: time="2025-05-14T00:17:33.242058560Z" level=info msg="runtime interface created" May 14 00:17:33.242093 containerd[1482]: time="2025-05-14T00:17:33.242064541Z" level=info msg="created NRI interface" May 14 00:17:33.242093 containerd[1482]: time="2025-05-14T00:17:33.242073037Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 14 00:17:33.242423 containerd[1482]: time="2025-05-14T00:17:33.242230492Z" level=info msg="Connect containerd service" May 14 00:17:33.242423 containerd[1482]: time="2025-05-14T00:17:33.242271379Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 14 00:17:33.243321 containerd[1482]: time="2025-05-14T00:17:33.243301711Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 14 00:17:33.427086 containerd[1482]: time="2025-05-14T00:17:33.427030474Z" level=info msg="Start subscribing containerd event" May 14 00:17:33.427223 containerd[1482]: time="2025-05-14T00:17:33.427099384Z" level=info msg="Start recovering state" May 14 00:17:33.427223 containerd[1482]: time="2025-05-14T00:17:33.427211394Z" level=info msg="Start event monitor" May 14 00:17:33.427283 containerd[1482]: time="2025-05-14T00:17:33.427231381Z" level=info msg="Start cni network conf syncer for default" May 14 00:17:33.427283 containerd[1482]: time="2025-05-14T00:17:33.427245738Z" level=info msg="Start streaming server" May 14 00:17:33.427283 containerd[1482]: time="2025-05-14T00:17:33.427265325Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 14 00:17:33.427283 containerd[1482]: time="2025-05-14T00:17:33.427273971Z" level=info msg="runtime interface starting up..." May 14 00:17:33.427283 containerd[1482]: time="2025-05-14T00:17:33.427281135Z" level=info msg="starting plugins..." May 14 00:17:33.427389 containerd[1482]: time="2025-05-14T00:17:33.427295902Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 14 00:17:33.428572 containerd[1482]: time="2025-05-14T00:17:33.427526204Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 14 00:17:33.428572 containerd[1482]: time="2025-05-14T00:17:33.427579584Z" level=info msg=serving... address=/run/containerd/containerd.sock May 14 00:17:33.427728 systemd[1]: Started containerd.service - containerd container runtime. May 14 00:17:33.430448 containerd[1482]: time="2025-05-14T00:17:33.430421815Z" level=info msg="containerd successfully booted in 0.218186s" May 14 00:17:33.506918 tar[1475]: linux-amd64/LICENSE May 14 00:17:33.506918 tar[1475]: linux-amd64/README.md May 14 00:17:33.523648 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 14 00:17:34.140847 systemd-networkd[1390]: eth0: Gained IPv6LL May 14 00:17:34.145011 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 14 00:17:34.151645 systemd[1]: Reached target network-online.target - Network is Online. May 14 00:17:34.160167 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 00:17:34.169892 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 14 00:17:34.229684 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 14 00:17:34.241310 sshd[1533]: Accepted publickey for core from 172.24.4.1 port 34946 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:17:34.246163 sshd-session[1533]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:17:34.265930 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 14 00:17:34.272829 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 14 00:17:34.286936 systemd-logind[1463]: New session 1 of user core. May 14 00:17:34.300065 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 14 00:17:34.309397 systemd[1]: Starting user@500.service - User Manager for UID 500... May 14 00:17:34.325192 (systemd)[1575]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 14 00:17:34.329386 systemd-logind[1463]: New session c1 of user core. May 14 00:17:34.492099 systemd[1575]: Queued start job for default target default.target. May 14 00:17:34.498432 systemd[1575]: Created slice app.slice - User Application Slice. May 14 00:17:34.498460 systemd[1575]: Reached target paths.target - Paths. May 14 00:17:34.498597 systemd[1575]: Reached target timers.target - Timers. May 14 00:17:34.499844 systemd[1575]: Starting dbus.socket - D-Bus User Message Bus Socket... May 14 00:17:34.511591 systemd[1575]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 14 00:17:34.511768 systemd[1575]: Reached target sockets.target - Sockets. May 14 00:17:34.511808 systemd[1575]: Reached target basic.target - Basic System. May 14 00:17:34.511848 systemd[1575]: Reached target default.target - Main User Target. May 14 00:17:34.511873 systemd[1575]: Startup finished in 175ms. May 14 00:17:34.512224 systemd[1]: Started user@500.service - User Manager for UID 500. May 14 00:17:34.521785 systemd[1]: Started session-1.scope - Session 1 of User core. May 14 00:17:35.017315 systemd[1]: Started sshd@1-172.24.4.34:22-172.24.4.1:49438.service - OpenSSH per-connection server daemon (172.24.4.1:49438). May 14 00:17:35.756079 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 00:17:35.772112 (kubelet)[1594]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 00:17:36.235647 sshd[1586]: Accepted publickey for core from 172.24.4.1 port 49438 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:17:36.238174 sshd-session[1586]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:17:36.253764 systemd-logind[1463]: New session 2 of user core. May 14 00:17:36.265928 systemd[1]: Started session-2.scope - Session 2 of User core. May 14 00:17:36.878502 sshd[1600]: Connection closed by 172.24.4.1 port 49438 May 14 00:17:36.879777 sshd-session[1586]: pam_unix(sshd:session): session closed for user core May 14 00:17:36.896296 systemd[1]: sshd@1-172.24.4.34:22-172.24.4.1:49438.service: Deactivated successfully. May 14 00:17:36.900200 systemd[1]: session-2.scope: Deactivated successfully. May 14 00:17:36.902270 systemd-logind[1463]: Session 2 logged out. Waiting for processes to exit. May 14 00:17:36.905649 systemd[1]: Started sshd@2-172.24.4.34:22-172.24.4.1:49454.service - OpenSSH per-connection server daemon (172.24.4.1:49454). May 14 00:17:36.911310 systemd-logind[1463]: Removed session 2. May 14 00:17:37.192008 kubelet[1594]: E0514 00:17:37.191722 1594 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 00:17:37.195405 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 00:17:37.195710 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 00:17:37.196233 systemd[1]: kubelet.service: Consumed 1.945s CPU time, 243.5M memory peak. May 14 00:17:38.179112 sshd[1606]: Accepted publickey for core from 172.24.4.1 port 49454 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:17:38.181690 sshd-session[1606]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:17:38.195603 systemd-logind[1463]: New session 3 of user core. May 14 00:17:38.205935 systemd[1]: Started session-3.scope - Session 3 of User core. May 14 00:17:38.278216 login[1539]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 14 00:17:38.287782 login[1540]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 14 00:17:38.292812 systemd-logind[1463]: New session 5 of user core. May 14 00:17:38.308445 systemd[1]: Started session-5.scope - Session 5 of User core. May 14 00:17:38.315497 systemd-logind[1463]: New session 4 of user core. May 14 00:17:38.323224 systemd[1]: Started session-4.scope - Session 4 of User core. May 14 00:17:38.821626 sshd[1613]: Connection closed by 172.24.4.1 port 49454 May 14 00:17:38.822622 sshd-session[1606]: pam_unix(sshd:session): session closed for user core May 14 00:17:38.829762 systemd[1]: sshd@2-172.24.4.34:22-172.24.4.1:49454.service: Deactivated successfully. May 14 00:17:38.833243 systemd[1]: session-3.scope: Deactivated successfully. May 14 00:17:38.835072 systemd-logind[1463]: Session 3 logged out. Waiting for processes to exit. May 14 00:17:38.837879 systemd-logind[1463]: Removed session 3. May 14 00:17:39.360803 coreos-metadata[1450]: May 14 00:17:39.360 WARN failed to locate config-drive, using the metadata service API instead May 14 00:17:39.410329 coreos-metadata[1450]: May 14 00:17:39.410 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 May 14 00:17:39.595668 coreos-metadata[1450]: May 14 00:17:39.595 INFO Fetch successful May 14 00:17:39.595668 coreos-metadata[1450]: May 14 00:17:39.595 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 14 00:17:39.609857 coreos-metadata[1450]: May 14 00:17:39.609 INFO Fetch successful May 14 00:17:39.609857 coreos-metadata[1450]: May 14 00:17:39.609 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 May 14 00:17:39.622992 coreos-metadata[1450]: May 14 00:17:39.622 INFO Fetch successful May 14 00:17:39.622992 coreos-metadata[1450]: May 14 00:17:39.622 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 May 14 00:17:39.636016 coreos-metadata[1450]: May 14 00:17:39.635 INFO Fetch successful May 14 00:17:39.636016 coreos-metadata[1450]: May 14 00:17:39.635 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 May 14 00:17:39.652083 coreos-metadata[1450]: May 14 00:17:39.651 INFO Fetch successful May 14 00:17:39.652083 coreos-metadata[1450]: May 14 00:17:39.652 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 May 14 00:17:39.664949 coreos-metadata[1450]: May 14 00:17:39.664 INFO Fetch successful May 14 00:17:39.713493 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 14 00:17:39.717854 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 14 00:17:40.115090 coreos-metadata[1525]: May 14 00:17:40.114 WARN failed to locate config-drive, using the metadata service API instead May 14 00:17:40.156780 coreos-metadata[1525]: May 14 00:17:40.156 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 May 14 00:17:40.172382 coreos-metadata[1525]: May 14 00:17:40.172 INFO Fetch successful May 14 00:17:40.172382 coreos-metadata[1525]: May 14 00:17:40.172 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 May 14 00:17:40.186773 coreos-metadata[1525]: May 14 00:17:40.186 INFO Fetch successful May 14 00:17:40.194982 unknown[1525]: wrote ssh authorized keys file for user: core May 14 00:17:40.241630 update-ssh-keys[1651]: Updated "/home/core/.ssh/authorized_keys" May 14 00:17:40.242792 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 14 00:17:40.246168 systemd[1]: Finished sshkeys.service. May 14 00:17:40.251923 systemd[1]: Reached target multi-user.target - Multi-User System. May 14 00:17:40.252746 systemd[1]: Startup finished in 1.255s (kernel) + 15.814s (initrd) + 11.289s (userspace) = 28.359s. May 14 00:17:47.447107 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 14 00:17:47.451316 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 00:17:47.799730 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 00:17:47.812379 (kubelet)[1663]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 00:17:47.906260 kubelet[1663]: E0514 00:17:47.906219 1663 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 00:17:47.913281 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 00:17:47.913657 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 00:17:47.914355 systemd[1]: kubelet.service: Consumed 319ms CPU time, 96.2M memory peak. May 14 00:17:48.842593 systemd[1]: Started sshd@3-172.24.4.34:22-172.24.4.1:43128.service - OpenSSH per-connection server daemon (172.24.4.1:43128). May 14 00:17:50.300049 sshd[1672]: Accepted publickey for core from 172.24.4.1 port 43128 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:17:50.302684 sshd-session[1672]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:17:50.314418 systemd-logind[1463]: New session 6 of user core. May 14 00:17:50.323821 systemd[1]: Started session-6.scope - Session 6 of User core. May 14 00:17:50.943795 sshd[1674]: Connection closed by 172.24.4.1 port 43128 May 14 00:17:50.946198 sshd-session[1672]: pam_unix(sshd:session): session closed for user core May 14 00:17:50.961810 systemd[1]: sshd@3-172.24.4.34:22-172.24.4.1:43128.service: Deactivated successfully. May 14 00:17:50.964924 systemd[1]: session-6.scope: Deactivated successfully. May 14 00:17:50.968788 systemd-logind[1463]: Session 6 logged out. Waiting for processes to exit. May 14 00:17:50.971125 systemd[1]: Started sshd@4-172.24.4.34:22-172.24.4.1:43130.service - OpenSSH per-connection server daemon (172.24.4.1:43130). May 14 00:17:50.974906 systemd-logind[1463]: Removed session 6. May 14 00:17:52.296223 sshd[1679]: Accepted publickey for core from 172.24.4.1 port 43130 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:17:52.299808 sshd-session[1679]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:17:52.312617 systemd-logind[1463]: New session 7 of user core. May 14 00:17:52.318845 systemd[1]: Started session-7.scope - Session 7 of User core. May 14 00:17:53.081217 sshd[1682]: Connection closed by 172.24.4.1 port 43130 May 14 00:17:53.080007 sshd-session[1679]: pam_unix(sshd:session): session closed for user core May 14 00:17:53.097913 systemd[1]: sshd@4-172.24.4.34:22-172.24.4.1:43130.service: Deactivated successfully. May 14 00:17:53.102066 systemd[1]: session-7.scope: Deactivated successfully. May 14 00:17:53.105905 systemd-logind[1463]: Session 7 logged out. Waiting for processes to exit. May 14 00:17:53.110280 systemd[1]: Started sshd@5-172.24.4.34:22-172.24.4.1:43132.service - OpenSSH per-connection server daemon (172.24.4.1:43132). May 14 00:17:53.114405 systemd-logind[1463]: Removed session 7. May 14 00:17:54.432448 sshd[1687]: Accepted publickey for core from 172.24.4.1 port 43132 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:17:54.435557 sshd-session[1687]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:17:54.447209 systemd-logind[1463]: New session 8 of user core. May 14 00:17:54.460910 systemd[1]: Started session-8.scope - Session 8 of User core. May 14 00:17:55.074565 sshd[1690]: Connection closed by 172.24.4.1 port 43132 May 14 00:17:55.075383 sshd-session[1687]: pam_unix(sshd:session): session closed for user core May 14 00:17:55.091369 systemd[1]: sshd@5-172.24.4.34:22-172.24.4.1:43132.service: Deactivated successfully. May 14 00:17:55.095488 systemd[1]: session-8.scope: Deactivated successfully. May 14 00:17:55.098120 systemd-logind[1463]: Session 8 logged out. Waiting for processes to exit. May 14 00:17:55.102866 systemd[1]: Started sshd@6-172.24.4.34:22-172.24.4.1:56580.service - OpenSSH per-connection server daemon (172.24.4.1:56580). May 14 00:17:55.105815 systemd-logind[1463]: Removed session 8. May 14 00:17:56.391478 sshd[1695]: Accepted publickey for core from 172.24.4.1 port 56580 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:17:56.394707 sshd-session[1695]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:17:56.408645 systemd-logind[1463]: New session 9 of user core. May 14 00:17:56.415843 systemd[1]: Started session-9.scope - Session 9 of User core. May 14 00:17:56.890733 sudo[1699]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 14 00:17:56.891365 sudo[1699]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 00:17:56.914564 sudo[1699]: pam_unix(sudo:session): session closed for user root May 14 00:17:57.166668 sshd[1698]: Connection closed by 172.24.4.1 port 56580 May 14 00:17:57.167917 sshd-session[1695]: pam_unix(sshd:session): session closed for user core May 14 00:17:57.184653 systemd[1]: sshd@6-172.24.4.34:22-172.24.4.1:56580.service: Deactivated successfully. May 14 00:17:57.188811 systemd[1]: session-9.scope: Deactivated successfully. May 14 00:17:57.192900 systemd-logind[1463]: Session 9 logged out. Waiting for processes to exit. May 14 00:17:57.196308 systemd[1]: Started sshd@7-172.24.4.34:22-172.24.4.1:56590.service - OpenSSH per-connection server daemon (172.24.4.1:56590). May 14 00:17:57.199621 systemd-logind[1463]: Removed session 9. May 14 00:17:58.165029 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 14 00:17:58.169846 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 00:17:58.501970 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 00:17:58.517046 (kubelet)[1715]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 00:17:58.561046 sshd[1704]: Accepted publickey for core from 172.24.4.1 port 56590 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:17:58.563803 sshd-session[1704]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:17:58.578652 systemd-logind[1463]: New session 10 of user core. May 14 00:17:58.584924 systemd[1]: Started session-10.scope - Session 10 of User core. May 14 00:17:58.615409 kubelet[1715]: E0514 00:17:58.615373 1715 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 00:17:58.619251 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 00:17:58.619570 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 00:17:58.620081 systemd[1]: kubelet.service: Consumed 301ms CPU time, 96.2M memory peak. May 14 00:17:59.067786 sudo[1725]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 14 00:17:59.068407 sudo[1725]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 00:17:59.075393 sudo[1725]: pam_unix(sudo:session): session closed for user root May 14 00:17:59.086558 sudo[1724]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 14 00:17:59.087181 sudo[1724]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 00:17:59.108121 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 14 00:17:59.181269 augenrules[1747]: No rules May 14 00:17:59.182772 systemd[1]: audit-rules.service: Deactivated successfully. May 14 00:17:59.183207 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 14 00:17:59.185624 sudo[1724]: pam_unix(sudo:session): session closed for user root May 14 00:17:59.374263 sshd[1722]: Connection closed by 172.24.4.1 port 56590 May 14 00:17:59.375280 sshd-session[1704]: pam_unix(sshd:session): session closed for user core May 14 00:17:59.391336 systemd[1]: sshd@7-172.24.4.34:22-172.24.4.1:56590.service: Deactivated successfully. May 14 00:17:59.394480 systemd[1]: session-10.scope: Deactivated successfully. May 14 00:17:59.397799 systemd-logind[1463]: Session 10 logged out. Waiting for processes to exit. May 14 00:17:59.402045 systemd[1]: Started sshd@8-172.24.4.34:22-172.24.4.1:56600.service - OpenSSH per-connection server daemon (172.24.4.1:56600). May 14 00:17:59.405273 systemd-logind[1463]: Removed session 10. May 14 00:18:00.633698 sshd[1755]: Accepted publickey for core from 172.24.4.1 port 56600 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:18:00.636803 sshd-session[1755]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:18:00.649631 systemd-logind[1463]: New session 11 of user core. May 14 00:18:00.660879 systemd[1]: Started session-11.scope - Session 11 of User core. May 14 00:18:01.060391 sudo[1759]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 14 00:18:01.061138 sudo[1759]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 00:18:01.696267 systemd[1]: Starting docker.service - Docker Application Container Engine... May 14 00:18:01.710897 (dockerd)[1778]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 14 00:18:02.285632 dockerd[1778]: time="2025-05-14T00:18:02.285488635Z" level=info msg="Starting up" May 14 00:18:02.288817 dockerd[1778]: time="2025-05-14T00:18:02.288769839Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 14 00:18:02.362134 systemd[1]: var-lib-docker-metacopy\x2dcheck2202972260-merged.mount: Deactivated successfully. May 14 00:18:02.420386 dockerd[1778]: time="2025-05-14T00:18:02.420311556Z" level=info msg="Loading containers: start." May 14 00:18:02.623559 kernel: Initializing XFRM netlink socket May 14 00:18:02.708667 systemd-networkd[1390]: docker0: Link UP May 14 00:18:02.858183 dockerd[1778]: time="2025-05-14T00:18:02.858064371Z" level=info msg="Loading containers: done." May 14 00:18:02.891578 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck949775960-merged.mount: Deactivated successfully. May 14 00:18:03.103762 dockerd[1778]: time="2025-05-14T00:18:03.103548887Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 14 00:18:03.103762 dockerd[1778]: time="2025-05-14T00:18:03.103716421Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 May 14 00:18:03.104016 dockerd[1778]: time="2025-05-14T00:18:03.103930062Z" level=info msg="Daemon has completed initialization" May 14 00:18:03.189213 dockerd[1778]: time="2025-05-14T00:18:03.189060955Z" level=info msg="API listen on /run/docker.sock" May 14 00:18:03.189804 systemd[1]: Started docker.service - Docker Application Container Engine. May 14 00:18:05.066288 containerd[1482]: time="2025-05-14T00:18:05.066164808Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\"" May 14 00:18:05.781658 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1072724213.mount: Deactivated successfully. May 14 00:18:07.968966 containerd[1482]: time="2025-05-14T00:18:07.968920882Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:07.970964 containerd[1482]: time="2025-05-14T00:18:07.970886884Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.12: active requests=0, bytes read=32674881" May 14 00:18:07.971714 containerd[1482]: time="2025-05-14T00:18:07.971670340Z" level=info msg="ImageCreate event name:\"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:07.975051 containerd[1482]: time="2025-05-14T00:18:07.975009809Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:07.976435 containerd[1482]: time="2025-05-14T00:18:07.976250556Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.12\" with image id \"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\", size \"32671673\" in 2.910027659s" May 14 00:18:07.976435 containerd[1482]: time="2025-05-14T00:18:07.976303376Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\" returns image reference \"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\"" May 14 00:18:07.995613 containerd[1482]: time="2025-05-14T00:18:07.995352885Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\"" May 14 00:18:08.741933 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 14 00:18:08.746890 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 00:18:08.902751 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 00:18:08.911782 (kubelet)[2053]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 00:18:09.035826 kubelet[2053]: E0514 00:18:09.035717 2053 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 00:18:09.038974 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 00:18:09.039122 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 00:18:09.039422 systemd[1]: kubelet.service: Consumed 186ms CPU time, 93.6M memory peak. May 14 00:18:10.224835 containerd[1482]: time="2025-05-14T00:18:10.224617543Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:10.226542 containerd[1482]: time="2025-05-14T00:18:10.226175715Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.12: active requests=0, bytes read=29617542" May 14 00:18:10.228084 containerd[1482]: time="2025-05-14T00:18:10.227941238Z" level=info msg="ImageCreate event name:\"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:10.231994 containerd[1482]: time="2025-05-14T00:18:10.231940625Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:10.233970 containerd[1482]: time="2025-05-14T00:18:10.233699616Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.12\" with image id \"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\", size \"31105907\" in 2.238310673s" May 14 00:18:10.233970 containerd[1482]: time="2025-05-14T00:18:10.233728260Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\" returns image reference \"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\"" May 14 00:18:10.254014 containerd[1482]: time="2025-05-14T00:18:10.253974935Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\"" May 14 00:18:12.247210 containerd[1482]: time="2025-05-14T00:18:12.247142319Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:12.248553 containerd[1482]: time="2025-05-14T00:18:12.248315847Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.12: active requests=0, bytes read=17903690" May 14 00:18:12.249543 containerd[1482]: time="2025-05-14T00:18:12.249480518Z" level=info msg="ImageCreate event name:\"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:12.252592 containerd[1482]: time="2025-05-14T00:18:12.252520206Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:12.253914 containerd[1482]: time="2025-05-14T00:18:12.253530076Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.12\" with image id \"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\", size \"19392073\" in 1.999354323s" May 14 00:18:12.253914 containerd[1482]: time="2025-05-14T00:18:12.253560533Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\" returns image reference \"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\"" May 14 00:18:12.272770 containerd[1482]: time="2025-05-14T00:18:12.272739554Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\"" May 14 00:18:13.678659 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount279252457.mount: Deactivated successfully. May 14 00:18:14.163714 containerd[1482]: time="2025-05-14T00:18:14.163669100Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:14.164823 containerd[1482]: time="2025-05-14T00:18:14.164781692Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.12: active requests=0, bytes read=29185825" May 14 00:18:14.165834 containerd[1482]: time="2025-05-14T00:18:14.165786793Z" level=info msg="ImageCreate event name:\"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:14.167891 containerd[1482]: time="2025-05-14T00:18:14.167870000Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:14.168540 containerd[1482]: time="2025-05-14T00:18:14.168435684Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.12\" with image id \"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\", repo tag \"registry.k8s.io/kube-proxy:v1.30.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\", size \"29184836\" in 1.895656656s" May 14 00:18:14.168540 containerd[1482]: time="2025-05-14T00:18:14.168479145Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\" returns image reference \"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\"" May 14 00:18:14.188430 containerd[1482]: time="2025-05-14T00:18:14.188396732Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 14 00:18:14.831772 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1672812588.mount: Deactivated successfully. May 14 00:18:16.290126 containerd[1482]: time="2025-05-14T00:18:16.289978383Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:16.292981 containerd[1482]: time="2025-05-14T00:18:16.292813724Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" May 14 00:18:16.294353 containerd[1482]: time="2025-05-14T00:18:16.294183319Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:16.300727 containerd[1482]: time="2025-05-14T00:18:16.300610701Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:16.304400 containerd[1482]: time="2025-05-14T00:18:16.304164973Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.115718227s" May 14 00:18:16.304400 containerd[1482]: time="2025-05-14T00:18:16.304237109Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 14 00:18:16.345750 containerd[1482]: time="2025-05-14T00:18:16.345452281Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" May 14 00:18:17.077216 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1202992240.mount: Deactivated successfully. May 14 00:18:17.088567 containerd[1482]: time="2025-05-14T00:18:17.088454115Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:17.090699 containerd[1482]: time="2025-05-14T00:18:17.090504339Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322298" May 14 00:18:17.092363 containerd[1482]: time="2025-05-14T00:18:17.092249519Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:17.098884 containerd[1482]: time="2025-05-14T00:18:17.098747152Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:17.100775 containerd[1482]: time="2025-05-14T00:18:17.100451806Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 754.841117ms" May 14 00:18:17.100775 containerd[1482]: time="2025-05-14T00:18:17.100593884Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" May 14 00:18:17.144103 containerd[1482]: time="2025-05-14T00:18:17.143641108Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" May 14 00:18:17.901799 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3008564612.mount: Deactivated successfully. May 14 00:18:17.904468 update_engine[1468]: I20250514 00:18:17.902882 1468 update_attempter.cc:509] Updating boot flags... May 14 00:18:17.944764 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2170) May 14 00:18:18.017615 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2171) May 14 00:18:19.241651 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 14 00:18:19.247197 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 00:18:19.526264 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 00:18:19.535919 (kubelet)[2220]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 00:18:19.583151 kubelet[2220]: E0514 00:18:19.583108 2220 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 00:18:19.586245 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 00:18:19.586666 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 00:18:19.587644 systemd[1]: kubelet.service: Consumed 225ms CPU time, 95.8M memory peak. May 14 00:18:21.168389 containerd[1482]: time="2025-05-14T00:18:21.168245346Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:21.174555 containerd[1482]: time="2025-05-14T00:18:21.172453894Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238579" May 14 00:18:21.176331 containerd[1482]: time="2025-05-14T00:18:21.176243733Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:21.190736 containerd[1482]: time="2025-05-14T00:18:21.190654587Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:21.194045 containerd[1482]: time="2025-05-14T00:18:21.193956420Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 4.050236284s" May 14 00:18:21.194247 containerd[1482]: time="2025-05-14T00:18:21.194208795Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" May 14 00:18:26.523836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 00:18:26.524390 systemd[1]: kubelet.service: Consumed 225ms CPU time, 95.8M memory peak. May 14 00:18:26.528884 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 00:18:26.564971 systemd[1]: Reload requested from client PID 2317 ('systemctl') (unit session-11.scope)... May 14 00:18:26.565155 systemd[1]: Reloading... May 14 00:18:26.674654 zram_generator::config[2362]: No configuration found. May 14 00:18:26.857821 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 00:18:26.988121 systemd[1]: Reloading finished in 422 ms. May 14 00:18:27.405879 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 14 00:18:27.406437 systemd[1]: kubelet.service: Failed with result 'signal'. May 14 00:18:27.407354 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 00:18:27.407484 systemd[1]: kubelet.service: Consumed 96ms CPU time, 74.3M memory peak. May 14 00:18:27.416196 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 00:18:28.017387 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 00:18:28.026955 (kubelet)[2427]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 14 00:18:28.077091 kubelet[2427]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 00:18:28.077457 kubelet[2427]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 14 00:18:28.077540 kubelet[2427]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 00:18:28.077673 kubelet[2427]: I0514 00:18:28.077646 2427 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 14 00:18:28.476741 kubelet[2427]: I0514 00:18:28.476653 2427 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 14 00:18:28.476872 kubelet[2427]: I0514 00:18:28.476859 2427 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 14 00:18:28.477156 kubelet[2427]: I0514 00:18:28.477139 2427 server.go:927] "Client rotation is on, will bootstrap in background" May 14 00:18:28.493871 kubelet[2427]: I0514 00:18:28.493822 2427 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 14 00:18:28.496400 kubelet[2427]: E0514 00:18:28.496335 2427 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.24.4.34:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:28.517445 kubelet[2427]: I0514 00:18:28.517392 2427 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 14 00:18:28.518705 kubelet[2427]: I0514 00:18:28.518029 2427 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 14 00:18:28.519071 kubelet[2427]: I0514 00:18:28.518113 2427 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-4643e7afba.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 14 00:18:28.519675 kubelet[2427]: I0514 00:18:28.519634 2427 topology_manager.go:138] "Creating topology manager with none policy" May 14 00:18:28.519675 kubelet[2427]: I0514 00:18:28.519657 2427 container_manager_linux.go:301] "Creating device plugin manager" May 14 00:18:28.519853 kubelet[2427]: I0514 00:18:28.519778 2427 state_mem.go:36] "Initialized new in-memory state store" May 14 00:18:28.521408 kubelet[2427]: I0514 00:18:28.521183 2427 kubelet.go:400] "Attempting to sync node with API server" May 14 00:18:28.521408 kubelet[2427]: I0514 00:18:28.521224 2427 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 14 00:18:28.521408 kubelet[2427]: I0514 00:18:28.521249 2427 kubelet.go:312] "Adding apiserver pod source" May 14 00:18:28.521408 kubelet[2427]: I0514 00:18:28.521264 2427 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 14 00:18:28.527402 kubelet[2427]: W0514 00:18:28.526940 2427 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.34:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:28.527402 kubelet[2427]: E0514 00:18:28.526996 2427 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.34:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:28.527402 kubelet[2427]: W0514 00:18:28.527057 2427 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.34:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-4643e7afba.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:28.527402 kubelet[2427]: E0514 00:18:28.527092 2427 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.34:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-4643e7afba.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:28.527697 kubelet[2427]: I0514 00:18:28.527681 2427 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 14 00:18:28.529790 kubelet[2427]: I0514 00:18:28.529772 2427 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 14 00:18:28.530528 kubelet[2427]: W0514 00:18:28.529892 2427 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 14 00:18:28.530879 kubelet[2427]: I0514 00:18:28.530863 2427 server.go:1264] "Started kubelet" May 14 00:18:28.543830 kubelet[2427]: I0514 00:18:28.543806 2427 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 14 00:18:28.544332 kubelet[2427]: E0514 00:18:28.544124 2427 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.34:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.34:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284-0-0-n-4643e7afba.novalocal.183f3cb057f85222 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284-0-0-n-4643e7afba.novalocal,UID:ci-4284-0-0-n-4643e7afba.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-n-4643e7afba.novalocal,},FirstTimestamp:2025-05-14 00:18:28.530844194 +0000 UTC m=+0.500361706,LastTimestamp:2025-05-14 00:18:28.530844194 +0000 UTC m=+0.500361706,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-n-4643e7afba.novalocal,}" May 14 00:18:28.551115 kubelet[2427]: I0514 00:18:28.550936 2427 volume_manager.go:291] "Starting Kubelet Volume Manager" May 14 00:18:28.551354 kubelet[2427]: I0514 00:18:28.551309 2427 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 14 00:18:28.553566 kubelet[2427]: I0514 00:18:28.553289 2427 server.go:455] "Adding debug handlers to kubelet server" May 14 00:18:28.554057 kubelet[2427]: I0514 00:18:28.554044 2427 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 14 00:18:28.554166 kubelet[2427]: I0514 00:18:28.554155 2427 reconciler.go:26] "Reconciler: start to sync state" May 14 00:18:28.555475 kubelet[2427]: I0514 00:18:28.555366 2427 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 14 00:18:28.555857 kubelet[2427]: I0514 00:18:28.555759 2427 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 14 00:18:28.560184 kubelet[2427]: E0514 00:18:28.559653 2427 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-4643e7afba.novalocal?timeout=10s\": dial tcp 172.24.4.34:6443: connect: connection refused" interval="200ms" May 14 00:18:28.560714 kubelet[2427]: I0514 00:18:28.560481 2427 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 14 00:18:28.563532 kubelet[2427]: E0514 00:18:28.563404 2427 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 14 00:18:28.564006 kubelet[2427]: W0514 00:18:28.563968 2427 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.34:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:28.564117 kubelet[2427]: E0514 00:18:28.564085 2427 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.34:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:28.565557 kubelet[2427]: I0514 00:18:28.564650 2427 factory.go:221] Registration of the containerd container factory successfully May 14 00:18:28.565557 kubelet[2427]: I0514 00:18:28.564663 2427 factory.go:221] Registration of the systemd container factory successfully May 14 00:18:28.580653 kubelet[2427]: I0514 00:18:28.580598 2427 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 14 00:18:28.582014 kubelet[2427]: I0514 00:18:28.581977 2427 cpu_manager.go:214] "Starting CPU manager" policy="none" May 14 00:18:28.582116 kubelet[2427]: I0514 00:18:28.582105 2427 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 14 00:18:28.582201 kubelet[2427]: I0514 00:18:28.582192 2427 state_mem.go:36] "Initialized new in-memory state store" May 14 00:18:28.582890 kubelet[2427]: I0514 00:18:28.582859 2427 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 14 00:18:28.582959 kubelet[2427]: I0514 00:18:28.582893 2427 status_manager.go:217] "Starting to sync pod status with apiserver" May 14 00:18:28.582959 kubelet[2427]: I0514 00:18:28.582918 2427 kubelet.go:2337] "Starting kubelet main sync loop" May 14 00:18:28.583014 kubelet[2427]: E0514 00:18:28.582979 2427 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 14 00:18:28.586885 kubelet[2427]: W0514 00:18:28.586790 2427 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.34:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:28.586885 kubelet[2427]: E0514 00:18:28.586846 2427 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.34:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:28.591135 kubelet[2427]: I0514 00:18:28.591071 2427 policy_none.go:49] "None policy: Start" May 14 00:18:28.591968 kubelet[2427]: I0514 00:18:28.591950 2427 memory_manager.go:170] "Starting memorymanager" policy="None" May 14 00:18:28.592093 kubelet[2427]: I0514 00:18:28.592078 2427 state_mem.go:35] "Initializing new in-memory state store" May 14 00:18:28.603568 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 14 00:18:28.616052 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 14 00:18:28.619885 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 14 00:18:28.630737 kubelet[2427]: I0514 00:18:28.630479 2427 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 14 00:18:28.630737 kubelet[2427]: I0514 00:18:28.630692 2427 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 14 00:18:28.630839 kubelet[2427]: I0514 00:18:28.630805 2427 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 14 00:18:28.633673 kubelet[2427]: E0514 00:18:28.633504 2427 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284-0-0-n-4643e7afba.novalocal\" not found" May 14 00:18:28.654231 kubelet[2427]: I0514 00:18:28.654013 2427 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.654858 kubelet[2427]: E0514 00:18:28.654796 2427 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.34:6443/api/v1/nodes\": dial tcp 172.24.4.34:6443: connect: connection refused" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.684033 kubelet[2427]: I0514 00:18:28.683961 2427 topology_manager.go:215] "Topology Admit Handler" podUID="c57792f722ea2a1749272ba752d0b50e" podNamespace="kube-system" podName="kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.687437 kubelet[2427]: I0514 00:18:28.687341 2427 topology_manager.go:215] "Topology Admit Handler" podUID="7653ec825afeb202f28eec235b5bb085" podNamespace="kube-system" podName="kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.691306 kubelet[2427]: I0514 00:18:28.690988 2427 topology_manager.go:215] "Topology Admit Handler" podUID="05e121befe145ff052f2e4e0945e63a9" podNamespace="kube-system" podName="kube-scheduler-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.711386 systemd[1]: Created slice kubepods-burstable-podc57792f722ea2a1749272ba752d0b50e.slice - libcontainer container kubepods-burstable-podc57792f722ea2a1749272ba752d0b50e.slice. May 14 00:18:28.729057 systemd[1]: Created slice kubepods-burstable-pod7653ec825afeb202f28eec235b5bb085.slice - libcontainer container kubepods-burstable-pod7653ec825afeb202f28eec235b5bb085.slice. May 14 00:18:28.750833 systemd[1]: Created slice kubepods-burstable-pod05e121befe145ff052f2e4e0945e63a9.slice - libcontainer container kubepods-burstable-pod05e121befe145ff052f2e4e0945e63a9.slice. May 14 00:18:28.755612 kubelet[2427]: I0514 00:18:28.754839 2427 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7653ec825afeb202f28eec235b5bb085-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"7653ec825afeb202f28eec235b5bb085\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.755612 kubelet[2427]: I0514 00:18:28.755271 2427 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/05e121befe145ff052f2e4e0945e63a9-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"05e121befe145ff052f2e4e0945e63a9\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.755612 kubelet[2427]: I0514 00:18:28.755386 2427 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c57792f722ea2a1749272ba752d0b50e-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"c57792f722ea2a1749272ba752d0b50e\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.755612 kubelet[2427]: I0514 00:18:28.755477 2427 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c57792f722ea2a1749272ba752d0b50e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"c57792f722ea2a1749272ba752d0b50e\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.755987 kubelet[2427]: I0514 00:18:28.755618 2427 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7653ec825afeb202f28eec235b5bb085-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"7653ec825afeb202f28eec235b5bb085\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.755987 kubelet[2427]: I0514 00:18:28.755721 2427 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7653ec825afeb202f28eec235b5bb085-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"7653ec825afeb202f28eec235b5bb085\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.755987 kubelet[2427]: I0514 00:18:28.755815 2427 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c57792f722ea2a1749272ba752d0b50e-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"c57792f722ea2a1749272ba752d0b50e\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.755987 kubelet[2427]: I0514 00:18:28.755907 2427 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7653ec825afeb202f28eec235b5bb085-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"7653ec825afeb202f28eec235b5bb085\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.756257 kubelet[2427]: I0514 00:18:28.755998 2427 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7653ec825afeb202f28eec235b5bb085-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"7653ec825afeb202f28eec235b5bb085\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.761098 kubelet[2427]: E0514 00:18:28.760987 2427 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-4643e7afba.novalocal?timeout=10s\": dial tcp 172.24.4.34:6443: connect: connection refused" interval="400ms" May 14 00:18:28.860285 kubelet[2427]: I0514 00:18:28.859629 2427 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:28.860804 kubelet[2427]: E0514 00:18:28.860634 2427 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.34:6443/api/v1/nodes\": dial tcp 172.24.4.34:6443: connect: connection refused" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:29.027605 containerd[1482]: time="2025-05-14T00:18:29.027415712Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal,Uid:c57792f722ea2a1749272ba752d0b50e,Namespace:kube-system,Attempt:0,}" May 14 00:18:29.045341 containerd[1482]: time="2025-05-14T00:18:29.045250535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal,Uid:7653ec825afeb202f28eec235b5bb085,Namespace:kube-system,Attempt:0,}" May 14 00:18:29.057825 containerd[1482]: time="2025-05-14T00:18:29.057765451Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-4643e7afba.novalocal,Uid:05e121befe145ff052f2e4e0945e63a9,Namespace:kube-system,Attempt:0,}" May 14 00:18:29.162009 kubelet[2427]: E0514 00:18:29.161896 2427 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-4643e7afba.novalocal?timeout=10s\": dial tcp 172.24.4.34:6443: connect: connection refused" interval="800ms" May 14 00:18:29.265577 kubelet[2427]: I0514 00:18:29.265188 2427 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:29.265977 kubelet[2427]: E0514 00:18:29.265887 2427 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.34:6443/api/v1/nodes\": dial tcp 172.24.4.34:6443: connect: connection refused" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:29.638946 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1653635946.mount: Deactivated successfully. May 14 00:18:29.656253 containerd[1482]: time="2025-05-14T00:18:29.656119376Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 00:18:29.663290 containerd[1482]: time="2025-05-14T00:18:29.663187856Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 00:18:29.667718 containerd[1482]: time="2025-05-14T00:18:29.667595812Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 14 00:18:29.669122 containerd[1482]: time="2025-05-14T00:18:29.669000940Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 14 00:18:29.672233 containerd[1482]: time="2025-05-14T00:18:29.672134433Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 00:18:29.675573 containerd[1482]: time="2025-05-14T00:18:29.675391738Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 00:18:29.676200 containerd[1482]: time="2025-05-14T00:18:29.675901345Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 14 00:18:29.677149 containerd[1482]: time="2025-05-14T00:18:29.677005648Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 00:18:29.679788 containerd[1482]: time="2025-05-14T00:18:29.679627420Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 630.754884ms" May 14 00:18:29.684844 containerd[1482]: time="2025-05-14T00:18:29.684780975Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 650.953265ms" May 14 00:18:29.688987 containerd[1482]: time="2025-05-14T00:18:29.688457948Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 622.690623ms" May 14 00:18:29.702708 kubelet[2427]: W0514 00:18:29.702581 2427 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.34:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:29.704235 kubelet[2427]: E0514 00:18:29.702881 2427 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.34:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:29.754496 containerd[1482]: time="2025-05-14T00:18:29.754422582Z" level=info msg="connecting to shim c689a5e24e8a6b32759f0be8654db09159c2f75d74fcc9e03e54edc5d0cdbf6e" address="unix:///run/containerd/s/538f68eed575942c31deb5395d221bb13ecafbb2ab369fda4c7688009daaca8f" namespace=k8s.io protocol=ttrpc version=3 May 14 00:18:29.757174 containerd[1482]: time="2025-05-14T00:18:29.756821205Z" level=info msg="connecting to shim 5ef32ffc2c038b0586a76ecbac4a90de87b81ab2a28a56e7497a3e41e79304c1" address="unix:///run/containerd/s/015167821c9991bad5bc8593974bb16b4a8195878756cb308469a8ee05d2b5da" namespace=k8s.io protocol=ttrpc version=3 May 14 00:18:29.766159 containerd[1482]: time="2025-05-14T00:18:29.766099565Z" level=info msg="connecting to shim d137d3fdd493a5ad8a0692d7de2c59e4470b224671e16ab1541c884ae3373b78" address="unix:///run/containerd/s/41a1603405a822acc2d6f2f0f3dcdc519369aed54bece876239c1f4bfe532c2c" namespace=k8s.io protocol=ttrpc version=3 May 14 00:18:29.794729 systemd[1]: Started cri-containerd-c689a5e24e8a6b32759f0be8654db09159c2f75d74fcc9e03e54edc5d0cdbf6e.scope - libcontainer container c689a5e24e8a6b32759f0be8654db09159c2f75d74fcc9e03e54edc5d0cdbf6e. May 14 00:18:29.800310 systemd[1]: Started cri-containerd-d137d3fdd493a5ad8a0692d7de2c59e4470b224671e16ab1541c884ae3373b78.scope - libcontainer container d137d3fdd493a5ad8a0692d7de2c59e4470b224671e16ab1541c884ae3373b78. May 14 00:18:29.806298 systemd[1]: Started cri-containerd-5ef32ffc2c038b0586a76ecbac4a90de87b81ab2a28a56e7497a3e41e79304c1.scope - libcontainer container 5ef32ffc2c038b0586a76ecbac4a90de87b81ab2a28a56e7497a3e41e79304c1. May 14 00:18:29.880629 containerd[1482]: time="2025-05-14T00:18:29.880463113Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-4643e7afba.novalocal,Uid:05e121befe145ff052f2e4e0945e63a9,Namespace:kube-system,Attempt:0,} returns sandbox id \"5ef32ffc2c038b0586a76ecbac4a90de87b81ab2a28a56e7497a3e41e79304c1\"" May 14 00:18:29.891032 containerd[1482]: time="2025-05-14T00:18:29.890423093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal,Uid:c57792f722ea2a1749272ba752d0b50e,Namespace:kube-system,Attempt:0,} returns sandbox id \"d137d3fdd493a5ad8a0692d7de2c59e4470b224671e16ab1541c884ae3373b78\"" May 14 00:18:29.891032 containerd[1482]: time="2025-05-14T00:18:29.890656491Z" level=info msg="CreateContainer within sandbox \"5ef32ffc2c038b0586a76ecbac4a90de87b81ab2a28a56e7497a3e41e79304c1\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 14 00:18:29.891959 containerd[1482]: time="2025-05-14T00:18:29.891877343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal,Uid:7653ec825afeb202f28eec235b5bb085,Namespace:kube-system,Attempt:0,} returns sandbox id \"c689a5e24e8a6b32759f0be8654db09159c2f75d74fcc9e03e54edc5d0cdbf6e\"" May 14 00:18:29.904757 containerd[1482]: time="2025-05-14T00:18:29.904206430Z" level=info msg="Container 6c7b5bfd18d9036bdcb5662bd9fc8ac730b7eaa4060202a9a6307c97f2547bdb: CDI devices from CRI Config.CDIDevices: []" May 14 00:18:29.908575 containerd[1482]: time="2025-05-14T00:18:29.908537121Z" level=info msg="CreateContainer within sandbox \"c689a5e24e8a6b32759f0be8654db09159c2f75d74fcc9e03e54edc5d0cdbf6e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 14 00:18:29.909267 containerd[1482]: time="2025-05-14T00:18:29.908974191Z" level=info msg="CreateContainer within sandbox \"d137d3fdd493a5ad8a0692d7de2c59e4470b224671e16ab1541c884ae3373b78\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 14 00:18:29.923780 containerd[1482]: time="2025-05-14T00:18:29.923736196Z" level=info msg="CreateContainer within sandbox \"5ef32ffc2c038b0586a76ecbac4a90de87b81ab2a28a56e7497a3e41e79304c1\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"6c7b5bfd18d9036bdcb5662bd9fc8ac730b7eaa4060202a9a6307c97f2547bdb\"" May 14 00:18:29.924589 containerd[1482]: time="2025-05-14T00:18:29.924542540Z" level=info msg="StartContainer for \"6c7b5bfd18d9036bdcb5662bd9fc8ac730b7eaa4060202a9a6307c97f2547bdb\"" May 14 00:18:29.926105 containerd[1482]: time="2025-05-14T00:18:29.926053195Z" level=info msg="connecting to shim 6c7b5bfd18d9036bdcb5662bd9fc8ac730b7eaa4060202a9a6307c97f2547bdb" address="unix:///run/containerd/s/015167821c9991bad5bc8593974bb16b4a8195878756cb308469a8ee05d2b5da" protocol=ttrpc version=3 May 14 00:18:29.931558 containerd[1482]: time="2025-05-14T00:18:29.930020043Z" level=info msg="Container 5f9bf61673da583125b74245319c5047ad3c9150bfa6937fa24071225d40c11a: CDI devices from CRI Config.CDIDevices: []" May 14 00:18:29.934605 containerd[1482]: time="2025-05-14T00:18:29.934475238Z" level=info msg="Container b0c11c2899ba70d82f7a991bb7b62e4f3d1dbbc77ad172a20d4e55be5b0a4607: CDI devices from CRI Config.CDIDevices: []" May 14 00:18:29.944016 containerd[1482]: time="2025-05-14T00:18:29.943909289Z" level=info msg="CreateContainer within sandbox \"c689a5e24e8a6b32759f0be8654db09159c2f75d74fcc9e03e54edc5d0cdbf6e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"5f9bf61673da583125b74245319c5047ad3c9150bfa6937fa24071225d40c11a\"" May 14 00:18:29.946626 containerd[1482]: time="2025-05-14T00:18:29.944831311Z" level=info msg="StartContainer for \"5f9bf61673da583125b74245319c5047ad3c9150bfa6937fa24071225d40c11a\"" May 14 00:18:29.946768 containerd[1482]: time="2025-05-14T00:18:29.946648682Z" level=info msg="connecting to shim 5f9bf61673da583125b74245319c5047ad3c9150bfa6937fa24071225d40c11a" address="unix:///run/containerd/s/538f68eed575942c31deb5395d221bb13ecafbb2ab369fda4c7688009daaca8f" protocol=ttrpc version=3 May 14 00:18:29.958288 containerd[1482]: time="2025-05-14T00:18:29.958219645Z" level=info msg="CreateContainer within sandbox \"d137d3fdd493a5ad8a0692d7de2c59e4470b224671e16ab1541c884ae3373b78\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"b0c11c2899ba70d82f7a991bb7b62e4f3d1dbbc77ad172a20d4e55be5b0a4607\"" May 14 00:18:29.958654 containerd[1482]: time="2025-05-14T00:18:29.958620909Z" level=info msg="StartContainer for \"b0c11c2899ba70d82f7a991bb7b62e4f3d1dbbc77ad172a20d4e55be5b0a4607\"" May 14 00:18:29.959791 containerd[1482]: time="2025-05-14T00:18:29.959731724Z" level=info msg="connecting to shim b0c11c2899ba70d82f7a991bb7b62e4f3d1dbbc77ad172a20d4e55be5b0a4607" address="unix:///run/containerd/s/41a1603405a822acc2d6f2f0f3dcdc519369aed54bece876239c1f4bfe532c2c" protocol=ttrpc version=3 May 14 00:18:29.960862 systemd[1]: Started cri-containerd-6c7b5bfd18d9036bdcb5662bd9fc8ac730b7eaa4060202a9a6307c97f2547bdb.scope - libcontainer container 6c7b5bfd18d9036bdcb5662bd9fc8ac730b7eaa4060202a9a6307c97f2547bdb. May 14 00:18:29.962539 kubelet[2427]: E0514 00:18:29.962460 2427 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-4643e7afba.novalocal?timeout=10s\": dial tcp 172.24.4.34:6443: connect: connection refused" interval="1.6s" May 14 00:18:29.963847 kubelet[2427]: W0514 00:18:29.963782 2427 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.34:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-4643e7afba.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:29.963954 kubelet[2427]: E0514 00:18:29.963855 2427 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.34:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-4643e7afba.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:29.996663 systemd[1]: Started cri-containerd-5f9bf61673da583125b74245319c5047ad3c9150bfa6937fa24071225d40c11a.scope - libcontainer container 5f9bf61673da583125b74245319c5047ad3c9150bfa6937fa24071225d40c11a. May 14 00:18:29.998674 systemd[1]: Started cri-containerd-b0c11c2899ba70d82f7a991bb7b62e4f3d1dbbc77ad172a20d4e55be5b0a4607.scope - libcontainer container b0c11c2899ba70d82f7a991bb7b62e4f3d1dbbc77ad172a20d4e55be5b0a4607. May 14 00:18:30.031096 kubelet[2427]: W0514 00:18:30.030964 2427 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.34:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:30.031096 kubelet[2427]: E0514 00:18:30.031024 2427 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.34:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:30.038986 containerd[1482]: time="2025-05-14T00:18:30.038856661Z" level=info msg="StartContainer for \"6c7b5bfd18d9036bdcb5662bd9fc8ac730b7eaa4060202a9a6307c97f2547bdb\" returns successfully" May 14 00:18:30.069150 kubelet[2427]: I0514 00:18:30.068671 2427 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:30.069150 kubelet[2427]: E0514 00:18:30.068998 2427 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.34:6443/api/v1/nodes\": dial tcp 172.24.4.34:6443: connect: connection refused" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:30.078997 containerd[1482]: time="2025-05-14T00:18:30.078902620Z" level=info msg="StartContainer for \"b0c11c2899ba70d82f7a991bb7b62e4f3d1dbbc77ad172a20d4e55be5b0a4607\" returns successfully" May 14 00:18:30.118145 kubelet[2427]: W0514 00:18:30.118002 2427 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.34:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:30.118145 kubelet[2427]: E0514 00:18:30.118119 2427 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.34:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.34:6443: connect: connection refused May 14 00:18:30.126011 containerd[1482]: time="2025-05-14T00:18:30.125906761Z" level=info msg="StartContainer for \"5f9bf61673da583125b74245319c5047ad3c9150bfa6937fa24071225d40c11a\" returns successfully" May 14 00:18:31.674531 kubelet[2427]: I0514 00:18:31.671995 2427 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:32.748380 kubelet[2427]: E0514 00:18:32.748312 2427 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4284-0-0-n-4643e7afba.novalocal\" not found" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:32.856891 kubelet[2427]: I0514 00:18:32.855368 2427 kubelet_node_status.go:76] "Successfully registered node" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:32.991555 kubelet[2427]: E0514 00:18:32.990614 2427 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4284-0-0-n-4643e7afba.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:33.531421 kubelet[2427]: I0514 00:18:33.530111 2427 apiserver.go:52] "Watching apiserver" May 14 00:18:33.554943 kubelet[2427]: I0514 00:18:33.554851 2427 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 14 00:18:35.316092 systemd[1]: Reload requested from client PID 2698 ('systemctl') (unit session-11.scope)... May 14 00:18:35.316225 systemd[1]: Reloading... May 14 00:18:35.462617 zram_generator::config[2747]: No configuration found. May 14 00:18:35.643410 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 00:18:35.815294 systemd[1]: Reloading finished in 498 ms. May 14 00:18:35.853843 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 14 00:18:35.857525 kubelet[2427]: E0514 00:18:35.855419 2427 event.go:319] "Unable to write event (broadcaster is shut down)" event="&Event{ObjectMeta:{ci-4284-0-0-n-4643e7afba.novalocal.183f3cb057f85222 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284-0-0-n-4643e7afba.novalocal,UID:ci-4284-0-0-n-4643e7afba.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-n-4643e7afba.novalocal,},FirstTimestamp:2025-05-14 00:18:28.530844194 +0000 UTC m=+0.500361706,LastTimestamp:2025-05-14 00:18:28.530844194 +0000 UTC m=+0.500361706,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-n-4643e7afba.novalocal,}" May 14 00:18:35.875732 systemd[1]: kubelet.service: Deactivated successfully. May 14 00:18:35.876015 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 00:18:35.876101 systemd[1]: kubelet.service: Consumed 1.120s CPU time, 113.7M memory peak. May 14 00:18:35.882236 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 00:18:36.257327 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 00:18:36.272418 (kubelet)[2808]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 14 00:18:36.390526 kubelet[2808]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 00:18:36.390526 kubelet[2808]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 14 00:18:36.390526 kubelet[2808]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 00:18:36.391107 kubelet[2808]: I0514 00:18:36.390599 2808 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 14 00:18:36.398669 kubelet[2808]: I0514 00:18:36.398607 2808 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 14 00:18:36.398669 kubelet[2808]: I0514 00:18:36.398647 2808 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 14 00:18:36.398942 kubelet[2808]: I0514 00:18:36.398908 2808 server.go:927] "Client rotation is on, will bootstrap in background" May 14 00:18:36.404548 kubelet[2808]: I0514 00:18:36.403059 2808 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 14 00:18:36.405871 kubelet[2808]: I0514 00:18:36.405837 2808 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 14 00:18:36.423341 kubelet[2808]: I0514 00:18:36.423272 2808 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 14 00:18:36.424105 kubelet[2808]: I0514 00:18:36.424057 2808 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 14 00:18:36.424425 kubelet[2808]: I0514 00:18:36.424099 2808 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-4643e7afba.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 14 00:18:36.424425 kubelet[2808]: I0514 00:18:36.424459 2808 topology_manager.go:138] "Creating topology manager with none policy" May 14 00:18:36.424425 kubelet[2808]: I0514 00:18:36.424485 2808 container_manager_linux.go:301] "Creating device plugin manager" May 14 00:18:36.424915 kubelet[2808]: I0514 00:18:36.424663 2808 state_mem.go:36] "Initialized new in-memory state store" May 14 00:18:36.424915 kubelet[2808]: I0514 00:18:36.424855 2808 kubelet.go:400] "Attempting to sync node with API server" May 14 00:18:36.424915 kubelet[2808]: I0514 00:18:36.424880 2808 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 14 00:18:36.425058 kubelet[2808]: I0514 00:18:36.424949 2808 kubelet.go:312] "Adding apiserver pod source" May 14 00:18:36.425058 kubelet[2808]: I0514 00:18:36.424989 2808 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 14 00:18:36.430603 kubelet[2808]: I0514 00:18:36.429851 2808 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 14 00:18:36.430603 kubelet[2808]: I0514 00:18:36.430152 2808 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 14 00:18:36.431232 kubelet[2808]: I0514 00:18:36.431204 2808 server.go:1264] "Started kubelet" May 14 00:18:36.435049 kubelet[2808]: I0514 00:18:36.435021 2808 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 14 00:18:36.447218 kubelet[2808]: I0514 00:18:36.447153 2808 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 14 00:18:36.449300 kubelet[2808]: I0514 00:18:36.449189 2808 server.go:455] "Adding debug handlers to kubelet server" May 14 00:18:36.452342 kubelet[2808]: I0514 00:18:36.452259 2808 volume_manager.go:291] "Starting Kubelet Volume Manager" May 14 00:18:36.453613 kubelet[2808]: I0514 00:18:36.447437 2808 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 14 00:18:36.453921 kubelet[2808]: I0514 00:18:36.453902 2808 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 14 00:18:36.454143 kubelet[2808]: I0514 00:18:36.454104 2808 reconciler.go:26] "Reconciler: start to sync state" May 14 00:18:36.456535 kubelet[2808]: I0514 00:18:36.454851 2808 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 14 00:18:36.470100 kubelet[2808]: I0514 00:18:36.470062 2808 factory.go:221] Registration of the systemd container factory successfully May 14 00:18:36.472321 kubelet[2808]: I0514 00:18:36.472282 2808 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 14 00:18:36.486599 kubelet[2808]: I0514 00:18:36.485725 2808 factory.go:221] Registration of the containerd container factory successfully May 14 00:18:36.489547 kubelet[2808]: E0514 00:18:36.487863 2808 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 14 00:18:36.496164 kubelet[2808]: I0514 00:18:36.495843 2808 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 14 00:18:36.498459 kubelet[2808]: I0514 00:18:36.497912 2808 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 14 00:18:36.498459 kubelet[2808]: I0514 00:18:36.497968 2808 status_manager.go:217] "Starting to sync pod status with apiserver" May 14 00:18:36.498459 kubelet[2808]: I0514 00:18:36.497994 2808 kubelet.go:2337] "Starting kubelet main sync loop" May 14 00:18:36.498459 kubelet[2808]: E0514 00:18:36.498070 2808 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 14 00:18:36.558133 kubelet[2808]: I0514 00:18:36.557097 2808 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.577289 kubelet[2808]: I0514 00:18:36.576723 2808 kubelet_node_status.go:112] "Node was previously registered" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.577289 kubelet[2808]: I0514 00:18:36.576854 2808 kubelet_node_status.go:76] "Successfully registered node" node="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.580587 kubelet[2808]: I0514 00:18:36.579661 2808 cpu_manager.go:214] "Starting CPU manager" policy="none" May 14 00:18:36.580587 kubelet[2808]: I0514 00:18:36.579681 2808 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 14 00:18:36.580587 kubelet[2808]: I0514 00:18:36.579711 2808 state_mem.go:36] "Initialized new in-memory state store" May 14 00:18:36.580587 kubelet[2808]: I0514 00:18:36.579990 2808 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 14 00:18:36.580587 kubelet[2808]: I0514 00:18:36.580086 2808 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 14 00:18:36.580587 kubelet[2808]: I0514 00:18:36.580205 2808 policy_none.go:49] "None policy: Start" May 14 00:18:36.582065 kubelet[2808]: I0514 00:18:36.582034 2808 memory_manager.go:170] "Starting memorymanager" policy="None" May 14 00:18:36.582295 kubelet[2808]: I0514 00:18:36.582273 2808 state_mem.go:35] "Initializing new in-memory state store" May 14 00:18:36.583051 kubelet[2808]: I0514 00:18:36.582884 2808 state_mem.go:75] "Updated machine memory state" May 14 00:18:36.595088 kubelet[2808]: I0514 00:18:36.593453 2808 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 14 00:18:36.595983 kubelet[2808]: I0514 00:18:36.595355 2808 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 14 00:18:36.595983 kubelet[2808]: I0514 00:18:36.595468 2808 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 14 00:18:36.600871 kubelet[2808]: I0514 00:18:36.600661 2808 topology_manager.go:215] "Topology Admit Handler" podUID="05e121befe145ff052f2e4e0945e63a9" podNamespace="kube-system" podName="kube-scheduler-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.600871 kubelet[2808]: I0514 00:18:36.600799 2808 topology_manager.go:215] "Topology Admit Handler" podUID="c57792f722ea2a1749272ba752d0b50e" podNamespace="kube-system" podName="kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.601083 kubelet[2808]: I0514 00:18:36.600905 2808 topology_manager.go:215] "Topology Admit Handler" podUID="7653ec825afeb202f28eec235b5bb085" podNamespace="kube-system" podName="kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.631490 kubelet[2808]: W0514 00:18:36.631449 2808 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 00:18:36.635775 kubelet[2808]: W0514 00:18:36.635744 2808 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 00:18:36.636906 kubelet[2808]: W0514 00:18:36.636594 2808 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 00:18:36.636906 kubelet[2808]: E0514 00:18:36.636765 2808 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.655826 kubelet[2808]: I0514 00:18:36.654916 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/05e121befe145ff052f2e4e0945e63a9-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"05e121befe145ff052f2e4e0945e63a9\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.755480 kubelet[2808]: I0514 00:18:36.755415 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7653ec825afeb202f28eec235b5bb085-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"7653ec825afeb202f28eec235b5bb085\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.755480 kubelet[2808]: I0514 00:18:36.755463 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7653ec825afeb202f28eec235b5bb085-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"7653ec825afeb202f28eec235b5bb085\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.755480 kubelet[2808]: I0514 00:18:36.755494 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c57792f722ea2a1749272ba752d0b50e-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"c57792f722ea2a1749272ba752d0b50e\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.755793 kubelet[2808]: I0514 00:18:36.755550 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c57792f722ea2a1749272ba752d0b50e-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"c57792f722ea2a1749272ba752d0b50e\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.755793 kubelet[2808]: I0514 00:18:36.755577 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c57792f722ea2a1749272ba752d0b50e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"c57792f722ea2a1749272ba752d0b50e\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.755793 kubelet[2808]: I0514 00:18:36.755598 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7653ec825afeb202f28eec235b5bb085-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"7653ec825afeb202f28eec235b5bb085\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.755793 kubelet[2808]: I0514 00:18:36.755618 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7653ec825afeb202f28eec235b5bb085-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"7653ec825afeb202f28eec235b5bb085\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:36.755944 kubelet[2808]: I0514 00:18:36.755648 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7653ec825afeb202f28eec235b5bb085-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal\" (UID: \"7653ec825afeb202f28eec235b5bb085\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:37.426720 kubelet[2808]: I0514 00:18:37.426640 2808 apiserver.go:52] "Watching apiserver" May 14 00:18:37.455339 kubelet[2808]: I0514 00:18:37.455043 2808 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 14 00:18:37.526384 kubelet[2808]: I0514 00:18:37.525332 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal" podStartSLOduration=2.525296595 podStartE2EDuration="2.525296595s" podCreationTimestamp="2025-05-14 00:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 00:18:37.504060353 +0000 UTC m=+1.225754256" watchObservedRunningTime="2025-05-14 00:18:37.525296595 +0000 UTC m=+1.246990428" May 14 00:18:37.543428 kubelet[2808]: W0514 00:18:37.543354 2808 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 00:18:37.543428 kubelet[2808]: E0514 00:18:37.543430 2808 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:18:37.548420 kubelet[2808]: I0514 00:18:37.548217 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4643e7afba.novalocal" podStartSLOduration=1.548200929 podStartE2EDuration="1.548200929s" podCreationTimestamp="2025-05-14 00:18:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 00:18:37.525900148 +0000 UTC m=+1.247594031" watchObservedRunningTime="2025-05-14 00:18:37.548200929 +0000 UTC m=+1.269894782" May 14 00:18:37.548420 kubelet[2808]: I0514 00:18:37.548438 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284-0-0-n-4643e7afba.novalocal" podStartSLOduration=1.548405182 podStartE2EDuration="1.548405182s" podCreationTimestamp="2025-05-14 00:18:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 00:18:37.547647881 +0000 UTC m=+1.269341714" watchObservedRunningTime="2025-05-14 00:18:37.548405182 +0000 UTC m=+1.270099025" May 14 00:18:42.579111 sudo[1759]: pam_unix(sudo:session): session closed for user root May 14 00:18:42.859793 sshd[1758]: Connection closed by 172.24.4.1 port 56600 May 14 00:18:42.862085 sshd-session[1755]: pam_unix(sshd:session): session closed for user core May 14 00:18:42.873629 systemd[1]: sshd@8-172.24.4.34:22-172.24.4.1:56600.service: Deactivated successfully. May 14 00:18:42.881787 systemd[1]: session-11.scope: Deactivated successfully. May 14 00:18:42.882730 systemd[1]: session-11.scope: Consumed 8.489s CPU time, 250.9M memory peak. May 14 00:18:42.891066 systemd-logind[1463]: Session 11 logged out. Waiting for processes to exit. May 14 00:18:42.901161 systemd-logind[1463]: Removed session 11. May 14 00:18:51.395632 kubelet[2808]: I0514 00:18:51.395354 2808 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 14 00:18:51.399539 containerd[1482]: time="2025-05-14T00:18:51.399199359Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 14 00:18:51.399966 kubelet[2808]: I0514 00:18:51.399665 2808 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 14 00:18:51.459598 kubelet[2808]: I0514 00:18:51.459537 2808 topology_manager.go:215] "Topology Admit Handler" podUID="0c7082a2-a4b7-4445-a62a-d6ee18a3f67d" podNamespace="tigera-operator" podName="tigera-operator-797db67f8-8nq2j" May 14 00:18:51.471728 systemd[1]: Created slice kubepods-besteffort-pod0c7082a2_a4b7_4445_a62a_d6ee18a3f67d.slice - libcontainer container kubepods-besteffort-pod0c7082a2_a4b7_4445_a62a_d6ee18a3f67d.slice. May 14 00:18:51.482249 kubelet[2808]: W0514 00:18:51.482196 2808 reflector.go:547] object-"tigera-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4284-0-0-n-4643e7afba.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4284-0-0-n-4643e7afba.novalocal' and this object May 14 00:18:51.482395 kubelet[2808]: E0514 00:18:51.482261 2808 reflector.go:150] object-"tigera-operator"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4284-0-0-n-4643e7afba.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4284-0-0-n-4643e7afba.novalocal' and this object May 14 00:18:51.486273 kubelet[2808]: W0514 00:18:51.486233 2808 reflector.go:547] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4284-0-0-n-4643e7afba.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4284-0-0-n-4643e7afba.novalocal' and this object May 14 00:18:51.486273 kubelet[2808]: E0514 00:18:51.486273 2808 reflector.go:150] object-"tigera-operator"/"kubernetes-services-endpoint": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4284-0-0-n-4643e7afba.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4284-0-0-n-4643e7afba.novalocal' and this object May 14 00:18:51.531557 kubelet[2808]: I0514 00:18:51.529221 2808 topology_manager.go:215] "Topology Admit Handler" podUID="8c066be8-974b-4795-b73f-d49c9220134a" podNamespace="kube-system" podName="kube-proxy-6mbl5" May 14 00:18:51.540130 systemd[1]: Created slice kubepods-besteffort-pod8c066be8_974b_4795_b73f_d49c9220134a.slice - libcontainer container kubepods-besteffort-pod8c066be8_974b_4795_b73f_d49c9220134a.slice. May 14 00:18:51.655429 kubelet[2808]: I0514 00:18:51.655112 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0c7082a2-a4b7-4445-a62a-d6ee18a3f67d-var-lib-calico\") pod \"tigera-operator-797db67f8-8nq2j\" (UID: \"0c7082a2-a4b7-4445-a62a-d6ee18a3f67d\") " pod="tigera-operator/tigera-operator-797db67f8-8nq2j" May 14 00:18:51.655429 kubelet[2808]: I0514 00:18:51.655160 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8c066be8-974b-4795-b73f-d49c9220134a-xtables-lock\") pod \"kube-proxy-6mbl5\" (UID: \"8c066be8-974b-4795-b73f-d49c9220134a\") " pod="kube-system/kube-proxy-6mbl5" May 14 00:18:51.655429 kubelet[2808]: I0514 00:18:51.655190 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8c066be8-974b-4795-b73f-d49c9220134a-lib-modules\") pod \"kube-proxy-6mbl5\" (UID: \"8c066be8-974b-4795-b73f-d49c9220134a\") " pod="kube-system/kube-proxy-6mbl5" May 14 00:18:51.655429 kubelet[2808]: I0514 00:18:51.655214 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85g2w\" (UniqueName: \"kubernetes.io/projected/8c066be8-974b-4795-b73f-d49c9220134a-kube-api-access-85g2w\") pod \"kube-proxy-6mbl5\" (UID: \"8c066be8-974b-4795-b73f-d49c9220134a\") " pod="kube-system/kube-proxy-6mbl5" May 14 00:18:51.655429 kubelet[2808]: I0514 00:18:51.655236 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fqr6\" (UniqueName: \"kubernetes.io/projected/0c7082a2-a4b7-4445-a62a-d6ee18a3f67d-kube-api-access-9fqr6\") pod \"tigera-operator-797db67f8-8nq2j\" (UID: \"0c7082a2-a4b7-4445-a62a-d6ee18a3f67d\") " pod="tigera-operator/tigera-operator-797db67f8-8nq2j" May 14 00:18:51.655749 kubelet[2808]: I0514 00:18:51.655254 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8c066be8-974b-4795-b73f-d49c9220134a-kube-proxy\") pod \"kube-proxy-6mbl5\" (UID: \"8c066be8-974b-4795-b73f-d49c9220134a\") " pod="kube-system/kube-proxy-6mbl5" May 14 00:18:51.846852 containerd[1482]: time="2025-05-14T00:18:51.846689334Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6mbl5,Uid:8c066be8-974b-4795-b73f-d49c9220134a,Namespace:kube-system,Attempt:0,}" May 14 00:18:51.916871 containerd[1482]: time="2025-05-14T00:18:51.916625211Z" level=info msg="connecting to shim fcc92c2fc5b4808de8f243e8051e55099bd3e50db58bea7f392a9e873b1159ca" address="unix:///run/containerd/s/5c47dabef5da1dea13c01a191ded0bda873af55d81563968fd1c81dd521c8e03" namespace=k8s.io protocol=ttrpc version=3 May 14 00:18:51.965667 systemd[1]: Started cri-containerd-fcc92c2fc5b4808de8f243e8051e55099bd3e50db58bea7f392a9e873b1159ca.scope - libcontainer container fcc92c2fc5b4808de8f243e8051e55099bd3e50db58bea7f392a9e873b1159ca. May 14 00:18:51.994153 containerd[1482]: time="2025-05-14T00:18:51.994101304Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6mbl5,Uid:8c066be8-974b-4795-b73f-d49c9220134a,Namespace:kube-system,Attempt:0,} returns sandbox id \"fcc92c2fc5b4808de8f243e8051e55099bd3e50db58bea7f392a9e873b1159ca\"" May 14 00:18:51.999115 containerd[1482]: time="2025-05-14T00:18:51.999075925Z" level=info msg="CreateContainer within sandbox \"fcc92c2fc5b4808de8f243e8051e55099bd3e50db58bea7f392a9e873b1159ca\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 14 00:18:52.019204 containerd[1482]: time="2025-05-14T00:18:52.017587752Z" level=info msg="Container bcbb4790852df321486a34495336ab23f98c913d78d22a33fe2338ccc248dbb1: CDI devices from CRI Config.CDIDevices: []" May 14 00:18:52.027080 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1091383538.mount: Deactivated successfully. May 14 00:18:52.038703 containerd[1482]: time="2025-05-14T00:18:52.037946825Z" level=info msg="CreateContainer within sandbox \"fcc92c2fc5b4808de8f243e8051e55099bd3e50db58bea7f392a9e873b1159ca\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"bcbb4790852df321486a34495336ab23f98c913d78d22a33fe2338ccc248dbb1\"" May 14 00:18:52.039064 containerd[1482]: time="2025-05-14T00:18:52.039034846Z" level=info msg="StartContainer for \"bcbb4790852df321486a34495336ab23f98c913d78d22a33fe2338ccc248dbb1\"" May 14 00:18:52.040852 containerd[1482]: time="2025-05-14T00:18:52.040823241Z" level=info msg="connecting to shim bcbb4790852df321486a34495336ab23f98c913d78d22a33fe2338ccc248dbb1" address="unix:///run/containerd/s/5c47dabef5da1dea13c01a191ded0bda873af55d81563968fd1c81dd521c8e03" protocol=ttrpc version=3 May 14 00:18:52.063734 systemd[1]: Started cri-containerd-bcbb4790852df321486a34495336ab23f98c913d78d22a33fe2338ccc248dbb1.scope - libcontainer container bcbb4790852df321486a34495336ab23f98c913d78d22a33fe2338ccc248dbb1. May 14 00:18:52.115350 containerd[1482]: time="2025-05-14T00:18:52.115294705Z" level=info msg="StartContainer for \"bcbb4790852df321486a34495336ab23f98c913d78d22a33fe2338ccc248dbb1\" returns successfully" May 14 00:18:52.383723 containerd[1482]: time="2025-05-14T00:18:52.383663377Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-8nq2j,Uid:0c7082a2-a4b7-4445-a62a-d6ee18a3f67d,Namespace:tigera-operator,Attempt:0,}" May 14 00:18:52.416120 containerd[1482]: time="2025-05-14T00:18:52.415565303Z" level=info msg="connecting to shim bcd4432f7d28a8b29344cb2d703d443bf39153b67f82f86bbcc7f550aacf8e40" address="unix:///run/containerd/s/5f7051920a1ec30df1551a3997ec89116adbf2f305ba69b0d6c8679dd87474ba" namespace=k8s.io protocol=ttrpc version=3 May 14 00:18:52.454815 systemd[1]: Started cri-containerd-bcd4432f7d28a8b29344cb2d703d443bf39153b67f82f86bbcc7f550aacf8e40.scope - libcontainer container bcd4432f7d28a8b29344cb2d703d443bf39153b67f82f86bbcc7f550aacf8e40. May 14 00:18:52.519123 containerd[1482]: time="2025-05-14T00:18:52.518892629Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-8nq2j,Uid:0c7082a2-a4b7-4445-a62a-d6ee18a3f67d,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"bcd4432f7d28a8b29344cb2d703d443bf39153b67f82f86bbcc7f550aacf8e40\"" May 14 00:18:52.521916 containerd[1482]: time="2025-05-14T00:18:52.520866461Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 14 00:18:53.884343 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3838649823.mount: Deactivated successfully. May 14 00:18:54.597609 containerd[1482]: time="2025-05-14T00:18:54.597537955Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:54.600012 containerd[1482]: time="2025-05-14T00:18:54.599180456Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 14 00:18:54.601298 containerd[1482]: time="2025-05-14T00:18:54.601261560Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:54.604741 containerd[1482]: time="2025-05-14T00:18:54.604703096Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:18:54.605491 containerd[1482]: time="2025-05-14T00:18:54.605175573Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 2.084254108s" May 14 00:18:54.605491 containerd[1482]: time="2025-05-14T00:18:54.605206901Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 14 00:18:54.609027 containerd[1482]: time="2025-05-14T00:18:54.608920357Z" level=info msg="CreateContainer within sandbox \"bcd4432f7d28a8b29344cb2d703d443bf39153b67f82f86bbcc7f550aacf8e40\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 14 00:18:54.624097 containerd[1482]: time="2025-05-14T00:18:54.623280976Z" level=info msg="Container 82d9413ef90dc8b30330bed0ac4d362be720ae7de10f75d77a2c096f9da37bfe: CDI devices from CRI Config.CDIDevices: []" May 14 00:18:54.632329 containerd[1482]: time="2025-05-14T00:18:54.632030559Z" level=info msg="CreateContainer within sandbox \"bcd4432f7d28a8b29344cb2d703d443bf39153b67f82f86bbcc7f550aacf8e40\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"82d9413ef90dc8b30330bed0ac4d362be720ae7de10f75d77a2c096f9da37bfe\"" May 14 00:18:54.633364 containerd[1482]: time="2025-05-14T00:18:54.633341688Z" level=info msg="StartContainer for \"82d9413ef90dc8b30330bed0ac4d362be720ae7de10f75d77a2c096f9da37bfe\"" May 14 00:18:54.637730 containerd[1482]: time="2025-05-14T00:18:54.637685677Z" level=info msg="connecting to shim 82d9413ef90dc8b30330bed0ac4d362be720ae7de10f75d77a2c096f9da37bfe" address="unix:///run/containerd/s/5f7051920a1ec30df1551a3997ec89116adbf2f305ba69b0d6c8679dd87474ba" protocol=ttrpc version=3 May 14 00:18:54.667720 systemd[1]: Started cri-containerd-82d9413ef90dc8b30330bed0ac4d362be720ae7de10f75d77a2c096f9da37bfe.scope - libcontainer container 82d9413ef90dc8b30330bed0ac4d362be720ae7de10f75d77a2c096f9da37bfe. May 14 00:18:54.708352 containerd[1482]: time="2025-05-14T00:18:54.708307966Z" level=info msg="StartContainer for \"82d9413ef90dc8b30330bed0ac4d362be720ae7de10f75d77a2c096f9da37bfe\" returns successfully" May 14 00:18:55.627559 kubelet[2808]: I0514 00:18:55.627368 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-6mbl5" podStartSLOduration=4.627292679 podStartE2EDuration="4.627292679s" podCreationTimestamp="2025-05-14 00:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 00:18:52.615193183 +0000 UTC m=+16.336887046" watchObservedRunningTime="2025-05-14 00:18:55.627292679 +0000 UTC m=+19.348986562" May 14 00:18:56.528906 kubelet[2808]: I0514 00:18:56.528325 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-797db67f8-8nq2j" podStartSLOduration=3.442160976 podStartE2EDuration="5.52828397s" podCreationTimestamp="2025-05-14 00:18:51 +0000 UTC" firstStartedPulling="2025-05-14 00:18:52.52032753 +0000 UTC m=+16.242021363" lastFinishedPulling="2025-05-14 00:18:54.606450514 +0000 UTC m=+18.328144357" observedRunningTime="2025-05-14 00:18:55.628255606 +0000 UTC m=+19.349949489" watchObservedRunningTime="2025-05-14 00:18:56.52828397 +0000 UTC m=+20.249977863" May 14 00:18:57.992544 kubelet[2808]: I0514 00:18:57.992460 2808 topology_manager.go:215] "Topology Admit Handler" podUID="12ad7f1a-7cda-408c-8521-96302e9ca6b7" podNamespace="calico-system" podName="calico-typha-577966c9b9-rr77z" May 14 00:18:57.997228 kubelet[2808]: I0514 00:18:57.997126 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12ad7f1a-7cda-408c-8521-96302e9ca6b7-tigera-ca-bundle\") pod \"calico-typha-577966c9b9-rr77z\" (UID: \"12ad7f1a-7cda-408c-8521-96302e9ca6b7\") " pod="calico-system/calico-typha-577966c9b9-rr77z" May 14 00:18:57.997228 kubelet[2808]: I0514 00:18:57.997164 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm5mt\" (UniqueName: \"kubernetes.io/projected/12ad7f1a-7cda-408c-8521-96302e9ca6b7-kube-api-access-lm5mt\") pod \"calico-typha-577966c9b9-rr77z\" (UID: \"12ad7f1a-7cda-408c-8521-96302e9ca6b7\") " pod="calico-system/calico-typha-577966c9b9-rr77z" May 14 00:18:57.997228 kubelet[2808]: I0514 00:18:57.997188 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/12ad7f1a-7cda-408c-8521-96302e9ca6b7-typha-certs\") pod \"calico-typha-577966c9b9-rr77z\" (UID: \"12ad7f1a-7cda-408c-8521-96302e9ca6b7\") " pod="calico-system/calico-typha-577966c9b9-rr77z" May 14 00:18:58.005159 systemd[1]: Created slice kubepods-besteffort-pod12ad7f1a_7cda_408c_8521_96302e9ca6b7.slice - libcontainer container kubepods-besteffort-pod12ad7f1a_7cda_408c_8521_96302e9ca6b7.slice. May 14 00:18:58.164295 kubelet[2808]: I0514 00:18:58.164221 2808 topology_manager.go:215] "Topology Admit Handler" podUID="8e0c8e6d-a05d-4278-8bfb-2249deac605e" podNamespace="calico-system" podName="calico-node-v8ssp" May 14 00:18:58.176068 systemd[1]: Created slice kubepods-besteffort-pod8e0c8e6d_a05d_4278_8bfb_2249deac605e.slice - libcontainer container kubepods-besteffort-pod8e0c8e6d_a05d_4278_8bfb_2249deac605e.slice. May 14 00:18:58.299770 kubelet[2808]: I0514 00:18:58.299256 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/8e0c8e6d-a05d-4278-8bfb-2249deac605e-var-run-calico\") pod \"calico-node-v8ssp\" (UID: \"8e0c8e6d-a05d-4278-8bfb-2249deac605e\") " pod="calico-system/calico-node-v8ssp" May 14 00:18:58.299770 kubelet[2808]: I0514 00:18:58.299299 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e0c8e6d-a05d-4278-8bfb-2249deac605e-tigera-ca-bundle\") pod \"calico-node-v8ssp\" (UID: \"8e0c8e6d-a05d-4278-8bfb-2249deac605e\") " pod="calico-system/calico-node-v8ssp" May 14 00:18:58.299770 kubelet[2808]: I0514 00:18:58.299320 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/8e0c8e6d-a05d-4278-8bfb-2249deac605e-node-certs\") pod \"calico-node-v8ssp\" (UID: \"8e0c8e6d-a05d-4278-8bfb-2249deac605e\") " pod="calico-system/calico-node-v8ssp" May 14 00:18:58.299770 kubelet[2808]: I0514 00:18:58.299340 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/8e0c8e6d-a05d-4278-8bfb-2249deac605e-cni-net-dir\") pod \"calico-node-v8ssp\" (UID: \"8e0c8e6d-a05d-4278-8bfb-2249deac605e\") " pod="calico-system/calico-node-v8ssp" May 14 00:18:58.299770 kubelet[2808]: I0514 00:18:58.299360 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/8e0c8e6d-a05d-4278-8bfb-2249deac605e-flexvol-driver-host\") pod \"calico-node-v8ssp\" (UID: \"8e0c8e6d-a05d-4278-8bfb-2249deac605e\") " pod="calico-system/calico-node-v8ssp" May 14 00:18:58.300806 kubelet[2808]: I0514 00:18:58.299382 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e0c8e6d-a05d-4278-8bfb-2249deac605e-lib-modules\") pod \"calico-node-v8ssp\" (UID: \"8e0c8e6d-a05d-4278-8bfb-2249deac605e\") " pod="calico-system/calico-node-v8ssp" May 14 00:18:58.300806 kubelet[2808]: I0514 00:18:58.299441 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8e0c8e6d-a05d-4278-8bfb-2249deac605e-var-lib-calico\") pod \"calico-node-v8ssp\" (UID: \"8e0c8e6d-a05d-4278-8bfb-2249deac605e\") " pod="calico-system/calico-node-v8ssp" May 14 00:18:58.300806 kubelet[2808]: I0514 00:18:58.299563 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvnbj\" (UniqueName: \"kubernetes.io/projected/8e0c8e6d-a05d-4278-8bfb-2249deac605e-kube-api-access-lvnbj\") pod \"calico-node-v8ssp\" (UID: \"8e0c8e6d-a05d-4278-8bfb-2249deac605e\") " pod="calico-system/calico-node-v8ssp" May 14 00:18:58.300806 kubelet[2808]: I0514 00:18:58.299593 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/8e0c8e6d-a05d-4278-8bfb-2249deac605e-cni-log-dir\") pod \"calico-node-v8ssp\" (UID: \"8e0c8e6d-a05d-4278-8bfb-2249deac605e\") " pod="calico-system/calico-node-v8ssp" May 14 00:18:58.300806 kubelet[2808]: I0514 00:18:58.299617 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8e0c8e6d-a05d-4278-8bfb-2249deac605e-xtables-lock\") pod \"calico-node-v8ssp\" (UID: \"8e0c8e6d-a05d-4278-8bfb-2249deac605e\") " pod="calico-system/calico-node-v8ssp" May 14 00:18:58.300990 kubelet[2808]: I0514 00:18:58.299634 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/8e0c8e6d-a05d-4278-8bfb-2249deac605e-policysync\") pod \"calico-node-v8ssp\" (UID: \"8e0c8e6d-a05d-4278-8bfb-2249deac605e\") " pod="calico-system/calico-node-v8ssp" May 14 00:18:58.300990 kubelet[2808]: I0514 00:18:58.299652 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/8e0c8e6d-a05d-4278-8bfb-2249deac605e-cni-bin-dir\") pod \"calico-node-v8ssp\" (UID: \"8e0c8e6d-a05d-4278-8bfb-2249deac605e\") " pod="calico-system/calico-node-v8ssp" May 14 00:18:58.306072 kubelet[2808]: I0514 00:18:58.305134 2808 topology_manager.go:215] "Topology Admit Handler" podUID="2846908e-da51-4861-b641-78bf7c75d90f" podNamespace="calico-system" podName="csi-node-driver-6svwf" May 14 00:18:58.306072 kubelet[2808]: E0514 00:18:58.305489 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6svwf" podUID="2846908e-da51-4861-b641-78bf7c75d90f" May 14 00:18:58.313548 containerd[1482]: time="2025-05-14T00:18:58.313249710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-577966c9b9-rr77z,Uid:12ad7f1a-7cda-408c-8521-96302e9ca6b7,Namespace:calico-system,Attempt:0,}" May 14 00:18:58.359012 containerd[1482]: time="2025-05-14T00:18:58.357904544Z" level=info msg="connecting to shim 32be841826eb153d1878e517e2b90a42e3516e5df05402eb4e6e1879a20788c6" address="unix:///run/containerd/s/a42c5dcb0fb11292429a3712dd18b38f1c82175c89683d049cdda972b5023a05" namespace=k8s.io protocol=ttrpc version=3 May 14 00:18:58.400905 kubelet[2808]: I0514 00:18:58.400699 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2846908e-da51-4861-b641-78bf7c75d90f-varrun\") pod \"csi-node-driver-6svwf\" (UID: \"2846908e-da51-4861-b641-78bf7c75d90f\") " pod="calico-system/csi-node-driver-6svwf" May 14 00:18:58.400905 kubelet[2808]: I0514 00:18:58.400760 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2846908e-da51-4861-b641-78bf7c75d90f-socket-dir\") pod \"csi-node-driver-6svwf\" (UID: \"2846908e-da51-4861-b641-78bf7c75d90f\") " pod="calico-system/csi-node-driver-6svwf" May 14 00:18:58.401560 kubelet[2808]: I0514 00:18:58.401131 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm57n\" (UniqueName: \"kubernetes.io/projected/2846908e-da51-4861-b641-78bf7c75d90f-kube-api-access-jm57n\") pod \"csi-node-driver-6svwf\" (UID: \"2846908e-da51-4861-b641-78bf7c75d90f\") " pod="calico-system/csi-node-driver-6svwf" May 14 00:18:58.401798 kubelet[2808]: I0514 00:18:58.401759 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2846908e-da51-4861-b641-78bf7c75d90f-registration-dir\") pod \"csi-node-driver-6svwf\" (UID: \"2846908e-da51-4861-b641-78bf7c75d90f\") " pod="calico-system/csi-node-driver-6svwf" May 14 00:18:58.401861 kubelet[2808]: I0514 00:18:58.401835 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2846908e-da51-4861-b641-78bf7c75d90f-kubelet-dir\") pod \"csi-node-driver-6svwf\" (UID: \"2846908e-da51-4861-b641-78bf7c75d90f\") " pod="calico-system/csi-node-driver-6svwf" May 14 00:18:58.405439 kubelet[2808]: E0514 00:18:58.405261 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.405439 kubelet[2808]: W0514 00:18:58.405290 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.405439 kubelet[2808]: E0514 00:18:58.405328 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.407750 systemd[1]: Started cri-containerd-32be841826eb153d1878e517e2b90a42e3516e5df05402eb4e6e1879a20788c6.scope - libcontainer container 32be841826eb153d1878e517e2b90a42e3516e5df05402eb4e6e1879a20788c6. May 14 00:18:58.409708 kubelet[2808]: E0514 00:18:58.409167 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.409708 kubelet[2808]: W0514 00:18:58.409194 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.409708 kubelet[2808]: E0514 00:18:58.409217 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.410243 kubelet[2808]: E0514 00:18:58.410099 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.410243 kubelet[2808]: W0514 00:18:58.410115 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.410243 kubelet[2808]: E0514 00:18:58.410127 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.419572 kubelet[2808]: E0514 00:18:58.418627 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.419572 kubelet[2808]: W0514 00:18:58.418659 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.419572 kubelet[2808]: E0514 00:18:58.418683 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.433721 kubelet[2808]: E0514 00:18:58.433661 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.433721 kubelet[2808]: W0514 00:18:58.433690 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.433721 kubelet[2808]: E0514 00:18:58.433713 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.481526 containerd[1482]: time="2025-05-14T00:18:58.481454759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v8ssp,Uid:8e0c8e6d-a05d-4278-8bfb-2249deac605e,Namespace:calico-system,Attempt:0,}" May 14 00:18:58.502548 kubelet[2808]: E0514 00:18:58.502491 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.502707 kubelet[2808]: W0514 00:18:58.502602 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.502707 kubelet[2808]: E0514 00:18:58.502637 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.503635 kubelet[2808]: E0514 00:18:58.503214 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.503635 kubelet[2808]: W0514 00:18:58.503231 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.503963 kubelet[2808]: E0514 00:18:58.503726 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.504816 kubelet[2808]: E0514 00:18:58.504672 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.504816 kubelet[2808]: W0514 00:18:58.504687 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.504816 kubelet[2808]: E0514 00:18:58.504704 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.505304 kubelet[2808]: E0514 00:18:58.505051 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.505304 kubelet[2808]: W0514 00:18:58.505063 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.505304 kubelet[2808]: E0514 00:18:58.505127 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.506061 kubelet[2808]: E0514 00:18:58.505673 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.506061 kubelet[2808]: W0514 00:18:58.505690 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.506448 kubelet[2808]: E0514 00:18:58.506172 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.506498 kubelet[2808]: E0514 00:18:58.506480 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.506498 kubelet[2808]: W0514 00:18:58.506491 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.506736 kubelet[2808]: E0514 00:18:58.506623 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.507025 kubelet[2808]: E0514 00:18:58.506995 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.507025 kubelet[2808]: W0514 00:18:58.507009 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.507562 kubelet[2808]: E0514 00:18:58.507171 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.509365 kubelet[2808]: E0514 00:18:58.509339 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.509365 kubelet[2808]: W0514 00:18:58.509357 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.509480 kubelet[2808]: E0514 00:18:58.509376 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.509821 kubelet[2808]: E0514 00:18:58.509803 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.509821 kubelet[2808]: W0514 00:18:58.509818 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.509915 kubelet[2808]: E0514 00:18:58.509905 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.510302 kubelet[2808]: E0514 00:18:58.510238 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.510371 kubelet[2808]: W0514 00:18:58.510296 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.510561 kubelet[2808]: E0514 00:18:58.510468 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.510682 kubelet[2808]: E0514 00:18:58.510662 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.510682 kubelet[2808]: W0514 00:18:58.510677 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.510832 kubelet[2808]: E0514 00:18:58.510713 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.511615 kubelet[2808]: E0514 00:18:58.511592 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.511615 kubelet[2808]: W0514 00:18:58.511608 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.511967 kubelet[2808]: E0514 00:18:58.511849 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.512219 kubelet[2808]: E0514 00:18:58.512182 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.512219 kubelet[2808]: W0514 00:18:58.512196 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.512954 kubelet[2808]: E0514 00:18:58.512545 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.513209 kubelet[2808]: E0514 00:18:58.513187 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.513209 kubelet[2808]: W0514 00:18:58.513201 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.513881 kubelet[2808]: E0514 00:18:58.513854 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.514143 kubelet[2808]: E0514 00:18:58.514104 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.514143 kubelet[2808]: W0514 00:18:58.514122 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.514263 kubelet[2808]: E0514 00:18:58.514234 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.516218 kubelet[2808]: E0514 00:18:58.516154 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.516218 kubelet[2808]: W0514 00:18:58.516209 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.516842 kubelet[2808]: E0514 00:18:58.516247 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.516842 kubelet[2808]: E0514 00:18:58.516668 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.516842 kubelet[2808]: W0514 00:18:58.516679 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.517053 kubelet[2808]: E0514 00:18:58.516912 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.517569 kubelet[2808]: E0514 00:18:58.517495 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.517569 kubelet[2808]: W0514 00:18:58.517541 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.517726 kubelet[2808]: E0514 00:18:58.517645 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.519462 kubelet[2808]: E0514 00:18:58.519442 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.519462 kubelet[2808]: W0514 00:18:58.519460 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.519660 kubelet[2808]: E0514 00:18:58.519555 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.519903 kubelet[2808]: E0514 00:18:58.519883 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.520097 kubelet[2808]: W0514 00:18:58.519913 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.520097 kubelet[2808]: E0514 00:18:58.520027 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.520781 kubelet[2808]: E0514 00:18:58.520673 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.520781 kubelet[2808]: W0514 00:18:58.520690 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.521041 kubelet[2808]: E0514 00:18:58.520862 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.521346 kubelet[2808]: E0514 00:18:58.521248 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.521346 kubelet[2808]: W0514 00:18:58.521261 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.521843 kubelet[2808]: E0514 00:18:58.521714 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.521843 kubelet[2808]: W0514 00:18:58.521738 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.521843 kubelet[2808]: E0514 00:18:58.521775 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.521843 kubelet[2808]: E0514 00:18:58.521814 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.522559 kubelet[2808]: E0514 00:18:58.522121 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.522559 kubelet[2808]: W0514 00:18:58.522134 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.522559 kubelet[2808]: E0514 00:18:58.522168 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.522996 kubelet[2808]: E0514 00:18:58.522877 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.522996 kubelet[2808]: W0514 00:18:58.522890 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.522996 kubelet[2808]: E0514 00:18:58.522901 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.524184 containerd[1482]: time="2025-05-14T00:18:58.524042455Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-577966c9b9-rr77z,Uid:12ad7f1a-7cda-408c-8521-96302e9ca6b7,Namespace:calico-system,Attempt:0,} returns sandbox id \"32be841826eb153d1878e517e2b90a42e3516e5df05402eb4e6e1879a20788c6\"" May 14 00:18:58.529169 containerd[1482]: time="2025-05-14T00:18:58.529122364Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 14 00:18:58.540927 containerd[1482]: time="2025-05-14T00:18:58.540740447Z" level=info msg="connecting to shim 53ad9a70a8b6f18a293296056507aa0599d47c312c5640cc35b9d5cbf27a9652" address="unix:///run/containerd/s/f00e8545773c12780e5715b7dfbb658d032560c8fa9cf61fdf4550d9aa368bbf" namespace=k8s.io protocol=ttrpc version=3 May 14 00:18:58.550609 kubelet[2808]: E0514 00:18:58.550443 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:18:58.550609 kubelet[2808]: W0514 00:18:58.550474 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:18:58.550609 kubelet[2808]: E0514 00:18:58.550585 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:18:58.582368 systemd[1]: Started cri-containerd-53ad9a70a8b6f18a293296056507aa0599d47c312c5640cc35b9d5cbf27a9652.scope - libcontainer container 53ad9a70a8b6f18a293296056507aa0599d47c312c5640cc35b9d5cbf27a9652. May 14 00:18:58.644067 containerd[1482]: time="2025-05-14T00:18:58.644014759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v8ssp,Uid:8e0c8e6d-a05d-4278-8bfb-2249deac605e,Namespace:calico-system,Attempt:0,} returns sandbox id \"53ad9a70a8b6f18a293296056507aa0599d47c312c5640cc35b9d5cbf27a9652\"" May 14 00:19:00.500747 kubelet[2808]: E0514 00:19:00.499040 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6svwf" podUID="2846908e-da51-4861-b641-78bf7c75d90f" May 14 00:19:01.758594 containerd[1482]: time="2025-05-14T00:19:01.758535489Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:01.759702 containerd[1482]: time="2025-05-14T00:19:01.759641373Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 14 00:19:01.762541 containerd[1482]: time="2025-05-14T00:19:01.761214805Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:01.764316 containerd[1482]: time="2025-05-14T00:19:01.764281748Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:01.765077 containerd[1482]: time="2025-05-14T00:19:01.765029861Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 3.235686953s" May 14 00:19:01.765165 containerd[1482]: time="2025-05-14T00:19:01.765147973Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 14 00:19:01.766597 containerd[1482]: time="2025-05-14T00:19:01.766563347Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 14 00:19:01.784682 containerd[1482]: time="2025-05-14T00:19:01.784614868Z" level=info msg="CreateContainer within sandbox \"32be841826eb153d1878e517e2b90a42e3516e5df05402eb4e6e1879a20788c6\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 14 00:19:01.796486 containerd[1482]: time="2025-05-14T00:19:01.796451530Z" level=info msg="Container 172384fd33c62643b6ccc1bc431bfdbad27ddc7b71336154bf1710dfbd7c2be7: CDI devices from CRI Config.CDIDevices: []" May 14 00:19:01.813468 containerd[1482]: time="2025-05-14T00:19:01.813420540Z" level=info msg="CreateContainer within sandbox \"32be841826eb153d1878e517e2b90a42e3516e5df05402eb4e6e1879a20788c6\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"172384fd33c62643b6ccc1bc431bfdbad27ddc7b71336154bf1710dfbd7c2be7\"" May 14 00:19:01.814244 containerd[1482]: time="2025-05-14T00:19:01.814001610Z" level=info msg="StartContainer for \"172384fd33c62643b6ccc1bc431bfdbad27ddc7b71336154bf1710dfbd7c2be7\"" May 14 00:19:01.817545 containerd[1482]: time="2025-05-14T00:19:01.816987750Z" level=info msg="connecting to shim 172384fd33c62643b6ccc1bc431bfdbad27ddc7b71336154bf1710dfbd7c2be7" address="unix:///run/containerd/s/a42c5dcb0fb11292429a3712dd18b38f1c82175c89683d049cdda972b5023a05" protocol=ttrpc version=3 May 14 00:19:01.850145 systemd[1]: Started cri-containerd-172384fd33c62643b6ccc1bc431bfdbad27ddc7b71336154bf1710dfbd7c2be7.scope - libcontainer container 172384fd33c62643b6ccc1bc431bfdbad27ddc7b71336154bf1710dfbd7c2be7. May 14 00:19:01.917192 containerd[1482]: time="2025-05-14T00:19:01.917033221Z" level=info msg="StartContainer for \"172384fd33c62643b6ccc1bc431bfdbad27ddc7b71336154bf1710dfbd7c2be7\" returns successfully" May 14 00:19:02.500649 kubelet[2808]: E0514 00:19:02.500587 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6svwf" podUID="2846908e-da51-4861-b641-78bf7c75d90f" May 14 00:19:02.734552 kubelet[2808]: E0514 00:19:02.734445 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.734552 kubelet[2808]: W0514 00:19:02.734549 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.734950 kubelet[2808]: E0514 00:19:02.734599 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.735236 kubelet[2808]: E0514 00:19:02.735196 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.735625 kubelet[2808]: W0514 00:19:02.735271 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.735625 kubelet[2808]: E0514 00:19:02.735300 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.736074 kubelet[2808]: E0514 00:19:02.735963 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.736074 kubelet[2808]: W0514 00:19:02.736030 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.736074 kubelet[2808]: E0514 00:19:02.736056 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.736782 kubelet[2808]: E0514 00:19:02.736708 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.736782 kubelet[2808]: W0514 00:19:02.736781 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.737027 kubelet[2808]: E0514 00:19:02.736805 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.737391 kubelet[2808]: E0514 00:19:02.737334 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.737391 kubelet[2808]: W0514 00:19:02.737372 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.737687 kubelet[2808]: E0514 00:19:02.737433 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.738083 kubelet[2808]: E0514 00:19:02.738032 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.738318 kubelet[2808]: W0514 00:19:02.738091 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.738318 kubelet[2808]: E0514 00:19:02.738119 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.738749 kubelet[2808]: E0514 00:19:02.738694 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.738749 kubelet[2808]: W0514 00:19:02.738719 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.738749 kubelet[2808]: E0514 00:19:02.738741 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.739305 kubelet[2808]: E0514 00:19:02.739153 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.739305 kubelet[2808]: W0514 00:19:02.739176 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.739305 kubelet[2808]: E0514 00:19:02.739198 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.739887 kubelet[2808]: E0514 00:19:02.739840 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.739887 kubelet[2808]: W0514 00:19:02.739873 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.740110 kubelet[2808]: E0514 00:19:02.739896 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.740335 kubelet[2808]: E0514 00:19:02.740288 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.740335 kubelet[2808]: W0514 00:19:02.740320 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.740629 kubelet[2808]: E0514 00:19:02.740342 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.740777 kubelet[2808]: E0514 00:19:02.740737 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.740777 kubelet[2808]: W0514 00:19:02.740762 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.740988 kubelet[2808]: E0514 00:19:02.740783 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.741172 kubelet[2808]: E0514 00:19:02.741131 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.741172 kubelet[2808]: W0514 00:19:02.741161 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.741362 kubelet[2808]: E0514 00:19:02.741182 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.741690 kubelet[2808]: E0514 00:19:02.741648 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.741690 kubelet[2808]: W0514 00:19:02.741680 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.741943 kubelet[2808]: E0514 00:19:02.741703 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.742120 kubelet[2808]: E0514 00:19:02.742075 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.742120 kubelet[2808]: W0514 00:19:02.742109 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.742379 kubelet[2808]: E0514 00:19:02.742133 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.742573 kubelet[2808]: E0514 00:19:02.742494 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.742573 kubelet[2808]: W0514 00:19:02.742565 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.742843 kubelet[2808]: E0514 00:19:02.742591 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.840989 kubelet[2808]: E0514 00:19:02.840771 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.840989 kubelet[2808]: W0514 00:19:02.840823 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.840989 kubelet[2808]: E0514 00:19:02.840862 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.843491 kubelet[2808]: E0514 00:19:02.843407 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.843491 kubelet[2808]: W0514 00:19:02.843444 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.843491 kubelet[2808]: E0514 00:19:02.843481 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.844255 kubelet[2808]: E0514 00:19:02.843978 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.844255 kubelet[2808]: W0514 00:19:02.844002 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.845890 kubelet[2808]: E0514 00:19:02.845819 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.846392 kubelet[2808]: E0514 00:19:02.846369 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.846625 kubelet[2808]: W0514 00:19:02.846396 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.846625 kubelet[2808]: E0514 00:19:02.846437 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.846889 kubelet[2808]: E0514 00:19:02.846858 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.846889 kubelet[2808]: W0514 00:19:02.846886 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.847321 kubelet[2808]: E0514 00:19:02.847026 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.847609 kubelet[2808]: E0514 00:19:02.847387 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.847609 kubelet[2808]: W0514 00:19:02.847412 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.847961 kubelet[2808]: E0514 00:19:02.847639 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.847961 kubelet[2808]: E0514 00:19:02.847914 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.847961 kubelet[2808]: W0514 00:19:02.847937 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.848948 kubelet[2808]: E0514 00:19:02.848040 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.848948 kubelet[2808]: E0514 00:19:02.848291 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.848948 kubelet[2808]: W0514 00:19:02.848579 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.848948 kubelet[2808]: E0514 00:19:02.848641 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.849678 kubelet[2808]: E0514 00:19:02.849644 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.849883 kubelet[2808]: W0514 00:19:02.849850 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.850393 kubelet[2808]: E0514 00:19:02.850097 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.850931 kubelet[2808]: E0514 00:19:02.850706 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.850931 kubelet[2808]: W0514 00:19:02.850741 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.850931 kubelet[2808]: E0514 00:19:02.850832 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.851575 kubelet[2808]: E0514 00:19:02.851474 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.851943 kubelet[2808]: W0514 00:19:02.851504 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.851943 kubelet[2808]: E0514 00:19:02.851778 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.852711 kubelet[2808]: E0514 00:19:02.852342 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.852711 kubelet[2808]: W0514 00:19:02.852371 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.852711 kubelet[2808]: E0514 00:19:02.852417 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.853181 kubelet[2808]: E0514 00:19:02.853149 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.853364 kubelet[2808]: W0514 00:19:02.853333 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.853622 kubelet[2808]: E0514 00:19:02.853565 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.853999 kubelet[2808]: E0514 00:19:02.853956 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.853999 kubelet[2808]: W0514 00:19:02.853995 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.854198 kubelet[2808]: E0514 00:19:02.854055 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.854663 kubelet[2808]: E0514 00:19:02.854619 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.854663 kubelet[2808]: W0514 00:19:02.854655 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.854877 kubelet[2808]: E0514 00:19:02.854691 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.855582 kubelet[2808]: E0514 00:19:02.855464 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.855582 kubelet[2808]: W0514 00:19:02.855497 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.856103 kubelet[2808]: E0514 00:19:02.855854 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.856871 kubelet[2808]: E0514 00:19:02.856328 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.856871 kubelet[2808]: W0514 00:19:02.856367 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.856871 kubelet[2808]: E0514 00:19:02.856429 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:02.857340 kubelet[2808]: E0514 00:19:02.857279 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:02.857561 kubelet[2808]: W0514 00:19:02.857495 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:02.857867 kubelet[2808]: E0514 00:19:02.857833 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.643456 kubelet[2808]: I0514 00:19:03.643366 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 00:19:03.649891 kubelet[2808]: E0514 00:19:03.649821 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.649891 kubelet[2808]: W0514 00:19:03.649869 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.650127 kubelet[2808]: E0514 00:19:03.649901 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.650456 kubelet[2808]: E0514 00:19:03.650394 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.650456 kubelet[2808]: W0514 00:19:03.650434 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.650699 kubelet[2808]: E0514 00:19:03.650459 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.651058 kubelet[2808]: E0514 00:19:03.650993 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.651058 kubelet[2808]: W0514 00:19:03.651034 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.651058 kubelet[2808]: E0514 00:19:03.651059 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.651574 kubelet[2808]: E0514 00:19:03.651477 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.651574 kubelet[2808]: W0514 00:19:03.651556 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.651797 kubelet[2808]: E0514 00:19:03.651580 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.652042 kubelet[2808]: E0514 00:19:03.652007 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.652042 kubelet[2808]: W0514 00:19:03.652037 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.652249 kubelet[2808]: E0514 00:19:03.652060 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.652443 kubelet[2808]: E0514 00:19:03.652410 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.652443 kubelet[2808]: W0514 00:19:03.652439 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.652707 kubelet[2808]: E0514 00:19:03.652461 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.653058 kubelet[2808]: E0514 00:19:03.653021 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.653058 kubelet[2808]: W0514 00:19:03.653055 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.653239 kubelet[2808]: E0514 00:19:03.653080 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.653484 kubelet[2808]: E0514 00:19:03.653452 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.653484 kubelet[2808]: W0514 00:19:03.653482 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.653758 kubelet[2808]: E0514 00:19:03.653505 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.654075 kubelet[2808]: E0514 00:19:03.653970 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.654206 kubelet[2808]: W0514 00:19:03.654079 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.654206 kubelet[2808]: E0514 00:19:03.654108 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.654571 kubelet[2808]: E0514 00:19:03.654498 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.654702 kubelet[2808]: W0514 00:19:03.654594 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.654702 kubelet[2808]: E0514 00:19:03.654621 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.655063 kubelet[2808]: E0514 00:19:03.654980 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.655063 kubelet[2808]: W0514 00:19:03.655011 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.655063 kubelet[2808]: E0514 00:19:03.655032 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.655601 kubelet[2808]: E0514 00:19:03.655504 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.655601 kubelet[2808]: W0514 00:19:03.655599 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.655796 kubelet[2808]: E0514 00:19:03.655625 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.656329 kubelet[2808]: E0514 00:19:03.656291 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.656329 kubelet[2808]: W0514 00:19:03.656325 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.656615 kubelet[2808]: E0514 00:19:03.656350 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.656853 kubelet[2808]: E0514 00:19:03.656818 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.656853 kubelet[2808]: W0514 00:19:03.656851 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.657059 kubelet[2808]: E0514 00:19:03.656875 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.657405 kubelet[2808]: E0514 00:19:03.657369 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.657405 kubelet[2808]: W0514 00:19:03.657401 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.657635 kubelet[2808]: E0514 00:19:03.657427 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.658086 kubelet[2808]: E0514 00:19:03.658022 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.658086 kubelet[2808]: W0514 00:19:03.658062 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.658316 kubelet[2808]: E0514 00:19:03.658091 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.658724 kubelet[2808]: E0514 00:19:03.658686 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.658724 kubelet[2808]: W0514 00:19:03.658724 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.658960 kubelet[2808]: E0514 00:19:03.658783 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.659389 kubelet[2808]: E0514 00:19:03.659327 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.659389 kubelet[2808]: W0514 00:19:03.659367 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.659655 kubelet[2808]: E0514 00:19:03.659404 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.659960 kubelet[2808]: E0514 00:19:03.659894 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.660075 kubelet[2808]: W0514 00:19:03.660005 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.660138 kubelet[2808]: E0514 00:19:03.660074 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.660668 kubelet[2808]: E0514 00:19:03.660629 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.660668 kubelet[2808]: W0514 00:19:03.660663 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.660893 kubelet[2808]: E0514 00:19:03.660796 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.661308 kubelet[2808]: E0514 00:19:03.661272 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.661308 kubelet[2808]: W0514 00:19:03.661305 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.661580 kubelet[2808]: E0514 00:19:03.661481 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.661927 kubelet[2808]: E0514 00:19:03.661865 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.661927 kubelet[2808]: W0514 00:19:03.661905 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.662286 kubelet[2808]: E0514 00:19:03.662164 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.662449 kubelet[2808]: E0514 00:19:03.662378 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.662449 kubelet[2808]: W0514 00:19:03.662403 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.662623 kubelet[2808]: E0514 00:19:03.662588 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.663442 kubelet[2808]: E0514 00:19:03.663402 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.663442 kubelet[2808]: W0514 00:19:03.663435 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.663737 kubelet[2808]: E0514 00:19:03.663473 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.664086 kubelet[2808]: E0514 00:19:03.664009 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.664086 kubelet[2808]: W0514 00:19:03.664047 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.664425 kubelet[2808]: E0514 00:19:03.664304 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.664643 kubelet[2808]: E0514 00:19:03.664495 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.664643 kubelet[2808]: W0514 00:19:03.664578 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.664875 kubelet[2808]: E0514 00:19:03.664714 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.665096 kubelet[2808]: E0514 00:19:03.665061 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.665096 kubelet[2808]: W0514 00:19:03.665090 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.665257 kubelet[2808]: E0514 00:19:03.665225 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.665832 kubelet[2808]: E0514 00:19:03.665794 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.665832 kubelet[2808]: W0514 00:19:03.665826 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.666045 kubelet[2808]: E0514 00:19:03.665861 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.666565 kubelet[2808]: E0514 00:19:03.666434 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.666565 kubelet[2808]: W0514 00:19:03.666474 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.667368 kubelet[2808]: E0514 00:19:03.666812 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.667368 kubelet[2808]: E0514 00:19:03.666910 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.667368 kubelet[2808]: W0514 00:19:03.666950 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.667368 kubelet[2808]: E0514 00:19:03.666981 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.667765 kubelet[2808]: E0514 00:19:03.667576 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.667765 kubelet[2808]: W0514 00:19:03.667604 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.667765 kubelet[2808]: E0514 00:19:03.667663 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.668831 kubelet[2808]: E0514 00:19:03.668764 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.668831 kubelet[2808]: W0514 00:19:03.668806 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.668831 kubelet[2808]: E0514 00:19:03.668843 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:03.669288 kubelet[2808]: E0514 00:19:03.669254 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 00:19:03.669288 kubelet[2808]: W0514 00:19:03.669284 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 00:19:03.669456 kubelet[2808]: E0514 00:19:03.669308 2808 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 00:19:04.271658 containerd[1482]: time="2025-05-14T00:19:04.271605278Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:04.272956 containerd[1482]: time="2025-05-14T00:19:04.272904304Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 14 00:19:04.274557 containerd[1482]: time="2025-05-14T00:19:04.274491601Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:04.277073 containerd[1482]: time="2025-05-14T00:19:04.277004295Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:04.277646 containerd[1482]: time="2025-05-14T00:19:04.277610191Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 2.511010295s" May 14 00:19:04.277710 containerd[1482]: time="2025-05-14T00:19:04.277645407Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 14 00:19:04.280624 containerd[1482]: time="2025-05-14T00:19:04.280077809Z" level=info msg="CreateContainer within sandbox \"53ad9a70a8b6f18a293296056507aa0599d47c312c5640cc35b9d5cbf27a9652\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 14 00:19:04.298734 containerd[1482]: time="2025-05-14T00:19:04.298688828Z" level=info msg="Container 4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189: CDI devices from CRI Config.CDIDevices: []" May 14 00:19:04.313938 containerd[1482]: time="2025-05-14T00:19:04.313905139Z" level=info msg="CreateContainer within sandbox \"53ad9a70a8b6f18a293296056507aa0599d47c312c5640cc35b9d5cbf27a9652\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189\"" May 14 00:19:04.314787 containerd[1482]: time="2025-05-14T00:19:04.314755975Z" level=info msg="StartContainer for \"4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189\"" May 14 00:19:04.317239 containerd[1482]: time="2025-05-14T00:19:04.317180253Z" level=info msg="connecting to shim 4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189" address="unix:///run/containerd/s/f00e8545773c12780e5715b7dfbb658d032560c8fa9cf61fdf4550d9aa368bbf" protocol=ttrpc version=3 May 14 00:19:04.347679 systemd[1]: Started cri-containerd-4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189.scope - libcontainer container 4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189. May 14 00:19:04.400985 containerd[1482]: time="2025-05-14T00:19:04.400097842Z" level=info msg="StartContainer for \"4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189\" returns successfully" May 14 00:19:04.411954 systemd[1]: cri-containerd-4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189.scope: Deactivated successfully. May 14 00:19:04.417143 containerd[1482]: time="2025-05-14T00:19:04.417094512Z" level=info msg="received exit event container_id:\"4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189\" id:\"4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189\" pid:3433 exited_at:{seconds:1747181944 nanos:416483506}" May 14 00:19:04.417446 containerd[1482]: time="2025-05-14T00:19:04.417410054Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189\" id:\"4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189\" pid:3433 exited_at:{seconds:1747181944 nanos:416483506}" May 14 00:19:04.445185 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189-rootfs.mount: Deactivated successfully. May 14 00:19:04.499909 kubelet[2808]: E0514 00:19:04.498762 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6svwf" podUID="2846908e-da51-4861-b641-78bf7c75d90f" May 14 00:19:04.810905 kubelet[2808]: I0514 00:19:04.810774 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-577966c9b9-rr77z" podStartSLOduration=4.572889301 podStartE2EDuration="7.810719094s" podCreationTimestamp="2025-05-14 00:18:57 +0000 UTC" firstStartedPulling="2025-05-14 00:18:58.52847529 +0000 UTC m=+22.250169133" lastFinishedPulling="2025-05-14 00:19:01.766305083 +0000 UTC m=+25.487998926" observedRunningTime="2025-05-14 00:19:02.668962877 +0000 UTC m=+26.390656760" watchObservedRunningTime="2025-05-14 00:19:04.810719094 +0000 UTC m=+28.532412977" May 14 00:19:05.667049 containerd[1482]: time="2025-05-14T00:19:05.666771994Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 14 00:19:06.500988 kubelet[2808]: E0514 00:19:06.500881 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6svwf" podUID="2846908e-da51-4861-b641-78bf7c75d90f" May 14 00:19:08.501297 kubelet[2808]: E0514 00:19:08.498904 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6svwf" podUID="2846908e-da51-4861-b641-78bf7c75d90f" May 14 00:19:10.498738 kubelet[2808]: E0514 00:19:10.498442 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6svwf" podUID="2846908e-da51-4861-b641-78bf7c75d90f" May 14 00:19:12.317948 containerd[1482]: time="2025-05-14T00:19:12.317591162Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:12.320791 containerd[1482]: time="2025-05-14T00:19:12.320282321Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 14 00:19:12.321560 containerd[1482]: time="2025-05-14T00:19:12.320885672Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:12.324245 containerd[1482]: time="2025-05-14T00:19:12.324182918Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:12.325442 containerd[1482]: time="2025-05-14T00:19:12.324905734Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 6.657918626s" May 14 00:19:12.325442 containerd[1482]: time="2025-05-14T00:19:12.324983400Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 14 00:19:12.330018 containerd[1482]: time="2025-05-14T00:19:12.329956849Z" level=info msg="CreateContainer within sandbox \"53ad9a70a8b6f18a293296056507aa0599d47c312c5640cc35b9d5cbf27a9652\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 14 00:19:12.350387 containerd[1482]: time="2025-05-14T00:19:12.347860333Z" level=info msg="Container 85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5: CDI devices from CRI Config.CDIDevices: []" May 14 00:19:12.365783 containerd[1482]: time="2025-05-14T00:19:12.365718654Z" level=info msg="CreateContainer within sandbox \"53ad9a70a8b6f18a293296056507aa0599d47c312c5640cc35b9d5cbf27a9652\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5\"" May 14 00:19:12.366697 containerd[1482]: time="2025-05-14T00:19:12.366618531Z" level=info msg="StartContainer for \"85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5\"" May 14 00:19:12.380597 containerd[1482]: time="2025-05-14T00:19:12.379956562Z" level=info msg="connecting to shim 85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5" address="unix:///run/containerd/s/f00e8545773c12780e5715b7dfbb658d032560c8fa9cf61fdf4550d9aa368bbf" protocol=ttrpc version=3 May 14 00:19:12.420739 systemd[1]: Started cri-containerd-85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5.scope - libcontainer container 85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5. May 14 00:19:12.480724 containerd[1482]: time="2025-05-14T00:19:12.479656114Z" level=info msg="StartContainer for \"85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5\" returns successfully" May 14 00:19:12.501412 kubelet[2808]: E0514 00:19:12.500821 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6svwf" podUID="2846908e-da51-4861-b641-78bf7c75d90f" May 14 00:19:13.699113 containerd[1482]: time="2025-05-14T00:19:13.698578266Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 14 00:19:13.708095 systemd[1]: cri-containerd-85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5.scope: Deactivated successfully. May 14 00:19:13.710026 systemd[1]: cri-containerd-85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5.scope: Consumed 676ms CPU time, 173.8M memory peak, 154M written to disk. May 14 00:19:13.714561 containerd[1482]: time="2025-05-14T00:19:13.712843735Z" level=info msg="TaskExit event in podsandbox handler container_id:\"85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5\" id:\"85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5\" pid:3489 exited_at:{seconds:1747181953 nanos:709961960}" May 14 00:19:13.714561 containerd[1482]: time="2025-05-14T00:19:13.712969612Z" level=info msg="received exit event container_id:\"85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5\" id:\"85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5\" pid:3489 exited_at:{seconds:1747181953 nanos:709961960}" May 14 00:19:13.765560 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5-rootfs.mount: Deactivated successfully. May 14 00:19:13.770422 kubelet[2808]: I0514 00:19:13.770340 2808 kubelet_node_status.go:497] "Fast updating node status as it just became ready" May 14 00:19:14.054096 kubelet[2808]: I0514 00:19:14.022679 2808 topology_manager.go:215] "Topology Admit Handler" podUID="ad63d8da-8e9c-4dad-bf11-f8f68208946c" podNamespace="kube-system" podName="coredns-7db6d8ff4d-vlxwd" May 14 00:19:14.054096 kubelet[2808]: I0514 00:19:14.038399 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad63d8da-8e9c-4dad-bf11-f8f68208946c-config-volume\") pod \"coredns-7db6d8ff4d-vlxwd\" (UID: \"ad63d8da-8e9c-4dad-bf11-f8f68208946c\") " pod="kube-system/coredns-7db6d8ff4d-vlxwd" May 14 00:19:14.054096 kubelet[2808]: I0514 00:19:14.038577 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2tp\" (UniqueName: \"kubernetes.io/projected/ad63d8da-8e9c-4dad-bf11-f8f68208946c-kube-api-access-dh2tp\") pod \"coredns-7db6d8ff4d-vlxwd\" (UID: \"ad63d8da-8e9c-4dad-bf11-f8f68208946c\") " pod="kube-system/coredns-7db6d8ff4d-vlxwd" May 14 00:19:14.049593 systemd[1]: Created slice kubepods-burstable-podad63d8da_8e9c_4dad_bf11_f8f68208946c.slice - libcontainer container kubepods-burstable-podad63d8da_8e9c_4dad_bf11_f8f68208946c.slice. May 14 00:19:14.115479 kubelet[2808]: I0514 00:19:14.115376 2808 topology_manager.go:215] "Topology Admit Handler" podUID="46692647-8abf-4b93-b4e7-539aa57f3e65" podNamespace="calico-system" podName="calico-kube-controllers-84c5894965-x5l79" May 14 00:19:14.121825 kubelet[2808]: I0514 00:19:14.120577 2808 topology_manager.go:215] "Topology Admit Handler" podUID="119d07b0-5223-402f-be8b-608f7867493d" podNamespace="calico-apiserver" podName="calico-apiserver-85c77fb996-7pqlj" May 14 00:19:14.126395 kubelet[2808]: I0514 00:19:14.126299 2808 topology_manager.go:215] "Topology Admit Handler" podUID="668fd9d0-96d4-4c00-9f43-69d0f9229bcd" podNamespace="kube-system" podName="coredns-7db6d8ff4d-7q68r" May 14 00:19:14.136482 kubelet[2808]: I0514 00:19:14.134648 2808 topology_manager.go:215] "Topology Admit Handler" podUID="53393edc-5a1c-4194-8089-62b99b6d22c3" podNamespace="calico-apiserver" podName="calico-apiserver-85c77fb996-vjqrg" May 14 00:19:14.142802 kubelet[2808]: I0514 00:19:14.142748 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/668fd9d0-96d4-4c00-9f43-69d0f9229bcd-config-volume\") pod \"coredns-7db6d8ff4d-7q68r\" (UID: \"668fd9d0-96d4-4c00-9f43-69d0f9229bcd\") " pod="kube-system/coredns-7db6d8ff4d-7q68r" May 14 00:19:14.145558 kubelet[2808]: I0514 00:19:14.143643 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46692647-8abf-4b93-b4e7-539aa57f3e65-tigera-ca-bundle\") pod \"calico-kube-controllers-84c5894965-x5l79\" (UID: \"46692647-8abf-4b93-b4e7-539aa57f3e65\") " pod="calico-system/calico-kube-controllers-84c5894965-x5l79" May 14 00:19:14.145558 kubelet[2808]: I0514 00:19:14.143717 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh5vl\" (UniqueName: \"kubernetes.io/projected/53393edc-5a1c-4194-8089-62b99b6d22c3-kube-api-access-zh5vl\") pod \"calico-apiserver-85c77fb996-vjqrg\" (UID: \"53393edc-5a1c-4194-8089-62b99b6d22c3\") " pod="calico-apiserver/calico-apiserver-85c77fb996-vjqrg" May 14 00:19:14.145558 kubelet[2808]: I0514 00:19:14.143810 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn4dq\" (UniqueName: \"kubernetes.io/projected/119d07b0-5223-402f-be8b-608f7867493d-kube-api-access-pn4dq\") pod \"calico-apiserver-85c77fb996-7pqlj\" (UID: \"119d07b0-5223-402f-be8b-608f7867493d\") " pod="calico-apiserver/calico-apiserver-85c77fb996-7pqlj" May 14 00:19:14.145558 kubelet[2808]: I0514 00:19:14.143909 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/53393edc-5a1c-4194-8089-62b99b6d22c3-calico-apiserver-certs\") pod \"calico-apiserver-85c77fb996-vjqrg\" (UID: \"53393edc-5a1c-4194-8089-62b99b6d22c3\") " pod="calico-apiserver/calico-apiserver-85c77fb996-vjqrg" May 14 00:19:14.145558 kubelet[2808]: I0514 00:19:14.145371 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf82m\" (UniqueName: \"kubernetes.io/projected/46692647-8abf-4b93-b4e7-539aa57f3e65-kube-api-access-kf82m\") pod \"calico-kube-controllers-84c5894965-x5l79\" (UID: \"46692647-8abf-4b93-b4e7-539aa57f3e65\") " pod="calico-system/calico-kube-controllers-84c5894965-x5l79" May 14 00:19:14.146313 kubelet[2808]: I0514 00:19:14.145470 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8wcs\" (UniqueName: \"kubernetes.io/projected/668fd9d0-96d4-4c00-9f43-69d0f9229bcd-kube-api-access-t8wcs\") pod \"coredns-7db6d8ff4d-7q68r\" (UID: \"668fd9d0-96d4-4c00-9f43-69d0f9229bcd\") " pod="kube-system/coredns-7db6d8ff4d-7q68r" May 14 00:19:14.146313 kubelet[2808]: I0514 00:19:14.145582 2808 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/119d07b0-5223-402f-be8b-608f7867493d-calico-apiserver-certs\") pod \"calico-apiserver-85c77fb996-7pqlj\" (UID: \"119d07b0-5223-402f-be8b-608f7867493d\") " pod="calico-apiserver/calico-apiserver-85c77fb996-7pqlj" May 14 00:19:14.146867 systemd[1]: Created slice kubepods-besteffort-pod46692647_8abf_4b93_b4e7_539aa57f3e65.slice - libcontainer container kubepods-besteffort-pod46692647_8abf_4b93_b4e7_539aa57f3e65.slice. May 14 00:19:14.164668 systemd[1]: Created slice kubepods-besteffort-pod119d07b0_5223_402f_be8b_608f7867493d.slice - libcontainer container kubepods-besteffort-pod119d07b0_5223_402f_be8b_608f7867493d.slice. May 14 00:19:14.172093 systemd[1]: Created slice kubepods-burstable-pod668fd9d0_96d4_4c00_9f43_69d0f9229bcd.slice - libcontainer container kubepods-burstable-pod668fd9d0_96d4_4c00_9f43_69d0f9229bcd.slice. May 14 00:19:14.177092 systemd[1]: Created slice kubepods-besteffort-pod53393edc_5a1c_4194_8089_62b99b6d22c3.slice - libcontainer container kubepods-besteffort-pod53393edc_5a1c_4194_8089_62b99b6d22c3.slice. May 14 00:19:14.460113 containerd[1482]: time="2025-05-14T00:19:14.459687810Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84c5894965-x5l79,Uid:46692647-8abf-4b93-b4e7-539aa57f3e65,Namespace:calico-system,Attempt:0,}" May 14 00:19:14.469273 containerd[1482]: time="2025-05-14T00:19:14.469144789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85c77fb996-7pqlj,Uid:119d07b0-5223-402f-be8b-608f7867493d,Namespace:calico-apiserver,Attempt:0,}" May 14 00:19:14.476201 containerd[1482]: time="2025-05-14T00:19:14.476098794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7q68r,Uid:668fd9d0-96d4-4c00-9f43-69d0f9229bcd,Namespace:kube-system,Attempt:0,}" May 14 00:19:14.481257 containerd[1482]: time="2025-05-14T00:19:14.480948742Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85c77fb996-vjqrg,Uid:53393edc-5a1c-4194-8089-62b99b6d22c3,Namespace:calico-apiserver,Attempt:0,}" May 14 00:19:14.518708 systemd[1]: Created slice kubepods-besteffort-pod2846908e_da51_4861_b641_78bf7c75d90f.slice - libcontainer container kubepods-besteffort-pod2846908e_da51_4861_b641_78bf7c75d90f.slice. May 14 00:19:14.523686 containerd[1482]: time="2025-05-14T00:19:14.523600129Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6svwf,Uid:2846908e-da51-4861-b641-78bf7c75d90f,Namespace:calico-system,Attempt:0,}" May 14 00:19:14.685574 containerd[1482]: time="2025-05-14T00:19:14.685108142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-vlxwd,Uid:ad63d8da-8e9c-4dad-bf11-f8f68208946c,Namespace:kube-system,Attempt:0,}" May 14 00:19:14.745558 containerd[1482]: time="2025-05-14T00:19:14.745489168Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 14 00:19:14.884674 containerd[1482]: time="2025-05-14T00:19:14.884594798Z" level=error msg="Failed to destroy network for sandbox \"e4965ddfa95a96b23c441666b0dafe37edd06986340e38f11c0b6d93645aff65\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.889913 systemd[1]: run-netns-cni\x2d2f931d97\x2d9467\x2d21b9\x2de960\x2d4edb4d59b079.mount: Deactivated successfully. May 14 00:19:14.892586 containerd[1482]: time="2025-05-14T00:19:14.892021569Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-vlxwd,Uid:ad63d8da-8e9c-4dad-bf11-f8f68208946c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4965ddfa95a96b23c441666b0dafe37edd06986340e38f11c0b6d93645aff65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.893390 kubelet[2808]: E0514 00:19:14.893091 2808 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4965ddfa95a96b23c441666b0dafe37edd06986340e38f11c0b6d93645aff65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.893390 kubelet[2808]: E0514 00:19:14.893226 2808 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4965ddfa95a96b23c441666b0dafe37edd06986340e38f11c0b6d93645aff65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-vlxwd" May 14 00:19:14.893390 kubelet[2808]: E0514 00:19:14.893265 2808 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4965ddfa95a96b23c441666b0dafe37edd06986340e38f11c0b6d93645aff65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-vlxwd" May 14 00:19:14.893951 kubelet[2808]: E0514 00:19:14.893329 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-vlxwd_kube-system(ad63d8da-8e9c-4dad-bf11-f8f68208946c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-vlxwd_kube-system(ad63d8da-8e9c-4dad-bf11-f8f68208946c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e4965ddfa95a96b23c441666b0dafe37edd06986340e38f11c0b6d93645aff65\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-vlxwd" podUID="ad63d8da-8e9c-4dad-bf11-f8f68208946c" May 14 00:19:14.945094 containerd[1482]: time="2025-05-14T00:19:14.944126109Z" level=error msg="Failed to destroy network for sandbox \"78a02bb3ff7544fbcb5a03e3af62afcf569685c9060f8c69f9410b12c502a4db\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.948459 systemd[1]: run-netns-cni\x2d258d04d3\x2dbcf9\x2d841e\x2d8b72\x2d226e0c01bfae.mount: Deactivated successfully. May 14 00:19:14.952592 containerd[1482]: time="2025-05-14T00:19:14.951311968Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84c5894965-x5l79,Uid:46692647-8abf-4b93-b4e7-539aa57f3e65,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"78a02bb3ff7544fbcb5a03e3af62afcf569685c9060f8c69f9410b12c502a4db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.952741 kubelet[2808]: E0514 00:19:14.952447 2808 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78a02bb3ff7544fbcb5a03e3af62afcf569685c9060f8c69f9410b12c502a4db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.952741 kubelet[2808]: E0514 00:19:14.952503 2808 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78a02bb3ff7544fbcb5a03e3af62afcf569685c9060f8c69f9410b12c502a4db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-84c5894965-x5l79" May 14 00:19:14.952741 kubelet[2808]: E0514 00:19:14.952558 2808 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78a02bb3ff7544fbcb5a03e3af62afcf569685c9060f8c69f9410b12c502a4db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-84c5894965-x5l79" May 14 00:19:14.952864 kubelet[2808]: E0514 00:19:14.952653 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-84c5894965-x5l79_calico-system(46692647-8abf-4b93-b4e7-539aa57f3e65)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-84c5894965-x5l79_calico-system(46692647-8abf-4b93-b4e7-539aa57f3e65)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"78a02bb3ff7544fbcb5a03e3af62afcf569685c9060f8c69f9410b12c502a4db\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-84c5894965-x5l79" podUID="46692647-8abf-4b93-b4e7-539aa57f3e65" May 14 00:19:14.955883 containerd[1482]: time="2025-05-14T00:19:14.955037959Z" level=error msg="Failed to destroy network for sandbox \"7a34164403d204f7c90ae635dd8bfeecf0d61b80220940229b03fd974c021d86\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.959590 containerd[1482]: time="2025-05-14T00:19:14.958965305Z" level=error msg="Failed to destroy network for sandbox \"e6b6e49ea2ae2977f763ada660d1cabd53d9cf63bcdf272e7239b90b0b4b6f39\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.960690 systemd[1]: run-netns-cni\x2d33aa4c62\x2d6a80\x2d5290\x2d3ed8\x2d2e40a8a25d8a.mount: Deactivated successfully. May 14 00:19:14.964045 containerd[1482]: time="2025-05-14T00:19:14.963000394Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7q68r,Uid:668fd9d0-96d4-4c00-9f43-69d0f9229bcd,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a34164403d204f7c90ae635dd8bfeecf0d61b80220940229b03fd974c021d86\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.966126 systemd[1]: run-netns-cni\x2d03ebf3d4\x2d92eb\x2db96d\x2d87c3\x2d4985382045cf.mount: Deactivated successfully. May 14 00:19:14.966855 kubelet[2808]: E0514 00:19:14.966804 2808 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a34164403d204f7c90ae635dd8bfeecf0d61b80220940229b03fd974c021d86\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.966930 kubelet[2808]: E0514 00:19:14.966864 2808 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a34164403d204f7c90ae635dd8bfeecf0d61b80220940229b03fd974c021d86\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7q68r" May 14 00:19:14.966930 kubelet[2808]: E0514 00:19:14.966905 2808 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a34164403d204f7c90ae635dd8bfeecf0d61b80220940229b03fd974c021d86\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7q68r" May 14 00:19:14.967467 containerd[1482]: time="2025-05-14T00:19:14.966696889Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85c77fb996-vjqrg,Uid:53393edc-5a1c-4194-8089-62b99b6d22c3,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6b6e49ea2ae2977f763ada660d1cabd53d9cf63bcdf272e7239b90b0b4b6f39\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.969668 kubelet[2808]: E0514 00:19:14.967328 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-7q68r_kube-system(668fd9d0-96d4-4c00-9f43-69d0f9229bcd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-7q68r_kube-system(668fd9d0-96d4-4c00-9f43-69d0f9229bcd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7a34164403d204f7c90ae635dd8bfeecf0d61b80220940229b03fd974c021d86\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-7q68r" podUID="668fd9d0-96d4-4c00-9f43-69d0f9229bcd" May 14 00:19:14.969769 kubelet[2808]: E0514 00:19:14.969686 2808 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6b6e49ea2ae2977f763ada660d1cabd53d9cf63bcdf272e7239b90b0b4b6f39\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.969864 kubelet[2808]: E0514 00:19:14.969833 2808 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6b6e49ea2ae2977f763ada660d1cabd53d9cf63bcdf272e7239b90b0b4b6f39\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85c77fb996-vjqrg" May 14 00:19:14.969906 kubelet[2808]: E0514 00:19:14.969862 2808 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6b6e49ea2ae2977f763ada660d1cabd53d9cf63bcdf272e7239b90b0b4b6f39\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85c77fb996-vjqrg" May 14 00:19:14.971750 kubelet[2808]: E0514 00:19:14.970015 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85c77fb996-vjqrg_calico-apiserver(53393edc-5a1c-4194-8089-62b99b6d22c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85c77fb996-vjqrg_calico-apiserver(53393edc-5a1c-4194-8089-62b99b6d22c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e6b6e49ea2ae2977f763ada660d1cabd53d9cf63bcdf272e7239b90b0b4b6f39\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85c77fb996-vjqrg" podUID="53393edc-5a1c-4194-8089-62b99b6d22c3" May 14 00:19:14.976021 containerd[1482]: time="2025-05-14T00:19:14.975971166Z" level=error msg="Failed to destroy network for sandbox \"46f0404cf39466abf0180b3c02fcb8cdd9401332095d3ee7253907d2e0a017dd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.979199 containerd[1482]: time="2025-05-14T00:19:14.979148947Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85c77fb996-7pqlj,Uid:119d07b0-5223-402f-be8b-608f7867493d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"46f0404cf39466abf0180b3c02fcb8cdd9401332095d3ee7253907d2e0a017dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.979655 kubelet[2808]: E0514 00:19:14.979442 2808 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46f0404cf39466abf0180b3c02fcb8cdd9401332095d3ee7253907d2e0a017dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.979655 kubelet[2808]: E0514 00:19:14.979536 2808 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46f0404cf39466abf0180b3c02fcb8cdd9401332095d3ee7253907d2e0a017dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85c77fb996-7pqlj" May 14 00:19:14.979655 kubelet[2808]: E0514 00:19:14.979564 2808 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46f0404cf39466abf0180b3c02fcb8cdd9401332095d3ee7253907d2e0a017dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85c77fb996-7pqlj" May 14 00:19:14.979837 kubelet[2808]: E0514 00:19:14.979616 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85c77fb996-7pqlj_calico-apiserver(119d07b0-5223-402f-be8b-608f7867493d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85c77fb996-7pqlj_calico-apiserver(119d07b0-5223-402f-be8b-608f7867493d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"46f0404cf39466abf0180b3c02fcb8cdd9401332095d3ee7253907d2e0a017dd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85c77fb996-7pqlj" podUID="119d07b0-5223-402f-be8b-608f7867493d" May 14 00:19:14.980384 containerd[1482]: time="2025-05-14T00:19:14.980306508Z" level=error msg="Failed to destroy network for sandbox \"cb9cdab2ccbbc5812f19bc3111bd437ad501650bd3130939ab50f8c6e080ca4b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.982288 containerd[1482]: time="2025-05-14T00:19:14.982120501Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6svwf,Uid:2846908e-da51-4861-b641-78bf7c75d90f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb9cdab2ccbbc5812f19bc3111bd437ad501650bd3130939ab50f8c6e080ca4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.982710 kubelet[2808]: E0514 00:19:14.982501 2808 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb9cdab2ccbbc5812f19bc3111bd437ad501650bd3130939ab50f8c6e080ca4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 00:19:14.982710 kubelet[2808]: E0514 00:19:14.982588 2808 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb9cdab2ccbbc5812f19bc3111bd437ad501650bd3130939ab50f8c6e080ca4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6svwf" May 14 00:19:14.982710 kubelet[2808]: E0514 00:19:14.982619 2808 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb9cdab2ccbbc5812f19bc3111bd437ad501650bd3130939ab50f8c6e080ca4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6svwf" May 14 00:19:14.982899 kubelet[2808]: E0514 00:19:14.982670 2808 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-6svwf_calico-system(2846908e-da51-4861-b641-78bf7c75d90f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-6svwf_calico-system(2846908e-da51-4861-b641-78bf7c75d90f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cb9cdab2ccbbc5812f19bc3111bd437ad501650bd3130939ab50f8c6e080ca4b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-6svwf" podUID="2846908e-da51-4861-b641-78bf7c75d90f" May 14 00:19:15.766033 systemd[1]: run-netns-cni\x2d966ecf4a\x2ded4c\x2dc95f\x2d53da\x2d25f9b48cfafa.mount: Deactivated successfully. May 14 00:19:15.766359 systemd[1]: run-netns-cni\x2dd5cc74ec\x2d63e8\x2d091f\x2d4745\x2d08ab5584f4bd.mount: Deactivated successfully. May 14 00:19:19.572537 kubelet[2808]: I0514 00:19:19.572004 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 00:19:24.216830 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1424652983.mount: Deactivated successfully. May 14 00:19:24.256236 containerd[1482]: time="2025-05-14T00:19:24.256115180Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:24.258350 containerd[1482]: time="2025-05-14T00:19:24.258072682Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 14 00:19:24.260788 containerd[1482]: time="2025-05-14T00:19:24.259736663Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:24.264239 containerd[1482]: time="2025-05-14T00:19:24.264170310Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:24.265358 containerd[1482]: time="2025-05-14T00:19:24.265312031Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 9.519442119s" May 14 00:19:24.266077 containerd[1482]: time="2025-05-14T00:19:24.266049454Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 14 00:19:24.296204 containerd[1482]: time="2025-05-14T00:19:24.296146840Z" level=info msg="CreateContainer within sandbox \"53ad9a70a8b6f18a293296056507aa0599d47c312c5640cc35b9d5cbf27a9652\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 14 00:19:24.336803 containerd[1482]: time="2025-05-14T00:19:24.336756925Z" level=info msg="Container 5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592: CDI devices from CRI Config.CDIDevices: []" May 14 00:19:24.360408 containerd[1482]: time="2025-05-14T00:19:24.360217321Z" level=info msg="CreateContainer within sandbox \"53ad9a70a8b6f18a293296056507aa0599d47c312c5640cc35b9d5cbf27a9652\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\"" May 14 00:19:24.364538 containerd[1482]: time="2025-05-14T00:19:24.362876649Z" level=info msg="StartContainer for \"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\"" May 14 00:19:24.365214 containerd[1482]: time="2025-05-14T00:19:24.365190379Z" level=info msg="connecting to shim 5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592" address="unix:///run/containerd/s/f00e8545773c12780e5715b7dfbb658d032560c8fa9cf61fdf4550d9aa368bbf" protocol=ttrpc version=3 May 14 00:19:24.421739 systemd[1]: Started cri-containerd-5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592.scope - libcontainer container 5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592. May 14 00:19:24.508529 containerd[1482]: time="2025-05-14T00:19:24.508467944Z" level=info msg="StartContainer for \"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" returns successfully" May 14 00:19:24.621619 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 14 00:19:24.621878 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 14 00:19:24.810370 kubelet[2808]: I0514 00:19:24.810136 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-v8ssp" podStartSLOduration=1.195950598 podStartE2EDuration="26.810015856s" podCreationTimestamp="2025-05-14 00:18:58 +0000 UTC" firstStartedPulling="2025-05-14 00:18:58.653230054 +0000 UTC m=+22.374923887" lastFinishedPulling="2025-05-14 00:19:24.267295312 +0000 UTC m=+47.988989145" observedRunningTime="2025-05-14 00:19:24.805841105 +0000 UTC m=+48.527534958" watchObservedRunningTime="2025-05-14 00:19:24.810015856 +0000 UTC m=+48.531709689" May 14 00:19:24.949645 containerd[1482]: time="2025-05-14T00:19:24.948916189Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"230b881e15b297216b95f53f1df92c5aca56063fbf5901fdd797dfd3d03445a8\" pid:3773 exit_status:1 exited_at:{seconds:1747181964 nanos:947815094}" May 14 00:19:25.501045 containerd[1482]: time="2025-05-14T00:19:25.500837628Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6svwf,Uid:2846908e-da51-4861-b641-78bf7c75d90f,Namespace:calico-system,Attempt:0,}" May 14 00:19:25.502275 containerd[1482]: time="2025-05-14T00:19:25.501107565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84c5894965-x5l79,Uid:46692647-8abf-4b93-b4e7-539aa57f3e65,Namespace:calico-system,Attempt:0,}" May 14 00:19:25.503113 containerd[1482]: time="2025-05-14T00:19:25.503038957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85c77fb996-7pqlj,Uid:119d07b0-5223-402f-be8b-608f7867493d,Namespace:calico-apiserver,Attempt:0,}" May 14 00:19:25.989645 systemd-networkd[1390]: cali8aea1ff0a96: Link UP May 14 00:19:25.989996 systemd-networkd[1390]: cali8aea1ff0a96: Gained carrier May 14 00:19:26.023541 containerd[1482]: 2025-05-14 00:19:25.628 [INFO][3800] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 14 00:19:26.023541 containerd[1482]: 2025-05-14 00:19:25.672 [INFO][3800] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0 csi-node-driver- calico-system 2846908e-da51-4861-b641-78bf7c75d90f 609 0 2025-05-14 00:18:58 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b7b4b9d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284-0-0-n-4643e7afba.novalocal csi-node-driver-6svwf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali8aea1ff0a96 [] []}} ContainerID="124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" Namespace="calico-system" Pod="csi-node-driver-6svwf" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-" May 14 00:19:26.023541 containerd[1482]: 2025-05-14 00:19:25.672 [INFO][3800] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" Namespace="calico-system" Pod="csi-node-driver-6svwf" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0" May 14 00:19:26.023541 containerd[1482]: 2025-05-14 00:19:25.806 [INFO][3832] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" HandleID="k8s-pod-network.124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0" May 14 00:19:26.023901 containerd[1482]: 2025-05-14 00:19:25.864 [INFO][3832] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" HandleID="k8s-pod-network.124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003194b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-4643e7afba.novalocal", "pod":"csi-node-driver-6svwf", "timestamp":"2025-05-14 00:19:25.806441626 +0000 UTC"}, Hostname:"ci-4284-0-0-n-4643e7afba.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 00:19:26.023901 containerd[1482]: 2025-05-14 00:19:25.864 [INFO][3832] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 00:19:26.023901 containerd[1482]: 2025-05-14 00:19:25.864 [INFO][3832] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 00:19:26.023901 containerd[1482]: 2025-05-14 00:19:25.864 [INFO][3832] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-4643e7afba.novalocal' May 14 00:19:26.023901 containerd[1482]: 2025-05-14 00:19:25.873 [INFO][3832] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.023901 containerd[1482]: 2025-05-14 00:19:25.898 [INFO][3832] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.023901 containerd[1482]: 2025-05-14 00:19:25.919 [INFO][3832] ipam/ipam.go 489: Trying affinity for 192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.023901 containerd[1482]: 2025-05-14 00:19:25.922 [INFO][3832] ipam/ipam.go 155: Attempting to load block cidr=192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.023901 containerd[1482]: 2025-05-14 00:19:25.926 [INFO][3832] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.024344 containerd[1482]: 2025-05-14 00:19:25.927 [INFO][3832] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.13.64/26 handle="k8s-pod-network.124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.024344 containerd[1482]: 2025-05-14 00:19:25.929 [INFO][3832] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586 May 14 00:19:26.024344 containerd[1482]: 2025-05-14 00:19:25.937 [INFO][3832] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.13.64/26 handle="k8s-pod-network.124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.024344 containerd[1482]: 2025-05-14 00:19:25.946 [INFO][3832] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.13.65/26] block=192.168.13.64/26 handle="k8s-pod-network.124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.024344 containerd[1482]: 2025-05-14 00:19:25.946 [INFO][3832] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.13.65/26] handle="k8s-pod-network.124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.024344 containerd[1482]: 2025-05-14 00:19:25.946 [INFO][3832] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 00:19:26.024344 containerd[1482]: 2025-05-14 00:19:25.946 [INFO][3832] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.13.65/26] IPv6=[] ContainerID="124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" HandleID="k8s-pod-network.124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0" May 14 00:19:26.024606 containerd[1482]: 2025-05-14 00:19:25.953 [INFO][3800] cni-plugin/k8s.go 386: Populated endpoint ContainerID="124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" Namespace="calico-system" Pod="csi-node-driver-6svwf" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2846908e-da51-4861-b641-78bf7c75d90f", ResourceVersion:"609", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 18, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4643e7afba.novalocal", ContainerID:"", Pod:"csi-node-driver-6svwf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.13.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8aea1ff0a96", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:19:26.024697 containerd[1482]: 2025-05-14 00:19:25.954 [INFO][3800] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.13.65/32] ContainerID="124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" Namespace="calico-system" Pod="csi-node-driver-6svwf" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0" May 14 00:19:26.024697 containerd[1482]: 2025-05-14 00:19:25.954 [INFO][3800] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8aea1ff0a96 ContainerID="124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" Namespace="calico-system" Pod="csi-node-driver-6svwf" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0" May 14 00:19:26.024697 containerd[1482]: 2025-05-14 00:19:25.985 [INFO][3800] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" Namespace="calico-system" Pod="csi-node-driver-6svwf" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0" May 14 00:19:26.025353 containerd[1482]: 2025-05-14 00:19:25.990 [INFO][3800] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" Namespace="calico-system" Pod="csi-node-driver-6svwf" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2846908e-da51-4861-b641-78bf7c75d90f", ResourceVersion:"609", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 18, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4643e7afba.novalocal", ContainerID:"124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586", Pod:"csi-node-driver-6svwf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.13.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8aea1ff0a96", MAC:"fe:7a:a6:7b:3d:20", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:19:26.025486 containerd[1482]: 2025-05-14 00:19:26.013 [INFO][3800] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" Namespace="calico-system" Pod="csi-node-driver-6svwf" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-csi--node--driver--6svwf-eth0" May 14 00:19:26.063563 systemd-networkd[1390]: cali6122ab79569: Link UP May 14 00:19:26.073924 systemd-networkd[1390]: cali6122ab79569: Gained carrier May 14 00:19:26.123130 containerd[1482]: 2025-05-14 00:19:25.624 [INFO][3798] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 14 00:19:26.123130 containerd[1482]: 2025-05-14 00:19:25.672 [INFO][3798] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0 calico-kube-controllers-84c5894965- calico-system 46692647-8abf-4b93-b4e7-539aa57f3e65 704 0 2025-05-14 00:18:58 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:84c5894965 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284-0-0-n-4643e7afba.novalocal calico-kube-controllers-84c5894965-x5l79 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali6122ab79569 [] []}} ContainerID="a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" Namespace="calico-system" Pod="calico-kube-controllers-84c5894965-x5l79" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-" May 14 00:19:26.123130 containerd[1482]: 2025-05-14 00:19:25.672 [INFO][3798] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" Namespace="calico-system" Pod="calico-kube-controllers-84c5894965-x5l79" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0" May 14 00:19:26.123130 containerd[1482]: 2025-05-14 00:19:25.845 [INFO][3837] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" HandleID="k8s-pod-network.a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0" May 14 00:19:26.123476 containerd[1482]: 2025-05-14 00:19:25.908 [INFO][3837] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" HandleID="k8s-pod-network.a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000289360), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-4643e7afba.novalocal", "pod":"calico-kube-controllers-84c5894965-x5l79", "timestamp":"2025-05-14 00:19:25.844958033 +0000 UTC"}, Hostname:"ci-4284-0-0-n-4643e7afba.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 00:19:26.123476 containerd[1482]: 2025-05-14 00:19:25.911 [INFO][3837] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 00:19:26.123476 containerd[1482]: 2025-05-14 00:19:25.946 [INFO][3837] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 00:19:26.123476 containerd[1482]: 2025-05-14 00:19:25.946 [INFO][3837] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-4643e7afba.novalocal' May 14 00:19:26.123476 containerd[1482]: 2025-05-14 00:19:25.951 [INFO][3837] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.123476 containerd[1482]: 2025-05-14 00:19:25.960 [INFO][3837] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.123476 containerd[1482]: 2025-05-14 00:19:25.967 [INFO][3837] ipam/ipam.go 489: Trying affinity for 192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.123476 containerd[1482]: 2025-05-14 00:19:25.971 [INFO][3837] ipam/ipam.go 155: Attempting to load block cidr=192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.123476 containerd[1482]: 2025-05-14 00:19:25.975 [INFO][3837] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.123934 containerd[1482]: 2025-05-14 00:19:25.975 [INFO][3837] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.13.64/26 handle="k8s-pod-network.a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.123934 containerd[1482]: 2025-05-14 00:19:25.981 [INFO][3837] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60 May 14 00:19:26.123934 containerd[1482]: 2025-05-14 00:19:25.995 [INFO][3837] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.13.64/26 handle="k8s-pod-network.a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.123934 containerd[1482]: 2025-05-14 00:19:26.012 [INFO][3837] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.13.66/26] block=192.168.13.64/26 handle="k8s-pod-network.a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.123934 containerd[1482]: 2025-05-14 00:19:26.012 [INFO][3837] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.13.66/26] handle="k8s-pod-network.a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.123934 containerd[1482]: 2025-05-14 00:19:26.012 [INFO][3837] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 00:19:26.123934 containerd[1482]: 2025-05-14 00:19:26.012 [INFO][3837] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.13.66/26] IPv6=[] ContainerID="a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" HandleID="k8s-pod-network.a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0" May 14 00:19:26.124225 containerd[1482]: 2025-05-14 00:19:26.035 [INFO][3798] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" Namespace="calico-system" Pod="calico-kube-controllers-84c5894965-x5l79" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0", GenerateName:"calico-kube-controllers-84c5894965-", Namespace:"calico-system", SelfLink:"", UID:"46692647-8abf-4b93-b4e7-539aa57f3e65", ResourceVersion:"704", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 18, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"84c5894965", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4643e7afba.novalocal", ContainerID:"", Pod:"calico-kube-controllers-84c5894965-x5l79", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.13.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6122ab79569", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:19:26.124321 containerd[1482]: 2025-05-14 00:19:26.038 [INFO][3798] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.13.66/32] ContainerID="a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" Namespace="calico-system" Pod="calico-kube-controllers-84c5894965-x5l79" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0" May 14 00:19:26.124321 containerd[1482]: 2025-05-14 00:19:26.038 [INFO][3798] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6122ab79569 ContainerID="a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" Namespace="calico-system" Pod="calico-kube-controllers-84c5894965-x5l79" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0" May 14 00:19:26.124321 containerd[1482]: 2025-05-14 00:19:26.076 [INFO][3798] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" Namespace="calico-system" Pod="calico-kube-controllers-84c5894965-x5l79" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0" May 14 00:19:26.124436 containerd[1482]: 2025-05-14 00:19:26.077 [INFO][3798] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" Namespace="calico-system" Pod="calico-kube-controllers-84c5894965-x5l79" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0", GenerateName:"calico-kube-controllers-84c5894965-", Namespace:"calico-system", SelfLink:"", UID:"46692647-8abf-4b93-b4e7-539aa57f3e65", ResourceVersion:"704", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 18, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"84c5894965", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4643e7afba.novalocal", ContainerID:"a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60", Pod:"calico-kube-controllers-84c5894965-x5l79", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.13.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6122ab79569", MAC:"5e:ca:06:45:1e:fb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:19:26.125954 containerd[1482]: 2025-05-14 00:19:26.115 [INFO][3798] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" Namespace="calico-system" Pod="calico-kube-controllers-84c5894965-x5l79" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--kube--controllers--84c5894965--x5l79-eth0" May 14 00:19:26.147884 systemd-networkd[1390]: cali98d5a11582b: Link UP May 14 00:19:26.150594 systemd-networkd[1390]: cali98d5a11582b: Gained carrier May 14 00:19:26.183636 containerd[1482]: 2025-05-14 00:19:25.631 [INFO][3806] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 14 00:19:26.183636 containerd[1482]: 2025-05-14 00:19:25.681 [INFO][3806] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0 calico-apiserver-85c77fb996- calico-apiserver 119d07b0-5223-402f-be8b-608f7867493d 703 0 2025-05-14 00:18:57 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:85c77fb996 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-4643e7afba.novalocal calico-apiserver-85c77fb996-7pqlj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali98d5a11582b [] []}} ContainerID="5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-7pqlj" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-" May 14 00:19:26.183636 containerd[1482]: 2025-05-14 00:19:25.682 [INFO][3806] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-7pqlj" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0" May 14 00:19:26.183636 containerd[1482]: 2025-05-14 00:19:25.897 [INFO][3839] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" HandleID="k8s-pod-network.5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0" May 14 00:19:26.183980 containerd[1482]: 2025-05-14 00:19:25.923 [INFO][3839] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" HandleID="k8s-pod-network.5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000517e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-4643e7afba.novalocal", "pod":"calico-apiserver-85c77fb996-7pqlj", "timestamp":"2025-05-14 00:19:25.897909026 +0000 UTC"}, Hostname:"ci-4284-0-0-n-4643e7afba.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 00:19:26.183980 containerd[1482]: 2025-05-14 00:19:25.924 [INFO][3839] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 00:19:26.183980 containerd[1482]: 2025-05-14 00:19:26.026 [INFO][3839] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 00:19:26.183980 containerd[1482]: 2025-05-14 00:19:26.026 [INFO][3839] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-4643e7afba.novalocal' May 14 00:19:26.183980 containerd[1482]: 2025-05-14 00:19:26.035 [INFO][3839] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.183980 containerd[1482]: 2025-05-14 00:19:26.047 [INFO][3839] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.183980 containerd[1482]: 2025-05-14 00:19:26.055 [INFO][3839] ipam/ipam.go 489: Trying affinity for 192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.183980 containerd[1482]: 2025-05-14 00:19:26.060 [INFO][3839] ipam/ipam.go 155: Attempting to load block cidr=192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.183980 containerd[1482]: 2025-05-14 00:19:26.066 [INFO][3839] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.184252 containerd[1482]: 2025-05-14 00:19:26.066 [INFO][3839] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.13.64/26 handle="k8s-pod-network.5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.184252 containerd[1482]: 2025-05-14 00:19:26.069 [INFO][3839] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd May 14 00:19:26.184252 containerd[1482]: 2025-05-14 00:19:26.089 [INFO][3839] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.13.64/26 handle="k8s-pod-network.5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.184252 containerd[1482]: 2025-05-14 00:19:26.117 [INFO][3839] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.13.67/26] block=192.168.13.64/26 handle="k8s-pod-network.5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.184252 containerd[1482]: 2025-05-14 00:19:26.119 [INFO][3839] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.13.67/26] handle="k8s-pod-network.5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:26.184252 containerd[1482]: 2025-05-14 00:19:26.119 [INFO][3839] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 00:19:26.184252 containerd[1482]: 2025-05-14 00:19:26.120 [INFO][3839] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.13.67/26] IPv6=[] ContainerID="5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" HandleID="k8s-pod-network.5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0" May 14 00:19:26.184619 containerd[1482]: 2025-05-14 00:19:26.127 [INFO][3806] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-7pqlj" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0", GenerateName:"calico-apiserver-85c77fb996-", Namespace:"calico-apiserver", SelfLink:"", UID:"119d07b0-5223-402f-be8b-608f7867493d", ResourceVersion:"703", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 18, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85c77fb996", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4643e7afba.novalocal", ContainerID:"", Pod:"calico-apiserver-85c77fb996-7pqlj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.13.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali98d5a11582b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:19:26.184696 containerd[1482]: 2025-05-14 00:19:26.127 [INFO][3806] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.13.67/32] ContainerID="5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-7pqlj" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0" May 14 00:19:26.184696 containerd[1482]: 2025-05-14 00:19:26.128 [INFO][3806] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali98d5a11582b ContainerID="5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-7pqlj" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0" May 14 00:19:26.184696 containerd[1482]: 2025-05-14 00:19:26.151 [INFO][3806] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-7pqlj" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0" May 14 00:19:26.184787 containerd[1482]: 2025-05-14 00:19:26.151 [INFO][3806] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-7pqlj" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0", GenerateName:"calico-apiserver-85c77fb996-", Namespace:"calico-apiserver", SelfLink:"", UID:"119d07b0-5223-402f-be8b-608f7867493d", ResourceVersion:"703", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 18, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85c77fb996", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4643e7afba.novalocal", ContainerID:"5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd", Pod:"calico-apiserver-85c77fb996-7pqlj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.13.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali98d5a11582b", MAC:"c2:71:5e:dd:cf:9c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:19:26.184864 containerd[1482]: 2025-05-14 00:19:26.174 [INFO][3806] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-7pqlj" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--7pqlj-eth0" May 14 00:19:26.219001 containerd[1482]: time="2025-05-14T00:19:26.216723563Z" level=info msg="connecting to shim 124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586" address="unix:///run/containerd/s/25813dbf66bf0cc386be329c4ce1076889d5bb85c7bb29df24b1a477f90071f3" namespace=k8s.io protocol=ttrpc version=3 May 14 00:19:26.238271 containerd[1482]: time="2025-05-14T00:19:26.237844290Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"15302ecabd45bbd2ba1ea061068b30bd466e7562c1db289cea6091ef639eb58f\" pid:3865 exit_status:1 exited_at:{seconds:1747181966 nanos:237085677}" May 14 00:19:26.350568 containerd[1482]: time="2025-05-14T00:19:26.349730120Z" level=info msg="connecting to shim a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60" address="unix:///run/containerd/s/4ec3e6d53b067352607d9725aee6df450abaef87664ac21f006dc1e19ae65689" namespace=k8s.io protocol=ttrpc version=3 May 14 00:19:26.369934 containerd[1482]: time="2025-05-14T00:19:26.369826876Z" level=info msg="connecting to shim 5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd" address="unix:///run/containerd/s/ce26ba5e6cf1f4dc6f8de5123de1885be2e90a5ce1ec69662926fd43f3fdfc57" namespace=k8s.io protocol=ttrpc version=3 May 14 00:19:26.396022 systemd[1]: Started cri-containerd-124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586.scope - libcontainer container 124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586. May 14 00:19:26.568838 systemd[1]: Started cri-containerd-a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60.scope - libcontainer container a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60. May 14 00:19:26.591847 systemd[1]: Started cri-containerd-5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd.scope - libcontainer container 5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd. May 14 00:19:26.703628 containerd[1482]: time="2025-05-14T00:19:26.703355030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6svwf,Uid:2846908e-da51-4861-b641-78bf7c75d90f,Namespace:calico-system,Attempt:0,} returns sandbox id \"124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586\"" May 14 00:19:26.724810 containerd[1482]: time="2025-05-14T00:19:26.724753358Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 14 00:19:26.755840 containerd[1482]: time="2025-05-14T00:19:26.755774706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84c5894965-x5l79,Uid:46692647-8abf-4b93-b4e7-539aa57f3e65,Namespace:calico-system,Attempt:0,} returns sandbox id \"a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60\"" May 14 00:19:26.804854 containerd[1482]: time="2025-05-14T00:19:26.804787632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85c77fb996-7pqlj,Uid:119d07b0-5223-402f-be8b-608f7867493d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd\"" May 14 00:19:26.966606 kernel: bpftool[4146]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 14 00:19:27.384671 systemd-networkd[1390]: vxlan.calico: Link UP May 14 00:19:27.384684 systemd-networkd[1390]: vxlan.calico: Gained carrier May 14 00:19:27.484690 systemd-networkd[1390]: cali6122ab79569: Gained IPv6LL May 14 00:19:27.740887 systemd-networkd[1390]: cali8aea1ff0a96: Gained IPv6LL May 14 00:19:27.814130 systemd-networkd[1390]: cali98d5a11582b: Gained IPv6LL May 14 00:19:28.444811 systemd-networkd[1390]: vxlan.calico: Gained IPv6LL May 14 00:19:28.526602 containerd[1482]: time="2025-05-14T00:19:28.524387213Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-vlxwd,Uid:ad63d8da-8e9c-4dad-bf11-f8f68208946c,Namespace:kube-system,Attempt:0,}" May 14 00:19:28.533076 containerd[1482]: time="2025-05-14T00:19:28.524443970Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7q68r,Uid:668fd9d0-96d4-4c00-9f43-69d0f9229bcd,Namespace:kube-system,Attempt:0,}" May 14 00:19:28.849436 systemd-networkd[1390]: calic4e09703030: Link UP May 14 00:19:28.850447 systemd-networkd[1390]: calic4e09703030: Gained carrier May 14 00:19:28.877175 containerd[1482]: 2025-05-14 00:19:28.680 [INFO][4239] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0 coredns-7db6d8ff4d- kube-system ad63d8da-8e9c-4dad-bf11-f8f68208946c 699 0 2025-05-14 00:18:51 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-4643e7afba.novalocal coredns-7db6d8ff4d-vlxwd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic4e09703030 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vlxwd" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-" May 14 00:19:28.877175 containerd[1482]: 2025-05-14 00:19:28.681 [INFO][4239] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vlxwd" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0" May 14 00:19:28.877175 containerd[1482]: 2025-05-14 00:19:28.759 [INFO][4260] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" HandleID="k8s-pod-network.a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0" May 14 00:19:28.878098 containerd[1482]: 2025-05-14 00:19:28.771 [INFO][4260] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" HandleID="k8s-pod-network.a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ed630), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-4643e7afba.novalocal", "pod":"coredns-7db6d8ff4d-vlxwd", "timestamp":"2025-05-14 00:19:28.759048225 +0000 UTC"}, Hostname:"ci-4284-0-0-n-4643e7afba.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 00:19:28.878098 containerd[1482]: 2025-05-14 00:19:28.771 [INFO][4260] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 00:19:28.878098 containerd[1482]: 2025-05-14 00:19:28.771 [INFO][4260] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 00:19:28.878098 containerd[1482]: 2025-05-14 00:19:28.771 [INFO][4260] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-4643e7afba.novalocal' May 14 00:19:28.878098 containerd[1482]: 2025-05-14 00:19:28.773 [INFO][4260] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.878098 containerd[1482]: 2025-05-14 00:19:28.783 [INFO][4260] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.878098 containerd[1482]: 2025-05-14 00:19:28.789 [INFO][4260] ipam/ipam.go 489: Trying affinity for 192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.878098 containerd[1482]: 2025-05-14 00:19:28.791 [INFO][4260] ipam/ipam.go 155: Attempting to load block cidr=192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.878098 containerd[1482]: 2025-05-14 00:19:28.794 [INFO][4260] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.880386 containerd[1482]: 2025-05-14 00:19:28.795 [INFO][4260] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.13.64/26 handle="k8s-pod-network.a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.880386 containerd[1482]: 2025-05-14 00:19:28.796 [INFO][4260] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868 May 14 00:19:28.880386 containerd[1482]: 2025-05-14 00:19:28.830 [INFO][4260] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.13.64/26 handle="k8s-pod-network.a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.880386 containerd[1482]: 2025-05-14 00:19:28.842 [INFO][4260] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.13.68/26] block=192.168.13.64/26 handle="k8s-pod-network.a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.880386 containerd[1482]: 2025-05-14 00:19:28.842 [INFO][4260] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.13.68/26] handle="k8s-pod-network.a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.880386 containerd[1482]: 2025-05-14 00:19:28.842 [INFO][4260] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 00:19:28.880386 containerd[1482]: 2025-05-14 00:19:28.843 [INFO][4260] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.13.68/26] IPv6=[] ContainerID="a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" HandleID="k8s-pod-network.a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0" May 14 00:19:28.880714 containerd[1482]: 2025-05-14 00:19:28.845 [INFO][4239] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vlxwd" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"ad63d8da-8e9c-4dad-bf11-f8f68208946c", ResourceVersion:"699", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 18, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4643e7afba.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-vlxwd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.13.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic4e09703030", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:19:28.880714 containerd[1482]: 2025-05-14 00:19:28.846 [INFO][4239] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.13.68/32] ContainerID="a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vlxwd" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0" May 14 00:19:28.880714 containerd[1482]: 2025-05-14 00:19:28.846 [INFO][4239] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic4e09703030 ContainerID="a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vlxwd" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0" May 14 00:19:28.880714 containerd[1482]: 2025-05-14 00:19:28.849 [INFO][4239] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vlxwd" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0" May 14 00:19:28.880714 containerd[1482]: 2025-05-14 00:19:28.849 [INFO][4239] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vlxwd" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"ad63d8da-8e9c-4dad-bf11-f8f68208946c", ResourceVersion:"699", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 18, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4643e7afba.novalocal", ContainerID:"a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868", Pod:"coredns-7db6d8ff4d-vlxwd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.13.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic4e09703030", MAC:"06:7e:bb:ac:c7:c3", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:19:28.880714 containerd[1482]: 2025-05-14 00:19:28.873 [INFO][4239] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vlxwd" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--vlxwd-eth0" May 14 00:19:28.949390 systemd-networkd[1390]: calia7ac2929dd9: Link UP May 14 00:19:28.951658 systemd-networkd[1390]: calia7ac2929dd9: Gained carrier May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.693 [INFO][4237] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0 coredns-7db6d8ff4d- kube-system 668fd9d0-96d4-4c00-9f43-69d0f9229bcd 706 0 2025-05-14 00:18:51 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-4643e7afba.novalocal coredns-7db6d8ff4d-7q68r eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia7ac2929dd9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7q68r" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-" May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.694 [INFO][4237] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7q68r" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0" May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.775 [INFO][4265] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" HandleID="k8s-pod-network.13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0" May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.788 [INFO][4265] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" HandleID="k8s-pod-network.13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003c8c70), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-4643e7afba.novalocal", "pod":"coredns-7db6d8ff4d-7q68r", "timestamp":"2025-05-14 00:19:28.775685622 +0000 UTC"}, Hostname:"ci-4284-0-0-n-4643e7afba.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.788 [INFO][4265] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.843 [INFO][4265] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.843 [INFO][4265] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-4643e7afba.novalocal' May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.853 [INFO][4265] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.883 [INFO][4265] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.896 [INFO][4265] ipam/ipam.go 489: Trying affinity for 192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.901 [INFO][4265] ipam/ipam.go 155: Attempting to load block cidr=192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.907 [INFO][4265] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.907 [INFO][4265] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.13.64/26 handle="k8s-pod-network.13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.910 [INFO][4265] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.915 [INFO][4265] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.13.64/26 handle="k8s-pod-network.13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.927 [INFO][4265] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.13.69/26] block=192.168.13.64/26 handle="k8s-pod-network.13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.927 [INFO][4265] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.13.69/26] handle="k8s-pod-network.13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.927 [INFO][4265] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 00:19:28.966621 containerd[1482]: 2025-05-14 00:19:28.927 [INFO][4265] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.13.69/26] IPv6=[] ContainerID="13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" HandleID="k8s-pod-network.13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0" May 14 00:19:28.967449 containerd[1482]: 2025-05-14 00:19:28.936 [INFO][4237] cni-plugin/k8s.go 386: Populated endpoint ContainerID="13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7q68r" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"668fd9d0-96d4-4c00-9f43-69d0f9229bcd", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 18, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4643e7afba.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-7q68r", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.13.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia7ac2929dd9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:19:28.967449 containerd[1482]: 2025-05-14 00:19:28.936 [INFO][4237] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.13.69/32] ContainerID="13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7q68r" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0" May 14 00:19:28.967449 containerd[1482]: 2025-05-14 00:19:28.936 [INFO][4237] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia7ac2929dd9 ContainerID="13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7q68r" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0" May 14 00:19:28.967449 containerd[1482]: 2025-05-14 00:19:28.939 [INFO][4237] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7q68r" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0" May 14 00:19:28.967449 containerd[1482]: 2025-05-14 00:19:28.941 [INFO][4237] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7q68r" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"668fd9d0-96d4-4c00-9f43-69d0f9229bcd", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 18, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4643e7afba.novalocal", ContainerID:"13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc", Pod:"coredns-7db6d8ff4d-7q68r", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.13.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia7ac2929dd9", MAC:"ce:c8:e0:7c:44:5a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:19:28.967449 containerd[1482]: 2025-05-14 00:19:28.959 [INFO][4237] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7q68r" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-coredns--7db6d8ff4d--7q68r-eth0" May 14 00:19:28.973154 containerd[1482]: time="2025-05-14T00:19:28.973036440Z" level=info msg="connecting to shim a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868" address="unix:///run/containerd/s/89e10894f349fa7303054ec17b93f8b487829275ec7e17f2378a6784c5347316" namespace=k8s.io protocol=ttrpc version=3 May 14 00:19:29.044206 containerd[1482]: time="2025-05-14T00:19:29.043030249Z" level=info msg="connecting to shim 13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc" address="unix:///run/containerd/s/219bc8a19f95cdd10fd9a9f152cf6d9d6d3a8f5c3da17b23eafcf0b4772b16f2" namespace=k8s.io protocol=ttrpc version=3 May 14 00:19:29.043919 systemd[1]: Started cri-containerd-a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868.scope - libcontainer container a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868. May 14 00:19:29.085655 systemd[1]: Started cri-containerd-13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc.scope - libcontainer container 13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc. May 14 00:19:29.185986 containerd[1482]: time="2025-05-14T00:19:29.184192331Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-vlxwd,Uid:ad63d8da-8e9c-4dad-bf11-f8f68208946c,Namespace:kube-system,Attempt:0,} returns sandbox id \"a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868\"" May 14 00:19:29.200185 containerd[1482]: time="2025-05-14T00:19:29.200133423Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7q68r,Uid:668fd9d0-96d4-4c00-9f43-69d0f9229bcd,Namespace:kube-system,Attempt:0,} returns sandbox id \"13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc\"" May 14 00:19:29.209846 containerd[1482]: time="2025-05-14T00:19:29.209788212Z" level=info msg="CreateContainer within sandbox \"a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 14 00:19:29.212470 containerd[1482]: time="2025-05-14T00:19:29.212422854Z" level=info msg="CreateContainer within sandbox \"13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 14 00:19:29.236372 containerd[1482]: time="2025-05-14T00:19:29.236058769Z" level=info msg="Container a884d24dd6313877a5f16d308548443419b90f6880d934948e18062abc2f7a98: CDI devices from CRI Config.CDIDevices: []" May 14 00:19:29.241251 containerd[1482]: time="2025-05-14T00:19:29.241199872Z" level=info msg="Container 0d79a8d01de40af014df38a3f853c89cd3a816b3682d6d6e894648371add7542: CDI devices from CRI Config.CDIDevices: []" May 14 00:19:29.251638 containerd[1482]: time="2025-05-14T00:19:29.251539366Z" level=info msg="CreateContainer within sandbox \"13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a884d24dd6313877a5f16d308548443419b90f6880d934948e18062abc2f7a98\"" May 14 00:19:29.254026 containerd[1482]: time="2025-05-14T00:19:29.253803303Z" level=info msg="StartContainer for \"a884d24dd6313877a5f16d308548443419b90f6880d934948e18062abc2f7a98\"" May 14 00:19:29.256303 containerd[1482]: time="2025-05-14T00:19:29.256184539Z" level=info msg="connecting to shim a884d24dd6313877a5f16d308548443419b90f6880d934948e18062abc2f7a98" address="unix:///run/containerd/s/219bc8a19f95cdd10fd9a9f152cf6d9d6d3a8f5c3da17b23eafcf0b4772b16f2" protocol=ttrpc version=3 May 14 00:19:29.260462 containerd[1482]: time="2025-05-14T00:19:29.260329093Z" level=info msg="CreateContainer within sandbox \"a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0d79a8d01de40af014df38a3f853c89cd3a816b3682d6d6e894648371add7542\"" May 14 00:19:29.263346 containerd[1482]: time="2025-05-14T00:19:29.262705781Z" level=info msg="StartContainer for \"0d79a8d01de40af014df38a3f853c89cd3a816b3682d6d6e894648371add7542\"" May 14 00:19:29.266346 containerd[1482]: time="2025-05-14T00:19:29.266271941Z" level=info msg="connecting to shim 0d79a8d01de40af014df38a3f853c89cd3a816b3682d6d6e894648371add7542" address="unix:///run/containerd/s/89e10894f349fa7303054ec17b93f8b487829275ec7e17f2378a6784c5347316" protocol=ttrpc version=3 May 14 00:19:29.290707 systemd[1]: Started cri-containerd-a884d24dd6313877a5f16d308548443419b90f6880d934948e18062abc2f7a98.scope - libcontainer container a884d24dd6313877a5f16d308548443419b90f6880d934948e18062abc2f7a98. May 14 00:19:29.325921 systemd[1]: Started cri-containerd-0d79a8d01de40af014df38a3f853c89cd3a816b3682d6d6e894648371add7542.scope - libcontainer container 0d79a8d01de40af014df38a3f853c89cd3a816b3682d6d6e894648371add7542. May 14 00:19:29.360951 containerd[1482]: time="2025-05-14T00:19:29.360909054Z" level=info msg="StartContainer for \"a884d24dd6313877a5f16d308548443419b90f6880d934948e18062abc2f7a98\" returns successfully" May 14 00:19:29.381955 containerd[1482]: time="2025-05-14T00:19:29.381548187Z" level=info msg="StartContainer for \"0d79a8d01de40af014df38a3f853c89cd3a816b3682d6d6e894648371add7542\" returns successfully" May 14 00:19:29.500158 containerd[1482]: time="2025-05-14T00:19:29.500113664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85c77fb996-vjqrg,Uid:53393edc-5a1c-4194-8089-62b99b6d22c3,Namespace:calico-apiserver,Attempt:0,}" May 14 00:19:29.666855 systemd-networkd[1390]: caliefbde26820a: Link UP May 14 00:19:29.667714 systemd-networkd[1390]: caliefbde26820a: Gained carrier May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.569 [INFO][4460] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0 calico-apiserver-85c77fb996- calico-apiserver 53393edc-5a1c-4194-8089-62b99b6d22c3 707 0 2025-05-14 00:18:57 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:85c77fb996 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-4643e7afba.novalocal calico-apiserver-85c77fb996-vjqrg eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliefbde26820a [] []}} ContainerID="3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-vjqrg" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-" May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.570 [INFO][4460] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-vjqrg" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0" May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.612 [INFO][4472] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" HandleID="k8s-pod-network.3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0" May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.625 [INFO][4472] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" HandleID="k8s-pod-network.3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000384ae0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-4643e7afba.novalocal", "pod":"calico-apiserver-85c77fb996-vjqrg", "timestamp":"2025-05-14 00:19:29.612939004 +0000 UTC"}, Hostname:"ci-4284-0-0-n-4643e7afba.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.625 [INFO][4472] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.625 [INFO][4472] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.625 [INFO][4472] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-4643e7afba.novalocal' May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.628 [INFO][4472] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.632 [INFO][4472] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.638 [INFO][4472] ipam/ipam.go 489: Trying affinity for 192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.640 [INFO][4472] ipam/ipam.go 155: Attempting to load block cidr=192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.643 [INFO][4472] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.13.64/26 host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.643 [INFO][4472] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.13.64/26 handle="k8s-pod-network.3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.645 [INFO][4472] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34 May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.651 [INFO][4472] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.13.64/26 handle="k8s-pod-network.3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.660 [INFO][4472] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.13.70/26] block=192.168.13.64/26 handle="k8s-pod-network.3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.661 [INFO][4472] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.13.70/26] handle="k8s-pod-network.3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" host="ci-4284-0-0-n-4643e7afba.novalocal" May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.661 [INFO][4472] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 00:19:29.687675 containerd[1482]: 2025-05-14 00:19:29.661 [INFO][4472] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.13.70/26] IPv6=[] ContainerID="3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" HandleID="k8s-pod-network.3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" Workload="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0" May 14 00:19:29.689493 containerd[1482]: 2025-05-14 00:19:29.663 [INFO][4460] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-vjqrg" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0", GenerateName:"calico-apiserver-85c77fb996-", Namespace:"calico-apiserver", SelfLink:"", UID:"53393edc-5a1c-4194-8089-62b99b6d22c3", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 18, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85c77fb996", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4643e7afba.novalocal", ContainerID:"", Pod:"calico-apiserver-85c77fb996-vjqrg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.13.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliefbde26820a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:19:29.689493 containerd[1482]: 2025-05-14 00:19:29.663 [INFO][4460] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.13.70/32] ContainerID="3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-vjqrg" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0" May 14 00:19:29.689493 containerd[1482]: 2025-05-14 00:19:29.663 [INFO][4460] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliefbde26820a ContainerID="3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-vjqrg" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0" May 14 00:19:29.689493 containerd[1482]: 2025-05-14 00:19:29.668 [INFO][4460] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-vjqrg" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0" May 14 00:19:29.689493 containerd[1482]: 2025-05-14 00:19:29.668 [INFO][4460] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-vjqrg" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0", GenerateName:"calico-apiserver-85c77fb996-", Namespace:"calico-apiserver", SelfLink:"", UID:"53393edc-5a1c-4194-8089-62b99b6d22c3", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 0, 18, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85c77fb996", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4643e7afba.novalocal", ContainerID:"3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34", Pod:"calico-apiserver-85c77fb996-vjqrg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.13.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliefbde26820a", MAC:"46:32:84:06:d5:bd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 00:19:29.689493 containerd[1482]: 2025-05-14 00:19:29.682 [INFO][4460] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" Namespace="calico-apiserver" Pod="calico-apiserver-85c77fb996-vjqrg" WorkloadEndpoint="ci--4284--0--0--n--4643e7afba.novalocal-k8s-calico--apiserver--85c77fb996--vjqrg-eth0" May 14 00:19:29.732950 containerd[1482]: time="2025-05-14T00:19:29.732906190Z" level=info msg="connecting to shim 3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34" address="unix:///run/containerd/s/5f9cbf6d08c153fcbf440cdefcec6dc978f84ca339b67f5306267dbbd3638e6f" namespace=k8s.io protocol=ttrpc version=3 May 14 00:19:29.806928 systemd[1]: Started cri-containerd-3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34.scope - libcontainer container 3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34. May 14 00:19:29.939505 kubelet[2808]: I0514 00:19:29.939286 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-7q68r" podStartSLOduration=38.937102096 podStartE2EDuration="38.937102096s" podCreationTimestamp="2025-05-14 00:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 00:19:29.935294014 +0000 UTC m=+53.656987847" watchObservedRunningTime="2025-05-14 00:19:29.937102096 +0000 UTC m=+53.658795929" May 14 00:19:29.950169 containerd[1482]: time="2025-05-14T00:19:29.949993858Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85c77fb996-vjqrg,Uid:53393edc-5a1c-4194-8089-62b99b6d22c3,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34\"" May 14 00:19:29.980387 kubelet[2808]: I0514 00:19:29.980047 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-vlxwd" podStartSLOduration=38.980011011 podStartE2EDuration="38.980011011s" podCreationTimestamp="2025-05-14 00:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 00:19:29.978620934 +0000 UTC m=+53.700314787" watchObservedRunningTime="2025-05-14 00:19:29.980011011 +0000 UTC m=+53.701704844" May 14 00:19:30.173927 systemd-networkd[1390]: calic4e09703030: Gained IPv6LL May 14 00:19:30.492805 systemd-networkd[1390]: calia7ac2929dd9: Gained IPv6LL May 14 00:19:30.812847 systemd-networkd[1390]: caliefbde26820a: Gained IPv6LL May 14 00:19:31.183747 containerd[1482]: time="2025-05-14T00:19:31.182715069Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:31.184274 containerd[1482]: time="2025-05-14T00:19:31.184210325Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 14 00:19:31.185123 containerd[1482]: time="2025-05-14T00:19:31.185094583Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:31.187885 containerd[1482]: time="2025-05-14T00:19:31.187858757Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:31.188753 containerd[1482]: time="2025-05-14T00:19:31.188702832Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 4.463890212s" May 14 00:19:31.188878 containerd[1482]: time="2025-05-14T00:19:31.188857552Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 14 00:19:31.192569 containerd[1482]: time="2025-05-14T00:19:31.192489413Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 14 00:19:31.195052 containerd[1482]: time="2025-05-14T00:19:31.194069478Z" level=info msg="CreateContainer within sandbox \"124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 14 00:19:31.211814 containerd[1482]: time="2025-05-14T00:19:31.211752456Z" level=info msg="Container 7821ee943508a3958fdc37694f67b2d5dff099643d7f8c5387168b79f8eb55a7: CDI devices from CRI Config.CDIDevices: []" May 14 00:19:31.228025 containerd[1482]: time="2025-05-14T00:19:31.227988630Z" level=info msg="CreateContainer within sandbox \"124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"7821ee943508a3958fdc37694f67b2d5dff099643d7f8c5387168b79f8eb55a7\"" May 14 00:19:31.230153 containerd[1482]: time="2025-05-14T00:19:31.229033269Z" level=info msg="StartContainer for \"7821ee943508a3958fdc37694f67b2d5dff099643d7f8c5387168b79f8eb55a7\"" May 14 00:19:31.232268 containerd[1482]: time="2025-05-14T00:19:31.232242028Z" level=info msg="connecting to shim 7821ee943508a3958fdc37694f67b2d5dff099643d7f8c5387168b79f8eb55a7" address="unix:///run/containerd/s/25813dbf66bf0cc386be329c4ce1076889d5bb85c7bb29df24b1a477f90071f3" protocol=ttrpc version=3 May 14 00:19:31.265698 systemd[1]: Started cri-containerd-7821ee943508a3958fdc37694f67b2d5dff099643d7f8c5387168b79f8eb55a7.scope - libcontainer container 7821ee943508a3958fdc37694f67b2d5dff099643d7f8c5387168b79f8eb55a7. May 14 00:19:31.353006 containerd[1482]: time="2025-05-14T00:19:31.352963357Z" level=info msg="StartContainer for \"7821ee943508a3958fdc37694f67b2d5dff099643d7f8c5387168b79f8eb55a7\" returns successfully" May 14 00:19:35.594671 containerd[1482]: time="2025-05-14T00:19:35.594037287Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:35.604659 containerd[1482]: time="2025-05-14T00:19:35.603679644Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 14 00:19:35.605937 containerd[1482]: time="2025-05-14T00:19:35.605641253Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:35.609632 containerd[1482]: time="2025-05-14T00:19:35.608675865Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:35.609869 containerd[1482]: time="2025-05-14T00:19:35.609806886Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 4.41721477s" May 14 00:19:35.611240 containerd[1482]: time="2025-05-14T00:19:35.611046952Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 14 00:19:35.619920 containerd[1482]: time="2025-05-14T00:19:35.618565094Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 14 00:19:35.647102 containerd[1482]: time="2025-05-14T00:19:35.647036287Z" level=info msg="CreateContainer within sandbox \"a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 14 00:19:35.663299 containerd[1482]: time="2025-05-14T00:19:35.663237456Z" level=info msg="Container 64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd: CDI devices from CRI Config.CDIDevices: []" May 14 00:19:35.680612 containerd[1482]: time="2025-05-14T00:19:35.680565628Z" level=info msg="CreateContainer within sandbox \"a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\"" May 14 00:19:35.682373 containerd[1482]: time="2025-05-14T00:19:35.682348282Z" level=info msg="StartContainer for \"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\"" May 14 00:19:35.685326 containerd[1482]: time="2025-05-14T00:19:35.685301131Z" level=info msg="connecting to shim 64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd" address="unix:///run/containerd/s/4ec3e6d53b067352607d9725aee6df450abaef87664ac21f006dc1e19ae65689" protocol=ttrpc version=3 May 14 00:19:35.746783 systemd[1]: Started cri-containerd-64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd.scope - libcontainer container 64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd. May 14 00:19:35.824379 containerd[1482]: time="2025-05-14T00:19:35.824076801Z" level=info msg="StartContainer for \"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" returns successfully" May 14 00:19:35.984036 kubelet[2808]: I0514 00:19:35.983741 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-84c5894965-x5l79" podStartSLOduration=29.125541207 podStartE2EDuration="37.983622091s" podCreationTimestamp="2025-05-14 00:18:58 +0000 UTC" firstStartedPulling="2025-05-14 00:19:26.758699422 +0000 UTC m=+50.480393255" lastFinishedPulling="2025-05-14 00:19:35.616780296 +0000 UTC m=+59.338474139" observedRunningTime="2025-05-14 00:19:35.977870863 +0000 UTC m=+59.699564716" watchObservedRunningTime="2025-05-14 00:19:35.983622091 +0000 UTC m=+59.705315924" May 14 00:19:36.091838 containerd[1482]: time="2025-05-14T00:19:36.091793707Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"888c1826a8a3efda467d092327741472f16c6a4314855507955d6c0ad07e0799\" pid:4637 exited_at:{seconds:1747181976 nanos:91221855}" May 14 00:19:36.187320 containerd[1482]: time="2025-05-14T00:19:36.186987782Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"d6e2d847cc6a58ad0e65c179553cd0e02a8abd4dabfce3e79305403ba47c7965\" pid:4658 exited_at:{seconds:1747181976 nanos:186596167}" May 14 00:19:40.242981 containerd[1482]: time="2025-05-14T00:19:40.242870405Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:40.244310 containerd[1482]: time="2025-05-14T00:19:40.244145708Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 14 00:19:40.247145 containerd[1482]: time="2025-05-14T00:19:40.247105960Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:40.253325 containerd[1482]: time="2025-05-14T00:19:40.253239504Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:40.255758 containerd[1482]: time="2025-05-14T00:19:40.255396410Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 4.636752798s" May 14 00:19:40.255758 containerd[1482]: time="2025-05-14T00:19:40.255480228Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 14 00:19:40.261394 containerd[1482]: time="2025-05-14T00:19:40.261181983Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 14 00:19:40.263664 containerd[1482]: time="2025-05-14T00:19:40.263626317Z" level=info msg="CreateContainer within sandbox \"5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 14 00:19:40.281990 containerd[1482]: time="2025-05-14T00:19:40.281908969Z" level=info msg="Container 8b612aa6386f1e80029021b70a626712510b285fa5a1e87c36b520b9f768495a: CDI devices from CRI Config.CDIDevices: []" May 14 00:19:40.305565 containerd[1482]: time="2025-05-14T00:19:40.305454853Z" level=info msg="CreateContainer within sandbox \"5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8b612aa6386f1e80029021b70a626712510b285fa5a1e87c36b520b9f768495a\"" May 14 00:19:40.307915 containerd[1482]: time="2025-05-14T00:19:40.307673465Z" level=info msg="StartContainer for \"8b612aa6386f1e80029021b70a626712510b285fa5a1e87c36b520b9f768495a\"" May 14 00:19:40.311072 containerd[1482]: time="2025-05-14T00:19:40.310919553Z" level=info msg="connecting to shim 8b612aa6386f1e80029021b70a626712510b285fa5a1e87c36b520b9f768495a" address="unix:///run/containerd/s/ce26ba5e6cf1f4dc6f8de5123de1885be2e90a5ce1ec69662926fd43f3fdfc57" protocol=ttrpc version=3 May 14 00:19:40.366207 systemd[1]: Started cri-containerd-8b612aa6386f1e80029021b70a626712510b285fa5a1e87c36b520b9f768495a.scope - libcontainer container 8b612aa6386f1e80029021b70a626712510b285fa5a1e87c36b520b9f768495a. May 14 00:19:40.737457 containerd[1482]: time="2025-05-14T00:19:40.737238780Z" level=info msg="StartContainer for \"8b612aa6386f1e80029021b70a626712510b285fa5a1e87c36b520b9f768495a\" returns successfully" May 14 00:19:40.766098 containerd[1482]: time="2025-05-14T00:19:40.766005176Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:40.768338 containerd[1482]: time="2025-05-14T00:19:40.768169295Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 14 00:19:40.775852 containerd[1482]: time="2025-05-14T00:19:40.775814545Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 514.598769ms" May 14 00:19:40.775973 containerd[1482]: time="2025-05-14T00:19:40.775923108Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 14 00:19:40.779318 containerd[1482]: time="2025-05-14T00:19:40.778589881Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 14 00:19:40.784443 containerd[1482]: time="2025-05-14T00:19:40.784341940Z" level=info msg="CreateContainer within sandbox \"3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 14 00:19:40.804878 containerd[1482]: time="2025-05-14T00:19:40.802802536Z" level=info msg="Container e90a2ee08dcaa3c82fc4f1ddb5de6fe8eb2502bcbe4657423b371d4657d98d23: CDI devices from CRI Config.CDIDevices: []" May 14 00:19:40.829775 containerd[1482]: time="2025-05-14T00:19:40.829714985Z" level=info msg="CreateContainer within sandbox \"3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e90a2ee08dcaa3c82fc4f1ddb5de6fe8eb2502bcbe4657423b371d4657d98d23\"" May 14 00:19:40.832696 containerd[1482]: time="2025-05-14T00:19:40.831260203Z" level=info msg="StartContainer for \"e90a2ee08dcaa3c82fc4f1ddb5de6fe8eb2502bcbe4657423b371d4657d98d23\"" May 14 00:19:40.834315 containerd[1482]: time="2025-05-14T00:19:40.834283504Z" level=info msg="connecting to shim e90a2ee08dcaa3c82fc4f1ddb5de6fe8eb2502bcbe4657423b371d4657d98d23" address="unix:///run/containerd/s/5f9cbf6d08c153fcbf440cdefcec6dc978f84ca339b67f5306267dbbd3638e6f" protocol=ttrpc version=3 May 14 00:19:40.897057 systemd[1]: Started cri-containerd-e90a2ee08dcaa3c82fc4f1ddb5de6fe8eb2502bcbe4657423b371d4657d98d23.scope - libcontainer container e90a2ee08dcaa3c82fc4f1ddb5de6fe8eb2502bcbe4657423b371d4657d98d23. May 14 00:19:41.086727 containerd[1482]: time="2025-05-14T00:19:41.086676139Z" level=info msg="StartContainer for \"e90a2ee08dcaa3c82fc4f1ddb5de6fe8eb2502bcbe4657423b371d4657d98d23\" returns successfully" May 14 00:19:41.986450 kubelet[2808]: I0514 00:19:41.985765 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 00:19:42.003086 kubelet[2808]: I0514 00:19:42.002455 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-85c77fb996-7pqlj" podStartSLOduration=31.551880333 podStartE2EDuration="45.00160851s" podCreationTimestamp="2025-05-14 00:18:57 +0000 UTC" firstStartedPulling="2025-05-14 00:19:26.808853438 +0000 UTC m=+50.530547281" lastFinishedPulling="2025-05-14 00:19:40.258581615 +0000 UTC m=+63.980275458" observedRunningTime="2025-05-14 00:19:40.99889429 +0000 UTC m=+64.720588133" watchObservedRunningTime="2025-05-14 00:19:42.00160851 +0000 UTC m=+65.723302354" May 14 00:19:42.003086 kubelet[2808]: I0514 00:19:42.002820 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-85c77fb996-vjqrg" podStartSLOduration=34.188310682 podStartE2EDuration="45.002811798s" podCreationTimestamp="2025-05-14 00:18:57 +0000 UTC" firstStartedPulling="2025-05-14 00:19:29.962957814 +0000 UTC m=+53.684651657" lastFinishedPulling="2025-05-14 00:19:40.77745893 +0000 UTC m=+64.499152773" observedRunningTime="2025-05-14 00:19:41.999462285 +0000 UTC m=+65.721156118" watchObservedRunningTime="2025-05-14 00:19:42.002811798 +0000 UTC m=+65.724505631" May 14 00:19:42.989093 kubelet[2808]: I0514 00:19:42.989057 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 00:19:43.456651 containerd[1482]: time="2025-05-14T00:19:43.456577425Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:43.458461 containerd[1482]: time="2025-05-14T00:19:43.458198565Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 14 00:19:43.459828 containerd[1482]: time="2025-05-14T00:19:43.459792315Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:43.462862 containerd[1482]: time="2025-05-14T00:19:43.462781281Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 00:19:43.463628 containerd[1482]: time="2025-05-14T00:19:43.463580590Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 2.684951336s" May 14 00:19:43.463677 containerd[1482]: time="2025-05-14T00:19:43.463618551Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 14 00:19:43.467774 containerd[1482]: time="2025-05-14T00:19:43.467666374Z" level=info msg="CreateContainer within sandbox \"124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 14 00:19:43.482284 containerd[1482]: time="2025-05-14T00:19:43.481845559Z" level=info msg="Container 53cfeb07f06447060a453e3299147b46a064c81e272d5d7cb133c1e676d96aa6: CDI devices from CRI Config.CDIDevices: []" May 14 00:19:43.491130 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount762512370.mount: Deactivated successfully. May 14 00:19:43.510390 containerd[1482]: time="2025-05-14T00:19:43.510182149Z" level=info msg="CreateContainer within sandbox \"124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"53cfeb07f06447060a453e3299147b46a064c81e272d5d7cb133c1e676d96aa6\"" May 14 00:19:43.511342 containerd[1482]: time="2025-05-14T00:19:43.511289476Z" level=info msg="StartContainer for \"53cfeb07f06447060a453e3299147b46a064c81e272d5d7cb133c1e676d96aa6\"" May 14 00:19:43.514823 containerd[1482]: time="2025-05-14T00:19:43.514658416Z" level=info msg="connecting to shim 53cfeb07f06447060a453e3299147b46a064c81e272d5d7cb133c1e676d96aa6" address="unix:///run/containerd/s/25813dbf66bf0cc386be329c4ce1076889d5bb85c7bb29df24b1a477f90071f3" protocol=ttrpc version=3 May 14 00:19:43.570866 systemd[1]: Started cri-containerd-53cfeb07f06447060a453e3299147b46a064c81e272d5d7cb133c1e676d96aa6.scope - libcontainer container 53cfeb07f06447060a453e3299147b46a064c81e272d5d7cb133c1e676d96aa6. May 14 00:19:43.656393 containerd[1482]: time="2025-05-14T00:19:43.655874059Z" level=info msg="StartContainer for \"53cfeb07f06447060a453e3299147b46a064c81e272d5d7cb133c1e676d96aa6\" returns successfully" May 14 00:19:44.024398 kubelet[2808]: I0514 00:19:44.024282 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-6svwf" podStartSLOduration=29.28351749 podStartE2EDuration="46.024259023s" podCreationTimestamp="2025-05-14 00:18:58 +0000 UTC" firstStartedPulling="2025-05-14 00:19:26.724352075 +0000 UTC m=+50.446045918" lastFinishedPulling="2025-05-14 00:19:43.465093618 +0000 UTC m=+67.186787451" observedRunningTime="2025-05-14 00:19:44.021501532 +0000 UTC m=+67.743195375" watchObservedRunningTime="2025-05-14 00:19:44.024259023 +0000 UTC m=+67.745952866" May 14 00:19:44.072600 kubelet[2808]: I0514 00:19:44.070799 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 00:19:44.580788 containerd[1482]: time="2025-05-14T00:19:44.580666003Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"9812c5babb342f399058ea9d1254a044edde29b8e08909873f0fadb23b072d6e\" pid:4810 exited_at:{seconds:1747181984 nanos:580268067}" May 14 00:19:44.650289 kubelet[2808]: I0514 00:19:44.650218 2808 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 14 00:19:44.650289 kubelet[2808]: I0514 00:19:44.650301 2808 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 14 00:19:47.242395 kubelet[2808]: I0514 00:19:47.241909 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 00:20:06.276309 containerd[1482]: time="2025-05-14T00:20:06.276166658Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"c7e4748d8797c22365683370da8c64a6f99634d2ee2fa45637db663f6d73fba0\" pid:4848 exited_at:{seconds:1747182006 nanos:275270407}" May 14 00:20:14.568942 containerd[1482]: time="2025-05-14T00:20:14.568874390Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"9441abd8b30837e8763f7a2efad613a671be204a86ef97ae2ae3870466b05802\" pid:4880 exited_at:{seconds:1747182014 nanos:567915602}" May 14 00:20:34.896280 containerd[1482]: time="2025-05-14T00:20:34.895706078Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"72c07d4b8fbd411cd7cb7a2c450af40a80465d8a3965b638d274e6aa147e6133\" pid:4909 exited_at:{seconds:1747182034 nanos:888844237}" May 14 00:20:36.287268 containerd[1482]: time="2025-05-14T00:20:36.287172293Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"500c32b8d32f5a074b4a0590ae8fdd3ed89dbc699c1be33af10f6648344147d7\" pid:4930 exited_at:{seconds:1747182036 nanos:286751093}" May 14 00:20:44.614226 containerd[1482]: time="2025-05-14T00:20:44.614172034Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"93a0b0d7f3abf5ae0c135fb6a38887a41ddabf3cb430ecc1b2455d8868622c9a\" pid:4956 exited_at:{seconds:1747182044 nanos:613697243}" May 14 00:21:06.261758 containerd[1482]: time="2025-05-14T00:21:06.261456582Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"7896d90e664fc8c1d8d2a5b7ba0d56c5241b7c71a5de318d3027dc94da321683\" pid:5006 exited_at:{seconds:1747182066 nanos:260872968}" May 14 00:21:14.573069 containerd[1482]: time="2025-05-14T00:21:14.572971403Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"6724e0f2010216f34a979d2b934a24016b2b02b53ceb365443d994f80959b4f0\" pid:5030 exited_at:{seconds:1747182074 nanos:571769920}" May 14 00:21:34.805896 containerd[1482]: time="2025-05-14T00:21:34.805822321Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"1d1dd2e87f6fcb7b3ce5c9b67827f8300378568e5d3428e80e2528301344f027\" pid:5052 exited_at:{seconds:1747182094 nanos:805348342}" May 14 00:21:36.246446 containerd[1482]: time="2025-05-14T00:21:36.246372100Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"7a19ba30a399e887e444377e07e495d02e176329103981a27a54f7c0bb43e348\" pid:5075 exited_at:{seconds:1747182096 nanos:245852085}" May 14 00:21:44.547358 containerd[1482]: time="2025-05-14T00:21:44.547170285Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"91b330f00ad2b57f9d8828fe6fdaa693c0b073af755bcbba55a4be14b48f882d\" pid:5099 exited_at:{seconds:1747182104 nanos:546772599}" May 14 00:22:06.255768 containerd[1482]: time="2025-05-14T00:22:06.255587711Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"7a5bbba8a6bb8ad6673b7496c2a8fed463c657fc3f8fc8b6e4ddd4986e40a3a4\" pid:5124 exited_at:{seconds:1747182126 nanos:255035976}" May 14 00:22:14.560493 containerd[1482]: time="2025-05-14T00:22:14.560432016Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"75b2cd6943d3a57cffee7c6fcbccf031c5a55d483061ed7059e113a1ed91ec87\" pid:5157 exited_at:{seconds:1747182134 nanos:559603502}" May 14 00:22:34.898153 containerd[1482]: time="2025-05-14T00:22:34.897195562Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"97e037db29753048d1852b35863092ae261a14761182468cffa46590893a3f61\" pid:5185 exited_at:{seconds:1747182154 nanos:896075612}" May 14 00:22:36.295737 containerd[1482]: time="2025-05-14T00:22:36.295581778Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"ad3fb49b74935990cb3e7099f35a1960d78659c7b43aef0f2f2d16b28499cef7\" pid:5218 exited_at:{seconds:1747182156 nanos:294998374}" May 14 00:22:44.592258 containerd[1482]: time="2025-05-14T00:22:44.591882806Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"a47a22e1987a342da48299ef163a7645f7d48c2becc7394707ec39d01ad90c38\" pid:5253 exited_at:{seconds:1747182164 nanos:590746004}" May 14 00:23:06.381479 containerd[1482]: time="2025-05-14T00:23:06.380895288Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"96df6c90179ea66b8d58658cc5a394edbf94afd0bf92e26a5d3576e2f30d32f9\" pid:5275 exited_at:{seconds:1747182186 nanos:379410573}" May 14 00:23:06.470934 systemd[1]: Started sshd@9-172.24.4.34:22-172.24.4.1:36308.service - OpenSSH per-connection server daemon (172.24.4.1:36308). May 14 00:23:07.774203 sshd[5293]: Accepted publickey for core from 172.24.4.1 port 36308 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:23:07.779678 sshd-session[5293]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:23:07.799128 systemd-logind[1463]: New session 12 of user core. May 14 00:23:07.804387 systemd[1]: Started session-12.scope - Session 12 of User core. May 14 00:23:08.672206 sshd[5295]: Connection closed by 172.24.4.1 port 36308 May 14 00:23:08.675673 sshd-session[5293]: pam_unix(sshd:session): session closed for user core May 14 00:23:08.682980 systemd[1]: sshd@9-172.24.4.34:22-172.24.4.1:36308.service: Deactivated successfully. May 14 00:23:08.683582 systemd-logind[1463]: Session 12 logged out. Waiting for processes to exit. May 14 00:23:08.695482 systemd[1]: session-12.scope: Deactivated successfully. May 14 00:23:08.698154 systemd-logind[1463]: Removed session 12. May 14 00:23:13.702275 systemd[1]: Started sshd@10-172.24.4.34:22-172.24.4.1:41488.service - OpenSSH per-connection server daemon (172.24.4.1:41488). May 14 00:23:14.606169 containerd[1482]: time="2025-05-14T00:23:14.606107863Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"b46ea670152fd642539cd54d3636521081b558842c830a09ad8d743b91b93c2b\" pid:5322 exited_at:{seconds:1747182194 nanos:605027186}" May 14 00:23:14.930444 sshd[5310]: Accepted publickey for core from 172.24.4.1 port 41488 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:23:14.932982 sshd-session[5310]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:23:14.947767 systemd-logind[1463]: New session 13 of user core. May 14 00:23:14.954912 systemd[1]: Started session-13.scope - Session 13 of User core. May 14 00:23:15.637275 sshd[5332]: Connection closed by 172.24.4.1 port 41488 May 14 00:23:15.638891 sshd-session[5310]: pam_unix(sshd:session): session closed for user core May 14 00:23:15.648326 systemd[1]: sshd@10-172.24.4.34:22-172.24.4.1:41488.service: Deactivated successfully. May 14 00:23:15.655771 systemd[1]: session-13.scope: Deactivated successfully. May 14 00:23:15.658199 systemd-logind[1463]: Session 13 logged out. Waiting for processes to exit. May 14 00:23:15.662001 systemd-logind[1463]: Removed session 13. May 14 00:23:20.690467 systemd[1]: Started sshd@11-172.24.4.34:22-172.24.4.1:41500.service - OpenSSH per-connection server daemon (172.24.4.1:41500). May 14 00:23:21.772482 sshd[5345]: Accepted publickey for core from 172.24.4.1 port 41500 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:23:21.780241 sshd-session[5345]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:23:21.810044 systemd-logind[1463]: New session 14 of user core. May 14 00:23:21.827051 systemd[1]: Started session-14.scope - Session 14 of User core. May 14 00:23:22.490590 sshd[5347]: Connection closed by 172.24.4.1 port 41500 May 14 00:23:22.491444 sshd-session[5345]: pam_unix(sshd:session): session closed for user core May 14 00:23:22.523110 systemd[1]: sshd@11-172.24.4.34:22-172.24.4.1:41500.service: Deactivated successfully. May 14 00:23:22.529482 systemd[1]: session-14.scope: Deactivated successfully. May 14 00:23:22.532207 systemd-logind[1463]: Session 14 logged out. Waiting for processes to exit. May 14 00:23:22.537756 systemd[1]: Started sshd@12-172.24.4.34:22-172.24.4.1:41516.service - OpenSSH per-connection server daemon (172.24.4.1:41516). May 14 00:23:22.545229 systemd-logind[1463]: Removed session 14. May 14 00:23:23.674453 sshd[5360]: Accepted publickey for core from 172.24.4.1 port 41516 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:23:23.677973 sshd-session[5360]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:23:23.690927 systemd-logind[1463]: New session 15 of user core. May 14 00:23:23.697928 systemd[1]: Started session-15.scope - Session 15 of User core. May 14 00:23:24.508593 sshd[5363]: Connection closed by 172.24.4.1 port 41516 May 14 00:23:24.509915 sshd-session[5360]: pam_unix(sshd:session): session closed for user core May 14 00:23:24.525634 systemd[1]: Started sshd@13-172.24.4.34:22-172.24.4.1:58308.service - OpenSSH per-connection server daemon (172.24.4.1:58308). May 14 00:23:24.526203 systemd[1]: sshd@12-172.24.4.34:22-172.24.4.1:41516.service: Deactivated successfully. May 14 00:23:24.530399 systemd[1]: session-15.scope: Deactivated successfully. May 14 00:23:24.531657 systemd-logind[1463]: Session 15 logged out. Waiting for processes to exit. May 14 00:23:24.534993 systemd-logind[1463]: Removed session 15. May 14 00:23:25.705605 sshd[5370]: Accepted publickey for core from 172.24.4.1 port 58308 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:23:25.710016 sshd-session[5370]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:23:25.726199 systemd-logind[1463]: New session 16 of user core. May 14 00:23:25.734929 systemd[1]: Started session-16.scope - Session 16 of User core. May 14 00:23:26.369867 sshd[5375]: Connection closed by 172.24.4.1 port 58308 May 14 00:23:26.371335 sshd-session[5370]: pam_unix(sshd:session): session closed for user core May 14 00:23:26.382134 systemd[1]: sshd@13-172.24.4.34:22-172.24.4.1:58308.service: Deactivated successfully. May 14 00:23:26.391099 systemd[1]: session-16.scope: Deactivated successfully. May 14 00:23:26.394742 systemd-logind[1463]: Session 16 logged out. Waiting for processes to exit. May 14 00:23:26.400495 systemd-logind[1463]: Removed session 16. May 14 00:23:29.881366 containerd[1482]: time="2025-05-14T00:23:29.880922850Z" level=warning msg="container event discarded" container=5ef32ffc2c038b0586a76ecbac4a90de87b81ab2a28a56e7497a3e41e79304c1 type=CONTAINER_CREATED_EVENT May 14 00:23:29.881366 containerd[1482]: time="2025-05-14T00:23:29.881284999Z" level=warning msg="container event discarded" container=5ef32ffc2c038b0586a76ecbac4a90de87b81ab2a28a56e7497a3e41e79304c1 type=CONTAINER_STARTED_EVENT May 14 00:23:29.900653 containerd[1482]: time="2025-05-14T00:23:29.900545704Z" level=warning msg="container event discarded" container=d137d3fdd493a5ad8a0692d7de2c59e4470b224671e16ab1541c884ae3373b78 type=CONTAINER_CREATED_EVENT May 14 00:23:29.900653 containerd[1482]: time="2025-05-14T00:23:29.900623079Z" level=warning msg="container event discarded" container=d137d3fdd493a5ad8a0692d7de2c59e4470b224671e16ab1541c884ae3373b78 type=CONTAINER_STARTED_EVENT May 14 00:23:29.900653 containerd[1482]: time="2025-05-14T00:23:29.900646373Z" level=warning msg="container event discarded" container=c689a5e24e8a6b32759f0be8654db09159c2f75d74fcc9e03e54edc5d0cdbf6e type=CONTAINER_CREATED_EVENT May 14 00:23:29.901109 containerd[1482]: time="2025-05-14T00:23:29.900667993Z" level=warning msg="container event discarded" container=c689a5e24e8a6b32759f0be8654db09159c2f75d74fcc9e03e54edc5d0cdbf6e type=CONTAINER_STARTED_EVENT May 14 00:23:29.932487 containerd[1482]: time="2025-05-14T00:23:29.932379229Z" level=warning msg="container event discarded" container=6c7b5bfd18d9036bdcb5662bd9fc8ac730b7eaa4060202a9a6307c97f2547bdb type=CONTAINER_CREATED_EVENT May 14 00:23:29.952815 containerd[1482]: time="2025-05-14T00:23:29.952603180Z" level=warning msg="container event discarded" container=5f9bf61673da583125b74245319c5047ad3c9150bfa6937fa24071225d40c11a type=CONTAINER_CREATED_EVENT May 14 00:23:29.965983 containerd[1482]: time="2025-05-14T00:23:29.965872918Z" level=warning msg="container event discarded" container=b0c11c2899ba70d82f7a991bb7b62e4f3d1dbbc77ad172a20d4e55be5b0a4607 type=CONTAINER_CREATED_EVENT May 14 00:23:30.048669 containerd[1482]: time="2025-05-14T00:23:30.048547127Z" level=warning msg="container event discarded" container=6c7b5bfd18d9036bdcb5662bd9fc8ac730b7eaa4060202a9a6307c97f2547bdb type=CONTAINER_STARTED_EVENT May 14 00:23:30.087105 containerd[1482]: time="2025-05-14T00:23:30.086992082Z" level=warning msg="container event discarded" container=b0c11c2899ba70d82f7a991bb7b62e4f3d1dbbc77ad172a20d4e55be5b0a4607 type=CONTAINER_STARTED_EVENT May 14 00:23:30.130770 containerd[1482]: time="2025-05-14T00:23:30.130642761Z" level=warning msg="container event discarded" container=5f9bf61673da583125b74245319c5047ad3c9150bfa6937fa24071225d40c11a type=CONTAINER_STARTED_EVENT May 14 00:23:31.395012 systemd[1]: Started sshd@14-172.24.4.34:22-172.24.4.1:58314.service - OpenSSH per-connection server daemon (172.24.4.1:58314). May 14 00:23:32.500841 sshd[5388]: Accepted publickey for core from 172.24.4.1 port 58314 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:23:32.505849 sshd-session[5388]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:23:32.516791 systemd-logind[1463]: New session 17 of user core. May 14 00:23:32.526844 systemd[1]: Started session-17.scope - Session 17 of User core. May 14 00:23:33.131561 sshd[5390]: Connection closed by 172.24.4.1 port 58314 May 14 00:23:33.132269 sshd-session[5388]: pam_unix(sshd:session): session closed for user core May 14 00:23:33.135305 systemd-logind[1463]: Session 17 logged out. Waiting for processes to exit. May 14 00:23:33.135659 systemd[1]: sshd@14-172.24.4.34:22-172.24.4.1:58314.service: Deactivated successfully. May 14 00:23:33.137866 systemd[1]: session-17.scope: Deactivated successfully. May 14 00:23:33.140104 systemd-logind[1463]: Removed session 17. May 14 00:23:34.870366 containerd[1482]: time="2025-05-14T00:23:34.870250633Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"6c1fdc399b3e87b0ba7c1ea742520fed8264aca5801e49dc79650524561c1be7\" pid:5420 exited_at:{seconds:1747182214 nanos:868650542}" May 14 00:23:36.235310 containerd[1482]: time="2025-05-14T00:23:36.234839429Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"6c512235eb726b9b6f05d91c916f3d83eed78d8d8c322be3442b775165a34e21\" pid:5441 exit_status:1 exited_at:{seconds:1747182216 nanos:234075827}" May 14 00:23:38.177986 systemd[1]: Started sshd@15-172.24.4.34:22-172.24.4.1:32938.service - OpenSSH per-connection server daemon (172.24.4.1:32938). May 14 00:23:39.682552 sshd[5456]: Accepted publickey for core from 172.24.4.1 port 32938 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:23:39.685065 sshd-session[5456]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:23:39.695668 systemd-logind[1463]: New session 18 of user core. May 14 00:23:39.701694 systemd[1]: Started session-18.scope - Session 18 of User core. May 14 00:23:40.412477 sshd[5458]: Connection closed by 172.24.4.1 port 32938 May 14 00:23:40.414409 sshd-session[5456]: pam_unix(sshd:session): session closed for user core May 14 00:23:40.419973 systemd-logind[1463]: Session 18 logged out. Waiting for processes to exit. May 14 00:23:40.421222 systemd[1]: sshd@15-172.24.4.34:22-172.24.4.1:32938.service: Deactivated successfully. May 14 00:23:40.425137 systemd[1]: session-18.scope: Deactivated successfully. May 14 00:23:40.429481 systemd-logind[1463]: Removed session 18. May 14 00:23:44.610038 containerd[1482]: time="2025-05-14T00:23:44.609856519Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"a0380a3bf6418adff9e12e873c8549f498003823e1a45ccba2162cf35ff9681c\" pid:5481 exited_at:{seconds:1747182224 nanos:608357849}" May 14 00:23:45.443671 systemd[1]: Started sshd@16-172.24.4.34:22-172.24.4.1:58084.service - OpenSSH per-connection server daemon (172.24.4.1:58084). May 14 00:23:46.660197 sshd[5491]: Accepted publickey for core from 172.24.4.1 port 58084 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:23:46.664983 sshd-session[5491]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:23:46.680484 systemd-logind[1463]: New session 19 of user core. May 14 00:23:46.688849 systemd[1]: Started session-19.scope - Session 19 of User core. May 14 00:23:47.583735 sshd[5493]: Connection closed by 172.24.4.1 port 58084 May 14 00:23:47.583444 sshd-session[5491]: pam_unix(sshd:session): session closed for user core May 14 00:23:47.601307 systemd[1]: sshd@16-172.24.4.34:22-172.24.4.1:58084.service: Deactivated successfully. May 14 00:23:47.608965 systemd[1]: session-19.scope: Deactivated successfully. May 14 00:23:47.612648 systemd-logind[1463]: Session 19 logged out. Waiting for processes to exit. May 14 00:23:47.624054 systemd[1]: Started sshd@17-172.24.4.34:22-172.24.4.1:58090.service - OpenSSH per-connection server daemon (172.24.4.1:58090). May 14 00:23:47.631619 systemd-logind[1463]: Removed session 19. May 14 00:23:48.785412 sshd[5504]: Accepted publickey for core from 172.24.4.1 port 58090 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:23:48.788785 sshd-session[5504]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:23:48.803196 systemd-logind[1463]: New session 20 of user core. May 14 00:23:48.811957 systemd[1]: Started session-20.scope - Session 20 of User core. May 14 00:23:49.760988 sshd[5507]: Connection closed by 172.24.4.1 port 58090 May 14 00:23:49.762912 sshd-session[5504]: pam_unix(sshd:session): session closed for user core May 14 00:23:49.776857 systemd[1]: Started sshd@18-172.24.4.34:22-172.24.4.1:58096.service - OpenSSH per-connection server daemon (172.24.4.1:58096). May 14 00:23:49.777408 systemd[1]: sshd@17-172.24.4.34:22-172.24.4.1:58090.service: Deactivated successfully. May 14 00:23:49.782325 systemd[1]: session-20.scope: Deactivated successfully. May 14 00:23:49.785805 systemd-logind[1463]: Session 20 logged out. Waiting for processes to exit. May 14 00:23:49.797188 systemd-logind[1463]: Removed session 20. May 14 00:23:51.083793 sshd[5513]: Accepted publickey for core from 172.24.4.1 port 58096 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:23:51.087155 sshd-session[5513]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:23:51.102974 systemd-logind[1463]: New session 21 of user core. May 14 00:23:51.110874 systemd[1]: Started session-21.scope - Session 21 of User core. May 14 00:23:52.005784 containerd[1482]: time="2025-05-14T00:23:52.005214808Z" level=warning msg="container event discarded" container=fcc92c2fc5b4808de8f243e8051e55099bd3e50db58bea7f392a9e873b1159ca type=CONTAINER_CREATED_EVENT May 14 00:23:52.005784 containerd[1482]: time="2025-05-14T00:23:52.005749341Z" level=warning msg="container event discarded" container=fcc92c2fc5b4808de8f243e8051e55099bd3e50db58bea7f392a9e873b1159ca type=CONTAINER_STARTED_EVENT May 14 00:23:52.048293 containerd[1482]: time="2025-05-14T00:23:52.047044451Z" level=warning msg="container event discarded" container=bcbb4790852df321486a34495336ab23f98c913d78d22a33fe2338ccc248dbb1 type=CONTAINER_CREATED_EVENT May 14 00:23:52.123925 containerd[1482]: time="2025-05-14T00:23:52.123762790Z" level=warning msg="container event discarded" container=bcbb4790852df321486a34495336ab23f98c913d78d22a33fe2338ccc248dbb1 type=CONTAINER_STARTED_EVENT May 14 00:23:52.530039 containerd[1482]: time="2025-05-14T00:23:52.529938771Z" level=warning msg="container event discarded" container=bcd4432f7d28a8b29344cb2d703d443bf39153b67f82f86bbcc7f550aacf8e40 type=CONTAINER_CREATED_EVENT May 14 00:23:52.530039 containerd[1482]: time="2025-05-14T00:23:52.530015034Z" level=warning msg="container event discarded" container=bcd4432f7d28a8b29344cb2d703d443bf39153b67f82f86bbcc7f550aacf8e40 type=CONTAINER_STARTED_EVENT May 14 00:23:54.641616 containerd[1482]: time="2025-05-14T00:23:54.641369721Z" level=warning msg="container event discarded" container=82d9413ef90dc8b30330bed0ac4d362be720ae7de10f75d77a2c096f9da37bfe type=CONTAINER_CREATED_EVENT May 14 00:23:54.718060 containerd[1482]: time="2025-05-14T00:23:54.717877044Z" level=warning msg="container event discarded" container=82d9413ef90dc8b30330bed0ac4d362be720ae7de10f75d77a2c096f9da37bfe type=CONTAINER_STARTED_EVENT May 14 00:23:54.779568 sshd[5518]: Connection closed by 172.24.4.1 port 58096 May 14 00:23:54.783198 sshd-session[5513]: pam_unix(sshd:session): session closed for user core May 14 00:23:54.811479 systemd[1]: sshd@18-172.24.4.34:22-172.24.4.1:58096.service: Deactivated successfully. May 14 00:23:54.818263 systemd[1]: session-21.scope: Deactivated successfully. May 14 00:23:54.819077 systemd[1]: session-21.scope: Consumed 992ms CPU time, 69.3M memory peak. May 14 00:23:54.821802 systemd-logind[1463]: Session 21 logged out. Waiting for processes to exit. May 14 00:23:54.830736 systemd[1]: Started sshd@19-172.24.4.34:22-172.24.4.1:40028.service - OpenSSH per-connection server daemon (172.24.4.1:40028). May 14 00:23:54.836814 systemd-logind[1463]: Removed session 21. May 14 00:23:56.039618 sshd[5539]: Accepted publickey for core from 172.24.4.1 port 40028 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:23:56.042626 sshd-session[5539]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:23:56.059704 systemd-logind[1463]: New session 22 of user core. May 14 00:23:56.073996 systemd[1]: Started session-22.scope - Session 22 of User core. May 14 00:23:57.052168 sshd[5542]: Connection closed by 172.24.4.1 port 40028 May 14 00:23:57.052679 sshd-session[5539]: pam_unix(sshd:session): session closed for user core May 14 00:23:57.070986 systemd[1]: sshd@19-172.24.4.34:22-172.24.4.1:40028.service: Deactivated successfully. May 14 00:23:57.079654 systemd[1]: session-22.scope: Deactivated successfully. May 14 00:23:57.082126 systemd-logind[1463]: Session 22 logged out. Waiting for processes to exit. May 14 00:23:57.088927 systemd[1]: Started sshd@20-172.24.4.34:22-172.24.4.1:40030.service - OpenSSH per-connection server daemon (172.24.4.1:40030). May 14 00:23:57.090890 systemd-logind[1463]: Removed session 22. May 14 00:23:58.446615 sshd[5551]: Accepted publickey for core from 172.24.4.1 port 40030 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:23:58.449013 sshd-session[5551]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:23:58.458994 systemd-logind[1463]: New session 23 of user core. May 14 00:23:58.470797 systemd[1]: Started session-23.scope - Session 23 of User core. May 14 00:23:58.534949 containerd[1482]: time="2025-05-14T00:23:58.534719881Z" level=warning msg="container event discarded" container=32be841826eb153d1878e517e2b90a42e3516e5df05402eb4e6e1879a20788c6 type=CONTAINER_CREATED_EVENT May 14 00:23:58.534949 containerd[1482]: time="2025-05-14T00:23:58.534934443Z" level=warning msg="container event discarded" container=32be841826eb153d1878e517e2b90a42e3516e5df05402eb4e6e1879a20788c6 type=CONTAINER_STARTED_EVENT May 14 00:23:58.654329 containerd[1482]: time="2025-05-14T00:23:58.654189752Z" level=warning msg="container event discarded" container=53ad9a70a8b6f18a293296056507aa0599d47c312c5640cc35b9d5cbf27a9652 type=CONTAINER_CREATED_EVENT May 14 00:23:58.654329 containerd[1482]: time="2025-05-14T00:23:58.654291162Z" level=warning msg="container event discarded" container=53ad9a70a8b6f18a293296056507aa0599d47c312c5640cc35b9d5cbf27a9652 type=CONTAINER_STARTED_EVENT May 14 00:23:59.190569 sshd[5554]: Connection closed by 172.24.4.1 port 40030 May 14 00:23:59.191797 sshd-session[5551]: pam_unix(sshd:session): session closed for user core May 14 00:23:59.199751 systemd[1]: sshd@20-172.24.4.34:22-172.24.4.1:40030.service: Deactivated successfully. May 14 00:23:59.206544 systemd[1]: session-23.scope: Deactivated successfully. May 14 00:23:59.211241 systemd-logind[1463]: Session 23 logged out. Waiting for processes to exit. May 14 00:23:59.214947 systemd-logind[1463]: Removed session 23. May 14 00:24:01.822639 containerd[1482]: time="2025-05-14T00:24:01.822322974Z" level=warning msg="container event discarded" container=172384fd33c62643b6ccc1bc431bfdbad27ddc7b71336154bf1710dfbd7c2be7 type=CONTAINER_CREATED_EVENT May 14 00:24:01.925713 containerd[1482]: time="2025-05-14T00:24:01.925405761Z" level=warning msg="container event discarded" container=172384fd33c62643b6ccc1bc431bfdbad27ddc7b71336154bf1710dfbd7c2be7 type=CONTAINER_STARTED_EVENT May 14 00:24:04.216055 systemd[1]: Started sshd@21-172.24.4.34:22-172.24.4.1:56022.service - OpenSSH per-connection server daemon (172.24.4.1:56022). May 14 00:24:04.323049 containerd[1482]: time="2025-05-14T00:24:04.322847658Z" level=warning msg="container event discarded" container=4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189 type=CONTAINER_CREATED_EVENT May 14 00:24:04.409125 containerd[1482]: time="2025-05-14T00:24:04.408930149Z" level=warning msg="container event discarded" container=4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189 type=CONTAINER_STARTED_EVENT May 14 00:24:05.141070 containerd[1482]: time="2025-05-14T00:24:05.140668566Z" level=warning msg="container event discarded" container=4d369f0029a1af6a3e3c16ab49b086bc2de0c25cfa3ee86643fc8da89a4f7189 type=CONTAINER_STOPPED_EVENT May 14 00:24:05.509592 sshd[5568]: Accepted publickey for core from 172.24.4.1 port 56022 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:24:05.512845 sshd-session[5568]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:24:05.526300 systemd-logind[1463]: New session 24 of user core. May 14 00:24:05.536853 systemd[1]: Started session-24.scope - Session 24 of User core. May 14 00:24:06.233573 containerd[1482]: time="2025-05-14T00:24:06.233412687Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"2e223bd01efa5f2f6c181f01585490d580673be03d19ab93a187f48213a4e315\" pid:5591 exited_at:{seconds:1747182246 nanos:232007662}" May 14 00:24:06.425577 sshd[5570]: Connection closed by 172.24.4.1 port 56022 May 14 00:24:06.427199 sshd-session[5568]: pam_unix(sshd:session): session closed for user core May 14 00:24:06.443870 systemd[1]: sshd@21-172.24.4.34:22-172.24.4.1:56022.service: Deactivated successfully. May 14 00:24:06.451442 systemd[1]: session-24.scope: Deactivated successfully. May 14 00:24:06.456610 systemd-logind[1463]: Session 24 logged out. Waiting for processes to exit. May 14 00:24:06.459453 systemd-logind[1463]: Removed session 24. May 14 00:24:11.449408 systemd[1]: Started sshd@22-172.24.4.34:22-172.24.4.1:56028.service - OpenSSH per-connection server daemon (172.24.4.1:56028). May 14 00:24:12.375431 containerd[1482]: time="2025-05-14T00:24:12.374598860Z" level=warning msg="container event discarded" container=85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5 type=CONTAINER_CREATED_EVENT May 14 00:24:12.487903 containerd[1482]: time="2025-05-14T00:24:12.487641206Z" level=warning msg="container event discarded" container=85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5 type=CONTAINER_STARTED_EVENT May 14 00:24:12.667416 sshd[5619]: Accepted publickey for core from 172.24.4.1 port 56028 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:24:12.672094 sshd-session[5619]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:24:12.693961 systemd-logind[1463]: New session 25 of user core. May 14 00:24:12.703050 systemd[1]: Started session-25.scope - Session 25 of User core. May 14 00:24:13.494486 sshd[5621]: Connection closed by 172.24.4.1 port 56028 May 14 00:24:13.495742 sshd-session[5619]: pam_unix(sshd:session): session closed for user core May 14 00:24:13.502765 systemd-logind[1463]: Session 25 logged out. Waiting for processes to exit. May 14 00:24:13.502846 systemd[1]: sshd@22-172.24.4.34:22-172.24.4.1:56028.service: Deactivated successfully. May 14 00:24:13.506299 systemd[1]: session-25.scope: Deactivated successfully. May 14 00:24:13.510261 systemd-logind[1463]: Removed session 25. May 14 00:24:14.564104 containerd[1482]: time="2025-05-14T00:24:14.564001030Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"ed264388795111d39499ed0f3105f638c2849ea72c2397e928469b7b58ca1e76\" pid:5650 exited_at:{seconds:1747182254 nanos:562983412}" May 14 00:24:14.677230 containerd[1482]: time="2025-05-14T00:24:14.677134217Z" level=warning msg="container event discarded" container=85c95d4caeb96464c156e031f08ea3454b6cf0f83aff6e7ce6cf406c973f2cd5 type=CONTAINER_STOPPED_EVENT May 14 00:24:18.524139 systemd[1]: Started sshd@23-172.24.4.34:22-172.24.4.1:47628.service - OpenSSH per-connection server daemon (172.24.4.1:47628). May 14 00:24:19.653575 sshd[5659]: Accepted publickey for core from 172.24.4.1 port 47628 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:24:19.661044 sshd-session[5659]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:24:19.685275 systemd-logind[1463]: New session 26 of user core. May 14 00:24:19.696017 systemd[1]: Started session-26.scope - Session 26 of User core. May 14 00:24:20.411601 sshd[5661]: Connection closed by 172.24.4.1 port 47628 May 14 00:24:20.413255 sshd-session[5659]: pam_unix(sshd:session): session closed for user core May 14 00:24:20.424687 systemd[1]: sshd@23-172.24.4.34:22-172.24.4.1:47628.service: Deactivated successfully. May 14 00:24:20.433681 systemd[1]: session-26.scope: Deactivated successfully. May 14 00:24:20.442486 systemd-logind[1463]: Session 26 logged out. Waiting for processes to exit. May 14 00:24:20.446235 systemd-logind[1463]: Removed session 26. May 14 00:24:24.370442 containerd[1482]: time="2025-05-14T00:24:24.369977720Z" level=warning msg="container event discarded" container=5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592 type=CONTAINER_CREATED_EVENT May 14 00:24:24.517666 containerd[1482]: time="2025-05-14T00:24:24.517461613Z" level=warning msg="container event discarded" container=5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592 type=CONTAINER_STARTED_EVENT May 14 00:24:25.440149 systemd[1]: Started sshd@24-172.24.4.34:22-172.24.4.1:59716.service - OpenSSH per-connection server daemon (172.24.4.1:59716). May 14 00:24:26.712859 containerd[1482]: time="2025-05-14T00:24:26.712695112Z" level=warning msg="container event discarded" container=124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586 type=CONTAINER_CREATED_EVENT May 14 00:24:26.715349 containerd[1482]: time="2025-05-14T00:24:26.713622461Z" level=warning msg="container event discarded" container=124446416a83e899b3ac46a6a2e444f3ee9de0735b319e06eb2e39c4ff942586 type=CONTAINER_STARTED_EVENT May 14 00:24:26.766325 containerd[1482]: time="2025-05-14T00:24:26.766192328Z" level=warning msg="container event discarded" container=a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60 type=CONTAINER_CREATED_EVENT May 14 00:24:26.766325 containerd[1482]: time="2025-05-14T00:24:26.766293788Z" level=warning msg="container event discarded" container=a7d0395816dfa7487ca855962cc6e938fc2f95689c87efd457f8b8980a2bde60 type=CONTAINER_STARTED_EVENT May 14 00:24:26.813494 sshd[5675]: Accepted publickey for core from 172.24.4.1 port 59716 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:24:26.815915 containerd[1482]: time="2025-05-14T00:24:26.815824737Z" level=warning msg="container event discarded" container=5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd type=CONTAINER_CREATED_EVENT May 14 00:24:26.815915 containerd[1482]: time="2025-05-14T00:24:26.815898575Z" level=warning msg="container event discarded" container=5b38ba3b1d8c6a33933f53edf3611ef858d02bc1bae4d5b4e337330a30c35bdd type=CONTAINER_STARTED_EVENT May 14 00:24:26.818911 sshd-session[5675]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:24:26.832635 systemd-logind[1463]: New session 27 of user core. May 14 00:24:26.848938 systemd[1]: Started session-27.scope - Session 27 of User core. May 14 00:24:27.721553 sshd[5677]: Connection closed by 172.24.4.1 port 59716 May 14 00:24:27.722799 sshd-session[5675]: pam_unix(sshd:session): session closed for user core May 14 00:24:27.730135 systemd[1]: sshd@24-172.24.4.34:22-172.24.4.1:59716.service: Deactivated successfully. May 14 00:24:27.736771 systemd[1]: session-27.scope: Deactivated successfully. May 14 00:24:27.739308 systemd-logind[1463]: Session 27 logged out. Waiting for processes to exit. May 14 00:24:27.741638 systemd-logind[1463]: Removed session 27. May 14 00:24:29.194637 containerd[1482]: time="2025-05-14T00:24:29.194300231Z" level=warning msg="container event discarded" container=a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868 type=CONTAINER_CREATED_EVENT May 14 00:24:29.194637 containerd[1482]: time="2025-05-14T00:24:29.194414556Z" level=warning msg="container event discarded" container=a66e9389da68f8f17e13b65fe03823b2245868dcea10054a3ad05ec15c8d6868 type=CONTAINER_STARTED_EVENT May 14 00:24:29.210969 containerd[1482]: time="2025-05-14T00:24:29.210801039Z" level=warning msg="container event discarded" container=13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc type=CONTAINER_CREATED_EVENT May 14 00:24:29.211415 containerd[1482]: time="2025-05-14T00:24:29.211298892Z" level=warning msg="container event discarded" container=13c2b94dcd8d387b474cc4573fac05db6fd4f724060f6f71520a47c1786bc3cc type=CONTAINER_STARTED_EVENT May 14 00:24:29.258386 containerd[1482]: time="2025-05-14T00:24:29.258239633Z" level=warning msg="container event discarded" container=a884d24dd6313877a5f16d308548443419b90f6880d934948e18062abc2f7a98 type=CONTAINER_CREATED_EVENT May 14 00:24:29.269951 containerd[1482]: time="2025-05-14T00:24:29.269759568Z" level=warning msg="container event discarded" container=0d79a8d01de40af014df38a3f853c89cd3a816b3682d6d6e894648371add7542 type=CONTAINER_CREATED_EVENT May 14 00:24:29.370383 containerd[1482]: time="2025-05-14T00:24:29.370219194Z" level=warning msg="container event discarded" container=a884d24dd6313877a5f16d308548443419b90f6880d934948e18062abc2f7a98 type=CONTAINER_STARTED_EVENT May 14 00:24:29.390809 containerd[1482]: time="2025-05-14T00:24:29.390643411Z" level=warning msg="container event discarded" container=0d79a8d01de40af014df38a3f853c89cd3a816b3682d6d6e894648371add7542 type=CONTAINER_STARTED_EVENT May 14 00:24:29.961354 containerd[1482]: time="2025-05-14T00:24:29.960981715Z" level=warning msg="container event discarded" container=3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34 type=CONTAINER_CREATED_EVENT May 14 00:24:29.961354 containerd[1482]: time="2025-05-14T00:24:29.961088055Z" level=warning msg="container event discarded" container=3d75cb6ead6e3952ae837ab614b8d5ef85fd2445d820d3f90c6ecfabeba88b34 type=CONTAINER_STARTED_EVENT May 14 00:24:31.237290 containerd[1482]: time="2025-05-14T00:24:31.237150505Z" level=warning msg="container event discarded" container=7821ee943508a3958fdc37694f67b2d5dff099643d7f8c5387168b79f8eb55a7 type=CONTAINER_CREATED_EVENT May 14 00:24:31.362932 containerd[1482]: time="2025-05-14T00:24:31.362775631Z" level=warning msg="container event discarded" container=7821ee943508a3958fdc37694f67b2d5dff099643d7f8c5387168b79f8eb55a7 type=CONTAINER_STARTED_EVENT May 14 00:24:32.751323 systemd[1]: Started sshd@25-172.24.4.34:22-172.24.4.1:59726.service - OpenSSH per-connection server daemon (172.24.4.1:59726). May 14 00:24:33.947392 sshd[5690]: Accepted publickey for core from 172.24.4.1 port 59726 ssh2: RSA SHA256:i8IoyOvRZbmvyVLGRx2Lp3PnNlvImQe2fwjN6PCd1wA May 14 00:24:33.951060 sshd-session[5690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 00:24:33.964701 systemd-logind[1463]: New session 28 of user core. May 14 00:24:33.972873 systemd[1]: Started session-28.scope - Session 28 of User core. May 14 00:24:34.830186 sshd[5692]: Connection closed by 172.24.4.1 port 59726 May 14 00:24:34.871032 sshd-session[5690]: pam_unix(sshd:session): session closed for user core May 14 00:24:34.880598 systemd[1]: sshd@25-172.24.4.34:22-172.24.4.1:59726.service: Deactivated successfully. May 14 00:24:34.887678 systemd[1]: session-28.scope: Deactivated successfully. May 14 00:24:34.890484 systemd-logind[1463]: Session 28 logged out. Waiting for processes to exit. May 14 00:24:34.893925 containerd[1482]: time="2025-05-14T00:24:34.893849377Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"818c4c73dad89d378b81e9b3f3038ae2f45a4b0b6d2c8d5e9a83c54bc16d7cda\" pid:5713 exited_at:{seconds:1747182274 nanos:891989469}" May 14 00:24:34.895185 systemd-logind[1463]: Removed session 28. May 14 00:24:35.689328 containerd[1482]: time="2025-05-14T00:24:35.689155640Z" level=warning msg="container event discarded" container=64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd type=CONTAINER_CREATED_EVENT May 14 00:24:35.833362 containerd[1482]: time="2025-05-14T00:24:35.831596293Z" level=warning msg="container event discarded" container=64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd type=CONTAINER_STARTED_EVENT May 14 00:24:36.223856 containerd[1482]: time="2025-05-14T00:24:36.223788347Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"054e6aba7236c723f956e7be90afb0b9031e239340389c0d7651116241ed385f\" pid:5737 exited_at:{seconds:1747182276 nanos:222229953}" May 14 00:24:40.313190 containerd[1482]: time="2025-05-14T00:24:40.312945061Z" level=warning msg="container event discarded" container=8b612aa6386f1e80029021b70a626712510b285fa5a1e87c36b520b9f768495a type=CONTAINER_CREATED_EVENT May 14 00:24:40.742825 containerd[1482]: time="2025-05-14T00:24:40.742671802Z" level=warning msg="container event discarded" container=8b612aa6386f1e80029021b70a626712510b285fa5a1e87c36b520b9f768495a type=CONTAINER_STARTED_EVENT May 14 00:24:40.838136 containerd[1482]: time="2025-05-14T00:24:40.838028187Z" level=warning msg="container event discarded" container=e90a2ee08dcaa3c82fc4f1ddb5de6fe8eb2502bcbe4657423b371d4657d98d23 type=CONTAINER_CREATED_EVENT May 14 00:24:41.089243 containerd[1482]: time="2025-05-14T00:24:41.088976672Z" level=warning msg="container event discarded" container=e90a2ee08dcaa3c82fc4f1ddb5de6fe8eb2502bcbe4657423b371d4657d98d23 type=CONTAINER_STARTED_EVENT May 14 00:24:43.519501 containerd[1482]: time="2025-05-14T00:24:43.519365514Z" level=warning msg="container event discarded" container=53cfeb07f06447060a453e3299147b46a064c81e272d5d7cb133c1e676d96aa6 type=CONTAINER_CREATED_EVENT May 14 00:24:43.663752 containerd[1482]: time="2025-05-14T00:24:43.663397672Z" level=warning msg="container event discarded" container=53cfeb07f06447060a453e3299147b46a064c81e272d5d7cb133c1e676d96aa6 type=CONTAINER_STARTED_EVENT May 14 00:24:44.578498 containerd[1482]: time="2025-05-14T00:24:44.578395921Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"0e615f089790594063a169f2fed916ad30bafb4b45238a6bfb2c44e02022098b\" pid:5764 exited_at:{seconds:1747182284 nanos:577885984}" May 14 00:24:59.919672 update_engine[1468]: I20250514 00:24:59.918807 1468 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 14 00:24:59.919672 update_engine[1468]: I20250514 00:24:59.919275 1468 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 14 00:24:59.924818 update_engine[1468]: I20250514 00:24:59.921993 1468 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 14 00:24:59.927751 update_engine[1468]: I20250514 00:24:59.927668 1468 omaha_request_params.cc:62] Current group set to alpha May 14 00:24:59.931319 update_engine[1468]: I20250514 00:24:59.930949 1468 update_attempter.cc:499] Already updated boot flags. Skipping. May 14 00:24:59.931319 update_engine[1468]: I20250514 00:24:59.930996 1468 update_attempter.cc:643] Scheduling an action processor start. May 14 00:24:59.931319 update_engine[1468]: I20250514 00:24:59.931075 1468 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 14 00:24:59.931319 update_engine[1468]: I20250514 00:24:59.931294 1468 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 14 00:24:59.931827 update_engine[1468]: I20250514 00:24:59.931491 1468 omaha_request_action.cc:271] Posting an Omaha request to disabled May 14 00:24:59.931827 update_engine[1468]: I20250514 00:24:59.931562 1468 omaha_request_action.cc:272] Request: May 14 00:24:59.931827 update_engine[1468]: May 14 00:24:59.931827 update_engine[1468]: May 14 00:24:59.931827 update_engine[1468]: May 14 00:24:59.931827 update_engine[1468]: May 14 00:24:59.931827 update_engine[1468]: May 14 00:24:59.931827 update_engine[1468]: May 14 00:24:59.931827 update_engine[1468]: May 14 00:24:59.931827 update_engine[1468]: May 14 00:24:59.931827 update_engine[1468]: I20250514 00:24:59.931591 1468 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 00:24:59.938022 locksmithd[1487]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 14 00:24:59.943208 update_engine[1468]: I20250514 00:24:59.943118 1468 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 00:24:59.944655 update_engine[1468]: I20250514 00:24:59.944502 1468 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 00:24:59.952129 update_engine[1468]: E20250514 00:24:59.952019 1468 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 00:24:59.952354 update_engine[1468]: I20250514 00:24:59.952265 1468 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 14 00:25:06.248585 containerd[1482]: time="2025-05-14T00:25:06.247844048Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"1ec3c0dc95dcbd8e23aab97766a294f34d27e648b1bf52d6c7c84fcd5eb670d6\" pid:5796 exited_at:{seconds:1747182306 nanos:246339757}" May 14 00:25:09.905375 update_engine[1468]: I20250514 00:25:09.903641 1468 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 00:25:09.905375 update_engine[1468]: I20250514 00:25:09.904474 1468 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 00:25:09.905375 update_engine[1468]: I20250514 00:25:09.905200 1468 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 00:25:09.910858 update_engine[1468]: E20250514 00:25:09.910599 1468 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 00:25:09.910858 update_engine[1468]: I20250514 00:25:09.910782 1468 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 14 00:25:14.582452 containerd[1482]: time="2025-05-14T00:25:14.582322128Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"2a274c01a962c06781f9c45bd06c96efa72a12edf13337a5af353125f1a61845\" pid:5820 exited_at:{seconds:1747182314 nanos:581700132}" May 14 00:25:19.903735 update_engine[1468]: I20250514 00:25:19.903451 1468 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 00:25:19.906300 update_engine[1468]: I20250514 00:25:19.905472 1468 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 00:25:19.906300 update_engine[1468]: I20250514 00:25:19.906141 1468 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 00:25:19.911688 update_engine[1468]: E20250514 00:25:19.911404 1468 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 00:25:19.911688 update_engine[1468]: I20250514 00:25:19.911614 1468 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 14 00:25:29.900861 update_engine[1468]: I20250514 00:25:29.899500 1468 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 00:25:29.900861 update_engine[1468]: I20250514 00:25:29.900107 1468 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 00:25:29.900861 update_engine[1468]: I20250514 00:25:29.900763 1468 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 00:25:29.907220 update_engine[1468]: E20250514 00:25:29.906059 1468 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 00:25:29.907220 update_engine[1468]: I20250514 00:25:29.906227 1468 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 14 00:25:29.907220 update_engine[1468]: I20250514 00:25:29.906287 1468 omaha_request_action.cc:617] Omaha request response: May 14 00:25:29.907220 update_engine[1468]: E20250514 00:25:29.906561 1468 omaha_request_action.cc:636] Omaha request network transfer failed. May 14 00:25:29.907220 update_engine[1468]: I20250514 00:25:29.906857 1468 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 14 00:25:29.907220 update_engine[1468]: I20250514 00:25:29.906876 1468 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 14 00:25:29.907220 update_engine[1468]: I20250514 00:25:29.906896 1468 update_attempter.cc:306] Processing Done. May 14 00:25:29.907220 update_engine[1468]: E20250514 00:25:29.906981 1468 update_attempter.cc:619] Update failed. May 14 00:25:29.907220 update_engine[1468]: I20250514 00:25:29.907011 1468 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 14 00:25:29.907220 update_engine[1468]: I20250514 00:25:29.907027 1468 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 14 00:25:29.907220 update_engine[1468]: I20250514 00:25:29.907039 1468 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 14 00:25:29.909821 update_engine[1468]: I20250514 00:25:29.908484 1468 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 14 00:25:29.909821 update_engine[1468]: I20250514 00:25:29.908653 1468 omaha_request_action.cc:271] Posting an Omaha request to disabled May 14 00:25:29.909821 update_engine[1468]: I20250514 00:25:29.908674 1468 omaha_request_action.cc:272] Request: May 14 00:25:29.909821 update_engine[1468]: May 14 00:25:29.909821 update_engine[1468]: May 14 00:25:29.909821 update_engine[1468]: May 14 00:25:29.909821 update_engine[1468]: May 14 00:25:29.909821 update_engine[1468]: May 14 00:25:29.909821 update_engine[1468]: May 14 00:25:29.909821 update_engine[1468]: I20250514 00:25:29.908689 1468 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 00:25:29.909821 update_engine[1468]: I20250514 00:25:29.909022 1468 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 00:25:29.911067 update_engine[1468]: I20250514 00:25:29.910975 1468 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 00:25:29.912981 locksmithd[1487]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 14 00:25:29.916612 update_engine[1468]: E20250514 00:25:29.916167 1468 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 00:25:29.916612 update_engine[1468]: I20250514 00:25:29.916322 1468 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 14 00:25:29.916612 update_engine[1468]: I20250514 00:25:29.916349 1468 omaha_request_action.cc:617] Omaha request response: May 14 00:25:29.916612 update_engine[1468]: I20250514 00:25:29.916365 1468 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 14 00:25:29.916612 update_engine[1468]: I20250514 00:25:29.916378 1468 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 14 00:25:29.916612 update_engine[1468]: I20250514 00:25:29.916389 1468 update_attempter.cc:306] Processing Done. May 14 00:25:29.916612 update_engine[1468]: I20250514 00:25:29.916404 1468 update_attempter.cc:310] Error event sent. May 14 00:25:29.916612 update_engine[1468]: I20250514 00:25:29.916447 1468 update_check_scheduler.cc:74] Next update check in 43m50s May 14 00:25:29.918546 locksmithd[1487]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 14 00:25:34.862091 containerd[1482]: time="2025-05-14T00:25:34.862022788Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"3689cbfd1e32afaabf9064949bb387129ae51bac1dc6fd87c455ce5f7ff9417c\" pid:5845 exited_at:{seconds:1747182334 nanos:861017880}" May 14 00:25:36.248304 containerd[1482]: time="2025-05-14T00:25:36.248231377Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"fbc54a5a7287fb9c0f63ff93e1284b599991306b39c4ee158511598f0fe59033\" pid:5865 exited_at:{seconds:1747182336 nanos:247715894}" May 14 00:25:44.562154 containerd[1482]: time="2025-05-14T00:25:44.562051426Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"eb739944a531a39827c4766c854d866c22b9bdd7aef75f9fad0848b2c5692896\" pid:5902 exited_at:{seconds:1747182344 nanos:561275781}" May 14 00:26:06.249883 containerd[1482]: time="2025-05-14T00:26:06.249649220Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"dfcf367dbe1e8d39edf4f85ffbc1f04397d8cfe8600d72036c692e4e9b061194\" pid:5930 exited_at:{seconds:1747182366 nanos:248864470}" May 14 00:26:14.593705 containerd[1482]: time="2025-05-14T00:26:14.592667777Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"4cf7e750202dd2b040925a007f5b9b1756d83f9d5339d76ee372449885da1a10\" pid:5953 exited_at:{seconds:1747182374 nanos:591435162}" May 14 00:26:34.882317 containerd[1482]: time="2025-05-14T00:26:34.881895918Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64dc99f3d89b8a6829e440c51970b624b8248bacd531cb6234c3b3bc9b3ec9fd\" id:\"d3d565bf24182b226fba09b556542f4f518462fb3c56eac57b3d8f1b42e4d1aa\" pid:5976 exited_at:{seconds:1747182394 nanos:878592923}" May 14 00:26:36.267050 containerd[1482]: time="2025-05-14T00:26:36.266982765Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d9b88ea50487bd6e0ef532b1ffc777fc18c98e313e9bb94f64cdfb673956592\" id:\"9485187a18b6b3d4d6196a4cd48e031023beb551b5856295c1cb3ebeda0242b9\" pid:5999 exited_at:{seconds:1747182396 nanos:266571862}"