May 8 08:40:16.086068 kernel: Linux version 6.6.88-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu May 8 02:00:00 -00 2025 May 8 08:40:16.086097 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=c6e5fdf442cda427c9e51c476b5f39b01a7265d89660ab7c7d9178b52b2cc04b May 8 08:40:16.086111 kernel: BIOS-provided physical RAM map: May 8 08:40:16.086120 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 8 08:40:16.086128 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 8 08:40:16.086136 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 8 08:40:16.086146 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable May 8 08:40:16.086154 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved May 8 08:40:16.086162 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 8 08:40:16.086171 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 8 08:40:16.086179 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable May 8 08:40:16.086189 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 8 08:40:16.086197 kernel: NX (Execute Disable) protection: active May 8 08:40:16.086205 kernel: APIC: Static calls initialized May 8 08:40:16.086215 kernel: SMBIOS 3.0.0 present. May 8 08:40:16.086224 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 May 8 08:40:16.086235 kernel: Hypervisor detected: KVM May 8 08:40:16.086243 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 8 08:40:16.086252 kernel: kvm-clock: using sched offset of 4788924321 cycles May 8 08:40:16.086261 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 8 08:40:16.086270 kernel: tsc: Detected 1996.249 MHz processor May 8 08:40:16.086280 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 8 08:40:16.086290 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 8 08:40:16.086299 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 May 8 08:40:16.086308 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 8 08:40:16.086317 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 8 08:40:16.086328 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 May 8 08:40:16.086336 kernel: ACPI: Early table checksum verification disabled May 8 08:40:16.086345 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) May 8 08:40:16.086355 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 8 08:40:16.086363 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 8 08:40:16.086372 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 8 08:40:16.086381 kernel: ACPI: FACS 0x00000000BFFE0000 000040 May 8 08:40:16.086390 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 8 08:40:16.086399 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 8 08:40:16.086410 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] May 8 08:40:16.086419 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] May 8 08:40:16.086428 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] May 8 08:40:16.086436 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] May 8 08:40:16.086445 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] May 8 08:40:16.086458 kernel: No NUMA configuration found May 8 08:40:16.086470 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] May 8 08:40:16.086479 kernel: NODE_DATA(0) allocated [mem 0x13fff5000-0x13fffcfff] May 8 08:40:16.086488 kernel: Zone ranges: May 8 08:40:16.086497 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 8 08:40:16.086507 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] May 8 08:40:16.086516 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] May 8 08:40:16.086525 kernel: Device empty May 8 08:40:16.086534 kernel: Movable zone start for each node May 8 08:40:16.086546 kernel: Early memory node ranges May 8 08:40:16.086555 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 8 08:40:16.086564 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] May 8 08:40:16.086573 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] May 8 08:40:16.086582 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] May 8 08:40:16.086591 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 8 08:40:16.086601 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 8 08:40:16.086610 kernel: On node 0, zone Normal: 35 pages in unavailable ranges May 8 08:40:16.086619 kernel: ACPI: PM-Timer IO Port: 0x608 May 8 08:40:16.086630 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 8 08:40:16.086639 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 8 08:40:16.086648 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 8 08:40:16.086658 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 8 08:40:16.086667 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 8 08:40:16.086676 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 8 08:40:16.086685 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 8 08:40:16.086694 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 8 08:40:16.086704 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs May 8 08:40:16.086715 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 8 08:40:16.086724 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices May 8 08:40:16.086734 kernel: Booting paravirtualized kernel on KVM May 8 08:40:16.086744 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 8 08:40:16.086753 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 8 08:40:16.086763 kernel: percpu: Embedded 58 pages/cpu s197480 r8192 d31896 u1048576 May 8 08:40:16.086772 kernel: pcpu-alloc: s197480 r8192 d31896 u1048576 alloc=1*2097152 May 8 08:40:16.086781 kernel: pcpu-alloc: [0] 0 1 May 8 08:40:16.086790 kernel: kvm-guest: PV spinlocks disabled, no host support May 8 08:40:16.086802 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=c6e5fdf442cda427c9e51c476b5f39b01a7265d89660ab7c7d9178b52b2cc04b May 8 08:40:16.086812 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 8 08:40:16.086821 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 8 08:40:16.086831 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 8 08:40:16.086840 kernel: Fallback order for Node 0: 0 May 8 08:40:16.086849 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1031901 May 8 08:40:16.086858 kernel: Policy zone: Normal May 8 08:40:16.086868 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 8 08:40:16.086879 kernel: software IO TLB: area num 2. May 8 08:40:16.086888 kernel: Memory: 3968244K/4193772K available (14336K kernel code, 2309K rwdata, 9040K rodata, 53684K init, 1592K bss, 225268K reserved, 0K cma-reserved) May 8 08:40:16.086898 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 8 08:40:16.086907 kernel: ftrace: allocating 38190 entries in 150 pages May 8 08:40:16.086916 kernel: ftrace: allocated 150 pages with 4 groups May 8 08:40:16.086925 kernel: Dynamic Preempt: voluntary May 8 08:40:16.086934 kernel: rcu: Preemptible hierarchical RCU implementation. May 8 08:40:16.086945 kernel: rcu: RCU event tracing is enabled. May 8 08:40:16.086954 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 8 08:40:16.086965 kernel: Trampoline variant of Tasks RCU enabled. May 8 08:40:16.086975 kernel: Rude variant of Tasks RCU enabled. May 8 08:40:16.089849 kernel: Tracing variant of Tasks RCU enabled. May 8 08:40:16.089860 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 8 08:40:16.089870 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 8 08:40:16.089879 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 May 8 08:40:16.089888 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 8 08:40:16.089897 kernel: Console: colour VGA+ 80x25 May 8 08:40:16.089906 kernel: printk: console [tty0] enabled May 8 08:40:16.089916 kernel: printk: console [ttyS0] enabled May 8 08:40:16.089930 kernel: ACPI: Core revision 20230628 May 8 08:40:16.089940 kernel: APIC: Switch to symmetric I/O mode setup May 8 08:40:16.089949 kernel: x2apic enabled May 8 08:40:16.089958 kernel: APIC: Switched APIC routing to: physical x2apic May 8 08:40:16.089967 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 8 08:40:16.089977 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized May 8 08:40:16.090000 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) May 8 08:40:16.090010 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 8 08:40:16.090019 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 8 08:40:16.090032 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 8 08:40:16.090042 kernel: Spectre V2 : Mitigation: Retpolines May 8 08:40:16.090051 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch May 8 08:40:16.090060 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT May 8 08:40:16.090069 kernel: Speculative Store Bypass: Vulnerable May 8 08:40:16.090079 kernel: x86/fpu: x87 FPU will use FXSAVE May 8 08:40:16.090096 kernel: Freeing SMP alternatives memory: 32K May 8 08:40:16.090106 kernel: pid_max: default: 32768 minimum: 301 May 8 08:40:16.090116 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 8 08:40:16.090125 kernel: landlock: Up and running. May 8 08:40:16.090135 kernel: SELinux: Initializing. May 8 08:40:16.090145 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 8 08:40:16.090156 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 8 08:40:16.090166 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) May 8 08:40:16.090176 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 8 08:40:16.090186 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 8 08:40:16.090198 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 8 08:40:16.090208 kernel: Performance Events: AMD PMU driver. May 8 08:40:16.090217 kernel: ... version: 0 May 8 08:40:16.090227 kernel: ... bit width: 48 May 8 08:40:16.090237 kernel: ... generic registers: 4 May 8 08:40:16.090247 kernel: ... value mask: 0000ffffffffffff May 8 08:40:16.090256 kernel: ... max period: 00007fffffffffff May 8 08:40:16.090266 kernel: ... fixed-purpose events: 0 May 8 08:40:16.090275 kernel: ... event mask: 000000000000000f May 8 08:40:16.090287 kernel: signal: max sigframe size: 1440 May 8 08:40:16.090296 kernel: rcu: Hierarchical SRCU implementation. May 8 08:40:16.090306 kernel: rcu: Max phase no-delay instances is 400. May 8 08:40:16.090317 kernel: smp: Bringing up secondary CPUs ... May 8 08:40:16.090327 kernel: smpboot: x86: Booting SMP configuration: May 8 08:40:16.090336 kernel: .... node #0, CPUs: #1 May 8 08:40:16.090346 kernel: smp: Brought up 1 node, 2 CPUs May 8 08:40:16.090356 kernel: smpboot: Max logical packages: 2 May 8 08:40:16.090365 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) May 8 08:40:16.090375 kernel: devtmpfs: initialized May 8 08:40:16.090386 kernel: x86/mm: Memory block size: 128MB May 8 08:40:16.090396 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 8 08:40:16.090406 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 8 08:40:16.090416 kernel: pinctrl core: initialized pinctrl subsystem May 8 08:40:16.090425 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 8 08:40:16.090435 kernel: audit: initializing netlink subsys (disabled) May 8 08:40:16.090445 kernel: audit: type=2000 audit(1746693612.170:1): state=initialized audit_enabled=0 res=1 May 8 08:40:16.090454 kernel: thermal_sys: Registered thermal governor 'step_wise' May 8 08:40:16.090466 kernel: thermal_sys: Registered thermal governor 'user_space' May 8 08:40:16.090475 kernel: cpuidle: using governor menu May 8 08:40:16.090485 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 8 08:40:16.090495 kernel: dca service started, version 1.12.1 May 8 08:40:16.090505 kernel: PCI: Using configuration type 1 for base access May 8 08:40:16.090515 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 8 08:40:16.090525 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 8 08:40:16.090535 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 8 08:40:16.090544 kernel: ACPI: Added _OSI(Module Device) May 8 08:40:16.090554 kernel: ACPI: Added _OSI(Processor Device) May 8 08:40:16.090565 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 8 08:40:16.090575 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 8 08:40:16.090585 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 8 08:40:16.090595 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC May 8 08:40:16.090604 kernel: ACPI: Interpreter enabled May 8 08:40:16.090614 kernel: ACPI: PM: (supports S0 S3 S5) May 8 08:40:16.090624 kernel: ACPI: Using IOAPIC for interrupt routing May 8 08:40:16.090634 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 8 08:40:16.090644 kernel: PCI: Using E820 reservations for host bridge windows May 8 08:40:16.090656 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F May 8 08:40:16.090666 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 8 08:40:16.090815 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] May 8 08:40:16.090919 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] May 8 08:40:16.091055 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge May 8 08:40:16.091071 kernel: acpiphp: Slot [3] registered May 8 08:40:16.091081 kernel: acpiphp: Slot [4] registered May 8 08:40:16.091095 kernel: acpiphp: Slot [5] registered May 8 08:40:16.091104 kernel: acpiphp: Slot [6] registered May 8 08:40:16.091114 kernel: acpiphp: Slot [7] registered May 8 08:40:16.091123 kernel: acpiphp: Slot [8] registered May 8 08:40:16.091133 kernel: acpiphp: Slot [9] registered May 8 08:40:16.091142 kernel: acpiphp: Slot [10] registered May 8 08:40:16.091152 kernel: acpiphp: Slot [11] registered May 8 08:40:16.091162 kernel: acpiphp: Slot [12] registered May 8 08:40:16.091172 kernel: acpiphp: Slot [13] registered May 8 08:40:16.091183 kernel: acpiphp: Slot [14] registered May 8 08:40:16.091192 kernel: acpiphp: Slot [15] registered May 8 08:40:16.091202 kernel: acpiphp: Slot [16] registered May 8 08:40:16.091211 kernel: acpiphp: Slot [17] registered May 8 08:40:16.091221 kernel: acpiphp: Slot [18] registered May 8 08:40:16.091231 kernel: acpiphp: Slot [19] registered May 8 08:40:16.091240 kernel: acpiphp: Slot [20] registered May 8 08:40:16.091250 kernel: acpiphp: Slot [21] registered May 8 08:40:16.091259 kernel: acpiphp: Slot [22] registered May 8 08:40:16.091269 kernel: acpiphp: Slot [23] registered May 8 08:40:16.091280 kernel: acpiphp: Slot [24] registered May 8 08:40:16.091290 kernel: acpiphp: Slot [25] registered May 8 08:40:16.091299 kernel: acpiphp: Slot [26] registered May 8 08:40:16.091309 kernel: acpiphp: Slot [27] registered May 8 08:40:16.091318 kernel: acpiphp: Slot [28] registered May 8 08:40:16.091328 kernel: acpiphp: Slot [29] registered May 8 08:40:16.091337 kernel: acpiphp: Slot [30] registered May 8 08:40:16.091347 kernel: acpiphp: Slot [31] registered May 8 08:40:16.091357 kernel: PCI host bridge to bus 0000:00 May 8 08:40:16.091458 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 8 08:40:16.091546 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 8 08:40:16.091629 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 8 08:40:16.091715 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 8 08:40:16.091798 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] May 8 08:40:16.091879 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 8 08:40:16.092005 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 May 8 08:40:16.092122 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 May 8 08:40:16.092227 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 May 8 08:40:16.092322 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] May 8 08:40:16.092419 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] May 8 08:40:16.092512 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] May 8 08:40:16.092604 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] May 8 08:40:16.092700 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] May 8 08:40:16.092799 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 May 8 08:40:16.092893 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI May 8 08:40:16.095181 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB May 8 08:40:16.095305 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 May 8 08:40:16.095402 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] May 8 08:40:16.095501 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xc000000000-0xc000003fff 64bit pref] May 8 08:40:16.095594 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] May 8 08:40:16.095685 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] May 8 08:40:16.095778 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 8 08:40:16.095882 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 May 8 08:40:16.095993 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] May 8 08:40:16.096092 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] May 8 08:40:16.096189 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xc000004000-0xc000007fff 64bit pref] May 8 08:40:16.096282 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] May 8 08:40:16.096380 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 May 8 08:40:16.096472 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] May 8 08:40:16.096565 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] May 8 08:40:16.096655 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xc000008000-0xc00000bfff 64bit pref] May 8 08:40:16.096753 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 May 8 08:40:16.096878 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] May 8 08:40:16.100032 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xc00000c000-0xc00000ffff 64bit pref] May 8 08:40:16.100241 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 May 8 08:40:16.100344 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] May 8 08:40:16.100443 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfeb93000-0xfeb93fff] May 8 08:40:16.100540 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xc000010000-0xc000013fff 64bit pref] May 8 08:40:16.100554 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 8 08:40:16.100564 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 8 08:40:16.100578 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 8 08:40:16.100587 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 8 08:40:16.100597 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 May 8 08:40:16.100606 kernel: iommu: Default domain type: Translated May 8 08:40:16.100616 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 8 08:40:16.100629 kernel: PCI: Using ACPI for IRQ routing May 8 08:40:16.100643 kernel: PCI: pci_cache_line_size set to 64 bytes May 8 08:40:16.100656 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 8 08:40:16.100669 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] May 8 08:40:16.100784 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device May 8 08:40:16.100881 kernel: pci 0000:00:02.0: vgaarb: bridge control possible May 8 08:40:16.101157 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 8 08:40:16.101174 kernel: vgaarb: loaded May 8 08:40:16.101183 kernel: clocksource: Switched to clocksource kvm-clock May 8 08:40:16.101193 kernel: VFS: Disk quotas dquot_6.6.0 May 8 08:40:16.101202 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 8 08:40:16.101211 kernel: pnp: PnP ACPI init May 8 08:40:16.101307 kernel: pnp 00:03: [dma 2] May 8 08:40:16.101323 kernel: pnp: PnP ACPI: found 5 devices May 8 08:40:16.101332 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 8 08:40:16.101342 kernel: NET: Registered PF_INET protocol family May 8 08:40:16.101351 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 8 08:40:16.101361 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 8 08:40:16.101370 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 8 08:40:16.101379 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 8 08:40:16.101389 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 8 08:40:16.101401 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 8 08:40:16.101410 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 8 08:40:16.101419 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 8 08:40:16.101429 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 8 08:40:16.101438 kernel: NET: Registered PF_XDP protocol family May 8 08:40:16.101523 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 8 08:40:16.101620 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 8 08:40:16.101702 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 8 08:40:16.101787 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] May 8 08:40:16.101865 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] May 8 08:40:16.105385 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release May 8 08:40:16.105496 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers May 8 08:40:16.105511 kernel: PCI: CLS 0 bytes, default 64 May 8 08:40:16.105521 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 8 08:40:16.105531 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) May 8 08:40:16.105540 kernel: Initialise system trusted keyrings May 8 08:40:16.105554 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 8 08:40:16.105563 kernel: Key type asymmetric registered May 8 08:40:16.105572 kernel: Asymmetric key parser 'x509' registered May 8 08:40:16.105582 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 8 08:40:16.105607 kernel: io scheduler mq-deadline registered May 8 08:40:16.105616 kernel: io scheduler kyber registered May 8 08:40:16.105625 kernel: io scheduler bfq registered May 8 08:40:16.105634 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 8 08:40:16.105644 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 May 8 08:40:16.105656 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 May 8 08:40:16.105665 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 May 8 08:40:16.105674 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 May 8 08:40:16.105683 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 8 08:40:16.105693 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 8 08:40:16.105702 kernel: random: crng init done May 8 08:40:16.105711 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 8 08:40:16.105720 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 8 08:40:16.105729 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 8 08:40:16.105826 kernel: rtc_cmos 00:04: RTC can wake from S4 May 8 08:40:16.105841 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 8 08:40:16.105921 kernel: rtc_cmos 00:04: registered as rtc0 May 8 08:40:16.106039 kernel: rtc_cmos 00:04: setting system clock to 2025-05-08T08:40:15 UTC (1746693615) May 8 08:40:16.106126 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram May 8 08:40:16.106139 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 8 08:40:16.106149 kernel: NET: Registered PF_INET6 protocol family May 8 08:40:16.106158 kernel: Segment Routing with IPv6 May 8 08:40:16.106172 kernel: In-situ OAM (IOAM) with IPv6 May 8 08:40:16.106181 kernel: NET: Registered PF_PACKET protocol family May 8 08:40:16.106190 kernel: Key type dns_resolver registered May 8 08:40:16.106199 kernel: IPI shorthand broadcast: enabled May 8 08:40:16.106208 kernel: sched_clock: Marking stable (3650007946, 167337579)->(3858190636, -40845111) May 8 08:40:16.106217 kernel: registered taskstats version 1 May 8 08:40:16.106226 kernel: Loading compiled-in X.509 certificates May 8 08:40:16.106235 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.88-flatcar: 1d65a3ec8f410b8df3c709e9a079b9ae6c6b7ac3' May 8 08:40:16.106244 kernel: Key type .fscrypt registered May 8 08:40:16.106255 kernel: Key type fscrypt-provisioning registered May 8 08:40:16.106264 kernel: ima: No TPM chip found, activating TPM-bypass! May 8 08:40:16.106274 kernel: ima: Allocated hash algorithm: sha1 May 8 08:40:16.106283 kernel: ima: No architecture policies found May 8 08:40:16.106292 kernel: clk: Disabling unused clocks May 8 08:40:16.106300 kernel: Warning: unable to open an initial console. May 8 08:40:16.106310 kernel: Freeing unused kernel image (initmem) memory: 53684K May 8 08:40:16.106331 kernel: Write protecting the kernel read-only data: 24576k May 8 08:40:16.106340 kernel: Freeing unused kernel image (rodata/data gap) memory: 1200K May 8 08:40:16.106351 kernel: Run /init as init process May 8 08:40:16.106360 kernel: with arguments: May 8 08:40:16.106369 kernel: /init May 8 08:40:16.106378 kernel: with environment: May 8 08:40:16.106387 kernel: HOME=/ May 8 08:40:16.106395 kernel: TERM=linux May 8 08:40:16.106404 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 8 08:40:16.106415 systemd[1]: Successfully made /usr/ read-only. May 8 08:40:16.106429 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 8 08:40:16.106440 systemd[1]: Detected virtualization kvm. May 8 08:40:16.106449 systemd[1]: Detected architecture x86-64. May 8 08:40:16.106459 systemd[1]: Running in initrd. May 8 08:40:16.106468 systemd[1]: No hostname configured, using default hostname. May 8 08:40:16.106487 systemd[1]: Hostname set to . May 8 08:40:16.106497 systemd[1]: Initializing machine ID from VM UUID. May 8 08:40:16.106511 systemd[1]: Queued start job for default target initrd.target. May 8 08:40:16.106530 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 8 08:40:16.106542 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 8 08:40:16.106554 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 8 08:40:16.106565 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 8 08:40:16.106576 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 8 08:40:16.106590 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 8 08:40:16.106602 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 8 08:40:16.106613 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 8 08:40:16.106624 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 8 08:40:16.106635 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 8 08:40:16.106646 systemd[1]: Reached target paths.target - Path Units. May 8 08:40:16.106667 systemd[1]: Reached target slices.target - Slice Units. May 8 08:40:16.106679 systemd[1]: Reached target swap.target - Swaps. May 8 08:40:16.109037 systemd[1]: Reached target timers.target - Timer Units. May 8 08:40:16.109052 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 8 08:40:16.109062 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 8 08:40:16.109073 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 8 08:40:16.109083 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 8 08:40:16.109094 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 8 08:40:16.109104 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 8 08:40:16.109114 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 8 08:40:16.109128 systemd[1]: Reached target sockets.target - Socket Units. May 8 08:40:16.109138 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 8 08:40:16.109148 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 8 08:40:16.109158 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 8 08:40:16.109169 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 8 08:40:16.109179 systemd[1]: Starting systemd-fsck-usr.service... May 8 08:40:16.109189 systemd[1]: Starting systemd-journald.service - Journal Service... May 8 08:40:16.109199 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 8 08:40:16.109211 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 8 08:40:16.109222 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 8 08:40:16.109234 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 8 08:40:16.109271 systemd-journald[185]: Collecting audit messages is disabled. May 8 08:40:16.109298 systemd[1]: Finished systemd-fsck-usr.service. May 8 08:40:16.109311 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 8 08:40:16.109323 systemd-journald[185]: Journal started May 8 08:40:16.109346 systemd-journald[185]: Runtime Journal (/run/log/journal/1aa48beeb2004471a665a4100cf02bf1) is 8M, max 78.5M, 70.5M free. May 8 08:40:16.098825 systemd-modules-load[187]: Inserted module 'overlay' May 8 08:40:16.170843 systemd[1]: Started systemd-journald.service - Journal Service. May 8 08:40:16.170890 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 8 08:40:16.170927 kernel: Bridge firewalling registered May 8 08:40:16.145073 systemd-modules-load[187]: Inserted module 'br_netfilter' May 8 08:40:16.172851 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 8 08:40:16.173622 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 8 08:40:16.177265 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 8 08:40:16.181143 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 8 08:40:16.185472 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 8 08:40:16.194368 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 8 08:40:16.201279 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 8 08:40:16.210017 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 8 08:40:16.213095 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 8 08:40:16.216238 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 8 08:40:16.221954 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 8 08:40:16.236297 systemd-tmpfiles[210]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 8 08:40:16.239458 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 8 08:40:16.242092 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 8 08:40:16.244776 dracut-cmdline[221]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=c6e5fdf442cda427c9e51c476b5f39b01a7265d89660ab7c7d9178b52b2cc04b May 8 08:40:16.284009 systemd-resolved[234]: Positive Trust Anchors: May 8 08:40:16.284021 systemd-resolved[234]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 8 08:40:16.284061 systemd-resolved[234]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 8 08:40:16.291305 systemd-resolved[234]: Defaulting to hostname 'linux'. May 8 08:40:16.292714 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 8 08:40:16.293278 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 8 08:40:16.314012 kernel: SCSI subsystem initialized May 8 08:40:16.325025 kernel: Loading iSCSI transport class v2.0-870. May 8 08:40:16.339022 kernel: iscsi: registered transport (tcp) May 8 08:40:16.386285 kernel: iscsi: registered transport (qla4xxx) May 8 08:40:16.386373 kernel: QLogic iSCSI HBA Driver May 8 08:40:16.409047 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 8 08:40:16.429384 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 8 08:40:16.431548 systemd[1]: Reached target network-pre.target - Preparation for Network. May 8 08:40:16.474196 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 8 08:40:16.477086 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 8 08:40:16.530070 kernel: raid6: sse2x4 gen() 12736 MB/s May 8 08:40:16.548068 kernel: raid6: sse2x2 gen() 14403 MB/s May 8 08:40:16.566447 kernel: raid6: sse2x1 gen() 9536 MB/s May 8 08:40:16.566517 kernel: raid6: using algorithm sse2x2 gen() 14403 MB/s May 8 08:40:16.585399 kernel: raid6: .... xor() 9347 MB/s, rmw enabled May 8 08:40:16.585462 kernel: raid6: using ssse3x2 recovery algorithm May 8 08:40:16.607198 kernel: xor: measuring software checksum speed May 8 08:40:16.607261 kernel: prefetch64-sse : 16362 MB/sec May 8 08:40:16.609621 kernel: generic_sse : 16250 MB/sec May 8 08:40:16.609661 kernel: xor: using function: prefetch64-sse (16362 MB/sec) May 8 08:40:16.783475 kernel: Btrfs loaded, zoned=no, fsverity=no May 8 08:40:16.789423 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 8 08:40:16.794401 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 8 08:40:16.822095 systemd-udevd[435]: Using default interface naming scheme 'v255'. May 8 08:40:16.827484 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 8 08:40:16.834021 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 8 08:40:16.862839 dracut-pre-trigger[449]: rd.md=0: removing MD RAID activation May 8 08:40:16.890867 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 8 08:40:16.894424 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 8 08:40:16.951256 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 8 08:40:16.955501 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 8 08:40:17.027011 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues May 8 08:40:17.087326 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) May 8 08:40:17.087476 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 May 8 08:40:17.087492 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 8 08:40:17.087505 kernel: GPT:17805311 != 20971519 May 8 08:40:17.087517 kernel: GPT:Alternate GPT header not at the end of the disk. May 8 08:40:17.087529 kernel: GPT:17805311 != 20971519 May 8 08:40:17.087541 kernel: GPT: Use GNU Parted to correct GPT errors. May 8 08:40:17.087553 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 8 08:40:17.078434 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 8 08:40:17.078566 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 8 08:40:17.079182 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 8 08:40:17.080623 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 8 08:40:17.082464 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 8 08:40:17.123004 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (497) May 8 08:40:17.123060 kernel: BTRFS: device fsid 4222d142-b650-43a9-9f25-c45bb16dfbfe devid 1 transid 43 /dev/vda3 scanned by (udev-worker) (484) May 8 08:40:17.152007 kernel: libata version 3.00 loaded. May 8 08:40:17.156070 kernel: ata_piix 0000:00:01.1: version 2.13 May 8 08:40:17.169396 kernel: scsi host0: ata_piix May 8 08:40:17.169515 kernel: scsi host1: ata_piix May 8 08:40:17.169658 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 May 8 08:40:17.169682 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 May 8 08:40:17.176164 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 8 08:40:17.187213 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 8 08:40:17.196723 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 8 08:40:17.197314 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 8 08:40:17.209307 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 8 08:40:17.220561 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 8 08:40:17.224090 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 8 08:40:17.256006 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 8 08:40:17.256125 disk-uuid[534]: Primary Header is updated. May 8 08:40:17.256125 disk-uuid[534]: Secondary Entries is updated. May 8 08:40:17.256125 disk-uuid[534]: Secondary Header is updated. May 8 08:40:17.405269 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 8 08:40:17.422142 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 8 08:40:17.423564 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 8 08:40:17.424765 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 8 08:40:17.426796 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 8 08:40:17.447696 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 8 08:40:18.282067 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 8 08:40:18.283099 disk-uuid[535]: The operation has completed successfully. May 8 08:40:18.365292 systemd[1]: disk-uuid.service: Deactivated successfully. May 8 08:40:18.365405 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 8 08:40:18.413826 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 8 08:40:18.432448 sh[560]: Success May 8 08:40:18.458520 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 8 08:40:18.458618 kernel: device-mapper: uevent: version 1.0.3 May 8 08:40:18.459490 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 8 08:40:18.484032 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" May 8 08:40:18.559455 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 8 08:40:18.561100 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 8 08:40:18.582200 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 8 08:40:18.605039 kernel: BTRFS info (device dm-0): first mount of filesystem 4222d142-b650-43a9-9f25-c45bb16dfbfe May 8 08:40:18.605121 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 8 08:40:18.605153 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 8 08:40:18.605577 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 8 08:40:18.607214 kernel: BTRFS info (device dm-0): using free space tree May 8 08:40:18.628701 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 8 08:40:18.630798 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 8 08:40:18.632808 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 8 08:40:18.636210 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 8 08:40:18.641239 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 8 08:40:18.688299 kernel: BTRFS info (device vda6): first mount of filesystem cf994bdc-a9fa-440e-8b4d-4cf650650e1d May 8 08:40:18.688388 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 8 08:40:18.692495 kernel: BTRFS info (device vda6): using free space tree May 8 08:40:18.710073 kernel: BTRFS info (device vda6): auto enabling async discard May 8 08:40:18.723210 kernel: BTRFS info (device vda6): last unmount of filesystem cf994bdc-a9fa-440e-8b4d-4cf650650e1d May 8 08:40:18.739816 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 8 08:40:18.747238 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 8 08:40:18.791425 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 8 08:40:18.795405 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 8 08:40:18.832707 systemd-networkd[742]: lo: Link UP May 8 08:40:18.832720 systemd-networkd[742]: lo: Gained carrier May 8 08:40:18.834290 systemd-networkd[742]: Enumeration completed May 8 08:40:18.834357 systemd[1]: Started systemd-networkd.service - Network Configuration. May 8 08:40:18.834952 systemd[1]: Reached target network.target - Network. May 8 08:40:18.835498 systemd-networkd[742]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 8 08:40:18.835502 systemd-networkd[742]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 8 08:40:18.836285 systemd-networkd[742]: eth0: Link UP May 8 08:40:18.836289 systemd-networkd[742]: eth0: Gained carrier May 8 08:40:18.836298 systemd-networkd[742]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 8 08:40:18.850312 systemd-networkd[742]: eth0: DHCPv4 address 172.24.4.129/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 8 08:40:18.971891 ignition[681]: Ignition 2.21.0 May 8 08:40:18.971910 ignition[681]: Stage: fetch-offline May 8 08:40:18.975881 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 8 08:40:18.971950 ignition[681]: no configs at "/usr/lib/ignition/base.d" May 8 08:40:18.971960 ignition[681]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 8 08:40:18.972087 ignition[681]: parsed url from cmdline: "" May 8 08:40:18.980090 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 8 08:40:18.972093 ignition[681]: no config URL provided May 8 08:40:18.972099 ignition[681]: reading system config file "/usr/lib/ignition/user.ign" May 8 08:40:18.972108 ignition[681]: no config at "/usr/lib/ignition/user.ign" May 8 08:40:18.972113 ignition[681]: failed to fetch config: resource requires networking May 8 08:40:18.972271 ignition[681]: Ignition finished successfully May 8 08:40:19.009187 ignition[753]: Ignition 2.21.0 May 8 08:40:19.009207 ignition[753]: Stage: fetch May 8 08:40:19.009435 ignition[753]: no configs at "/usr/lib/ignition/base.d" May 8 08:40:19.009454 ignition[753]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 8 08:40:19.009720 ignition[753]: parsed url from cmdline: "" May 8 08:40:19.009728 ignition[753]: no config URL provided May 8 08:40:19.009738 ignition[753]: reading system config file "/usr/lib/ignition/user.ign" May 8 08:40:19.009755 ignition[753]: no config at "/usr/lib/ignition/user.ign" May 8 08:40:19.009895 ignition[753]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... May 8 08:40:19.009921 ignition[753]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 May 8 08:40:19.009942 ignition[753]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... May 8 08:40:19.246836 ignition[753]: GET result: OK May 8 08:40:19.246947 ignition[753]: parsing config with SHA512: 6eeca4523fb38933f756870fa78614d18d96b4ffd5afcc94e37b1de0cc03689d20337b748c136ae32ef76f0ffb71e38dfca7429dbb47dbc6dcae22b949c210ea May 8 08:40:19.255271 unknown[753]: fetched base config from "system" May 8 08:40:19.255285 unknown[753]: fetched base config from "system" May 8 08:40:19.255628 ignition[753]: fetch: fetch complete May 8 08:40:19.255291 unknown[753]: fetched user config from "openstack" May 8 08:40:19.255635 ignition[753]: fetch: fetch passed May 8 08:40:19.257845 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 8 08:40:19.255674 ignition[753]: Ignition finished successfully May 8 08:40:19.262101 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 8 08:40:19.308468 ignition[760]: Ignition 2.21.0 May 8 08:40:19.308509 ignition[760]: Stage: kargs May 8 08:40:19.308868 ignition[760]: no configs at "/usr/lib/ignition/base.d" May 8 08:40:19.308894 ignition[760]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 8 08:40:19.312276 ignition[760]: kargs: kargs passed May 8 08:40:19.317289 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 8 08:40:19.312450 ignition[760]: Ignition finished successfully May 8 08:40:19.322933 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 8 08:40:19.372028 ignition[766]: Ignition 2.21.0 May 8 08:40:19.372054 ignition[766]: Stage: disks May 8 08:40:19.372371 ignition[766]: no configs at "/usr/lib/ignition/base.d" May 8 08:40:19.372397 ignition[766]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 8 08:40:19.380451 ignition[766]: disks: disks passed May 8 08:40:19.380591 ignition[766]: Ignition finished successfully May 8 08:40:19.383124 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 8 08:40:19.385356 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 8 08:40:19.387462 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 8 08:40:19.390334 systemd[1]: Reached target local-fs.target - Local File Systems. May 8 08:40:19.393160 systemd[1]: Reached target sysinit.target - System Initialization. May 8 08:40:19.395585 systemd[1]: Reached target basic.target - Basic System. May 8 08:40:19.400171 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 8 08:40:19.445328 systemd-fsck[774]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks May 8 08:40:19.456962 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 8 08:40:19.461862 systemd[1]: Mounting sysroot.mount - /sysroot... May 8 08:40:19.661023 kernel: EXT4-fs (vda9): mounted filesystem a930b30e-9e26-4ff0-8cba-060d78a760a1 r/w with ordered data mode. Quota mode: none. May 8 08:40:19.661977 systemd[1]: Mounted sysroot.mount - /sysroot. May 8 08:40:19.663531 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 8 08:40:19.666618 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 8 08:40:19.669090 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 8 08:40:19.670434 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 8 08:40:19.675296 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... May 8 08:40:19.676771 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 8 08:40:19.676866 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 8 08:40:19.689028 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 8 08:40:19.697334 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 8 08:40:19.706032 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (782) May 8 08:40:19.716260 kernel: BTRFS info (device vda6): first mount of filesystem cf994bdc-a9fa-440e-8b4d-4cf650650e1d May 8 08:40:19.716329 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 8 08:40:19.722202 kernel: BTRFS info (device vda6): using free space tree May 8 08:40:19.733142 kernel: BTRFS info (device vda6): auto enabling async discard May 8 08:40:19.740496 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 8 08:40:19.800642 initrd-setup-root[810]: cut: /sysroot/etc/passwd: No such file or directory May 8 08:40:19.805020 initrd-setup-root[817]: cut: /sysroot/etc/group: No such file or directory May 8 08:40:19.811127 initrd-setup-root[824]: cut: /sysroot/etc/shadow: No such file or directory May 8 08:40:19.816902 initrd-setup-root[831]: cut: /sysroot/etc/gshadow: No such file or directory May 8 08:40:19.910577 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 8 08:40:19.912452 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 8 08:40:19.915115 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 8 08:40:19.927266 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 8 08:40:19.930087 kernel: BTRFS info (device vda6): last unmount of filesystem cf994bdc-a9fa-440e-8b4d-4cf650650e1d May 8 08:40:19.950682 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 8 08:40:19.959945 ignition[899]: INFO : Ignition 2.21.0 May 8 08:40:19.959945 ignition[899]: INFO : Stage: mount May 8 08:40:19.962498 ignition[899]: INFO : no configs at "/usr/lib/ignition/base.d" May 8 08:40:19.962498 ignition[899]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 8 08:40:19.962498 ignition[899]: INFO : mount: mount passed May 8 08:40:19.962498 ignition[899]: INFO : Ignition finished successfully May 8 08:40:19.963146 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 8 08:40:20.311348 systemd-networkd[742]: eth0: Gained IPv6LL May 8 08:40:26.853492 coreos-metadata[784]: May 08 08:40:26.853 WARN failed to locate config-drive, using the metadata service API instead May 8 08:40:26.894933 coreos-metadata[784]: May 08 08:40:26.894 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 8 08:40:26.911626 coreos-metadata[784]: May 08 08:40:26.911 INFO Fetch successful May 8 08:40:26.913204 coreos-metadata[784]: May 08 08:40:26.912 INFO wrote hostname ci-4327-0-0-w-78bcb828ec.novalocal to /sysroot/etc/hostname May 8 08:40:26.915944 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. May 8 08:40:26.916232 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. May 8 08:40:26.923845 systemd[1]: Starting ignition-files.service - Ignition (files)... May 8 08:40:26.949669 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 8 08:40:26.980114 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (915) May 8 08:40:26.987583 kernel: BTRFS info (device vda6): first mount of filesystem cf994bdc-a9fa-440e-8b4d-4cf650650e1d May 8 08:40:26.987669 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 8 08:40:26.991788 kernel: BTRFS info (device vda6): using free space tree May 8 08:40:27.003107 kernel: BTRFS info (device vda6): auto enabling async discard May 8 08:40:27.007805 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 8 08:40:27.064120 ignition[933]: INFO : Ignition 2.21.0 May 8 08:40:27.064120 ignition[933]: INFO : Stage: files May 8 08:40:27.067632 ignition[933]: INFO : no configs at "/usr/lib/ignition/base.d" May 8 08:40:27.067632 ignition[933]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 8 08:40:27.073829 ignition[933]: DEBUG : files: compiled without relabeling support, skipping May 8 08:40:27.080068 ignition[933]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 8 08:40:27.080068 ignition[933]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 8 08:40:27.084533 ignition[933]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 8 08:40:27.086666 ignition[933]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 8 08:40:27.089187 ignition[933]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 8 08:40:27.088851 unknown[933]: wrote ssh authorized keys file for user: core May 8 08:40:27.094296 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 8 08:40:27.096783 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 8 08:40:28.141040 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 8 08:40:28.454487 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 8 08:40:28.454487 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 8 08:40:28.459454 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 May 8 08:40:29.175282 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 8 08:40:30.749043 ignition[933]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 8 08:40:30.749043 ignition[933]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 8 08:40:30.754422 ignition[933]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 8 08:40:30.754422 ignition[933]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 8 08:40:30.754422 ignition[933]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 8 08:40:30.754422 ignition[933]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 8 08:40:30.754422 ignition[933]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 8 08:40:30.754422 ignition[933]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 8 08:40:30.754422 ignition[933]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 8 08:40:30.754422 ignition[933]: INFO : files: files passed May 8 08:40:30.754422 ignition[933]: INFO : Ignition finished successfully May 8 08:40:30.752315 systemd[1]: Finished ignition-files.service - Ignition (files). May 8 08:40:30.759134 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 8 08:40:30.761950 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 8 08:40:30.773634 systemd[1]: ignition-quench.service: Deactivated successfully. May 8 08:40:30.773722 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 8 08:40:30.784334 initrd-setup-root-after-ignition[962]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 8 08:40:30.784334 initrd-setup-root-after-ignition[962]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 8 08:40:30.787125 initrd-setup-root-after-ignition[966]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 8 08:40:30.790466 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 8 08:40:30.793308 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 8 08:40:30.798210 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 8 08:40:30.855726 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 8 08:40:30.855927 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 8 08:40:30.858447 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 8 08:40:30.859406 systemd[1]: Reached target initrd.target - Initrd Default Target. May 8 08:40:30.861752 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 8 08:40:30.864092 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 8 08:40:30.886947 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 8 08:40:30.889192 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 8 08:40:30.920483 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 8 08:40:30.923265 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 8 08:40:30.924222 systemd[1]: Stopped target timers.target - Timer Units. May 8 08:40:30.926310 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 8 08:40:30.926438 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 8 08:40:30.928769 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 8 08:40:30.929792 systemd[1]: Stopped target basic.target - Basic System. May 8 08:40:30.931869 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 8 08:40:30.933658 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 8 08:40:30.935439 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 8 08:40:30.937480 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 8 08:40:30.939588 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 8 08:40:30.941717 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 8 08:40:30.943906 systemd[1]: Stopped target sysinit.target - System Initialization. May 8 08:40:30.945955 systemd[1]: Stopped target local-fs.target - Local File Systems. May 8 08:40:30.948157 systemd[1]: Stopped target swap.target - Swaps. May 8 08:40:30.950131 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 8 08:40:30.950245 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 8 08:40:30.952313 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 8 08:40:30.953158 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 8 08:40:30.954103 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 8 08:40:30.954210 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 8 08:40:30.955301 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 8 08:40:30.955458 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 8 08:40:30.956942 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 8 08:40:30.957088 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 8 08:40:30.958312 systemd[1]: ignition-files.service: Deactivated successfully. May 8 08:40:30.958464 systemd[1]: Stopped ignition-files.service - Ignition (files). May 8 08:40:30.962195 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 8 08:40:30.964173 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 8 08:40:30.964686 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 8 08:40:30.965121 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 8 08:40:30.966676 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 8 08:40:30.966831 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 8 08:40:30.973758 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 8 08:40:30.975034 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 8 08:40:30.993411 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 8 08:40:30.995142 ignition[986]: INFO : Ignition 2.21.0 May 8 08:40:30.996785 ignition[986]: INFO : Stage: umount May 8 08:40:30.996785 ignition[986]: INFO : no configs at "/usr/lib/ignition/base.d" May 8 08:40:30.996785 ignition[986]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 8 08:40:30.998202 systemd[1]: ignition-mount.service: Deactivated successfully. May 8 08:40:31.001580 ignition[986]: INFO : umount: umount passed May 8 08:40:31.001580 ignition[986]: INFO : Ignition finished successfully May 8 08:40:30.998304 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 8 08:40:30.999186 systemd[1]: ignition-disks.service: Deactivated successfully. May 8 08:40:30.999254 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 8 08:40:30.999755 systemd[1]: ignition-kargs.service: Deactivated successfully. May 8 08:40:30.999794 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 8 08:40:31.001131 systemd[1]: ignition-fetch.service: Deactivated successfully. May 8 08:40:31.001189 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 8 08:40:31.002184 systemd[1]: Stopped target network.target - Network. May 8 08:40:31.003195 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 8 08:40:31.003246 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 8 08:40:31.004279 systemd[1]: Stopped target paths.target - Path Units. May 8 08:40:31.005187 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 8 08:40:31.009280 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 8 08:40:31.010068 systemd[1]: Stopped target slices.target - Slice Units. May 8 08:40:31.011165 systemd[1]: Stopped target sockets.target - Socket Units. May 8 08:40:31.012160 systemd[1]: iscsid.socket: Deactivated successfully. May 8 08:40:31.012199 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 8 08:40:31.013360 systemd[1]: iscsiuio.socket: Deactivated successfully. May 8 08:40:31.013398 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 8 08:40:31.014639 systemd[1]: ignition-setup.service: Deactivated successfully. May 8 08:40:31.014680 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 8 08:40:31.015560 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 8 08:40:31.015600 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 8 08:40:31.016609 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 8 08:40:31.017772 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 8 08:40:31.019945 systemd[1]: sysroot-boot.service: Deactivated successfully. May 8 08:40:31.020054 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 8 08:40:31.021047 systemd[1]: systemd-resolved.service: Deactivated successfully. May 8 08:40:31.021150 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 8 08:40:31.024411 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 8 08:40:31.025281 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 8 08:40:31.025338 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 8 08:40:31.027053 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 8 08:40:31.027099 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 8 08:40:31.029403 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 8 08:40:31.029691 systemd[1]: systemd-networkd.service: Deactivated successfully. May 8 08:40:31.029776 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 8 08:40:31.031281 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 8 08:40:31.031671 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 8 08:40:31.036529 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 8 08:40:31.036564 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 8 08:40:31.040072 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 8 08:40:31.040609 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 8 08:40:31.040658 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 8 08:40:31.041898 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 8 08:40:31.041942 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 8 08:40:31.043966 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 8 08:40:31.044026 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 8 08:40:31.044925 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 8 08:40:31.047060 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 8 08:40:31.053363 systemd[1]: systemd-udevd.service: Deactivated successfully. May 8 08:40:31.053696 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 8 08:40:31.056513 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 8 08:40:31.056584 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 8 08:40:31.057743 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 8 08:40:31.057776 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 8 08:40:31.058916 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 8 08:40:31.058961 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 8 08:40:31.061188 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 8 08:40:31.061230 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 8 08:40:31.062218 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 8 08:40:31.062261 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 8 08:40:31.066092 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 8 08:40:31.066652 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 8 08:40:31.066701 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 8 08:40:31.067953 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 8 08:40:31.068020 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 8 08:40:31.069608 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 8 08:40:31.069653 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 8 08:40:31.070472 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 8 08:40:31.070514 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 8 08:40:31.072345 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 8 08:40:31.072386 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 8 08:40:31.076747 systemd[1]: network-cleanup.service: Deactivated successfully. May 8 08:40:31.076827 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 8 08:40:31.080743 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 8 08:40:31.080846 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 8 08:40:31.082136 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 8 08:40:31.084200 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 8 08:40:31.101334 systemd[1]: Switching root. May 8 08:40:31.126724 systemd-journald[185]: Journal stopped May 8 08:40:32.955526 systemd-journald[185]: Received SIGTERM from PID 1 (systemd). May 8 08:40:32.955603 kernel: SELinux: policy capability network_peer_controls=1 May 8 08:40:32.955635 kernel: SELinux: policy capability open_perms=1 May 8 08:40:32.955654 kernel: SELinux: policy capability extended_socket_class=1 May 8 08:40:32.955669 kernel: SELinux: policy capability always_check_network=0 May 8 08:40:32.955687 kernel: SELinux: policy capability cgroup_seclabel=1 May 8 08:40:32.955705 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 8 08:40:32.955722 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 8 08:40:32.955736 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 8 08:40:32.955749 kernel: audit: type=1403 audit(1746693631.799:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 8 08:40:32.955765 systemd[1]: Successfully loaded SELinux policy in 73.572ms. May 8 08:40:32.955788 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 12.715ms. May 8 08:40:32.955806 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 8 08:40:32.955822 systemd[1]: Detected virtualization kvm. May 8 08:40:32.955839 systemd[1]: Detected architecture x86-64. May 8 08:40:32.955854 systemd[1]: Detected first boot. May 8 08:40:32.955869 systemd[1]: Hostname set to . May 8 08:40:32.955883 systemd[1]: Initializing machine ID from VM UUID. May 8 08:40:32.955898 zram_generator::config[1032]: No configuration found. May 8 08:40:32.955914 kernel: Guest personality initialized and is inactive May 8 08:40:32.955928 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 8 08:40:32.955941 kernel: Initialized host personality May 8 08:40:32.955957 kernel: NET: Registered PF_VSOCK protocol family May 8 08:40:32.955972 systemd[1]: Populated /etc with preset unit settings. May 8 08:40:32.959928 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 8 08:40:32.959952 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 8 08:40:32.959972 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 8 08:40:32.960005 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 8 08:40:32.960022 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 8 08:40:32.960037 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 8 08:40:32.960052 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 8 08:40:32.960070 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 8 08:40:32.960086 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 8 08:40:32.960102 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 8 08:40:32.960118 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 8 08:40:32.960132 systemd[1]: Created slice user.slice - User and Session Slice. May 8 08:40:32.960147 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 8 08:40:32.960162 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 8 08:40:32.960176 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 8 08:40:32.960191 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 8 08:40:32.960209 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 8 08:40:32.960224 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 8 08:40:32.960238 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 8 08:40:32.960254 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 8 08:40:32.960268 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 8 08:40:32.960283 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 8 08:40:32.960301 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 8 08:40:32.960315 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 8 08:40:32.960330 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 8 08:40:32.960345 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 8 08:40:32.960360 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 8 08:40:32.960374 systemd[1]: Reached target slices.target - Slice Units. May 8 08:40:32.960388 systemd[1]: Reached target swap.target - Swaps. May 8 08:40:32.960404 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 8 08:40:32.960418 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 8 08:40:32.960435 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 8 08:40:32.960449 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 8 08:40:32.960464 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 8 08:40:32.960478 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 8 08:40:32.960493 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 8 08:40:32.960512 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 8 08:40:32.960528 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 8 08:40:32.960542 systemd[1]: Mounting media.mount - External Media Directory... May 8 08:40:32.960558 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 8 08:40:32.960574 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 8 08:40:32.960589 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 8 08:40:32.960604 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 8 08:40:32.960620 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 8 08:40:32.960634 systemd[1]: Reached target machines.target - Containers. May 8 08:40:32.960650 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 8 08:40:32.960665 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 8 08:40:32.960680 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 8 08:40:32.960696 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 8 08:40:32.960711 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 8 08:40:32.960727 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 8 08:40:32.960741 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 8 08:40:32.960756 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 8 08:40:32.960771 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 8 08:40:32.960786 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 8 08:40:32.960801 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 8 08:40:32.960815 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 8 08:40:32.960832 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 8 08:40:32.960847 systemd[1]: Stopped systemd-fsck-usr.service. May 8 08:40:32.960863 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 8 08:40:32.960878 systemd[1]: Starting systemd-journald.service - Journal Service... May 8 08:40:32.960893 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 8 08:40:32.960908 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 8 08:40:32.960923 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 8 08:40:32.960938 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 8 08:40:32.960958 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 8 08:40:32.960973 systemd[1]: verity-setup.service: Deactivated successfully. May 8 08:40:32.961008 systemd[1]: Stopped verity-setup.service. May 8 08:40:32.961024 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 8 08:40:32.961039 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 8 08:40:32.961080 systemd-journald[1121]: Collecting audit messages is disabled. May 8 08:40:32.961109 kernel: loop: module loaded May 8 08:40:32.961124 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 8 08:40:32.961140 systemd[1]: Mounted media.mount - External Media Directory. May 8 08:40:32.961158 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 8 08:40:32.961173 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 8 08:40:32.961188 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 8 08:40:32.961205 systemd-journald[1121]: Journal started May 8 08:40:32.961235 systemd-journald[1121]: Runtime Journal (/run/log/journal/1aa48beeb2004471a665a4100cf02bf1) is 8M, max 78.5M, 70.5M free. May 8 08:40:32.622187 systemd[1]: Queued start job for default target multi-user.target. May 8 08:40:32.630975 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 8 08:40:32.631421 systemd[1]: systemd-journald.service: Deactivated successfully. May 8 08:40:32.963171 systemd[1]: Started systemd-journald.service - Journal Service. May 8 08:40:32.968089 kernel: fuse: init (API version 7.39) May 8 08:40:32.967140 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 8 08:40:32.967937 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 8 08:40:32.968166 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 8 08:40:32.968946 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 8 08:40:32.969180 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 8 08:40:32.970153 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 8 08:40:32.970335 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 8 08:40:32.972326 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 8 08:40:32.972502 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 8 08:40:32.973276 systemd[1]: modprobe@loop.service: Deactivated successfully. May 8 08:40:32.973439 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 8 08:40:32.974405 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 8 08:40:32.975942 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 8 08:40:32.976812 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 8 08:40:32.998786 systemd[1]: Reached target network-pre.target - Preparation for Network. May 8 08:40:33.003302 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 8 08:40:33.025866 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 8 08:40:33.026548 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 8 08:40:33.026597 systemd[1]: Reached target local-fs.target - Local File Systems. May 8 08:40:33.029002 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 8 08:40:33.035815 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 8 08:40:33.036517 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 8 08:40:33.039216 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 8 08:40:33.048192 kernel: ACPI: bus type drm_connector registered May 8 08:40:33.052998 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 8 08:40:33.054084 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 8 08:40:33.057129 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 8 08:40:33.058623 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 8 08:40:33.069021 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 8 08:40:33.075134 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 8 08:40:33.078155 systemd-journald[1121]: Time spent on flushing to /var/log/journal/1aa48beeb2004471a665a4100cf02bf1 is 64.608ms for 957 entries. May 8 08:40:33.078155 systemd-journald[1121]: System Journal (/var/log/journal/1aa48beeb2004471a665a4100cf02bf1) is 8M, max 584.8M, 576.8M free. May 8 08:40:33.176188 systemd-journald[1121]: Received client request to flush runtime journal. May 8 08:40:33.176246 kernel: loop0: detected capacity change from 0 to 146240 May 8 08:40:33.089121 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 8 08:40:33.093031 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 8 08:40:33.093816 systemd[1]: modprobe@drm.service: Deactivated successfully. May 8 08:40:33.094030 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 8 08:40:33.095370 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 8 08:40:33.097290 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 8 08:40:33.098077 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 8 08:40:33.098695 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 8 08:40:33.102036 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 8 08:40:33.106967 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 8 08:40:33.112087 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 8 08:40:33.174172 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 8 08:40:33.179902 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 8 08:40:33.205896 systemd-tmpfiles[1168]: ACLs are not supported, ignoring. May 8 08:40:33.205914 systemd-tmpfiles[1168]: ACLs are not supported, ignoring. May 8 08:40:33.213908 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 8 08:40:33.216430 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 8 08:40:33.221310 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 8 08:40:33.267022 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 8 08:40:33.294012 kernel: loop1: detected capacity change from 0 to 210664 May 8 08:40:33.306325 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 8 08:40:33.309162 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 8 08:40:33.344515 systemd-tmpfiles[1191]: ACLs are not supported, ignoring. May 8 08:40:33.344534 systemd-tmpfiles[1191]: ACLs are not supported, ignoring. May 8 08:40:33.351712 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 8 08:40:33.361018 kernel: loop2: detected capacity change from 0 to 8 May 8 08:40:33.385033 kernel: loop3: detected capacity change from 0 to 113872 May 8 08:40:33.447023 kernel: loop4: detected capacity change from 0 to 146240 May 8 08:40:33.505024 kernel: loop5: detected capacity change from 0 to 210664 May 8 08:40:33.564626 kernel: loop6: detected capacity change from 0 to 8 May 8 08:40:33.568016 kernel: loop7: detected capacity change from 0 to 113872 May 8 08:40:33.620357 (sd-merge)[1198]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. May 8 08:40:33.620801 (sd-merge)[1198]: Merged extensions into '/usr'. May 8 08:40:33.630388 systemd[1]: Reload requested from client PID 1167 ('systemd-sysext') (unit systemd-sysext.service)... May 8 08:40:33.630505 systemd[1]: Reloading... May 8 08:40:33.696004 zram_generator::config[1221]: No configuration found. May 8 08:40:33.914460 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 8 08:40:34.021320 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 8 08:40:34.021659 systemd[1]: Reloading finished in 390 ms. May 8 08:40:34.043926 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 8 08:40:34.044784 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 8 08:40:34.053263 systemd[1]: Starting ensure-sysext.service... May 8 08:40:34.057106 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 8 08:40:34.060360 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 8 08:40:34.081619 systemd-tmpfiles[1281]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 8 08:40:34.081661 systemd-tmpfiles[1281]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 8 08:40:34.081926 systemd-tmpfiles[1281]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 8 08:40:34.082208 systemd-tmpfiles[1281]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 8 08:40:34.082679 systemd[1]: Reload requested from client PID 1280 ('systemctl') (unit ensure-sysext.service)... May 8 08:40:34.082695 systemd[1]: Reloading... May 8 08:40:34.083077 systemd-tmpfiles[1281]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 8 08:40:34.083373 systemd-tmpfiles[1281]: ACLs are not supported, ignoring. May 8 08:40:34.083434 systemd-tmpfiles[1281]: ACLs are not supported, ignoring. May 8 08:40:34.098052 systemd-tmpfiles[1281]: Detected autofs mount point /boot during canonicalization of boot. May 8 08:40:34.098063 systemd-tmpfiles[1281]: Skipping /boot May 8 08:40:34.109253 systemd-tmpfiles[1281]: Detected autofs mount point /boot during canonicalization of boot. May 8 08:40:34.109267 systemd-tmpfiles[1281]: Skipping /boot May 8 08:40:34.162223 systemd-udevd[1282]: Using default interface naming scheme 'v255'. May 8 08:40:34.185822 zram_generator::config[1310]: No configuration found. May 8 08:40:34.186959 ldconfig[1162]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 8 08:40:34.390688 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 8 08:40:34.449009 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 43 scanned by (udev-worker) (1390) May 8 08:40:34.452065 kernel: mousedev: PS/2 mouse device common for all mice May 8 08:40:34.524857 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 8 08:40:34.525353 systemd[1]: Reloading finished in 442 ms. May 8 08:40:34.529012 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 May 8 08:40:34.533159 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 8 08:40:34.534183 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 8 08:40:34.542074 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 May 8 08:40:34.542830 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 8 08:40:34.577002 kernel: ACPI: button: Power Button [PWRF] May 8 08:40:34.600734 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 8 08:40:34.602413 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 8 08:40:34.608208 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 8 08:40:34.609096 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 8 08:40:34.610786 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 8 08:40:34.614642 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 8 08:40:34.625214 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 8 08:40:34.632652 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 8 08:40:34.633527 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 8 08:40:34.633616 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 8 08:40:34.636802 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 8 08:40:34.641159 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 8 08:40:34.646202 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 8 08:40:34.650565 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 8 08:40:34.651151 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 8 08:40:34.652095 systemd[1]: Finished ensure-sysext.service. May 8 08:40:34.665226 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 8 08:40:34.675212 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 8 08:40:34.685227 systemd[1]: modprobe@drm.service: Deactivated successfully. May 8 08:40:34.685421 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 8 08:40:34.689084 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 8 08:40:34.689267 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 8 08:40:34.717901 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 8 08:40:34.724453 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 8 08:40:34.731063 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 8 08:40:34.731252 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 8 08:40:34.732449 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 8 08:40:34.733848 systemd[1]: modprobe@loop.service: Deactivated successfully. May 8 08:40:34.734559 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 8 08:40:34.738065 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 8 08:40:34.739654 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 8 08:40:34.762184 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 8 08:40:34.771583 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 8 08:40:34.779745 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 8 08:40:34.783681 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 8 08:40:34.794934 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 8 08:40:34.819214 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 May 8 08:40:34.819284 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console May 8 08:40:34.817396 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 8 08:40:34.823381 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 8 08:40:34.824207 augenrules[1461]: No rules May 8 08:40:34.825631 kernel: Console: switching to colour dummy device 80x25 May 8 08:40:34.826719 systemd[1]: audit-rules.service: Deactivated successfully. May 8 08:40:34.826908 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 8 08:40:34.831623 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 8 08:40:34.831672 kernel: [drm] features: -context_init May 8 08:40:34.834235 kernel: [drm] number of scanouts: 1 May 8 08:40:34.834286 kernel: [drm] number of cap sets: 0 May 8 08:40:34.838021 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 May 8 08:40:34.848630 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device May 8 08:40:34.848696 kernel: Console: switching to colour frame buffer device 160x50 May 8 08:40:34.860512 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device May 8 08:40:34.862218 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 8 08:40:34.863052 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 8 08:40:34.866803 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 8 08:40:34.873197 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 8 08:40:34.922590 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 8 08:40:35.016123 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 8 08:40:35.029058 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 8 08:40:35.029263 systemd[1]: Reached target time-set.target - System Time Set. May 8 08:40:35.046350 systemd-resolved[1425]: Positive Trust Anchors: May 8 08:40:35.046669 systemd-resolved[1425]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 8 08:40:35.046771 systemd-resolved[1425]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 8 08:40:35.051610 systemd-resolved[1425]: Using system hostname 'ci-4327-0-0-w-78bcb828ec.novalocal'. May 8 08:40:35.051887 systemd-networkd[1424]: lo: Link UP May 8 08:40:35.051899 systemd-networkd[1424]: lo: Gained carrier May 8 08:40:35.053216 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 8 08:40:35.053359 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 8 08:40:35.053439 systemd[1]: Reached target sysinit.target - System Initialization. May 8 08:40:35.053593 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 8 08:40:35.053696 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 8 08:40:35.053782 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 8 08:40:35.054003 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 8 08:40:35.054157 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 8 08:40:35.054236 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 8 08:40:35.054309 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 8 08:40:35.054342 systemd[1]: Reached target paths.target - Path Units. May 8 08:40:35.054405 systemd[1]: Reached target timers.target - Timer Units. May 8 08:40:35.055108 systemd-networkd[1424]: Enumeration completed May 8 08:40:35.055469 systemd-networkd[1424]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 8 08:40:35.055473 systemd-networkd[1424]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 8 08:40:35.056603 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 8 08:40:35.059454 systemd[1]: Starting docker.socket - Docker Socket for the API... May 8 08:40:35.059575 systemd-networkd[1424]: eth0: Link UP May 8 08:40:35.059579 systemd-networkd[1424]: eth0: Gained carrier May 8 08:40:35.059598 systemd-networkd[1424]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 8 08:40:35.063662 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 8 08:40:35.066374 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 8 08:40:35.068753 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 8 08:40:35.075042 systemd-networkd[1424]: eth0: DHCPv4 address 172.24.4.129/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 8 08:40:35.075635 systemd-timesyncd[1433]: Network configuration changed, trying to establish connection. May 8 08:40:35.077560 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 8 08:40:35.078757 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 8 08:40:35.079838 systemd[1]: Started systemd-networkd.service - Network Configuration. May 8 08:40:35.082166 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 8 08:40:35.084276 systemd[1]: Reached target network.target - Network. May 8 08:40:35.085641 systemd[1]: Reached target sockets.target - Socket Units. May 8 08:40:35.087688 systemd[1]: Reached target basic.target - Basic System. May 8 08:40:35.089725 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 8 08:40:35.089840 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 8 08:40:35.091101 systemd[1]: Starting containerd.service - containerd container runtime... May 8 08:40:35.099547 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 8 08:40:35.103129 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 8 08:40:35.110121 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 8 08:40:35.113426 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 8 08:40:35.119431 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 8 08:40:35.121069 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 8 08:40:35.137292 jq[1497]: false May 8 08:40:35.127820 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 8 08:40:35.133182 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 8 08:40:35.137520 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 8 08:40:35.142869 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 8 08:40:35.149722 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 8 08:40:35.160170 systemd[1]: Starting systemd-logind.service - User Login Management... May 8 08:40:35.168206 google_oslogin_nss_cache[1499]: oslogin_cache_refresh[1499]: Refreshing passwd entry cache May 8 08:40:35.168362 oslogin_cache_refresh[1499]: Refreshing passwd entry cache May 8 08:40:35.169267 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 8 08:40:35.175230 extend-filesystems[1498]: Found loop4 May 8 08:40:35.182353 extend-filesystems[1498]: Found loop5 May 8 08:40:35.182353 extend-filesystems[1498]: Found loop6 May 8 08:40:35.182353 extend-filesystems[1498]: Found loop7 May 8 08:40:35.182353 extend-filesystems[1498]: Found vda May 8 08:40:35.182353 extend-filesystems[1498]: Found vda1 May 8 08:40:35.182353 extend-filesystems[1498]: Found vda2 May 8 08:40:35.182353 extend-filesystems[1498]: Found vda3 May 8 08:40:35.182353 extend-filesystems[1498]: Found usr May 8 08:40:35.182353 extend-filesystems[1498]: Found vda4 May 8 08:40:35.182353 extend-filesystems[1498]: Found vda6 May 8 08:40:35.182353 extend-filesystems[1498]: Found vda7 May 8 08:40:35.182353 extend-filesystems[1498]: Found vda9 May 8 08:40:35.182353 extend-filesystems[1498]: Checking size of /dev/vda9 May 8 08:40:35.182215 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 8 08:40:35.258364 google_oslogin_nss_cache[1499]: oslogin_cache_refresh[1499]: Failure getting users, quitting May 8 08:40:35.258364 google_oslogin_nss_cache[1499]: oslogin_cache_refresh[1499]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 8 08:40:35.258364 google_oslogin_nss_cache[1499]: oslogin_cache_refresh[1499]: Refreshing group entry cache May 8 08:40:35.258364 google_oslogin_nss_cache[1499]: oslogin_cache_refresh[1499]: Failure getting groups, quitting May 8 08:40:35.258364 google_oslogin_nss_cache[1499]: oslogin_cache_refresh[1499]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 8 08:40:35.181793 oslogin_cache_refresh[1499]: Failure getting users, quitting May 8 08:40:35.258570 extend-filesystems[1498]: Resized partition /dev/vda9 May 8 08:40:35.289203 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks May 8 08:40:35.187596 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 8 08:40:35.181813 oslogin_cache_refresh[1499]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 8 08:40:35.289530 extend-filesystems[1522]: resize2fs 1.47.2 (1-Jan-2025) May 8 08:40:35.303098 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 43 scanned by (udev-worker) (1388) May 8 08:40:35.339625 kernel: EXT4-fs (vda9): resized filesystem to 2014203 May 8 08:40:35.191263 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 8 08:40:35.181862 oslogin_cache_refresh[1499]: Refreshing group entry cache May 8 08:40:35.339834 update_engine[1517]: I20250508 08:40:35.244238 1517 main.cc:92] Flatcar Update Engine starting May 8 08:40:35.201640 systemd[1]: Starting update-engine.service - Update Engine... May 8 08:40:35.193529 oslogin_cache_refresh[1499]: Failure getting groups, quitting May 8 08:40:35.221519 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 8 08:40:35.193563 oslogin_cache_refresh[1499]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 8 08:40:35.340864 tar[1525]: linux-amd64/helm May 8 08:40:35.234039 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 8 08:40:35.259794 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 8 08:40:35.341316 jq[1521]: true May 8 08:40:35.260031 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 8 08:40:35.260286 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 8 08:40:35.260459 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 8 08:40:35.264107 systemd[1]: motdgen.service: Deactivated successfully. May 8 08:40:35.264291 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 8 08:40:35.270734 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 8 08:40:35.270939 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 8 08:40:35.299677 systemd-logind[1511]: New seat seat0. May 8 08:40:35.337530 (ntainerd)[1535]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 8 08:40:35.338211 systemd-logind[1511]: Watching system buttons on /dev/input/event2 (Power Button) May 8 08:40:35.338230 systemd-logind[1511]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 8 08:40:35.338749 systemd[1]: Started systemd-logind.service - User Login Management. May 8 08:40:35.350673 jq[1534]: true May 8 08:40:35.360171 extend-filesystems[1522]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 8 08:40:35.360171 extend-filesystems[1522]: old_desc_blocks = 1, new_desc_blocks = 1 May 8 08:40:35.360171 extend-filesystems[1522]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. May 8 08:40:35.374556 extend-filesystems[1498]: Resized filesystem in /dev/vda9 May 8 08:40:35.367571 systemd[1]: extend-filesystems.service: Deactivated successfully. May 8 08:40:35.367766 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 8 08:40:35.374055 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 8 08:40:35.388999 dbus-daemon[1493]: [system] SELinux support is enabled May 8 08:40:35.393172 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 8 08:40:35.400220 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 8 08:40:35.400254 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 8 08:40:35.405416 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 8 08:40:35.405442 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 8 08:40:35.410257 dbus-daemon[1493]: [system] Successfully activated service 'org.freedesktop.systemd1' May 8 08:40:35.412302 update_engine[1517]: I20250508 08:40:35.412153 1517 update_check_scheduler.cc:74] Next update check in 7m1s May 8 08:40:35.416962 systemd[1]: Started update-engine.service - Update Engine. May 8 08:40:35.434838 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 8 08:40:35.522428 bash[1558]: Updated "/home/core/.ssh/authorized_keys" May 8 08:40:35.522195 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 8 08:40:35.527472 systemd[1]: Starting sshkeys.service... May 8 08:40:35.573830 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 8 08:40:35.579282 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 8 08:40:35.747285 locksmithd[1545]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 8 08:40:35.795498 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 8 08:40:35.856966 containerd[1535]: time="2025-05-08T08:40:35Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 8 08:40:35.857498 containerd[1535]: time="2025-05-08T08:40:35.857424444Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 8 08:40:35.871021 containerd[1535]: time="2025-05-08T08:40:35.869390276Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.949µs" May 8 08:40:35.875042 containerd[1535]: time="2025-05-08T08:40:35.875005888Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 8 08:40:35.875098 containerd[1535]: time="2025-05-08T08:40:35.875049329Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 8 08:40:35.875227 containerd[1535]: time="2025-05-08T08:40:35.875206124Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 8 08:40:35.875258 containerd[1535]: time="2025-05-08T08:40:35.875230018Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 8 08:40:35.875280 containerd[1535]: time="2025-05-08T08:40:35.875257290Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 8 08:40:35.875348 containerd[1535]: time="2025-05-08T08:40:35.875323103Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 8 08:40:35.875348 containerd[1535]: time="2025-05-08T08:40:35.875343211Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 8 08:40:35.875586 containerd[1535]: time="2025-05-08T08:40:35.875554517Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 8 08:40:35.875586 containerd[1535]: time="2025-05-08T08:40:35.875575576Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 8 08:40:35.875641 containerd[1535]: time="2025-05-08T08:40:35.875587719Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 8 08:40:35.875641 containerd[1535]: time="2025-05-08T08:40:35.875598099Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 8 08:40:35.875689 containerd[1535]: time="2025-05-08T08:40:35.875672889Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 8 08:40:35.875913 containerd[1535]: time="2025-05-08T08:40:35.875891669Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 8 08:40:35.875945 containerd[1535]: time="2025-05-08T08:40:35.875928017Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 8 08:40:35.875945 containerd[1535]: time="2025-05-08T08:40:35.875940811Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 8 08:40:35.876041 containerd[1535]: time="2025-05-08T08:40:35.875963033Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 8 08:40:35.876238 containerd[1535]: time="2025-05-08T08:40:35.876217571Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 8 08:40:35.876299 containerd[1535]: time="2025-05-08T08:40:35.876280739Z" level=info msg="metadata content store policy set" policy=shared May 8 08:40:35.890649 containerd[1535]: time="2025-05-08T08:40:35.890598312Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 8 08:40:35.890720 containerd[1535]: time="2025-05-08T08:40:35.890654367Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 8 08:40:35.890720 containerd[1535]: time="2025-05-08T08:40:35.890671379Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 8 08:40:35.890720 containerd[1535]: time="2025-05-08T08:40:35.890684904Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 8 08:40:35.890720 containerd[1535]: time="2025-05-08T08:40:35.890700824Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 8 08:40:35.890720 containerd[1535]: time="2025-05-08T08:40:35.890713037Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 8 08:40:35.890836 containerd[1535]: time="2025-05-08T08:40:35.890725320Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 8 08:40:35.890836 containerd[1535]: time="2025-05-08T08:40:35.890738324Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 8 08:40:35.890836 containerd[1535]: time="2025-05-08T08:40:35.890752010Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 8 08:40:35.890836 containerd[1535]: time="2025-05-08T08:40:35.890763912Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 8 08:40:35.890836 containerd[1535]: time="2025-05-08T08:40:35.890776997Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 8 08:40:35.890836 containerd[1535]: time="2025-05-08T08:40:35.890790492Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 8 08:40:35.890972 containerd[1535]: time="2025-05-08T08:40:35.890888015Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 8 08:40:35.890972 containerd[1535]: time="2025-05-08T08:40:35.890909345Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 8 08:40:35.890972 containerd[1535]: time="2025-05-08T08:40:35.890923411Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 8 08:40:35.890972 containerd[1535]: time="2025-05-08T08:40:35.890934192Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 8 08:40:35.890972 containerd[1535]: time="2025-05-08T08:40:35.890945042Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 8 08:40:35.890972 containerd[1535]: time="2025-05-08T08:40:35.890958687Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 8 08:40:35.890972 containerd[1535]: time="2025-05-08T08:40:35.890970049Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 8 08:40:35.891213 containerd[1535]: time="2025-05-08T08:40:35.891002259Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 8 08:40:35.891213 containerd[1535]: time="2025-05-08T08:40:35.891017518Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 8 08:40:35.891213 containerd[1535]: time="2025-05-08T08:40:35.891037165Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 8 08:40:35.891213 containerd[1535]: time="2025-05-08T08:40:35.891050800Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 8 08:40:35.891213 containerd[1535]: time="2025-05-08T08:40:35.891104982Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 8 08:40:35.891213 containerd[1535]: time="2025-05-08T08:40:35.891119469Z" level=info msg="Start snapshots syncer" May 8 08:40:35.891213 containerd[1535]: time="2025-05-08T08:40:35.891138515Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 8 08:40:35.891404 containerd[1535]: time="2025-05-08T08:40:35.891360922Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 8 08:40:35.891558 containerd[1535]: time="2025-05-08T08:40:35.891420383Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 8 08:40:35.891558 containerd[1535]: time="2025-05-08T08:40:35.891490224Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 8 08:40:35.891643 containerd[1535]: time="2025-05-08T08:40:35.891581526Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 8 08:40:35.891643 containerd[1535]: time="2025-05-08T08:40:35.891615299Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 8 08:40:35.891643 containerd[1535]: time="2025-05-08T08:40:35.891627702Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 8 08:40:35.891643 containerd[1535]: time="2025-05-08T08:40:35.891638402Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 8 08:40:35.891756 containerd[1535]: time="2025-05-08T08:40:35.891650665Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 8 08:40:35.891756 containerd[1535]: time="2025-05-08T08:40:35.891661766Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 8 08:40:35.891756 containerd[1535]: time="2025-05-08T08:40:35.891672376Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 8 08:40:35.891756 containerd[1535]: time="2025-05-08T08:40:35.891693866Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 8 08:40:35.891756 containerd[1535]: time="2025-05-08T08:40:35.891705899Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 8 08:40:35.891756 containerd[1535]: time="2025-05-08T08:40:35.891718182Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 8 08:40:35.891756 containerd[1535]: time="2025-05-08T08:40:35.891744211Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 8 08:40:35.891756 containerd[1535]: time="2025-05-08T08:40:35.891757265Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 8 08:40:35.891756 containerd[1535]: time="2025-05-08T08:40:35.891766412Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 8 08:40:35.892036 containerd[1535]: time="2025-05-08T08:40:35.891776752Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 8 08:40:35.892036 containerd[1535]: time="2025-05-08T08:40:35.891786169Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 8 08:40:35.892036 containerd[1535]: time="2025-05-08T08:40:35.891796298Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 8 08:40:35.892036 containerd[1535]: time="2025-05-08T08:40:35.891806568Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 8 08:40:35.892036 containerd[1535]: time="2025-05-08T08:40:35.891824972Z" level=info msg="runtime interface created" May 8 08:40:35.892036 containerd[1535]: time="2025-05-08T08:40:35.891831114Z" level=info msg="created NRI interface" May 8 08:40:35.892036 containerd[1535]: time="2025-05-08T08:40:35.891843938Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 8 08:40:35.892036 containerd[1535]: time="2025-05-08T08:40:35.891856071Z" level=info msg="Connect containerd service" May 8 08:40:35.892036 containerd[1535]: time="2025-05-08T08:40:35.891882240Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 8 08:40:35.893653 containerd[1535]: time="2025-05-08T08:40:35.893274060Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 8 08:40:36.071310 sshd_keygen[1524]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 8 08:40:36.101407 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 8 08:40:36.116930 systemd[1]: Starting issuegen.service - Generate /run/issue... May 8 08:40:36.127164 systemd[1]: Started sshd@0-172.24.4.129:22-172.24.4.1:43070.service - OpenSSH per-connection server daemon (172.24.4.1:43070). May 8 08:40:36.159291 systemd[1]: issuegen.service: Deactivated successfully. May 8 08:40:36.159633 systemd[1]: Finished issuegen.service - Generate /run/issue. May 8 08:40:36.168316 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 8 08:40:36.196533 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 8 08:40:36.203874 systemd[1]: Started getty@tty1.service - Getty on tty1. May 8 08:40:36.212356 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 8 08:40:36.213169 systemd[1]: Reached target getty.target - Login Prompts. May 8 08:40:36.221372 containerd[1535]: time="2025-05-08T08:40:36.221334267Z" level=info msg="Start subscribing containerd event" May 8 08:40:36.222088 containerd[1535]: time="2025-05-08T08:40:36.222041824Z" level=info msg="Start recovering state" May 8 08:40:36.222171 containerd[1535]: time="2025-05-08T08:40:36.222152902Z" level=info msg="Start event monitor" May 8 08:40:36.222204 containerd[1535]: time="2025-05-08T08:40:36.222173952Z" level=info msg="Start cni network conf syncer for default" May 8 08:40:36.222204 containerd[1535]: time="2025-05-08T08:40:36.222182367Z" level=info msg="Start streaming server" May 8 08:40:36.222204 containerd[1535]: time="2025-05-08T08:40:36.222190723Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 8 08:40:36.222204 containerd[1535]: time="2025-05-08T08:40:36.222198207Z" level=info msg="runtime interface starting up..." May 8 08:40:36.222303 containerd[1535]: time="2025-05-08T08:40:36.222204559Z" level=info msg="starting plugins..." May 8 08:40:36.222303 containerd[1535]: time="2025-05-08T08:40:36.222223805Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 8 08:40:36.222303 containerd[1535]: time="2025-05-08T08:40:36.221475201Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 8 08:40:36.222371 containerd[1535]: time="2025-05-08T08:40:36.222341526Z" level=info msg=serving... address=/run/containerd/containerd.sock May 8 08:40:36.225734 containerd[1535]: time="2025-05-08T08:40:36.222394034Z" level=info msg="containerd successfully booted in 0.366264s" May 8 08:40:36.222462 systemd[1]: Started containerd.service - containerd container runtime. May 8 08:40:36.260869 tar[1525]: linux-amd64/LICENSE May 8 08:40:36.261000 tar[1525]: linux-amd64/README.md May 8 08:40:36.276352 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 8 08:40:36.375141 systemd-networkd[1424]: eth0: Gained IPv6LL May 8 08:40:36.375619 systemd-timesyncd[1433]: Network configuration changed, trying to establish connection. May 8 08:40:36.378296 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 8 08:40:36.381775 systemd[1]: Reached target network-online.target - Network is Online. May 8 08:40:36.391863 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 8 08:40:36.404602 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 8 08:40:36.466421 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 8 08:40:37.612512 sshd[1593]: Accepted publickey for core from 172.24.4.1 port 43070 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:40:37.615330 sshd-session[1593]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:40:37.635183 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 8 08:40:37.641747 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 8 08:40:37.669677 systemd-logind[1511]: New session 1 of user core. May 8 08:40:37.680952 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 8 08:40:37.693765 systemd[1]: Starting user@500.service - User Manager for UID 500... May 8 08:40:37.718471 (systemd)[1625]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 8 08:40:37.722852 systemd-logind[1511]: New session c1 of user core. May 8 08:40:37.882714 systemd[1625]: Queued start job for default target default.target. May 8 08:40:37.892117 systemd[1625]: Created slice app.slice - User Application Slice. May 8 08:40:37.892146 systemd[1625]: Reached target paths.target - Paths. May 8 08:40:37.892188 systemd[1625]: Reached target timers.target - Timers. May 8 08:40:37.897117 systemd[1625]: Starting dbus.socket - D-Bus User Message Bus Socket... May 8 08:40:37.912828 systemd[1625]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 8 08:40:37.913784 systemd[1625]: Reached target sockets.target - Sockets. May 8 08:40:37.913924 systemd[1625]: Reached target basic.target - Basic System. May 8 08:40:37.914024 systemd[1]: Started user@500.service - User Manager for UID 500. May 8 08:40:37.914257 systemd[1625]: Reached target default.target - Main User Target. May 8 08:40:37.914286 systemd[1625]: Startup finished in 183ms. May 8 08:40:37.921210 systemd[1]: Started session-1.scope - Session 1 of User core. May 8 08:40:38.071805 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 8 08:40:38.088734 (kubelet)[1639]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 8 08:40:39.030685 systemd[1]: Started sshd@1-172.24.4.129:22-172.24.4.1:48398.service - OpenSSH per-connection server daemon (172.24.4.1:48398). May 8 08:40:39.462347 kubelet[1639]: E0508 08:40:39.462085 1639 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 8 08:40:39.464350 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 8 08:40:39.464584 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 8 08:40:39.465126 systemd[1]: kubelet.service: Consumed 1.985s CPU time, 247.5M memory peak. May 8 08:40:40.282429 sshd[1649]: Accepted publickey for core from 172.24.4.1 port 48398 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:40:40.285045 sshd-session[1649]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:40:40.295809 systemd-logind[1511]: New session 2 of user core. May 8 08:40:40.308367 systemd[1]: Started session-2.scope - Session 2 of User core. May 8 08:40:40.925621 sshd[1654]: Connection closed by 172.24.4.1 port 48398 May 8 08:40:40.924274 sshd-session[1649]: pam_unix(sshd:session): session closed for user core May 8 08:40:40.944822 systemd[1]: sshd@1-172.24.4.129:22-172.24.4.1:48398.service: Deactivated successfully. May 8 08:40:40.948838 systemd[1]: session-2.scope: Deactivated successfully. May 8 08:40:40.952134 systemd-logind[1511]: Session 2 logged out. Waiting for processes to exit. May 8 08:40:40.956191 systemd[1]: Started sshd@2-172.24.4.129:22-172.24.4.1:48406.service - OpenSSH per-connection server daemon (172.24.4.1:48406). May 8 08:40:40.965599 systemd-logind[1511]: Removed session 2. May 8 08:40:41.285920 login[1605]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 8 08:40:41.301366 systemd-logind[1511]: New session 3 of user core. May 8 08:40:41.310061 login[1606]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 8 08:40:41.312414 systemd[1]: Started session-3.scope - Session 3 of User core. May 8 08:40:41.329235 systemd-logind[1511]: New session 4 of user core. May 8 08:40:41.337284 systemd[1]: Started session-4.scope - Session 4 of User core. May 8 08:40:42.192461 coreos-metadata[1492]: May 08 08:40:42.192 WARN failed to locate config-drive, using the metadata service API instead May 8 08:40:42.242593 coreos-metadata[1492]: May 08 08:40:42.242 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 May 8 08:40:42.272956 sshd[1659]: Accepted publickey for core from 172.24.4.1 port 48406 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:40:42.275851 sshd-session[1659]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:40:42.288228 systemd-logind[1511]: New session 5 of user core. May 8 08:40:42.296382 systemd[1]: Started session-5.scope - Session 5 of User core. May 8 08:40:42.399248 coreos-metadata[1492]: May 08 08:40:42.399 INFO Fetch successful May 8 08:40:42.399248 coreos-metadata[1492]: May 08 08:40:42.399 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 8 08:40:42.412804 coreos-metadata[1492]: May 08 08:40:42.412 INFO Fetch successful May 8 08:40:42.412804 coreos-metadata[1492]: May 08 08:40:42.412 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 May 8 08:40:42.426366 coreos-metadata[1492]: May 08 08:40:42.426 INFO Fetch successful May 8 08:40:42.426366 coreos-metadata[1492]: May 08 08:40:42.426 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 May 8 08:40:42.440624 coreos-metadata[1492]: May 08 08:40:42.440 INFO Fetch successful May 8 08:40:42.440624 coreos-metadata[1492]: May 08 08:40:42.440 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 May 8 08:40:42.452957 coreos-metadata[1492]: May 08 08:40:42.452 INFO Fetch successful May 8 08:40:42.452957 coreos-metadata[1492]: May 08 08:40:42.452 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 May 8 08:40:42.466363 coreos-metadata[1492]: May 08 08:40:42.466 INFO Fetch successful May 8 08:40:42.516370 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 8 08:40:42.517769 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 8 08:40:42.700373 coreos-metadata[1564]: May 08 08:40:42.700 WARN failed to locate config-drive, using the metadata service API instead May 8 08:40:42.742247 coreos-metadata[1564]: May 08 08:40:42.742 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 May 8 08:40:42.756114 coreos-metadata[1564]: May 08 08:40:42.756 INFO Fetch successful May 8 08:40:42.756114 coreos-metadata[1564]: May 08 08:40:42.756 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 May 8 08:40:42.772073 coreos-metadata[1564]: May 08 08:40:42.771 INFO Fetch successful May 8 08:40:42.776135 unknown[1564]: wrote ssh authorized keys file for user: core May 8 08:40:42.828513 update-ssh-keys[1697]: Updated "/home/core/.ssh/authorized_keys" May 8 08:40:42.831428 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 8 08:40:42.834934 systemd[1]: Finished sshkeys.service. May 8 08:40:42.839948 systemd[1]: Reached target multi-user.target - Multi-User System. May 8 08:40:42.840371 systemd[1]: Startup finished in 3.886s (kernel) + 15.967s (initrd) + 11.112s (userspace) = 30.966s. May 8 08:40:43.002758 sshd[1688]: Connection closed by 172.24.4.1 port 48406 May 8 08:40:43.002519 sshd-session[1659]: pam_unix(sshd:session): session closed for user core May 8 08:40:43.009931 systemd[1]: sshd@2-172.24.4.129:22-172.24.4.1:48406.service: Deactivated successfully. May 8 08:40:43.013896 systemd[1]: session-5.scope: Deactivated successfully. May 8 08:40:43.017166 systemd-logind[1511]: Session 5 logged out. Waiting for processes to exit. May 8 08:40:43.019898 systemd-logind[1511]: Removed session 5. May 8 08:40:49.596393 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 8 08:40:49.601562 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 8 08:40:49.989491 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 8 08:40:50.002369 (kubelet)[1711]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 8 08:40:50.195456 kubelet[1711]: E0508 08:40:50.195355 1711 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 8 08:40:50.203580 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 8 08:40:50.203852 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 8 08:40:50.204420 systemd[1]: kubelet.service: Consumed 307ms CPU time, 95.8M memory peak. May 8 08:40:53.023403 systemd[1]: Started sshd@3-172.24.4.129:22-172.24.4.1:36068.service - OpenSSH per-connection server daemon (172.24.4.1:36068). May 8 08:40:54.306384 sshd[1721]: Accepted publickey for core from 172.24.4.1 port 36068 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:40:54.308894 sshd-session[1721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:40:54.320557 systemd-logind[1511]: New session 6 of user core. May 8 08:40:54.337391 systemd[1]: Started session-6.scope - Session 6 of User core. May 8 08:40:55.032038 sshd[1723]: Connection closed by 172.24.4.1 port 36068 May 8 08:40:55.030773 sshd-session[1721]: pam_unix(sshd:session): session closed for user core May 8 08:40:55.046531 systemd[1]: sshd@3-172.24.4.129:22-172.24.4.1:36068.service: Deactivated successfully. May 8 08:40:55.049857 systemd[1]: session-6.scope: Deactivated successfully. May 8 08:40:55.052056 systemd-logind[1511]: Session 6 logged out. Waiting for processes to exit. May 8 08:40:55.056714 systemd[1]: Started sshd@4-172.24.4.129:22-172.24.4.1:45722.service - OpenSSH per-connection server daemon (172.24.4.1:45722). May 8 08:40:55.060132 systemd-logind[1511]: Removed session 6. May 8 08:40:56.478597 sshd[1728]: Accepted publickey for core from 172.24.4.1 port 45722 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:40:56.481722 sshd-session[1728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:40:56.495519 systemd-logind[1511]: New session 7 of user core. May 8 08:40:56.505388 systemd[1]: Started session-7.scope - Session 7 of User core. May 8 08:40:57.209648 sshd[1731]: Connection closed by 172.24.4.1 port 45722 May 8 08:40:57.210528 sshd-session[1728]: pam_unix(sshd:session): session closed for user core May 8 08:40:57.226097 systemd[1]: sshd@4-172.24.4.129:22-172.24.4.1:45722.service: Deactivated successfully. May 8 08:40:57.229087 systemd[1]: session-7.scope: Deactivated successfully. May 8 08:40:57.230912 systemd-logind[1511]: Session 7 logged out. Waiting for processes to exit. May 8 08:40:57.235160 systemd[1]: Started sshd@5-172.24.4.129:22-172.24.4.1:45728.service - OpenSSH per-connection server daemon (172.24.4.1:45728). May 8 08:40:57.238798 systemd-logind[1511]: Removed session 7. May 8 08:40:58.713371 sshd[1736]: Accepted publickey for core from 172.24.4.1 port 45728 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:40:58.716055 sshd-session[1736]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:40:58.727649 systemd-logind[1511]: New session 8 of user core. May 8 08:40:58.737298 systemd[1]: Started session-8.scope - Session 8 of User core. May 8 08:40:59.444335 sshd[1739]: Connection closed by 172.24.4.1 port 45728 May 8 08:40:59.445313 sshd-session[1736]: pam_unix(sshd:session): session closed for user core May 8 08:40:59.460833 systemd[1]: sshd@5-172.24.4.129:22-172.24.4.1:45728.service: Deactivated successfully. May 8 08:40:59.464119 systemd[1]: session-8.scope: Deactivated successfully. May 8 08:40:59.467481 systemd-logind[1511]: Session 8 logged out. Waiting for processes to exit. May 8 08:40:59.470520 systemd[1]: Started sshd@6-172.24.4.129:22-172.24.4.1:45736.service - OpenSSH per-connection server daemon (172.24.4.1:45736). May 8 08:40:59.474152 systemd-logind[1511]: Removed session 8. May 8 08:41:00.346470 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 8 08:41:00.350431 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 8 08:41:00.681639 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 8 08:41:00.694364 (kubelet)[1754]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 8 08:41:00.822114 kubelet[1754]: E0508 08:41:00.821921 1754 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 8 08:41:00.827471 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 8 08:41:00.827813 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 8 08:41:00.828496 systemd[1]: kubelet.service: Consumed 279ms CPU time, 96.2M memory peak. May 8 08:41:00.986062 sshd[1744]: Accepted publickey for core from 172.24.4.1 port 45736 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:41:00.988669 sshd-session[1744]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:41:01.001089 systemd-logind[1511]: New session 9 of user core. May 8 08:41:01.011306 systemd[1]: Started session-9.scope - Session 9 of User core. May 8 08:41:01.343622 sudo[1765]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 8 08:41:01.344583 sudo[1765]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 8 08:41:01.367290 sudo[1765]: pam_unix(sudo:session): session closed for user root May 8 08:41:01.583147 sshd[1764]: Connection closed by 172.24.4.1 port 45736 May 8 08:41:01.585134 sshd-session[1744]: pam_unix(sshd:session): session closed for user core May 8 08:41:01.598952 systemd[1]: sshd@6-172.24.4.129:22-172.24.4.1:45736.service: Deactivated successfully. May 8 08:41:01.602890 systemd[1]: session-9.scope: Deactivated successfully. May 8 08:41:01.604941 systemd-logind[1511]: Session 9 logged out. Waiting for processes to exit. May 8 08:41:01.610386 systemd[1]: Started sshd@7-172.24.4.129:22-172.24.4.1:45746.service - OpenSSH per-connection server daemon (172.24.4.1:45746). May 8 08:41:01.612855 systemd-logind[1511]: Removed session 9. May 8 08:41:02.974702 sshd[1770]: Accepted publickey for core from 172.24.4.1 port 45746 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:41:02.977954 sshd-session[1770]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:41:02.989091 systemd-logind[1511]: New session 10 of user core. May 8 08:41:02.998319 systemd[1]: Started session-10.scope - Session 10 of User core. May 8 08:41:03.501276 sudo[1775]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 8 08:41:03.502738 sudo[1775]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 8 08:41:03.510514 sudo[1775]: pam_unix(sudo:session): session closed for user root May 8 08:41:03.522219 sudo[1774]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 8 08:41:03.522843 sudo[1774]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 8 08:41:03.544226 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 8 08:41:03.622348 augenrules[1797]: No rules May 8 08:41:03.624550 systemd[1]: audit-rules.service: Deactivated successfully. May 8 08:41:03.625144 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 8 08:41:03.627858 sudo[1774]: pam_unix(sudo:session): session closed for user root May 8 08:41:03.834137 sshd[1773]: Connection closed by 172.24.4.1 port 45746 May 8 08:41:03.834785 sshd-session[1770]: pam_unix(sshd:session): session closed for user core May 8 08:41:03.847184 systemd[1]: sshd@7-172.24.4.129:22-172.24.4.1:45746.service: Deactivated successfully. May 8 08:41:03.850354 systemd[1]: session-10.scope: Deactivated successfully. May 8 08:41:03.852197 systemd-logind[1511]: Session 10 logged out. Waiting for processes to exit. May 8 08:41:03.856819 systemd[1]: Started sshd@8-172.24.4.129:22-172.24.4.1:48680.service - OpenSSH per-connection server daemon (172.24.4.1:48680). May 8 08:41:03.859570 systemd-logind[1511]: Removed session 10. May 8 08:41:05.199516 sshd[1805]: Accepted publickey for core from 172.24.4.1 port 48680 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:41:05.202206 sshd-session[1805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:41:05.214088 systemd-logind[1511]: New session 11 of user core. May 8 08:41:05.218273 systemd[1]: Started session-11.scope - Session 11 of User core. May 8 08:41:05.664932 sudo[1809]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 8 08:41:05.665627 sudo[1809]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 8 08:41:06.383345 systemd[1]: Starting docker.service - Docker Application Container Engine... May 8 08:41:06.398559 (dockerd)[1826]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 8 08:41:06.709145 systemd-timesyncd[1433]: Contacted time server 50.218.103.254:123 (2.flatcar.pool.ntp.org). May 8 08:41:06.709270 systemd-timesyncd[1433]: Initial clock synchronization to Thu 2025-05-08 08:41:06.928098 UTC. May 8 08:41:06.927196 dockerd[1826]: time="2025-05-08T08:41:06.927059837Z" level=info msg="Starting up" May 8 08:41:06.930556 dockerd[1826]: time="2025-05-08T08:41:06.930459694Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 8 08:41:07.014100 systemd[1]: var-lib-docker-metacopy\x2dcheck1988949598-merged.mount: Deactivated successfully. May 8 08:41:07.059510 dockerd[1826]: time="2025-05-08T08:41:07.059348183Z" level=info msg="Loading containers: start." May 8 08:41:07.074349 kernel: Initializing XFRM netlink socket May 8 08:41:07.351606 systemd-networkd[1424]: docker0: Link UP May 8 08:41:07.357790 dockerd[1826]: time="2025-05-08T08:41:07.357749791Z" level=info msg="Loading containers: done." May 8 08:41:07.377615 dockerd[1826]: time="2025-05-08T08:41:07.377566189Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 8 08:41:07.377777 dockerd[1826]: time="2025-05-08T08:41:07.377646488Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 8 08:41:07.377777 dockerd[1826]: time="2025-05-08T08:41:07.377745512Z" level=info msg="Initializing buildkit" May 8 08:41:07.410513 dockerd[1826]: time="2025-05-08T08:41:07.410468315Z" level=info msg="Completed buildkit initialization" May 8 08:41:07.420132 dockerd[1826]: time="2025-05-08T08:41:07.420076566Z" level=info msg="Daemon has completed initialization" May 8 08:41:07.421353 dockerd[1826]: time="2025-05-08T08:41:07.420400853Z" level=info msg="API listen on /run/docker.sock" May 8 08:41:07.420315 systemd[1]: Started docker.service - Docker Application Container Engine. May 8 08:41:09.190096 containerd[1535]: time="2025-05-08T08:41:09.189695956Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\"" May 8 08:41:09.991954 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3838189720.mount: Deactivated successfully. May 8 08:41:10.845393 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 8 08:41:10.851377 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 8 08:41:11.009086 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 8 08:41:11.016303 (kubelet)[2094]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 8 08:41:11.071218 kubelet[2094]: E0508 08:41:11.071143 2094 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 8 08:41:11.073583 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 8 08:41:11.073821 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 8 08:41:11.074402 systemd[1]: kubelet.service: Consumed 185ms CPU time, 95.7M memory peak. May 8 08:41:12.277507 containerd[1535]: time="2025-05-08T08:41:12.277465572Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:12.280820 containerd[1535]: time="2025-05-08T08:41:12.280543971Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.12: active requests=0, bytes read=32674881" May 8 08:41:12.283809 containerd[1535]: time="2025-05-08T08:41:12.283773839Z" level=info msg="ImageCreate event name:\"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:12.289932 containerd[1535]: time="2025-05-08T08:41:12.289510675Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:12.290565 containerd[1535]: time="2025-05-08T08:41:12.290532903Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.12\" with image id \"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\", size \"32671673\" in 3.10079259s" May 8 08:41:12.290620 containerd[1535]: time="2025-05-08T08:41:12.290566329Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\" returns image reference \"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\"" May 8 08:41:12.308328 containerd[1535]: time="2025-05-08T08:41:12.308278459Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\"" May 8 08:41:15.106541 containerd[1535]: time="2025-05-08T08:41:15.106503736Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:15.107855 containerd[1535]: time="2025-05-08T08:41:15.107681386Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.12: active requests=0, bytes read=29617542" May 8 08:41:15.108757 containerd[1535]: time="2025-05-08T08:41:15.108723236Z" level=info msg="ImageCreate event name:\"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:15.112399 containerd[1535]: time="2025-05-08T08:41:15.112365976Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:15.113917 containerd[1535]: time="2025-05-08T08:41:15.113712921Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.12\" with image id \"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\", size \"31105907\" in 2.805207273s" May 8 08:41:15.113917 containerd[1535]: time="2025-05-08T08:41:15.113746153Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\" returns image reference \"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\"" May 8 08:41:15.137233 containerd[1535]: time="2025-05-08T08:41:15.137145134Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\"" May 8 08:41:16.745966 containerd[1535]: time="2025-05-08T08:41:16.745922906Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:16.748525 containerd[1535]: time="2025-05-08T08:41:16.748505973Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.12: active requests=0, bytes read=17903690" May 8 08:41:16.749611 containerd[1535]: time="2025-05-08T08:41:16.749588791Z" level=info msg="ImageCreate event name:\"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:16.753135 containerd[1535]: time="2025-05-08T08:41:16.753113458Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:16.754082 containerd[1535]: time="2025-05-08T08:41:16.754044483Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.12\" with image id \"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\", size \"19392073\" in 1.616866928s" May 8 08:41:16.754132 containerd[1535]: time="2025-05-08T08:41:16.754084070Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\" returns image reference \"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\"" May 8 08:41:16.774395 containerd[1535]: time="2025-05-08T08:41:16.774270884Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\"" May 8 08:41:18.419466 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2802077334.mount: Deactivated successfully. May 8 08:41:18.948761 containerd[1535]: time="2025-05-08T08:41:18.948722554Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:18.950636 containerd[1535]: time="2025-05-08T08:41:18.950614476Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.12: active requests=0, bytes read=29185825" May 8 08:41:18.952082 containerd[1535]: time="2025-05-08T08:41:18.951771450Z" level=info msg="ImageCreate event name:\"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:18.957919 containerd[1535]: time="2025-05-08T08:41:18.957849451Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:18.958675 containerd[1535]: time="2025-05-08T08:41:18.958338390Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.12\" with image id \"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\", repo tag \"registry.k8s.io/kube-proxy:v1.30.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\", size \"29184836\" in 2.183762219s" May 8 08:41:18.958675 containerd[1535]: time="2025-05-08T08:41:18.958369895Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\" returns image reference \"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\"" May 8 08:41:18.979843 containerd[1535]: time="2025-05-08T08:41:18.979176490Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 8 08:41:19.935094 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3974678970.mount: Deactivated successfully. May 8 08:41:20.888272 update_engine[1517]: I20250508 08:41:20.887115 1517 update_attempter.cc:509] Updating boot flags... May 8 08:41:20.957265 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 43 scanned by (udev-worker) (2200) May 8 08:41:21.060041 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 43 scanned by (udev-worker) (2201) May 8 08:41:21.083350 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 8 08:41:21.091136 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 8 08:41:21.170019 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 43 scanned by (udev-worker) (2201) May 8 08:41:21.518295 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 8 08:41:21.528267 (kubelet)[2216]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 8 08:41:21.593412 kubelet[2216]: E0508 08:41:21.593342 2216 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 8 08:41:21.595875 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 8 08:41:21.596046 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 8 08:41:21.597396 systemd[1]: kubelet.service: Consumed 165ms CPU time, 97.7M memory peak. May 8 08:41:21.650875 containerd[1535]: time="2025-05-08T08:41:21.650791530Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:21.652235 containerd[1535]: time="2025-05-08T08:41:21.651993466Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" May 8 08:41:21.653442 containerd[1535]: time="2025-05-08T08:41:21.653376150Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:21.656457 containerd[1535]: time="2025-05-08T08:41:21.656412395Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:21.657551 containerd[1535]: time="2025-05-08T08:41:21.657415483Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.678201525s" May 8 08:41:21.657551 containerd[1535]: time="2025-05-08T08:41:21.657458464Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 8 08:41:21.676689 containerd[1535]: time="2025-05-08T08:41:21.676641756Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" May 8 08:41:22.288650 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount133395814.mount: Deactivated successfully. May 8 08:41:22.299827 containerd[1535]: time="2025-05-08T08:41:22.299671328Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:22.302060 containerd[1535]: time="2025-05-08T08:41:22.301730869Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322298" May 8 08:41:22.304344 containerd[1535]: time="2025-05-08T08:41:22.304245789Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:22.310049 containerd[1535]: time="2025-05-08T08:41:22.309326985Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:22.311425 containerd[1535]: time="2025-05-08T08:41:22.311188365Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 634.447234ms" May 8 08:41:22.311425 containerd[1535]: time="2025-05-08T08:41:22.311258886Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" May 8 08:41:22.352343 containerd[1535]: time="2025-05-08T08:41:22.352266962Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" May 8 08:41:23.021213 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3390820657.mount: Deactivated successfully. May 8 08:41:26.626940 containerd[1535]: time="2025-05-08T08:41:26.625860244Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:26.630128 containerd[1535]: time="2025-05-08T08:41:26.630033978Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238579" May 8 08:41:26.671193 containerd[1535]: time="2025-05-08T08:41:26.671126148Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:26.808065 containerd[1535]: time="2025-05-08T08:41:26.807911246Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:41:26.811313 containerd[1535]: time="2025-05-08T08:41:26.810973554Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 4.458624911s" May 8 08:41:26.811313 containerd[1535]: time="2025-05-08T08:41:26.811097363Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" May 8 08:41:31.312849 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 8 08:41:31.313041 systemd[1]: kubelet.service: Consumed 165ms CPU time, 97.7M memory peak. May 8 08:41:31.315483 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 8 08:41:31.351926 systemd[1]: Reload requested from client PID 2373 ('systemctl') (unit session-11.scope)... May 8 08:41:31.352086 systemd[1]: Reloading... May 8 08:41:31.468077 zram_generator::config[2421]: No configuration found. May 8 08:41:31.588022 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 8 08:41:31.725894 systemd[1]: Reloading finished in 373 ms. May 8 08:41:31.776874 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 8 08:41:31.780764 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 8 08:41:31.782457 systemd[1]: kubelet.service: Deactivated successfully. May 8 08:41:31.782663 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 8 08:41:31.782699 systemd[1]: kubelet.service: Consumed 113ms CPU time, 83.6M memory peak. May 8 08:41:31.784441 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 8 08:41:33.820755 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 8 08:41:33.830470 (kubelet)[2486]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 8 08:41:34.584524 kubelet[2486]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 8 08:41:34.584524 kubelet[2486]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 8 08:41:34.584524 kubelet[2486]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 8 08:41:34.584524 kubelet[2486]: I0508 08:41:34.583572 2486 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 8 08:41:35.300360 kubelet[2486]: I0508 08:41:35.300291 2486 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 8 08:41:35.300360 kubelet[2486]: I0508 08:41:35.300329 2486 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 8 08:41:35.300623 kubelet[2486]: I0508 08:41:35.300566 2486 server.go:927] "Client rotation is on, will bootstrap in background" May 8 08:41:35.322226 kubelet[2486]: I0508 08:41:35.321905 2486 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 8 08:41:35.325156 kubelet[2486]: E0508 08:41:35.325069 2486 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.24.4.129:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:35.346677 kubelet[2486]: I0508 08:41:35.346528 2486 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 8 08:41:35.347143 kubelet[2486]: I0508 08:41:35.346904 2486 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 8 08:41:35.347502 kubelet[2486]: I0508 08:41:35.346965 2486 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4327-0-0-w-78bcb828ec.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 8 08:41:35.347502 kubelet[2486]: I0508 08:41:35.347492 2486 topology_manager.go:138] "Creating topology manager with none policy" May 8 08:41:35.347822 kubelet[2486]: I0508 08:41:35.347521 2486 container_manager_linux.go:301] "Creating device plugin manager" May 8 08:41:35.347822 kubelet[2486]: I0508 08:41:35.347742 2486 state_mem.go:36] "Initialized new in-memory state store" May 8 08:41:35.351714 kubelet[2486]: I0508 08:41:35.351233 2486 kubelet.go:400] "Attempting to sync node with API server" May 8 08:41:35.351714 kubelet[2486]: I0508 08:41:35.351281 2486 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 8 08:41:35.351714 kubelet[2486]: I0508 08:41:35.351329 2486 kubelet.go:312] "Adding apiserver pod source" May 8 08:41:35.351714 kubelet[2486]: I0508 08:41:35.351361 2486 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 8 08:41:35.360509 kubelet[2486]: W0508 08:41:35.360409 2486 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.129:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4327-0-0-w-78bcb828ec.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:35.361040 kubelet[2486]: E0508 08:41:35.360751 2486 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.129:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4327-0-0-w-78bcb828ec.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:35.361663 kubelet[2486]: W0508 08:41:35.361572 2486 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.129:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:35.362149 kubelet[2486]: E0508 08:41:35.361887 2486 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.129:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:35.362519 kubelet[2486]: I0508 08:41:35.362483 2486 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 8 08:41:35.368466 kubelet[2486]: I0508 08:41:35.366747 2486 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 8 08:41:35.368466 kubelet[2486]: W0508 08:41:35.366846 2486 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 8 08:41:35.368466 kubelet[2486]: I0508 08:41:35.368051 2486 server.go:1264] "Started kubelet" May 8 08:41:35.382775 kubelet[2486]: I0508 08:41:35.382730 2486 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 8 08:41:35.391711 kubelet[2486]: E0508 08:41:35.391428 2486 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.129:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.129:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4327-0-0-w-78bcb828ec.novalocal.183d80a95bd24f12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4327-0-0-w-78bcb828ec.novalocal,UID:ci-4327-0-0-w-78bcb828ec.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4327-0-0-w-78bcb828ec.novalocal,},FirstTimestamp:2025-05-08 08:41:35.36795829 +0000 UTC m=+1.533516332,LastTimestamp:2025-05-08 08:41:35.36795829 +0000 UTC m=+1.533516332,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4327-0-0-w-78bcb828ec.novalocal,}" May 8 08:41:35.397934 kubelet[2486]: I0508 08:41:35.397830 2486 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 8 08:41:35.399819 kubelet[2486]: I0508 08:41:35.399791 2486 volume_manager.go:291] "Starting Kubelet Volume Manager" May 8 08:41:35.400808 kubelet[2486]: I0508 08:41:35.400758 2486 server.go:455] "Adding debug handlers to kubelet server" May 8 08:41:35.403055 kubelet[2486]: I0508 08:41:35.402947 2486 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 8 08:41:35.403355 kubelet[2486]: I0508 08:41:35.403324 2486 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 8 08:41:35.403738 kubelet[2486]: I0508 08:41:35.403715 2486 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 8 08:41:35.404914 kubelet[2486]: I0508 08:41:35.403854 2486 reconciler.go:26] "Reconciler: start to sync state" May 8 08:41:35.405716 kubelet[2486]: E0508 08:41:35.405671 2486 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.129:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4327-0-0-w-78bcb828ec.novalocal?timeout=10s\": dial tcp 172.24.4.129:6443: connect: connection refused" interval="200ms" May 8 08:41:35.405806 kubelet[2486]: W0508 08:41:35.405779 2486 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.129:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:35.405841 kubelet[2486]: E0508 08:41:35.405812 2486 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.129:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:35.406878 kubelet[2486]: I0508 08:41:35.406009 2486 factory.go:221] Registration of the systemd container factory successfully May 8 08:41:35.406878 kubelet[2486]: I0508 08:41:35.406085 2486 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 8 08:41:35.407674 kubelet[2486]: E0508 08:41:35.407649 2486 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 8 08:41:35.408071 kubelet[2486]: I0508 08:41:35.408048 2486 factory.go:221] Registration of the containerd container factory successfully May 8 08:41:35.414343 kubelet[2486]: I0508 08:41:35.414310 2486 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 8 08:41:35.415370 kubelet[2486]: I0508 08:41:35.415355 2486 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 8 08:41:35.415447 kubelet[2486]: I0508 08:41:35.415438 2486 status_manager.go:217] "Starting to sync pod status with apiserver" May 8 08:41:35.415512 kubelet[2486]: I0508 08:41:35.415504 2486 kubelet.go:2337] "Starting kubelet main sync loop" May 8 08:41:35.415610 kubelet[2486]: E0508 08:41:35.415591 2486 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 8 08:41:35.422635 kubelet[2486]: W0508 08:41:35.422579 2486 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.129:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:35.422635 kubelet[2486]: E0508 08:41:35.422643 2486 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.129:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:35.437528 kubelet[2486]: I0508 08:41:35.437502 2486 cpu_manager.go:214] "Starting CPU manager" policy="none" May 8 08:41:35.437528 kubelet[2486]: I0508 08:41:35.437522 2486 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 8 08:41:35.437528 kubelet[2486]: I0508 08:41:35.437540 2486 state_mem.go:36] "Initialized new in-memory state store" May 8 08:41:35.442897 kubelet[2486]: I0508 08:41:35.442866 2486 policy_none.go:49] "None policy: Start" May 8 08:41:35.443448 kubelet[2486]: I0508 08:41:35.443418 2486 memory_manager.go:170] "Starting memorymanager" policy="None" May 8 08:41:35.444093 kubelet[2486]: I0508 08:41:35.443515 2486 state_mem.go:35] "Initializing new in-memory state store" May 8 08:41:35.453057 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 8 08:41:35.465904 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 8 08:41:35.479678 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 8 08:41:35.483052 kubelet[2486]: I0508 08:41:35.482620 2486 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 8 08:41:35.483052 kubelet[2486]: I0508 08:41:35.482824 2486 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 8 08:41:35.483052 kubelet[2486]: I0508 08:41:35.482944 2486 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 8 08:41:35.486247 kubelet[2486]: E0508 08:41:35.486214 2486 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4327-0-0-w-78bcb828ec.novalocal\" not found" May 8 08:41:35.502644 kubelet[2486]: I0508 08:41:35.502592 2486 kubelet_node_status.go:73] "Attempting to register node" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.503040 kubelet[2486]: E0508 08:41:35.502960 2486 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.129:6443/api/v1/nodes\": dial tcp 172.24.4.129:6443: connect: connection refused" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.516634 kubelet[2486]: I0508 08:41:35.516577 2486 topology_manager.go:215] "Topology Admit Handler" podUID="f1e1dc146ed23c241ef58db7d11f58b4" podNamespace="kube-system" podName="kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.518564 kubelet[2486]: I0508 08:41:35.518475 2486 topology_manager.go:215] "Topology Admit Handler" podUID="cbd8bd9be2208f566102c35b46840afe" podNamespace="kube-system" podName="kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.520796 kubelet[2486]: I0508 08:41:35.520609 2486 topology_manager.go:215] "Topology Admit Handler" podUID="ec6bcf07f6156dec7ab3aee37a6798a4" podNamespace="kube-system" podName="kube-scheduler-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.531900 systemd[1]: Created slice kubepods-burstable-podf1e1dc146ed23c241ef58db7d11f58b4.slice - libcontainer container kubepods-burstable-podf1e1dc146ed23c241ef58db7d11f58b4.slice. May 8 08:41:35.564183 systemd[1]: Created slice kubepods-burstable-podcbd8bd9be2208f566102c35b46840afe.slice - libcontainer container kubepods-burstable-podcbd8bd9be2208f566102c35b46840afe.slice. May 8 08:41:35.585926 systemd[1]: Created slice kubepods-burstable-podec6bcf07f6156dec7ab3aee37a6798a4.slice - libcontainer container kubepods-burstable-podec6bcf07f6156dec7ab3aee37a6798a4.slice. May 8 08:41:35.606827 kubelet[2486]: E0508 08:41:35.606687 2486 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.129:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4327-0-0-w-78bcb828ec.novalocal?timeout=10s\": dial tcp 172.24.4.129:6443: connect: connection refused" interval="400ms" May 8 08:41:35.706461 kubelet[2486]: I0508 08:41:35.706353 2486 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ec6bcf07f6156dec7ab3aee37a6798a4-kubeconfig\") pod \"kube-scheduler-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"ec6bcf07f6156dec7ab3aee37a6798a4\") " pod="kube-system/kube-scheduler-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.706702 kubelet[2486]: I0508 08:41:35.706545 2486 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f1e1dc146ed23c241ef58db7d11f58b4-ca-certs\") pod \"kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"f1e1dc146ed23c241ef58db7d11f58b4\") " pod="kube-system/kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.706702 kubelet[2486]: I0508 08:41:35.706627 2486 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f1e1dc146ed23c241ef58db7d11f58b4-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"f1e1dc146ed23c241ef58db7d11f58b4\") " pod="kube-system/kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.706702 kubelet[2486]: I0508 08:41:35.706676 2486 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cbd8bd9be2208f566102c35b46840afe-ca-certs\") pod \"kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"cbd8bd9be2208f566102c35b46840afe\") " pod="kube-system/kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.707447 kubelet[2486]: I0508 08:41:35.706726 2486 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/cbd8bd9be2208f566102c35b46840afe-flexvolume-dir\") pod \"kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"cbd8bd9be2208f566102c35b46840afe\") " pod="kube-system/kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.707447 kubelet[2486]: I0508 08:41:35.706770 2486 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cbd8bd9be2208f566102c35b46840afe-k8s-certs\") pod \"kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"cbd8bd9be2208f566102c35b46840afe\") " pod="kube-system/kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.707447 kubelet[2486]: I0508 08:41:35.706817 2486 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cbd8bd9be2208f566102c35b46840afe-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"cbd8bd9be2208f566102c35b46840afe\") " pod="kube-system/kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.707447 kubelet[2486]: I0508 08:41:35.706864 2486 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f1e1dc146ed23c241ef58db7d11f58b4-k8s-certs\") pod \"kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"f1e1dc146ed23c241ef58db7d11f58b4\") " pod="kube-system/kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.707720 kubelet[2486]: I0508 08:41:35.706910 2486 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/cbd8bd9be2208f566102c35b46840afe-kubeconfig\") pod \"kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"cbd8bd9be2208f566102c35b46840afe\") " pod="kube-system/kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.707720 kubelet[2486]: I0508 08:41:35.707271 2486 kubelet_node_status.go:73] "Attempting to register node" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.708505 kubelet[2486]: E0508 08:41:35.708446 2486 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.129:6443/api/v1/nodes\": dial tcp 172.24.4.129:6443: connect: connection refused" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:35.857295 containerd[1535]: time="2025-05-08T08:41:35.857167384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal,Uid:f1e1dc146ed23c241ef58db7d11f58b4,Namespace:kube-system,Attempt:0,}" May 8 08:41:35.882826 containerd[1535]: time="2025-05-08T08:41:35.882691948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal,Uid:cbd8bd9be2208f566102c35b46840afe,Namespace:kube-system,Attempt:0,}" May 8 08:41:35.894489 containerd[1535]: time="2025-05-08T08:41:35.894185579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4327-0-0-w-78bcb828ec.novalocal,Uid:ec6bcf07f6156dec7ab3aee37a6798a4,Namespace:kube-system,Attempt:0,}" May 8 08:41:36.007778 kubelet[2486]: E0508 08:41:36.007669 2486 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.129:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4327-0-0-w-78bcb828ec.novalocal?timeout=10s\": dial tcp 172.24.4.129:6443: connect: connection refused" interval="800ms" May 8 08:41:36.112118 kubelet[2486]: I0508 08:41:36.111653 2486 kubelet_node_status.go:73] "Attempting to register node" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:36.112648 kubelet[2486]: E0508 08:41:36.112557 2486 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.129:6443/api/v1/nodes\": dial tcp 172.24.4.129:6443: connect: connection refused" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:36.187700 kubelet[2486]: W0508 08:41:36.187542 2486 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.129:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4327-0-0-w-78bcb828ec.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:36.187700 kubelet[2486]: E0508 08:41:36.187661 2486 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.129:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4327-0-0-w-78bcb828ec.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:36.385493 kubelet[2486]: W0508 08:41:36.385279 2486 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.129:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:36.385493 kubelet[2486]: E0508 08:41:36.385377 2486 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.129:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:36.477964 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount710548289.mount: Deactivated successfully. May 8 08:41:36.488941 containerd[1535]: time="2025-05-08T08:41:36.488853417Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 8 08:41:36.492080 containerd[1535]: time="2025-05-08T08:41:36.491919193Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 8 08:41:36.494906 containerd[1535]: time="2025-05-08T08:41:36.494850688Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 8 08:41:36.496379 containerd[1535]: time="2025-05-08T08:41:36.496319319Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 8 08:41:36.500278 containerd[1535]: time="2025-05-08T08:41:36.500082760Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 8 08:41:36.503199 containerd[1535]: time="2025-05-08T08:41:36.503039167Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 8 08:41:36.504072 containerd[1535]: time="2025-05-08T08:41:36.503767127Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 8 08:41:36.512752 containerd[1535]: time="2025-05-08T08:41:36.512686481Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 8 08:41:36.515171 containerd[1535]: time="2025-05-08T08:41:36.515107651Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 627.93637ms" May 8 08:41:36.517478 containerd[1535]: time="2025-05-08T08:41:36.517413348Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 654.87127ms" May 8 08:41:36.521025 containerd[1535]: time="2025-05-08T08:41:36.520934324Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 617.265879ms" May 8 08:41:36.590177 containerd[1535]: time="2025-05-08T08:41:36.589340215Z" level=info msg="connecting to shim 0654f7847ee51992150abc605270bd62592ed820bfac188a081eac71cb19489e" address="unix:///run/containerd/s/321da4eb83d6ee4dbbb7f6a36e8d75f5cadb13952a21ead6ceabb5fcbf7fea70" namespace=k8s.io protocol=ttrpc version=3 May 8 08:41:36.596967 containerd[1535]: time="2025-05-08T08:41:36.596907335Z" level=info msg="connecting to shim 9ec8b6a358655c55a8b547723c8de72d66d9a9f3ded45da2e97477f71240100b" address="unix:///run/containerd/s/d6cf72baa3f663aac55c14baac99f7b1d00d5043894ec0f004a64fa1ca83d0ae" namespace=k8s.io protocol=ttrpc version=3 May 8 08:41:36.597893 containerd[1535]: time="2025-05-08T08:41:36.597859763Z" level=info msg="connecting to shim 9d9bb516d9b493040459030c5d4d40896eaa2d18bef7d9e297ad6660beb9726e" address="unix:///run/containerd/s/1c21cf33551a0725c7236952d459337364f0bc74de6bf5d5fb19d3431ac61f52" namespace=k8s.io protocol=ttrpc version=3 May 8 08:41:36.632176 systemd[1]: Started cri-containerd-0654f7847ee51992150abc605270bd62592ed820bfac188a081eac71cb19489e.scope - libcontainer container 0654f7847ee51992150abc605270bd62592ed820bfac188a081eac71cb19489e. May 8 08:41:36.638645 systemd[1]: Started cri-containerd-9d9bb516d9b493040459030c5d4d40896eaa2d18bef7d9e297ad6660beb9726e.scope - libcontainer container 9d9bb516d9b493040459030c5d4d40896eaa2d18bef7d9e297ad6660beb9726e. May 8 08:41:36.640639 systemd[1]: Started cri-containerd-9ec8b6a358655c55a8b547723c8de72d66d9a9f3ded45da2e97477f71240100b.scope - libcontainer container 9ec8b6a358655c55a8b547723c8de72d66d9a9f3ded45da2e97477f71240100b. May 8 08:41:36.719237 containerd[1535]: time="2025-05-08T08:41:36.719082383Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal,Uid:f1e1dc146ed23c241ef58db7d11f58b4,Namespace:kube-system,Attempt:0,} returns sandbox id \"9d9bb516d9b493040459030c5d4d40896eaa2d18bef7d9e297ad6660beb9726e\"" May 8 08:41:36.726523 containerd[1535]: time="2025-05-08T08:41:36.726492338Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal,Uid:cbd8bd9be2208f566102c35b46840afe,Namespace:kube-system,Attempt:0,} returns sandbox id \"0654f7847ee51992150abc605270bd62592ed820bfac188a081eac71cb19489e\"" May 8 08:41:36.739004 containerd[1535]: time="2025-05-08T08:41:36.737389788Z" level=info msg="CreateContainer within sandbox \"9d9bb516d9b493040459030c5d4d40896eaa2d18bef7d9e297ad6660beb9726e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 8 08:41:36.744875 containerd[1535]: time="2025-05-08T08:41:36.744845545Z" level=info msg="CreateContainer within sandbox \"0654f7847ee51992150abc605270bd62592ed820bfac188a081eac71cb19489e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 8 08:41:36.746144 containerd[1535]: time="2025-05-08T08:41:36.746096915Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4327-0-0-w-78bcb828ec.novalocal,Uid:ec6bcf07f6156dec7ab3aee37a6798a4,Namespace:kube-system,Attempt:0,} returns sandbox id \"9ec8b6a358655c55a8b547723c8de72d66d9a9f3ded45da2e97477f71240100b\"" May 8 08:41:36.748601 containerd[1535]: time="2025-05-08T08:41:36.748576335Z" level=info msg="CreateContainer within sandbox \"9ec8b6a358655c55a8b547723c8de72d66d9a9f3ded45da2e97477f71240100b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 8 08:41:36.763826 containerd[1535]: time="2025-05-08T08:41:36.763782180Z" level=info msg="Container 247de2e4b5e9f71e2e1014d7886f9bce34b901e47a95286e33b599cf8ef5f0ea: CDI devices from CRI Config.CDIDevices: []" May 8 08:41:36.767782 containerd[1535]: time="2025-05-08T08:41:36.767752054Z" level=info msg="Container e1ccecf9a7e564775b4b86c4474c0fad580f67408835da923f7a9e99df6ce165: CDI devices from CRI Config.CDIDevices: []" May 8 08:41:36.776187 containerd[1535]: time="2025-05-08T08:41:36.776153814Z" level=info msg="CreateContainer within sandbox \"0654f7847ee51992150abc605270bd62592ed820bfac188a081eac71cb19489e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"247de2e4b5e9f71e2e1014d7886f9bce34b901e47a95286e33b599cf8ef5f0ea\"" May 8 08:41:36.776830 containerd[1535]: time="2025-05-08T08:41:36.776806018Z" level=info msg="StartContainer for \"247de2e4b5e9f71e2e1014d7886f9bce34b901e47a95286e33b599cf8ef5f0ea\"" May 8 08:41:36.777340 containerd[1535]: time="2025-05-08T08:41:36.777211399Z" level=info msg="Container cc69cd8baa72a45bfd5c490344eaa375ced26bf863cec673bc84ce8a9cbcc428: CDI devices from CRI Config.CDIDevices: []" May 8 08:41:36.778644 containerd[1535]: time="2025-05-08T08:41:36.778620494Z" level=info msg="connecting to shim 247de2e4b5e9f71e2e1014d7886f9bce34b901e47a95286e33b599cf8ef5f0ea" address="unix:///run/containerd/s/321da4eb83d6ee4dbbb7f6a36e8d75f5cadb13952a21ead6ceabb5fcbf7fea70" protocol=ttrpc version=3 May 8 08:41:36.788059 containerd[1535]: time="2025-05-08T08:41:36.787964557Z" level=info msg="CreateContainer within sandbox \"9d9bb516d9b493040459030c5d4d40896eaa2d18bef7d9e297ad6660beb9726e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"e1ccecf9a7e564775b4b86c4474c0fad580f67408835da923f7a9e99df6ce165\"" May 8 08:41:36.788915 containerd[1535]: time="2025-05-08T08:41:36.788889368Z" level=info msg="StartContainer for \"e1ccecf9a7e564775b4b86c4474c0fad580f67408835da923f7a9e99df6ce165\"" May 8 08:41:36.790083 containerd[1535]: time="2025-05-08T08:41:36.790056682Z" level=info msg="connecting to shim e1ccecf9a7e564775b4b86c4474c0fad580f67408835da923f7a9e99df6ce165" address="unix:///run/containerd/s/1c21cf33551a0725c7236952d459337364f0bc74de6bf5d5fb19d3431ac61f52" protocol=ttrpc version=3 May 8 08:41:36.799615 containerd[1535]: time="2025-05-08T08:41:36.798914667Z" level=info msg="CreateContainer within sandbox \"9ec8b6a358655c55a8b547723c8de72d66d9a9f3ded45da2e97477f71240100b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"cc69cd8baa72a45bfd5c490344eaa375ced26bf863cec673bc84ce8a9cbcc428\"" May 8 08:41:36.799940 containerd[1535]: time="2025-05-08T08:41:36.799914201Z" level=info msg="StartContainer for \"cc69cd8baa72a45bfd5c490344eaa375ced26bf863cec673bc84ce8a9cbcc428\"" May 8 08:41:36.802132 containerd[1535]: time="2025-05-08T08:41:36.802108334Z" level=info msg="connecting to shim cc69cd8baa72a45bfd5c490344eaa375ced26bf863cec673bc84ce8a9cbcc428" address="unix:///run/containerd/s/d6cf72baa3f663aac55c14baac99f7b1d00d5043894ec0f004a64fa1ca83d0ae" protocol=ttrpc version=3 May 8 08:41:36.805197 systemd[1]: Started cri-containerd-247de2e4b5e9f71e2e1014d7886f9bce34b901e47a95286e33b599cf8ef5f0ea.scope - libcontainer container 247de2e4b5e9f71e2e1014d7886f9bce34b901e47a95286e33b599cf8ef5f0ea. May 8 08:41:36.809257 kubelet[2486]: E0508 08:41:36.809199 2486 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.129:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4327-0-0-w-78bcb828ec.novalocal?timeout=10s\": dial tcp 172.24.4.129:6443: connect: connection refused" interval="1.6s" May 8 08:41:36.839170 systemd[1]: Started cri-containerd-cc69cd8baa72a45bfd5c490344eaa375ced26bf863cec673bc84ce8a9cbcc428.scope - libcontainer container cc69cd8baa72a45bfd5c490344eaa375ced26bf863cec673bc84ce8a9cbcc428. May 8 08:41:36.840569 systemd[1]: Started cri-containerd-e1ccecf9a7e564775b4b86c4474c0fad580f67408835da923f7a9e99df6ce165.scope - libcontainer container e1ccecf9a7e564775b4b86c4474c0fad580f67408835da923f7a9e99df6ce165. May 8 08:41:36.897801 kubelet[2486]: W0508 08:41:36.897654 2486 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.129:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:36.897801 kubelet[2486]: E0508 08:41:36.897724 2486 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.129:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:36.912965 containerd[1535]: time="2025-05-08T08:41:36.912829842Z" level=info msg="StartContainer for \"247de2e4b5e9f71e2e1014d7886f9bce34b901e47a95286e33b599cf8ef5f0ea\" returns successfully" May 8 08:41:36.916267 kubelet[2486]: I0508 08:41:36.916246 2486 kubelet_node_status.go:73] "Attempting to register node" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:36.917741 kubelet[2486]: E0508 08:41:36.917178 2486 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.129:6443/api/v1/nodes\": dial tcp 172.24.4.129:6443: connect: connection refused" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:36.948438 containerd[1535]: time="2025-05-08T08:41:36.948396141Z" level=info msg="StartContainer for \"cc69cd8baa72a45bfd5c490344eaa375ced26bf863cec673bc84ce8a9cbcc428\" returns successfully" May 8 08:41:36.948817 containerd[1535]: time="2025-05-08T08:41:36.948785382Z" level=info msg="StartContainer for \"e1ccecf9a7e564775b4b86c4474c0fad580f67408835da923f7a9e99df6ce165\" returns successfully" May 8 08:41:36.997088 kubelet[2486]: W0508 08:41:36.997026 2486 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.129:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:36.997269 kubelet[2486]: E0508 08:41:36.997256 2486 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.129:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.129:6443: connect: connection refused May 8 08:41:38.521408 kubelet[2486]: I0508 08:41:38.520062 2486 kubelet_node_status.go:73] "Attempting to register node" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:38.949162 kubelet[2486]: E0508 08:41:38.949120 2486 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4327-0-0-w-78bcb828ec.novalocal\" not found" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:39.084961 kubelet[2486]: I0508 08:41:39.084760 2486 kubelet_node_status.go:76] "Successfully registered node" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:39.112773 kubelet[2486]: E0508 08:41:39.112734 2486 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4327-0-0-w-78bcb828ec.novalocal\" not found" May 8 08:41:39.213444 kubelet[2486]: E0508 08:41:39.213340 2486 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4327-0-0-w-78bcb828ec.novalocal\" not found" May 8 08:41:39.355142 kubelet[2486]: I0508 08:41:39.355104 2486 apiserver.go:52] "Watching apiserver" May 8 08:41:39.405213 kubelet[2486]: I0508 08:41:39.405168 2486 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 8 08:41:39.469053 kubelet[2486]: E0508 08:41:39.468673 2486 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:40.232507 kubelet[2486]: W0508 08:41:40.232427 2486 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 8 08:41:41.547893 systemd[1]: Reload requested from client PID 2755 ('systemctl') (unit session-11.scope)... May 8 08:41:41.547928 systemd[1]: Reloading... May 8 08:41:41.672030 zram_generator::config[2798]: No configuration found. May 8 08:41:41.805429 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 8 08:41:41.969036 systemd[1]: Reloading finished in 420 ms. May 8 08:41:41.999094 kubelet[2486]: I0508 08:41:41.999030 2486 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 8 08:41:41.999142 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 8 08:41:42.010453 systemd[1]: kubelet.service: Deactivated successfully. May 8 08:41:42.010644 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 8 08:41:42.010692 systemd[1]: kubelet.service: Consumed 2.116s CPU time, 116M memory peak. May 8 08:41:42.012851 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 8 08:41:42.232422 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 8 08:41:42.242250 (kubelet)[2864]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 8 08:41:42.296445 kubelet[2864]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 8 08:41:42.296445 kubelet[2864]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 8 08:41:42.296445 kubelet[2864]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 8 08:41:42.297273 kubelet[2864]: I0508 08:41:42.296497 2864 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 8 08:41:42.301224 kubelet[2864]: I0508 08:41:42.301165 2864 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 8 08:41:42.301224 kubelet[2864]: I0508 08:41:42.301192 2864 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 8 08:41:42.301505 kubelet[2864]: I0508 08:41:42.301407 2864 server.go:927] "Client rotation is on, will bootstrap in background" May 8 08:41:42.306231 kubelet[2864]: I0508 08:41:42.303815 2864 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 8 08:41:42.306231 kubelet[2864]: I0508 08:41:42.305184 2864 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 8 08:41:42.311942 kubelet[2864]: I0508 08:41:42.311895 2864 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 8 08:41:42.312173 kubelet[2864]: I0508 08:41:42.312091 2864 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 8 08:41:42.313023 kubelet[2864]: I0508 08:41:42.312116 2864 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4327-0-0-w-78bcb828ec.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 8 08:41:42.313023 kubelet[2864]: I0508 08:41:42.312296 2864 topology_manager.go:138] "Creating topology manager with none policy" May 8 08:41:42.313023 kubelet[2864]: I0508 08:41:42.312308 2864 container_manager_linux.go:301] "Creating device plugin manager" May 8 08:41:42.313023 kubelet[2864]: I0508 08:41:42.312341 2864 state_mem.go:36] "Initialized new in-memory state store" May 8 08:41:42.313023 kubelet[2864]: I0508 08:41:42.312409 2864 kubelet.go:400] "Attempting to sync node with API server" May 8 08:41:42.313701 kubelet[2864]: I0508 08:41:42.312421 2864 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 8 08:41:42.313701 kubelet[2864]: I0508 08:41:42.312440 2864 kubelet.go:312] "Adding apiserver pod source" May 8 08:41:42.313701 kubelet[2864]: I0508 08:41:42.312455 2864 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 8 08:41:42.313701 kubelet[2864]: I0508 08:41:42.313448 2864 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 8 08:41:42.313701 kubelet[2864]: I0508 08:41:42.313585 2864 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 8 08:41:42.316056 kubelet[2864]: I0508 08:41:42.313964 2864 server.go:1264] "Started kubelet" May 8 08:41:42.324669 kubelet[2864]: I0508 08:41:42.324618 2864 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 8 08:41:42.339026 kubelet[2864]: I0508 08:41:42.336377 2864 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 8 08:41:42.339026 kubelet[2864]: I0508 08:41:42.337427 2864 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 8 08:41:42.339026 kubelet[2864]: I0508 08:41:42.337449 2864 status_manager.go:217] "Starting to sync pod status with apiserver" May 8 08:41:42.339026 kubelet[2864]: I0508 08:41:42.337466 2864 kubelet.go:2337] "Starting kubelet main sync loop" May 8 08:41:42.339026 kubelet[2864]: E0508 08:41:42.337503 2864 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 8 08:41:42.345019 kubelet[2864]: I0508 08:41:42.344372 2864 volume_manager.go:291] "Starting Kubelet Volume Manager" May 8 08:41:42.347015 kubelet[2864]: I0508 08:41:42.329095 2864 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 8 08:41:42.347015 kubelet[2864]: I0508 08:41:42.345891 2864 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 8 08:41:42.347015 kubelet[2864]: I0508 08:41:42.346188 2864 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 8 08:41:42.347116 kubelet[2864]: I0508 08:41:42.347019 2864 server.go:455] "Adding debug handlers to kubelet server" May 8 08:41:42.347234 kubelet[2864]: I0508 08:41:42.347220 2864 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 8 08:41:42.347451 kubelet[2864]: I0508 08:41:42.347440 2864 reconciler.go:26] "Reconciler: start to sync state" May 8 08:41:42.366266 kubelet[2864]: I0508 08:41:42.365570 2864 factory.go:221] Registration of the systemd container factory successfully May 8 08:41:42.366567 kubelet[2864]: I0508 08:41:42.366545 2864 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 8 08:41:42.376863 kubelet[2864]: I0508 08:41:42.376829 2864 factory.go:221] Registration of the containerd container factory successfully May 8 08:41:42.381468 kubelet[2864]: E0508 08:41:42.376875 2864 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 8 08:41:42.430951 kubelet[2864]: I0508 08:41:42.430920 2864 cpu_manager.go:214] "Starting CPU manager" policy="none" May 8 08:41:42.430951 kubelet[2864]: I0508 08:41:42.430941 2864 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 8 08:41:42.430951 kubelet[2864]: I0508 08:41:42.430958 2864 state_mem.go:36] "Initialized new in-memory state store" May 8 08:41:42.431195 kubelet[2864]: I0508 08:41:42.431123 2864 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 8 08:41:42.431195 kubelet[2864]: I0508 08:41:42.431135 2864 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 8 08:41:42.431195 kubelet[2864]: I0508 08:41:42.431153 2864 policy_none.go:49] "None policy: Start" May 8 08:41:42.431893 kubelet[2864]: I0508 08:41:42.431875 2864 memory_manager.go:170] "Starting memorymanager" policy="None" May 8 08:41:42.431893 kubelet[2864]: I0508 08:41:42.431896 2864 state_mem.go:35] "Initializing new in-memory state store" May 8 08:41:42.432081 kubelet[2864]: I0508 08:41:42.432061 2864 state_mem.go:75] "Updated machine memory state" May 8 08:41:42.436538 kubelet[2864]: I0508 08:41:42.436505 2864 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 8 08:41:42.436692 kubelet[2864]: I0508 08:41:42.436644 2864 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 8 08:41:42.436756 kubelet[2864]: I0508 08:41:42.436737 2864 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 8 08:41:42.438074 kubelet[2864]: I0508 08:41:42.437895 2864 topology_manager.go:215] "Topology Admit Handler" podUID="ec6bcf07f6156dec7ab3aee37a6798a4" podNamespace="kube-system" podName="kube-scheduler-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.438074 kubelet[2864]: I0508 08:41:42.437972 2864 topology_manager.go:215] "Topology Admit Handler" podUID="f1e1dc146ed23c241ef58db7d11f58b4" podNamespace="kube-system" podName="kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.438179 kubelet[2864]: I0508 08:41:42.438076 2864 topology_manager.go:215] "Topology Admit Handler" podUID="cbd8bd9be2208f566102c35b46840afe" podNamespace="kube-system" podName="kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.454805 kubelet[2864]: I0508 08:41:42.454205 2864 kubelet_node_status.go:73] "Attempting to register node" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.457520 kubelet[2864]: W0508 08:41:42.457502 2864 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 8 08:41:42.459081 kubelet[2864]: W0508 08:41:42.459067 2864 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 8 08:41:42.462904 kubelet[2864]: W0508 08:41:42.462797 2864 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 8 08:41:42.462904 kubelet[2864]: E0508 08:41:42.462855 2864 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4327-0-0-w-78bcb828ec.novalocal\" already exists" pod="kube-system/kube-scheduler-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.481286 kubelet[2864]: I0508 08:41:42.481076 2864 kubelet_node_status.go:112] "Node was previously registered" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.481286 kubelet[2864]: I0508 08:41:42.481178 2864 kubelet_node_status.go:76] "Successfully registered node" node="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.549206 kubelet[2864]: I0508 08:41:42.549026 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/cbd8bd9be2208f566102c35b46840afe-flexvolume-dir\") pod \"kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"cbd8bd9be2208f566102c35b46840afe\") " pod="kube-system/kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.549206 kubelet[2864]: I0508 08:41:42.549066 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cbd8bd9be2208f566102c35b46840afe-k8s-certs\") pod \"kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"cbd8bd9be2208f566102c35b46840afe\") " pod="kube-system/kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.549206 kubelet[2864]: I0508 08:41:42.549090 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/cbd8bd9be2208f566102c35b46840afe-kubeconfig\") pod \"kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"cbd8bd9be2208f566102c35b46840afe\") " pod="kube-system/kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.549206 kubelet[2864]: I0508 08:41:42.549128 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cbd8bd9be2208f566102c35b46840afe-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"cbd8bd9be2208f566102c35b46840afe\") " pod="kube-system/kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.549438 kubelet[2864]: I0508 08:41:42.549154 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f1e1dc146ed23c241ef58db7d11f58b4-ca-certs\") pod \"kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"f1e1dc146ed23c241ef58db7d11f58b4\") " pod="kube-system/kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.549438 kubelet[2864]: I0508 08:41:42.549173 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f1e1dc146ed23c241ef58db7d11f58b4-k8s-certs\") pod \"kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"f1e1dc146ed23c241ef58db7d11f58b4\") " pod="kube-system/kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.549438 kubelet[2864]: I0508 08:41:42.549193 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f1e1dc146ed23c241ef58db7d11f58b4-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"f1e1dc146ed23c241ef58db7d11f58b4\") " pod="kube-system/kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.549438 kubelet[2864]: I0508 08:41:42.549220 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cbd8bd9be2208f566102c35b46840afe-ca-certs\") pod \"kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"cbd8bd9be2208f566102c35b46840afe\") " pod="kube-system/kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:42.549438 kubelet[2864]: I0508 08:41:42.549241 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ec6bcf07f6156dec7ab3aee37a6798a4-kubeconfig\") pod \"kube-scheduler-ci-4327-0-0-w-78bcb828ec.novalocal\" (UID: \"ec6bcf07f6156dec7ab3aee37a6798a4\") " pod="kube-system/kube-scheduler-ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:41:43.317905 kubelet[2864]: I0508 08:41:43.317273 2864 apiserver.go:52] "Watching apiserver" May 8 08:41:43.349313 kubelet[2864]: I0508 08:41:43.349252 2864 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 8 08:41:43.448591 kubelet[2864]: I0508 08:41:43.448533 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4327-0-0-w-78bcb828ec.novalocal" podStartSLOduration=3.448514881 podStartE2EDuration="3.448514881s" podCreationTimestamp="2025-05-08 08:41:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-08 08:41:43.425816348 +0000 UTC m=+1.178133682" watchObservedRunningTime="2025-05-08 08:41:43.448514881 +0000 UTC m=+1.200832215" May 8 08:41:43.466305 kubelet[2864]: I0508 08:41:43.465778 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4327-0-0-w-78bcb828ec.novalocal" podStartSLOduration=1.4657602650000001 podStartE2EDuration="1.465760265s" podCreationTimestamp="2025-05-08 08:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-08 08:41:43.450540837 +0000 UTC m=+1.202858161" watchObservedRunningTime="2025-05-08 08:41:43.465760265 +0000 UTC m=+1.218077589" May 8 08:41:43.485271 kubelet[2864]: I0508 08:41:43.485221 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4327-0-0-w-78bcb828ec.novalocal" podStartSLOduration=1.485201677 podStartE2EDuration="1.485201677s" podCreationTimestamp="2025-05-08 08:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-08 08:41:43.466436712 +0000 UTC m=+1.218754047" watchObservedRunningTime="2025-05-08 08:41:43.485201677 +0000 UTC m=+1.237519001" May 8 08:41:49.174031 sudo[1809]: pam_unix(sudo:session): session closed for user root May 8 08:41:49.327769 sshd[1808]: Connection closed by 172.24.4.1 port 48680 May 8 08:41:49.328861 sshd-session[1805]: pam_unix(sshd:session): session closed for user core May 8 08:41:49.337598 systemd-logind[1511]: Session 11 logged out. Waiting for processes to exit. May 8 08:41:49.338543 systemd[1]: sshd@8-172.24.4.129:22-172.24.4.1:48680.service: Deactivated successfully. May 8 08:41:49.343459 systemd[1]: session-11.scope: Deactivated successfully. May 8 08:41:49.345308 systemd[1]: session-11.scope: Consumed 7.735s CPU time, 256.4M memory peak. May 8 08:41:49.351903 systemd-logind[1511]: Removed session 11. May 8 08:41:56.669101 kubelet[2864]: I0508 08:41:56.668891 2864 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 8 08:41:56.669537 kubelet[2864]: I0508 08:41:56.669495 2864 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 8 08:41:56.669581 containerd[1535]: time="2025-05-08T08:41:56.669313835Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 8 08:41:56.727310 kubelet[2864]: I0508 08:41:56.727227 2864 topology_manager.go:215] "Topology Admit Handler" podUID="d905e7fa-0038-413f-a65e-05db56bc9c9f" podNamespace="kube-system" podName="kube-proxy-nx7ll" May 8 08:41:56.736763 systemd[1]: Created slice kubepods-besteffort-podd905e7fa_0038_413f_a65e_05db56bc9c9f.slice - libcontainer container kubepods-besteffort-podd905e7fa_0038_413f_a65e_05db56bc9c9f.slice. May 8 08:41:56.833084 kubelet[2864]: I0508 08:41:56.832695 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d905e7fa-0038-413f-a65e-05db56bc9c9f-lib-modules\") pod \"kube-proxy-nx7ll\" (UID: \"d905e7fa-0038-413f-a65e-05db56bc9c9f\") " pod="kube-system/kube-proxy-nx7ll" May 8 08:41:56.833084 kubelet[2864]: I0508 08:41:56.832781 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d905e7fa-0038-413f-a65e-05db56bc9c9f-kube-proxy\") pod \"kube-proxy-nx7ll\" (UID: \"d905e7fa-0038-413f-a65e-05db56bc9c9f\") " pod="kube-system/kube-proxy-nx7ll" May 8 08:41:56.833084 kubelet[2864]: I0508 08:41:56.832828 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d905e7fa-0038-413f-a65e-05db56bc9c9f-xtables-lock\") pod \"kube-proxy-nx7ll\" (UID: \"d905e7fa-0038-413f-a65e-05db56bc9c9f\") " pod="kube-system/kube-proxy-nx7ll" May 8 08:41:56.833084 kubelet[2864]: I0508 08:41:56.832879 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkcgs\" (UniqueName: \"kubernetes.io/projected/d905e7fa-0038-413f-a65e-05db56bc9c9f-kube-api-access-jkcgs\") pod \"kube-proxy-nx7ll\" (UID: \"d905e7fa-0038-413f-a65e-05db56bc9c9f\") " pod="kube-system/kube-proxy-nx7ll" May 8 08:41:56.946177 kubelet[2864]: E0508 08:41:56.945875 2864 projected.go:294] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found May 8 08:41:56.946177 kubelet[2864]: E0508 08:41:56.945932 2864 projected.go:200] Error preparing data for projected volume kube-api-access-jkcgs for pod kube-system/kube-proxy-nx7ll: configmap "kube-root-ca.crt" not found May 8 08:41:56.946177 kubelet[2864]: E0508 08:41:56.946101 2864 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d905e7fa-0038-413f-a65e-05db56bc9c9f-kube-api-access-jkcgs podName:d905e7fa-0038-413f-a65e-05db56bc9c9f nodeName:}" failed. No retries permitted until 2025-05-08 08:41:57.446061794 +0000 UTC m=+15.198379168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jkcgs" (UniqueName: "kubernetes.io/projected/d905e7fa-0038-413f-a65e-05db56bc9c9f-kube-api-access-jkcgs") pod "kube-proxy-nx7ll" (UID: "d905e7fa-0038-413f-a65e-05db56bc9c9f") : configmap "kube-root-ca.crt" not found May 8 08:41:57.649550 containerd[1535]: time="2025-05-08T08:41:57.648355286Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-nx7ll,Uid:d905e7fa-0038-413f-a65e-05db56bc9c9f,Namespace:kube-system,Attempt:0,}" May 8 08:41:57.703005 containerd[1535]: time="2025-05-08T08:41:57.700171783Z" level=info msg="connecting to shim 0a6dce8666649044275e88a14e7bdf2610a1322593752b7c92b2e9daddf1628b" address="unix:///run/containerd/s/8f22b4157a8833ceef33c4fe9d6eda7ec6f1d697414925d98d91effb11e81ddc" namespace=k8s.io protocol=ttrpc version=3 May 8 08:41:57.745206 systemd[1]: Started cri-containerd-0a6dce8666649044275e88a14e7bdf2610a1322593752b7c92b2e9daddf1628b.scope - libcontainer container 0a6dce8666649044275e88a14e7bdf2610a1322593752b7c92b2e9daddf1628b. May 8 08:41:57.801122 kubelet[2864]: I0508 08:41:57.801063 2864 topology_manager.go:215] "Topology Admit Handler" podUID="fa2fe4bd-d0ce-4f8d-92cc-41c5524d33fa" podNamespace="tigera-operator" podName="tigera-operator-797db67f8-tbpnw" May 8 08:41:57.809398 systemd[1]: Created slice kubepods-besteffort-podfa2fe4bd_d0ce_4f8d_92cc_41c5524d33fa.slice - libcontainer container kubepods-besteffort-podfa2fe4bd_d0ce_4f8d_92cc_41c5524d33fa.slice. May 8 08:41:57.838892 kubelet[2864]: I0508 08:41:57.838672 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fa2fe4bd-d0ce-4f8d-92cc-41c5524d33fa-var-lib-calico\") pod \"tigera-operator-797db67f8-tbpnw\" (UID: \"fa2fe4bd-d0ce-4f8d-92cc-41c5524d33fa\") " pod="tigera-operator/tigera-operator-797db67f8-tbpnw" May 8 08:41:57.838892 kubelet[2864]: I0508 08:41:57.838719 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh6jj\" (UniqueName: \"kubernetes.io/projected/fa2fe4bd-d0ce-4f8d-92cc-41c5524d33fa-kube-api-access-wh6jj\") pod \"tigera-operator-797db67f8-tbpnw\" (UID: \"fa2fe4bd-d0ce-4f8d-92cc-41c5524d33fa\") " pod="tigera-operator/tigera-operator-797db67f8-tbpnw" May 8 08:41:57.893945 containerd[1535]: time="2025-05-08T08:41:57.893903497Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-nx7ll,Uid:d905e7fa-0038-413f-a65e-05db56bc9c9f,Namespace:kube-system,Attempt:0,} returns sandbox id \"0a6dce8666649044275e88a14e7bdf2610a1322593752b7c92b2e9daddf1628b\"" May 8 08:41:57.897855 containerd[1535]: time="2025-05-08T08:41:57.897826059Z" level=info msg="CreateContainer within sandbox \"0a6dce8666649044275e88a14e7bdf2610a1322593752b7c92b2e9daddf1628b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 8 08:41:57.918592 containerd[1535]: time="2025-05-08T08:41:57.918483296Z" level=info msg="Container 769ce901e03f5225019c6123543d69f5565238999d5755c5e026143056cbae7c: CDI devices from CRI Config.CDIDevices: []" May 8 08:41:57.934483 containerd[1535]: time="2025-05-08T08:41:57.934375781Z" level=info msg="CreateContainer within sandbox \"0a6dce8666649044275e88a14e7bdf2610a1322593752b7c92b2e9daddf1628b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"769ce901e03f5225019c6123543d69f5565238999d5755c5e026143056cbae7c\"" May 8 08:41:57.936340 containerd[1535]: time="2025-05-08T08:41:57.935204883Z" level=info msg="StartContainer for \"769ce901e03f5225019c6123543d69f5565238999d5755c5e026143056cbae7c\"" May 8 08:41:57.937042 containerd[1535]: time="2025-05-08T08:41:57.937018745Z" level=info msg="connecting to shim 769ce901e03f5225019c6123543d69f5565238999d5755c5e026143056cbae7c" address="unix:///run/containerd/s/8f22b4157a8833ceef33c4fe9d6eda7ec6f1d697414925d98d91effb11e81ddc" protocol=ttrpc version=3 May 8 08:41:57.962133 systemd[1]: Started cri-containerd-769ce901e03f5225019c6123543d69f5565238999d5755c5e026143056cbae7c.scope - libcontainer container 769ce901e03f5225019c6123543d69f5565238999d5755c5e026143056cbae7c. May 8 08:41:58.018306 containerd[1535]: time="2025-05-08T08:41:58.018277440Z" level=info msg="StartContainer for \"769ce901e03f5225019c6123543d69f5565238999d5755c5e026143056cbae7c\" returns successfully" May 8 08:41:58.116929 containerd[1535]: time="2025-05-08T08:41:58.116892085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-tbpnw,Uid:fa2fe4bd-d0ce-4f8d-92cc-41c5524d33fa,Namespace:tigera-operator,Attempt:0,}" May 8 08:41:58.155283 containerd[1535]: time="2025-05-08T08:41:58.155244476Z" level=info msg="connecting to shim cb3023a8ccd79dc08236fadf977953f91bb371344adaf3203109efa951845732" address="unix:///run/containerd/s/06f70fe599ea6da7692438c098ceae9fb8a615cf9e2a250eb85be6fc08aa3ebf" namespace=k8s.io protocol=ttrpc version=3 May 8 08:41:58.194507 systemd[1]: Started cri-containerd-cb3023a8ccd79dc08236fadf977953f91bb371344adaf3203109efa951845732.scope - libcontainer container cb3023a8ccd79dc08236fadf977953f91bb371344adaf3203109efa951845732. May 8 08:41:58.279682 containerd[1535]: time="2025-05-08T08:41:58.279632973Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-tbpnw,Uid:fa2fe4bd-d0ce-4f8d-92cc-41c5524d33fa,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"cb3023a8ccd79dc08236fadf977953f91bb371344adaf3203109efa951845732\"" May 8 08:41:58.283695 containerd[1535]: time="2025-05-08T08:41:58.283449280Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 8 08:41:58.457926 kubelet[2864]: I0508 08:41:58.457644 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-nx7ll" podStartSLOduration=2.457619851 podStartE2EDuration="2.457619851s" podCreationTimestamp="2025-05-08 08:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-08 08:41:58.456732387 +0000 UTC m=+16.209049721" watchObservedRunningTime="2025-05-08 08:41:58.457619851 +0000 UTC m=+16.209937195" May 8 08:42:00.135708 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1869679549.mount: Deactivated successfully. May 8 08:42:00.748003 containerd[1535]: time="2025-05-08T08:42:00.747894729Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:00.749762 containerd[1535]: time="2025-05-08T08:42:00.749397098Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 8 08:42:00.753018 containerd[1535]: time="2025-05-08T08:42:00.752177960Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:00.756850 containerd[1535]: time="2025-05-08T08:42:00.756809283Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:00.757455 containerd[1535]: time="2025-05-08T08:42:00.757424698Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 2.473936241s" May 8 08:42:00.757544 containerd[1535]: time="2025-05-08T08:42:00.757526506Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 8 08:42:00.760564 containerd[1535]: time="2025-05-08T08:42:00.760527119Z" level=info msg="CreateContainer within sandbox \"cb3023a8ccd79dc08236fadf977953f91bb371344adaf3203109efa951845732\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 8 08:42:00.774178 containerd[1535]: time="2025-05-08T08:42:00.774142684Z" level=info msg="Container d8f4754ddf0f8ba098ed73375865402ed3ad8d43d351733863cd6559bfe63490: CDI devices from CRI Config.CDIDevices: []" May 8 08:42:00.786333 containerd[1535]: time="2025-05-08T08:42:00.786284614Z" level=info msg="CreateContainer within sandbox \"cb3023a8ccd79dc08236fadf977953f91bb371344adaf3203109efa951845732\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d8f4754ddf0f8ba098ed73375865402ed3ad8d43d351733863cd6559bfe63490\"" May 8 08:42:00.787140 containerd[1535]: time="2025-05-08T08:42:00.786877714Z" level=info msg="StartContainer for \"d8f4754ddf0f8ba098ed73375865402ed3ad8d43d351733863cd6559bfe63490\"" May 8 08:42:00.787780 containerd[1535]: time="2025-05-08T08:42:00.787752205Z" level=info msg="connecting to shim d8f4754ddf0f8ba098ed73375865402ed3ad8d43d351733863cd6559bfe63490" address="unix:///run/containerd/s/06f70fe599ea6da7692438c098ceae9fb8a615cf9e2a250eb85be6fc08aa3ebf" protocol=ttrpc version=3 May 8 08:42:00.818306 systemd[1]: Started cri-containerd-d8f4754ddf0f8ba098ed73375865402ed3ad8d43d351733863cd6559bfe63490.scope - libcontainer container d8f4754ddf0f8ba098ed73375865402ed3ad8d43d351733863cd6559bfe63490. May 8 08:42:00.865038 containerd[1535]: time="2025-05-08T08:42:00.864912391Z" level=info msg="StartContainer for \"d8f4754ddf0f8ba098ed73375865402ed3ad8d43d351733863cd6559bfe63490\" returns successfully" May 8 08:42:04.171182 kubelet[2864]: I0508 08:42:04.171057 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-797db67f8-tbpnw" podStartSLOduration=4.6938559810000005 podStartE2EDuration="7.171038558s" podCreationTimestamp="2025-05-08 08:41:57 +0000 UTC" firstStartedPulling="2025-05-08 08:41:58.281849887 +0000 UTC m=+16.034167211" lastFinishedPulling="2025-05-08 08:42:00.759032454 +0000 UTC m=+18.511349788" observedRunningTime="2025-05-08 08:42:01.475477785 +0000 UTC m=+19.227795159" watchObservedRunningTime="2025-05-08 08:42:04.171038558 +0000 UTC m=+21.923355882" May 8 08:42:04.171667 kubelet[2864]: I0508 08:42:04.171200 2864 topology_manager.go:215] "Topology Admit Handler" podUID="6840bd30-2c02-4b96-bff5-95b83081dad7" podNamespace="calico-system" podName="calico-typha-5f94b5b785-f825v" May 8 08:42:04.180381 systemd[1]: Created slice kubepods-besteffort-pod6840bd30_2c02_4b96_bff5_95b83081dad7.slice - libcontainer container kubepods-besteffort-pod6840bd30_2c02_4b96_bff5_95b83081dad7.slice. May 8 08:42:04.183088 kubelet[2864]: I0508 08:42:04.182253 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmwtm\" (UniqueName: \"kubernetes.io/projected/6840bd30-2c02-4b96-bff5-95b83081dad7-kube-api-access-lmwtm\") pod \"calico-typha-5f94b5b785-f825v\" (UID: \"6840bd30-2c02-4b96-bff5-95b83081dad7\") " pod="calico-system/calico-typha-5f94b5b785-f825v" May 8 08:42:04.183088 kubelet[2864]: I0508 08:42:04.182292 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6840bd30-2c02-4b96-bff5-95b83081dad7-typha-certs\") pod \"calico-typha-5f94b5b785-f825v\" (UID: \"6840bd30-2c02-4b96-bff5-95b83081dad7\") " pod="calico-system/calico-typha-5f94b5b785-f825v" May 8 08:42:04.183088 kubelet[2864]: I0508 08:42:04.182321 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6840bd30-2c02-4b96-bff5-95b83081dad7-tigera-ca-bundle\") pod \"calico-typha-5f94b5b785-f825v\" (UID: \"6840bd30-2c02-4b96-bff5-95b83081dad7\") " pod="calico-system/calico-typha-5f94b5b785-f825v" May 8 08:42:04.383716 kubelet[2864]: I0508 08:42:04.382884 2864 topology_manager.go:215] "Topology Admit Handler" podUID="440d3895-87f6-4e5c-90ff-ff7cdd99b071" podNamespace="calico-system" podName="calico-node-t5jrg" May 8 08:42:04.392432 systemd[1]: Created slice kubepods-besteffort-pod440d3895_87f6_4e5c_90ff_ff7cdd99b071.slice - libcontainer container kubepods-besteffort-pod440d3895_87f6_4e5c_90ff_ff7cdd99b071.slice. May 8 08:42:04.484073 kubelet[2864]: I0508 08:42:04.483147 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/440d3895-87f6-4e5c-90ff-ff7cdd99b071-tigera-ca-bundle\") pod \"calico-node-t5jrg\" (UID: \"440d3895-87f6-4e5c-90ff-ff7cdd99b071\") " pod="calico-system/calico-node-t5jrg" May 8 08:42:04.484073 kubelet[2864]: I0508 08:42:04.483195 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/440d3895-87f6-4e5c-90ff-ff7cdd99b071-lib-modules\") pod \"calico-node-t5jrg\" (UID: \"440d3895-87f6-4e5c-90ff-ff7cdd99b071\") " pod="calico-system/calico-node-t5jrg" May 8 08:42:04.484073 kubelet[2864]: I0508 08:42:04.483215 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/440d3895-87f6-4e5c-90ff-ff7cdd99b071-policysync\") pod \"calico-node-t5jrg\" (UID: \"440d3895-87f6-4e5c-90ff-ff7cdd99b071\") " pod="calico-system/calico-node-t5jrg" May 8 08:42:04.484073 kubelet[2864]: I0508 08:42:04.483237 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/440d3895-87f6-4e5c-90ff-ff7cdd99b071-xtables-lock\") pod \"calico-node-t5jrg\" (UID: \"440d3895-87f6-4e5c-90ff-ff7cdd99b071\") " pod="calico-system/calico-node-t5jrg" May 8 08:42:04.484073 kubelet[2864]: I0508 08:42:04.483260 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/440d3895-87f6-4e5c-90ff-ff7cdd99b071-cni-log-dir\") pod \"calico-node-t5jrg\" (UID: \"440d3895-87f6-4e5c-90ff-ff7cdd99b071\") " pod="calico-system/calico-node-t5jrg" May 8 08:42:04.484356 kubelet[2864]: I0508 08:42:04.483283 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/440d3895-87f6-4e5c-90ff-ff7cdd99b071-cni-bin-dir\") pod \"calico-node-t5jrg\" (UID: \"440d3895-87f6-4e5c-90ff-ff7cdd99b071\") " pod="calico-system/calico-node-t5jrg" May 8 08:42:04.484356 kubelet[2864]: I0508 08:42:04.483304 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtz2x\" (UniqueName: \"kubernetes.io/projected/440d3895-87f6-4e5c-90ff-ff7cdd99b071-kube-api-access-vtz2x\") pod \"calico-node-t5jrg\" (UID: \"440d3895-87f6-4e5c-90ff-ff7cdd99b071\") " pod="calico-system/calico-node-t5jrg" May 8 08:42:04.484356 kubelet[2864]: I0508 08:42:04.483329 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/440d3895-87f6-4e5c-90ff-ff7cdd99b071-node-certs\") pod \"calico-node-t5jrg\" (UID: \"440d3895-87f6-4e5c-90ff-ff7cdd99b071\") " pod="calico-system/calico-node-t5jrg" May 8 08:42:04.484356 kubelet[2864]: I0508 08:42:04.483346 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/440d3895-87f6-4e5c-90ff-ff7cdd99b071-var-run-calico\") pod \"calico-node-t5jrg\" (UID: \"440d3895-87f6-4e5c-90ff-ff7cdd99b071\") " pod="calico-system/calico-node-t5jrg" May 8 08:42:04.484356 kubelet[2864]: I0508 08:42:04.483367 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/440d3895-87f6-4e5c-90ff-ff7cdd99b071-var-lib-calico\") pod \"calico-node-t5jrg\" (UID: \"440d3895-87f6-4e5c-90ff-ff7cdd99b071\") " pod="calico-system/calico-node-t5jrg" May 8 08:42:04.484579 kubelet[2864]: I0508 08:42:04.483384 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/440d3895-87f6-4e5c-90ff-ff7cdd99b071-cni-net-dir\") pod \"calico-node-t5jrg\" (UID: \"440d3895-87f6-4e5c-90ff-ff7cdd99b071\") " pod="calico-system/calico-node-t5jrg" May 8 08:42:04.484579 kubelet[2864]: I0508 08:42:04.483407 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/440d3895-87f6-4e5c-90ff-ff7cdd99b071-flexvol-driver-host\") pod \"calico-node-t5jrg\" (UID: \"440d3895-87f6-4e5c-90ff-ff7cdd99b071\") " pod="calico-system/calico-node-t5jrg" May 8 08:42:04.486601 containerd[1535]: time="2025-05-08T08:42:04.486400088Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5f94b5b785-f825v,Uid:6840bd30-2c02-4b96-bff5-95b83081dad7,Namespace:calico-system,Attempt:0,}" May 8 08:42:04.518670 kubelet[2864]: I0508 08:42:04.516931 2864 topology_manager.go:215] "Topology Admit Handler" podUID="55738467-875a-4dbf-84b4-c2a4f16cd05d" podNamespace="calico-system" podName="csi-node-driver-hfrk7" May 8 08:42:04.518670 kubelet[2864]: E0508 08:42:04.517251 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hfrk7" podUID="55738467-875a-4dbf-84b4-c2a4f16cd05d" May 8 08:42:04.538295 containerd[1535]: time="2025-05-08T08:42:04.538239576Z" level=info msg="connecting to shim 3ef9f70c56203e4522d049a3b382b54c3af9b2fd10eb666f50cda61e8136216c" address="unix:///run/containerd/s/882520e6e6a5f976a9ceb702161becd6d476d18e0735fc70c68657259823b077" namespace=k8s.io protocol=ttrpc version=3 May 8 08:42:04.584078 kubelet[2864]: I0508 08:42:04.583679 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/55738467-875a-4dbf-84b4-c2a4f16cd05d-socket-dir\") pod \"csi-node-driver-hfrk7\" (UID: \"55738467-875a-4dbf-84b4-c2a4f16cd05d\") " pod="calico-system/csi-node-driver-hfrk7" May 8 08:42:04.584078 kubelet[2864]: I0508 08:42:04.583735 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/55738467-875a-4dbf-84b4-c2a4f16cd05d-registration-dir\") pod \"csi-node-driver-hfrk7\" (UID: \"55738467-875a-4dbf-84b4-c2a4f16cd05d\") " pod="calico-system/csi-node-driver-hfrk7" May 8 08:42:04.584078 kubelet[2864]: I0508 08:42:04.583790 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/55738467-875a-4dbf-84b4-c2a4f16cd05d-varrun\") pod \"csi-node-driver-hfrk7\" (UID: \"55738467-875a-4dbf-84b4-c2a4f16cd05d\") " pod="calico-system/csi-node-driver-hfrk7" May 8 08:42:04.584078 kubelet[2864]: I0508 08:42:04.583851 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62zvb\" (UniqueName: \"kubernetes.io/projected/55738467-875a-4dbf-84b4-c2a4f16cd05d-kube-api-access-62zvb\") pod \"csi-node-driver-hfrk7\" (UID: \"55738467-875a-4dbf-84b4-c2a4f16cd05d\") " pod="calico-system/csi-node-driver-hfrk7" May 8 08:42:04.584078 kubelet[2864]: I0508 08:42:04.583925 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55738467-875a-4dbf-84b4-c2a4f16cd05d-kubelet-dir\") pod \"csi-node-driver-hfrk7\" (UID: \"55738467-875a-4dbf-84b4-c2a4f16cd05d\") " pod="calico-system/csi-node-driver-hfrk7" May 8 08:42:04.586020 systemd[1]: Started cri-containerd-3ef9f70c56203e4522d049a3b382b54c3af9b2fd10eb666f50cda61e8136216c.scope - libcontainer container 3ef9f70c56203e4522d049a3b382b54c3af9b2fd10eb666f50cda61e8136216c. May 8 08:42:04.592460 kubelet[2864]: E0508 08:42:04.592397 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.592460 kubelet[2864]: W0508 08:42:04.592422 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.592460 kubelet[2864]: E0508 08:42:04.592460 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.594107 kubelet[2864]: E0508 08:42:04.594074 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.594184 kubelet[2864]: W0508 08:42:04.594109 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.594184 kubelet[2864]: E0508 08:42:04.594138 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.596833 kubelet[2864]: E0508 08:42:04.595931 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.596833 kubelet[2864]: W0508 08:42:04.595947 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.596833 kubelet[2864]: E0508 08:42:04.596188 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.596833 kubelet[2864]: W0508 08:42:04.596197 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.596833 kubelet[2864]: E0508 08:42:04.596211 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.596833 kubelet[2864]: E0508 08:42:04.596242 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.600492 kubelet[2864]: E0508 08:42:04.597099 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.600492 kubelet[2864]: W0508 08:42:04.597116 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.600492 kubelet[2864]: E0508 08:42:04.597128 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.600492 kubelet[2864]: E0508 08:42:04.598115 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.600492 kubelet[2864]: W0508 08:42:04.598126 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.600492 kubelet[2864]: E0508 08:42:04.598136 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.630241 kubelet[2864]: E0508 08:42:04.629776 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.630444 kubelet[2864]: W0508 08:42:04.630425 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.630536 kubelet[2864]: E0508 08:42:04.630523 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.685089 kubelet[2864]: E0508 08:42:04.684950 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.685089 kubelet[2864]: W0508 08:42:04.684977 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.685089 kubelet[2864]: E0508 08:42:04.685023 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.685524 kubelet[2864]: E0508 08:42:04.685358 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.685524 kubelet[2864]: W0508 08:42:04.685371 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.685524 kubelet[2864]: E0508 08:42:04.685394 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.686036 kubelet[2864]: E0508 08:42:04.685624 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.686036 kubelet[2864]: W0508 08:42:04.685638 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.686036 kubelet[2864]: E0508 08:42:04.685673 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.686173 kubelet[2864]: E0508 08:42:04.686106 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.686173 kubelet[2864]: W0508 08:42:04.686125 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.686173 kubelet[2864]: E0508 08:42:04.686150 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.686524 kubelet[2864]: E0508 08:42:04.686373 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.686524 kubelet[2864]: W0508 08:42:04.686394 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.686524 kubelet[2864]: E0508 08:42:04.686421 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.686893 kubelet[2864]: E0508 08:42:04.686821 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.686893 kubelet[2864]: W0508 08:42:04.686832 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.686893 kubelet[2864]: E0508 08:42:04.686860 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.687155 kubelet[2864]: E0508 08:42:04.687134 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.687155 kubelet[2864]: W0508 08:42:04.687151 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.687387 kubelet[2864]: E0508 08:42:04.687252 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.687589 kubelet[2864]: E0508 08:42:04.687554 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.687589 kubelet[2864]: W0508 08:42:04.687569 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.687846 kubelet[2864]: E0508 08:42:04.687690 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.688060 kubelet[2864]: E0508 08:42:04.688041 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.688060 kubelet[2864]: W0508 08:42:04.688057 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.688337 kubelet[2864]: E0508 08:42:04.688118 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.688337 kubelet[2864]: E0508 08:42:04.688258 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.688337 kubelet[2864]: W0508 08:42:04.688269 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.688509 kubelet[2864]: E0508 08:42:04.688481 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.688633 kubelet[2864]: E0508 08:42:04.688615 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.688633 kubelet[2864]: W0508 08:42:04.688631 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.688920 kubelet[2864]: E0508 08:42:04.688883 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.689361 kubelet[2864]: E0508 08:42:04.689218 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.689361 kubelet[2864]: W0508 08:42:04.689292 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.689361 kubelet[2864]: E0508 08:42:04.689320 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.690084 kubelet[2864]: E0508 08:42:04.690060 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.690084 kubelet[2864]: W0508 08:42:04.690081 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.690166 kubelet[2864]: E0508 08:42:04.690122 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.690390 kubelet[2864]: E0508 08:42:04.690370 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.690390 kubelet[2864]: W0508 08:42:04.690385 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.690580 kubelet[2864]: E0508 08:42:04.690483 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.690753 kubelet[2864]: E0508 08:42:04.690590 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.690753 kubelet[2864]: W0508 08:42:04.690601 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.690753 kubelet[2864]: E0508 08:42:04.690676 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.692119 kubelet[2864]: E0508 08:42:04.692071 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.692119 kubelet[2864]: W0508 08:42:04.692086 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.692205 kubelet[2864]: E0508 08:42:04.692192 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.692450 kubelet[2864]: E0508 08:42:04.692430 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.692450 kubelet[2864]: W0508 08:42:04.692443 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.692631 kubelet[2864]: E0508 08:42:04.692602 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.692696 kubelet[2864]: E0508 08:42:04.692658 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.692696 kubelet[2864]: W0508 08:42:04.692672 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.692865 kubelet[2864]: E0508 08:42:04.692787 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.692865 kubelet[2864]: E0508 08:42:04.692804 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.692865 kubelet[2864]: W0508 08:42:04.692813 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.693065 kubelet[2864]: E0508 08:42:04.692902 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.693153 kubelet[2864]: E0508 08:42:04.693107 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.693153 kubelet[2864]: W0508 08:42:04.693116 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.693153 kubelet[2864]: E0508 08:42:04.693133 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.693321 kubelet[2864]: E0508 08:42:04.693289 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.693321 kubelet[2864]: W0508 08:42:04.693298 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.693424 kubelet[2864]: E0508 08:42:04.693321 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.693500 kubelet[2864]: E0508 08:42:04.693480 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.693500 kubelet[2864]: W0508 08:42:04.693493 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.693600 kubelet[2864]: E0508 08:42:04.693506 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.694055 kubelet[2864]: E0508 08:42:04.694028 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.694055 kubelet[2864]: W0508 08:42:04.694043 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.694055 kubelet[2864]: E0508 08:42:04.694060 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.695210 kubelet[2864]: E0508 08:42:04.695129 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.695210 kubelet[2864]: W0508 08:42:04.695146 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.695210 kubelet[2864]: E0508 08:42:04.695168 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.695413 kubelet[2864]: E0508 08:42:04.695400 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.695413 kubelet[2864]: W0508 08:42:04.695411 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.695468 kubelet[2864]: E0508 08:42:04.695422 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.698946 containerd[1535]: time="2025-05-08T08:42:04.698832649Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-t5jrg,Uid:440d3895-87f6-4e5c-90ff-ff7cdd99b071,Namespace:calico-system,Attempt:0,}" May 8 08:42:04.716346 kubelet[2864]: E0508 08:42:04.716166 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:04.716346 kubelet[2864]: W0508 08:42:04.716193 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:04.716346 kubelet[2864]: E0508 08:42:04.716213 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:04.748337 containerd[1535]: time="2025-05-08T08:42:04.748209649Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5f94b5b785-f825v,Uid:6840bd30-2c02-4b96-bff5-95b83081dad7,Namespace:calico-system,Attempt:0,} returns sandbox id \"3ef9f70c56203e4522d049a3b382b54c3af9b2fd10eb666f50cda61e8136216c\"" May 8 08:42:04.752713 containerd[1535]: time="2025-05-08T08:42:04.752125492Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 8 08:42:04.752713 containerd[1535]: time="2025-05-08T08:42:04.752668659Z" level=info msg="connecting to shim 5002bd4fc0fa1a87f9f5cda20442072de3020f92e26967e1b46a379d04766172" address="unix:///run/containerd/s/1082b432953f868a99114a80f333a3a520c2e13784f7bcdb704fd0b9b8440e96" namespace=k8s.io protocol=ttrpc version=3 May 8 08:42:04.793235 systemd[1]: Started cri-containerd-5002bd4fc0fa1a87f9f5cda20442072de3020f92e26967e1b46a379d04766172.scope - libcontainer container 5002bd4fc0fa1a87f9f5cda20442072de3020f92e26967e1b46a379d04766172. May 8 08:42:04.839953 containerd[1535]: time="2025-05-08T08:42:04.839896938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-t5jrg,Uid:440d3895-87f6-4e5c-90ff-ff7cdd99b071,Namespace:calico-system,Attempt:0,} returns sandbox id \"5002bd4fc0fa1a87f9f5cda20442072de3020f92e26967e1b46a379d04766172\"" May 8 08:42:06.340039 kubelet[2864]: E0508 08:42:06.338464 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hfrk7" podUID="55738467-875a-4dbf-84b4-c2a4f16cd05d" May 8 08:42:08.059406 containerd[1535]: time="2025-05-08T08:42:08.059366778Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:08.061022 containerd[1535]: time="2025-05-08T08:42:08.060992293Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 8 08:42:08.063219 containerd[1535]: time="2025-05-08T08:42:08.062842543Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:08.065835 containerd[1535]: time="2025-05-08T08:42:08.065800086Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:08.066338 containerd[1535]: time="2025-05-08T08:42:08.066291256Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 3.314129081s" May 8 08:42:08.066338 containerd[1535]: time="2025-05-08T08:42:08.066321785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 8 08:42:08.068060 containerd[1535]: time="2025-05-08T08:42:08.068040130Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 8 08:42:08.083931 containerd[1535]: time="2025-05-08T08:42:08.082094392Z" level=info msg="CreateContainer within sandbox \"3ef9f70c56203e4522d049a3b382b54c3af9b2fd10eb666f50cda61e8136216c\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 8 08:42:08.100533 containerd[1535]: time="2025-05-08T08:42:08.099418180Z" level=info msg="Container fa672adedf7644d7cbc00eda08e00e05d7168c5e979455576c4faca77e1f2417: CDI devices from CRI Config.CDIDevices: []" May 8 08:42:08.111830 containerd[1535]: time="2025-05-08T08:42:08.111787592Z" level=info msg="CreateContainer within sandbox \"3ef9f70c56203e4522d049a3b382b54c3af9b2fd10eb666f50cda61e8136216c\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"fa672adedf7644d7cbc00eda08e00e05d7168c5e979455576c4faca77e1f2417\"" May 8 08:42:08.113541 containerd[1535]: time="2025-05-08T08:42:08.112401960Z" level=info msg="StartContainer for \"fa672adedf7644d7cbc00eda08e00e05d7168c5e979455576c4faca77e1f2417\"" May 8 08:42:08.113541 containerd[1535]: time="2025-05-08T08:42:08.113383038Z" level=info msg="connecting to shim fa672adedf7644d7cbc00eda08e00e05d7168c5e979455576c4faca77e1f2417" address="unix:///run/containerd/s/882520e6e6a5f976a9ceb702161becd6d476d18e0735fc70c68657259823b077" protocol=ttrpc version=3 May 8 08:42:08.138130 systemd[1]: Started cri-containerd-fa672adedf7644d7cbc00eda08e00e05d7168c5e979455576c4faca77e1f2417.scope - libcontainer container fa672adedf7644d7cbc00eda08e00e05d7168c5e979455576c4faca77e1f2417. May 8 08:42:08.200311 containerd[1535]: time="2025-05-08T08:42:08.200171446Z" level=info msg="StartContainer for \"fa672adedf7644d7cbc00eda08e00e05d7168c5e979455576c4faca77e1f2417\" returns successfully" May 8 08:42:08.338857 kubelet[2864]: E0508 08:42:08.338718 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hfrk7" podUID="55738467-875a-4dbf-84b4-c2a4f16cd05d" May 8 08:42:08.506002 kubelet[2864]: I0508 08:42:08.505923 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5f94b5b785-f825v" podStartSLOduration=1.190095782 podStartE2EDuration="4.505908396s" podCreationTimestamp="2025-05-08 08:42:04 +0000 UTC" firstStartedPulling="2025-05-08 08:42:04.751594631 +0000 UTC m=+22.503911955" lastFinishedPulling="2025-05-08 08:42:08.067407245 +0000 UTC m=+25.819724569" observedRunningTime="2025-05-08 08:42:08.505669533 +0000 UTC m=+26.257986867" watchObservedRunningTime="2025-05-08 08:42:08.505908396 +0000 UTC m=+26.258225720" May 8 08:42:08.509197 kubelet[2864]: E0508 08:42:08.509164 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.509197 kubelet[2864]: W0508 08:42:08.509183 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.509324 kubelet[2864]: E0508 08:42:08.509198 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.509472 kubelet[2864]: E0508 08:42:08.509450 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.509472 kubelet[2864]: W0508 08:42:08.509466 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.509545 kubelet[2864]: E0508 08:42:08.509476 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.509679 kubelet[2864]: E0508 08:42:08.509653 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.509679 kubelet[2864]: W0508 08:42:08.509668 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.509679 kubelet[2864]: E0508 08:42:08.509677 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.509855 kubelet[2864]: E0508 08:42:08.509830 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.509855 kubelet[2864]: W0508 08:42:08.509844 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.509855 kubelet[2864]: E0508 08:42:08.509853 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.510066 kubelet[2864]: E0508 08:42:08.510046 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.510066 kubelet[2864]: W0508 08:42:08.510060 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.510147 kubelet[2864]: E0508 08:42:08.510069 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.510242 kubelet[2864]: E0508 08:42:08.510222 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.510242 kubelet[2864]: W0508 08:42:08.510235 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.510420 kubelet[2864]: E0508 08:42:08.510244 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.510457 kubelet[2864]: E0508 08:42:08.510438 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.510457 kubelet[2864]: W0508 08:42:08.510447 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.510457 kubelet[2864]: E0508 08:42:08.510456 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.510667 kubelet[2864]: E0508 08:42:08.510648 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.510667 kubelet[2864]: W0508 08:42:08.510662 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.510744 kubelet[2864]: E0508 08:42:08.510672 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.510845 kubelet[2864]: E0508 08:42:08.510826 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.510845 kubelet[2864]: W0508 08:42:08.510841 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.510918 kubelet[2864]: E0508 08:42:08.510850 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.511033 kubelet[2864]: E0508 08:42:08.511014 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.511033 kubelet[2864]: W0508 08:42:08.511027 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.511117 kubelet[2864]: E0508 08:42:08.511037 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.511204 kubelet[2864]: E0508 08:42:08.511176 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.511204 kubelet[2864]: W0508 08:42:08.511190 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.511204 kubelet[2864]: E0508 08:42:08.511198 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.511688 kubelet[2864]: E0508 08:42:08.511380 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.511688 kubelet[2864]: W0508 08:42:08.511388 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.511688 kubelet[2864]: E0508 08:42:08.511397 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.511688 kubelet[2864]: E0508 08:42:08.511547 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.511688 kubelet[2864]: W0508 08:42:08.511555 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.511688 kubelet[2864]: E0508 08:42:08.511564 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.512233 kubelet[2864]: E0508 08:42:08.512210 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.512233 kubelet[2864]: W0508 08:42:08.512226 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.512309 kubelet[2864]: E0508 08:42:08.512236 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.512410 kubelet[2864]: E0508 08:42:08.512389 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.512410 kubelet[2864]: W0508 08:42:08.512403 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.512410 kubelet[2864]: E0508 08:42:08.512412 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.515336 kubelet[2864]: E0508 08:42:08.514741 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.515336 kubelet[2864]: W0508 08:42:08.515198 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.515336 kubelet[2864]: E0508 08:42:08.515216 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.515594 kubelet[2864]: E0508 08:42:08.515572 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.515653 kubelet[2864]: W0508 08:42:08.515600 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.515653 kubelet[2864]: E0508 08:42:08.515616 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.516461 kubelet[2864]: E0508 08:42:08.516429 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.516461 kubelet[2864]: W0508 08:42:08.516450 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.516530 kubelet[2864]: E0508 08:42:08.516472 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.516756 kubelet[2864]: E0508 08:42:08.516727 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.516756 kubelet[2864]: W0508 08:42:08.516744 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.516853 kubelet[2864]: E0508 08:42:08.516833 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.517046 kubelet[2864]: E0508 08:42:08.517001 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.517046 kubelet[2864]: W0508 08:42:08.517014 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.517437 kubelet[2864]: E0508 08:42:08.517404 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.517639 kubelet[2864]: E0508 08:42:08.517611 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.517639 kubelet[2864]: W0508 08:42:08.517628 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.517711 kubelet[2864]: E0508 08:42:08.517641 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.517918 kubelet[2864]: E0508 08:42:08.517890 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.517918 kubelet[2864]: W0508 08:42:08.517906 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.517918 kubelet[2864]: E0508 08:42:08.517919 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.518464 kubelet[2864]: E0508 08:42:08.518351 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.518464 kubelet[2864]: W0508 08:42:08.518366 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.518541 kubelet[2864]: E0508 08:42:08.518467 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.519216 kubelet[2864]: E0508 08:42:08.519177 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.519216 kubelet[2864]: W0508 08:42:08.519193 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.519290 kubelet[2864]: E0508 08:42:08.519278 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.520014 kubelet[2864]: E0508 08:42:08.519524 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.520014 kubelet[2864]: W0508 08:42:08.519540 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.520014 kubelet[2864]: E0508 08:42:08.519624 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.520014 kubelet[2864]: E0508 08:42:08.519759 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.520014 kubelet[2864]: W0508 08:42:08.519767 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.520014 kubelet[2864]: E0508 08:42:08.519827 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.520187 kubelet[2864]: E0508 08:42:08.520163 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.520187 kubelet[2864]: W0508 08:42:08.520173 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.520187 kubelet[2864]: E0508 08:42:08.520185 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.520384 kubelet[2864]: E0508 08:42:08.520362 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.520384 kubelet[2864]: W0508 08:42:08.520377 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.520451 kubelet[2864]: E0508 08:42:08.520386 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.520658 kubelet[2864]: E0508 08:42:08.520634 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.520658 kubelet[2864]: W0508 08:42:08.520649 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.520739 kubelet[2864]: E0508 08:42:08.520658 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.520841 kubelet[2864]: E0508 08:42:08.520822 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.520841 kubelet[2864]: W0508 08:42:08.520835 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.521034 kubelet[2864]: E0508 08:42:08.520859 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.521169 kubelet[2864]: E0508 08:42:08.521058 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.521169 kubelet[2864]: W0508 08:42:08.521071 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.521169 kubelet[2864]: E0508 08:42:08.521092 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.521562 kubelet[2864]: E0508 08:42:08.521511 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.521562 kubelet[2864]: W0508 08:42:08.521528 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.521562 kubelet[2864]: E0508 08:42:08.521537 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:08.521958 kubelet[2864]: E0508 08:42:08.521937 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:08.521958 kubelet[2864]: W0508 08:42:08.521952 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:08.522618 kubelet[2864]: E0508 08:42:08.521964 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.488454 kubelet[2864]: I0508 08:42:09.488297 2864 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 8 08:42:09.520852 kubelet[2864]: E0508 08:42:09.520606 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.520852 kubelet[2864]: W0508 08:42:09.520648 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.520852 kubelet[2864]: E0508 08:42:09.520687 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.521316 kubelet[2864]: E0508 08:42:09.521157 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.521316 kubelet[2864]: W0508 08:42:09.521180 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.521316 kubelet[2864]: E0508 08:42:09.521202 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.521709 kubelet[2864]: E0508 08:42:09.521531 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.521709 kubelet[2864]: W0508 08:42:09.521551 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.521709 kubelet[2864]: E0508 08:42:09.521572 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.522023 kubelet[2864]: E0508 08:42:09.521898 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.522023 kubelet[2864]: W0508 08:42:09.521918 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.522023 kubelet[2864]: E0508 08:42:09.521938 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.522374 kubelet[2864]: E0508 08:42:09.522357 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.522452 kubelet[2864]: W0508 08:42:09.522377 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.522452 kubelet[2864]: E0508 08:42:09.522399 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.522813 kubelet[2864]: E0508 08:42:09.522724 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.522813 kubelet[2864]: W0508 08:42:09.522751 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.522813 kubelet[2864]: E0508 08:42:09.522772 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.523314 kubelet[2864]: E0508 08:42:09.523135 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.523314 kubelet[2864]: W0508 08:42:09.523156 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.523314 kubelet[2864]: E0508 08:42:09.523178 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.523577 kubelet[2864]: E0508 08:42:09.523509 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.523577 kubelet[2864]: W0508 08:42:09.523528 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.523577 kubelet[2864]: E0508 08:42:09.523549 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.523952 kubelet[2864]: E0508 08:42:09.523913 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.523952 kubelet[2864]: W0508 08:42:09.523946 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.524184 kubelet[2864]: E0508 08:42:09.523966 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.524420 kubelet[2864]: E0508 08:42:09.524380 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.524420 kubelet[2864]: W0508 08:42:09.524408 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.524749 kubelet[2864]: E0508 08:42:09.524428 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.524833 kubelet[2864]: E0508 08:42:09.524749 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.524833 kubelet[2864]: W0508 08:42:09.524768 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.524833 kubelet[2864]: E0508 08:42:09.524789 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.525215 kubelet[2864]: E0508 08:42:09.525181 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.525215 kubelet[2864]: W0508 08:42:09.525209 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.525385 kubelet[2864]: E0508 08:42:09.525230 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.525742 kubelet[2864]: E0508 08:42:09.525708 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.525742 kubelet[2864]: W0508 08:42:09.525737 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.525929 kubelet[2864]: E0508 08:42:09.525758 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.526182 kubelet[2864]: E0508 08:42:09.526140 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.526182 kubelet[2864]: W0508 08:42:09.526171 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.526394 kubelet[2864]: E0508 08:42:09.526191 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.526577 kubelet[2864]: E0508 08:42:09.526545 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.526577 kubelet[2864]: W0508 08:42:09.526567 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.526730 kubelet[2864]: E0508 08:42:09.526588 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.527133 kubelet[2864]: E0508 08:42:09.527052 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.527133 kubelet[2864]: W0508 08:42:09.527078 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.527133 kubelet[2864]: E0508 08:42:09.527098 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.527623 kubelet[2864]: E0508 08:42:09.527581 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.527623 kubelet[2864]: W0508 08:42:09.527610 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.527780 kubelet[2864]: E0508 08:42:09.527643 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.528240 kubelet[2864]: E0508 08:42:09.528161 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.528240 kubelet[2864]: W0508 08:42:09.528190 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.528240 kubelet[2864]: E0508 08:42:09.528222 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.528636 kubelet[2864]: E0508 08:42:09.528597 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.528636 kubelet[2864]: W0508 08:42:09.528624 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.528796 kubelet[2864]: E0508 08:42:09.528761 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.529161 kubelet[2864]: E0508 08:42:09.529128 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.529161 kubelet[2864]: W0508 08:42:09.529159 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.529415 kubelet[2864]: E0508 08:42:09.529373 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.529652 kubelet[2864]: E0508 08:42:09.529616 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.529652 kubelet[2864]: W0508 08:42:09.529637 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.530029 kubelet[2864]: E0508 08:42:09.529862 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.530029 kubelet[2864]: E0508 08:42:09.529955 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.530029 kubelet[2864]: W0508 08:42:09.529975 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.530517 kubelet[2864]: E0508 08:42:09.530066 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.530517 kubelet[2864]: E0508 08:42:09.530463 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.530517 kubelet[2864]: W0508 08:42:09.530483 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.530858 kubelet[2864]: E0508 08:42:09.530538 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.531075 kubelet[2864]: E0508 08:42:09.530969 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.531075 kubelet[2864]: W0508 08:42:09.531071 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.531947 kubelet[2864]: E0508 08:42:09.531354 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.531947 kubelet[2864]: E0508 08:42:09.531402 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.531947 kubelet[2864]: W0508 08:42:09.531422 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.531947 kubelet[2864]: E0508 08:42:09.531723 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.531947 kubelet[2864]: W0508 08:42:09.531744 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.531947 kubelet[2864]: E0508 08:42:09.531767 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.532399 kubelet[2864]: E0508 08:42:09.532097 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.532399 kubelet[2864]: W0508 08:42:09.532116 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.532399 kubelet[2864]: E0508 08:42:09.532137 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.532579 kubelet[2864]: E0508 08:42:09.532464 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.532579 kubelet[2864]: W0508 08:42:09.532483 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.532579 kubelet[2864]: E0508 08:42:09.532503 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.533123 kubelet[2864]: E0508 08:42:09.531726 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.533661 kubelet[2864]: E0508 08:42:09.533372 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.533661 kubelet[2864]: W0508 08:42:09.533400 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.533661 kubelet[2864]: E0508 08:42:09.533442 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.534117 kubelet[2864]: E0508 08:42:09.534090 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.534368 kubelet[2864]: W0508 08:42:09.534336 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.534531 kubelet[2864]: E0508 08:42:09.534503 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.534892 kubelet[2864]: E0508 08:42:09.534840 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.534892 kubelet[2864]: W0508 08:42:09.534877 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.535108 kubelet[2864]: E0508 08:42:09.534903 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.535688 kubelet[2864]: E0508 08:42:09.535324 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.535688 kubelet[2864]: W0508 08:42:09.535346 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.535688 kubelet[2864]: E0508 08:42:09.535367 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:09.536090 kubelet[2864]: E0508 08:42:09.536042 2864 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 8 08:42:09.536313 kubelet[2864]: W0508 08:42:09.536285 2864 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 8 08:42:09.536473 kubelet[2864]: E0508 08:42:09.536448 2864 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 8 08:42:10.270332 containerd[1535]: time="2025-05-08T08:42:10.270243880Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:10.273325 containerd[1535]: time="2025-05-08T08:42:10.273275934Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 8 08:42:10.275216 containerd[1535]: time="2025-05-08T08:42:10.275131096Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:10.278370 containerd[1535]: time="2025-05-08T08:42:10.278345392Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:10.279457 containerd[1535]: time="2025-05-08T08:42:10.279331527Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 2.211180172s" May 8 08:42:10.279457 containerd[1535]: time="2025-05-08T08:42:10.279380290Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 8 08:42:10.282558 containerd[1535]: time="2025-05-08T08:42:10.282067819Z" level=info msg="CreateContainer within sandbox \"5002bd4fc0fa1a87f9f5cda20442072de3020f92e26967e1b46a379d04766172\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 8 08:42:10.306111 containerd[1535]: time="2025-05-08T08:42:10.306068702Z" level=info msg="Container 392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8: CDI devices from CRI Config.CDIDevices: []" May 8 08:42:10.321589 containerd[1535]: time="2025-05-08T08:42:10.321547879Z" level=info msg="CreateContainer within sandbox \"5002bd4fc0fa1a87f9f5cda20442072de3020f92e26967e1b46a379d04766172\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8\"" May 8 08:42:10.322853 containerd[1535]: time="2025-05-08T08:42:10.322555966Z" level=info msg="StartContainer for \"392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8\"" May 8 08:42:10.325833 containerd[1535]: time="2025-05-08T08:42:10.325791723Z" level=info msg="connecting to shim 392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8" address="unix:///run/containerd/s/1082b432953f868a99114a80f333a3a520c2e13784f7bcdb704fd0b9b8440e96" protocol=ttrpc version=3 May 8 08:42:10.339278 kubelet[2864]: E0508 08:42:10.338480 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hfrk7" podUID="55738467-875a-4dbf-84b4-c2a4f16cd05d" May 8 08:42:10.352145 systemd[1]: Started cri-containerd-392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8.scope - libcontainer container 392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8. May 8 08:42:10.400656 containerd[1535]: time="2025-05-08T08:42:10.400625735Z" level=info msg="StartContainer for \"392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8\" returns successfully" May 8 08:42:10.412079 systemd[1]: cri-containerd-392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8.scope: Deactivated successfully. May 8 08:42:10.416523 containerd[1535]: time="2025-05-08T08:42:10.416229223Z" level=info msg="received exit event container_id:\"392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8\" id:\"392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8\" pid:3493 exited_at:{seconds:1746693730 nanos:415891280}" May 8 08:42:10.416523 containerd[1535]: time="2025-05-08T08:42:10.416489897Z" level=info msg="TaskExit event in podsandbox handler container_id:\"392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8\" id:\"392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8\" pid:3493 exited_at:{seconds:1746693730 nanos:415891280}" May 8 08:42:10.444933 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8-rootfs.mount: Deactivated successfully. May 8 08:42:11.514100 containerd[1535]: time="2025-05-08T08:42:11.513839434Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 8 08:42:12.340033 kubelet[2864]: E0508 08:42:12.338887 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hfrk7" podUID="55738467-875a-4dbf-84b4-c2a4f16cd05d" May 8 08:42:14.341128 kubelet[2864]: E0508 08:42:14.340369 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hfrk7" podUID="55738467-875a-4dbf-84b4-c2a4f16cd05d" May 8 08:42:16.339095 kubelet[2864]: E0508 08:42:16.339053 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hfrk7" podUID="55738467-875a-4dbf-84b4-c2a4f16cd05d" May 8 08:42:17.659584 containerd[1535]: time="2025-05-08T08:42:17.659459735Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:17.661237 containerd[1535]: time="2025-05-08T08:42:17.661183405Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 8 08:42:17.662064 containerd[1535]: time="2025-05-08T08:42:17.661911494Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:17.664795 containerd[1535]: time="2025-05-08T08:42:17.664713695Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:17.666042 containerd[1535]: time="2025-05-08T08:42:17.665446923Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 6.151542363s" May 8 08:42:17.666042 containerd[1535]: time="2025-05-08T08:42:17.665483713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 8 08:42:17.669072 containerd[1535]: time="2025-05-08T08:42:17.668820652Z" level=info msg="CreateContainer within sandbox \"5002bd4fc0fa1a87f9f5cda20442072de3020f92e26967e1b46a379d04766172\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 8 08:42:17.684017 containerd[1535]: time="2025-05-08T08:42:17.683325567Z" level=info msg="Container d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543: CDI devices from CRI Config.CDIDevices: []" May 8 08:42:17.698959 containerd[1535]: time="2025-05-08T08:42:17.698902170Z" level=info msg="CreateContainer within sandbox \"5002bd4fc0fa1a87f9f5cda20442072de3020f92e26967e1b46a379d04766172\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543\"" May 8 08:42:17.700322 containerd[1535]: time="2025-05-08T08:42:17.700197287Z" level=info msg="StartContainer for \"d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543\"" May 8 08:42:17.704954 containerd[1535]: time="2025-05-08T08:42:17.704849722Z" level=info msg="connecting to shim d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543" address="unix:///run/containerd/s/1082b432953f868a99114a80f333a3a520c2e13784f7bcdb704fd0b9b8440e96" protocol=ttrpc version=3 May 8 08:42:17.737131 systemd[1]: Started cri-containerd-d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543.scope - libcontainer container d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543. May 8 08:42:17.782901 containerd[1535]: time="2025-05-08T08:42:17.782863096Z" level=info msg="StartContainer for \"d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543\" returns successfully" May 8 08:42:18.339119 kubelet[2864]: E0508 08:42:18.338133 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hfrk7" podUID="55738467-875a-4dbf-84b4-c2a4f16cd05d" May 8 08:42:19.081057 containerd[1535]: time="2025-05-08T08:42:19.080929292Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 8 08:42:19.087238 systemd[1]: cri-containerd-d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543.scope: Deactivated successfully. May 8 08:42:19.087812 systemd[1]: cri-containerd-d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543.scope: Consumed 783ms CPU time, 175.4M memory peak, 154M written to disk. May 8 08:42:19.093394 containerd[1535]: time="2025-05-08T08:42:19.093037909Z" level=info msg="received exit event container_id:\"d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543\" id:\"d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543\" pid:3551 exited_at:{seconds:1746693739 nanos:92151519}" May 8 08:42:19.094254 containerd[1535]: time="2025-05-08T08:42:19.094162897Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543\" id:\"d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543\" pid:3551 exited_at:{seconds:1746693739 nanos:92151519}" May 8 08:42:19.108048 kubelet[2864]: I0508 08:42:19.107251 2864 kubelet_node_status.go:497] "Fast updating node status as it just became ready" May 8 08:42:19.151547 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543-rootfs.mount: Deactivated successfully. May 8 08:42:19.239682 kubelet[2864]: I0508 08:42:19.239570 2864 topology_manager.go:215] "Topology Admit Handler" podUID="2aad45b6-81f3-4d71-a771-048450f82553" podNamespace="calico-system" podName="calico-kube-controllers-574667564d-5z62w" May 8 08:42:19.481032 kubelet[2864]: I0508 08:42:19.406521 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s482w\" (UniqueName: \"kubernetes.io/projected/2aad45b6-81f3-4d71-a771-048450f82553-kube-api-access-s482w\") pod \"calico-kube-controllers-574667564d-5z62w\" (UID: \"2aad45b6-81f3-4d71-a771-048450f82553\") " pod="calico-system/calico-kube-controllers-574667564d-5z62w" May 8 08:42:19.481032 kubelet[2864]: I0508 08:42:19.406618 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aad45b6-81f3-4d71-a771-048450f82553-tigera-ca-bundle\") pod \"calico-kube-controllers-574667564d-5z62w\" (UID: \"2aad45b6-81f3-4d71-a771-048450f82553\") " pod="calico-system/calico-kube-controllers-574667564d-5z62w" May 8 08:42:19.251589 systemd[1]: Created slice kubepods-besteffort-pod2aad45b6_81f3_4d71_a771_048450f82553.slice - libcontainer container kubepods-besteffort-pod2aad45b6_81f3_4d71_a771_048450f82553.slice. May 8 08:42:19.519581 kubelet[2864]: I0508 08:42:19.518576 2864 topology_manager.go:215] "Topology Admit Handler" podUID="0b760017-9088-41e6-b686-98d1a18df87b" podNamespace="kube-system" podName="coredns-7db6d8ff4d-qpbnv" May 8 08:42:19.521212 kubelet[2864]: I0508 08:42:19.520379 2864 topology_manager.go:215] "Topology Admit Handler" podUID="1b3ab818-593e-4e7b-812b-16baea71dd0e" podNamespace="calico-apiserver" podName="calico-apiserver-dbff64874-l66hm" May 8 08:42:19.532059 kubelet[2864]: I0508 08:42:19.531569 2864 topology_manager.go:215] "Topology Admit Handler" podUID="6ff13dc1-7d0b-4fee-a69f-8166ad87f47b" podNamespace="calico-apiserver" podName="calico-apiserver-dbff64874-f6gff" May 8 08:42:19.536706 kubelet[2864]: I0508 08:42:19.536192 2864 topology_manager.go:215] "Topology Admit Handler" podUID="45d8b150-711f-48ea-9546-c9d41ba961e3" podNamespace="kube-system" podName="coredns-7db6d8ff4d-64rw6" May 8 08:42:19.542605 kubelet[2864]: W0508 08:42:19.542428 2864 reflector.go:547] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:ci-4327-0-0-w-78bcb828ec.novalocal" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ci-4327-0-0-w-78bcb828ec.novalocal' and this object May 8 08:42:19.542857 kubelet[2864]: W0508 08:42:19.542686 2864 reflector.go:547] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4327-0-0-w-78bcb828ec.novalocal" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4327-0-0-w-78bcb828ec.novalocal' and this object May 8 08:42:19.542857 kubelet[2864]: E0508 08:42:19.542765 2864 reflector.go:150] object-"calico-apiserver"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4327-0-0-w-78bcb828ec.novalocal" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4327-0-0-w-78bcb828ec.novalocal' and this object May 8 08:42:19.545125 kubelet[2864]: W0508 08:42:19.542879 2864 reflector.go:547] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4327-0-0-w-78bcb828ec.novalocal" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4327-0-0-w-78bcb828ec.novalocal' and this object May 8 08:42:19.545125 kubelet[2864]: E0508 08:42:19.542909 2864 reflector.go:150] object-"calico-apiserver"/"calico-apiserver-certs": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4327-0-0-w-78bcb828ec.novalocal" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4327-0-0-w-78bcb828ec.novalocal' and this object May 8 08:42:19.545125 kubelet[2864]: E0508 08:42:19.543093 2864 reflector.go:150] object-"kube-system"/"coredns": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:ci-4327-0-0-w-78bcb828ec.novalocal" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ci-4327-0-0-w-78bcb828ec.novalocal' and this object May 8 08:42:19.560213 systemd[1]: Created slice kubepods-burstable-pod0b760017_9088_41e6_b686_98d1a18df87b.slice - libcontainer container kubepods-burstable-pod0b760017_9088_41e6_b686_98d1a18df87b.slice. May 8 08:42:19.594478 systemd[1]: Created slice kubepods-besteffort-pod1b3ab818_593e_4e7b_812b_16baea71dd0e.slice - libcontainer container kubepods-besteffort-pod1b3ab818_593e_4e7b_812b_16baea71dd0e.slice. May 8 08:42:19.604064 systemd[1]: Created slice kubepods-burstable-pod45d8b150_711f_48ea_9546_c9d41ba961e3.slice - libcontainer container kubepods-burstable-pod45d8b150_711f_48ea_9546_c9d41ba961e3.slice. May 8 08:42:19.614931 systemd[1]: Created slice kubepods-besteffort-pod6ff13dc1_7d0b_4fee_a69f_8166ad87f47b.slice - libcontainer container kubepods-besteffort-pod6ff13dc1_7d0b_4fee_a69f_8166ad87f47b.slice. May 8 08:42:19.709322 kubelet[2864]: I0508 08:42:19.709111 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crj68\" (UniqueName: \"kubernetes.io/projected/6ff13dc1-7d0b-4fee-a69f-8166ad87f47b-kube-api-access-crj68\") pod \"calico-apiserver-dbff64874-f6gff\" (UID: \"6ff13dc1-7d0b-4fee-a69f-8166ad87f47b\") " pod="calico-apiserver/calico-apiserver-dbff64874-f6gff" May 8 08:42:19.709322 kubelet[2864]: I0508 08:42:19.709164 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b760017-9088-41e6-b686-98d1a18df87b-config-volume\") pod \"coredns-7db6d8ff4d-qpbnv\" (UID: \"0b760017-9088-41e6-b686-98d1a18df87b\") " pod="kube-system/coredns-7db6d8ff4d-qpbnv" May 8 08:42:19.709322 kubelet[2864]: I0508 08:42:19.709186 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1b3ab818-593e-4e7b-812b-16baea71dd0e-calico-apiserver-certs\") pod \"calico-apiserver-dbff64874-l66hm\" (UID: \"1b3ab818-593e-4e7b-812b-16baea71dd0e\") " pod="calico-apiserver/calico-apiserver-dbff64874-l66hm" May 8 08:42:19.709322 kubelet[2864]: I0508 08:42:19.709284 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fknbx\" (UniqueName: \"kubernetes.io/projected/1b3ab818-593e-4e7b-812b-16baea71dd0e-kube-api-access-fknbx\") pod \"calico-apiserver-dbff64874-l66hm\" (UID: \"1b3ab818-593e-4e7b-812b-16baea71dd0e\") " pod="calico-apiserver/calico-apiserver-dbff64874-l66hm" May 8 08:42:19.709322 kubelet[2864]: I0508 08:42:19.709308 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45d8b150-711f-48ea-9546-c9d41ba961e3-config-volume\") pod \"coredns-7db6d8ff4d-64rw6\" (UID: \"45d8b150-711f-48ea-9546-c9d41ba961e3\") " pod="kube-system/coredns-7db6d8ff4d-64rw6" May 8 08:42:19.709971 kubelet[2864]: I0508 08:42:19.709357 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2clw7\" (UniqueName: \"kubernetes.io/projected/45d8b150-711f-48ea-9546-c9d41ba961e3-kube-api-access-2clw7\") pod \"coredns-7db6d8ff4d-64rw6\" (UID: \"45d8b150-711f-48ea-9546-c9d41ba961e3\") " pod="kube-system/coredns-7db6d8ff4d-64rw6" May 8 08:42:19.709971 kubelet[2864]: I0508 08:42:19.709379 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfsbl\" (UniqueName: \"kubernetes.io/projected/0b760017-9088-41e6-b686-98d1a18df87b-kube-api-access-lfsbl\") pod \"coredns-7db6d8ff4d-qpbnv\" (UID: \"0b760017-9088-41e6-b686-98d1a18df87b\") " pod="kube-system/coredns-7db6d8ff4d-qpbnv" May 8 08:42:19.709971 kubelet[2864]: I0508 08:42:19.709411 2864 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6ff13dc1-7d0b-4fee-a69f-8166ad87f47b-calico-apiserver-certs\") pod \"calico-apiserver-dbff64874-f6gff\" (UID: \"6ff13dc1-7d0b-4fee-a69f-8166ad87f47b\") " pod="calico-apiserver/calico-apiserver-dbff64874-f6gff" May 8 08:42:19.784835 containerd[1535]: time="2025-05-08T08:42:19.784660165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-574667564d-5z62w,Uid:2aad45b6-81f3-4d71-a771-048450f82553,Namespace:calico-system,Attempt:0,}" May 8 08:42:20.247310 containerd[1535]: time="2025-05-08T08:42:20.247248008Z" level=error msg="Failed to destroy network for sandbox \"7778182d922e57d4c4fb9e31881e5516b79b48bb17b5d7e9d6c6e7e364e3d4d6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:20.249477 systemd[1]: run-netns-cni\x2d618e52f0\x2d89ea\x2df788\x2d07c6\x2d107cbb5ae93c.mount: Deactivated successfully. May 8 08:42:20.251500 containerd[1535]: time="2025-05-08T08:42:20.251019909Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-574667564d-5z62w,Uid:2aad45b6-81f3-4d71-a771-048450f82553,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7778182d922e57d4c4fb9e31881e5516b79b48bb17b5d7e9d6c6e7e364e3d4d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:20.251611 kubelet[2864]: E0508 08:42:20.251297 2864 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7778182d922e57d4c4fb9e31881e5516b79b48bb17b5d7e9d6c6e7e364e3d4d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:20.252037 kubelet[2864]: E0508 08:42:20.251458 2864 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7778182d922e57d4c4fb9e31881e5516b79b48bb17b5d7e9d6c6e7e364e3d4d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-574667564d-5z62w" May 8 08:42:20.252037 kubelet[2864]: E0508 08:42:20.251693 2864 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7778182d922e57d4c4fb9e31881e5516b79b48bb17b5d7e9d6c6e7e364e3d4d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-574667564d-5z62w" May 8 08:42:20.252037 kubelet[2864]: E0508 08:42:20.251754 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-574667564d-5z62w_calico-system(2aad45b6-81f3-4d71-a771-048450f82553)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-574667564d-5z62w_calico-system(2aad45b6-81f3-4d71-a771-048450f82553)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7778182d922e57d4c4fb9e31881e5516b79b48bb17b5d7e9d6c6e7e364e3d4d6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-574667564d-5z62w" podUID="2aad45b6-81f3-4d71-a771-048450f82553" May 8 08:42:20.350430 systemd[1]: Created slice kubepods-besteffort-pod55738467_875a_4dbf_84b4_c2a4f16cd05d.slice - libcontainer container kubepods-besteffort-pod55738467_875a_4dbf_84b4_c2a4f16cd05d.slice. May 8 08:42:20.355179 containerd[1535]: time="2025-05-08T08:42:20.355074378Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hfrk7,Uid:55738467-875a-4dbf-84b4-c2a4f16cd05d,Namespace:calico-system,Attempt:0,}" May 8 08:42:20.484023 containerd[1535]: time="2025-05-08T08:42:20.482301553Z" level=error msg="Failed to destroy network for sandbox \"0608f033accb6d7acb7cc6b4f7bec39f7069324516416d1587d8dbc9794f9b45\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:20.484696 systemd[1]: run-netns-cni\x2d1262cc8b\x2dfc50\x2d3d56\x2def53\x2d4afbf7372e9a.mount: Deactivated successfully. May 8 08:42:20.485858 containerd[1535]: time="2025-05-08T08:42:20.485766787Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hfrk7,Uid:55738467-875a-4dbf-84b4-c2a4f16cd05d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0608f033accb6d7acb7cc6b4f7bec39f7069324516416d1587d8dbc9794f9b45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:20.486567 kubelet[2864]: E0508 08:42:20.486527 2864 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0608f033accb6d7acb7cc6b4f7bec39f7069324516416d1587d8dbc9794f9b45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:20.486839 kubelet[2864]: E0508 08:42:20.486584 2864 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0608f033accb6d7acb7cc6b4f7bec39f7069324516416d1587d8dbc9794f9b45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hfrk7" May 8 08:42:20.486839 kubelet[2864]: E0508 08:42:20.486609 2864 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0608f033accb6d7acb7cc6b4f7bec39f7069324516416d1587d8dbc9794f9b45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hfrk7" May 8 08:42:20.486839 kubelet[2864]: E0508 08:42:20.486654 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-hfrk7_calico-system(55738467-875a-4dbf-84b4-c2a4f16cd05d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-hfrk7_calico-system(55738467-875a-4dbf-84b4-c2a4f16cd05d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0608f033accb6d7acb7cc6b4f7bec39f7069324516416d1587d8dbc9794f9b45\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-hfrk7" podUID="55738467-875a-4dbf-84b4-c2a4f16cd05d" May 8 08:42:20.503505 containerd[1535]: time="2025-05-08T08:42:20.503152818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dbff64874-l66hm,Uid:1b3ab818-593e-4e7b-812b-16baea71dd0e,Namespace:calico-apiserver,Attempt:0,}" May 8 08:42:20.518537 containerd[1535]: time="2025-05-08T08:42:20.518340505Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dbff64874-f6gff,Uid:6ff13dc1-7d0b-4fee-a69f-8166ad87f47b,Namespace:calico-apiserver,Attempt:0,}" May 8 08:42:20.559223 containerd[1535]: time="2025-05-08T08:42:20.558476753Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 8 08:42:20.604240 containerd[1535]: time="2025-05-08T08:42:20.604199321Z" level=error msg="Failed to destroy network for sandbox \"409802fc592ac46aec3033986648de82b6642f9df0ca531575ebafa1c4f7d3e9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:20.611152 containerd[1535]: time="2025-05-08T08:42:20.611082086Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dbff64874-l66hm,Uid:1b3ab818-593e-4e7b-812b-16baea71dd0e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"409802fc592ac46aec3033986648de82b6642f9df0ca531575ebafa1c4f7d3e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:20.611740 kubelet[2864]: E0508 08:42:20.611582 2864 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"409802fc592ac46aec3033986648de82b6642f9df0ca531575ebafa1c4f7d3e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:20.611740 kubelet[2864]: E0508 08:42:20.611660 2864 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"409802fc592ac46aec3033986648de82b6642f9df0ca531575ebafa1c4f7d3e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dbff64874-l66hm" May 8 08:42:20.611740 kubelet[2864]: E0508 08:42:20.611683 2864 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"409802fc592ac46aec3033986648de82b6642f9df0ca531575ebafa1c4f7d3e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dbff64874-l66hm" May 8 08:42:20.612042 kubelet[2864]: E0508 08:42:20.611945 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-dbff64874-l66hm_calico-apiserver(1b3ab818-593e-4e7b-812b-16baea71dd0e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-dbff64874-l66hm_calico-apiserver(1b3ab818-593e-4e7b-812b-16baea71dd0e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"409802fc592ac46aec3033986648de82b6642f9df0ca531575ebafa1c4f7d3e9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-dbff64874-l66hm" podUID="1b3ab818-593e-4e7b-812b-16baea71dd0e" May 8 08:42:20.623484 containerd[1535]: time="2025-05-08T08:42:20.623422003Z" level=error msg="Failed to destroy network for sandbox \"8f80d57fba45975ccd14c3dec04129c17dcbbacc2c00f4a06e8a35d35874fd43\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:20.625425 containerd[1535]: time="2025-05-08T08:42:20.625375659Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dbff64874-f6gff,Uid:6ff13dc1-7d0b-4fee-a69f-8166ad87f47b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f80d57fba45975ccd14c3dec04129c17dcbbacc2c00f4a06e8a35d35874fd43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:20.625633 kubelet[2864]: E0508 08:42:20.625595 2864 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f80d57fba45975ccd14c3dec04129c17dcbbacc2c00f4a06e8a35d35874fd43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:20.625696 kubelet[2864]: E0508 08:42:20.625661 2864 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f80d57fba45975ccd14c3dec04129c17dcbbacc2c00f4a06e8a35d35874fd43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dbff64874-f6gff" May 8 08:42:20.625696 kubelet[2864]: E0508 08:42:20.625687 2864 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f80d57fba45975ccd14c3dec04129c17dcbbacc2c00f4a06e8a35d35874fd43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dbff64874-f6gff" May 8 08:42:20.625772 kubelet[2864]: E0508 08:42:20.625734 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-dbff64874-f6gff_calico-apiserver(6ff13dc1-7d0b-4fee-a69f-8166ad87f47b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-dbff64874-f6gff_calico-apiserver(6ff13dc1-7d0b-4fee-a69f-8166ad87f47b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8f80d57fba45975ccd14c3dec04129c17dcbbacc2c00f4a06e8a35d35874fd43\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-dbff64874-f6gff" podUID="6ff13dc1-7d0b-4fee-a69f-8166ad87f47b" May 8 08:42:20.811338 kubelet[2864]: E0508 08:42:20.810857 2864 configmap.go:199] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition May 8 08:42:20.811338 kubelet[2864]: E0508 08:42:20.811049 2864 configmap.go:199] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition May 8 08:42:20.811338 kubelet[2864]: E0508 08:42:20.811099 2864 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0b760017-9088-41e6-b686-98d1a18df87b-config-volume podName:0b760017-9088-41e6-b686-98d1a18df87b nodeName:}" failed. No retries permitted until 2025-05-08 08:42:21.311060447 +0000 UTC m=+39.063377831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/0b760017-9088-41e6-b686-98d1a18df87b-config-volume") pod "coredns-7db6d8ff4d-qpbnv" (UID: "0b760017-9088-41e6-b686-98d1a18df87b") : failed to sync configmap cache: timed out waiting for the condition May 8 08:42:20.811338 kubelet[2864]: E0508 08:42:20.811173 2864 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/45d8b150-711f-48ea-9546-c9d41ba961e3-config-volume podName:45d8b150-711f-48ea-9546-c9d41ba961e3 nodeName:}" failed. No retries permitted until 2025-05-08 08:42:21.311135701 +0000 UTC m=+39.063453075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/45d8b150-711f-48ea-9546-c9d41ba961e3-config-volume") pod "coredns-7db6d8ff4d-64rw6" (UID: "45d8b150-711f-48ea-9546-c9d41ba961e3") : failed to sync configmap cache: timed out waiting for the condition May 8 08:42:21.374158 containerd[1535]: time="2025-05-08T08:42:21.374097008Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-qpbnv,Uid:0b760017-9088-41e6-b686-98d1a18df87b,Namespace:kube-system,Attempt:0,}" May 8 08:42:21.413129 containerd[1535]: time="2025-05-08T08:42:21.412901636Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-64rw6,Uid:45d8b150-711f-48ea-9546-c9d41ba961e3,Namespace:kube-system,Attempt:0,}" May 8 08:42:21.484059 containerd[1535]: time="2025-05-08T08:42:21.483955071Z" level=error msg="Failed to destroy network for sandbox \"090008990bbb7e9c67246e230f09d8b6bfc25ebff65b999897d97cbc70845c4f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:21.486514 systemd[1]: run-netns-cni\x2d08b67a76\x2df5a4\x2d9f26\x2def58\x2d433e5ab39aed.mount: Deactivated successfully. May 8 08:42:21.489715 containerd[1535]: time="2025-05-08T08:42:21.489628121Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-qpbnv,Uid:0b760017-9088-41e6-b686-98d1a18df87b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"090008990bbb7e9c67246e230f09d8b6bfc25ebff65b999897d97cbc70845c4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:21.490699 kubelet[2864]: E0508 08:42:21.490657 2864 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"090008990bbb7e9c67246e230f09d8b6bfc25ebff65b999897d97cbc70845c4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:21.491067 kubelet[2864]: E0508 08:42:21.490719 2864 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"090008990bbb7e9c67246e230f09d8b6bfc25ebff65b999897d97cbc70845c4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-qpbnv" May 8 08:42:21.491067 kubelet[2864]: E0508 08:42:21.490742 2864 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"090008990bbb7e9c67246e230f09d8b6bfc25ebff65b999897d97cbc70845c4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-qpbnv" May 8 08:42:21.491067 kubelet[2864]: E0508 08:42:21.490784 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-qpbnv_kube-system(0b760017-9088-41e6-b686-98d1a18df87b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-qpbnv_kube-system(0b760017-9088-41e6-b686-98d1a18df87b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"090008990bbb7e9c67246e230f09d8b6bfc25ebff65b999897d97cbc70845c4f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-qpbnv" podUID="0b760017-9088-41e6-b686-98d1a18df87b" May 8 08:42:21.522556 containerd[1535]: time="2025-05-08T08:42:21.522446939Z" level=error msg="Failed to destroy network for sandbox \"a6a522398a23337e5af8f68ec1f9146bd84c4425f555655481e9a32216a4728b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:21.525769 containerd[1535]: time="2025-05-08T08:42:21.525646439Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-64rw6,Uid:45d8b150-711f-48ea-9546-c9d41ba961e3,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6a522398a23337e5af8f68ec1f9146bd84c4425f555655481e9a32216a4728b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:21.525964 kubelet[2864]: E0508 08:42:21.525920 2864 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6a522398a23337e5af8f68ec1f9146bd84c4425f555655481e9a32216a4728b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:21.526205 kubelet[2864]: E0508 08:42:21.526016 2864 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6a522398a23337e5af8f68ec1f9146bd84c4425f555655481e9a32216a4728b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-64rw6" May 8 08:42:21.526205 kubelet[2864]: E0508 08:42:21.526047 2864 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6a522398a23337e5af8f68ec1f9146bd84c4425f555655481e9a32216a4728b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-64rw6" May 8 08:42:21.526205 kubelet[2864]: E0508 08:42:21.526116 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-64rw6_kube-system(45d8b150-711f-48ea-9546-c9d41ba961e3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-64rw6_kube-system(45d8b150-711f-48ea-9546-c9d41ba961e3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a6a522398a23337e5af8f68ec1f9146bd84c4425f555655481e9a32216a4728b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-64rw6" podUID="45d8b150-711f-48ea-9546-c9d41ba961e3" May 8 08:42:21.526936 systemd[1]: run-netns-cni\x2dd07b8d41\x2d693e\x2d5991\x2df9b9\x2d51212326cc00.mount: Deactivated successfully. May 8 08:42:30.172068 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3660945608.mount: Deactivated successfully. May 8 08:42:31.034278 containerd[1535]: time="2025-05-08T08:42:31.033616163Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:31.036765 containerd[1535]: time="2025-05-08T08:42:31.035779363Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 8 08:42:31.038469 containerd[1535]: time="2025-05-08T08:42:31.038393294Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:31.043526 containerd[1535]: time="2025-05-08T08:42:31.043459428Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:31.045019 containerd[1535]: time="2025-05-08T08:42:31.044897454Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 10.486370223s" May 8 08:42:31.045190 containerd[1535]: time="2025-05-08T08:42:31.044975983Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 8 08:42:31.081332 containerd[1535]: time="2025-05-08T08:42:31.080207117Z" level=info msg="CreateContainer within sandbox \"5002bd4fc0fa1a87f9f5cda20442072de3020f92e26967e1b46a379d04766172\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 8 08:42:31.116176 containerd[1535]: time="2025-05-08T08:42:31.115427721Z" level=info msg="Container f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475: CDI devices from CRI Config.CDIDevices: []" May 8 08:42:31.147900 containerd[1535]: time="2025-05-08T08:42:31.147782753Z" level=info msg="CreateContainer within sandbox \"5002bd4fc0fa1a87f9f5cda20442072de3020f92e26967e1b46a379d04766172\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\"" May 8 08:42:31.150690 containerd[1535]: time="2025-05-08T08:42:31.149195510Z" level=info msg="StartContainer for \"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\"" May 8 08:42:31.158066 containerd[1535]: time="2025-05-08T08:42:31.157609297Z" level=info msg="connecting to shim f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475" address="unix:///run/containerd/s/1082b432953f868a99114a80f333a3a520c2e13784f7bcdb704fd0b9b8440e96" protocol=ttrpc version=3 May 8 08:42:31.202136 systemd[1]: Started cri-containerd-f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475.scope - libcontainer container f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475. May 8 08:42:31.283057 containerd[1535]: time="2025-05-08T08:42:31.283014598Z" level=info msg="StartContainer for \"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" returns successfully" May 8 08:42:31.339490 containerd[1535]: time="2025-05-08T08:42:31.338726308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-574667564d-5z62w,Uid:2aad45b6-81f3-4d71-a771-048450f82553,Namespace:calico-system,Attempt:0,}" May 8 08:42:31.385376 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 8 08:42:31.385480 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 8 08:42:31.424765 containerd[1535]: time="2025-05-08T08:42:31.424719722Z" level=error msg="Failed to destroy network for sandbox \"594583dba30b61bace57c936a8fc72991dda0d421378148d9ee36b5b18b8475b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:31.427031 containerd[1535]: time="2025-05-08T08:42:31.426748386Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-574667564d-5z62w,Uid:2aad45b6-81f3-4d71-a771-048450f82553,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"594583dba30b61bace57c936a8fc72991dda0d421378148d9ee36b5b18b8475b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:31.428242 kubelet[2864]: E0508 08:42:31.428038 2864 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"594583dba30b61bace57c936a8fc72991dda0d421378148d9ee36b5b18b8475b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 8 08:42:31.428242 kubelet[2864]: E0508 08:42:31.428122 2864 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"594583dba30b61bace57c936a8fc72991dda0d421378148d9ee36b5b18b8475b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-574667564d-5z62w" May 8 08:42:31.428242 kubelet[2864]: E0508 08:42:31.428149 2864 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"594583dba30b61bace57c936a8fc72991dda0d421378148d9ee36b5b18b8475b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-574667564d-5z62w" May 8 08:42:31.428743 kubelet[2864]: E0508 08:42:31.428204 2864 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-574667564d-5z62w_calico-system(2aad45b6-81f3-4d71-a771-048450f82553)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-574667564d-5z62w_calico-system(2aad45b6-81f3-4d71-a771-048450f82553)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"594583dba30b61bace57c936a8fc72991dda0d421378148d9ee36b5b18b8475b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-574667564d-5z62w" podUID="2aad45b6-81f3-4d71-a771-048450f82553" May 8 08:42:31.429846 systemd[1]: run-netns-cni\x2d3b292c41\x2dcfef\x2d9d52\x2d1b74\x2dfd1a7eb747b2.mount: Deactivated successfully. May 8 08:42:31.720039 containerd[1535]: time="2025-05-08T08:42:31.719835294Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"4830fef33213404d455e7c72e0688f8df05f9def75d37a5ef18d667852209be2\" pid:3862 exit_status:1 exited_at:{seconds:1746693751 nanos:719546923}" May 8 08:42:32.340188 containerd[1535]: time="2025-05-08T08:42:32.340105599Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hfrk7,Uid:55738467-875a-4dbf-84b4-c2a4f16cd05d,Namespace:calico-system,Attempt:0,}" May 8 08:42:32.559907 systemd-networkd[1424]: cali07d5f36c4f2: Link UP May 8 08:42:32.560153 systemd-networkd[1424]: cali07d5f36c4f2: Gained carrier May 8 08:42:32.594779 kubelet[2864]: I0508 08:42:32.594167 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-t5jrg" podStartSLOduration=2.388972924 podStartE2EDuration="28.594147564s" podCreationTimestamp="2025-05-08 08:42:04 +0000 UTC" firstStartedPulling="2025-05-08 08:42:04.842722462 +0000 UTC m=+22.595039796" lastFinishedPulling="2025-05-08 08:42:31.047897062 +0000 UTC m=+48.800214436" observedRunningTime="2025-05-08 08:42:31.64267171 +0000 UTC m=+49.394989034" watchObservedRunningTime="2025-05-08 08:42:32.594147564 +0000 UTC m=+50.346464898" May 8 08:42:32.599432 containerd[1535]: 2025-05-08 08:42:32.373 [INFO][3885] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 8 08:42:32.599432 containerd[1535]: 2025-05-08 08:42:32.414 [INFO][3885] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0 csi-node-driver- calico-system 55738467-875a-4dbf-84b4-c2a4f16cd05d 594 0 2025-05-08 08:42:04 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b7b4b9d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4327-0-0-w-78bcb828ec.novalocal csi-node-driver-hfrk7 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali07d5f36c4f2 [] []}} ContainerID="cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" Namespace="calico-system" Pod="csi-node-driver-hfrk7" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-" May 8 08:42:32.599432 containerd[1535]: 2025-05-08 08:42:32.414 [INFO][3885] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" Namespace="calico-system" Pod="csi-node-driver-hfrk7" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0" May 8 08:42:32.599432 containerd[1535]: 2025-05-08 08:42:32.473 [INFO][3898] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" HandleID="k8s-pod-network.cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0" May 8 08:42:32.599690 containerd[1535]: 2025-05-08 08:42:32.492 [INFO][3898] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" HandleID="k8s-pod-network.cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051bd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4327-0-0-w-78bcb828ec.novalocal", "pod":"csi-node-driver-hfrk7", "timestamp":"2025-05-08 08:42:32.473145527 +0000 UTC"}, Hostname:"ci-4327-0-0-w-78bcb828ec.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 8 08:42:32.599690 containerd[1535]: 2025-05-08 08:42:32.492 [INFO][3898] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 8 08:42:32.599690 containerd[1535]: 2025-05-08 08:42:32.492 [INFO][3898] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 8 08:42:32.599690 containerd[1535]: 2025-05-08 08:42:32.492 [INFO][3898] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4327-0-0-w-78bcb828ec.novalocal' May 8 08:42:32.599690 containerd[1535]: 2025-05-08 08:42:32.496 [INFO][3898] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:32.599690 containerd[1535]: 2025-05-08 08:42:32.503 [INFO][3898] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:32.599690 containerd[1535]: 2025-05-08 08:42:32.510 [INFO][3898] ipam/ipam.go 489: Trying affinity for 192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:32.599690 containerd[1535]: 2025-05-08 08:42:32.514 [INFO][3898] ipam/ipam.go 155: Attempting to load block cidr=192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:32.599690 containerd[1535]: 2025-05-08 08:42:32.517 [INFO][3898] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:32.600005 containerd[1535]: 2025-05-08 08:42:32.517 [INFO][3898] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.79.128/26 handle="k8s-pod-network.cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:32.600005 containerd[1535]: 2025-05-08 08:42:32.520 [INFO][3898] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd May 8 08:42:32.600005 containerd[1535]: 2025-05-08 08:42:32.529 [INFO][3898] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.79.128/26 handle="k8s-pod-network.cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:32.600005 containerd[1535]: 2025-05-08 08:42:32.544 [INFO][3898] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.79.129/26] block=192.168.79.128/26 handle="k8s-pod-network.cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:32.600005 containerd[1535]: 2025-05-08 08:42:32.544 [INFO][3898] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.79.129/26] handle="k8s-pod-network.cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:32.600005 containerd[1535]: 2025-05-08 08:42:32.544 [INFO][3898] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 8 08:42:32.600005 containerd[1535]: 2025-05-08 08:42:32.544 [INFO][3898] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.79.129/26] IPv6=[] ContainerID="cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" HandleID="k8s-pod-network.cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0" May 8 08:42:32.600210 containerd[1535]: 2025-05-08 08:42:32.549 [INFO][3885] cni-plugin/k8s.go 386: Populated endpoint ContainerID="cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" Namespace="calico-system" Pod="csi-node-driver-hfrk7" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"55738467-875a-4dbf-84b4-c2a4f16cd05d", ResourceVersion:"594", Generation:0, CreationTimestamp:time.Date(2025, time.May, 8, 8, 42, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4327-0-0-w-78bcb828ec.novalocal", ContainerID:"", Pod:"csi-node-driver-hfrk7", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.79.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali07d5f36c4f2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 8 08:42:32.600290 containerd[1535]: 2025-05-08 08:42:32.549 [INFO][3885] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.79.129/32] ContainerID="cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" Namespace="calico-system" Pod="csi-node-driver-hfrk7" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0" May 8 08:42:32.600290 containerd[1535]: 2025-05-08 08:42:32.550 [INFO][3885] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali07d5f36c4f2 ContainerID="cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" Namespace="calico-system" Pod="csi-node-driver-hfrk7" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0" May 8 08:42:32.600290 containerd[1535]: 2025-05-08 08:42:32.560 [INFO][3885] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" Namespace="calico-system" Pod="csi-node-driver-hfrk7" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0" May 8 08:42:32.600379 containerd[1535]: 2025-05-08 08:42:32.561 [INFO][3885] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" Namespace="calico-system" Pod="csi-node-driver-hfrk7" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"55738467-875a-4dbf-84b4-c2a4f16cd05d", ResourceVersion:"594", Generation:0, CreationTimestamp:time.Date(2025, time.May, 8, 8, 42, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4327-0-0-w-78bcb828ec.novalocal", ContainerID:"cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd", Pod:"csi-node-driver-hfrk7", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.79.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali07d5f36c4f2", MAC:"92:b9:b2:7a:16:d9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 8 08:42:32.600448 containerd[1535]: 2025-05-08 08:42:32.594 [INFO][3885] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" Namespace="calico-system" Pod="csi-node-driver-hfrk7" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-csi--node--driver--hfrk7-eth0" May 8 08:42:32.677553 containerd[1535]: time="2025-05-08T08:42:32.677499175Z" level=info msg="connecting to shim cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd" address="unix:///run/containerd/s/1c977e0cd254984620c8e96cdbe5b9bd4ee2a3fc01ac0a4e20a6fef18dbabc80" namespace=k8s.io protocol=ttrpc version=3 May 8 08:42:32.712163 systemd[1]: Started cri-containerd-cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd.scope - libcontainer container cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd. May 8 08:42:32.772460 containerd[1535]: time="2025-05-08T08:42:32.772401556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hfrk7,Uid:55738467-875a-4dbf-84b4-c2a4f16cd05d,Namespace:calico-system,Attempt:0,} returns sandbox id \"cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd\"" May 8 08:42:32.775193 containerd[1535]: time="2025-05-08T08:42:32.775147508Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 8 08:42:32.973902 containerd[1535]: time="2025-05-08T08:42:32.973751309Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"62f2bf5380a9c04943d0f638b72f4996379ddf4cdaca28e06a2b156fcebcb2ea\" pid:3933 exit_status:1 exited_at:{seconds:1746693752 nanos:973147306}" May 8 08:42:33.338789 containerd[1535]: time="2025-05-08T08:42:33.338650156Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dbff64874-f6gff,Uid:6ff13dc1-7d0b-4fee-a69f-8166ad87f47b,Namespace:calico-apiserver,Attempt:0,}" May 8 08:42:33.550588 systemd-networkd[1424]: cali2d9e31faec2: Link UP May 8 08:42:33.551859 systemd-networkd[1424]: cali2d9e31faec2: Gained carrier May 8 08:42:33.568245 containerd[1535]: 2025-05-08 08:42:33.401 [INFO][4079] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 8 08:42:33.568245 containerd[1535]: 2025-05-08 08:42:33.439 [INFO][4079] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0 calico-apiserver-dbff64874- calico-apiserver 6ff13dc1-7d0b-4fee-a69f-8166ad87f47b 687 0 2025-05-08 08:42:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:dbff64874 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4327-0-0-w-78bcb828ec.novalocal calico-apiserver-dbff64874-f6gff eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2d9e31faec2 [] []}} ContainerID="8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-f6gff" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-" May 8 08:42:33.568245 containerd[1535]: 2025-05-08 08:42:33.439 [INFO][4079] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-f6gff" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0" May 8 08:42:33.568245 containerd[1535]: 2025-05-08 08:42:33.501 [INFO][4090] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" HandleID="k8s-pod-network.8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0" May 8 08:42:33.568858 containerd[1535]: 2025-05-08 08:42:33.512 [INFO][4090] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" HandleID="k8s-pod-network.8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000287310), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4327-0-0-w-78bcb828ec.novalocal", "pod":"calico-apiserver-dbff64874-f6gff", "timestamp":"2025-05-08 08:42:33.501365463 +0000 UTC"}, Hostname:"ci-4327-0-0-w-78bcb828ec.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 8 08:42:33.568858 containerd[1535]: 2025-05-08 08:42:33.512 [INFO][4090] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 8 08:42:33.568858 containerd[1535]: 2025-05-08 08:42:33.513 [INFO][4090] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 8 08:42:33.568858 containerd[1535]: 2025-05-08 08:42:33.513 [INFO][4090] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4327-0-0-w-78bcb828ec.novalocal' May 8 08:42:33.568858 containerd[1535]: 2025-05-08 08:42:33.515 [INFO][4090] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:33.568858 containerd[1535]: 2025-05-08 08:42:33.520 [INFO][4090] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:33.568858 containerd[1535]: 2025-05-08 08:42:33.526 [INFO][4090] ipam/ipam.go 489: Trying affinity for 192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:33.568858 containerd[1535]: 2025-05-08 08:42:33.529 [INFO][4090] ipam/ipam.go 155: Attempting to load block cidr=192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:33.568858 containerd[1535]: 2025-05-08 08:42:33.531 [INFO][4090] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:33.569175 containerd[1535]: 2025-05-08 08:42:33.532 [INFO][4090] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.79.128/26 handle="k8s-pod-network.8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:33.569175 containerd[1535]: 2025-05-08 08:42:33.533 [INFO][4090] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a May 8 08:42:33.569175 containerd[1535]: 2025-05-08 08:42:33.539 [INFO][4090] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.79.128/26 handle="k8s-pod-network.8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:33.569175 containerd[1535]: 2025-05-08 08:42:33.545 [INFO][4090] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.79.130/26] block=192.168.79.128/26 handle="k8s-pod-network.8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:33.569175 containerd[1535]: 2025-05-08 08:42:33.545 [INFO][4090] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.79.130/26] handle="k8s-pod-network.8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:33.569175 containerd[1535]: 2025-05-08 08:42:33.545 [INFO][4090] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 8 08:42:33.569175 containerd[1535]: 2025-05-08 08:42:33.545 [INFO][4090] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.79.130/26] IPv6=[] ContainerID="8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" HandleID="k8s-pod-network.8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0" May 8 08:42:33.569645 containerd[1535]: 2025-05-08 08:42:33.548 [INFO][4079] cni-plugin/k8s.go 386: Populated endpoint ContainerID="8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-f6gff" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0", GenerateName:"calico-apiserver-dbff64874-", Namespace:"calico-apiserver", SelfLink:"", UID:"6ff13dc1-7d0b-4fee-a69f-8166ad87f47b", ResourceVersion:"687", Generation:0, CreationTimestamp:time.Date(2025, time.May, 8, 8, 42, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"dbff64874", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4327-0-0-w-78bcb828ec.novalocal", ContainerID:"", Pod:"calico-apiserver-dbff64874-f6gff", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.79.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2d9e31faec2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 8 08:42:33.569741 containerd[1535]: 2025-05-08 08:42:33.548 [INFO][4079] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.79.130/32] ContainerID="8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-f6gff" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0" May 8 08:42:33.569741 containerd[1535]: 2025-05-08 08:42:33.548 [INFO][4079] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2d9e31faec2 ContainerID="8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-f6gff" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0" May 8 08:42:33.569741 containerd[1535]: 2025-05-08 08:42:33.552 [INFO][4079] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-f6gff" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0" May 8 08:42:33.569859 containerd[1535]: 2025-05-08 08:42:33.552 [INFO][4079] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-f6gff" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0", GenerateName:"calico-apiserver-dbff64874-", Namespace:"calico-apiserver", SelfLink:"", UID:"6ff13dc1-7d0b-4fee-a69f-8166ad87f47b", ResourceVersion:"687", Generation:0, CreationTimestamp:time.Date(2025, time.May, 8, 8, 42, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"dbff64874", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4327-0-0-w-78bcb828ec.novalocal", ContainerID:"8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a", Pod:"calico-apiserver-dbff64874-f6gff", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.79.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2d9e31faec2", MAC:"1e:8b:d8:8b:f8:2c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 8 08:42:33.570392 containerd[1535]: 2025-05-08 08:42:33.565 [INFO][4079] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-f6gff" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--f6gff-eth0" May 8 08:42:33.606402 kubelet[2864]: I0508 08:42:33.606250 2864 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 8 08:42:33.675685 containerd[1535]: time="2025-05-08T08:42:33.675481411Z" level=info msg="connecting to shim 8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a" address="unix:///run/containerd/s/4f3e450e65b5de0a41607dce4075b403dd3226a3c94f1a551be7ced8073ccaa4" namespace=k8s.io protocol=ttrpc version=3 May 8 08:42:33.749163 systemd[1]: Started cri-containerd-8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a.scope - libcontainer container 8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a. May 8 08:42:33.796688 containerd[1535]: time="2025-05-08T08:42:33.796644542Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dbff64874-f6gff,Uid:6ff13dc1-7d0b-4fee-a69f-8166ad87f47b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a\"" May 8 08:42:34.131031 kernel: bpftool[4196]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 8 08:42:34.136700 systemd-networkd[1424]: cali07d5f36c4f2: Gained IPv6LL May 8 08:42:34.467827 systemd-networkd[1424]: vxlan.calico: Link UP May 8 08:42:34.469591 systemd-networkd[1424]: vxlan.calico: Gained carrier May 8 08:42:34.903273 systemd-networkd[1424]: cali2d9e31faec2: Gained IPv6LL May 8 08:42:35.339617 containerd[1535]: time="2025-05-08T08:42:35.339350142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-64rw6,Uid:45d8b150-711f-48ea-9546-c9d41ba961e3,Namespace:kube-system,Attempt:0,}" May 8 08:42:35.340450 containerd[1535]: time="2025-05-08T08:42:35.340300073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dbff64874-l66hm,Uid:1b3ab818-593e-4e7b-812b-16baea71dd0e,Namespace:calico-apiserver,Attempt:0,}" May 8 08:42:35.735293 systemd-networkd[1424]: vxlan.calico: Gained IPv6LL May 8 08:42:35.792936 systemd-networkd[1424]: cali848dd0c7c92: Link UP May 8 08:42:35.794507 systemd-networkd[1424]: cali848dd0c7c92: Gained carrier May 8 08:42:35.836398 containerd[1535]: 2025-05-08 08:42:35.636 [INFO][4280] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0 calico-apiserver-dbff64874- calico-apiserver 1b3ab818-593e-4e7b-812b-16baea71dd0e 692 0 2025-05-08 08:42:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:dbff64874 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4327-0-0-w-78bcb828ec.novalocal calico-apiserver-dbff64874-l66hm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali848dd0c7c92 [] []}} ContainerID="f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-l66hm" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-" May 8 08:42:35.836398 containerd[1535]: 2025-05-08 08:42:35.636 [INFO][4280] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-l66hm" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0" May 8 08:42:35.836398 containerd[1535]: 2025-05-08 08:42:35.707 [INFO][4310] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" HandleID="k8s-pod-network.f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0" May 8 08:42:35.836830 containerd[1535]: 2025-05-08 08:42:35.727 [INFO][4310] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" HandleID="k8s-pod-network.f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003bc690), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4327-0-0-w-78bcb828ec.novalocal", "pod":"calico-apiserver-dbff64874-l66hm", "timestamp":"2025-05-08 08:42:35.707495211 +0000 UTC"}, Hostname:"ci-4327-0-0-w-78bcb828ec.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 8 08:42:35.836830 containerd[1535]: 2025-05-08 08:42:35.727 [INFO][4310] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 8 08:42:35.836830 containerd[1535]: 2025-05-08 08:42:35.727 [INFO][4310] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 8 08:42:35.836830 containerd[1535]: 2025-05-08 08:42:35.727 [INFO][4310] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4327-0-0-w-78bcb828ec.novalocal' May 8 08:42:35.836830 containerd[1535]: 2025-05-08 08:42:35.732 [INFO][4310] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.836830 containerd[1535]: 2025-05-08 08:42:35.739 [INFO][4310] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.836830 containerd[1535]: 2025-05-08 08:42:35.747 [INFO][4310] ipam/ipam.go 489: Trying affinity for 192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.836830 containerd[1535]: 2025-05-08 08:42:35.752 [INFO][4310] ipam/ipam.go 155: Attempting to load block cidr=192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.836830 containerd[1535]: 2025-05-08 08:42:35.757 [INFO][4310] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.837409 containerd[1535]: 2025-05-08 08:42:35.758 [INFO][4310] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.79.128/26 handle="k8s-pod-network.f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.837409 containerd[1535]: 2025-05-08 08:42:35.762 [INFO][4310] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a May 8 08:42:35.837409 containerd[1535]: 2025-05-08 08:42:35.769 [INFO][4310] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.79.128/26 handle="k8s-pod-network.f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.837409 containerd[1535]: 2025-05-08 08:42:35.782 [INFO][4310] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.79.131/26] block=192.168.79.128/26 handle="k8s-pod-network.f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.837409 containerd[1535]: 2025-05-08 08:42:35.782 [INFO][4310] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.79.131/26] handle="k8s-pod-network.f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.837409 containerd[1535]: 2025-05-08 08:42:35.783 [INFO][4310] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 8 08:42:35.837409 containerd[1535]: 2025-05-08 08:42:35.783 [INFO][4310] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.79.131/26] IPv6=[] ContainerID="f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" HandleID="k8s-pod-network.f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0" May 8 08:42:35.837583 containerd[1535]: 2025-05-08 08:42:35.787 [INFO][4280] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-l66hm" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0", GenerateName:"calico-apiserver-dbff64874-", Namespace:"calico-apiserver", SelfLink:"", UID:"1b3ab818-593e-4e7b-812b-16baea71dd0e", ResourceVersion:"692", Generation:0, CreationTimestamp:time.Date(2025, time.May, 8, 8, 42, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"dbff64874", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4327-0-0-w-78bcb828ec.novalocal", ContainerID:"", Pod:"calico-apiserver-dbff64874-l66hm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.79.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali848dd0c7c92", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 8 08:42:35.837881 containerd[1535]: 2025-05-08 08:42:35.787 [INFO][4280] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.79.131/32] ContainerID="f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-l66hm" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0" May 8 08:42:35.837881 containerd[1535]: 2025-05-08 08:42:35.787 [INFO][4280] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali848dd0c7c92 ContainerID="f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-l66hm" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0" May 8 08:42:35.837881 containerd[1535]: 2025-05-08 08:42:35.797 [INFO][4280] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-l66hm" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0" May 8 08:42:35.837965 containerd[1535]: 2025-05-08 08:42:35.800 [INFO][4280] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-l66hm" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0", GenerateName:"calico-apiserver-dbff64874-", Namespace:"calico-apiserver", SelfLink:"", UID:"1b3ab818-593e-4e7b-812b-16baea71dd0e", ResourceVersion:"692", Generation:0, CreationTimestamp:time.Date(2025, time.May, 8, 8, 42, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"dbff64874", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4327-0-0-w-78bcb828ec.novalocal", ContainerID:"f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a", Pod:"calico-apiserver-dbff64874-l66hm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.79.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali848dd0c7c92", MAC:"3a:ad:34:48:3b:b2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 8 08:42:35.838056 containerd[1535]: 2025-05-08 08:42:35.827 [INFO][4280] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" Namespace="calico-apiserver" Pod="calico-apiserver-dbff64874-l66hm" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--apiserver--dbff64874--l66hm-eth0" May 8 08:42:35.881630 systemd-networkd[1424]: calic1ef7a6c3d6: Link UP May 8 08:42:35.882757 systemd-networkd[1424]: calic1ef7a6c3d6: Gained carrier May 8 08:42:35.912149 containerd[1535]: 2025-05-08 08:42:35.664 [INFO][4292] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0 coredns-7db6d8ff4d- kube-system 45d8b150-711f-48ea-9546-c9d41ba961e3 693 0 2025-05-08 08:41:57 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4327-0-0-w-78bcb828ec.novalocal coredns-7db6d8ff4d-64rw6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic1ef7a6c3d6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" Namespace="kube-system" Pod="coredns-7db6d8ff4d-64rw6" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-" May 8 08:42:35.912149 containerd[1535]: 2025-05-08 08:42:35.665 [INFO][4292] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" Namespace="kube-system" Pod="coredns-7db6d8ff4d-64rw6" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0" May 8 08:42:35.912149 containerd[1535]: 2025-05-08 08:42:35.740 [INFO][4318] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" HandleID="k8s-pod-network.ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0" May 8 08:42:35.912463 containerd[1535]: 2025-05-08 08:42:35.761 [INFO][4318] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" HandleID="k8s-pod-network.ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000319780), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4327-0-0-w-78bcb828ec.novalocal", "pod":"coredns-7db6d8ff4d-64rw6", "timestamp":"2025-05-08 08:42:35.740834887 +0000 UTC"}, Hostname:"ci-4327-0-0-w-78bcb828ec.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 8 08:42:35.912463 containerd[1535]: 2025-05-08 08:42:35.762 [INFO][4318] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 8 08:42:35.912463 containerd[1535]: 2025-05-08 08:42:35.783 [INFO][4318] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 8 08:42:35.912463 containerd[1535]: 2025-05-08 08:42:35.783 [INFO][4318] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4327-0-0-w-78bcb828ec.novalocal' May 8 08:42:35.912463 containerd[1535]: 2025-05-08 08:42:35.787 [INFO][4318] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.912463 containerd[1535]: 2025-05-08 08:42:35.800 [INFO][4318] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.912463 containerd[1535]: 2025-05-08 08:42:35.816 [INFO][4318] ipam/ipam.go 489: Trying affinity for 192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.912463 containerd[1535]: 2025-05-08 08:42:35.830 [INFO][4318] ipam/ipam.go 155: Attempting to load block cidr=192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.912463 containerd[1535]: 2025-05-08 08:42:35.838 [INFO][4318] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.912727 containerd[1535]: 2025-05-08 08:42:35.840 [INFO][4318] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.79.128/26 handle="k8s-pod-network.ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.912727 containerd[1535]: 2025-05-08 08:42:35.845 [INFO][4318] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50 May 8 08:42:35.912727 containerd[1535]: 2025-05-08 08:42:35.858 [INFO][4318] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.79.128/26 handle="k8s-pod-network.ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.912727 containerd[1535]: 2025-05-08 08:42:35.870 [INFO][4318] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.79.132/26] block=192.168.79.128/26 handle="k8s-pod-network.ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.912727 containerd[1535]: 2025-05-08 08:42:35.871 [INFO][4318] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.79.132/26] handle="k8s-pod-network.ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:35.912727 containerd[1535]: 2025-05-08 08:42:35.871 [INFO][4318] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 8 08:42:35.912727 containerd[1535]: 2025-05-08 08:42:35.871 [INFO][4318] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.79.132/26] IPv6=[] ContainerID="ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" HandleID="k8s-pod-network.ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0" May 8 08:42:35.912900 containerd[1535]: 2025-05-08 08:42:35.874 [INFO][4292] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" Namespace="kube-system" Pod="coredns-7db6d8ff4d-64rw6" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"45d8b150-711f-48ea-9546-c9d41ba961e3", ResourceVersion:"693", Generation:0, CreationTimestamp:time.Date(2025, time.May, 8, 8, 41, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4327-0-0-w-78bcb828ec.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-64rw6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.79.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic1ef7a6c3d6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 8 08:42:35.912900 containerd[1535]: 2025-05-08 08:42:35.874 [INFO][4292] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.79.132/32] ContainerID="ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" Namespace="kube-system" Pod="coredns-7db6d8ff4d-64rw6" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0" May 8 08:42:35.912900 containerd[1535]: 2025-05-08 08:42:35.874 [INFO][4292] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic1ef7a6c3d6 ContainerID="ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" Namespace="kube-system" Pod="coredns-7db6d8ff4d-64rw6" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0" May 8 08:42:35.912900 containerd[1535]: 2025-05-08 08:42:35.883 [INFO][4292] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" Namespace="kube-system" Pod="coredns-7db6d8ff4d-64rw6" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0" May 8 08:42:35.912900 containerd[1535]: 2025-05-08 08:42:35.883 [INFO][4292] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" Namespace="kube-system" Pod="coredns-7db6d8ff4d-64rw6" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"45d8b150-711f-48ea-9546-c9d41ba961e3", ResourceVersion:"693", Generation:0, CreationTimestamp:time.Date(2025, time.May, 8, 8, 41, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4327-0-0-w-78bcb828ec.novalocal", ContainerID:"ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50", Pod:"coredns-7db6d8ff4d-64rw6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.79.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic1ef7a6c3d6", MAC:"ca:aa:a2:14:f7:37", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 8 08:42:35.912900 containerd[1535]: 2025-05-08 08:42:35.905 [INFO][4292] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" Namespace="kube-system" Pod="coredns-7db6d8ff4d-64rw6" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--64rw6-eth0" May 8 08:42:35.921310 containerd[1535]: time="2025-05-08T08:42:35.921232350Z" level=info msg="connecting to shim f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a" address="unix:///run/containerd/s/29ead4ec994c1ab582788820c25590d86439b73d73ac2a08138dfc4e42553630" namespace=k8s.io protocol=ttrpc version=3 May 8 08:42:36.011480 containerd[1535]: time="2025-05-08T08:42:36.011000197Z" level=info msg="connecting to shim ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50" address="unix:///run/containerd/s/8059caa0374751d416fc4fdf15d14c1f2cdf08e86c0e1fd48e1c907ae07e1a08" namespace=k8s.io protocol=ttrpc version=3 May 8 08:42:36.022470 systemd[1]: Started cri-containerd-f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a.scope - libcontainer container f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a. May 8 08:42:36.032334 containerd[1535]: time="2025-05-08T08:42:36.031294688Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:36.036719 containerd[1535]: time="2025-05-08T08:42:36.036485148Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 8 08:42:36.039263 containerd[1535]: time="2025-05-08T08:42:36.038160875Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:36.047092 containerd[1535]: time="2025-05-08T08:42:36.047028519Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:36.052257 containerd[1535]: time="2025-05-08T08:42:36.052207758Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 3.277023941s" May 8 08:42:36.052257 containerd[1535]: time="2025-05-08T08:42:36.052255189Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 8 08:42:36.066946 containerd[1535]: time="2025-05-08T08:42:36.066386413Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 8 08:42:36.066489 systemd[1]: Started cri-containerd-ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50.scope - libcontainer container ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50. May 8 08:42:36.091625 containerd[1535]: time="2025-05-08T08:42:36.091592723Z" level=info msg="CreateContainer within sandbox \"cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 8 08:42:36.119488 containerd[1535]: time="2025-05-08T08:42:36.119435221Z" level=info msg="Container 00b9629bffe7b0e6fb413d6028ddea5523313518fa632bed517ceae5396ed42b: CDI devices from CRI Config.CDIDevices: []" May 8 08:42:36.147946 containerd[1535]: time="2025-05-08T08:42:36.147867664Z" level=info msg="CreateContainer within sandbox \"cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"00b9629bffe7b0e6fb413d6028ddea5523313518fa632bed517ceae5396ed42b\"" May 8 08:42:36.149455 containerd[1535]: time="2025-05-08T08:42:36.149251474Z" level=info msg="StartContainer for \"00b9629bffe7b0e6fb413d6028ddea5523313518fa632bed517ceae5396ed42b\"" May 8 08:42:36.155602 containerd[1535]: time="2025-05-08T08:42:36.155406335Z" level=info msg="connecting to shim 00b9629bffe7b0e6fb413d6028ddea5523313518fa632bed517ceae5396ed42b" address="unix:///run/containerd/s/1c977e0cd254984620c8e96cdbe5b9bd4ee2a3fc01ac0a4e20a6fef18dbabc80" protocol=ttrpc version=3 May 8 08:42:36.164945 containerd[1535]: time="2025-05-08T08:42:36.164447710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dbff64874-l66hm,Uid:1b3ab818-593e-4e7b-812b-16baea71dd0e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a\"" May 8 08:42:36.196517 containerd[1535]: time="2025-05-08T08:42:36.196369870Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-64rw6,Uid:45d8b150-711f-48ea-9546-c9d41ba961e3,Namespace:kube-system,Attempt:0,} returns sandbox id \"ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50\"" May 8 08:42:36.204370 containerd[1535]: time="2025-05-08T08:42:36.204130974Z" level=info msg="CreateContainer within sandbox \"ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 8 08:42:36.212166 systemd[1]: Started cri-containerd-00b9629bffe7b0e6fb413d6028ddea5523313518fa632bed517ceae5396ed42b.scope - libcontainer container 00b9629bffe7b0e6fb413d6028ddea5523313518fa632bed517ceae5396ed42b. May 8 08:42:36.227461 containerd[1535]: time="2025-05-08T08:42:36.226714851Z" level=info msg="Container 48923a21a40c9447beb4dd53ae13394193df3ef87bd4fb977277f5413f828a3b: CDI devices from CRI Config.CDIDevices: []" May 8 08:42:36.237607 containerd[1535]: time="2025-05-08T08:42:36.237548977Z" level=info msg="CreateContainer within sandbox \"ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"48923a21a40c9447beb4dd53ae13394193df3ef87bd4fb977277f5413f828a3b\"" May 8 08:42:36.239423 containerd[1535]: time="2025-05-08T08:42:36.239400318Z" level=info msg="StartContainer for \"48923a21a40c9447beb4dd53ae13394193df3ef87bd4fb977277f5413f828a3b\"" May 8 08:42:36.241286 containerd[1535]: time="2025-05-08T08:42:36.241118956Z" level=info msg="connecting to shim 48923a21a40c9447beb4dd53ae13394193df3ef87bd4fb977277f5413f828a3b" address="unix:///run/containerd/s/8059caa0374751d416fc4fdf15d14c1f2cdf08e86c0e1fd48e1c907ae07e1a08" protocol=ttrpc version=3 May 8 08:42:36.271251 systemd[1]: Started cri-containerd-48923a21a40c9447beb4dd53ae13394193df3ef87bd4fb977277f5413f828a3b.scope - libcontainer container 48923a21a40c9447beb4dd53ae13394193df3ef87bd4fb977277f5413f828a3b. May 8 08:42:36.287467 containerd[1535]: time="2025-05-08T08:42:36.287426416Z" level=info msg="StartContainer for \"00b9629bffe7b0e6fb413d6028ddea5523313518fa632bed517ceae5396ed42b\" returns successfully" May 8 08:42:36.321264 containerd[1535]: time="2025-05-08T08:42:36.321222840Z" level=info msg="StartContainer for \"48923a21a40c9447beb4dd53ae13394193df3ef87bd4fb977277f5413f828a3b\" returns successfully" May 8 08:42:36.339594 containerd[1535]: time="2025-05-08T08:42:36.339544599Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-qpbnv,Uid:0b760017-9088-41e6-b686-98d1a18df87b,Namespace:kube-system,Attempt:0,}" May 8 08:42:36.491853 systemd-networkd[1424]: cali1dfc7cdd5d6: Link UP May 8 08:42:36.492733 systemd-networkd[1424]: cali1dfc7cdd5d6: Gained carrier May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.395 [INFO][4501] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0 coredns-7db6d8ff4d- kube-system 0b760017-9088-41e6-b686-98d1a18df87b 691 0 2025-05-08 08:41:57 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4327-0-0-w-78bcb828ec.novalocal coredns-7db6d8ff4d-qpbnv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1dfc7cdd5d6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qpbnv" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-" May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.395 [INFO][4501] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qpbnv" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0" May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.425 [INFO][4513] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" HandleID="k8s-pod-network.d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0" May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.436 [INFO][4513] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" HandleID="k8s-pod-network.d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004d6b60), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4327-0-0-w-78bcb828ec.novalocal", "pod":"coredns-7db6d8ff4d-qpbnv", "timestamp":"2025-05-08 08:42:36.425437415 +0000 UTC"}, Hostname:"ci-4327-0-0-w-78bcb828ec.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.436 [INFO][4513] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.436 [INFO][4513] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.436 [INFO][4513] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4327-0-0-w-78bcb828ec.novalocal' May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.438 [INFO][4513] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.450 [INFO][4513] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.460 [INFO][4513] ipam/ipam.go 489: Trying affinity for 192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.463 [INFO][4513] ipam/ipam.go 155: Attempting to load block cidr=192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.466 [INFO][4513] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.466 [INFO][4513] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.79.128/26 handle="k8s-pod-network.d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.468 [INFO][4513] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.474 [INFO][4513] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.79.128/26 handle="k8s-pod-network.d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.486 [INFO][4513] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.79.133/26] block=192.168.79.128/26 handle="k8s-pod-network.d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.486 [INFO][4513] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.79.133/26] handle="k8s-pod-network.d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.486 [INFO][4513] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 8 08:42:36.535710 containerd[1535]: 2025-05-08 08:42:36.486 [INFO][4513] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.79.133/26] IPv6=[] ContainerID="d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" HandleID="k8s-pod-network.d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0" May 8 08:42:36.540407 containerd[1535]: 2025-05-08 08:42:36.488 [INFO][4501] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qpbnv" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"0b760017-9088-41e6-b686-98d1a18df87b", ResourceVersion:"691", Generation:0, CreationTimestamp:time.Date(2025, time.May, 8, 8, 41, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4327-0-0-w-78bcb828ec.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-qpbnv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.79.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1dfc7cdd5d6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 8 08:42:36.540407 containerd[1535]: 2025-05-08 08:42:36.488 [INFO][4501] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.79.133/32] ContainerID="d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qpbnv" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0" May 8 08:42:36.540407 containerd[1535]: 2025-05-08 08:42:36.488 [INFO][4501] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1dfc7cdd5d6 ContainerID="d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qpbnv" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0" May 8 08:42:36.540407 containerd[1535]: 2025-05-08 08:42:36.492 [INFO][4501] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qpbnv" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0" May 8 08:42:36.540407 containerd[1535]: 2025-05-08 08:42:36.493 [INFO][4501] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qpbnv" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"0b760017-9088-41e6-b686-98d1a18df87b", ResourceVersion:"691", Generation:0, CreationTimestamp:time.Date(2025, time.May, 8, 8, 41, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4327-0-0-w-78bcb828ec.novalocal", ContainerID:"d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e", Pod:"coredns-7db6d8ff4d-qpbnv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.79.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1dfc7cdd5d6", MAC:"c2:99:51:b6:5a:79", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 8 08:42:36.540407 containerd[1535]: 2025-05-08 08:42:36.531 [INFO][4501] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qpbnv" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-coredns--7db6d8ff4d--qpbnv-eth0" May 8 08:42:36.591275 containerd[1535]: time="2025-05-08T08:42:36.590803212Z" level=info msg="connecting to shim d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e" address="unix:///run/containerd/s/547122466641c98656dc9677e6e7b9c78bf709b893b8ba6f0792f9e7866c63a5" namespace=k8s.io protocol=ttrpc version=3 May 8 08:42:36.624152 systemd[1]: Started cri-containerd-d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e.scope - libcontainer container d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e. May 8 08:42:36.664882 kubelet[2864]: I0508 08:42:36.664821 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-64rw6" podStartSLOduration=39.664800327 podStartE2EDuration="39.664800327s" podCreationTimestamp="2025-05-08 08:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-08 08:42:36.662964195 +0000 UTC m=+54.415281549" watchObservedRunningTime="2025-05-08 08:42:36.664800327 +0000 UTC m=+54.417117651" May 8 08:42:36.701191 containerd[1535]: time="2025-05-08T08:42:36.701135254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-qpbnv,Uid:0b760017-9088-41e6-b686-98d1a18df87b,Namespace:kube-system,Attempt:0,} returns sandbox id \"d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e\"" May 8 08:42:36.706391 containerd[1535]: time="2025-05-08T08:42:36.706345633Z" level=info msg="CreateContainer within sandbox \"d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 8 08:42:36.727016 containerd[1535]: time="2025-05-08T08:42:36.723205282Z" level=info msg="Container 1012c281cf6e11256245e169ae539c38dd9e9ba0c5779f350dcc44a58dc047a8: CDI devices from CRI Config.CDIDevices: []" May 8 08:42:36.742372 containerd[1535]: time="2025-05-08T08:42:36.742264867Z" level=info msg="CreateContainer within sandbox \"d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1012c281cf6e11256245e169ae539c38dd9e9ba0c5779f350dcc44a58dc047a8\"" May 8 08:42:36.744779 containerd[1535]: time="2025-05-08T08:42:36.743238885Z" level=info msg="StartContainer for \"1012c281cf6e11256245e169ae539c38dd9e9ba0c5779f350dcc44a58dc047a8\"" May 8 08:42:36.744779 containerd[1535]: time="2025-05-08T08:42:36.744174289Z" level=info msg="connecting to shim 1012c281cf6e11256245e169ae539c38dd9e9ba0c5779f350dcc44a58dc047a8" address="unix:///run/containerd/s/547122466641c98656dc9677e6e7b9c78bf709b893b8ba6f0792f9e7866c63a5" protocol=ttrpc version=3 May 8 08:42:36.774199 systemd[1]: Started cri-containerd-1012c281cf6e11256245e169ae539c38dd9e9ba0c5779f350dcc44a58dc047a8.scope - libcontainer container 1012c281cf6e11256245e169ae539c38dd9e9ba0c5779f350dcc44a58dc047a8. May 8 08:42:36.841844 containerd[1535]: time="2025-05-08T08:42:36.841674616Z" level=info msg="StartContainer for \"1012c281cf6e11256245e169ae539c38dd9e9ba0c5779f350dcc44a58dc047a8\" returns successfully" May 8 08:42:37.143129 systemd-networkd[1424]: calic1ef7a6c3d6: Gained IPv6LL May 8 08:42:37.144494 systemd-networkd[1424]: cali848dd0c7c92: Gained IPv6LL May 8 08:42:37.677002 kubelet[2864]: I0508 08:42:37.676839 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-qpbnv" podStartSLOduration=40.676794879 podStartE2EDuration="40.676794879s" podCreationTimestamp="2025-05-08 08:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-08 08:42:37.676318501 +0000 UTC m=+55.428635825" watchObservedRunningTime="2025-05-08 08:42:37.676794879 +0000 UTC m=+55.429112264" May 8 08:42:38.295458 systemd-networkd[1424]: cali1dfc7cdd5d6: Gained IPv6LL May 8 08:42:41.543475 containerd[1535]: time="2025-05-08T08:42:41.541876452Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:41.543475 containerd[1535]: time="2025-05-08T08:42:41.543384446Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 8 08:42:41.544749 containerd[1535]: time="2025-05-08T08:42:41.544613718Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:41.547487 containerd[1535]: time="2025-05-08T08:42:41.547404247Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:41.549413 containerd[1535]: time="2025-05-08T08:42:41.548211596Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 5.48039818s" May 8 08:42:41.549413 containerd[1535]: time="2025-05-08T08:42:41.548248867Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 8 08:42:41.573928 containerd[1535]: time="2025-05-08T08:42:41.573859701Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 8 08:42:41.588851 containerd[1535]: time="2025-05-08T08:42:41.587235665Z" level=info msg="CreateContainer within sandbox \"8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 8 08:42:41.609216 containerd[1535]: time="2025-05-08T08:42:41.609165112Z" level=info msg="Container c8a1660298c3f1346b8c48b689f977cdea908b30009daa6cea080a554ae67a26: CDI devices from CRI Config.CDIDevices: []" May 8 08:42:41.618003 containerd[1535]: time="2025-05-08T08:42:41.617847600Z" level=info msg="CreateContainer within sandbox \"8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c8a1660298c3f1346b8c48b689f977cdea908b30009daa6cea080a554ae67a26\"" May 8 08:42:41.624923 containerd[1535]: time="2025-05-08T08:42:41.624323963Z" level=info msg="StartContainer for \"c8a1660298c3f1346b8c48b689f977cdea908b30009daa6cea080a554ae67a26\"" May 8 08:42:41.629204 containerd[1535]: time="2025-05-08T08:42:41.629137508Z" level=info msg="connecting to shim c8a1660298c3f1346b8c48b689f977cdea908b30009daa6cea080a554ae67a26" address="unix:///run/containerd/s/4f3e450e65b5de0a41607dce4075b403dd3226a3c94f1a551be7ced8073ccaa4" protocol=ttrpc version=3 May 8 08:42:41.666163 systemd[1]: Started cri-containerd-c8a1660298c3f1346b8c48b689f977cdea908b30009daa6cea080a554ae67a26.scope - libcontainer container c8a1660298c3f1346b8c48b689f977cdea908b30009daa6cea080a554ae67a26. May 8 08:42:41.721583 containerd[1535]: time="2025-05-08T08:42:41.721243940Z" level=info msg="StartContainer for \"c8a1660298c3f1346b8c48b689f977cdea908b30009daa6cea080a554ae67a26\" returns successfully" May 8 08:42:41.820511 containerd[1535]: time="2025-05-08T08:42:41.820070982Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"dc625354f9a1b89d8756243cd92dd6d7e6f6ff03bd5cfc7c40f2863681490b0a\" pid:4676 exited_at:{seconds:1746693761 nanos:819600795}" May 8 08:42:42.132508 containerd[1535]: time="2025-05-08T08:42:42.132418504Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:42.134439 containerd[1535]: time="2025-05-08T08:42:42.134392125Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 8 08:42:42.141381 containerd[1535]: time="2025-05-08T08:42:42.141324316Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 567.336441ms" May 8 08:42:42.141433 containerd[1535]: time="2025-05-08T08:42:42.141394059Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 8 08:42:42.146372 containerd[1535]: time="2025-05-08T08:42:42.146324506Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 8 08:42:42.148796 containerd[1535]: time="2025-05-08T08:42:42.148731763Z" level=info msg="CreateContainer within sandbox \"f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 8 08:42:42.175006 containerd[1535]: time="2025-05-08T08:42:42.170215475Z" level=info msg="Container 1fa287be1ca113701a34054b8ec878c15b46bfcfdd24d834989fe33c02f2a751: CDI devices from CRI Config.CDIDevices: []" May 8 08:42:42.235101 containerd[1535]: time="2025-05-08T08:42:42.232839520Z" level=info msg="CreateContainer within sandbox \"f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1fa287be1ca113701a34054b8ec878c15b46bfcfdd24d834989fe33c02f2a751\"" May 8 08:42:42.238220 containerd[1535]: time="2025-05-08T08:42:42.238162173Z" level=info msg="StartContainer for \"1fa287be1ca113701a34054b8ec878c15b46bfcfdd24d834989fe33c02f2a751\"" May 8 08:42:42.242757 containerd[1535]: time="2025-05-08T08:42:42.242694500Z" level=info msg="connecting to shim 1fa287be1ca113701a34054b8ec878c15b46bfcfdd24d834989fe33c02f2a751" address="unix:///run/containerd/s/29ead4ec994c1ab582788820c25590d86439b73d73ac2a08138dfc4e42553630" protocol=ttrpc version=3 May 8 08:42:42.292149 systemd[1]: Started cri-containerd-1fa287be1ca113701a34054b8ec878c15b46bfcfdd24d834989fe33c02f2a751.scope - libcontainer container 1fa287be1ca113701a34054b8ec878c15b46bfcfdd24d834989fe33c02f2a751. May 8 08:42:42.393860 containerd[1535]: time="2025-05-08T08:42:42.393327334Z" level=info msg="StartContainer for \"1fa287be1ca113701a34054b8ec878c15b46bfcfdd24d834989fe33c02f2a751\" returns successfully" May 8 08:42:42.752255 kubelet[2864]: I0508 08:42:42.751918 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-dbff64874-f6gff" podStartSLOduration=30.976532863 podStartE2EDuration="38.751900857s" podCreationTimestamp="2025-05-08 08:42:04 +0000 UTC" firstStartedPulling="2025-05-08 08:42:33.798232743 +0000 UTC m=+51.550550077" lastFinishedPulling="2025-05-08 08:42:41.573600747 +0000 UTC m=+59.325918071" observedRunningTime="2025-05-08 08:42:42.722832181 +0000 UTC m=+60.475149515" watchObservedRunningTime="2025-05-08 08:42:42.751900857 +0000 UTC m=+60.504218192" May 8 08:42:43.340250 containerd[1535]: time="2025-05-08T08:42:43.339399799Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-574667564d-5z62w,Uid:2aad45b6-81f3-4d71-a771-048450f82553,Namespace:calico-system,Attempt:0,}" May 8 08:42:43.521266 systemd-networkd[1424]: calicee87c5e616: Link UP May 8 08:42:43.522387 systemd-networkd[1424]: calicee87c5e616: Gained carrier May 8 08:42:43.538656 kubelet[2864]: I0508 08:42:43.537974 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-dbff64874-l66hm" podStartSLOduration=33.564074638 podStartE2EDuration="39.537955436s" podCreationTimestamp="2025-05-08 08:42:04 +0000 UTC" firstStartedPulling="2025-05-08 08:42:36.169215996 +0000 UTC m=+53.921533330" lastFinishedPulling="2025-05-08 08:42:42.143096744 +0000 UTC m=+59.895414128" observedRunningTime="2025-05-08 08:42:42.752350645 +0000 UTC m=+60.504667979" watchObservedRunningTime="2025-05-08 08:42:43.537955436 +0000 UTC m=+61.290272760" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.401 [INFO][4731] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0 calico-kube-controllers-574667564d- calico-system 2aad45b6-81f3-4d71-a771-048450f82553 683 0 2025-05-08 08:42:04 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:574667564d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4327-0-0-w-78bcb828ec.novalocal calico-kube-controllers-574667564d-5z62w eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calicee87c5e616 [] []}} ContainerID="0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" Namespace="calico-system" Pod="calico-kube-controllers-574667564d-5z62w" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.402 [INFO][4731] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" Namespace="calico-system" Pod="calico-kube-controllers-574667564d-5z62w" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.466 [INFO][4744] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" HandleID="k8s-pod-network.0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.479 [INFO][4744] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" HandleID="k8s-pod-network.0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003bdba0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4327-0-0-w-78bcb828ec.novalocal", "pod":"calico-kube-controllers-574667564d-5z62w", "timestamp":"2025-05-08 08:42:43.466590548 +0000 UTC"}, Hostname:"ci-4327-0-0-w-78bcb828ec.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.479 [INFO][4744] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.479 [INFO][4744] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.479 [INFO][4744] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4327-0-0-w-78bcb828ec.novalocal' May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.481 [INFO][4744] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.486 [INFO][4744] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.491 [INFO][4744] ipam/ipam.go 489: Trying affinity for 192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.494 [INFO][4744] ipam/ipam.go 155: Attempting to load block cidr=192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.498 [INFO][4744] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.79.128/26 host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.498 [INFO][4744] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.79.128/26 handle="k8s-pod-network.0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.500 [INFO][4744] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.507 [INFO][4744] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.79.128/26 handle="k8s-pod-network.0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.515 [INFO][4744] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.79.134/26] block=192.168.79.128/26 handle="k8s-pod-network.0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.516 [INFO][4744] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.79.134/26] handle="k8s-pod-network.0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" host="ci-4327-0-0-w-78bcb828ec.novalocal" May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.516 [INFO][4744] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 8 08:42:43.542057 containerd[1535]: 2025-05-08 08:42:43.516 [INFO][4744] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.79.134/26] IPv6=[] ContainerID="0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" HandleID="k8s-pod-network.0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" Workload="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0" May 8 08:42:43.543298 containerd[1535]: 2025-05-08 08:42:43.517 [INFO][4731] cni-plugin/k8s.go 386: Populated endpoint ContainerID="0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" Namespace="calico-system" Pod="calico-kube-controllers-574667564d-5z62w" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0", GenerateName:"calico-kube-controllers-574667564d-", Namespace:"calico-system", SelfLink:"", UID:"2aad45b6-81f3-4d71-a771-048450f82553", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.May, 8, 8, 42, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"574667564d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4327-0-0-w-78bcb828ec.novalocal", ContainerID:"", Pod:"calico-kube-controllers-574667564d-5z62w", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.79.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calicee87c5e616", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 8 08:42:43.543298 containerd[1535]: 2025-05-08 08:42:43.517 [INFO][4731] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.79.134/32] ContainerID="0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" Namespace="calico-system" Pod="calico-kube-controllers-574667564d-5z62w" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0" May 8 08:42:43.543298 containerd[1535]: 2025-05-08 08:42:43.517 [INFO][4731] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicee87c5e616 ContainerID="0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" Namespace="calico-system" Pod="calico-kube-controllers-574667564d-5z62w" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0" May 8 08:42:43.543298 containerd[1535]: 2025-05-08 08:42:43.519 [INFO][4731] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" Namespace="calico-system" Pod="calico-kube-controllers-574667564d-5z62w" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0" May 8 08:42:43.543298 containerd[1535]: 2025-05-08 08:42:43.519 [INFO][4731] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" Namespace="calico-system" Pod="calico-kube-controllers-574667564d-5z62w" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0", GenerateName:"calico-kube-controllers-574667564d-", Namespace:"calico-system", SelfLink:"", UID:"2aad45b6-81f3-4d71-a771-048450f82553", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.May, 8, 8, 42, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"574667564d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4327-0-0-w-78bcb828ec.novalocal", ContainerID:"0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e", Pod:"calico-kube-controllers-574667564d-5z62w", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.79.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calicee87c5e616", MAC:"f2:9b:cc:11:5b:70", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 8 08:42:43.543298 containerd[1535]: 2025-05-08 08:42:43.539 [INFO][4731] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" Namespace="calico-system" Pod="calico-kube-controllers-574667564d-5z62w" WorkloadEndpoint="ci--4327--0--0--w--78bcb828ec.novalocal-k8s-calico--kube--controllers--574667564d--5z62w-eth0" May 8 08:42:43.591604 containerd[1535]: time="2025-05-08T08:42:43.590735908Z" level=info msg="connecting to shim 0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e" address="unix:///run/containerd/s/f33b1208a8599ce58ebf14e481fe17b4e5dc07a04682f902e17654facf404b40" namespace=k8s.io protocol=ttrpc version=3 May 8 08:42:43.624465 systemd[1]: Started cri-containerd-0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e.scope - libcontainer container 0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e. May 8 08:42:43.671685 kubelet[2864]: I0508 08:42:43.671648 2864 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 8 08:42:43.673088 kubelet[2864]: I0508 08:42:43.673067 2864 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 8 08:42:43.711425 containerd[1535]: time="2025-05-08T08:42:43.709569224Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-574667564d-5z62w,Uid:2aad45b6-81f3-4d71-a771-048450f82553,Namespace:calico-system,Attempt:0,} returns sandbox id \"0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e\"" May 8 08:42:44.759489 systemd-networkd[1424]: calicee87c5e616: Gained IPv6LL May 8 08:42:44.993460 containerd[1535]: time="2025-05-08T08:42:44.993399962Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:44.997667 containerd[1535]: time="2025-05-08T08:42:44.996033745Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 8 08:42:44.997667 containerd[1535]: time="2025-05-08T08:42:44.996688224Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:45.000409 containerd[1535]: time="2025-05-08T08:42:45.000360134Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:45.001193 containerd[1535]: time="2025-05-08T08:42:45.000970456Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 2.854581277s" May 8 08:42:45.001310 containerd[1535]: time="2025-05-08T08:42:45.001283628Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 8 08:42:45.004301 containerd[1535]: time="2025-05-08T08:42:45.004274494Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 8 08:42:45.007015 containerd[1535]: time="2025-05-08T08:42:45.006046101Z" level=info msg="CreateContainer within sandbox \"cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 8 08:42:45.030085 containerd[1535]: time="2025-05-08T08:42:45.029112394Z" level=info msg="Container bd8ec5072d15a036bb6ca1d8f31f003f178dfef3dcc578460be32b5da7151209: CDI devices from CRI Config.CDIDevices: []" May 8 08:42:45.051215 containerd[1535]: time="2025-05-08T08:42:45.051155109Z" level=info msg="CreateContainer within sandbox \"cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"bd8ec5072d15a036bb6ca1d8f31f003f178dfef3dcc578460be32b5da7151209\"" May 8 08:42:45.052926 containerd[1535]: time="2025-05-08T08:42:45.052892279Z" level=info msg="StartContainer for \"bd8ec5072d15a036bb6ca1d8f31f003f178dfef3dcc578460be32b5da7151209\"" May 8 08:42:45.055971 containerd[1535]: time="2025-05-08T08:42:45.055880341Z" level=info msg="connecting to shim bd8ec5072d15a036bb6ca1d8f31f003f178dfef3dcc578460be32b5da7151209" address="unix:///run/containerd/s/1c977e0cd254984620c8e96cdbe5b9bd4ee2a3fc01ac0a4e20a6fef18dbabc80" protocol=ttrpc version=3 May 8 08:42:45.098199 systemd[1]: Started cri-containerd-bd8ec5072d15a036bb6ca1d8f31f003f178dfef3dcc578460be32b5da7151209.scope - libcontainer container bd8ec5072d15a036bb6ca1d8f31f003f178dfef3dcc578460be32b5da7151209. May 8 08:42:45.170410 containerd[1535]: time="2025-05-08T08:42:45.170351567Z" level=info msg="StartContainer for \"bd8ec5072d15a036bb6ca1d8f31f003f178dfef3dcc578460be32b5da7151209\" returns successfully" May 8 08:42:45.478944 kubelet[2864]: I0508 08:42:45.478566 2864 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 8 08:42:45.478944 kubelet[2864]: I0508 08:42:45.478613 2864 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 8 08:42:48.636315 containerd[1535]: time="2025-05-08T08:42:48.636272720Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:48.638248 containerd[1535]: time="2025-05-08T08:42:48.638096803Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 8 08:42:48.639648 containerd[1535]: time="2025-05-08T08:42:48.639575212Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:48.642448 containerd[1535]: time="2025-05-08T08:42:48.642414832Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 8 08:42:48.643366 containerd[1535]: time="2025-05-08T08:42:48.643224930Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 3.638614431s" May 8 08:42:48.643366 containerd[1535]: time="2025-05-08T08:42:48.643263504Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 8 08:42:48.662935 containerd[1535]: time="2025-05-08T08:42:48.660276979Z" level=info msg="CreateContainer within sandbox \"0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 8 08:42:48.676270 containerd[1535]: time="2025-05-08T08:42:48.673786618Z" level=info msg="Container d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490: CDI devices from CRI Config.CDIDevices: []" May 8 08:42:48.687174 containerd[1535]: time="2025-05-08T08:42:48.687130415Z" level=info msg="CreateContainer within sandbox \"0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\"" May 8 08:42:48.687725 containerd[1535]: time="2025-05-08T08:42:48.687530875Z" level=info msg="StartContainer for \"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\"" May 8 08:42:48.688685 containerd[1535]: time="2025-05-08T08:42:48.688645967Z" level=info msg="connecting to shim d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490" address="unix:///run/containerd/s/f33b1208a8599ce58ebf14e481fe17b4e5dc07a04682f902e17654facf404b40" protocol=ttrpc version=3 May 8 08:42:48.721133 systemd[1]: Started cri-containerd-d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490.scope - libcontainer container d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490. May 8 08:42:48.790909 containerd[1535]: time="2025-05-08T08:42:48.790864797Z" level=info msg="StartContainer for \"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" returns successfully" May 8 08:42:49.742356 kubelet[2864]: I0508 08:42:49.742221 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-574667564d-5z62w" podStartSLOduration=40.814853521 podStartE2EDuration="45.742202746s" podCreationTimestamp="2025-05-08 08:42:04 +0000 UTC" firstStartedPulling="2025-05-08 08:42:43.717046224 +0000 UTC m=+61.469363558" lastFinishedPulling="2025-05-08 08:42:48.644395459 +0000 UTC m=+66.396712783" observedRunningTime="2025-05-08 08:42:49.740698528 +0000 UTC m=+67.493015902" watchObservedRunningTime="2025-05-08 08:42:49.742202746 +0000 UTC m=+67.494520080" May 8 08:42:49.744795 kubelet[2864]: I0508 08:42:49.742621 2864 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-hfrk7" podStartSLOduration=33.514190444 podStartE2EDuration="45.742614968s" podCreationTimestamp="2025-05-08 08:42:04 +0000 UTC" firstStartedPulling="2025-05-08 08:42:32.77443086 +0000 UTC m=+50.526748184" lastFinishedPulling="2025-05-08 08:42:45.002855384 +0000 UTC m=+62.755172708" observedRunningTime="2025-05-08 08:42:45.706753005 +0000 UTC m=+63.459070339" watchObservedRunningTime="2025-05-08 08:42:49.742614968 +0000 UTC m=+67.494932302" May 8 08:42:49.796175 containerd[1535]: time="2025-05-08T08:42:49.796128358Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"5cb44174f50e80d06e6dd94b764c2fe0ad0eeea2d2fc3a63563e147f1d4fb99b\" pid:4908 exited_at:{seconds:1746693769 nanos:794151140}" May 8 08:42:49.906421 containerd[1535]: time="2025-05-08T08:42:49.906355658Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"0c769a93a1c9cba730b4ead9eb71db74ed91da35f73a30e175cc35020c567b69\" pid:4930 exited_at:{seconds:1746693769 nanos:905897436}" May 8 08:42:59.168936 kubelet[2864]: I0508 08:42:59.168800 2864 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 8 08:43:11.870064 containerd[1535]: time="2025-05-08T08:43:11.869743180Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"ed65062570ff496bd9969d855ca09f9ee10c035673c39900a29a415390efecaa\" pid:4971 exited_at:{seconds:1746693791 nanos:868754705}" May 8 08:43:12.374017 kubelet[2864]: I0508 08:43:12.373873 2864 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 8 08:43:19.858522 containerd[1535]: time="2025-05-08T08:43:19.857964055Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"23ea18e7532b825918dbc43d3836d62de89a48887afd846bb777c87858afe929\" pid:5003 exited_at:{seconds:1746693799 nanos:857410611}" May 8 08:43:41.922049 containerd[1535]: time="2025-05-08T08:43:41.921904839Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"1f0cdc97f4a436774a483ed2c2a32104832e5aa2a3cc500be2b652c68947a78b\" pid:5033 exited_at:{seconds:1746693821 nanos:920554664}" May 8 08:43:43.535322 containerd[1535]: time="2025-05-08T08:43:43.535258303Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"6c9a0701b82877be1876a4fad6868a62b80fa0f2ce8900fd7e7798f195cb79b5\" pid:5060 exited_at:{seconds:1746693823 nanos:534900398}" May 8 08:43:49.912192 containerd[1535]: time="2025-05-08T08:43:49.912145209Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"8d21ac737bfa03532c096eb4524bbf420e77493840d00ae19c579d2da535c959\" pid:5082 exited_at:{seconds:1746693829 nanos:911704837}" May 8 08:44:11.904206 containerd[1535]: time="2025-05-08T08:44:11.903792481Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"ecdef245ea29ae8a423584b3121e738740c10da09b58b9e531ba65ba1bc5dc88\" pid:5121 exited_at:{seconds:1746693851 nanos:898187993}" May 8 08:44:19.925554 containerd[1535]: time="2025-05-08T08:44:19.925487987Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"00c4004c9546e5700600f530d6e2dcb7ab26f418d8fbe6c7b552a7273fde6c59\" pid:5155 exited_at:{seconds:1746693859 nanos:924941934}" May 8 08:44:41.925824 containerd[1535]: time="2025-05-08T08:44:41.924712888Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"0d6affbd497a3a52150756f6bacf3a18da27d699cd104548737854d4d50d5def\" pid:5180 exited_at:{seconds:1746693881 nanos:923538547}" May 8 08:44:43.549307 containerd[1535]: time="2025-05-08T08:44:43.548396011Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"b95b1cf152f7875ff11cf61a043e204b050cf5be161d0bde858f8025521093e1\" pid:5207 exited_at:{seconds:1746693883 nanos:547418354}" May 8 08:44:49.897187 containerd[1535]: time="2025-05-08T08:44:49.896849244Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"b72473d14e39c7ea8dddd348a053969fa052b6bdd2f38df65466be7a5252a956\" pid:5229 exited_at:{seconds:1746693889 nanos:896101065}" May 8 08:45:11.965069 containerd[1535]: time="2025-05-08T08:45:11.963286029Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"0f02b915dcc9372e8cb95ecfba6049e0d58b5572e0f4ff80a8e5bce899457d76\" pid:5253 exited_at:{seconds:1746693911 nanos:958135116}" May 8 08:45:19.912052 containerd[1535]: time="2025-05-08T08:45:19.911895883Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"3a39d29bec5e813b0f741d28b87cf660788ecae89aac3dee39a19ffffd261dd5\" pid:5283 exited_at:{seconds:1746693919 nanos:911273964}" May 8 08:45:30.054710 systemd[1]: Started sshd@9-172.24.4.129:22-172.24.4.1:58420.service - OpenSSH per-connection server daemon (172.24.4.1:58420). May 8 08:45:31.331760 sshd[5300]: Accepted publickey for core from 172.24.4.1 port 58420 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:45:31.339949 sshd-session[5300]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:45:31.369493 systemd-logind[1511]: New session 12 of user core. May 8 08:45:31.378635 systemd[1]: Started session-12.scope - Session 12 of User core. May 8 08:45:32.054067 sshd[5302]: Connection closed by 172.24.4.1 port 58420 May 8 08:45:32.052918 sshd-session[5300]: pam_unix(sshd:session): session closed for user core May 8 08:45:32.061877 systemd-logind[1511]: Session 12 logged out. Waiting for processes to exit. May 8 08:45:32.062681 systemd[1]: sshd@9-172.24.4.129:22-172.24.4.1:58420.service: Deactivated successfully. May 8 08:45:32.071675 systemd[1]: session-12.scope: Deactivated successfully. May 8 08:45:32.079761 systemd-logind[1511]: Removed session 12. May 8 08:45:37.085700 systemd[1]: Started sshd@10-172.24.4.129:22-172.24.4.1:53168.service - OpenSSH per-connection server daemon (172.24.4.1:53168). May 8 08:45:38.493769 sshd[5317]: Accepted publickey for core from 172.24.4.1 port 53168 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:45:38.497548 sshd-session[5317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:45:38.518439 systemd-logind[1511]: New session 13 of user core. May 8 08:45:38.523382 systemd[1]: Started session-13.scope - Session 13 of User core. May 8 08:45:39.270092 sshd[5322]: Connection closed by 172.24.4.1 port 53168 May 8 08:45:39.271868 sshd-session[5317]: pam_unix(sshd:session): session closed for user core May 8 08:45:39.282828 systemd[1]: sshd@10-172.24.4.129:22-172.24.4.1:53168.service: Deactivated successfully. May 8 08:45:39.288968 systemd[1]: session-13.scope: Deactivated successfully. May 8 08:45:39.292743 systemd-logind[1511]: Session 13 logged out. Waiting for processes to exit. May 8 08:45:39.295206 systemd-logind[1511]: Removed session 13. May 8 08:45:41.931536 containerd[1535]: time="2025-05-08T08:45:41.931410318Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"72495285bacda339e5faeeeb514087d45be20b0f67df665b666cd402e49d8d1d\" pid:5347 exited_at:{seconds:1746693941 nanos:930574680}" May 8 08:45:43.527973 containerd[1535]: time="2025-05-08T08:45:43.527685076Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"a22aa87a539e81714586a72a1635aa967db5d1c18b790125a9ad6c8293b7e4d5\" pid:5372 exited_at:{seconds:1746693943 nanos:527371965}" May 8 08:45:44.301338 systemd[1]: Started sshd@11-172.24.4.129:22-172.24.4.1:48138.service - OpenSSH per-connection server daemon (172.24.4.1:48138). May 8 08:45:45.575625 sshd[5383]: Accepted publickey for core from 172.24.4.1 port 48138 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:45:45.579054 sshd-session[5383]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:45:45.587140 systemd-logind[1511]: New session 14 of user core. May 8 08:45:45.593217 systemd[1]: Started session-14.scope - Session 14 of User core. May 8 08:45:46.548022 sshd[5385]: Connection closed by 172.24.4.1 port 48138 May 8 08:45:46.549468 sshd-session[5383]: pam_unix(sshd:session): session closed for user core May 8 08:45:46.574948 systemd[1]: sshd@11-172.24.4.129:22-172.24.4.1:48138.service: Deactivated successfully. May 8 08:45:46.582602 systemd[1]: session-14.scope: Deactivated successfully. May 8 08:45:46.584683 systemd-logind[1511]: Session 14 logged out. Waiting for processes to exit. May 8 08:45:46.593210 systemd[1]: Started sshd@12-172.24.4.129:22-172.24.4.1:48148.service - OpenSSH per-connection server daemon (172.24.4.1:48148). May 8 08:45:46.612596 systemd-logind[1511]: Removed session 14. May 8 08:45:47.824093 sshd[5396]: Accepted publickey for core from 172.24.4.1 port 48148 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:45:47.829236 sshd-session[5396]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:45:47.858660 systemd-logind[1511]: New session 15 of user core. May 8 08:45:47.869403 systemd[1]: Started session-15.scope - Session 15 of User core. May 8 08:45:48.800101 sshd[5404]: Connection closed by 172.24.4.1 port 48148 May 8 08:45:48.801424 sshd-session[5396]: pam_unix(sshd:session): session closed for user core May 8 08:45:48.816327 systemd[1]: sshd@12-172.24.4.129:22-172.24.4.1:48148.service: Deactivated successfully. May 8 08:45:48.822675 systemd[1]: session-15.scope: Deactivated successfully. May 8 08:45:48.830194 systemd-logind[1511]: Session 15 logged out. Waiting for processes to exit. May 8 08:45:48.836842 systemd[1]: Started sshd@13-172.24.4.129:22-172.24.4.1:48160.service - OpenSSH per-connection server daemon (172.24.4.1:48160). May 8 08:45:48.845656 systemd-logind[1511]: Removed session 15. May 8 08:45:49.889159 containerd[1535]: time="2025-05-08T08:45:49.888950742Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"77e811d5c9a8ee99cee70f65021ccd3107371099f85b482fb417ab8699eb5921\" pid:5440 exited_at:{seconds:1746693949 nanos:888387558}" May 8 08:45:50.172257 sshd[5424]: Accepted publickey for core from 172.24.4.1 port 48160 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:45:50.175961 sshd-session[5424]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:45:50.190162 systemd-logind[1511]: New session 16 of user core. May 8 08:45:50.198374 systemd[1]: Started session-16.scope - Session 16 of User core. May 8 08:45:50.962348 sshd[5449]: Connection closed by 172.24.4.1 port 48160 May 8 08:45:50.963845 sshd-session[5424]: pam_unix(sshd:session): session closed for user core May 8 08:45:50.973401 systemd[1]: sshd@13-172.24.4.129:22-172.24.4.1:48160.service: Deactivated successfully. May 8 08:45:50.979648 systemd[1]: session-16.scope: Deactivated successfully. May 8 08:45:50.982552 systemd-logind[1511]: Session 16 logged out. Waiting for processes to exit. May 8 08:45:50.985790 systemd-logind[1511]: Removed session 16. May 8 08:45:56.004812 systemd[1]: Started sshd@14-172.24.4.129:22-172.24.4.1:43392.service - OpenSSH per-connection server daemon (172.24.4.1:43392). May 8 08:45:57.172552 sshd[5463]: Accepted publickey for core from 172.24.4.1 port 43392 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:45:57.172169 sshd-session[5463]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:45:57.189814 systemd-logind[1511]: New session 17 of user core. May 8 08:45:57.207777 systemd[1]: Started session-17.scope - Session 17 of User core. May 8 08:45:57.872055 sshd[5468]: Connection closed by 172.24.4.1 port 43392 May 8 08:45:57.871680 sshd-session[5463]: pam_unix(sshd:session): session closed for user core May 8 08:45:57.878647 systemd[1]: sshd@14-172.24.4.129:22-172.24.4.1:43392.service: Deactivated successfully. May 8 08:45:57.878680 systemd-logind[1511]: Session 17 logged out. Waiting for processes to exit. May 8 08:45:57.881736 systemd[1]: session-17.scope: Deactivated successfully. May 8 08:45:57.885790 systemd-logind[1511]: Removed session 17. May 8 08:46:02.889473 systemd[1]: Started sshd@15-172.24.4.129:22-172.24.4.1:43394.service - OpenSSH per-connection server daemon (172.24.4.1:43394). May 8 08:46:04.053089 sshd[5483]: Accepted publickey for core from 172.24.4.1 port 43394 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:46:04.055248 sshd-session[5483]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:46:04.066730 systemd-logind[1511]: New session 18 of user core. May 8 08:46:04.074220 systemd[1]: Started session-18.scope - Session 18 of User core. May 8 08:46:04.764790 sshd[5485]: Connection closed by 172.24.4.1 port 43394 May 8 08:46:04.765482 sshd-session[5483]: pam_unix(sshd:session): session closed for user core May 8 08:46:04.776431 systemd-logind[1511]: Session 18 logged out. Waiting for processes to exit. May 8 08:46:04.776780 systemd[1]: sshd@15-172.24.4.129:22-172.24.4.1:43394.service: Deactivated successfully. May 8 08:46:04.781363 systemd[1]: session-18.scope: Deactivated successfully. May 8 08:46:04.786279 systemd-logind[1511]: Removed session 18. May 8 08:46:09.791581 systemd[1]: Started sshd@16-172.24.4.129:22-172.24.4.1:47748.service - OpenSSH per-connection server daemon (172.24.4.1:47748). May 8 08:46:11.219397 sshd[5498]: Accepted publickey for core from 172.24.4.1 port 47748 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:46:11.223608 sshd-session[5498]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:46:11.238650 systemd-logind[1511]: New session 19 of user core. May 8 08:46:11.247316 systemd[1]: Started session-19.scope - Session 19 of User core. May 8 08:46:11.864561 containerd[1535]: time="2025-05-08T08:46:11.864443214Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"6bc59663fae65e49df0d46aa1de1ca2431a81c26f48a6fdbef52ea2054412261\" pid:5520 exited_at:{seconds:1746693971 nanos:862963492}" May 8 08:46:12.039071 sshd[5500]: Connection closed by 172.24.4.1 port 47748 May 8 08:46:12.040926 sshd-session[5498]: pam_unix(sshd:session): session closed for user core May 8 08:46:12.052930 systemd[1]: sshd@16-172.24.4.129:22-172.24.4.1:47748.service: Deactivated successfully. May 8 08:46:12.059785 systemd[1]: session-19.scope: Deactivated successfully. May 8 08:46:12.062884 systemd-logind[1511]: Session 19 logged out. Waiting for processes to exit. May 8 08:46:12.068305 systemd[1]: Started sshd@17-172.24.4.129:22-172.24.4.1:47764.service - OpenSSH per-connection server daemon (172.24.4.1:47764). May 8 08:46:12.070511 systemd-logind[1511]: Removed session 19. May 8 08:46:13.115899 sshd[5535]: Accepted publickey for core from 172.24.4.1 port 47764 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:46:13.119639 sshd-session[5535]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:46:13.138121 systemd-logind[1511]: New session 20 of user core. May 8 08:46:13.144309 systemd[1]: Started session-20.scope - Session 20 of User core. May 8 08:46:14.272080 sshd[5538]: Connection closed by 172.24.4.1 port 47764 May 8 08:46:14.272544 sshd-session[5535]: pam_unix(sshd:session): session closed for user core May 8 08:46:14.284183 systemd[1]: Started sshd@18-172.24.4.129:22-172.24.4.1:40438.service - OpenSSH per-connection server daemon (172.24.4.1:40438). May 8 08:46:14.286659 systemd[1]: sshd@17-172.24.4.129:22-172.24.4.1:47764.service: Deactivated successfully. May 8 08:46:14.325940 systemd[1]: session-20.scope: Deactivated successfully. May 8 08:46:14.331478 systemd-logind[1511]: Session 20 logged out. Waiting for processes to exit. May 8 08:46:14.337802 systemd-logind[1511]: Removed session 20. May 8 08:46:15.491358 sshd[5545]: Accepted publickey for core from 172.24.4.1 port 40438 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:46:15.495267 sshd-session[5545]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:46:15.511106 systemd-logind[1511]: New session 21 of user core. May 8 08:46:15.519502 systemd[1]: Started session-21.scope - Session 21 of User core. May 8 08:46:19.270281 sshd[5550]: Connection closed by 172.24.4.1 port 40438 May 8 08:46:19.274747 sshd-session[5545]: pam_unix(sshd:session): session closed for user core May 8 08:46:19.288587 systemd[1]: sshd@18-172.24.4.129:22-172.24.4.1:40438.service: Deactivated successfully. May 8 08:46:19.292864 systemd[1]: session-21.scope: Deactivated successfully. May 8 08:46:19.293371 systemd[1]: session-21.scope: Consumed 1.034s CPU time, 67.8M memory peak. May 8 08:46:19.294368 systemd-logind[1511]: Session 21 logged out. Waiting for processes to exit. May 8 08:46:19.298719 systemd[1]: Started sshd@19-172.24.4.129:22-172.24.4.1:40444.service - OpenSSH per-connection server daemon (172.24.4.1:40444). May 8 08:46:19.301482 systemd-logind[1511]: Removed session 21. May 8 08:46:19.877795 containerd[1535]: time="2025-05-08T08:46:19.877727313Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"0a25e1a4662c52eac53d7ac154be867e8b82e381e57dda7c940cfdd50f212661\" pid:5583 exited_at:{seconds:1746693979 nanos:877292600}" May 8 08:46:20.754435 sshd[5566]: Accepted publickey for core from 172.24.4.1 port 40444 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:46:20.756654 sshd-session[5566]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:46:20.766102 systemd-logind[1511]: New session 22 of user core. May 8 08:46:20.769247 systemd[1]: Started session-22.scope - Session 22 of User core. May 8 08:46:21.961772 sshd[5592]: Connection closed by 172.24.4.1 port 40444 May 8 08:46:21.961371 sshd-session[5566]: pam_unix(sshd:session): session closed for user core May 8 08:46:21.978338 systemd[1]: sshd@19-172.24.4.129:22-172.24.4.1:40444.service: Deactivated successfully. May 8 08:46:21.983787 systemd[1]: session-22.scope: Deactivated successfully. May 8 08:46:21.987144 systemd-logind[1511]: Session 22 logged out. Waiting for processes to exit. May 8 08:46:21.995623 systemd[1]: Started sshd@20-172.24.4.129:22-172.24.4.1:40460.service - OpenSSH per-connection server daemon (172.24.4.1:40460). May 8 08:46:21.999416 systemd-logind[1511]: Removed session 22. May 8 08:46:23.349336 sshd[5600]: Accepted publickey for core from 172.24.4.1 port 40460 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:46:23.361238 sshd-session[5600]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:46:23.387505 systemd-logind[1511]: New session 23 of user core. May 8 08:46:23.403647 systemd[1]: Started session-23.scope - Session 23 of User core. May 8 08:46:24.248820 sshd[5603]: Connection closed by 172.24.4.1 port 40460 May 8 08:46:24.249718 sshd-session[5600]: pam_unix(sshd:session): session closed for user core May 8 08:46:24.262816 systemd-logind[1511]: Session 23 logged out. Waiting for processes to exit. May 8 08:46:24.264427 systemd[1]: sshd@20-172.24.4.129:22-172.24.4.1:40460.service: Deactivated successfully. May 8 08:46:24.273736 systemd[1]: session-23.scope: Deactivated successfully. May 8 08:46:24.281909 systemd-logind[1511]: Removed session 23. May 8 08:46:29.280839 systemd[1]: Started sshd@21-172.24.4.129:22-172.24.4.1:51670.service - OpenSSH per-connection server daemon (172.24.4.1:51670). May 8 08:46:30.461478 sshd[5620]: Accepted publickey for core from 172.24.4.1 port 51670 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:46:30.465253 sshd-session[5620]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:46:30.483391 systemd-logind[1511]: New session 24 of user core. May 8 08:46:30.493378 systemd[1]: Started session-24.scope - Session 24 of User core. May 8 08:46:31.270773 sshd[5622]: Connection closed by 172.24.4.1 port 51670 May 8 08:46:31.272493 sshd-session[5620]: pam_unix(sshd:session): session closed for user core May 8 08:46:31.279476 systemd[1]: sshd@21-172.24.4.129:22-172.24.4.1:51670.service: Deactivated successfully. May 8 08:46:31.283440 systemd[1]: session-24.scope: Deactivated successfully. May 8 08:46:31.285289 systemd-logind[1511]: Session 24 logged out. Waiting for processes to exit. May 8 08:46:31.288077 systemd-logind[1511]: Removed session 24. May 8 08:46:36.297728 systemd[1]: Started sshd@22-172.24.4.129:22-172.24.4.1:51256.service - OpenSSH per-connection server daemon (172.24.4.1:51256). May 8 08:46:36.719530 containerd[1535]: time="2025-05-08T08:46:36.719157277Z" level=warning msg="container event discarded" container=9d9bb516d9b493040459030c5d4d40896eaa2d18bef7d9e297ad6660beb9726e type=CONTAINER_CREATED_EVENT May 8 08:46:36.731008 containerd[1535]: time="2025-05-08T08:46:36.730843963Z" level=warning msg="container event discarded" container=9d9bb516d9b493040459030c5d4d40896eaa2d18bef7d9e297ad6660beb9726e type=CONTAINER_STARTED_EVENT May 8 08:46:36.731008 containerd[1535]: time="2025-05-08T08:46:36.730953291Z" level=warning msg="container event discarded" container=0654f7847ee51992150abc605270bd62592ed820bfac188a081eac71cb19489e type=CONTAINER_CREATED_EVENT May 8 08:46:36.731307 containerd[1535]: time="2025-05-08T08:46:36.730978498Z" level=warning msg="container event discarded" container=0654f7847ee51992150abc605270bd62592ed820bfac188a081eac71cb19489e type=CONTAINER_STARTED_EVENT May 8 08:46:36.756423 containerd[1535]: time="2025-05-08T08:46:36.756272554Z" level=warning msg="container event discarded" container=9ec8b6a358655c55a8b547723c8de72d66d9a9f3ded45da2e97477f71240100b type=CONTAINER_CREATED_EVENT May 8 08:46:36.756423 containerd[1535]: time="2025-05-08T08:46:36.756373356Z" level=warning msg="container event discarded" container=9ec8b6a358655c55a8b547723c8de72d66d9a9f3ded45da2e97477f71240100b type=CONTAINER_STARTED_EVENT May 8 08:46:36.785820 containerd[1535]: time="2025-05-08T08:46:36.785721692Z" level=warning msg="container event discarded" container=247de2e4b5e9f71e2e1014d7886f9bce34b901e47a95286e33b599cf8ef5f0ea type=CONTAINER_CREATED_EVENT May 8 08:46:36.797117 containerd[1535]: time="2025-05-08T08:46:36.797037034Z" level=warning msg="container event discarded" container=e1ccecf9a7e564775b4b86c4474c0fad580f67408835da923f7a9e99df6ce165 type=CONTAINER_CREATED_EVENT May 8 08:46:36.808019 containerd[1535]: time="2025-05-08T08:46:36.807924885Z" level=warning msg="container event discarded" container=cc69cd8baa72a45bfd5c490344eaa375ced26bf863cec673bc84ce8a9cbcc428 type=CONTAINER_CREATED_EVENT May 8 08:46:36.917903 containerd[1535]: time="2025-05-08T08:46:36.917797585Z" level=warning msg="container event discarded" container=247de2e4b5e9f71e2e1014d7886f9bce34b901e47a95286e33b599cf8ef5f0ea type=CONTAINER_STARTED_EVENT May 8 08:46:36.956324 containerd[1535]: time="2025-05-08T08:46:36.956150760Z" level=warning msg="container event discarded" container=cc69cd8baa72a45bfd5c490344eaa375ced26bf863cec673bc84ce8a9cbcc428 type=CONTAINER_STARTED_EVENT May 8 08:46:36.956324 containerd[1535]: time="2025-05-08T08:46:36.956273543Z" level=warning msg="container event discarded" container=e1ccecf9a7e564775b4b86c4474c0fad580f67408835da923f7a9e99df6ce165 type=CONTAINER_STARTED_EVENT May 8 08:46:37.461541 sshd[5634]: Accepted publickey for core from 172.24.4.1 port 51256 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:46:37.465313 sshd-session[5634]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:46:37.477286 systemd-logind[1511]: New session 25 of user core. May 8 08:46:37.489335 systemd[1]: Started session-25.scope - Session 25 of User core. May 8 08:46:38.271347 sshd[5636]: Connection closed by 172.24.4.1 port 51256 May 8 08:46:38.272843 sshd-session[5634]: pam_unix(sshd:session): session closed for user core May 8 08:46:38.282361 systemd[1]: sshd@22-172.24.4.129:22-172.24.4.1:51256.service: Deactivated successfully. May 8 08:46:38.291253 systemd[1]: session-25.scope: Deactivated successfully. May 8 08:46:38.296059 systemd-logind[1511]: Session 25 logged out. Waiting for processes to exit. May 8 08:46:38.299541 systemd-logind[1511]: Removed session 25. May 8 08:46:41.895817 containerd[1535]: time="2025-05-08T08:46:41.895722138Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"91a874b1131793225ff0117fe3ec424e5edec83b4c0f5ffbe4ac5777618eb2ad\" pid:5660 exited_at:{seconds:1746694001 nanos:894675322}" May 8 08:46:43.303732 systemd[1]: Started sshd@23-172.24.4.129:22-172.24.4.1:51266.service - OpenSSH per-connection server daemon (172.24.4.1:51266). May 8 08:46:43.521292 containerd[1535]: time="2025-05-08T08:46:43.521254702Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"cdf02285b6c829bd8462f7c2a94a0b2abbeb50c38c24618500f78ebb47db1b01\" pid:5691 exited_at:{seconds:1746694003 nanos:520864041}" May 8 08:46:44.448406 sshd[5676]: Accepted publickey for core from 172.24.4.1 port 51266 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:46:44.452884 sshd-session[5676]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:46:44.468301 systemd-logind[1511]: New session 26 of user core. May 8 08:46:44.473385 systemd[1]: Started session-26.scope - Session 26 of User core. May 8 08:46:45.149268 sshd[5700]: Connection closed by 172.24.4.1 port 51266 May 8 08:46:45.150908 sshd-session[5676]: pam_unix(sshd:session): session closed for user core May 8 08:46:45.159539 systemd[1]: sshd@23-172.24.4.129:22-172.24.4.1:51266.service: Deactivated successfully. May 8 08:46:45.166885 systemd[1]: session-26.scope: Deactivated successfully. May 8 08:46:45.173236 systemd-logind[1511]: Session 26 logged out. Waiting for processes to exit. May 8 08:46:45.176624 systemd-logind[1511]: Removed session 26. May 8 08:46:49.896298 containerd[1535]: time="2025-05-08T08:46:49.896216872Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"0494d5f9e31666994d59b0b9561226655497a5b626323b09a333abe5a146afa8\" pid:5723 exited_at:{seconds:1746694009 nanos:895703928}" May 8 08:46:50.177750 systemd[1]: Started sshd@24-172.24.4.129:22-172.24.4.1:42712.service - OpenSSH per-connection server daemon (172.24.4.1:42712). May 8 08:46:51.486740 sshd[5733]: Accepted publickey for core from 172.24.4.1 port 42712 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:46:51.490295 sshd-session[5733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:46:51.503734 systemd-logind[1511]: New session 27 of user core. May 8 08:46:51.513403 systemd[1]: Started session-27.scope - Session 27 of User core. May 8 08:46:52.271791 sshd[5736]: Connection closed by 172.24.4.1 port 42712 May 8 08:46:52.272541 sshd-session[5733]: pam_unix(sshd:session): session closed for user core May 8 08:46:52.278775 systemd[1]: sshd@24-172.24.4.129:22-172.24.4.1:42712.service: Deactivated successfully. May 8 08:46:52.279066 systemd-logind[1511]: Session 27 logged out. Waiting for processes to exit. May 8 08:46:52.281798 systemd[1]: session-27.scope: Deactivated successfully. May 8 08:46:52.284343 systemd-logind[1511]: Removed session 27. May 8 08:46:57.297616 systemd[1]: Started sshd@25-172.24.4.129:22-172.24.4.1:51920.service - OpenSSH per-connection server daemon (172.24.4.1:51920). May 8 08:46:57.904705 containerd[1535]: time="2025-05-08T08:46:57.904511395Z" level=warning msg="container event discarded" container=0a6dce8666649044275e88a14e7bdf2610a1322593752b7c92b2e9daddf1628b type=CONTAINER_CREATED_EVENT May 8 08:46:57.904705 containerd[1535]: time="2025-05-08T08:46:57.904644027Z" level=warning msg="container event discarded" container=0a6dce8666649044275e88a14e7bdf2610a1322593752b7c92b2e9daddf1628b type=CONTAINER_STARTED_EVENT May 8 08:46:57.944251 containerd[1535]: time="2025-05-08T08:46:57.944109487Z" level=warning msg="container event discarded" container=769ce901e03f5225019c6123543d69f5565238999d5755c5e026143056cbae7c type=CONTAINER_CREATED_EVENT May 8 08:46:58.024546 containerd[1535]: time="2025-05-08T08:46:58.024371285Z" level=warning msg="container event discarded" container=769ce901e03f5225019c6123543d69f5565238999d5755c5e026143056cbae7c type=CONTAINER_STARTED_EVENT May 8 08:46:58.290963 containerd[1535]: time="2025-05-08T08:46:58.290656890Z" level=warning msg="container event discarded" container=cb3023a8ccd79dc08236fadf977953f91bb371344adaf3203109efa951845732 type=CONTAINER_CREATED_EVENT May 8 08:46:58.290963 containerd[1535]: time="2025-05-08T08:46:58.290752642Z" level=warning msg="container event discarded" container=cb3023a8ccd79dc08236fadf977953f91bb371344adaf3203109efa951845732 type=CONTAINER_STARTED_EVENT May 8 08:46:58.442726 sshd[5748]: Accepted publickey for core from 172.24.4.1 port 51920 ssh2: RSA SHA256:A7ARL4Y05iWzdU2bVMe6EX052U/8RNKsjYLLn3yhVVk May 8 08:46:58.445773 sshd-session[5748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 8 08:46:58.459626 systemd-logind[1511]: New session 28 of user core. May 8 08:46:58.469391 systemd[1]: Started session-28.scope - Session 28 of User core. May 8 08:46:59.042457 sshd[5752]: Connection closed by 172.24.4.1 port 51920 May 8 08:46:59.043816 sshd-session[5748]: pam_unix(sshd:session): session closed for user core May 8 08:46:59.052431 systemd[1]: sshd@25-172.24.4.129:22-172.24.4.1:51920.service: Deactivated successfully. May 8 08:46:59.057668 systemd[1]: session-28.scope: Deactivated successfully. May 8 08:46:59.060694 systemd-logind[1511]: Session 28 logged out. Waiting for processes to exit. May 8 08:46:59.063665 systemd-logind[1511]: Removed session 28. May 8 08:47:00.796833 containerd[1535]: time="2025-05-08T08:47:00.796594235Z" level=warning msg="container event discarded" container=d8f4754ddf0f8ba098ed73375865402ed3ad8d43d351733863cd6559bfe63490 type=CONTAINER_CREATED_EVENT May 8 08:47:00.874266 containerd[1535]: time="2025-05-08T08:47:00.874143750Z" level=warning msg="container event discarded" container=d8f4754ddf0f8ba098ed73375865402ed3ad8d43d351733863cd6559bfe63490 type=CONTAINER_STARTED_EVENT May 8 08:47:04.759556 containerd[1535]: time="2025-05-08T08:47:04.759238378Z" level=warning msg="container event discarded" container=3ef9f70c56203e4522d049a3b382b54c3af9b2fd10eb666f50cda61e8136216c type=CONTAINER_CREATED_EVENT May 8 08:47:04.760532 containerd[1535]: time="2025-05-08T08:47:04.759787090Z" level=warning msg="container event discarded" container=3ef9f70c56203e4522d049a3b382b54c3af9b2fd10eb666f50cda61e8136216c type=CONTAINER_STARTED_EVENT May 8 08:47:04.851122 containerd[1535]: time="2025-05-08T08:47:04.850829962Z" level=warning msg="container event discarded" container=5002bd4fc0fa1a87f9f5cda20442072de3020f92e26967e1b46a379d04766172 type=CONTAINER_CREATED_EVENT May 8 08:47:04.851122 containerd[1535]: time="2025-05-08T08:47:04.850951233Z" level=warning msg="container event discarded" container=5002bd4fc0fa1a87f9f5cda20442072de3020f92e26967e1b46a379d04766172 type=CONTAINER_STARTED_EVENT May 8 08:47:08.121468 containerd[1535]: time="2025-05-08T08:47:08.121351113Z" level=warning msg="container event discarded" container=fa672adedf7644d7cbc00eda08e00e05d7168c5e979455576c4faca77e1f2417 type=CONTAINER_CREATED_EVENT May 8 08:47:08.210167 containerd[1535]: time="2025-05-08T08:47:08.210031671Z" level=warning msg="container event discarded" container=fa672adedf7644d7cbc00eda08e00e05d7168c5e979455576c4faca77e1f2417 type=CONTAINER_STARTED_EVENT May 8 08:47:10.330832 containerd[1535]: time="2025-05-08T08:47:10.330572554Z" level=warning msg="container event discarded" container=392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8 type=CONTAINER_CREATED_EVENT May 8 08:47:10.408056 containerd[1535]: time="2025-05-08T08:47:10.407869450Z" level=warning msg="container event discarded" container=392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8 type=CONTAINER_STARTED_EVENT May 8 08:47:11.266676 containerd[1535]: time="2025-05-08T08:47:11.266498059Z" level=warning msg="container event discarded" container=392d0b836f0c466f74d3badb22579aefb957d39a92e84c54d6f52f3a531072d8 type=CONTAINER_STOPPED_EVENT May 8 08:47:11.896618 containerd[1535]: time="2025-05-08T08:47:11.896539486Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"dd4ef77105006e52f73951d417e4da615392afc391281674d0d244db6f5acc4c\" pid:5776 exited_at:{seconds:1746694031 nanos:895339686}" May 8 08:47:17.708826 containerd[1535]: time="2025-05-08T08:47:17.707924745Z" level=warning msg="container event discarded" container=d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543 type=CONTAINER_CREATED_EVENT May 8 08:47:17.791902 containerd[1535]: time="2025-05-08T08:47:17.791771427Z" level=warning msg="container event discarded" container=d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543 type=CONTAINER_STARTED_EVENT May 8 08:47:19.897632 containerd[1535]: time="2025-05-08T08:47:19.897516828Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"32b475e923ff67bbe253e5c01eb6427c328cc662cdde6ca7fe9fd19306a89a43\" pid:5801 exited_at:{seconds:1746694039 nanos:895311728}" May 8 08:47:20.211550 containerd[1535]: time="2025-05-08T08:47:20.210369137Z" level=warning msg="container event discarded" container=d7df25cfdd985900128f94f81683bc388d50a910fa466953e6519c2818cb8543 type=CONTAINER_STOPPED_EVENT May 8 08:47:31.155476 containerd[1535]: time="2025-05-08T08:47:31.155319913Z" level=warning msg="container event discarded" container=f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475 type=CONTAINER_CREATED_EVENT May 8 08:47:31.289118 containerd[1535]: time="2025-05-08T08:47:31.288884627Z" level=warning msg="container event discarded" container=f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475 type=CONTAINER_STARTED_EVENT May 8 08:47:32.783050 containerd[1535]: time="2025-05-08T08:47:32.782906505Z" level=warning msg="container event discarded" container=cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd type=CONTAINER_CREATED_EVENT May 8 08:47:32.783050 containerd[1535]: time="2025-05-08T08:47:32.783046732Z" level=warning msg="container event discarded" container=cf80b194f228f6ac237fc9417f6261bfa54a89eb54a267073dd9b4439ea45fbd type=CONTAINER_STARTED_EVENT May 8 08:47:33.806956 containerd[1535]: time="2025-05-08T08:47:33.806754540Z" level=warning msg="container event discarded" container=8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a type=CONTAINER_CREATED_EVENT May 8 08:47:33.806956 containerd[1535]: time="2025-05-08T08:47:33.806893445Z" level=warning msg="container event discarded" container=8601508d49a194ca5a2e729cf6377bfc72a4d0020f7404bc757aac0cec69072a type=CONTAINER_STARTED_EVENT May 8 08:47:36.153561 containerd[1535]: time="2025-05-08T08:47:36.153293757Z" level=warning msg="container event discarded" container=00b9629bffe7b0e6fb413d6028ddea5523313518fa632bed517ceae5396ed42b type=CONTAINER_CREATED_EVENT May 8 08:47:36.174088 containerd[1535]: time="2025-05-08T08:47:36.173862315Z" level=warning msg="container event discarded" container=f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a type=CONTAINER_CREATED_EVENT May 8 08:47:36.174088 containerd[1535]: time="2025-05-08T08:47:36.173964710Z" level=warning msg="container event discarded" container=f5c36d5626934e04ed0de967044580797a07f17d3417e250485e876a2c108f5a type=CONTAINER_STARTED_EVENT May 8 08:47:36.207459 containerd[1535]: time="2025-05-08T08:47:36.207347825Z" level=warning msg="container event discarded" container=ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50 type=CONTAINER_CREATED_EVENT May 8 08:47:36.207459 containerd[1535]: time="2025-05-08T08:47:36.207464557Z" level=warning msg="container event discarded" container=ecef06d09fa5f56348806d1561be492b1cf2c7f283f11b521b7ff42a95f91c50 type=CONTAINER_STARTED_EVENT May 8 08:47:36.247249 containerd[1535]: time="2025-05-08T08:47:36.247112530Z" level=warning msg="container event discarded" container=48923a21a40c9447beb4dd53ae13394193df3ef87bd4fb977277f5413f828a3b type=CONTAINER_CREATED_EVENT May 8 08:47:36.296627 containerd[1535]: time="2025-05-08T08:47:36.296469550Z" level=warning msg="container event discarded" container=00b9629bffe7b0e6fb413d6028ddea5523313518fa632bed517ceae5396ed42b type=CONTAINER_STARTED_EVENT May 8 08:47:36.330959 containerd[1535]: time="2025-05-08T08:47:36.330795356Z" level=warning msg="container event discarded" container=48923a21a40c9447beb4dd53ae13394193df3ef87bd4fb977277f5413f828a3b type=CONTAINER_STARTED_EVENT May 8 08:47:36.711730 containerd[1535]: time="2025-05-08T08:47:36.711420207Z" level=warning msg="container event discarded" container=d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e type=CONTAINER_CREATED_EVENT May 8 08:47:36.711730 containerd[1535]: time="2025-05-08T08:47:36.711664352Z" level=warning msg="container event discarded" container=d22e6b8d7a34b67ae99b6212e8ca5900c7819e37a5cbd656db5aaa7a67508b0e type=CONTAINER_STARTED_EVENT May 8 08:47:36.752151 containerd[1535]: time="2025-05-08T08:47:36.752047933Z" level=warning msg="container event discarded" container=1012c281cf6e11256245e169ae539c38dd9e9ba0c5779f350dcc44a58dc047a8 type=CONTAINER_CREATED_EVENT May 8 08:47:36.850638 containerd[1535]: time="2025-05-08T08:47:36.850423318Z" level=warning msg="container event discarded" container=1012c281cf6e11256245e169ae539c38dd9e9ba0c5779f350dcc44a58dc047a8 type=CONTAINER_STARTED_EVENT May 8 08:47:36.912916 update_engine[1517]: I20250508 08:47:36.912555 1517 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 8 08:47:36.912916 update_engine[1517]: I20250508 08:47:36.912914 1517 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 8 08:47:36.914757 update_engine[1517]: I20250508 08:47:36.914642 1517 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 8 08:47:36.916903 update_engine[1517]: I20250508 08:47:36.916821 1517 omaha_request_params.cc:62] Current group set to developer May 8 08:47:36.919953 update_engine[1517]: I20250508 08:47:36.919809 1517 update_attempter.cc:499] Already updated boot flags. Skipping. May 8 08:47:36.919953 update_engine[1517]: I20250508 08:47:36.919867 1517 update_attempter.cc:643] Scheduling an action processor start. May 8 08:47:36.920512 update_engine[1517]: I20250508 08:47:36.919952 1517 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 8 08:47:36.920512 update_engine[1517]: I20250508 08:47:36.920243 1517 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 8 08:47:36.920512 update_engine[1517]: I20250508 08:47:36.920435 1517 omaha_request_action.cc:271] Posting an Omaha request to disabled May 8 08:47:36.920512 update_engine[1517]: I20250508 08:47:36.920464 1517 omaha_request_action.cc:272] Request: May 8 08:47:36.920512 update_engine[1517]: May 8 08:47:36.920512 update_engine[1517]: May 8 08:47:36.920512 update_engine[1517]: May 8 08:47:36.920512 update_engine[1517]: May 8 08:47:36.920512 update_engine[1517]: May 8 08:47:36.920512 update_engine[1517]: May 8 08:47:36.920512 update_engine[1517]: May 8 08:47:36.920512 update_engine[1517]: May 8 08:47:36.920512 update_engine[1517]: I20250508 08:47:36.920487 1517 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 8 08:47:36.946437 update_engine[1517]: I20250508 08:47:36.938670 1517 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 8 08:47:36.946437 update_engine[1517]: I20250508 08:47:36.941391 1517 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 8 08:47:36.951882 update_engine[1517]: E20250508 08:47:36.947340 1517 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 8 08:47:36.951882 update_engine[1517]: I20250508 08:47:36.947583 1517 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 8 08:47:36.953343 locksmithd[1545]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 8 08:47:41.627678 containerd[1535]: time="2025-05-08T08:47:41.627519494Z" level=warning msg="container event discarded" container=c8a1660298c3f1346b8c48b689f977cdea908b30009daa6cea080a554ae67a26 type=CONTAINER_CREATED_EVENT May 8 08:47:41.730121 containerd[1535]: time="2025-05-08T08:47:41.729925479Z" level=warning msg="container event discarded" container=c8a1660298c3f1346b8c48b689f977cdea908b30009daa6cea080a554ae67a26 type=CONTAINER_STARTED_EVENT May 8 08:47:41.929157 containerd[1535]: time="2025-05-08T08:47:41.928857985Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f92847dba7022d34af46136a45bf3a9bdd41fc41b5abe05fdf26d629f3452475\" id:\"0fc4f04aa4f9ef94ba21117846794e26bee18f24b95da99ab680c8be88bc05fd\" pid:5841 exited_at:{seconds:1746694061 nanos:927576208}" May 8 08:47:42.242146 containerd[1535]: time="2025-05-08T08:47:42.241976629Z" level=warning msg="container event discarded" container=1fa287be1ca113701a34054b8ec878c15b46bfcfdd24d834989fe33c02f2a751 type=CONTAINER_CREATED_EVENT May 8 08:47:42.402405 containerd[1535]: time="2025-05-08T08:47:42.401954025Z" level=warning msg="container event discarded" container=1fa287be1ca113701a34054b8ec878c15b46bfcfdd24d834989fe33c02f2a751 type=CONTAINER_STARTED_EVENT May 8 08:47:43.520483 containerd[1535]: time="2025-05-08T08:47:43.520429291Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"3d6b257c86b3b323673138749f4616ac602919e078a118e71d9b6ba5fc211b6e\" pid:5869 exited_at:{seconds:1746694063 nanos:520205806}" May 8 08:47:43.719914 containerd[1535]: time="2025-05-08T08:47:43.719753294Z" level=warning msg="container event discarded" container=0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e type=CONTAINER_CREATED_EVENT May 8 08:47:43.719914 containerd[1535]: time="2025-05-08T08:47:43.719857281Z" level=warning msg="container event discarded" container=0c9080abe6e7ce998512c45f4bb4357ae981fbde5b5e7313f1d0ca9e8acde12e type=CONTAINER_STARTED_EVENT May 8 08:47:45.061043 containerd[1535]: time="2025-05-08T08:47:45.060845318Z" level=warning msg="container event discarded" container=bd8ec5072d15a036bb6ca1d8f31f003f178dfef3dcc578460be32b5da7151209 type=CONTAINER_CREATED_EVENT May 8 08:47:45.178192 containerd[1535]: time="2025-05-08T08:47:45.178043311Z" level=warning msg="container event discarded" container=bd8ec5072d15a036bb6ca1d8f31f003f178dfef3dcc578460be32b5da7151209 type=CONTAINER_STARTED_EVENT May 8 08:47:46.883944 update_engine[1517]: I20250508 08:47:46.883760 1517 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 8 08:47:46.885829 update_engine[1517]: I20250508 08:47:46.884317 1517 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 8 08:47:46.885829 update_engine[1517]: I20250508 08:47:46.884932 1517 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 8 08:47:46.891237 update_engine[1517]: E20250508 08:47:46.891124 1517 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 8 08:47:46.891397 update_engine[1517]: I20250508 08:47:46.891307 1517 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 8 08:47:48.697368 containerd[1535]: time="2025-05-08T08:47:48.697235480Z" level=warning msg="container event discarded" container=d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490 type=CONTAINER_CREATED_EVENT May 8 08:47:48.799849 containerd[1535]: time="2025-05-08T08:47:48.799661680Z" level=warning msg="container event discarded" container=d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490 type=CONTAINER_STARTED_EVENT May 8 08:47:49.884352 containerd[1535]: time="2025-05-08T08:47:49.884306683Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6eb3ed9c66ca9dd81f8240466d047e3b5dd3549a7991c171a64b35b2c2e9490\" id:\"76fb0903d256b586179fa4c30f4c85799dc4ed5a3ac96d73692b4b074df7c137\" pid:5891 exited_at:{seconds:1746694069 nanos:883665775}" May 8 08:47:56.880254 update_engine[1517]: I20250508 08:47:56.880084 1517 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 8 08:47:56.882460 update_engine[1517]: I20250508 08:47:56.881592 1517 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 8 08:47:56.882673 update_engine[1517]: I20250508 08:47:56.882495 1517 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 8 08:47:56.888268 update_engine[1517]: E20250508 08:47:56.888144 1517 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 8 08:47:56.888563 update_engine[1517]: I20250508 08:47:56.888305 1517 libcurl_http_fetcher.cc:283] No HTTP response, retry 3