Sep 9 23:16:48.941363 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Sep 9 19:55:16 -00 2025 Sep 9 23:16:48.941422 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=f0ebd120fc09fb344715b1492c3f1d02e1457be2c9792ea5ffb3fe4b15efa812 Sep 9 23:16:48.941444 kernel: BIOS-provided physical RAM map: Sep 9 23:16:48.941454 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 9 23:16:48.941464 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 9 23:16:48.941477 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 9 23:16:48.941490 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Sep 9 23:16:48.941501 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Sep 9 23:16:48.941511 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 9 23:16:48.941521 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 9 23:16:48.941550 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 9 23:16:48.941561 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 9 23:16:48.941571 kernel: NX (Execute Disable) protection: active Sep 9 23:16:48.941582 kernel: APIC: Static calls initialized Sep 9 23:16:48.941593 kernel: SMBIOS 2.8 present. Sep 9 23:16:48.941605 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Sep 9 23:16:48.941620 kernel: DMI: Memory slots populated: 1/1 Sep 9 23:16:48.941631 kernel: Hypervisor detected: KVM Sep 9 23:16:48.941642 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 9 23:16:48.941653 kernel: kvm-clock: using sched offset of 5641286338 cycles Sep 9 23:16:48.941664 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 9 23:16:48.941676 kernel: tsc: Detected 2799.998 MHz processor Sep 9 23:16:48.941687 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 9 23:16:48.941698 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 9 23:16:48.941709 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Sep 9 23:16:48.941724 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 9 23:16:48.941735 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 9 23:16:48.941746 kernel: Using GB pages for direct mapping Sep 9 23:16:48.941757 kernel: ACPI: Early table checksum verification disabled Sep 9 23:16:48.941768 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Sep 9 23:16:48.941779 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:16:48.941803 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:16:48.941813 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:16:48.941824 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Sep 9 23:16:48.941838 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:16:48.941862 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:16:48.941872 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:16:48.941882 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:16:48.941893 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Sep 9 23:16:48.941903 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Sep 9 23:16:48.941919 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Sep 9 23:16:48.941933 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Sep 9 23:16:48.941944 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Sep 9 23:16:48.941955 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Sep 9 23:16:48.941966 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Sep 9 23:16:48.941976 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 9 23:16:48.941999 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 9 23:16:48.942010 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Sep 9 23:16:48.942025 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00001000-0x7ffdbfff] Sep 9 23:16:48.942037 kernel: NODE_DATA(0) allocated [mem 0x7ffd4dc0-0x7ffdbfff] Sep 9 23:16:48.942048 kernel: Zone ranges: Sep 9 23:16:48.942059 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 9 23:16:48.942070 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Sep 9 23:16:48.942081 kernel: Normal empty Sep 9 23:16:48.942104 kernel: Device empty Sep 9 23:16:48.942116 kernel: Movable zone start for each node Sep 9 23:16:48.942127 kernel: Early memory node ranges Sep 9 23:16:48.942138 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 9 23:16:48.942153 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Sep 9 23:16:48.942165 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Sep 9 23:16:48.942177 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 9 23:16:48.942188 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 9 23:16:48.942199 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Sep 9 23:16:48.942211 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 9 23:16:48.942222 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 9 23:16:48.942234 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 9 23:16:48.942692 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 9 23:16:48.942720 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 9 23:16:48.942733 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 9 23:16:48.942744 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 9 23:16:48.942756 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 9 23:16:48.942767 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 9 23:16:48.942779 kernel: TSC deadline timer available Sep 9 23:16:48.942790 kernel: CPU topo: Max. logical packages: 16 Sep 9 23:16:48.942802 kernel: CPU topo: Max. logical dies: 16 Sep 9 23:16:48.942825 kernel: CPU topo: Max. dies per package: 1 Sep 9 23:16:48.942840 kernel: CPU topo: Max. threads per core: 1 Sep 9 23:16:48.942851 kernel: CPU topo: Num. cores per package: 1 Sep 9 23:16:48.942862 kernel: CPU topo: Num. threads per package: 1 Sep 9 23:16:48.942885 kernel: CPU topo: Allowing 2 present CPUs plus 14 hotplug CPUs Sep 9 23:16:48.942895 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 9 23:16:48.942905 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 9 23:16:48.942916 kernel: Booting paravirtualized kernel on KVM Sep 9 23:16:48.942927 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 9 23:16:48.942950 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 9 23:16:48.942965 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 9 23:16:48.942976 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 9 23:16:48.942986 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 9 23:16:48.942997 kernel: kvm-guest: PV spinlocks enabled Sep 9 23:16:48.943008 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 9 23:16:48.943020 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=f0ebd120fc09fb344715b1492c3f1d02e1457be2c9792ea5ffb3fe4b15efa812 Sep 9 23:16:48.943032 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 23:16:48.943042 kernel: random: crng init done Sep 9 23:16:48.943057 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 23:16:48.943068 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 9 23:16:48.943078 kernel: Fallback order for Node 0: 0 Sep 9 23:16:48.943089 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524154 Sep 9 23:16:48.943100 kernel: Policy zone: DMA32 Sep 9 23:16:48.943111 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 23:16:48.943122 kernel: software IO TLB: area num 16. Sep 9 23:16:48.943132 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 9 23:16:48.943143 kernel: Kernel/User page tables isolation: enabled Sep 9 23:16:48.943158 kernel: ftrace: allocating 40102 entries in 157 pages Sep 9 23:16:48.943169 kernel: ftrace: allocated 157 pages with 5 groups Sep 9 23:16:48.943179 kernel: Dynamic Preempt: voluntary Sep 9 23:16:48.943190 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 23:16:48.943202 kernel: rcu: RCU event tracing is enabled. Sep 9 23:16:48.943213 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 9 23:16:48.943224 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 23:16:48.943235 kernel: Rude variant of Tasks RCU enabled. Sep 9 23:16:48.943245 kernel: Tracing variant of Tasks RCU enabled. Sep 9 23:16:48.943260 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 23:16:48.944265 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 9 23:16:48.944313 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 23:16:48.944340 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 23:16:48.944352 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 23:16:48.944363 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Sep 9 23:16:48.944374 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 23:16:48.944399 kernel: Console: colour VGA+ 80x25 Sep 9 23:16:48.944423 kernel: printk: legacy console [tty0] enabled Sep 9 23:16:48.944434 kernel: printk: legacy console [ttyS0] enabled Sep 9 23:16:48.944445 kernel: ACPI: Core revision 20240827 Sep 9 23:16:48.944456 kernel: APIC: Switch to symmetric I/O mode setup Sep 9 23:16:48.944470 kernel: x2apic enabled Sep 9 23:16:48.944494 kernel: APIC: Switched APIC routing to: physical x2apic Sep 9 23:16:48.944506 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Sep 9 23:16:48.944518 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Sep 9 23:16:48.944553 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 9 23:16:48.944571 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Sep 9 23:16:48.944584 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Sep 9 23:16:48.944596 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 9 23:16:48.944608 kernel: Spectre V2 : Mitigation: Retpolines Sep 9 23:16:48.944619 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 9 23:16:48.944631 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Sep 9 23:16:48.944643 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 9 23:16:48.944655 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 9 23:16:48.944667 kernel: MDS: Mitigation: Clear CPU buffers Sep 9 23:16:48.944679 kernel: MMIO Stale Data: Unknown: No mitigations Sep 9 23:16:48.944690 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 9 23:16:48.944706 kernel: active return thunk: its_return_thunk Sep 9 23:16:48.944718 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 9 23:16:48.944730 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 9 23:16:48.944742 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 9 23:16:48.944754 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 9 23:16:48.944766 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 9 23:16:48.944778 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Sep 9 23:16:48.944790 kernel: Freeing SMP alternatives memory: 32K Sep 9 23:16:48.944802 kernel: pid_max: default: 32768 minimum: 301 Sep 9 23:16:48.944814 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 23:16:48.944825 kernel: landlock: Up and running. Sep 9 23:16:48.944841 kernel: SELinux: Initializing. Sep 9 23:16:48.944853 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 23:16:48.944865 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 23:16:48.944878 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Sep 9 23:16:48.944890 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Sep 9 23:16:48.944902 kernel: signal: max sigframe size: 1776 Sep 9 23:16:48.944914 kernel: rcu: Hierarchical SRCU implementation. Sep 9 23:16:48.944927 kernel: rcu: Max phase no-delay instances is 400. Sep 9 23:16:48.944939 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Sep 9 23:16:48.944951 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 9 23:16:48.944967 kernel: smp: Bringing up secondary CPUs ... Sep 9 23:16:48.944979 kernel: smpboot: x86: Booting SMP configuration: Sep 9 23:16:48.944991 kernel: .... node #0, CPUs: #1 Sep 9 23:16:48.945003 kernel: smp: Brought up 1 node, 2 CPUs Sep 9 23:16:48.945015 kernel: smpboot: Total of 2 processors activated (11199.99 BogoMIPS) Sep 9 23:16:48.945028 kernel: Memory: 1895680K/2096616K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54092K init, 2876K bss, 194928K reserved, 0K cma-reserved) Sep 9 23:16:48.945040 kernel: devtmpfs: initialized Sep 9 23:16:48.945052 kernel: x86/mm: Memory block size: 128MB Sep 9 23:16:48.945065 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 23:16:48.945081 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 9 23:16:48.945094 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 23:16:48.945106 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 23:16:48.945118 kernel: audit: initializing netlink subsys (disabled) Sep 9 23:16:48.945130 kernel: audit: type=2000 audit(1757459805.553:1): state=initialized audit_enabled=0 res=1 Sep 9 23:16:48.945142 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 23:16:48.945154 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 9 23:16:48.945165 kernel: cpuidle: using governor menu Sep 9 23:16:48.945177 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 23:16:48.945193 kernel: dca service started, version 1.12.1 Sep 9 23:16:48.945206 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 9 23:16:48.945218 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Sep 9 23:16:48.945230 kernel: PCI: Using configuration type 1 for base access Sep 9 23:16:48.945242 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 9 23:16:48.945267 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 23:16:48.945280 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 23:16:48.945292 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 23:16:48.945309 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 23:16:48.945321 kernel: ACPI: Added _OSI(Module Device) Sep 9 23:16:48.945333 kernel: ACPI: Added _OSI(Processor Device) Sep 9 23:16:48.945346 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 23:16:48.945358 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 23:16:48.945370 kernel: ACPI: Interpreter enabled Sep 9 23:16:48.945382 kernel: ACPI: PM: (supports S0 S5) Sep 9 23:16:48.945394 kernel: ACPI: Using IOAPIC for interrupt routing Sep 9 23:16:48.945406 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 9 23:16:48.945418 kernel: PCI: Using E820 reservations for host bridge windows Sep 9 23:16:48.945435 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 9 23:16:48.945447 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 23:16:48.945722 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 23:16:48.945883 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 9 23:16:48.946038 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 9 23:16:48.946056 kernel: PCI host bridge to bus 0000:00 Sep 9 23:16:48.947306 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 9 23:16:48.947491 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 9 23:16:48.947662 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 9 23:16:48.947805 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Sep 9 23:16:48.947946 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 9 23:16:48.948094 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Sep 9 23:16:48.948252 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 23:16:48.949474 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 9 23:16:48.949686 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 conventional PCI endpoint Sep 9 23:16:48.949847 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfa000000-0xfbffffff pref] Sep 9 23:16:48.950001 kernel: pci 0000:00:01.0: BAR 1 [mem 0xfea50000-0xfea50fff] Sep 9 23:16:48.950188 kernel: pci 0000:00:01.0: ROM [mem 0xfea40000-0xfea4ffff pref] Sep 9 23:16:48.951376 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 9 23:16:48.951591 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:16:48.951763 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea51000-0xfea51fff] Sep 9 23:16:48.951921 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 9 23:16:48.952085 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 9 23:16:48.952235 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 23:16:48.952445 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:16:48.952629 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea52000-0xfea52fff] Sep 9 23:16:48.952786 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 9 23:16:48.952965 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 9 23:16:48.953142 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 23:16:48.954148 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:16:48.954414 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea53000-0xfea53fff] Sep 9 23:16:48.954593 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 9 23:16:48.954754 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 9 23:16:48.954912 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 23:16:48.955111 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:16:48.955296 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea54000-0xfea54fff] Sep 9 23:16:48.955456 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 9 23:16:48.955629 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 9 23:16:48.957432 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 23:16:48.957632 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:16:48.957796 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea55000-0xfea55fff] Sep 9 23:16:48.957969 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 9 23:16:48.958138 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 9 23:16:48.958320 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 23:16:48.958491 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:16:48.958672 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea56000-0xfea56fff] Sep 9 23:16:48.958828 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 9 23:16:48.958982 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 9 23:16:48.959144 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 23:16:48.960351 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:16:48.960552 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea57000-0xfea57fff] Sep 9 23:16:48.960713 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 9 23:16:48.960869 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 9 23:16:48.961025 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 23:16:48.961197 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:16:48.961390 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea58000-0xfea58fff] Sep 9 23:16:48.961598 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 9 23:16:48.961755 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 9 23:16:48.961909 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 23:16:48.962078 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 9 23:16:48.962234 kernel: pci 0000:00:03.0: BAR 0 [io 0xc0c0-0xc0df] Sep 9 23:16:48.965123 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfea59000-0xfea59fff] Sep 9 23:16:48.965332 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Sep 9 23:16:48.965547 kernel: pci 0000:00:03.0: ROM [mem 0xfea00000-0xfea3ffff pref] Sep 9 23:16:48.965719 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 9 23:16:48.965889 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc07f] Sep 9 23:16:48.966058 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfea5a000-0xfea5afff] Sep 9 23:16:48.966214 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfd004000-0xfd007fff 64bit pref] Sep 9 23:16:48.968439 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 9 23:16:48.968655 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 9 23:16:48.968836 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 9 23:16:48.969014 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc0e0-0xc0ff] Sep 9 23:16:48.969169 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea5b000-0xfea5bfff] Sep 9 23:16:48.969389 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 9 23:16:48.969564 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 9 23:16:48.969746 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Sep 9 23:16:48.969918 kernel: pci 0000:01:00.0: BAR 0 [mem 0xfda00000-0xfda000ff 64bit] Sep 9 23:16:48.970091 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 9 23:16:48.970249 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 9 23:16:48.970433 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 9 23:16:48.970645 kernel: pci_bus 0000:02: extended config space not accessible Sep 9 23:16:48.970837 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 conventional PCI endpoint Sep 9 23:16:48.971027 kernel: pci 0000:02:01.0: BAR 0 [mem 0xfd800000-0xfd80000f] Sep 9 23:16:48.971190 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 9 23:16:48.972430 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Sep 9 23:16:48.972634 kernel: pci 0000:03:00.0: BAR 0 [mem 0xfe800000-0xfe803fff 64bit] Sep 9 23:16:48.972798 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 9 23:16:48.972999 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Sep 9 23:16:48.973192 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Sep 9 23:16:48.973389 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 9 23:16:48.973561 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 9 23:16:48.973720 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 9 23:16:48.973901 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 9 23:16:48.974086 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 9 23:16:48.974265 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 9 23:16:48.974291 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 9 23:16:48.974304 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 9 23:16:48.975160 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 9 23:16:48.975186 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 9 23:16:48.975199 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 9 23:16:48.975211 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 9 23:16:48.975223 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 9 23:16:48.975236 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 9 23:16:48.975248 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 9 23:16:48.975267 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 9 23:16:48.975279 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 9 23:16:48.975322 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 9 23:16:48.975336 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 9 23:16:48.975348 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 9 23:16:48.975360 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 9 23:16:48.975372 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 9 23:16:48.975384 kernel: iommu: Default domain type: Translated Sep 9 23:16:48.975396 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 9 23:16:48.975414 kernel: PCI: Using ACPI for IRQ routing Sep 9 23:16:48.975427 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 9 23:16:48.975439 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 9 23:16:48.975451 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Sep 9 23:16:48.975633 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 9 23:16:48.975793 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 9 23:16:48.975949 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 9 23:16:48.975968 kernel: vgaarb: loaded Sep 9 23:16:48.975988 kernel: clocksource: Switched to clocksource kvm-clock Sep 9 23:16:48.976000 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 23:16:48.976013 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 23:16:48.976025 kernel: pnp: PnP ACPI init Sep 9 23:16:48.976194 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 9 23:16:48.976215 kernel: pnp: PnP ACPI: found 5 devices Sep 9 23:16:48.976228 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 9 23:16:48.976256 kernel: NET: Registered PF_INET protocol family Sep 9 23:16:48.976275 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 23:16:48.976287 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 9 23:16:48.976300 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 23:16:48.976313 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 9 23:16:48.976325 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 9 23:16:48.976337 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 9 23:16:48.976349 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 23:16:48.976361 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 23:16:48.976373 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 23:16:48.976390 kernel: NET: Registered PF_XDP protocol family Sep 9 23:16:48.976559 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Sep 9 23:16:48.976718 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 9 23:16:48.976873 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 9 23:16:48.977028 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 9 23:16:48.977181 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 9 23:16:48.979337 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 9 23:16:48.979505 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 9 23:16:48.979687 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 9 23:16:48.979847 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Sep 9 23:16:48.980003 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Sep 9 23:16:48.980167 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Sep 9 23:16:48.981804 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Sep 9 23:16:48.981988 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Sep 9 23:16:48.982146 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Sep 9 23:16:48.982334 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Sep 9 23:16:48.982499 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Sep 9 23:16:48.982674 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 9 23:16:48.982860 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 9 23:16:48.983015 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 9 23:16:48.983168 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 9 23:16:48.983363 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 9 23:16:48.983535 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 23:16:48.983696 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 9 23:16:48.983858 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 9 23:16:48.984032 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 9 23:16:48.984209 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 23:16:48.984411 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 9 23:16:48.984580 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 9 23:16:48.984735 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 9 23:16:48.985200 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 23:16:48.985395 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 9 23:16:48.985565 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 9 23:16:48.985729 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 9 23:16:48.985883 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 23:16:48.986045 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 9 23:16:48.986199 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 9 23:16:48.986383 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 9 23:16:48.986553 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 23:16:48.986710 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 9 23:16:48.986863 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 9 23:16:48.987017 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 9 23:16:48.987171 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 23:16:48.987384 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 9 23:16:48.987551 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 9 23:16:48.987716 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 9 23:16:48.987870 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 23:16:48.988025 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 9 23:16:48.988178 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 9 23:16:48.988403 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 9 23:16:48.988574 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 23:16:48.988724 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 9 23:16:48.988866 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 9 23:16:48.989006 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 9 23:16:48.989179 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Sep 9 23:16:48.989336 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 9 23:16:48.989478 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Sep 9 23:16:48.989651 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 9 23:16:48.989800 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Sep 9 23:16:48.989946 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 23:16:48.990103 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 9 23:16:48.990293 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Sep 9 23:16:48.990444 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 9 23:16:48.990604 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 23:16:48.990768 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Sep 9 23:16:48.990915 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 9 23:16:48.991061 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 23:16:48.991224 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Sep 9 23:16:48.991400 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 9 23:16:48.991562 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 23:16:48.991719 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Sep 9 23:16:48.991866 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 9 23:16:48.992011 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 23:16:48.992164 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Sep 9 23:16:48.992349 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 9 23:16:48.992497 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 23:16:48.992675 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Sep 9 23:16:48.992823 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Sep 9 23:16:48.992969 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 23:16:48.993124 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Sep 9 23:16:48.993307 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 9 23:16:48.993464 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 23:16:48.993484 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 9 23:16:48.993498 kernel: PCI: CLS 0 bytes, default 64 Sep 9 23:16:48.993511 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 9 23:16:48.993534 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Sep 9 23:16:48.993549 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 9 23:16:48.993562 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Sep 9 23:16:48.993575 kernel: Initialise system trusted keyrings Sep 9 23:16:48.993594 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 9 23:16:48.993607 kernel: Key type asymmetric registered Sep 9 23:16:48.993619 kernel: Asymmetric key parser 'x509' registered Sep 9 23:16:48.993632 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 9 23:16:48.993644 kernel: io scheduler mq-deadline registered Sep 9 23:16:48.993657 kernel: io scheduler kyber registered Sep 9 23:16:48.993670 kernel: io scheduler bfq registered Sep 9 23:16:48.993825 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 9 23:16:48.993980 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 9 23:16:48.994144 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:16:48.994329 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 9 23:16:48.994493 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 9 23:16:48.994661 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:16:48.994817 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 9 23:16:48.994972 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 9 23:16:48.995134 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:16:48.995308 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 9 23:16:48.995463 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 9 23:16:48.995632 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:16:48.995789 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 9 23:16:48.995943 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 9 23:16:48.996106 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:16:48.996287 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 9 23:16:48.996444 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 9 23:16:48.996612 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:16:48.996768 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 9 23:16:48.996922 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 9 23:16:48.997085 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:16:48.997255 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 9 23:16:48.997413 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 9 23:16:48.997582 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:16:48.997602 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 9 23:16:48.997616 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 9 23:16:48.997635 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 9 23:16:48.997649 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 23:16:48.997662 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 9 23:16:48.997675 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 9 23:16:48.997687 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 9 23:16:48.997700 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 9 23:16:48.997713 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 9 23:16:48.997873 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 9 23:16:48.998022 kernel: rtc_cmos 00:03: registered as rtc0 Sep 9 23:16:48.998188 kernel: rtc_cmos 00:03: setting system clock to 2025-09-09T23:16:48 UTC (1757459808) Sep 9 23:16:48.998366 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Sep 9 23:16:48.998386 kernel: intel_pstate: CPU model not supported Sep 9 23:16:48.998400 kernel: NET: Registered PF_INET6 protocol family Sep 9 23:16:48.998412 kernel: Segment Routing with IPv6 Sep 9 23:16:48.998425 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 23:16:48.998438 kernel: NET: Registered PF_PACKET protocol family Sep 9 23:16:48.998451 kernel: Key type dns_resolver registered Sep 9 23:16:48.998469 kernel: IPI shorthand broadcast: enabled Sep 9 23:16:48.998483 kernel: sched_clock: Marking stable (3328023541, 221040160)->(3672594998, -123531297) Sep 9 23:16:48.998495 kernel: registered taskstats version 1 Sep 9 23:16:48.998508 kernel: Loading compiled-in X.509 certificates Sep 9 23:16:48.998521 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 003b39862f2a560eb5545d7d88a07fc5bdfce075' Sep 9 23:16:48.998546 kernel: Demotion targets for Node 0: null Sep 9 23:16:48.998559 kernel: Key type .fscrypt registered Sep 9 23:16:48.998572 kernel: Key type fscrypt-provisioning registered Sep 9 23:16:48.998585 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 23:16:48.998603 kernel: ima: Allocated hash algorithm: sha1 Sep 9 23:16:48.998616 kernel: ima: No architecture policies found Sep 9 23:16:48.998629 kernel: clk: Disabling unused clocks Sep 9 23:16:48.998642 kernel: Warning: unable to open an initial console. Sep 9 23:16:48.998655 kernel: Freeing unused kernel image (initmem) memory: 54092K Sep 9 23:16:48.998667 kernel: Write protecting the kernel read-only data: 24576k Sep 9 23:16:48.998680 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 9 23:16:48.998693 kernel: Run /init as init process Sep 9 23:16:48.998705 kernel: with arguments: Sep 9 23:16:48.998722 kernel: /init Sep 9 23:16:48.998735 kernel: with environment: Sep 9 23:16:48.998747 kernel: HOME=/ Sep 9 23:16:48.998760 kernel: TERM=linux Sep 9 23:16:48.998772 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 23:16:48.998795 systemd[1]: Successfully made /usr/ read-only. Sep 9 23:16:48.998813 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 23:16:48.998828 systemd[1]: Detected virtualization kvm. Sep 9 23:16:48.998846 systemd[1]: Detected architecture x86-64. Sep 9 23:16:48.998860 systemd[1]: Running in initrd. Sep 9 23:16:48.998877 systemd[1]: No hostname configured, using default hostname. Sep 9 23:16:48.998890 systemd[1]: Hostname set to . Sep 9 23:16:48.998907 systemd[1]: Initializing machine ID from VM UUID. Sep 9 23:16:48.998920 systemd[1]: Queued start job for default target initrd.target. Sep 9 23:16:48.998934 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 23:16:48.998947 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 23:16:48.998966 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 23:16:48.998979 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 23:16:48.998993 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 23:16:48.999007 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 23:16:48.999022 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 23:16:48.999036 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 23:16:48.999053 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 23:16:48.999067 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 23:16:48.999080 systemd[1]: Reached target paths.target - Path Units. Sep 9 23:16:48.999094 systemd[1]: Reached target slices.target - Slice Units. Sep 9 23:16:48.999107 systemd[1]: Reached target swap.target - Swaps. Sep 9 23:16:48.999120 systemd[1]: Reached target timers.target - Timer Units. Sep 9 23:16:48.999134 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 23:16:48.999147 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 23:16:48.999161 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 23:16:48.999178 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 23:16:48.999192 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 23:16:48.999205 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 23:16:48.999218 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 23:16:48.999247 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 23:16:48.999263 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 23:16:48.999277 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 23:16:48.999290 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 23:16:48.999304 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 23:16:48.999323 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 23:16:48.999337 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 23:16:48.999351 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 23:16:48.999364 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:16:48.999378 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 23:16:48.999396 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 23:16:48.999410 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 23:16:48.999424 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 23:16:48.999483 systemd-journald[229]: Collecting audit messages is disabled. Sep 9 23:16:48.999520 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 23:16:48.999546 kernel: Bridge firewalling registered Sep 9 23:16:48.999560 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 23:16:48.999575 systemd-journald[229]: Journal started Sep 9 23:16:48.999605 systemd-journald[229]: Runtime Journal (/run/log/journal/1dd2e870ae174eb1a05ec7f4f0f11ff7) is 4.7M, max 38.2M, 33.4M free. Sep 9 23:16:48.938596 systemd-modules-load[230]: Inserted module 'overlay' Sep 9 23:16:49.051086 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 23:16:48.984169 systemd-modules-load[230]: Inserted module 'br_netfilter' Sep 9 23:16:49.052022 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:16:49.053478 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 23:16:49.056718 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 23:16:49.061448 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 23:16:49.063964 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 23:16:49.066843 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 23:16:49.087508 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 23:16:49.087945 systemd-tmpfiles[248]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 23:16:49.091022 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 23:16:49.099143 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 23:16:49.101767 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 23:16:49.104551 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 23:16:49.108384 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 23:16:49.135721 dracut-cmdline[270]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=f0ebd120fc09fb344715b1492c3f1d02e1457be2c9792ea5ffb3fe4b15efa812 Sep 9 23:16:49.159268 systemd-resolved[268]: Positive Trust Anchors: Sep 9 23:16:49.159306 systemd-resolved[268]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 23:16:49.159349 systemd-resolved[268]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 23:16:49.163390 systemd-resolved[268]: Defaulting to hostname 'linux'. Sep 9 23:16:49.166455 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 23:16:49.167498 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 23:16:49.252281 kernel: SCSI subsystem initialized Sep 9 23:16:49.263307 kernel: Loading iSCSI transport class v2.0-870. Sep 9 23:16:49.276268 kernel: iscsi: registered transport (tcp) Sep 9 23:16:49.301860 kernel: iscsi: registered transport (qla4xxx) Sep 9 23:16:49.301930 kernel: QLogic iSCSI HBA Driver Sep 9 23:16:49.325219 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 23:16:49.344585 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 23:16:49.346093 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 23:16:49.405260 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 23:16:49.407960 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 23:16:49.471301 kernel: raid6: sse2x4 gen() 13368 MB/s Sep 9 23:16:49.489382 kernel: raid6: sse2x2 gen() 9281 MB/s Sep 9 23:16:49.507881 kernel: raid6: sse2x1 gen() 9229 MB/s Sep 9 23:16:49.507920 kernel: raid6: using algorithm sse2x4 gen() 13368 MB/s Sep 9 23:16:49.526913 kernel: raid6: .... xor() 8023 MB/s, rmw enabled Sep 9 23:16:49.527004 kernel: raid6: using ssse3x2 recovery algorithm Sep 9 23:16:49.552295 kernel: xor: automatically using best checksumming function avx Sep 9 23:16:49.736332 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 23:16:49.744683 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 23:16:49.748423 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 23:16:49.781898 systemd-udevd[479]: Using default interface naming scheme 'v255'. Sep 9 23:16:49.792127 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 23:16:49.795032 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 23:16:49.821318 dracut-pre-trigger[482]: rd.md=0: removing MD RAID activation Sep 9 23:16:49.852521 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 23:16:49.855253 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 23:16:49.965994 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 23:16:49.971048 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 23:16:50.087261 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Sep 9 23:16:50.089828 kernel: cryptd: max_cpu_qlen set to 1000 Sep 9 23:16:50.103302 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Sep 9 23:16:50.115265 kernel: AES CTR mode by8 optimization enabled Sep 9 23:16:50.131267 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 9 23:16:50.131326 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 23:16:50.134534 kernel: GPT:17805311 != 125829119 Sep 9 23:16:50.135634 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 23:16:50.137634 kernel: GPT:17805311 != 125829119 Sep 9 23:16:50.137666 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 23:16:50.139338 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 23:16:50.152618 kernel: libata version 3.00 loaded. Sep 9 23:16:50.172401 kernel: ahci 0000:00:1f.2: version 3.0 Sep 9 23:16:50.174312 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 9 23:16:50.180258 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 9 23:16:50.180539 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 9 23:16:50.180733 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 9 23:16:50.189979 kernel: ACPI: bus type USB registered Sep 9 23:16:50.190040 kernel: scsi host0: ahci Sep 9 23:16:50.190330 kernel: usbcore: registered new interface driver usbfs Sep 9 23:16:50.191104 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 23:16:50.191311 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:16:50.193967 kernel: scsi host1: ahci Sep 9 23:16:50.198465 kernel: scsi host2: ahci Sep 9 23:16:50.198760 kernel: usbcore: registered new interface driver hub Sep 9 23:16:50.198782 kernel: usbcore: registered new device driver usb Sep 9 23:16:50.195693 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:16:50.200785 kernel: scsi host3: ahci Sep 9 23:16:50.203603 kernel: scsi host4: ahci Sep 9 23:16:50.202299 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:16:50.203686 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 23:16:50.231256 kernel: scsi host5: ahci Sep 9 23:16:50.231521 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 lpm-pol 1 Sep 9 23:16:50.231544 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 lpm-pol 1 Sep 9 23:16:50.231560 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 lpm-pol 1 Sep 9 23:16:50.231576 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 lpm-pol 1 Sep 9 23:16:50.231591 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 lpm-pol 1 Sep 9 23:16:50.231607 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 lpm-pol 1 Sep 9 23:16:50.253735 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 9 23:16:50.266721 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 9 23:16:50.328190 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 9 23:16:50.330013 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:16:50.366829 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 9 23:16:50.378863 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 23:16:50.380776 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 23:16:50.402701 disk-uuid[631]: Primary Header is updated. Sep 9 23:16:50.402701 disk-uuid[631]: Secondary Entries is updated. Sep 9 23:16:50.402701 disk-uuid[631]: Secondary Header is updated. Sep 9 23:16:50.407263 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 23:16:50.415273 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 23:16:50.531368 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 9 23:16:50.531428 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 9 23:16:50.534051 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 9 23:16:50.541255 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 9 23:16:50.541291 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 9 23:16:50.541309 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 9 23:16:50.585331 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 9 23:16:50.585624 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Sep 9 23:16:50.589269 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 9 23:16:50.592881 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 9 23:16:50.593097 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Sep 9 23:16:50.596204 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Sep 9 23:16:50.596455 kernel: hub 1-0:1.0: USB hub found Sep 9 23:16:50.600163 kernel: hub 1-0:1.0: 4 ports detected Sep 9 23:16:50.600389 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 9 23:16:50.602741 kernel: hub 2-0:1.0: USB hub found Sep 9 23:16:50.602960 kernel: hub 2-0:1.0: 4 ports detected Sep 9 23:16:50.620814 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 23:16:50.622744 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 23:16:50.623548 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 23:16:50.625082 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 23:16:50.627529 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 23:16:50.653384 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 23:16:50.837273 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 9 23:16:50.977271 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 23:16:50.983451 kernel: usbcore: registered new interface driver usbhid Sep 9 23:16:50.983534 kernel: usbhid: USB HID core driver Sep 9 23:16:50.990648 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Sep 9 23:16:50.990691 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Sep 9 23:16:51.415979 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 23:16:51.417434 disk-uuid[632]: The operation has completed successfully. Sep 9 23:16:51.468136 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 23:16:51.469326 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 23:16:51.523550 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 23:16:51.553551 sh[657]: Success Sep 9 23:16:51.576579 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 23:16:51.576658 kernel: device-mapper: uevent: version 1.0.3 Sep 9 23:16:51.578516 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 23:16:51.592271 kernel: device-mapper: verity: sha256 using shash "sha256-avx" Sep 9 23:16:51.641533 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 23:16:51.644643 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 23:16:51.657024 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 23:16:51.669433 kernel: BTRFS: device fsid f72d0a81-8b28-47a3-b3ab-bf6ecd8938f0 devid 1 transid 35 /dev/mapper/usr (253:0) scanned by mount (669) Sep 9 23:16:51.674462 kernel: BTRFS info (device dm-0): first mount of filesystem f72d0a81-8b28-47a3-b3ab-bf6ecd8938f0 Sep 9 23:16:51.674514 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 9 23:16:51.684173 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 23:16:51.684219 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 23:16:51.686995 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 23:16:51.688276 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 23:16:51.689111 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 23:16:51.690102 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 23:16:51.693430 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 23:16:51.729263 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (704) Sep 9 23:16:51.732264 kernel: BTRFS info (device vda6): first mount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 23:16:51.735674 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 23:16:51.741127 kernel: BTRFS info (device vda6): turning on async discard Sep 9 23:16:51.741162 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 23:16:51.748290 kernel: BTRFS info (device vda6): last unmount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 23:16:51.750421 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 23:16:51.755462 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 23:16:51.832011 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 23:16:51.835395 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 23:16:51.891303 systemd-networkd[841]: lo: Link UP Sep 9 23:16:51.892523 systemd-networkd[841]: lo: Gained carrier Sep 9 23:16:51.894621 systemd-networkd[841]: Enumeration completed Sep 9 23:16:51.895129 systemd-networkd[841]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:16:51.895136 systemd-networkd[841]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 23:16:51.896016 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 23:16:51.897368 systemd[1]: Reached target network.target - Network. Sep 9 23:16:51.901738 systemd-networkd[841]: eth0: Link UP Sep 9 23:16:51.901964 systemd-networkd[841]: eth0: Gained carrier Sep 9 23:16:51.901983 systemd-networkd[841]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:16:51.928392 systemd-networkd[841]: eth0: DHCPv4 address 10.230.66.202/30, gateway 10.230.66.201 acquired from 10.230.66.201 Sep 9 23:16:51.971776 ignition[757]: Ignition 2.22.0 Sep 9 23:16:51.971802 ignition[757]: Stage: fetch-offline Sep 9 23:16:51.971891 ignition[757]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:16:51.971908 ignition[757]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 23:16:51.976045 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 23:16:51.972090 ignition[757]: parsed url from cmdline: "" Sep 9 23:16:51.972097 ignition[757]: no config URL provided Sep 9 23:16:51.972111 ignition[757]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 23:16:51.979620 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 9 23:16:51.972125 ignition[757]: no config at "/usr/lib/ignition/user.ign" Sep 9 23:16:51.972139 ignition[757]: failed to fetch config: resource requires networking Sep 9 23:16:51.972447 ignition[757]: Ignition finished successfully Sep 9 23:16:52.030674 ignition[851]: Ignition 2.22.0 Sep 9 23:16:52.030697 ignition[851]: Stage: fetch Sep 9 23:16:52.030919 ignition[851]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:16:52.030936 ignition[851]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 23:16:52.031047 ignition[851]: parsed url from cmdline: "" Sep 9 23:16:52.031053 ignition[851]: no config URL provided Sep 9 23:16:52.031062 ignition[851]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 23:16:52.031076 ignition[851]: no config at "/usr/lib/ignition/user.ign" Sep 9 23:16:52.031231 ignition[851]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Sep 9 23:16:52.031295 ignition[851]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Sep 9 23:16:52.031392 ignition[851]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Sep 9 23:16:52.048622 ignition[851]: GET result: OK Sep 9 23:16:52.048948 ignition[851]: parsing config with SHA512: a64c027f83c38009a4d227883a4b1c6900f4db1f4c45d4e0cde4e10463e24123ad556c0ba3817febe5157dae39343eeaed372ab5aa54dd379398f35f7b20f4a1 Sep 9 23:16:52.056823 unknown[851]: fetched base config from "system" Sep 9 23:16:52.056839 unknown[851]: fetched base config from "system" Sep 9 23:16:52.057203 ignition[851]: fetch: fetch complete Sep 9 23:16:52.056847 unknown[851]: fetched user config from "openstack" Sep 9 23:16:52.057211 ignition[851]: fetch: fetch passed Sep 9 23:16:52.059831 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 9 23:16:52.057287 ignition[851]: Ignition finished successfully Sep 9 23:16:52.063426 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 23:16:52.103293 ignition[857]: Ignition 2.22.0 Sep 9 23:16:52.104259 ignition[857]: Stage: kargs Sep 9 23:16:52.104455 ignition[857]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:16:52.104485 ignition[857]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 23:16:52.107059 ignition[857]: kargs: kargs passed Sep 9 23:16:52.107127 ignition[857]: Ignition finished successfully Sep 9 23:16:52.108711 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 23:16:52.111362 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 23:16:52.151028 ignition[863]: Ignition 2.22.0 Sep 9 23:16:52.151833 ignition[863]: Stage: disks Sep 9 23:16:52.152044 ignition[863]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:16:52.152074 ignition[863]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 23:16:52.153067 ignition[863]: disks: disks passed Sep 9 23:16:52.154659 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 23:16:52.153148 ignition[863]: Ignition finished successfully Sep 9 23:16:52.156373 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 23:16:52.157388 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 23:16:52.158811 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 23:16:52.160384 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 23:16:52.162098 systemd[1]: Reached target basic.target - Basic System. Sep 9 23:16:52.164682 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 23:16:52.192971 systemd-fsck[871]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 9 23:16:52.196819 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 23:16:52.200330 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 23:16:52.323265 kernel: EXT4-fs (vda9): mounted filesystem b54acc07-9600-49db-baed-d5fd6f41a1a5 r/w with ordered data mode. Quota mode: none. Sep 9 23:16:52.323848 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 23:16:52.325090 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 23:16:52.328180 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 23:16:52.330488 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 23:16:52.333731 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 23:16:52.342401 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Sep 9 23:16:52.344190 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 23:16:52.345340 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 23:16:52.349305 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 23:16:52.354762 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (879) Sep 9 23:16:52.354818 kernel: BTRFS info (device vda6): first mount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 23:16:52.357550 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 23:16:52.360729 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 23:16:52.370279 kernel: BTRFS info (device vda6): turning on async discard Sep 9 23:16:52.370334 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 23:16:52.376285 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 23:16:52.428549 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 23:16:52.446202 initrd-setup-root[907]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 23:16:52.454949 initrd-setup-root[914]: cut: /sysroot/etc/group: No such file or directory Sep 9 23:16:52.461797 initrd-setup-root[921]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 23:16:52.467591 initrd-setup-root[928]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 23:16:52.575984 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 23:16:52.579534 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 23:16:52.582437 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 23:16:52.608272 kernel: BTRFS info (device vda6): last unmount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 23:16:52.625155 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 23:16:52.646163 ignition[997]: INFO : Ignition 2.22.0 Sep 9 23:16:52.646163 ignition[997]: INFO : Stage: mount Sep 9 23:16:52.648564 ignition[997]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 23:16:52.648564 ignition[997]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 23:16:52.648564 ignition[997]: INFO : mount: mount passed Sep 9 23:16:52.648564 ignition[997]: INFO : Ignition finished successfully Sep 9 23:16:52.648919 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 23:16:52.668401 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 23:16:53.147930 systemd-networkd[841]: eth0: Gained IPv6LL Sep 9 23:16:53.454323 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 23:16:54.654557 systemd-networkd[841]: eth0: Ignoring DHCPv6 address 2a02:1348:179:90b2:24:19ff:fee6:42ca/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:90b2:24:19ff:fee6:42ca/64 assigned by NDisc. Sep 9 23:16:54.654570 systemd-networkd[841]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 9 23:16:55.464302 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 23:16:59.474297 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 23:16:59.482652 coreos-metadata[881]: Sep 09 23:16:59.482 WARN failed to locate config-drive, using the metadata service API instead Sep 9 23:16:59.505820 coreos-metadata[881]: Sep 09 23:16:59.505 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 9 23:16:59.521771 coreos-metadata[881]: Sep 09 23:16:59.521 INFO Fetch successful Sep 9 23:16:59.522574 coreos-metadata[881]: Sep 09 23:16:59.522 INFO wrote hostname srv-5qwy1.gb1.brightbox.com to /sysroot/etc/hostname Sep 9 23:16:59.524177 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Sep 9 23:16:59.524378 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Sep 9 23:16:59.528491 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 23:16:59.549012 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 23:16:59.572311 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1013) Sep 9 23:16:59.577375 kernel: BTRFS info (device vda6): first mount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 23:16:59.577437 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 23:16:59.583827 kernel: BTRFS info (device vda6): turning on async discard Sep 9 23:16:59.583865 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 23:16:59.586777 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 23:16:59.628579 ignition[1031]: INFO : Ignition 2.22.0 Sep 9 23:16:59.628579 ignition[1031]: INFO : Stage: files Sep 9 23:16:59.630305 ignition[1031]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 23:16:59.630305 ignition[1031]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 23:16:59.630305 ignition[1031]: DEBUG : files: compiled without relabeling support, skipping Sep 9 23:16:59.633099 ignition[1031]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 23:16:59.633099 ignition[1031]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 23:16:59.640582 ignition[1031]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 23:16:59.640582 ignition[1031]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 23:16:59.640582 ignition[1031]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 23:16:59.640582 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 9 23:16:59.640582 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 9 23:16:59.637022 unknown[1031]: wrote ssh authorized keys file for user: core Sep 9 23:16:59.814332 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 23:17:00.337105 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 9 23:17:00.348022 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 23:17:00.348022 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 23:17:00.348022 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 23:17:00.348022 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 23:17:00.348022 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 23:17:00.348022 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 23:17:00.348022 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 23:17:00.348022 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 23:17:00.357760 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 23:17:00.357760 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 23:17:00.357760 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 23:17:00.357760 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 23:17:00.357760 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 23:17:00.357760 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 9 23:17:00.724003 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 23:17:02.787837 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 23:17:02.792924 ignition[1031]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 23:17:02.794224 ignition[1031]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 23:17:02.797377 ignition[1031]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 23:17:02.797377 ignition[1031]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 23:17:02.799559 ignition[1031]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 9 23:17:02.799559 ignition[1031]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 23:17:02.799559 ignition[1031]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 23:17:02.799559 ignition[1031]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 23:17:02.799559 ignition[1031]: INFO : files: files passed Sep 9 23:17:02.799559 ignition[1031]: INFO : Ignition finished successfully Sep 9 23:17:02.800959 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 23:17:02.807447 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 23:17:02.810771 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 23:17:02.831444 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 23:17:02.831619 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 23:17:02.841141 initrd-setup-root-after-ignition[1061]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 23:17:02.843407 initrd-setup-root-after-ignition[1061]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 23:17:02.844440 initrd-setup-root-after-ignition[1065]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 23:17:02.845791 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 23:17:02.847094 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 23:17:02.849364 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 23:17:02.905851 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 23:17:02.906014 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 23:17:02.907648 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 23:17:02.909272 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 23:17:02.910807 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 23:17:02.912040 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 23:17:02.955952 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 23:17:02.958796 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 23:17:02.984402 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 23:17:02.986178 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 23:17:02.988052 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 23:17:02.989517 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 23:17:02.989747 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 23:17:02.991985 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 23:17:02.992832 systemd[1]: Stopped target basic.target - Basic System. Sep 9 23:17:02.994972 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 23:17:02.995811 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 23:17:02.997376 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 23:17:02.998938 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 23:17:03.000585 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 23:17:03.002165 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 23:17:03.003734 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 23:17:03.005420 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 23:17:03.006218 systemd[1]: Stopped target swap.target - Swaps. Sep 9 23:17:03.006919 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 23:17:03.007201 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 23:17:03.008270 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 23:17:03.009130 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 23:17:03.010610 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 23:17:03.010798 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 23:17:03.012000 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 23:17:03.012250 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 23:17:03.013774 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 23:17:03.013943 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 23:17:03.015753 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 23:17:03.015968 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 23:17:03.020300 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 23:17:03.029570 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 23:17:03.032597 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 23:17:03.032859 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 23:17:03.033796 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 23:17:03.034033 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 23:17:03.045682 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 23:17:03.045839 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 23:17:03.063791 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 23:17:03.066666 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 23:17:03.067668 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 23:17:03.075611 ignition[1086]: INFO : Ignition 2.22.0 Sep 9 23:17:03.075611 ignition[1086]: INFO : Stage: umount Sep 9 23:17:03.077127 ignition[1086]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 23:17:03.077127 ignition[1086]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 23:17:03.079806 ignition[1086]: INFO : umount: umount passed Sep 9 23:17:03.079806 ignition[1086]: INFO : Ignition finished successfully Sep 9 23:17:03.079711 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 23:17:03.079892 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 23:17:03.081441 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 23:17:03.081523 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 23:17:03.082518 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 23:17:03.082590 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 23:17:03.083941 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 9 23:17:03.084005 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 9 23:17:03.085318 systemd[1]: Stopped target network.target - Network. Sep 9 23:17:03.086443 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 23:17:03.086521 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 23:17:03.087836 systemd[1]: Stopped target paths.target - Path Units. Sep 9 23:17:03.089108 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 23:17:03.092311 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 23:17:03.093504 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 23:17:03.094976 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 23:17:03.096350 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 23:17:03.096421 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 23:17:03.097802 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 23:17:03.097860 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 23:17:03.099404 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 23:17:03.099481 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 23:17:03.100704 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 23:17:03.100777 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 23:17:03.102090 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 23:17:03.102185 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 23:17:03.103591 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 23:17:03.105697 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 23:17:03.108429 systemd-networkd[841]: eth0: DHCPv6 lease lost Sep 9 23:17:03.113596 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 23:17:03.113793 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 23:17:03.119834 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 23:17:03.120195 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 23:17:03.120495 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 23:17:03.122676 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 23:17:03.123605 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 23:17:03.124501 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 23:17:03.124574 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 23:17:03.126970 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 23:17:03.128576 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 23:17:03.128643 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 23:17:03.130051 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 23:17:03.130113 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 23:17:03.132907 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 23:17:03.132971 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 23:17:03.134625 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 23:17:03.134699 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 23:17:03.136692 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 23:17:03.138760 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 23:17:03.138867 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 23:17:03.159115 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 23:17:03.160597 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 23:17:03.161900 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 23:17:03.162039 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 23:17:03.163763 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 23:17:03.163858 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 23:17:03.165555 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 23:17:03.165606 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 23:17:03.166909 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 23:17:03.166979 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 23:17:03.169083 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 23:17:03.169154 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 23:17:03.170610 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 23:17:03.170675 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 23:17:03.173150 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 23:17:03.174564 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 23:17:03.174628 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 23:17:03.177041 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 23:17:03.177110 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 23:17:03.180405 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 9 23:17:03.180469 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 23:17:03.181566 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 23:17:03.181628 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 23:17:03.182354 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 23:17:03.182423 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:17:03.185887 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 9 23:17:03.185964 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 9 23:17:03.186026 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 9 23:17:03.186090 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 23:17:03.196261 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 23:17:03.196447 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 23:17:03.198121 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 23:17:03.201406 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 23:17:03.223615 systemd[1]: Switching root. Sep 9 23:17:03.255804 systemd-journald[229]: Journal stopped Sep 9 23:17:04.707885 systemd-journald[229]: Received SIGTERM from PID 1 (systemd). Sep 9 23:17:04.707988 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 23:17:04.708019 kernel: SELinux: policy capability open_perms=1 Sep 9 23:17:04.708039 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 23:17:04.708057 kernel: SELinux: policy capability always_check_network=0 Sep 9 23:17:04.708081 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 23:17:04.708099 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 23:17:04.708116 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 23:17:04.708154 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 23:17:04.708172 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 23:17:04.708190 kernel: audit: type=1403 audit(1757459823.511:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 23:17:04.708209 systemd[1]: Successfully loaded SELinux policy in 73.212ms. Sep 9 23:17:04.711531 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 11.691ms. Sep 9 23:17:04.711563 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 23:17:04.711590 systemd[1]: Detected virtualization kvm. Sep 9 23:17:04.711611 systemd[1]: Detected architecture x86-64. Sep 9 23:17:04.711652 systemd[1]: Detected first boot. Sep 9 23:17:04.711674 systemd[1]: Hostname set to . Sep 9 23:17:04.711692 systemd[1]: Initializing machine ID from VM UUID. Sep 9 23:17:04.711711 zram_generator::config[1130]: No configuration found. Sep 9 23:17:04.711737 kernel: Guest personality initialized and is inactive Sep 9 23:17:04.711756 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 9 23:17:04.711780 kernel: Initialized host personality Sep 9 23:17:04.711799 kernel: NET: Registered PF_VSOCK protocol family Sep 9 23:17:04.711817 systemd[1]: Populated /etc with preset unit settings. Sep 9 23:17:04.711849 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 23:17:04.711871 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 23:17:04.711890 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 23:17:04.711909 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 23:17:04.711928 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 23:17:04.711947 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 23:17:04.711966 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 23:17:04.711985 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 23:17:04.712022 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 23:17:04.712053 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 23:17:04.712074 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 23:17:04.712100 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 23:17:04.712119 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 23:17:04.712137 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 23:17:04.712167 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 23:17:04.712188 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 23:17:04.712213 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 23:17:04.712247 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 23:17:04.712287 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 9 23:17:04.712308 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 23:17:04.712344 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 23:17:04.712365 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 23:17:04.712384 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 23:17:04.712402 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 23:17:04.712420 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 23:17:04.712438 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 23:17:04.712457 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 23:17:04.712483 systemd[1]: Reached target slices.target - Slice Units. Sep 9 23:17:04.712502 systemd[1]: Reached target swap.target - Swaps. Sep 9 23:17:04.712532 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 23:17:04.712552 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 23:17:04.712571 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 23:17:04.712589 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 23:17:04.712608 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 23:17:04.712626 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 23:17:04.712644 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 23:17:04.712663 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 23:17:04.712681 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 23:17:04.712710 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 23:17:04.712734 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 23:17:04.712753 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 23:17:04.712771 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 23:17:04.712790 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 23:17:04.712810 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 23:17:04.712828 systemd[1]: Reached target machines.target - Containers. Sep 9 23:17:04.712846 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 23:17:04.712865 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:17:04.712896 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 23:17:04.712916 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 23:17:04.716117 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 23:17:04.716145 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 23:17:04.716165 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 23:17:04.716184 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 23:17:04.716221 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 23:17:04.721995 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 23:17:04.722042 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 23:17:04.722065 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 23:17:04.722084 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 23:17:04.722170 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 23:17:04.722221 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:17:04.722280 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 23:17:04.722317 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 23:17:04.722339 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 23:17:04.722359 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 23:17:04.722379 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 23:17:04.722409 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 23:17:04.722440 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 23:17:04.722461 systemd[1]: Stopped verity-setup.service. Sep 9 23:17:04.722480 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 23:17:04.722499 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 23:17:04.722519 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 23:17:04.722538 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 23:17:04.722556 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 23:17:04.722587 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 23:17:04.722607 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 23:17:04.722626 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 23:17:04.722651 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 23:17:04.722669 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 23:17:04.722687 kernel: loop: module loaded Sep 9 23:17:04.722714 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 23:17:04.722740 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 23:17:04.722761 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 23:17:04.722799 kernel: fuse: init (API version 7.41) Sep 9 23:17:04.722825 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 23:17:04.722845 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 23:17:04.722922 systemd-journald[1224]: Collecting audit messages is disabled. Sep 9 23:17:04.722961 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 23:17:04.722982 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 23:17:04.723001 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 23:17:04.723033 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 23:17:04.723054 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 23:17:04.723074 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 23:17:04.723093 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 23:17:04.723112 systemd-journald[1224]: Journal started Sep 9 23:17:04.723157 systemd-journald[1224]: Runtime Journal (/run/log/journal/1dd2e870ae174eb1a05ec7f4f0f11ff7) is 4.7M, max 38.2M, 33.4M free. Sep 9 23:17:04.293129 systemd[1]: Queued start job for default target multi-user.target. Sep 9 23:17:04.309563 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 9 23:17:04.310312 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 23:17:04.730985 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 23:17:04.734298 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 23:17:04.742504 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 23:17:04.742577 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 23:17:04.753500 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 23:17:04.753581 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 23:17:04.757304 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:17:04.757352 kernel: ACPI: bus type drm_connector registered Sep 9 23:17:04.759253 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 23:17:04.763314 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 23:17:04.768350 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 23:17:04.772264 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 23:17:04.779265 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 23:17:04.791287 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 23:17:04.802267 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 23:17:04.807269 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 23:17:04.811207 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 23:17:04.812328 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 23:17:04.812592 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 23:17:04.813648 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 23:17:04.814597 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 23:17:04.815490 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 23:17:04.820523 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 23:17:04.842643 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 23:17:04.846518 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 23:17:04.855752 kernel: loop0: detected capacity change from 0 to 128016 Sep 9 23:17:04.861519 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 23:17:04.888345 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 23:17:04.926509 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 23:17:04.945413 systemd-journald[1224]: Time spent on flushing to /var/log/journal/1dd2e870ae174eb1a05ec7f4f0f11ff7 is 25.516ms for 1173 entries. Sep 9 23:17:04.945413 systemd-journald[1224]: System Journal (/var/log/journal/1dd2e870ae174eb1a05ec7f4f0f11ff7) is 8M, max 584.8M, 576.8M free. Sep 9 23:17:04.999139 systemd-journald[1224]: Received client request to flush runtime journal. Sep 9 23:17:04.999213 kernel: loop1: detected capacity change from 0 to 221472 Sep 9 23:17:04.947425 systemd-tmpfiles[1250]: ACLs are not supported, ignoring. Sep 9 23:17:04.947444 systemd-tmpfiles[1250]: ACLs are not supported, ignoring. Sep 9 23:17:04.964639 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 23:17:04.966108 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 23:17:04.974547 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 23:17:05.003401 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 23:17:05.046275 kernel: loop2: detected capacity change from 0 to 8 Sep 9 23:17:05.064227 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 23:17:05.070568 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 23:17:05.077297 kernel: loop3: detected capacity change from 0 to 110984 Sep 9 23:17:05.139293 kernel: loop4: detected capacity change from 0 to 128016 Sep 9 23:17:05.141826 systemd-tmpfiles[1289]: ACLs are not supported, ignoring. Sep 9 23:17:05.142836 systemd-tmpfiles[1289]: ACLs are not supported, ignoring. Sep 9 23:17:05.162270 kernel: loop5: detected capacity change from 0 to 221472 Sep 9 23:17:05.164468 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 23:17:05.183267 kernel: loop6: detected capacity change from 0 to 8 Sep 9 23:17:05.193216 kernel: loop7: detected capacity change from 0 to 110984 Sep 9 23:17:05.215751 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 23:17:05.217487 (sd-merge)[1292]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Sep 9 23:17:05.218491 (sd-merge)[1292]: Merged extensions into '/usr'. Sep 9 23:17:05.232408 systemd[1]: Reload requested from client PID 1249 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 23:17:05.232446 systemd[1]: Reloading... Sep 9 23:17:05.419552 zram_generator::config[1319]: No configuration found. Sep 9 23:17:05.569850 ldconfig[1245]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 23:17:05.783762 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 23:17:05.783964 systemd[1]: Reloading finished in 550 ms. Sep 9 23:17:05.819514 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 23:17:05.823600 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 23:17:05.847923 systemd[1]: Starting ensure-sysext.service... Sep 9 23:17:05.851396 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 23:17:05.876418 systemd[1]: Reload requested from client PID 1376 ('systemctl') (unit ensure-sysext.service)... Sep 9 23:17:05.876449 systemd[1]: Reloading... Sep 9 23:17:05.895128 systemd-tmpfiles[1377]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 23:17:05.895189 systemd-tmpfiles[1377]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 23:17:05.896486 systemd-tmpfiles[1377]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 23:17:05.896883 systemd-tmpfiles[1377]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 23:17:05.898405 systemd-tmpfiles[1377]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 23:17:05.898968 systemd-tmpfiles[1377]: ACLs are not supported, ignoring. Sep 9 23:17:05.899293 systemd-tmpfiles[1377]: ACLs are not supported, ignoring. Sep 9 23:17:05.907513 systemd-tmpfiles[1377]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 23:17:05.907641 systemd-tmpfiles[1377]: Skipping /boot Sep 9 23:17:05.924039 systemd-tmpfiles[1377]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 23:17:05.924220 systemd-tmpfiles[1377]: Skipping /boot Sep 9 23:17:05.975284 zram_generator::config[1404]: No configuration found. Sep 9 23:17:06.227601 systemd[1]: Reloading finished in 350 ms. Sep 9 23:17:06.250074 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 23:17:06.263284 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 23:17:06.273327 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 23:17:06.278526 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 23:17:06.286400 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 23:17:06.294013 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 23:17:06.300695 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 23:17:06.305909 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 23:17:06.311458 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 23:17:06.311721 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:17:06.319333 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 23:17:06.322785 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 23:17:06.331706 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 23:17:06.332608 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:17:06.332796 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:17:06.332955 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 23:17:06.342949 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 23:17:06.346456 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 23:17:06.347031 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:17:06.347771 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:17:06.347913 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:17:06.348042 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 23:17:06.354089 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 23:17:06.354733 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:17:06.358481 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 23:17:06.359363 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:17:06.359513 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:17:06.359696 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 23:17:06.363517 systemd[1]: Finished ensure-sysext.service. Sep 9 23:17:06.367537 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 23:17:06.383758 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 23:17:06.402125 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 23:17:06.409528 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 23:17:06.415629 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 23:17:06.416551 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 23:17:06.428204 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 23:17:06.429299 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 23:17:06.431465 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 23:17:06.432275 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 23:17:06.433866 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 23:17:06.434211 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 23:17:06.438180 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 23:17:06.438465 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 23:17:06.447258 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 23:17:06.449077 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 23:17:06.465418 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 23:17:06.469511 systemd-udevd[1467]: Using default interface naming scheme 'v255'. Sep 9 23:17:06.475033 augenrules[1504]: No rules Sep 9 23:17:06.475728 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 23:17:06.476078 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 23:17:06.495990 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 23:17:06.508108 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 23:17:06.517298 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 23:17:06.798886 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 23:17:06.799948 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 23:17:06.802554 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 9 23:17:06.806367 systemd-networkd[1518]: lo: Link UP Sep 9 23:17:06.806380 systemd-networkd[1518]: lo: Gained carrier Sep 9 23:17:06.807688 systemd-networkd[1518]: Enumeration completed Sep 9 23:17:06.807803 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 23:17:06.810401 systemd-timesyncd[1480]: No network connectivity, watching for changes. Sep 9 23:17:06.811180 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 23:17:06.813539 systemd-resolved[1465]: Positive Trust Anchors: Sep 9 23:17:06.813559 systemd-resolved[1465]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 23:17:06.813600 systemd-resolved[1465]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 23:17:06.816496 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 23:17:06.825697 systemd-resolved[1465]: Using system hostname 'srv-5qwy1.gb1.brightbox.com'. Sep 9 23:17:06.830083 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 23:17:06.830977 systemd[1]: Reached target network.target - Network. Sep 9 23:17:06.831993 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 23:17:06.833340 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 23:17:06.834351 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 23:17:06.835553 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 23:17:06.837057 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 9 23:17:06.838622 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 23:17:06.839757 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 23:17:06.842368 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 23:17:06.843170 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 23:17:06.843218 systemd[1]: Reached target paths.target - Path Units. Sep 9 23:17:06.844312 systemd[1]: Reached target timers.target - Timer Units. Sep 9 23:17:06.847491 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 23:17:06.852161 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 23:17:06.858179 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 23:17:06.860507 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 23:17:06.862146 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 23:17:06.872491 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 23:17:06.874128 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 23:17:06.876964 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 23:17:06.882503 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 23:17:06.885301 systemd[1]: Reached target basic.target - Basic System. Sep 9 23:17:06.886019 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 23:17:06.886075 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 23:17:06.889336 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 23:17:06.894424 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 9 23:17:06.899485 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 23:17:06.903406 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 23:17:06.906090 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 23:17:06.910458 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 23:17:06.912307 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 23:17:06.915028 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 9 23:17:06.919515 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 23:17:06.924924 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 23:17:06.931442 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 23:17:06.938457 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 23:17:06.946274 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 23:17:06.960264 jq[1555]: false Sep 9 23:17:06.960900 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 23:17:06.962783 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 23:17:06.965581 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 23:17:06.967597 google_oslogin_nss_cache[1557]: oslogin_cache_refresh[1557]: Refreshing passwd entry cache Sep 9 23:17:06.967605 oslogin_cache_refresh[1557]: Refreshing passwd entry cache Sep 9 23:17:06.970674 google_oslogin_nss_cache[1557]: oslogin_cache_refresh[1557]: Failure getting users, quitting Sep 9 23:17:06.970665 oslogin_cache_refresh[1557]: Failure getting users, quitting Sep 9 23:17:06.970832 google_oslogin_nss_cache[1557]: oslogin_cache_refresh[1557]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 23:17:06.970832 google_oslogin_nss_cache[1557]: oslogin_cache_refresh[1557]: Refreshing group entry cache Sep 9 23:17:06.970702 oslogin_cache_refresh[1557]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 23:17:06.970766 oslogin_cache_refresh[1557]: Refreshing group entry cache Sep 9 23:17:06.974911 google_oslogin_nss_cache[1557]: oslogin_cache_refresh[1557]: Failure getting groups, quitting Sep 9 23:17:06.974911 google_oslogin_nss_cache[1557]: oslogin_cache_refresh[1557]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 23:17:06.971935 oslogin_cache_refresh[1557]: Failure getting groups, quitting Sep 9 23:17:06.971949 oslogin_cache_refresh[1557]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 23:17:06.981510 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 23:17:06.983976 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 23:17:06.986708 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 23:17:06.988494 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 23:17:06.989591 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 23:17:06.989863 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 23:17:06.990320 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 9 23:17:06.990578 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 9 23:17:06.992757 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 23:17:06.993029 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 23:17:07.002168 extend-filesystems[1556]: Found /dev/vda6 Sep 9 23:17:07.025079 extend-filesystems[1556]: Found /dev/vda9 Sep 9 23:17:07.035707 jq[1570]: true Sep 9 23:17:07.043251 extend-filesystems[1556]: Checking size of /dev/vda9 Sep 9 23:17:07.048770 (ntainerd)[1589]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 23:17:07.081538 jq[1593]: true Sep 9 23:17:07.093256 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 23:17:07.093976 tar[1573]: linux-amd64/helm Sep 9 23:17:07.095460 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 23:17:07.101058 extend-filesystems[1556]: Resized partition /dev/vda9 Sep 9 23:17:07.104332 extend-filesystems[1603]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 23:17:07.145404 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Sep 9 23:17:07.132195 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 23:17:07.131896 dbus-daemon[1552]: [system] SELinux support is enabled Sep 9 23:17:07.146362 update_engine[1565]: I20250909 23:17:07.121317 1565 main.cc:92] Flatcar Update Engine starting Sep 9 23:17:07.136138 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 23:17:07.136175 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 23:17:07.137470 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 23:17:07.137511 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 23:17:07.166266 update_engine[1565]: I20250909 23:17:07.160969 1565 update_check_scheduler.cc:74] Next update check in 3m1s Sep 9 23:17:07.161547 systemd-networkd[1518]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:17:07.161554 systemd-networkd[1518]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 23:17:07.163920 systemd[1]: Started update-engine.service - Update Engine. Sep 9 23:17:07.169635 systemd-networkd[1518]: eth0: Link UP Sep 9 23:17:07.170997 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 23:17:07.175186 systemd-networkd[1518]: eth0: Gained carrier Sep 9 23:17:07.175372 systemd-networkd[1518]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:17:07.201260 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 23:17:07.203860 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 23:17:07.210528 systemd-networkd[1518]: eth0: DHCPv4 address 10.230.66.202/30, gateway 10.230.66.201 acquired from 10.230.66.201 Sep 9 23:17:07.210470 dbus-daemon[1552]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.1' (uid=244 pid=1518 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 9 23:17:07.220483 systemd-timesyncd[1480]: Network configuration changed, trying to establish connection. Sep 9 23:17:07.233127 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 23:17:07.247049 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 9 23:17:07.253543 bash[1618]: Updated "/home/core/.ssh/authorized_keys" Sep 9 23:17:07.260801 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 23:17:07.268863 systemd[1]: Starting sshkeys.service... Sep 9 23:17:07.980158 systemd-timesyncd[1480]: Contacted time server 217.154.60.177:123 (1.flatcar.pool.ntp.org). Sep 9 23:17:07.980663 systemd-timesyncd[1480]: Initial clock synchronization to Tue 2025-09-09 23:17:07.979967 UTC. Sep 9 23:17:07.982067 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 9 23:17:07.982568 systemd-resolved[1465]: Clock change detected. Flushing caches. Sep 9 23:17:07.987065 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 9 23:17:07.995444 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 23:17:08.047430 systemd-logind[1563]: New seat seat0. Sep 9 23:17:08.052705 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 23:17:08.058646 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 23:17:08.067198 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 23:17:08.081748 locksmithd[1619]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 23:17:08.089187 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Sep 9 23:17:08.117084 extend-filesystems[1603]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 9 23:17:08.117084 extend-filesystems[1603]: old_desc_blocks = 1, new_desc_blocks = 8 Sep 9 23:17:08.117084 extend-filesystems[1603]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Sep 9 23:17:08.122535 extend-filesystems[1556]: Resized filesystem in /dev/vda9 Sep 9 23:17:08.121984 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 23:17:08.124821 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 23:17:08.163821 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 9 23:17:08.170020 kernel: ACPI: button: Power Button [PWRF] Sep 9 23:17:08.243135 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 9 23:17:08.250506 dbus-daemon[1552]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 9 23:17:08.255275 dbus-daemon[1552]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.8' (uid=0 pid=1621 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 9 23:17:08.263475 containerd[1589]: time="2025-09-09T23:17:08Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 23:17:08.265634 systemd[1]: Starting polkit.service - Authorization Manager... Sep 9 23:17:08.267440 containerd[1589]: time="2025-09-09T23:17:08.267218453Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 23:17:08.318984 containerd[1589]: time="2025-09-09T23:17:08.318927936Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="35.516µs" Sep 9 23:17:08.319210 containerd[1589]: time="2025-09-09T23:17:08.319153366Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 23:17:08.319335 containerd[1589]: time="2025-09-09T23:17:08.319311668Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 23:17:08.321057 containerd[1589]: time="2025-09-09T23:17:08.320726888Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 23:17:08.321756 containerd[1589]: time="2025-09-09T23:17:08.321295501Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 23:17:08.321756 containerd[1589]: time="2025-09-09T23:17:08.321355561Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 23:17:08.321756 containerd[1589]: time="2025-09-09T23:17:08.321468518Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 23:17:08.321756 containerd[1589]: time="2025-09-09T23:17:08.321491368Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 23:17:08.324719 containerd[1589]: time="2025-09-09T23:17:08.324236574Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 23:17:08.324719 containerd[1589]: time="2025-09-09T23:17:08.324268304Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 23:17:08.324719 containerd[1589]: time="2025-09-09T23:17:08.324286923Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 23:17:08.324719 containerd[1589]: time="2025-09-09T23:17:08.324300243Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 23:17:08.324719 containerd[1589]: time="2025-09-09T23:17:08.324422707Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 23:17:08.326601 containerd[1589]: time="2025-09-09T23:17:08.325624351Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 23:17:08.326870 containerd[1589]: time="2025-09-09T23:17:08.326799255Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 23:17:08.326870 containerd[1589]: time="2025-09-09T23:17:08.326826616Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 23:17:08.327245 containerd[1589]: time="2025-09-09T23:17:08.327218747Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 23:17:08.331874 sshd_keygen[1595]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 23:17:08.332027 containerd[1589]: time="2025-09-09T23:17:08.331545281Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 23:17:08.332027 containerd[1589]: time="2025-09-09T23:17:08.331631897Z" level=info msg="metadata content store policy set" policy=shared Sep 9 23:17:08.356780 containerd[1589]: time="2025-09-09T23:17:08.356455250Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 23:17:08.356780 containerd[1589]: time="2025-09-09T23:17:08.356583140Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 23:17:08.356780 containerd[1589]: time="2025-09-09T23:17:08.356628698Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 23:17:08.356780 containerd[1589]: time="2025-09-09T23:17:08.356647858Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 23:17:08.356780 containerd[1589]: time="2025-09-09T23:17:08.356704230Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 23:17:08.356780 containerd[1589]: time="2025-09-09T23:17:08.356724590Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 23:17:08.357267 containerd[1589]: time="2025-09-09T23:17:08.357128413Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 23:17:08.357267 containerd[1589]: time="2025-09-09T23:17:08.357199021Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 23:17:08.357267 containerd[1589]: time="2025-09-09T23:17:08.357223308Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 23:17:08.359320 containerd[1589]: time="2025-09-09T23:17:08.357246887Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 23:17:08.359320 containerd[1589]: time="2025-09-09T23:17:08.358777705Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 23:17:08.359320 containerd[1589]: time="2025-09-09T23:17:08.358839256Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 23:17:08.359320 containerd[1589]: time="2025-09-09T23:17:08.359068329Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 23:17:08.359320 containerd[1589]: time="2025-09-09T23:17:08.359156001Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 23:17:08.359320 containerd[1589]: time="2025-09-09T23:17:08.359205216Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 23:17:08.359320 containerd[1589]: time="2025-09-09T23:17:08.359233413Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 23:17:08.359832 containerd[1589]: time="2025-09-09T23:17:08.359763724Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 23:17:08.359832 containerd[1589]: time="2025-09-09T23:17:08.359793828Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 23:17:08.361033 containerd[1589]: time="2025-09-09T23:17:08.359811412Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 23:17:08.361033 containerd[1589]: time="2025-09-09T23:17:08.359956552Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 23:17:08.361033 containerd[1589]: time="2025-09-09T23:17:08.360873562Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 23:17:08.361033 containerd[1589]: time="2025-09-09T23:17:08.360900533Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 23:17:08.361033 containerd[1589]: time="2025-09-09T23:17:08.360928865Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 23:17:08.367280 containerd[1589]: time="2025-09-09T23:17:08.361439073Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 23:17:08.367280 containerd[1589]: time="2025-09-09T23:17:08.364843453Z" level=info msg="Start snapshots syncer" Sep 9 23:17:08.367280 containerd[1589]: time="2025-09-09T23:17:08.364926900Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 23:17:08.367421 containerd[1589]: time="2025-09-09T23:17:08.365412820Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 23:17:08.367421 containerd[1589]: time="2025-09-09T23:17:08.365531563Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 23:17:08.367711 containerd[1589]: time="2025-09-09T23:17:08.367259597Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 23:17:08.367947 containerd[1589]: time="2025-09-09T23:17:08.367911592Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371218430Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371253735Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371293261Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371314037Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371329699Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371362735Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371394661Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371413317Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371452137Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371528496Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371555302Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371569684Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371596183Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371628161Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371656995Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371672529Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371727255Z" level=info msg="runtime interface created" Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371740701Z" level=info msg="created NRI interface" Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371757639Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371791399Z" level=info msg="Connect containerd service" Sep 9 23:17:08.373850 containerd[1589]: time="2025-09-09T23:17:08.371839766Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 23:17:08.377086 containerd[1589]: time="2025-09-09T23:17:08.376665248Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 23:17:08.392916 polkitd[1644]: Started polkitd version 126 Sep 9 23:17:08.407770 polkitd[1644]: Loading rules from directory /etc/polkit-1/rules.d Sep 9 23:17:08.411282 polkitd[1644]: Loading rules from directory /run/polkit-1/rules.d Sep 9 23:17:08.411375 polkitd[1644]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 9 23:17:08.411758 polkitd[1644]: Loading rules from directory /usr/local/share/polkit-1/rules.d Sep 9 23:17:08.411805 polkitd[1644]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 9 23:17:08.411863 polkitd[1644]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 9 23:17:08.415009 polkitd[1644]: Finished loading, compiling and executing 2 rules Sep 9 23:17:08.415903 systemd[1]: Started polkit.service - Authorization Manager. Sep 9 23:17:08.416844 dbus-daemon[1552]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 9 23:17:08.419076 polkitd[1644]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 9 23:17:08.459854 systemd-hostnamed[1621]: Hostname set to (static) Sep 9 23:17:08.464627 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 23:17:08.472052 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 23:17:08.475846 systemd[1]: Started sshd@0-10.230.66.202:22-139.178.68.195:35994.service - OpenSSH per-connection server daemon (139.178.68.195:35994). Sep 9 23:17:08.523716 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 23:17:08.524285 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 23:17:08.539335 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 9 23:17:08.540038 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 23:17:08.547216 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 9 23:17:08.578864 containerd[1589]: time="2025-09-09T23:17:08.578818108Z" level=info msg="Start subscribing containerd event" Sep 9 23:17:08.578963 containerd[1589]: time="2025-09-09T23:17:08.578876875Z" level=info msg="Start recovering state" Sep 9 23:17:08.579079 containerd[1589]: time="2025-09-09T23:17:08.579054263Z" level=info msg="Start event monitor" Sep 9 23:17:08.579185 containerd[1589]: time="2025-09-09T23:17:08.579079425Z" level=info msg="Start cni network conf syncer for default" Sep 9 23:17:08.579185 containerd[1589]: time="2025-09-09T23:17:08.579097229Z" level=info msg="Start streaming server" Sep 9 23:17:08.579185 containerd[1589]: time="2025-09-09T23:17:08.579150022Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 23:17:08.582251 containerd[1589]: time="2025-09-09T23:17:08.581933229Z" level=info msg="runtime interface starting up..." Sep 9 23:17:08.582251 containerd[1589]: time="2025-09-09T23:17:08.581967919Z" level=info msg="starting plugins..." Sep 9 23:17:08.582251 containerd[1589]: time="2025-09-09T23:17:08.582017999Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 23:17:08.583335 containerd[1589]: time="2025-09-09T23:17:08.583281152Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 23:17:08.583412 containerd[1589]: time="2025-09-09T23:17:08.583388078Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 23:17:08.583509 containerd[1589]: time="2025-09-09T23:17:08.583487271Z" level=info msg="containerd successfully booted in 0.320934s" Sep 9 23:17:08.584129 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 23:17:08.621733 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 23:17:08.627801 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 23:17:08.634523 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 9 23:17:08.635774 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 23:17:08.707484 systemd-logind[1563]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 9 23:17:08.868021 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:17:08.878896 systemd-networkd[1518]: eth0: Gained IPv6LL Sep 9 23:17:08.887259 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 23:17:08.889042 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 23:17:08.905600 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:17:08.912999 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 23:17:09.039739 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 23:17:09.058053 systemd-logind[1563]: Watching system buttons on /dev/input/event3 (Power Button) Sep 9 23:17:09.138269 tar[1573]: linux-amd64/LICENSE Sep 9 23:17:09.138754 tar[1573]: linux-amd64/README.md Sep 9 23:17:09.181265 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 23:17:09.328258 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:17:09.543230 sshd[1675]: Accepted publickey for core from 139.178.68.195 port 35994 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:17:09.547511 sshd-session[1675]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:17:09.558843 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 23:17:09.561882 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 23:17:09.585649 systemd-logind[1563]: New session 1 of user core. Sep 9 23:17:09.601470 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 23:17:09.606910 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 23:17:09.624745 (systemd)[1720]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 23:17:09.630797 systemd-logind[1563]: New session c1 of user core. Sep 9 23:17:09.811128 systemd[1720]: Queued start job for default target default.target. Sep 9 23:17:09.819748 systemd[1720]: Created slice app.slice - User Application Slice. Sep 9 23:17:09.819906 systemd[1720]: Reached target paths.target - Paths. Sep 9 23:17:09.820224 systemd[1720]: Reached target timers.target - Timers. Sep 9 23:17:09.823291 systemd[1720]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 23:17:09.844494 systemd[1720]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 23:17:09.844700 systemd[1720]: Reached target sockets.target - Sockets. Sep 9 23:17:09.844759 systemd[1720]: Reached target basic.target - Basic System. Sep 9 23:17:09.844837 systemd[1720]: Reached target default.target - Main User Target. Sep 9 23:17:09.844897 systemd[1720]: Startup finished in 201ms. Sep 9 23:17:09.845719 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 23:17:09.855452 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 23:17:10.085120 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:17:10.104989 (kubelet)[1734]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:17:10.113368 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 23:17:10.117274 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 23:17:10.386770 systemd-networkd[1518]: eth0: Ignoring DHCPv6 address 2a02:1348:179:90b2:24:19ff:fee6:42ca/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:90b2:24:19ff:fee6:42ca/64 assigned by NDisc. Sep 9 23:17:10.386782 systemd-networkd[1518]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 9 23:17:10.489263 systemd[1]: Started sshd@1-10.230.66.202:22-139.178.68.195:43882.service - OpenSSH per-connection server daemon (139.178.68.195:43882). Sep 9 23:17:10.739890 kubelet[1734]: E0909 23:17:10.739696 1734 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:17:10.742850 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:17:10.743092 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:17:10.743964 systemd[1]: kubelet.service: Consumed 1.025s CPU time, 265M memory peak. Sep 9 23:17:11.390920 sshd[1744]: Accepted publickey for core from 139.178.68.195 port 43882 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:17:11.392800 sshd-session[1744]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:17:11.402222 systemd-logind[1563]: New session 2 of user core. Sep 9 23:17:11.408444 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 23:17:12.007058 sshd[1749]: Connection closed by 139.178.68.195 port 43882 Sep 9 23:17:12.007962 sshd-session[1744]: pam_unix(sshd:session): session closed for user core Sep 9 23:17:12.012706 systemd[1]: sshd@1-10.230.66.202:22-139.178.68.195:43882.service: Deactivated successfully. Sep 9 23:17:12.015549 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 23:17:12.017637 systemd-logind[1563]: Session 2 logged out. Waiting for processes to exit. Sep 9 23:17:12.020304 systemd-logind[1563]: Removed session 2. Sep 9 23:17:12.132221 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 23:17:12.136200 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 23:17:12.160396 systemd[1]: Started sshd@2-10.230.66.202:22-139.178.68.195:43894.service - OpenSSH per-connection server daemon (139.178.68.195:43894). Sep 9 23:17:13.080782 sshd[1757]: Accepted publickey for core from 139.178.68.195 port 43894 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:17:13.082438 sshd-session[1757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:17:13.089252 systemd-logind[1563]: New session 3 of user core. Sep 9 23:17:13.098487 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 23:17:13.706150 sshd[1760]: Connection closed by 139.178.68.195 port 43894 Sep 9 23:17:13.711326 sshd-session[1757]: pam_unix(sshd:session): session closed for user core Sep 9 23:17:13.716991 login[1692]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 9 23:17:13.717794 systemd[1]: sshd@2-10.230.66.202:22-139.178.68.195:43894.service: Deactivated successfully. Sep 9 23:17:13.721762 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 23:17:13.723359 systemd-logind[1563]: Session 3 logged out. Waiting for processes to exit. Sep 9 23:17:13.726595 login[1691]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 9 23:17:13.727240 systemd-logind[1563]: Removed session 3. Sep 9 23:17:13.733545 systemd-logind[1563]: New session 4 of user core. Sep 9 23:17:13.737449 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 23:17:13.742434 systemd-logind[1563]: New session 5 of user core. Sep 9 23:17:13.746890 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 23:17:16.149201 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 23:17:16.154192 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 23:17:16.160850 coreos-metadata[1551]: Sep 09 23:17:16.160 WARN failed to locate config-drive, using the metadata service API instead Sep 9 23:17:16.164486 coreos-metadata[1631]: Sep 09 23:17:16.164 WARN failed to locate config-drive, using the metadata service API instead Sep 9 23:17:16.185249 coreos-metadata[1631]: Sep 09 23:17:16.185 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Sep 9 23:17:16.185407 coreos-metadata[1551]: Sep 09 23:17:16.185 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Sep 9 23:17:16.193256 coreos-metadata[1551]: Sep 09 23:17:16.193 INFO Fetch failed with 404: resource not found Sep 9 23:17:16.193392 coreos-metadata[1551]: Sep 09 23:17:16.193 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 9 23:17:16.194211 coreos-metadata[1551]: Sep 09 23:17:16.194 INFO Fetch successful Sep 9 23:17:16.194337 coreos-metadata[1551]: Sep 09 23:17:16.194 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Sep 9 23:17:16.206364 coreos-metadata[1551]: Sep 09 23:17:16.206 INFO Fetch successful Sep 9 23:17:16.206494 coreos-metadata[1551]: Sep 09 23:17:16.206 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Sep 9 23:17:16.215563 coreos-metadata[1631]: Sep 09 23:17:16.215 INFO Fetch successful Sep 9 23:17:16.215803 coreos-metadata[1631]: Sep 09 23:17:16.215 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 9 23:17:16.221934 coreos-metadata[1551]: Sep 09 23:17:16.221 INFO Fetch successful Sep 9 23:17:16.222036 coreos-metadata[1551]: Sep 09 23:17:16.221 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Sep 9 23:17:16.239267 coreos-metadata[1551]: Sep 09 23:17:16.239 INFO Fetch successful Sep 9 23:17:16.239370 coreos-metadata[1551]: Sep 09 23:17:16.239 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Sep 9 23:17:16.248589 coreos-metadata[1631]: Sep 09 23:17:16.248 INFO Fetch successful Sep 9 23:17:16.250680 unknown[1631]: wrote ssh authorized keys file for user: core Sep 9 23:17:16.256258 coreos-metadata[1551]: Sep 09 23:17:16.256 INFO Fetch successful Sep 9 23:17:16.277215 update-ssh-keys[1796]: Updated "/home/core/.ssh/authorized_keys" Sep 9 23:17:16.277364 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 9 23:17:16.280097 systemd[1]: Finished sshkeys.service. Sep 9 23:17:16.288806 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 9 23:17:16.289824 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 23:17:16.290049 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 23:17:16.292268 systemd[1]: Startup finished in 3.403s (kernel) + 14.850s (initrd) + 12.224s (userspace) = 30.478s. Sep 9 23:17:20.781082 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 23:17:20.783735 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:17:20.959182 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:17:20.970712 (kubelet)[1813]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:17:21.086035 kubelet[1813]: E0909 23:17:21.085822 1813 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:17:21.090603 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:17:21.090824 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:17:21.091918 systemd[1]: kubelet.service: Consumed 215ms CPU time, 108.7M memory peak. Sep 9 23:17:23.860686 systemd[1]: Started sshd@3-10.230.66.202:22-139.178.68.195:36088.service - OpenSSH per-connection server daemon (139.178.68.195:36088). Sep 9 23:17:24.776951 sshd[1821]: Accepted publickey for core from 139.178.68.195 port 36088 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:17:24.777673 sshd-session[1821]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:17:24.784030 systemd-logind[1563]: New session 6 of user core. Sep 9 23:17:24.793391 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 23:17:25.397165 sshd[1824]: Connection closed by 139.178.68.195 port 36088 Sep 9 23:17:25.397027 sshd-session[1821]: pam_unix(sshd:session): session closed for user core Sep 9 23:17:25.401878 systemd[1]: sshd@3-10.230.66.202:22-139.178.68.195:36088.service: Deactivated successfully. Sep 9 23:17:25.404348 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 23:17:25.405716 systemd-logind[1563]: Session 6 logged out. Waiting for processes to exit. Sep 9 23:17:25.407920 systemd-logind[1563]: Removed session 6. Sep 9 23:17:25.551579 systemd[1]: Started sshd@4-10.230.66.202:22-139.178.68.195:36098.service - OpenSSH per-connection server daemon (139.178.68.195:36098). Sep 9 23:17:26.464032 sshd[1830]: Accepted publickey for core from 139.178.68.195 port 36098 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:17:26.465622 sshd-session[1830]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:17:26.473072 systemd-logind[1563]: New session 7 of user core. Sep 9 23:17:26.480424 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 23:17:27.080839 sshd[1833]: Connection closed by 139.178.68.195 port 36098 Sep 9 23:17:27.081657 sshd-session[1830]: pam_unix(sshd:session): session closed for user core Sep 9 23:17:27.086854 systemd[1]: sshd@4-10.230.66.202:22-139.178.68.195:36098.service: Deactivated successfully. Sep 9 23:17:27.089391 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 23:17:27.091287 systemd-logind[1563]: Session 7 logged out. Waiting for processes to exit. Sep 9 23:17:27.093045 systemd-logind[1563]: Removed session 7. Sep 9 23:17:27.248558 systemd[1]: Started sshd@5-10.230.66.202:22-139.178.68.195:36106.service - OpenSSH per-connection server daemon (139.178.68.195:36106). Sep 9 23:17:28.221064 sshd[1839]: Accepted publickey for core from 139.178.68.195 port 36106 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:17:28.222726 sshd-session[1839]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:17:28.229286 systemd-logind[1563]: New session 8 of user core. Sep 9 23:17:28.237553 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 23:17:28.884727 sshd[1842]: Connection closed by 139.178.68.195 port 36106 Sep 9 23:17:28.884488 sshd-session[1839]: pam_unix(sshd:session): session closed for user core Sep 9 23:17:28.890989 systemd[1]: sshd@5-10.230.66.202:22-139.178.68.195:36106.service: Deactivated successfully. Sep 9 23:17:28.893855 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 23:17:28.895183 systemd-logind[1563]: Session 8 logged out. Waiting for processes to exit. Sep 9 23:17:28.897506 systemd-logind[1563]: Removed session 8. Sep 9 23:17:29.041894 systemd[1]: Started sshd@6-10.230.66.202:22-139.178.68.195:36108.service - OpenSSH per-connection server daemon (139.178.68.195:36108). Sep 9 23:17:29.962712 sshd[1848]: Accepted publickey for core from 139.178.68.195 port 36108 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:17:29.964405 sshd-session[1848]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:17:29.971025 systemd-logind[1563]: New session 9 of user core. Sep 9 23:17:29.982420 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 23:17:30.464707 sudo[1852]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 23:17:30.465129 sudo[1852]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:17:30.478540 sudo[1852]: pam_unix(sudo:session): session closed for user root Sep 9 23:17:30.622301 sshd[1851]: Connection closed by 139.178.68.195 port 36108 Sep 9 23:17:30.623334 sshd-session[1848]: pam_unix(sshd:session): session closed for user core Sep 9 23:17:30.628580 systemd[1]: sshd@6-10.230.66.202:22-139.178.68.195:36108.service: Deactivated successfully. Sep 9 23:17:30.630678 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 23:17:30.631852 systemd-logind[1563]: Session 9 logged out. Waiting for processes to exit. Sep 9 23:17:30.633568 systemd-logind[1563]: Removed session 9. Sep 9 23:17:30.777206 systemd[1]: Started sshd@7-10.230.66.202:22-139.178.68.195:56734.service - OpenSSH per-connection server daemon (139.178.68.195:56734). Sep 9 23:17:31.280898 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 23:17:31.284484 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:17:31.568481 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:17:31.581662 (kubelet)[1869]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:17:31.631762 kubelet[1869]: E0909 23:17:31.631681 1869 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:17:31.634102 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:17:31.634384 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:17:31.635242 systemd[1]: kubelet.service: Consumed 182ms CPU time, 108.6M memory peak. Sep 9 23:17:31.680936 sshd[1858]: Accepted publickey for core from 139.178.68.195 port 56734 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:17:31.682595 sshd-session[1858]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:17:31.691009 systemd-logind[1563]: New session 10 of user core. Sep 9 23:17:31.696381 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 23:17:32.155879 sudo[1878]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 23:17:32.156288 sudo[1878]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:17:32.163527 sudo[1878]: pam_unix(sudo:session): session closed for user root Sep 9 23:17:32.171038 sudo[1877]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 23:17:32.171441 sudo[1877]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:17:32.186302 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 23:17:32.229679 augenrules[1900]: No rules Sep 9 23:17:32.231581 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 23:17:32.232057 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 23:17:32.234899 sudo[1877]: pam_unix(sudo:session): session closed for user root Sep 9 23:17:32.377896 sshd[1876]: Connection closed by 139.178.68.195 port 56734 Sep 9 23:17:32.378978 sshd-session[1858]: pam_unix(sshd:session): session closed for user core Sep 9 23:17:32.384598 systemd[1]: sshd@7-10.230.66.202:22-139.178.68.195:56734.service: Deactivated successfully. Sep 9 23:17:32.386691 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 23:17:32.388001 systemd-logind[1563]: Session 10 logged out. Waiting for processes to exit. Sep 9 23:17:32.389708 systemd-logind[1563]: Removed session 10. Sep 9 23:17:32.535352 systemd[1]: Started sshd@8-10.230.66.202:22-139.178.68.195:56740.service - OpenSSH per-connection server daemon (139.178.68.195:56740). Sep 9 23:17:33.447917 sshd[1909]: Accepted publickey for core from 139.178.68.195 port 56740 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:17:33.449493 sshd-session[1909]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:17:33.456704 systemd-logind[1563]: New session 11 of user core. Sep 9 23:17:33.467425 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 23:17:33.923465 sudo[1913]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 23:17:33.923889 sudo[1913]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:17:34.404162 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 23:17:34.426687 (dockerd)[1930]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 23:17:34.783662 dockerd[1930]: time="2025-09-09T23:17:34.783566421Z" level=info msg="Starting up" Sep 9 23:17:34.785504 dockerd[1930]: time="2025-09-09T23:17:34.785466436Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 23:17:34.802607 dockerd[1930]: time="2025-09-09T23:17:34.802546411Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 23:17:34.821764 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport812706709-merged.mount: Deactivated successfully. Sep 9 23:17:34.832786 systemd[1]: var-lib-docker-metacopy\x2dcheck910709068-merged.mount: Deactivated successfully. Sep 9 23:17:34.857409 dockerd[1930]: time="2025-09-09T23:17:34.857124243Z" level=info msg="Loading containers: start." Sep 9 23:17:34.870218 kernel: Initializing XFRM netlink socket Sep 9 23:17:35.215077 systemd-networkd[1518]: docker0: Link UP Sep 9 23:17:35.219730 dockerd[1930]: time="2025-09-09T23:17:35.218880201Z" level=info msg="Loading containers: done." Sep 9 23:17:35.238367 dockerd[1930]: time="2025-09-09T23:17:35.238326832Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 23:17:35.238654 dockerd[1930]: time="2025-09-09T23:17:35.238625254Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 23:17:35.238874 dockerd[1930]: time="2025-09-09T23:17:35.238850750Z" level=info msg="Initializing buildkit" Sep 9 23:17:35.265430 dockerd[1930]: time="2025-09-09T23:17:35.265372640Z" level=info msg="Completed buildkit initialization" Sep 9 23:17:35.276053 dockerd[1930]: time="2025-09-09T23:17:35.275987757Z" level=info msg="Daemon has completed initialization" Sep 9 23:17:35.276398 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 23:17:35.277076 dockerd[1930]: time="2025-09-09T23:17:35.276994735Z" level=info msg="API listen on /run/docker.sock" Sep 9 23:17:35.818210 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3140194152-merged.mount: Deactivated successfully. Sep 9 23:17:36.351759 containerd[1589]: time="2025-09-09T23:17:36.351061946Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 9 23:17:37.152109 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2883883338.mount: Deactivated successfully. Sep 9 23:17:39.037204 containerd[1589]: time="2025-09-09T23:17:39.036245354Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:39.037845 containerd[1589]: time="2025-09-09T23:17:39.037816452Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=28079639" Sep 9 23:17:39.039502 containerd[1589]: time="2025-09-09T23:17:39.039468322Z" level=info msg="ImageCreate event name:\"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:39.044089 containerd[1589]: time="2025-09-09T23:17:39.044052445Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:39.046537 containerd[1589]: time="2025-09-09T23:17:39.046502881Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"28076431\" in 2.695353221s" Sep 9 23:17:39.046685 containerd[1589]: time="2025-09-09T23:17:39.046657909Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\"" Sep 9 23:17:39.047751 containerd[1589]: time="2025-09-09T23:17:39.047596163Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 9 23:17:40.410274 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 9 23:17:41.781155 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 9 23:17:41.785098 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:17:41.794421 containerd[1589]: time="2025-09-09T23:17:41.794365874Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:41.795949 containerd[1589]: time="2025-09-09T23:17:41.795885529Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=24714689" Sep 9 23:17:41.797086 containerd[1589]: time="2025-09-09T23:17:41.797015719Z" level=info msg="ImageCreate event name:\"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:41.801933 containerd[1589]: time="2025-09-09T23:17:41.801897192Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:41.804530 containerd[1589]: time="2025-09-09T23:17:41.804480308Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"26317875\" in 2.756509201s" Sep 9 23:17:41.804611 containerd[1589]: time="2025-09-09T23:17:41.804548123Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\"" Sep 9 23:17:41.805312 containerd[1589]: time="2025-09-09T23:17:41.805101348Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 9 23:17:41.962188 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:17:41.973705 (kubelet)[2213]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:17:42.066598 kubelet[2213]: E0909 23:17:42.066374 2213 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:17:42.069910 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:17:42.070182 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:17:42.071517 systemd[1]: kubelet.service: Consumed 206ms CPU time, 107.6M memory peak. Sep 9 23:17:43.888512 containerd[1589]: time="2025-09-09T23:17:43.888427901Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:43.890079 containerd[1589]: time="2025-09-09T23:17:43.889826861Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=18782435" Sep 9 23:17:43.890846 containerd[1589]: time="2025-09-09T23:17:43.890811113Z" level=info msg="ImageCreate event name:\"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:43.894442 containerd[1589]: time="2025-09-09T23:17:43.894410847Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:43.896342 containerd[1589]: time="2025-09-09T23:17:43.896279316Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"20385639\" in 2.091131925s" Sep 9 23:17:43.896517 containerd[1589]: time="2025-09-09T23:17:43.896318165Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\"" Sep 9 23:17:43.897475 containerd[1589]: time="2025-09-09T23:17:43.897437936Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 9 23:17:46.067388 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount892339249.mount: Deactivated successfully. Sep 9 23:17:46.707199 containerd[1589]: time="2025-09-09T23:17:46.706356784Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:46.708779 containerd[1589]: time="2025-09-09T23:17:46.708709481Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=30384263" Sep 9 23:17:46.709439 containerd[1589]: time="2025-09-09T23:17:46.709408234Z" level=info msg="ImageCreate event name:\"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:46.712541 containerd[1589]: time="2025-09-09T23:17:46.712511110Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:46.714008 containerd[1589]: time="2025-09-09T23:17:46.713973650Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"30383274\" in 2.816389226s" Sep 9 23:17:46.714392 containerd[1589]: time="2025-09-09T23:17:46.714365288Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\"" Sep 9 23:17:46.715094 containerd[1589]: time="2025-09-09T23:17:46.715065396Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 23:17:47.325037 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1363479184.mount: Deactivated successfully. Sep 9 23:17:48.685497 containerd[1589]: time="2025-09-09T23:17:48.685348004Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:48.687590 containerd[1589]: time="2025-09-09T23:17:48.687551022Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Sep 9 23:17:48.705086 containerd[1589]: time="2025-09-09T23:17:48.705041589Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:48.708935 containerd[1589]: time="2025-09-09T23:17:48.708872092Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:48.710125 containerd[1589]: time="2025-09-09T23:17:48.710065961Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.994818627s" Sep 9 23:17:48.710125 containerd[1589]: time="2025-09-09T23:17:48.710115902Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 9 23:17:48.711189 containerd[1589]: time="2025-09-09T23:17:48.710907772Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 23:17:49.319728 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount409294712.mount: Deactivated successfully. Sep 9 23:17:49.326030 containerd[1589]: time="2025-09-09T23:17:49.325866060Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 23:17:49.327497 containerd[1589]: time="2025-09-09T23:17:49.327432366Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Sep 9 23:17:49.329189 containerd[1589]: time="2025-09-09T23:17:49.328548363Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 23:17:49.331460 containerd[1589]: time="2025-09-09T23:17:49.331429088Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 23:17:49.333512 containerd[1589]: time="2025-09-09T23:17:49.333476356Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 622.531129ms" Sep 9 23:17:49.333599 containerd[1589]: time="2025-09-09T23:17:49.333516953Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 9 23:17:49.334133 containerd[1589]: time="2025-09-09T23:17:49.334102558Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 9 23:17:49.905336 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount775851588.mount: Deactivated successfully. Sep 9 23:17:52.282698 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 9 23:17:52.289423 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:17:52.722442 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:17:52.741904 (kubelet)[2348]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:17:52.827518 update_engine[1565]: I20250909 23:17:52.826351 1565 update_attempter.cc:509] Updating boot flags... Sep 9 23:17:52.866327 kubelet[2348]: E0909 23:17:52.866267 2348 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:17:52.881245 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:17:52.881498 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:17:52.881988 systemd[1]: kubelet.service: Consumed 240ms CPU time, 108.9M memory peak. Sep 9 23:17:54.602795 containerd[1589]: time="2025-09-09T23:17:54.602663350Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:54.604601 containerd[1589]: time="2025-09-09T23:17:54.604233510Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910717" Sep 9 23:17:54.605387 containerd[1589]: time="2025-09-09T23:17:54.605349448Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:54.610251 containerd[1589]: time="2025-09-09T23:17:54.610215937Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:17:54.611599 containerd[1589]: time="2025-09-09T23:17:54.611565476Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 5.277422982s" Sep 9 23:17:54.611788 containerd[1589]: time="2025-09-09T23:17:54.611760021Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 9 23:17:59.364993 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:17:59.366159 systemd[1]: kubelet.service: Consumed 240ms CPU time, 108.9M memory peak. Sep 9 23:17:59.370066 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:17:59.409487 systemd[1]: Reload requested from client PID 2402 ('systemctl') (unit session-11.scope)... Sep 9 23:17:59.409731 systemd[1]: Reloading... Sep 9 23:17:59.542265 zram_generator::config[2443]: No configuration found. Sep 9 23:17:59.915525 systemd[1]: Reloading finished in 504 ms. Sep 9 23:18:00.005799 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 23:18:00.005956 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 23:18:00.006760 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:18:00.006941 systemd[1]: kubelet.service: Consumed 129ms CPU time, 98M memory peak. Sep 9 23:18:00.009445 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:18:00.188351 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:18:00.205940 (kubelet)[2514]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 23:18:00.262960 kubelet[2514]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:18:00.262960 kubelet[2514]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 23:18:00.262960 kubelet[2514]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:18:00.262960 kubelet[2514]: I0909 23:18:00.262766 2514 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 23:18:00.591456 kubelet[2514]: I0909 23:18:00.591309 2514 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 23:18:00.591456 kubelet[2514]: I0909 23:18:00.591357 2514 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 23:18:00.591753 kubelet[2514]: I0909 23:18:00.591716 2514 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 23:18:00.624190 kubelet[2514]: I0909 23:18:00.623344 2514 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 23:18:00.628324 kubelet[2514]: E0909 23:18:00.628101 2514 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.230.66.202:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.230.66.202:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:18:00.637585 kubelet[2514]: I0909 23:18:00.637556 2514 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 23:18:00.646401 kubelet[2514]: I0909 23:18:00.646374 2514 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 23:18:00.647732 kubelet[2514]: I0909 23:18:00.647708 2514 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 23:18:00.648137 kubelet[2514]: I0909 23:18:00.648092 2514 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 23:18:00.648525 kubelet[2514]: I0909 23:18:00.648259 2514 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-5qwy1.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 23:18:00.648861 kubelet[2514]: I0909 23:18:00.648831 2514 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 23:18:00.649192 kubelet[2514]: I0909 23:18:00.648982 2514 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 23:18:00.649365 kubelet[2514]: I0909 23:18:00.649345 2514 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:18:00.653238 kubelet[2514]: I0909 23:18:00.653215 2514 kubelet.go:408] "Attempting to sync node with API server" Sep 9 23:18:00.653402 kubelet[2514]: I0909 23:18:00.653365 2514 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 23:18:00.653562 kubelet[2514]: I0909 23:18:00.653544 2514 kubelet.go:314] "Adding apiserver pod source" Sep 9 23:18:00.653704 kubelet[2514]: I0909 23:18:00.653685 2514 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 23:18:00.660322 kubelet[2514]: I0909 23:18:00.659635 2514 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 23:18:00.662789 kubelet[2514]: I0909 23:18:00.662761 2514 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 23:18:00.662991 kubelet[2514]: W0909 23:18:00.662892 2514 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 23:18:00.663767 kubelet[2514]: I0909 23:18:00.663743 2514 server.go:1274] "Started kubelet" Sep 9 23:18:00.664019 kubelet[2514]: W0909 23:18:00.663948 2514 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.66.202:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-5qwy1.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.66.202:6443: connect: connection refused Sep 9 23:18:00.664092 kubelet[2514]: E0909 23:18:00.664024 2514 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.66.202:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-5qwy1.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.66.202:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:18:00.665014 kubelet[2514]: W0909 23:18:00.664954 2514 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.66.202:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.66.202:6443: connect: connection refused Sep 9 23:18:00.665220 kubelet[2514]: E0909 23:18:00.665121 2514 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.66.202:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.66.202:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:18:00.665420 kubelet[2514]: I0909 23:18:00.665206 2514 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 23:18:00.666761 kubelet[2514]: I0909 23:18:00.666691 2514 server.go:449] "Adding debug handlers to kubelet server" Sep 9 23:18:00.670144 kubelet[2514]: I0909 23:18:00.669585 2514 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 23:18:00.670144 kubelet[2514]: I0909 23:18:00.669961 2514 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 23:18:00.672647 kubelet[2514]: I0909 23:18:00.672529 2514 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 23:18:00.674782 kubelet[2514]: E0909 23:18:00.671364 2514 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.230.66.202:6443/api/v1/namespaces/default/events\": dial tcp 10.230.66.202:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-5qwy1.gb1.brightbox.com.1863c07335f60e10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-5qwy1.gb1.brightbox.com,UID:srv-5qwy1.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-5qwy1.gb1.brightbox.com,},FirstTimestamp:2025-09-09 23:18:00.663715344 +0000 UTC m=+0.453025998,LastTimestamp:2025-09-09 23:18:00.663715344 +0000 UTC m=+0.453025998,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-5qwy1.gb1.brightbox.com,}" Sep 9 23:18:00.678329 kubelet[2514]: I0909 23:18:00.677615 2514 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 23:18:00.681574 kubelet[2514]: I0909 23:18:00.681547 2514 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 23:18:00.683060 kubelet[2514]: I0909 23:18:00.683040 2514 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 23:18:00.683266 kubelet[2514]: I0909 23:18:00.683247 2514 reconciler.go:26] "Reconciler: start to sync state" Sep 9 23:18:00.683989 kubelet[2514]: E0909 23:18:00.683952 2514 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-5qwy1.gb1.brightbox.com\" not found" Sep 9 23:18:00.684697 kubelet[2514]: E0909 23:18:00.684663 2514 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.66.202:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-5qwy1.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.66.202:6443: connect: connection refused" interval="200ms" Sep 9 23:18:00.684945 kubelet[2514]: W0909 23:18:00.684888 2514 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.66.202:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.66.202:6443: connect: connection refused Sep 9 23:18:00.685122 kubelet[2514]: E0909 23:18:00.685044 2514 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.66.202:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.66.202:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:18:00.690220 kubelet[2514]: I0909 23:18:00.690196 2514 factory.go:221] Registration of the systemd container factory successfully Sep 9 23:18:00.690440 kubelet[2514]: I0909 23:18:00.690414 2514 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 23:18:00.692440 kubelet[2514]: E0909 23:18:00.692418 2514 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 23:18:00.693172 kubelet[2514]: I0909 23:18:00.693136 2514 factory.go:221] Registration of the containerd container factory successfully Sep 9 23:18:00.718602 kubelet[2514]: I0909 23:18:00.718525 2514 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 23:18:00.719953 kubelet[2514]: I0909 23:18:00.719915 2514 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 23:18:00.720012 kubelet[2514]: I0909 23:18:00.719967 2514 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 23:18:00.720012 kubelet[2514]: I0909 23:18:00.720009 2514 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 23:18:00.720121 kubelet[2514]: E0909 23:18:00.720075 2514 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 23:18:00.730322 kubelet[2514]: W0909 23:18:00.730261 2514 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.66.202:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.66.202:6443: connect: connection refused Sep 9 23:18:00.730432 kubelet[2514]: E0909 23:18:00.730330 2514 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.66.202:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.66.202:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:18:00.745064 kubelet[2514]: I0909 23:18:00.745037 2514 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 23:18:00.745340 kubelet[2514]: I0909 23:18:00.745231 2514 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 23:18:00.745532 kubelet[2514]: I0909 23:18:00.745515 2514 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:18:00.747272 kubelet[2514]: I0909 23:18:00.747238 2514 policy_none.go:49] "None policy: Start" Sep 9 23:18:00.748502 kubelet[2514]: I0909 23:18:00.748064 2514 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 23:18:00.748502 kubelet[2514]: I0909 23:18:00.748101 2514 state_mem.go:35] "Initializing new in-memory state store" Sep 9 23:18:00.758187 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 23:18:00.771680 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 23:18:00.776835 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 23:18:00.784466 kubelet[2514]: E0909 23:18:00.784436 2514 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-5qwy1.gb1.brightbox.com\" not found" Sep 9 23:18:00.794199 kubelet[2514]: I0909 23:18:00.793695 2514 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 23:18:00.794199 kubelet[2514]: I0909 23:18:00.793977 2514 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 23:18:00.794199 kubelet[2514]: I0909 23:18:00.794002 2514 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 23:18:00.794744 kubelet[2514]: I0909 23:18:00.794654 2514 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 23:18:00.798956 kubelet[2514]: E0909 23:18:00.798932 2514 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-5qwy1.gb1.brightbox.com\" not found" Sep 9 23:18:00.838082 systemd[1]: Created slice kubepods-burstable-poda7dfa7df0342970545a36db935509be0.slice - libcontainer container kubepods-burstable-poda7dfa7df0342970545a36db935509be0.slice. Sep 9 23:18:00.850988 systemd[1]: Created slice kubepods-burstable-pode9eff9b5235007e6f5861f19975e1375.slice - libcontainer container kubepods-burstable-pode9eff9b5235007e6f5861f19975e1375.slice. Sep 9 23:18:00.867980 systemd[1]: Created slice kubepods-burstable-podda6f294bb597b2f31363d3dae2e154e5.slice - libcontainer container kubepods-burstable-podda6f294bb597b2f31363d3dae2e154e5.slice. Sep 9 23:18:00.885245 kubelet[2514]: I0909 23:18:00.884795 2514 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e9eff9b5235007e6f5861f19975e1375-ca-certs\") pod \"kube-controller-manager-srv-5qwy1.gb1.brightbox.com\" (UID: \"e9eff9b5235007e6f5861f19975e1375\") " pod="kube-system/kube-controller-manager-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:00.885245 kubelet[2514]: I0909 23:18:00.884898 2514 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/da6f294bb597b2f31363d3dae2e154e5-kubeconfig\") pod \"kube-scheduler-srv-5qwy1.gb1.brightbox.com\" (UID: \"da6f294bb597b2f31363d3dae2e154e5\") " pod="kube-system/kube-scheduler-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:00.885245 kubelet[2514]: I0909 23:18:00.884934 2514 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a7dfa7df0342970545a36db935509be0-usr-share-ca-certificates\") pod \"kube-apiserver-srv-5qwy1.gb1.brightbox.com\" (UID: \"a7dfa7df0342970545a36db935509be0\") " pod="kube-system/kube-apiserver-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:00.885245 kubelet[2514]: I0909 23:18:00.884986 2514 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a7dfa7df0342970545a36db935509be0-k8s-certs\") pod \"kube-apiserver-srv-5qwy1.gb1.brightbox.com\" (UID: \"a7dfa7df0342970545a36db935509be0\") " pod="kube-system/kube-apiserver-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:00.885245 kubelet[2514]: I0909 23:18:00.885020 2514 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e9eff9b5235007e6f5861f19975e1375-flexvolume-dir\") pod \"kube-controller-manager-srv-5qwy1.gb1.brightbox.com\" (UID: \"e9eff9b5235007e6f5861f19975e1375\") " pod="kube-system/kube-controller-manager-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:00.885601 kubelet[2514]: I0909 23:18:00.885085 2514 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e9eff9b5235007e6f5861f19975e1375-k8s-certs\") pod \"kube-controller-manager-srv-5qwy1.gb1.brightbox.com\" (UID: \"e9eff9b5235007e6f5861f19975e1375\") " pod="kube-system/kube-controller-manager-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:00.885601 kubelet[2514]: I0909 23:18:00.885113 2514 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e9eff9b5235007e6f5861f19975e1375-kubeconfig\") pod \"kube-controller-manager-srv-5qwy1.gb1.brightbox.com\" (UID: \"e9eff9b5235007e6f5861f19975e1375\") " pod="kube-system/kube-controller-manager-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:00.885601 kubelet[2514]: I0909 23:18:00.885162 2514 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e9eff9b5235007e6f5861f19975e1375-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-5qwy1.gb1.brightbox.com\" (UID: \"e9eff9b5235007e6f5861f19975e1375\") " pod="kube-system/kube-controller-manager-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:00.885601 kubelet[2514]: I0909 23:18:00.885319 2514 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a7dfa7df0342970545a36db935509be0-ca-certs\") pod \"kube-apiserver-srv-5qwy1.gb1.brightbox.com\" (UID: \"a7dfa7df0342970545a36db935509be0\") " pod="kube-system/kube-apiserver-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:00.886086 kubelet[2514]: E0909 23:18:00.886049 2514 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.66.202:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-5qwy1.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.66.202:6443: connect: connection refused" interval="400ms" Sep 9 23:18:00.897568 kubelet[2514]: I0909 23:18:00.897521 2514 kubelet_node_status.go:72] "Attempting to register node" node="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:00.898101 kubelet[2514]: E0909 23:18:00.898067 2514 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.66.202:6443/api/v1/nodes\": dial tcp 10.230.66.202:6443: connect: connection refused" node="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:01.100887 kubelet[2514]: I0909 23:18:01.100780 2514 kubelet_node_status.go:72] "Attempting to register node" node="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:01.102287 kubelet[2514]: E0909 23:18:01.101749 2514 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.66.202:6443/api/v1/nodes\": dial tcp 10.230.66.202:6443: connect: connection refused" node="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:01.153829 containerd[1589]: time="2025-09-09T23:18:01.153760984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-5qwy1.gb1.brightbox.com,Uid:a7dfa7df0342970545a36db935509be0,Namespace:kube-system,Attempt:0,}" Sep 9 23:18:01.174011 containerd[1589]: time="2025-09-09T23:18:01.173768208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-5qwy1.gb1.brightbox.com,Uid:e9eff9b5235007e6f5861f19975e1375,Namespace:kube-system,Attempt:0,}" Sep 9 23:18:01.174291 containerd[1589]: time="2025-09-09T23:18:01.174236982Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-5qwy1.gb1.brightbox.com,Uid:da6f294bb597b2f31363d3dae2e154e5,Namespace:kube-system,Attempt:0,}" Sep 9 23:18:01.287695 kubelet[2514]: E0909 23:18:01.287614 2514 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.66.202:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-5qwy1.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.66.202:6443: connect: connection refused" interval="800ms" Sep 9 23:18:01.296256 containerd[1589]: time="2025-09-09T23:18:01.295916479Z" level=info msg="connecting to shim 1c6f3ea92f2fa9c5fc7b83e6a423fb9000413a8deb8b428666d6bc151d66d971" address="unix:///run/containerd/s/7aa3328924092ceb8cca6847a1fc056ab1570e546d89b170ce44aa53f1ed9dfd" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:01.306369 containerd[1589]: time="2025-09-09T23:18:01.306322048Z" level=info msg="connecting to shim 5b5418e074f5e05f1fa2cb912ca2fa1877e85c7e4db9519384fd1b5a7ab8dadf" address="unix:///run/containerd/s/49ed2581dd8c296d5088736cd25c6fd4e5a835f761b7a121c75a44d5566297f0" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:01.318888 containerd[1589]: time="2025-09-09T23:18:01.318802833Z" level=info msg="connecting to shim 546923d88554017d03d0a9a4d234fb9f29d8ef61d568e9125eef8a12bbab2144" address="unix:///run/containerd/s/317728ed6678046bbad62ca2570ababb4ad23ea30a59a37825d9999ff7540dd4" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:01.447398 systemd[1]: Started cri-containerd-1c6f3ea92f2fa9c5fc7b83e6a423fb9000413a8deb8b428666d6bc151d66d971.scope - libcontainer container 1c6f3ea92f2fa9c5fc7b83e6a423fb9000413a8deb8b428666d6bc151d66d971. Sep 9 23:18:01.450905 systemd[1]: Started cri-containerd-546923d88554017d03d0a9a4d234fb9f29d8ef61d568e9125eef8a12bbab2144.scope - libcontainer container 546923d88554017d03d0a9a4d234fb9f29d8ef61d568e9125eef8a12bbab2144. Sep 9 23:18:01.453512 systemd[1]: Started cri-containerd-5b5418e074f5e05f1fa2cb912ca2fa1877e85c7e4db9519384fd1b5a7ab8dadf.scope - libcontainer container 5b5418e074f5e05f1fa2cb912ca2fa1877e85c7e4db9519384fd1b5a7ab8dadf. Sep 9 23:18:01.481694 kubelet[2514]: W0909 23:18:01.481625 2514 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.66.202:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.66.202:6443: connect: connection refused Sep 9 23:18:01.481889 kubelet[2514]: E0909 23:18:01.481708 2514 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.66.202:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.66.202:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:18:01.505336 kubelet[2514]: I0909 23:18:01.505305 2514 kubelet_node_status.go:72] "Attempting to register node" node="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:01.506207 kubelet[2514]: E0909 23:18:01.506144 2514 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.66.202:6443/api/v1/nodes\": dial tcp 10.230.66.202:6443: connect: connection refused" node="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:01.563005 containerd[1589]: time="2025-09-09T23:18:01.562946620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-5qwy1.gb1.brightbox.com,Uid:e9eff9b5235007e6f5861f19975e1375,Namespace:kube-system,Attempt:0,} returns sandbox id \"5b5418e074f5e05f1fa2cb912ca2fa1877e85c7e4db9519384fd1b5a7ab8dadf\"" Sep 9 23:18:01.566609 containerd[1589]: time="2025-09-09T23:18:01.566556399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-5qwy1.gb1.brightbox.com,Uid:a7dfa7df0342970545a36db935509be0,Namespace:kube-system,Attempt:0,} returns sandbox id \"546923d88554017d03d0a9a4d234fb9f29d8ef61d568e9125eef8a12bbab2144\"" Sep 9 23:18:01.567624 containerd[1589]: time="2025-09-09T23:18:01.567592570Z" level=info msg="CreateContainer within sandbox \"5b5418e074f5e05f1fa2cb912ca2fa1877e85c7e4db9519384fd1b5a7ab8dadf\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 23:18:01.579070 containerd[1589]: time="2025-09-09T23:18:01.579035645Z" level=info msg="Container dbece09cbe54c92e07ec7ff5c36bbd2d39f232a9a3a3b475c558f2f5c231ec27: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:18:01.591131 containerd[1589]: time="2025-09-09T23:18:01.591081010Z" level=info msg="CreateContainer within sandbox \"546923d88554017d03d0a9a4d234fb9f29d8ef61d568e9125eef8a12bbab2144\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 23:18:01.608789 containerd[1589]: time="2025-09-09T23:18:01.608733550Z" level=info msg="CreateContainer within sandbox \"5b5418e074f5e05f1fa2cb912ca2fa1877e85c7e4db9519384fd1b5a7ab8dadf\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"dbece09cbe54c92e07ec7ff5c36bbd2d39f232a9a3a3b475c558f2f5c231ec27\"" Sep 9 23:18:01.609599 containerd[1589]: time="2025-09-09T23:18:01.609570455Z" level=info msg="StartContainer for \"dbece09cbe54c92e07ec7ff5c36bbd2d39f232a9a3a3b475c558f2f5c231ec27\"" Sep 9 23:18:01.612061 containerd[1589]: time="2025-09-09T23:18:01.610988341Z" level=info msg="Container cd6fcecfec448b8b8c8366696c35d5e0491eda8fba8ad7e1b088e7e0f27df1e7: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:18:01.621016 containerd[1589]: time="2025-09-09T23:18:01.620836885Z" level=info msg="connecting to shim dbece09cbe54c92e07ec7ff5c36bbd2d39f232a9a3a3b475c558f2f5c231ec27" address="unix:///run/containerd/s/49ed2581dd8c296d5088736cd25c6fd4e5a835f761b7a121c75a44d5566297f0" protocol=ttrpc version=3 Sep 9 23:18:01.627923 containerd[1589]: time="2025-09-09T23:18:01.627882508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-5qwy1.gb1.brightbox.com,Uid:da6f294bb597b2f31363d3dae2e154e5,Namespace:kube-system,Attempt:0,} returns sandbox id \"1c6f3ea92f2fa9c5fc7b83e6a423fb9000413a8deb8b428666d6bc151d66d971\"" Sep 9 23:18:01.629859 containerd[1589]: time="2025-09-09T23:18:01.629826134Z" level=info msg="CreateContainer within sandbox \"546923d88554017d03d0a9a4d234fb9f29d8ef61d568e9125eef8a12bbab2144\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"cd6fcecfec448b8b8c8366696c35d5e0491eda8fba8ad7e1b088e7e0f27df1e7\"" Sep 9 23:18:01.630399 containerd[1589]: time="2025-09-09T23:18:01.630355427Z" level=info msg="StartContainer for \"cd6fcecfec448b8b8c8366696c35d5e0491eda8fba8ad7e1b088e7e0f27df1e7\"" Sep 9 23:18:01.633192 containerd[1589]: time="2025-09-09T23:18:01.633002178Z" level=info msg="connecting to shim cd6fcecfec448b8b8c8366696c35d5e0491eda8fba8ad7e1b088e7e0f27df1e7" address="unix:///run/containerd/s/317728ed6678046bbad62ca2570ababb4ad23ea30a59a37825d9999ff7540dd4" protocol=ttrpc version=3 Sep 9 23:18:01.635357 containerd[1589]: time="2025-09-09T23:18:01.635159329Z" level=info msg="CreateContainer within sandbox \"1c6f3ea92f2fa9c5fc7b83e6a423fb9000413a8deb8b428666d6bc151d66d971\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 23:18:01.648459 containerd[1589]: time="2025-09-09T23:18:01.648409981Z" level=info msg="Container fdaec4015ea514129a4ba8334d2b17c580d00e0c16b3f6c924fee01ff1c1ea11: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:18:01.659498 systemd[1]: Started cri-containerd-cd6fcecfec448b8b8c8366696c35d5e0491eda8fba8ad7e1b088e7e0f27df1e7.scope - libcontainer container cd6fcecfec448b8b8c8366696c35d5e0491eda8fba8ad7e1b088e7e0f27df1e7. Sep 9 23:18:01.667418 systemd[1]: Started cri-containerd-dbece09cbe54c92e07ec7ff5c36bbd2d39f232a9a3a3b475c558f2f5c231ec27.scope - libcontainer container dbece09cbe54c92e07ec7ff5c36bbd2d39f232a9a3a3b475c558f2f5c231ec27. Sep 9 23:18:01.670244 containerd[1589]: time="2025-09-09T23:18:01.670119609Z" level=info msg="CreateContainer within sandbox \"1c6f3ea92f2fa9c5fc7b83e6a423fb9000413a8deb8b428666d6bc151d66d971\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"fdaec4015ea514129a4ba8334d2b17c580d00e0c16b3f6c924fee01ff1c1ea11\"" Sep 9 23:18:01.672044 containerd[1589]: time="2025-09-09T23:18:01.671898460Z" level=info msg="StartContainer for \"fdaec4015ea514129a4ba8334d2b17c580d00e0c16b3f6c924fee01ff1c1ea11\"" Sep 9 23:18:01.676357 containerd[1589]: time="2025-09-09T23:18:01.676325988Z" level=info msg="connecting to shim fdaec4015ea514129a4ba8334d2b17c580d00e0c16b3f6c924fee01ff1c1ea11" address="unix:///run/containerd/s/7aa3328924092ceb8cca6847a1fc056ab1570e546d89b170ce44aa53f1ed9dfd" protocol=ttrpc version=3 Sep 9 23:18:01.679923 kubelet[2514]: W0909 23:18:01.679851 2514 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.66.202:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.66.202:6443: connect: connection refused Sep 9 23:18:01.680103 kubelet[2514]: E0909 23:18:01.679916 2514 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.66.202:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.66.202:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:18:01.717391 systemd[1]: Started cri-containerd-fdaec4015ea514129a4ba8334d2b17c580d00e0c16b3f6c924fee01ff1c1ea11.scope - libcontainer container fdaec4015ea514129a4ba8334d2b17c580d00e0c16b3f6c924fee01ff1c1ea11. Sep 9 23:18:01.784276 kubelet[2514]: W0909 23:18:01.784031 2514 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.66.202:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.66.202:6443: connect: connection refused Sep 9 23:18:01.784599 kubelet[2514]: E0909 23:18:01.784569 2514 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.66.202:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.66.202:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:18:01.799843 containerd[1589]: time="2025-09-09T23:18:01.799729161Z" level=info msg="StartContainer for \"cd6fcecfec448b8b8c8366696c35d5e0491eda8fba8ad7e1b088e7e0f27df1e7\" returns successfully" Sep 9 23:18:01.813046 containerd[1589]: time="2025-09-09T23:18:01.812998171Z" level=info msg="StartContainer for \"dbece09cbe54c92e07ec7ff5c36bbd2d39f232a9a3a3b475c558f2f5c231ec27\" returns successfully" Sep 9 23:18:01.883475 containerd[1589]: time="2025-09-09T23:18:01.883382983Z" level=info msg="StartContainer for \"fdaec4015ea514129a4ba8334d2b17c580d00e0c16b3f6c924fee01ff1c1ea11\" returns successfully" Sep 9 23:18:01.944140 kubelet[2514]: W0909 23:18:01.944057 2514 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.66.202:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-5qwy1.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.66.202:6443: connect: connection refused Sep 9 23:18:01.944358 kubelet[2514]: E0909 23:18:01.944154 2514 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.66.202:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-5qwy1.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.66.202:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:18:02.089102 kubelet[2514]: E0909 23:18:02.089042 2514 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.66.202:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-5qwy1.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.66.202:6443: connect: connection refused" interval="1.6s" Sep 9 23:18:02.309894 kubelet[2514]: I0909 23:18:02.309389 2514 kubelet_node_status.go:72] "Attempting to register node" node="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:04.584183 kubelet[2514]: E0909 23:18:04.584072 2514 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-5qwy1.gb1.brightbox.com\" not found" node="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:04.637561 kubelet[2514]: I0909 23:18:04.637500 2514 kubelet_node_status.go:75] "Successfully registered node" node="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:04.637896 kubelet[2514]: E0909 23:18:04.637570 2514 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"srv-5qwy1.gb1.brightbox.com\": node \"srv-5qwy1.gb1.brightbox.com\" not found" Sep 9 23:18:04.661047 kubelet[2514]: E0909 23:18:04.661002 2514 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-5qwy1.gb1.brightbox.com\" not found" Sep 9 23:18:04.762032 kubelet[2514]: E0909 23:18:04.761916 2514 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-5qwy1.gb1.brightbox.com\" not found" Sep 9 23:18:05.666150 kubelet[2514]: I0909 23:18:05.666042 2514 apiserver.go:52] "Watching apiserver" Sep 9 23:18:05.683874 kubelet[2514]: I0909 23:18:05.683808 2514 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 23:18:06.744421 systemd[1]: Reload requested from client PID 2784 ('systemctl') (unit session-11.scope)... Sep 9 23:18:06.744939 systemd[1]: Reloading... Sep 9 23:18:06.867260 zram_generator::config[2835]: No configuration found. Sep 9 23:18:07.211426 systemd[1]: Reloading finished in 465 ms. Sep 9 23:18:07.247041 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:18:07.264585 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 23:18:07.265445 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:18:07.265767 systemd[1]: kubelet.service: Consumed 962ms CPU time, 126.5M memory peak. Sep 9 23:18:07.270856 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:18:07.564688 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:18:07.582122 (kubelet)[2893]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 23:18:07.670861 kubelet[2893]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:18:07.671690 kubelet[2893]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 23:18:07.671690 kubelet[2893]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:18:07.672429 kubelet[2893]: I0909 23:18:07.671616 2893 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 23:18:07.691194 kubelet[2893]: I0909 23:18:07.690030 2893 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 23:18:07.691363 kubelet[2893]: I0909 23:18:07.691344 2893 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 23:18:07.691875 kubelet[2893]: I0909 23:18:07.691749 2893 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 23:18:07.695265 kubelet[2893]: I0909 23:18:07.694792 2893 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 23:18:07.701062 kubelet[2893]: I0909 23:18:07.701025 2893 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 23:18:07.708735 kubelet[2893]: I0909 23:18:07.708705 2893 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 23:18:07.719545 kubelet[2893]: I0909 23:18:07.718766 2893 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 23:18:07.719545 kubelet[2893]: I0909 23:18:07.718966 2893 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 23:18:07.719545 kubelet[2893]: I0909 23:18:07.719162 2893 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 23:18:07.719759 kubelet[2893]: I0909 23:18:07.719241 2893 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-5qwy1.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 23:18:07.719759 kubelet[2893]: I0909 23:18:07.719666 2893 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 23:18:07.719759 kubelet[2893]: I0909 23:18:07.719685 2893 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 23:18:07.720034 kubelet[2893]: I0909 23:18:07.719762 2893 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:18:07.720034 kubelet[2893]: I0909 23:18:07.719977 2893 kubelet.go:408] "Attempting to sync node with API server" Sep 9 23:18:07.720034 kubelet[2893]: I0909 23:18:07.720006 2893 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 23:18:07.720187 kubelet[2893]: I0909 23:18:07.720053 2893 kubelet.go:314] "Adding apiserver pod source" Sep 9 23:18:07.720187 kubelet[2893]: I0909 23:18:07.720069 2893 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 23:18:07.727959 kubelet[2893]: I0909 23:18:07.727896 2893 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 23:18:07.730814 kubelet[2893]: I0909 23:18:07.730746 2893 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 23:18:07.732974 kubelet[2893]: I0909 23:18:07.731783 2893 server.go:1274] "Started kubelet" Sep 9 23:18:07.746144 kubelet[2893]: I0909 23:18:07.745927 2893 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 23:18:07.747627 kubelet[2893]: I0909 23:18:07.747581 2893 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 23:18:07.749676 kubelet[2893]: I0909 23:18:07.749611 2893 server.go:449] "Adding debug handlers to kubelet server" Sep 9 23:18:07.752194 kubelet[2893]: I0909 23:18:07.752001 2893 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 23:18:07.752644 kubelet[2893]: I0909 23:18:07.752295 2893 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 23:18:07.752644 kubelet[2893]: I0909 23:18:07.752575 2893 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 23:18:07.755584 kubelet[2893]: I0909 23:18:07.755497 2893 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 23:18:07.755758 kubelet[2893]: E0909 23:18:07.755725 2893 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-5qwy1.gb1.brightbox.com\" not found" Sep 9 23:18:07.757695 kubelet[2893]: I0909 23:18:07.757051 2893 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 23:18:07.757695 kubelet[2893]: I0909 23:18:07.757292 2893 reconciler.go:26] "Reconciler: start to sync state" Sep 9 23:18:07.782920 kubelet[2893]: I0909 23:18:07.782128 2893 factory.go:221] Registration of the systemd container factory successfully Sep 9 23:18:07.782920 kubelet[2893]: I0909 23:18:07.782274 2893 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 23:18:07.789404 kubelet[2893]: I0909 23:18:07.789279 2893 factory.go:221] Registration of the containerd container factory successfully Sep 9 23:18:07.821997 kubelet[2893]: I0909 23:18:07.821869 2893 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 23:18:07.828290 kubelet[2893]: I0909 23:18:07.827087 2893 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 23:18:07.828290 kubelet[2893]: I0909 23:18:07.827129 2893 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 23:18:07.828290 kubelet[2893]: I0909 23:18:07.827155 2893 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 23:18:07.828290 kubelet[2893]: E0909 23:18:07.827267 2893 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 23:18:07.904058 kubelet[2893]: I0909 23:18:07.903991 2893 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 23:18:07.904431 kubelet[2893]: I0909 23:18:07.904270 2893 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 23:18:07.904431 kubelet[2893]: I0909 23:18:07.904313 2893 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:18:07.905409 kubelet[2893]: I0909 23:18:07.905339 2893 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 23:18:07.905409 kubelet[2893]: I0909 23:18:07.905361 2893 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 23:18:07.905696 kubelet[2893]: I0909 23:18:07.905615 2893 policy_none.go:49] "None policy: Start" Sep 9 23:18:07.907607 kubelet[2893]: I0909 23:18:07.907502 2893 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 23:18:07.908125 kubelet[2893]: I0909 23:18:07.908051 2893 state_mem.go:35] "Initializing new in-memory state store" Sep 9 23:18:07.908684 kubelet[2893]: I0909 23:18:07.908665 2893 state_mem.go:75] "Updated machine memory state" Sep 9 23:18:07.922949 kubelet[2893]: I0909 23:18:07.920655 2893 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 23:18:07.922949 kubelet[2893]: I0909 23:18:07.922610 2893 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 23:18:07.922949 kubelet[2893]: I0909 23:18:07.922635 2893 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 23:18:07.924928 kubelet[2893]: I0909 23:18:07.924891 2893 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 23:18:07.995380 kubelet[2893]: W0909 23:18:07.995064 2893 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 23:18:07.996264 kubelet[2893]: W0909 23:18:07.996232 2893 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 23:18:07.999272 kubelet[2893]: W0909 23:18:07.999251 2893 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 23:18:08.059350 kubelet[2893]: I0909 23:18:08.059303 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e9eff9b5235007e6f5861f19975e1375-ca-certs\") pod \"kube-controller-manager-srv-5qwy1.gb1.brightbox.com\" (UID: \"e9eff9b5235007e6f5861f19975e1375\") " pod="kube-system/kube-controller-manager-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:08.060137 kubelet[2893]: I0909 23:18:08.060087 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e9eff9b5235007e6f5861f19975e1375-flexvolume-dir\") pod \"kube-controller-manager-srv-5qwy1.gb1.brightbox.com\" (UID: \"e9eff9b5235007e6f5861f19975e1375\") " pod="kube-system/kube-controller-manager-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:08.060480 kubelet[2893]: I0909 23:18:08.060455 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e9eff9b5235007e6f5861f19975e1375-kubeconfig\") pod \"kube-controller-manager-srv-5qwy1.gb1.brightbox.com\" (UID: \"e9eff9b5235007e6f5861f19975e1375\") " pod="kube-system/kube-controller-manager-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:08.060604 kubelet[2893]: I0909 23:18:08.060580 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e9eff9b5235007e6f5861f19975e1375-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-5qwy1.gb1.brightbox.com\" (UID: \"e9eff9b5235007e6f5861f19975e1375\") " pod="kube-system/kube-controller-manager-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:08.061247 kubelet[2893]: I0909 23:18:08.061223 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a7dfa7df0342970545a36db935509be0-ca-certs\") pod \"kube-apiserver-srv-5qwy1.gb1.brightbox.com\" (UID: \"a7dfa7df0342970545a36db935509be0\") " pod="kube-system/kube-apiserver-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:08.061596 kubelet[2893]: I0909 23:18:08.061370 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a7dfa7df0342970545a36db935509be0-k8s-certs\") pod \"kube-apiserver-srv-5qwy1.gb1.brightbox.com\" (UID: \"a7dfa7df0342970545a36db935509be0\") " pod="kube-system/kube-apiserver-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:08.061596 kubelet[2893]: I0909 23:18:08.061410 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a7dfa7df0342970545a36db935509be0-usr-share-ca-certificates\") pod \"kube-apiserver-srv-5qwy1.gb1.brightbox.com\" (UID: \"a7dfa7df0342970545a36db935509be0\") " pod="kube-system/kube-apiserver-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:08.061596 kubelet[2893]: I0909 23:18:08.061436 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e9eff9b5235007e6f5861f19975e1375-k8s-certs\") pod \"kube-controller-manager-srv-5qwy1.gb1.brightbox.com\" (UID: \"e9eff9b5235007e6f5861f19975e1375\") " pod="kube-system/kube-controller-manager-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:08.061596 kubelet[2893]: I0909 23:18:08.061462 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/da6f294bb597b2f31363d3dae2e154e5-kubeconfig\") pod \"kube-scheduler-srv-5qwy1.gb1.brightbox.com\" (UID: \"da6f294bb597b2f31363d3dae2e154e5\") " pod="kube-system/kube-scheduler-srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:08.064278 kubelet[2893]: I0909 23:18:08.064244 2893 kubelet_node_status.go:72] "Attempting to register node" node="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:08.084410 kubelet[2893]: I0909 23:18:08.082873 2893 kubelet_node_status.go:111] "Node was previously registered" node="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:08.084864 kubelet[2893]: I0909 23:18:08.084620 2893 kubelet_node_status.go:75] "Successfully registered node" node="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:08.740971 kubelet[2893]: I0909 23:18:08.740911 2893 apiserver.go:52] "Watching apiserver" Sep 9 23:18:08.757392 kubelet[2893]: I0909 23:18:08.757349 2893 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 23:18:08.923745 kubelet[2893]: I0909 23:18:08.923562 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-5qwy1.gb1.brightbox.com" podStartSLOduration=1.923477102 podStartE2EDuration="1.923477102s" podCreationTimestamp="2025-09-09 23:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:18:08.909562337 +0000 UTC m=+1.319747261" watchObservedRunningTime="2025-09-09 23:18:08.923477102 +0000 UTC m=+1.333662006" Sep 9 23:18:08.938296 kubelet[2893]: I0909 23:18:08.937941 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-5qwy1.gb1.brightbox.com" podStartSLOduration=1.937911706 podStartE2EDuration="1.937911706s" podCreationTimestamp="2025-09-09 23:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:18:08.925067619 +0000 UTC m=+1.335252553" watchObservedRunningTime="2025-09-09 23:18:08.937911706 +0000 UTC m=+1.348096614" Sep 9 23:18:08.956968 kubelet[2893]: I0909 23:18:08.956550 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-5qwy1.gb1.brightbox.com" podStartSLOduration=1.956529653 podStartE2EDuration="1.956529653s" podCreationTimestamp="2025-09-09 23:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:18:08.939423343 +0000 UTC m=+1.349608273" watchObservedRunningTime="2025-09-09 23:18:08.956529653 +0000 UTC m=+1.366714554" Sep 9 23:18:12.534158 kubelet[2893]: I0909 23:18:12.534031 2893 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 23:18:12.536295 containerd[1589]: time="2025-09-09T23:18:12.536229147Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 23:18:12.538122 kubelet[2893]: I0909 23:18:12.536780 2893 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 23:18:13.487443 systemd[1]: Created slice kubepods-besteffort-podc80a8bb6_30e6_4f3d_9cb4_f07d739835a7.slice - libcontainer container kubepods-besteffort-podc80a8bb6_30e6_4f3d_9cb4_f07d739835a7.slice. Sep 9 23:18:13.502200 kubelet[2893]: I0909 23:18:13.502119 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c80a8bb6-30e6-4f3d-9cb4-f07d739835a7-xtables-lock\") pod \"kube-proxy-x65jj\" (UID: \"c80a8bb6-30e6-4f3d-9cb4-f07d739835a7\") " pod="kube-system/kube-proxy-x65jj" Sep 9 23:18:13.502406 kubelet[2893]: I0909 23:18:13.502173 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c80a8bb6-30e6-4f3d-9cb4-f07d739835a7-lib-modules\") pod \"kube-proxy-x65jj\" (UID: \"c80a8bb6-30e6-4f3d-9cb4-f07d739835a7\") " pod="kube-system/kube-proxy-x65jj" Sep 9 23:18:13.502527 kubelet[2893]: I0909 23:18:13.502426 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/c80a8bb6-30e6-4f3d-9cb4-f07d739835a7-kube-proxy\") pod \"kube-proxy-x65jj\" (UID: \"c80a8bb6-30e6-4f3d-9cb4-f07d739835a7\") " pod="kube-system/kube-proxy-x65jj" Sep 9 23:18:13.502527 kubelet[2893]: I0909 23:18:13.502457 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hglrq\" (UniqueName: \"kubernetes.io/projected/c80a8bb6-30e6-4f3d-9cb4-f07d739835a7-kube-api-access-hglrq\") pod \"kube-proxy-x65jj\" (UID: \"c80a8bb6-30e6-4f3d-9cb4-f07d739835a7\") " pod="kube-system/kube-proxy-x65jj" Sep 9 23:18:13.693078 systemd[1]: Created slice kubepods-besteffort-podca3ea8ec_dc7d_4b1a_88f6_c34ef05f55c4.slice - libcontainer container kubepods-besteffort-podca3ea8ec_dc7d_4b1a_88f6_c34ef05f55c4.slice. Sep 9 23:18:13.703749 kubelet[2893]: I0909 23:18:13.703600 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw2qb\" (UniqueName: \"kubernetes.io/projected/ca3ea8ec-dc7d-4b1a-88f6-c34ef05f55c4-kube-api-access-xw2qb\") pod \"tigera-operator-58fc44c59b-6cr8d\" (UID: \"ca3ea8ec-dc7d-4b1a-88f6-c34ef05f55c4\") " pod="tigera-operator/tigera-operator-58fc44c59b-6cr8d" Sep 9 23:18:13.705736 kubelet[2893]: I0909 23:18:13.705419 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ca3ea8ec-dc7d-4b1a-88f6-c34ef05f55c4-var-lib-calico\") pod \"tigera-operator-58fc44c59b-6cr8d\" (UID: \"ca3ea8ec-dc7d-4b1a-88f6-c34ef05f55c4\") " pod="tigera-operator/tigera-operator-58fc44c59b-6cr8d" Sep 9 23:18:13.800828 containerd[1589]: time="2025-09-09T23:18:13.800712618Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-x65jj,Uid:c80a8bb6-30e6-4f3d-9cb4-f07d739835a7,Namespace:kube-system,Attempt:0,}" Sep 9 23:18:13.836189 containerd[1589]: time="2025-09-09T23:18:13.836066936Z" level=info msg="connecting to shim ed7ba393212ce5b41f294bdcee744d288dfb544f15bb8ce00f995acf777d5526" address="unix:///run/containerd/s/8f91adeee0447ddd3833c10ad7d8c4da88ee48dcf22adc1b394e184138888af0" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:13.882646 systemd[1]: Started cri-containerd-ed7ba393212ce5b41f294bdcee744d288dfb544f15bb8ce00f995acf777d5526.scope - libcontainer container ed7ba393212ce5b41f294bdcee744d288dfb544f15bb8ce00f995acf777d5526. Sep 9 23:18:13.935117 containerd[1589]: time="2025-09-09T23:18:13.935054795Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-x65jj,Uid:c80a8bb6-30e6-4f3d-9cb4-f07d739835a7,Namespace:kube-system,Attempt:0,} returns sandbox id \"ed7ba393212ce5b41f294bdcee744d288dfb544f15bb8ce00f995acf777d5526\"" Sep 9 23:18:13.942544 containerd[1589]: time="2025-09-09T23:18:13.941906587Z" level=info msg="CreateContainer within sandbox \"ed7ba393212ce5b41f294bdcee744d288dfb544f15bb8ce00f995acf777d5526\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 23:18:13.954420 containerd[1589]: time="2025-09-09T23:18:13.954351954Z" level=info msg="Container 0c81174b2cd02cc0776868435c3c1779f117052476d7cefb7fc012752105d0d5: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:18:13.964034 containerd[1589]: time="2025-09-09T23:18:13.963974099Z" level=info msg="CreateContainer within sandbox \"ed7ba393212ce5b41f294bdcee744d288dfb544f15bb8ce00f995acf777d5526\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"0c81174b2cd02cc0776868435c3c1779f117052476d7cefb7fc012752105d0d5\"" Sep 9 23:18:13.966232 containerd[1589]: time="2025-09-09T23:18:13.964962559Z" level=info msg="StartContainer for \"0c81174b2cd02cc0776868435c3c1779f117052476d7cefb7fc012752105d0d5\"" Sep 9 23:18:13.966864 containerd[1589]: time="2025-09-09T23:18:13.966739127Z" level=info msg="connecting to shim 0c81174b2cd02cc0776868435c3c1779f117052476d7cefb7fc012752105d0d5" address="unix:///run/containerd/s/8f91adeee0447ddd3833c10ad7d8c4da88ee48dcf22adc1b394e184138888af0" protocol=ttrpc version=3 Sep 9 23:18:13.998525 systemd[1]: Started cri-containerd-0c81174b2cd02cc0776868435c3c1779f117052476d7cefb7fc012752105d0d5.scope - libcontainer container 0c81174b2cd02cc0776868435c3c1779f117052476d7cefb7fc012752105d0d5. Sep 9 23:18:14.003951 containerd[1589]: time="2025-09-09T23:18:14.003897205Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-6cr8d,Uid:ca3ea8ec-dc7d-4b1a-88f6-c34ef05f55c4,Namespace:tigera-operator,Attempt:0,}" Sep 9 23:18:14.046942 containerd[1589]: time="2025-09-09T23:18:14.046868348Z" level=info msg="connecting to shim 63f143ac7bdc5671b74340bad4a2c2864e402164f33c105b0d69bd8be61ba4ab" address="unix:///run/containerd/s/8f554b2d231d5b6b233f8dd471c4c070a464c2a812fc0884b7271e79a2bcf71a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:14.099385 systemd[1]: Started cri-containerd-63f143ac7bdc5671b74340bad4a2c2864e402164f33c105b0d69bd8be61ba4ab.scope - libcontainer container 63f143ac7bdc5671b74340bad4a2c2864e402164f33c105b0d69bd8be61ba4ab. Sep 9 23:18:14.111335 containerd[1589]: time="2025-09-09T23:18:14.111295314Z" level=info msg="StartContainer for \"0c81174b2cd02cc0776868435c3c1779f117052476d7cefb7fc012752105d0d5\" returns successfully" Sep 9 23:18:14.208961 containerd[1589]: time="2025-09-09T23:18:14.208778967Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-6cr8d,Uid:ca3ea8ec-dc7d-4b1a-88f6-c34ef05f55c4,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"63f143ac7bdc5671b74340bad4a2c2864e402164f33c105b0d69bd8be61ba4ab\"" Sep 9 23:18:14.212750 containerd[1589]: time="2025-09-09T23:18:14.212654448Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 23:18:14.907251 kubelet[2893]: I0909 23:18:14.907196 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-x65jj" podStartSLOduration=1.90716689 podStartE2EDuration="1.90716689s" podCreationTimestamp="2025-09-09 23:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:18:14.906067209 +0000 UTC m=+7.316252130" watchObservedRunningTime="2025-09-09 23:18:14.90716689 +0000 UTC m=+7.317351791" Sep 9 23:18:16.520736 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2559578541.mount: Deactivated successfully. Sep 9 23:18:17.412501 containerd[1589]: time="2025-09-09T23:18:17.412430288Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:17.413650 containerd[1589]: time="2025-09-09T23:18:17.413438089Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 9 23:18:17.414734 containerd[1589]: time="2025-09-09T23:18:17.414699195Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:17.417244 containerd[1589]: time="2025-09-09T23:18:17.417207121Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:17.418774 containerd[1589]: time="2025-09-09T23:18:17.418284964Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 3.205499386s" Sep 9 23:18:17.418774 containerd[1589]: time="2025-09-09T23:18:17.418336612Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 9 23:18:17.422822 containerd[1589]: time="2025-09-09T23:18:17.422789765Z" level=info msg="CreateContainer within sandbox \"63f143ac7bdc5671b74340bad4a2c2864e402164f33c105b0d69bd8be61ba4ab\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 23:18:17.432224 containerd[1589]: time="2025-09-09T23:18:17.431774050Z" level=info msg="Container e884eab812a51f60c6e7ecc0f396492be6a93e32b1ff80ed2d06dfaf098f6fe1: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:18:17.438435 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4176355236.mount: Deactivated successfully. Sep 9 23:18:17.451604 containerd[1589]: time="2025-09-09T23:18:17.451473275Z" level=info msg="CreateContainer within sandbox \"63f143ac7bdc5671b74340bad4a2c2864e402164f33c105b0d69bd8be61ba4ab\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"e884eab812a51f60c6e7ecc0f396492be6a93e32b1ff80ed2d06dfaf098f6fe1\"" Sep 9 23:18:17.452432 containerd[1589]: time="2025-09-09T23:18:17.452384101Z" level=info msg="StartContainer for \"e884eab812a51f60c6e7ecc0f396492be6a93e32b1ff80ed2d06dfaf098f6fe1\"" Sep 9 23:18:17.454580 containerd[1589]: time="2025-09-09T23:18:17.454519297Z" level=info msg="connecting to shim e884eab812a51f60c6e7ecc0f396492be6a93e32b1ff80ed2d06dfaf098f6fe1" address="unix:///run/containerd/s/8f554b2d231d5b6b233f8dd471c4c070a464c2a812fc0884b7271e79a2bcf71a" protocol=ttrpc version=3 Sep 9 23:18:17.490528 systemd[1]: Started cri-containerd-e884eab812a51f60c6e7ecc0f396492be6a93e32b1ff80ed2d06dfaf098f6fe1.scope - libcontainer container e884eab812a51f60c6e7ecc0f396492be6a93e32b1ff80ed2d06dfaf098f6fe1. Sep 9 23:18:17.539519 containerd[1589]: time="2025-09-09T23:18:17.539386773Z" level=info msg="StartContainer for \"e884eab812a51f60c6e7ecc0f396492be6a93e32b1ff80ed2d06dfaf098f6fe1\" returns successfully" Sep 9 23:18:17.943453 kubelet[2893]: I0909 23:18:17.943389 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-6cr8d" podStartSLOduration=1.734683684 podStartE2EDuration="4.943369647s" podCreationTimestamp="2025-09-09 23:18:13 +0000 UTC" firstStartedPulling="2025-09-09 23:18:14.211073805 +0000 UTC m=+6.621258707" lastFinishedPulling="2025-09-09 23:18:17.419759775 +0000 UTC m=+9.829944670" observedRunningTime="2025-09-09 23:18:17.916830687 +0000 UTC m=+10.327015617" watchObservedRunningTime="2025-09-09 23:18:17.943369647 +0000 UTC m=+10.353554556" Sep 9 23:18:22.907465 sudo[1913]: pam_unix(sudo:session): session closed for user root Sep 9 23:18:23.055204 sshd[1912]: Connection closed by 139.178.68.195 port 56740 Sep 9 23:18:23.056780 sshd-session[1909]: pam_unix(sshd:session): session closed for user core Sep 9 23:18:23.063919 systemd[1]: sshd@8-10.230.66.202:22-139.178.68.195:56740.service: Deactivated successfully. Sep 9 23:18:23.068575 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 23:18:23.069111 systemd[1]: session-11.scope: Consumed 7.058s CPU time, 159M memory peak. Sep 9 23:18:23.074981 systemd-logind[1563]: Session 11 logged out. Waiting for processes to exit. Sep 9 23:18:23.077337 systemd-logind[1563]: Removed session 11. Sep 9 23:18:27.403236 kubelet[2893]: W0909 23:18:27.403153 2893 reflector.go:561] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:srv-5qwy1.gb1.brightbox.com" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'srv-5qwy1.gb1.brightbox.com' and this object Sep 9 23:18:27.404845 kubelet[2893]: E0909 23:18:27.404559 2893 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"typha-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"typha-certs\" is forbidden: User \"system:node:srv-5qwy1.gb1.brightbox.com\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'srv-5qwy1.gb1.brightbox.com' and this object" logger="UnhandledError" Sep 9 23:18:27.404845 kubelet[2893]: W0909 23:18:27.403301 2893 reflector.go:561] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:srv-5qwy1.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'srv-5qwy1.gb1.brightbox.com' and this object Sep 9 23:18:27.404845 kubelet[2893]: E0909 23:18:27.404631 2893 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:srv-5qwy1.gb1.brightbox.com\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'srv-5qwy1.gb1.brightbox.com' and this object" logger="UnhandledError" Sep 9 23:18:27.412942 systemd[1]: Created slice kubepods-besteffort-pod5768268f_8c27_4a22_953d_5e34ee4c5133.slice - libcontainer container kubepods-besteffort-pod5768268f_8c27_4a22_953d_5e34ee4c5133.slice. Sep 9 23:18:27.501942 kubelet[2893]: I0909 23:18:27.501763 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94tbg\" (UniqueName: \"kubernetes.io/projected/5768268f-8c27-4a22-953d-5e34ee4c5133-kube-api-access-94tbg\") pod \"calico-typha-845d5cc58-2wlp9\" (UID: \"5768268f-8c27-4a22-953d-5e34ee4c5133\") " pod="calico-system/calico-typha-845d5cc58-2wlp9" Sep 9 23:18:27.501942 kubelet[2893]: I0909 23:18:27.501854 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5768268f-8c27-4a22-953d-5e34ee4c5133-tigera-ca-bundle\") pod \"calico-typha-845d5cc58-2wlp9\" (UID: \"5768268f-8c27-4a22-953d-5e34ee4c5133\") " pod="calico-system/calico-typha-845d5cc58-2wlp9" Sep 9 23:18:27.501942 kubelet[2893]: I0909 23:18:27.501887 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/5768268f-8c27-4a22-953d-5e34ee4c5133-typha-certs\") pod \"calico-typha-845d5cc58-2wlp9\" (UID: \"5768268f-8c27-4a22-953d-5e34ee4c5133\") " pod="calico-system/calico-typha-845d5cc58-2wlp9" Sep 9 23:18:27.698326 systemd[1]: Created slice kubepods-besteffort-podb98ba589_a7d1_4b07_81a3_3a2a321aed79.slice - libcontainer container kubepods-besteffort-podb98ba589_a7d1_4b07_81a3_3a2a321aed79.slice. Sep 9 23:18:27.704920 kubelet[2893]: I0909 23:18:27.704869 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b98ba589-a7d1-4b07-81a3-3a2a321aed79-lib-modules\") pod \"calico-node-9g7dw\" (UID: \"b98ba589-a7d1-4b07-81a3-3a2a321aed79\") " pod="calico-system/calico-node-9g7dw" Sep 9 23:18:27.705160 kubelet[2893]: I0909 23:18:27.704926 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlvtl\" (UniqueName: \"kubernetes.io/projected/b98ba589-a7d1-4b07-81a3-3a2a321aed79-kube-api-access-vlvtl\") pod \"calico-node-9g7dw\" (UID: \"b98ba589-a7d1-4b07-81a3-3a2a321aed79\") " pod="calico-system/calico-node-9g7dw" Sep 9 23:18:27.705160 kubelet[2893]: I0909 23:18:27.704957 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b98ba589-a7d1-4b07-81a3-3a2a321aed79-policysync\") pod \"calico-node-9g7dw\" (UID: \"b98ba589-a7d1-4b07-81a3-3a2a321aed79\") " pod="calico-system/calico-node-9g7dw" Sep 9 23:18:27.705160 kubelet[2893]: I0909 23:18:27.705031 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b98ba589-a7d1-4b07-81a3-3a2a321aed79-cni-log-dir\") pod \"calico-node-9g7dw\" (UID: \"b98ba589-a7d1-4b07-81a3-3a2a321aed79\") " pod="calico-system/calico-node-9g7dw" Sep 9 23:18:27.705160 kubelet[2893]: I0909 23:18:27.705071 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b98ba589-a7d1-4b07-81a3-3a2a321aed79-cni-net-dir\") pod \"calico-node-9g7dw\" (UID: \"b98ba589-a7d1-4b07-81a3-3a2a321aed79\") " pod="calico-system/calico-node-9g7dw" Sep 9 23:18:27.705160 kubelet[2893]: I0909 23:18:27.705113 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b98ba589-a7d1-4b07-81a3-3a2a321aed79-tigera-ca-bundle\") pod \"calico-node-9g7dw\" (UID: \"b98ba589-a7d1-4b07-81a3-3a2a321aed79\") " pod="calico-system/calico-node-9g7dw" Sep 9 23:18:27.706668 kubelet[2893]: I0909 23:18:27.705141 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b98ba589-a7d1-4b07-81a3-3a2a321aed79-flexvol-driver-host\") pod \"calico-node-9g7dw\" (UID: \"b98ba589-a7d1-4b07-81a3-3a2a321aed79\") " pod="calico-system/calico-node-9g7dw" Sep 9 23:18:27.706668 kubelet[2893]: I0909 23:18:27.705190 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b98ba589-a7d1-4b07-81a3-3a2a321aed79-xtables-lock\") pod \"calico-node-9g7dw\" (UID: \"b98ba589-a7d1-4b07-81a3-3a2a321aed79\") " pod="calico-system/calico-node-9g7dw" Sep 9 23:18:27.706668 kubelet[2893]: I0909 23:18:27.705221 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b98ba589-a7d1-4b07-81a3-3a2a321aed79-cni-bin-dir\") pod \"calico-node-9g7dw\" (UID: \"b98ba589-a7d1-4b07-81a3-3a2a321aed79\") " pod="calico-system/calico-node-9g7dw" Sep 9 23:18:27.706668 kubelet[2893]: I0909 23:18:27.705247 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b98ba589-a7d1-4b07-81a3-3a2a321aed79-node-certs\") pod \"calico-node-9g7dw\" (UID: \"b98ba589-a7d1-4b07-81a3-3a2a321aed79\") " pod="calico-system/calico-node-9g7dw" Sep 9 23:18:27.706668 kubelet[2893]: I0909 23:18:27.705274 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b98ba589-a7d1-4b07-81a3-3a2a321aed79-var-lib-calico\") pod \"calico-node-9g7dw\" (UID: \"b98ba589-a7d1-4b07-81a3-3a2a321aed79\") " pod="calico-system/calico-node-9g7dw" Sep 9 23:18:27.706894 kubelet[2893]: I0909 23:18:27.705298 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b98ba589-a7d1-4b07-81a3-3a2a321aed79-var-run-calico\") pod \"calico-node-9g7dw\" (UID: \"b98ba589-a7d1-4b07-81a3-3a2a321aed79\") " pod="calico-system/calico-node-9g7dw" Sep 9 23:18:27.813055 kubelet[2893]: E0909 23:18:27.813015 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.813535 kubelet[2893]: W0909 23:18:27.813363 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.813535 kubelet[2893]: E0909 23:18:27.813412 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.815288 kubelet[2893]: E0909 23:18:27.814900 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.815288 kubelet[2893]: W0909 23:18:27.814921 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.815288 kubelet[2893]: E0909 23:18:27.814937 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.815860 kubelet[2893]: E0909 23:18:27.815752 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.816044 kubelet[2893]: W0909 23:18:27.816021 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.816394 kubelet[2893]: E0909 23:18:27.816286 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.817344 kubelet[2893]: E0909 23:18:27.817324 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.817728 kubelet[2893]: W0909 23:18:27.817623 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.818196 kubelet[2893]: E0909 23:18:27.818014 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.826835 kubelet[2893]: E0909 23:18:27.826793 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.826835 kubelet[2893]: W0909 23:18:27.826825 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.826990 kubelet[2893]: E0909 23:18:27.826849 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.907981 kubelet[2893]: E0909 23:18:27.907774 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.907981 kubelet[2893]: W0909 23:18:27.907806 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.907981 kubelet[2893]: E0909 23:18:27.907833 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.908793 kubelet[2893]: E0909 23:18:27.908636 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.908793 kubelet[2893]: W0909 23:18:27.908655 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.908793 kubelet[2893]: E0909 23:18:27.908671 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.909087 kubelet[2893]: E0909 23:18:27.909068 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.909206 kubelet[2893]: W0909 23:18:27.909154 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.909310 kubelet[2893]: E0909 23:18:27.909292 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.981948 kubelet[2893]: E0909 23:18:27.981605 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkvhf" podUID="88a3a96f-1289-44aa-b5ca-25670a32dc3d" Sep 9 23:18:27.991299 kubelet[2893]: E0909 23:18:27.991262 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.991299 kubelet[2893]: W0909 23:18:27.991291 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.991578 kubelet[2893]: E0909 23:18:27.991318 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.991669 kubelet[2893]: E0909 23:18:27.991619 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.991669 kubelet[2893]: W0909 23:18:27.991635 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.991669 kubelet[2893]: E0909 23:18:27.991649 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.992195 kubelet[2893]: E0909 23:18:27.991858 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.992195 kubelet[2893]: W0909 23:18:27.991875 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.992195 kubelet[2893]: E0909 23:18:27.991890 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.993305 kubelet[2893]: E0909 23:18:27.992500 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.993305 kubelet[2893]: W0909 23:18:27.992519 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.993305 kubelet[2893]: E0909 23:18:27.992537 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.993305 kubelet[2893]: E0909 23:18:27.992785 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.993305 kubelet[2893]: W0909 23:18:27.992798 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.993305 kubelet[2893]: E0909 23:18:27.992812 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.993305 kubelet[2893]: E0909 23:18:27.993004 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.993305 kubelet[2893]: W0909 23:18:27.993016 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.993305 kubelet[2893]: E0909 23:18:27.993029 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.994329 kubelet[2893]: E0909 23:18:27.994303 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.994329 kubelet[2893]: W0909 23:18:27.994327 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.994771 kubelet[2893]: E0909 23:18:27.994343 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.994771 kubelet[2893]: E0909 23:18:27.994557 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.994771 kubelet[2893]: W0909 23:18:27.994570 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.994771 kubelet[2893]: E0909 23:18:27.994586 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.995233 kubelet[2893]: E0909 23:18:27.994810 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.995233 kubelet[2893]: W0909 23:18:27.994823 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.995233 kubelet[2893]: E0909 23:18:27.994836 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.995233 kubelet[2893]: E0909 23:18:27.995139 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.995233 kubelet[2893]: W0909 23:18:27.995154 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.995233 kubelet[2893]: E0909 23:18:27.995193 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.995494 kubelet[2893]: E0909 23:18:27.995421 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.995494 kubelet[2893]: W0909 23:18:27.995433 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.995494 kubelet[2893]: E0909 23:18:27.995447 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.996346 kubelet[2893]: E0909 23:18:27.995972 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.996346 kubelet[2893]: W0909 23:18:27.995986 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.996346 kubelet[2893]: E0909 23:18:27.996005 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.996346 kubelet[2893]: E0909 23:18:27.996266 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.996346 kubelet[2893]: W0909 23:18:27.996278 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.996346 kubelet[2893]: E0909 23:18:27.996292 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.998515 kubelet[2893]: E0909 23:18:27.997126 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.998515 kubelet[2893]: W0909 23:18:27.997139 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.998515 kubelet[2893]: E0909 23:18:27.997153 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.998515 kubelet[2893]: E0909 23:18:27.997417 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.998515 kubelet[2893]: W0909 23:18:27.997430 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.998515 kubelet[2893]: E0909 23:18:27.997444 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.998515 kubelet[2893]: E0909 23:18:27.997668 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.998515 kubelet[2893]: W0909 23:18:27.997680 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.998515 kubelet[2893]: E0909 23:18:27.997694 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.998515 kubelet[2893]: E0909 23:18:27.998027 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.999252 kubelet[2893]: W0909 23:18:27.999227 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.999347 kubelet[2893]: E0909 23:18:27.999258 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.999511 kubelet[2893]: E0909 23:18:27.999468 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.999511 kubelet[2893]: W0909 23:18:27.999487 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.999511 kubelet[2893]: E0909 23:18:27.999502 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:27.999825 kubelet[2893]: E0909 23:18:27.999755 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:27.999825 kubelet[2893]: W0909 23:18:27.999768 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:27.999825 kubelet[2893]: E0909 23:18:27.999783 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.000002 kubelet[2893]: E0909 23:18:27.999984 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.000002 kubelet[2893]: W0909 23:18:27.999996 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.000086 kubelet[2893]: E0909 23:18:28.000009 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.010069 kubelet[2893]: E0909 23:18:28.009865 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.010069 kubelet[2893]: W0909 23:18:28.009899 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.010069 kubelet[2893]: E0909 23:18:28.009923 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.010599 kubelet[2893]: E0909 23:18:28.010572 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.010599 kubelet[2893]: W0909 23:18:28.010593 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.010890 kubelet[2893]: E0909 23:18:28.010620 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.010890 kubelet[2893]: I0909 23:18:28.010658 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/88a3a96f-1289-44aa-b5ca-25670a32dc3d-varrun\") pod \"csi-node-driver-lkvhf\" (UID: \"88a3a96f-1289-44aa-b5ca-25670a32dc3d\") " pod="calico-system/csi-node-driver-lkvhf" Sep 9 23:18:28.011287 kubelet[2893]: E0909 23:18:28.011260 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.011287 kubelet[2893]: W0909 23:18:28.011282 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.011498 kubelet[2893]: E0909 23:18:28.011304 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.011498 kubelet[2893]: I0909 23:18:28.011330 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/88a3a96f-1289-44aa-b5ca-25670a32dc3d-kubelet-dir\") pod \"csi-node-driver-lkvhf\" (UID: \"88a3a96f-1289-44aa-b5ca-25670a32dc3d\") " pod="calico-system/csi-node-driver-lkvhf" Sep 9 23:18:28.011751 kubelet[2893]: E0909 23:18:28.011720 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.011751 kubelet[2893]: W0909 23:18:28.011742 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.011948 kubelet[2893]: E0909 23:18:28.011764 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.013599 kubelet[2893]: E0909 23:18:28.013572 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.013599 kubelet[2893]: W0909 23:18:28.013594 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.013893 kubelet[2893]: E0909 23:18:28.013791 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.013893 kubelet[2893]: E0909 23:18:28.013826 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.013893 kubelet[2893]: I0909 23:18:28.013830 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/88a3a96f-1289-44aa-b5ca-25670a32dc3d-registration-dir\") pod \"csi-node-driver-lkvhf\" (UID: \"88a3a96f-1289-44aa-b5ca-25670a32dc3d\") " pod="calico-system/csi-node-driver-lkvhf" Sep 9 23:18:28.013893 kubelet[2893]: W0909 23:18:28.013839 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.013893 kubelet[2893]: E0909 23:18:28.013866 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.014672 kubelet[2893]: E0909 23:18:28.014626 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.014672 kubelet[2893]: W0909 23:18:28.014646 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.014789 kubelet[2893]: E0909 23:18:28.014680 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.015335 kubelet[2893]: E0909 23:18:28.015322 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.015393 kubelet[2893]: W0909 23:18:28.015336 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.015393 kubelet[2893]: E0909 23:18:28.015380 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.015682 kubelet[2893]: E0909 23:18:28.015651 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.015682 kubelet[2893]: W0909 23:18:28.015670 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.015779 kubelet[2893]: E0909 23:18:28.015703 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.016695 kubelet[2893]: E0909 23:18:28.016640 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.016695 kubelet[2893]: W0909 23:18:28.016659 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.016695 kubelet[2893]: E0909 23:18:28.016674 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.017278 kubelet[2893]: E0909 23:18:28.017015 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.017278 kubelet[2893]: W0909 23:18:28.017029 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.017278 kubelet[2893]: E0909 23:18:28.017043 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.017416 kubelet[2893]: E0909 23:18:28.017345 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.017416 kubelet[2893]: W0909 23:18:28.017362 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.017416 kubelet[2893]: E0909 23:18:28.017377 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.018061 kubelet[2893]: E0909 23:18:28.017969 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.018061 kubelet[2893]: W0909 23:18:28.017983 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.018061 kubelet[2893]: E0909 23:18:28.017997 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.018061 kubelet[2893]: I0909 23:18:28.018053 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/88a3a96f-1289-44aa-b5ca-25670a32dc3d-socket-dir\") pod \"csi-node-driver-lkvhf\" (UID: \"88a3a96f-1289-44aa-b5ca-25670a32dc3d\") " pod="calico-system/csi-node-driver-lkvhf" Sep 9 23:18:28.019347 kubelet[2893]: E0909 23:18:28.019325 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.019347 kubelet[2893]: W0909 23:18:28.019345 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.019480 kubelet[2893]: E0909 23:18:28.019381 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.019480 kubelet[2893]: I0909 23:18:28.019407 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzgx2\" (UniqueName: \"kubernetes.io/projected/88a3a96f-1289-44aa-b5ca-25670a32dc3d-kube-api-access-pzgx2\") pod \"csi-node-driver-lkvhf\" (UID: \"88a3a96f-1289-44aa-b5ca-25670a32dc3d\") " pod="calico-system/csi-node-driver-lkvhf" Sep 9 23:18:28.019694 kubelet[2893]: E0909 23:18:28.019672 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.019694 kubelet[2893]: W0909 23:18:28.019691 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.019917 kubelet[2893]: E0909 23:18:28.019712 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.019972 kubelet[2893]: E0909 23:18:28.019954 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.019972 kubelet[2893]: W0909 23:18:28.019967 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.020058 kubelet[2893]: E0909 23:18:28.019983 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.020243 kubelet[2893]: E0909 23:18:28.020223 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.020243 kubelet[2893]: W0909 23:18:28.020241 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.020683 kubelet[2893]: E0909 23:18:28.020256 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.020992 kubelet[2893]: E0909 23:18:28.020963 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.020992 kubelet[2893]: W0909 23:18:28.020982 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.021223 kubelet[2893]: E0909 23:18:28.020997 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.120822 kubelet[2893]: E0909 23:18:28.120780 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.120822 kubelet[2893]: W0909 23:18:28.120813 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.121041 kubelet[2893]: E0909 23:18:28.120840 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.121112 kubelet[2893]: E0909 23:18:28.121093 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.121204 kubelet[2893]: W0909 23:18:28.121112 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.121204 kubelet[2893]: E0909 23:18:28.121127 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.122369 kubelet[2893]: E0909 23:18:28.122344 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.122369 kubelet[2893]: W0909 23:18:28.122364 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.122485 kubelet[2893]: E0909 23:18:28.122382 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.122705 kubelet[2893]: E0909 23:18:28.122684 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.122705 kubelet[2893]: W0909 23:18:28.122702 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.122809 kubelet[2893]: E0909 23:18:28.122733 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.123358 kubelet[2893]: E0909 23:18:28.123335 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.123358 kubelet[2893]: W0909 23:18:28.123354 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.123467 kubelet[2893]: E0909 23:18:28.123387 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.123693 kubelet[2893]: E0909 23:18:28.123669 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.123693 kubelet[2893]: W0909 23:18:28.123689 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.123806 kubelet[2893]: E0909 23:18:28.123720 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.124471 kubelet[2893]: E0909 23:18:28.124441 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.124471 kubelet[2893]: W0909 23:18:28.124460 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.124574 kubelet[2893]: E0909 23:18:28.124549 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.124894 kubelet[2893]: E0909 23:18:28.124777 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.124894 kubelet[2893]: W0909 23:18:28.124796 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.126310 kubelet[2893]: E0909 23:18:28.126286 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.126415 kubelet[2893]: E0909 23:18:28.126388 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.126415 kubelet[2893]: W0909 23:18:28.126406 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.126511 kubelet[2893]: E0909 23:18:28.126494 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.126726 kubelet[2893]: E0909 23:18:28.126702 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.126726 kubelet[2893]: W0909 23:18:28.126720 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.126820 kubelet[2893]: E0909 23:18:28.126807 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.127360 kubelet[2893]: E0909 23:18:28.127338 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.127360 kubelet[2893]: W0909 23:18:28.127357 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.127494 kubelet[2893]: E0909 23:18:28.127472 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.127691 kubelet[2893]: E0909 23:18:28.127669 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.127691 kubelet[2893]: W0909 23:18:28.127688 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.127970 kubelet[2893]: E0909 23:18:28.127783 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.128033 kubelet[2893]: E0909 23:18:28.128012 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.128033 kubelet[2893]: W0909 23:18:28.128025 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.128125 kubelet[2893]: E0909 23:18:28.128112 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.128365 kubelet[2893]: E0909 23:18:28.128343 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.128365 kubelet[2893]: W0909 23:18:28.128361 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.128620 kubelet[2893]: E0909 23:18:28.128448 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.128689 kubelet[2893]: E0909 23:18:28.128646 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.128689 kubelet[2893]: W0909 23:18:28.128659 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.129263 kubelet[2893]: E0909 23:18:28.128763 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.129263 kubelet[2893]: E0909 23:18:28.128869 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.129263 kubelet[2893]: W0909 23:18:28.128881 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.130236 kubelet[2893]: E0909 23:18:28.130209 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.130481 kubelet[2893]: E0909 23:18:28.130457 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.130481 kubelet[2893]: W0909 23:18:28.130478 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.130581 kubelet[2893]: E0909 23:18:28.130566 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.130786 kubelet[2893]: E0909 23:18:28.130766 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.130839 kubelet[2893]: W0909 23:18:28.130787 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.131044 kubelet[2893]: E0909 23:18:28.130876 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.131112 kubelet[2893]: E0909 23:18:28.131087 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.131112 kubelet[2893]: W0909 23:18:28.131100 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.131311 kubelet[2893]: E0909 23:18:28.131291 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.131737 kubelet[2893]: E0909 23:18:28.131388 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.131737 kubelet[2893]: W0909 23:18:28.131400 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.131737 kubelet[2893]: E0909 23:18:28.131485 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.131737 kubelet[2893]: E0909 23:18:28.131713 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.131737 kubelet[2893]: W0909 23:18:28.131725 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.132041 kubelet[2893]: E0909 23:18:28.132009 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.132106 kubelet[2893]: E0909 23:18:28.132086 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.132157 kubelet[2893]: W0909 23:18:28.132104 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.132386 kubelet[2893]: E0909 23:18:28.132270 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.132386 kubelet[2893]: E0909 23:18:28.132382 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.132475 kubelet[2893]: W0909 23:18:28.132395 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.132521 kubelet[2893]: E0909 23:18:28.132483 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.133733 kubelet[2893]: E0909 23:18:28.133707 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.133733 kubelet[2893]: W0909 23:18:28.133726 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.133941 kubelet[2893]: E0909 23:18:28.133919 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.134307 kubelet[2893]: E0909 23:18:28.134285 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.134307 kubelet[2893]: W0909 23:18:28.134305 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.134516 kubelet[2893]: E0909 23:18:28.134496 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.134616 kubelet[2893]: E0909 23:18:28.134587 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.134673 kubelet[2893]: W0909 23:18:28.134605 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.134729 kubelet[2893]: E0909 23:18:28.134704 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.134919 kubelet[2893]: E0909 23:18:28.134895 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.134919 kubelet[2893]: W0909 23:18:28.134913 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.135000 kubelet[2893]: E0909 23:18:28.134928 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.135226 kubelet[2893]: E0909 23:18:28.135206 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.135226 kubelet[2893]: W0909 23:18:28.135224 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.135345 kubelet[2893]: E0909 23:18:28.135239 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.228892 kubelet[2893]: E0909 23:18:28.228845 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.228892 kubelet[2893]: W0909 23:18:28.228878 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.229102 kubelet[2893]: E0909 23:18:28.228906 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.230420 kubelet[2893]: E0909 23:18:28.230388 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.230420 kubelet[2893]: W0909 23:18:28.230409 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.230538 kubelet[2893]: E0909 23:18:28.230425 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.230704 kubelet[2893]: E0909 23:18:28.230678 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.230704 kubelet[2893]: W0909 23:18:28.230698 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.230820 kubelet[2893]: E0909 23:18:28.230714 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.230977 kubelet[2893]: E0909 23:18:28.230952 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.230977 kubelet[2893]: W0909 23:18:28.230972 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.231072 kubelet[2893]: E0909 23:18:28.230987 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.284208 kubelet[2893]: E0909 23:18:28.284157 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.284208 kubelet[2893]: W0909 23:18:28.284206 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.284476 kubelet[2893]: E0909 23:18:28.284232 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.332252 kubelet[2893]: E0909 23:18:28.332204 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.332252 kubelet[2893]: W0909 23:18:28.332239 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.332468 kubelet[2893]: E0909 23:18:28.332268 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.332563 kubelet[2893]: E0909 23:18:28.332536 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.332563 kubelet[2893]: W0909 23:18:28.332561 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.332702 kubelet[2893]: E0909 23:18:28.332580 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.332861 kubelet[2893]: E0909 23:18:28.332832 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.332861 kubelet[2893]: W0909 23:18:28.332854 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.332976 kubelet[2893]: E0909 23:18:28.332870 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.434108 kubelet[2893]: E0909 23:18:28.434063 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.434108 kubelet[2893]: W0909 23:18:28.434096 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.434675 kubelet[2893]: E0909 23:18:28.434123 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.434675 kubelet[2893]: E0909 23:18:28.434415 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.434675 kubelet[2893]: W0909 23:18:28.434428 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.434675 kubelet[2893]: E0909 23:18:28.434442 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.434841 kubelet[2893]: E0909 23:18:28.434696 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.434841 kubelet[2893]: W0909 23:18:28.434709 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.434841 kubelet[2893]: E0909 23:18:28.434723 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.536061 kubelet[2893]: E0909 23:18:28.535368 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.536061 kubelet[2893]: W0909 23:18:28.535401 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.536061 kubelet[2893]: E0909 23:18:28.535426 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.536351 kubelet[2893]: E0909 23:18:28.536293 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.536351 kubelet[2893]: W0909 23:18:28.536312 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.536351 kubelet[2893]: E0909 23:18:28.536330 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.536731 kubelet[2893]: E0909 23:18:28.536699 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.536731 kubelet[2893]: W0909 23:18:28.536722 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.536837 kubelet[2893]: E0909 23:18:28.536738 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.588197 kubelet[2893]: E0909 23:18:28.587996 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.588197 kubelet[2893]: W0909 23:18:28.588027 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.588197 kubelet[2893]: E0909 23:18:28.588051 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.591831 kubelet[2893]: E0909 23:18:28.591809 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.591936 kubelet[2893]: W0909 23:18:28.591917 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.592192 kubelet[2893]: E0909 23:18:28.592043 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.592484 kubelet[2893]: E0909 23:18:28.592421 2893 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:18:28.592484 kubelet[2893]: W0909 23:18:28.592439 2893 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:18:28.592484 kubelet[2893]: E0909 23:18:28.592454 2893 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:18:28.610391 containerd[1589]: time="2025-09-09T23:18:28.610341930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9g7dw,Uid:b98ba589-a7d1-4b07-81a3-3a2a321aed79,Namespace:calico-system,Attempt:0,}" Sep 9 23:18:28.619381 containerd[1589]: time="2025-09-09T23:18:28.619338504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-845d5cc58-2wlp9,Uid:5768268f-8c27-4a22-953d-5e34ee4c5133,Namespace:calico-system,Attempt:0,}" Sep 9 23:18:28.641645 containerd[1589]: time="2025-09-09T23:18:28.641527754Z" level=info msg="connecting to shim f925e06b94a50aab4131fabc19e8a036fa66103be37a444c780acfa37c19296b" address="unix:///run/containerd/s/91a2988c0fb9fa433ea9bf245773b67fef2a5325367b256f8d707f9070f88946" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:28.656929 containerd[1589]: time="2025-09-09T23:18:28.656862480Z" level=info msg="connecting to shim a14eb485ccf78e4983dbca38582542c02637a72057aec70971ae81aadee5fa5c" address="unix:///run/containerd/s/51291f75ce81e68a34be54933137b6df529bac7bff2d2317fc7b4056b2959b55" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:28.708442 systemd[1]: Started cri-containerd-f925e06b94a50aab4131fabc19e8a036fa66103be37a444c780acfa37c19296b.scope - libcontainer container f925e06b94a50aab4131fabc19e8a036fa66103be37a444c780acfa37c19296b. Sep 9 23:18:28.731407 systemd[1]: Started cri-containerd-a14eb485ccf78e4983dbca38582542c02637a72057aec70971ae81aadee5fa5c.scope - libcontainer container a14eb485ccf78e4983dbca38582542c02637a72057aec70971ae81aadee5fa5c. Sep 9 23:18:28.797818 containerd[1589]: time="2025-09-09T23:18:28.796920347Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9g7dw,Uid:b98ba589-a7d1-4b07-81a3-3a2a321aed79,Namespace:calico-system,Attempt:0,} returns sandbox id \"f925e06b94a50aab4131fabc19e8a036fa66103be37a444c780acfa37c19296b\"" Sep 9 23:18:28.802195 containerd[1589]: time="2025-09-09T23:18:28.801289933Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 23:18:28.877690 containerd[1589]: time="2025-09-09T23:18:28.877635560Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-845d5cc58-2wlp9,Uid:5768268f-8c27-4a22-953d-5e34ee4c5133,Namespace:calico-system,Attempt:0,} returns sandbox id \"a14eb485ccf78e4983dbca38582542c02637a72057aec70971ae81aadee5fa5c\"" Sep 9 23:18:29.831219 kubelet[2893]: E0909 23:18:29.830191 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkvhf" podUID="88a3a96f-1289-44aa-b5ca-25670a32dc3d" Sep 9 23:18:30.401012 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount769707693.mount: Deactivated successfully. Sep 9 23:18:30.579617 containerd[1589]: time="2025-09-09T23:18:30.579490526Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:30.581323 containerd[1589]: time="2025-09-09T23:18:30.581296238Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=5939501" Sep 9 23:18:30.582200 containerd[1589]: time="2025-09-09T23:18:30.582012381Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:30.585634 containerd[1589]: time="2025-09-09T23:18:30.585550219Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:30.586564 containerd[1589]: time="2025-09-09T23:18:30.586457566Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.784771532s" Sep 9 23:18:30.586564 containerd[1589]: time="2025-09-09T23:18:30.586494770Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 9 23:18:30.589804 containerd[1589]: time="2025-09-09T23:18:30.589727740Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 23:18:30.592377 containerd[1589]: time="2025-09-09T23:18:30.592315425Z" level=info msg="CreateContainer within sandbox \"f925e06b94a50aab4131fabc19e8a036fa66103be37a444c780acfa37c19296b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 23:18:30.631471 containerd[1589]: time="2025-09-09T23:18:30.630225544Z" level=info msg="Container d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:18:30.640079 containerd[1589]: time="2025-09-09T23:18:30.640016806Z" level=info msg="CreateContainer within sandbox \"f925e06b94a50aab4131fabc19e8a036fa66103be37a444c780acfa37c19296b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df\"" Sep 9 23:18:30.640955 containerd[1589]: time="2025-09-09T23:18:30.640926830Z" level=info msg="StartContainer for \"d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df\"" Sep 9 23:18:30.642884 containerd[1589]: time="2025-09-09T23:18:30.642854569Z" level=info msg="connecting to shim d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df" address="unix:///run/containerd/s/91a2988c0fb9fa433ea9bf245773b67fef2a5325367b256f8d707f9070f88946" protocol=ttrpc version=3 Sep 9 23:18:30.678387 systemd[1]: Started cri-containerd-d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df.scope - libcontainer container d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df. Sep 9 23:18:30.768591 containerd[1589]: time="2025-09-09T23:18:30.768495642Z" level=info msg="StartContainer for \"d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df\" returns successfully" Sep 9 23:18:30.788393 systemd[1]: cri-containerd-d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df.scope: Deactivated successfully. Sep 9 23:18:30.817758 containerd[1589]: time="2025-09-09T23:18:30.817701277Z" level=info msg="received exit event container_id:\"d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df\" id:\"d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df\" pid:3506 exited_at:{seconds:1757459910 nanos:794437110}" Sep 9 23:18:30.829055 containerd[1589]: time="2025-09-09T23:18:30.829002799Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df\" id:\"d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df\" pid:3506 exited_at:{seconds:1757459910 nanos:794437110}" Sep 9 23:18:31.328963 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d1ab8e7384c9c62f91e4808cd9fd5e5b41ef387ce13e28e167d6ec0e9b9fe6df-rootfs.mount: Deactivated successfully. Sep 9 23:18:31.828204 kubelet[2893]: E0909 23:18:31.827997 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkvhf" podUID="88a3a96f-1289-44aa-b5ca-25670a32dc3d" Sep 9 23:18:33.828858 kubelet[2893]: E0909 23:18:33.828449 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkvhf" podUID="88a3a96f-1289-44aa-b5ca-25670a32dc3d" Sep 9 23:18:33.912573 containerd[1589]: time="2025-09-09T23:18:33.912357950Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:33.914279 containerd[1589]: time="2025-09-09T23:18:33.914112387Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33744548" Sep 9 23:18:33.915597 containerd[1589]: time="2025-09-09T23:18:33.915556250Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:33.919902 containerd[1589]: time="2025-09-09T23:18:33.919835618Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:33.921218 containerd[1589]: time="2025-09-09T23:18:33.920943037Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 3.331170135s" Sep 9 23:18:33.921218 containerd[1589]: time="2025-09-09T23:18:33.921004336Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 9 23:18:33.924584 containerd[1589]: time="2025-09-09T23:18:33.923619173Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 23:18:33.947066 containerd[1589]: time="2025-09-09T23:18:33.947010433Z" level=info msg="CreateContainer within sandbox \"a14eb485ccf78e4983dbca38582542c02637a72057aec70971ae81aadee5fa5c\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 23:18:33.956767 containerd[1589]: time="2025-09-09T23:18:33.956720794Z" level=info msg="Container 1a9e51042bf039fd0c6e7530bbbc1b831c5e3a0cb3e8816da4baaf595ec9697d: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:18:33.977114 containerd[1589]: time="2025-09-09T23:18:33.977027581Z" level=info msg="CreateContainer within sandbox \"a14eb485ccf78e4983dbca38582542c02637a72057aec70971ae81aadee5fa5c\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"1a9e51042bf039fd0c6e7530bbbc1b831c5e3a0cb3e8816da4baaf595ec9697d\"" Sep 9 23:18:33.978610 containerd[1589]: time="2025-09-09T23:18:33.977596697Z" level=info msg="StartContainer for \"1a9e51042bf039fd0c6e7530bbbc1b831c5e3a0cb3e8816da4baaf595ec9697d\"" Sep 9 23:18:33.982024 containerd[1589]: time="2025-09-09T23:18:33.981973664Z" level=info msg="connecting to shim 1a9e51042bf039fd0c6e7530bbbc1b831c5e3a0cb3e8816da4baaf595ec9697d" address="unix:///run/containerd/s/51291f75ce81e68a34be54933137b6df529bac7bff2d2317fc7b4056b2959b55" protocol=ttrpc version=3 Sep 9 23:18:34.038453 systemd[1]: Started cri-containerd-1a9e51042bf039fd0c6e7530bbbc1b831c5e3a0cb3e8816da4baaf595ec9697d.scope - libcontainer container 1a9e51042bf039fd0c6e7530bbbc1b831c5e3a0cb3e8816da4baaf595ec9697d. Sep 9 23:18:34.124783 containerd[1589]: time="2025-09-09T23:18:34.124586849Z" level=info msg="StartContainer for \"1a9e51042bf039fd0c6e7530bbbc1b831c5e3a0cb3e8816da4baaf595ec9697d\" returns successfully" Sep 9 23:18:35.829740 kubelet[2893]: E0909 23:18:35.828468 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkvhf" podUID="88a3a96f-1289-44aa-b5ca-25670a32dc3d" Sep 9 23:18:35.982721 kubelet[2893]: I0909 23:18:35.982432 2893 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:18:37.829856 kubelet[2893]: E0909 23:18:37.829540 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkvhf" podUID="88a3a96f-1289-44aa-b5ca-25670a32dc3d" Sep 9 23:18:38.983852 containerd[1589]: time="2025-09-09T23:18:38.983800505Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:38.985810 containerd[1589]: time="2025-09-09T23:18:38.985774713Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 9 23:18:38.986833 containerd[1589]: time="2025-09-09T23:18:38.986779789Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:38.989154 containerd[1589]: time="2025-09-09T23:18:38.989113324Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:38.991004 containerd[1589]: time="2025-09-09T23:18:38.990948466Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 5.067290769s" Sep 9 23:18:38.991088 containerd[1589]: time="2025-09-09T23:18:38.991002734Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 9 23:18:38.994836 containerd[1589]: time="2025-09-09T23:18:38.994775951Z" level=info msg="CreateContainer within sandbox \"f925e06b94a50aab4131fabc19e8a036fa66103be37a444c780acfa37c19296b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 23:18:39.016623 containerd[1589]: time="2025-09-09T23:18:39.016571028Z" level=info msg="Container 78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:18:39.040771 containerd[1589]: time="2025-09-09T23:18:39.040694979Z" level=info msg="CreateContainer within sandbox \"f925e06b94a50aab4131fabc19e8a036fa66103be37a444c780acfa37c19296b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59\"" Sep 9 23:18:39.042739 containerd[1589]: time="2025-09-09T23:18:39.042709275Z" level=info msg="StartContainer for \"78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59\"" Sep 9 23:18:39.045042 containerd[1589]: time="2025-09-09T23:18:39.044981965Z" level=info msg="connecting to shim 78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59" address="unix:///run/containerd/s/91a2988c0fb9fa433ea9bf245773b67fef2a5325367b256f8d707f9070f88946" protocol=ttrpc version=3 Sep 9 23:18:39.077391 systemd[1]: Started cri-containerd-78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59.scope - libcontainer container 78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59. Sep 9 23:18:39.154929 containerd[1589]: time="2025-09-09T23:18:39.154733933Z" level=info msg="StartContainer for \"78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59\" returns successfully" Sep 9 23:18:39.842384 kubelet[2893]: E0909 23:18:39.842312 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lkvhf" podUID="88a3a96f-1289-44aa-b5ca-25670a32dc3d" Sep 9 23:18:40.032473 kubelet[2893]: I0909 23:18:40.030565 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-845d5cc58-2wlp9" podStartSLOduration=7.987329426 podStartE2EDuration="13.030545839s" podCreationTimestamp="2025-09-09 23:18:27 +0000 UTC" firstStartedPulling="2025-09-09 23:18:28.879252943 +0000 UTC m=+21.289437843" lastFinishedPulling="2025-09-09 23:18:33.922469361 +0000 UTC m=+26.332654256" observedRunningTime="2025-09-09 23:18:34.997671437 +0000 UTC m=+27.407856375" watchObservedRunningTime="2025-09-09 23:18:40.030545839 +0000 UTC m=+32.440730746" Sep 9 23:18:40.132608 systemd[1]: cri-containerd-78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59.scope: Deactivated successfully. Sep 9 23:18:40.133890 systemd[1]: cri-containerd-78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59.scope: Consumed 711ms CPU time, 169.4M memory peak, 9.3M read from disk, 171.3M written to disk. Sep 9 23:18:40.206266 kubelet[2893]: I0909 23:18:40.205106 2893 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 9 23:18:40.280982 systemd[1]: Created slice kubepods-burstable-pod6bee4708_c6c2_4c72_85d5_67acf5711a98.slice - libcontainer container kubepods-burstable-pod6bee4708_c6c2_4c72_85d5_67acf5711a98.slice. Sep 9 23:18:40.291383 containerd[1589]: time="2025-09-09T23:18:40.291322038Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59\" id:\"78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59\" pid:3605 exited_at:{seconds:1757459920 nanos:261343606}" Sep 9 23:18:40.291818 containerd[1589]: time="2025-09-09T23:18:40.291720352Z" level=info msg="received exit event container_id:\"78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59\" id:\"78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59\" pid:3605 exited_at:{seconds:1757459920 nanos:261343606}" Sep 9 23:18:40.294899 kubelet[2893]: W0909 23:18:40.294242 2893 reflector.go:561] object-"calico-system"/"goldmane-key-pair": failed to list *v1.Secret: secrets "goldmane-key-pair" is forbidden: User "system:node:srv-5qwy1.gb1.brightbox.com" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'srv-5qwy1.gb1.brightbox.com' and this object Sep 9 23:18:40.294899 kubelet[2893]: E0909 23:18:40.294298 2893 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"goldmane-key-pair\" is forbidden: User \"system:node:srv-5qwy1.gb1.brightbox.com\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'srv-5qwy1.gb1.brightbox.com' and this object" logger="UnhandledError" Sep 9 23:18:40.303764 systemd[1]: Created slice kubepods-besteffort-pode5d2f19e_a775_4e1a_873a_f65ee624f7f0.slice - libcontainer container kubepods-besteffort-pode5d2f19e_a775_4e1a_873a_f65ee624f7f0.slice. Sep 9 23:18:40.317541 systemd[1]: Created slice kubepods-burstable-pod38bec17a_1d12_40cd_be11_efa68728ce36.slice - libcontainer container kubepods-burstable-pod38bec17a_1d12_40cd_be11_efa68728ce36.slice. Sep 9 23:18:40.334459 kubelet[2893]: I0909 23:18:40.334261 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njkrt\" (UniqueName: \"kubernetes.io/projected/e5d2f19e-a775-4e1a-873a-f65ee624f7f0-kube-api-access-njkrt\") pod \"calico-apiserver-c5bc589cc-wtzzr\" (UID: \"e5d2f19e-a775-4e1a-873a-f65ee624f7f0\") " pod="calico-apiserver/calico-apiserver-c5bc589cc-wtzzr" Sep 9 23:18:40.335441 kubelet[2893]: I0909 23:18:40.335405 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4437d017-b50d-4c6e-9b27-8d321a43b551-whisker-ca-bundle\") pod \"whisker-6ccbfd889c-n54ct\" (UID: \"4437d017-b50d-4c6e-9b27-8d321a43b551\") " pod="calico-system/whisker-6ccbfd889c-n54ct" Sep 9 23:18:40.336379 kubelet[2893]: I0909 23:18:40.335602 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhfg2\" (UniqueName: \"kubernetes.io/projected/6bee4708-c6c2-4c72-85d5-67acf5711a98-kube-api-access-jhfg2\") pod \"coredns-7c65d6cfc9-w4xgg\" (UID: \"6bee4708-c6c2-4c72-85d5-67acf5711a98\") " pod="kube-system/coredns-7c65d6cfc9-w4xgg" Sep 9 23:18:40.337076 kubelet[2893]: I0909 23:18:40.337052 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/ef7990c2-8108-400b-aa6e-868d127aa7b8-goldmane-key-pair\") pod \"goldmane-7988f88666-lvhtm\" (UID: \"ef7990c2-8108-400b-aa6e-868d127aa7b8\") " pod="calico-system/goldmane-7988f88666-lvhtm" Sep 9 23:18:40.339831 kubelet[2893]: I0909 23:18:40.337218 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df965b2b-b1fe-46f6-8aa0-5651c4d730d6-tigera-ca-bundle\") pod \"calico-kube-controllers-5c6cd884f-hjkmf\" (UID: \"df965b2b-b1fe-46f6-8aa0-5651c4d730d6\") " pod="calico-system/calico-kube-controllers-5c6cd884f-hjkmf" Sep 9 23:18:40.337933 systemd[1]: Created slice kubepods-besteffort-pod4437d017_b50d_4c6e_9b27_8d321a43b551.slice - libcontainer container kubepods-besteffort-pod4437d017_b50d_4c6e_9b27_8d321a43b551.slice. Sep 9 23:18:40.341267 kubelet[2893]: I0909 23:18:40.341019 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e5d2f19e-a775-4e1a-873a-f65ee624f7f0-calico-apiserver-certs\") pod \"calico-apiserver-c5bc589cc-wtzzr\" (UID: \"e5d2f19e-a775-4e1a-873a-f65ee624f7f0\") " pod="calico-apiserver/calico-apiserver-c5bc589cc-wtzzr" Sep 9 23:18:40.341511 kubelet[2893]: I0909 23:18:40.341486 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef7990c2-8108-400b-aa6e-868d127aa7b8-goldmane-ca-bundle\") pod \"goldmane-7988f88666-lvhtm\" (UID: \"ef7990c2-8108-400b-aa6e-868d127aa7b8\") " pod="calico-system/goldmane-7988f88666-lvhtm" Sep 9 23:18:40.346213 kubelet[2893]: I0909 23:18:40.341630 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jk6v\" (UniqueName: \"kubernetes.io/projected/bcc9abc6-6d48-4bc6-a634-589f1b9e5771-kube-api-access-6jk6v\") pod \"calico-apiserver-c5bc589cc-26mdm\" (UID: \"bcc9abc6-6d48-4bc6-a634-589f1b9e5771\") " pod="calico-apiserver/calico-apiserver-c5bc589cc-26mdm" Sep 9 23:18:40.347583 kubelet[2893]: I0909 23:18:40.347227 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc5nh\" (UniqueName: \"kubernetes.io/projected/ef7990c2-8108-400b-aa6e-868d127aa7b8-kube-api-access-fc5nh\") pod \"goldmane-7988f88666-lvhtm\" (UID: \"ef7990c2-8108-400b-aa6e-868d127aa7b8\") " pod="calico-system/goldmane-7988f88666-lvhtm" Sep 9 23:18:40.347583 kubelet[2893]: I0909 23:18:40.347272 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6bee4708-c6c2-4c72-85d5-67acf5711a98-config-volume\") pod \"coredns-7c65d6cfc9-w4xgg\" (UID: \"6bee4708-c6c2-4c72-85d5-67acf5711a98\") " pod="kube-system/coredns-7c65d6cfc9-w4xgg" Sep 9 23:18:40.347583 kubelet[2893]: I0909 23:18:40.347323 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7990c2-8108-400b-aa6e-868d127aa7b8-config\") pod \"goldmane-7988f88666-lvhtm\" (UID: \"ef7990c2-8108-400b-aa6e-868d127aa7b8\") " pod="calico-system/goldmane-7988f88666-lvhtm" Sep 9 23:18:40.347583 kubelet[2893]: I0909 23:18:40.347348 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rshqn\" (UniqueName: \"kubernetes.io/projected/df965b2b-b1fe-46f6-8aa0-5651c4d730d6-kube-api-access-rshqn\") pod \"calico-kube-controllers-5c6cd884f-hjkmf\" (UID: \"df965b2b-b1fe-46f6-8aa0-5651c4d730d6\") " pod="calico-system/calico-kube-controllers-5c6cd884f-hjkmf" Sep 9 23:18:40.349262 kubelet[2893]: I0909 23:18:40.347680 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4437d017-b50d-4c6e-9b27-8d321a43b551-whisker-backend-key-pair\") pod \"whisker-6ccbfd889c-n54ct\" (UID: \"4437d017-b50d-4c6e-9b27-8d321a43b551\") " pod="calico-system/whisker-6ccbfd889c-n54ct" Sep 9 23:18:40.349262 kubelet[2893]: I0909 23:18:40.347732 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/bcc9abc6-6d48-4bc6-a634-589f1b9e5771-calico-apiserver-certs\") pod \"calico-apiserver-c5bc589cc-26mdm\" (UID: \"bcc9abc6-6d48-4bc6-a634-589f1b9e5771\") " pod="calico-apiserver/calico-apiserver-c5bc589cc-26mdm" Sep 9 23:18:40.349262 kubelet[2893]: I0909 23:18:40.347758 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38bec17a-1d12-40cd-be11-efa68728ce36-config-volume\") pod \"coredns-7c65d6cfc9-cht4z\" (UID: \"38bec17a-1d12-40cd-be11-efa68728ce36\") " pod="kube-system/coredns-7c65d6cfc9-cht4z" Sep 9 23:18:40.349262 kubelet[2893]: I0909 23:18:40.347783 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qwnh\" (UniqueName: \"kubernetes.io/projected/4437d017-b50d-4c6e-9b27-8d321a43b551-kube-api-access-8qwnh\") pod \"whisker-6ccbfd889c-n54ct\" (UID: \"4437d017-b50d-4c6e-9b27-8d321a43b551\") " pod="calico-system/whisker-6ccbfd889c-n54ct" Sep 9 23:18:40.349262 kubelet[2893]: I0909 23:18:40.347812 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw2b8\" (UniqueName: \"kubernetes.io/projected/38bec17a-1d12-40cd-be11-efa68728ce36-kube-api-access-gw2b8\") pod \"coredns-7c65d6cfc9-cht4z\" (UID: \"38bec17a-1d12-40cd-be11-efa68728ce36\") " pod="kube-system/coredns-7c65d6cfc9-cht4z" Sep 9 23:18:40.368744 systemd[1]: Created slice kubepods-besteffort-podef7990c2_8108_400b_aa6e_868d127aa7b8.slice - libcontainer container kubepods-besteffort-podef7990c2_8108_400b_aa6e_868d127aa7b8.slice. Sep 9 23:18:40.389531 systemd[1]: Created slice kubepods-besteffort-podbcc9abc6_6d48_4bc6_a634_589f1b9e5771.slice - libcontainer container kubepods-besteffort-podbcc9abc6_6d48_4bc6_a634_589f1b9e5771.slice. Sep 9 23:18:40.409846 systemd[1]: Created slice kubepods-besteffort-poddf965b2b_b1fe_46f6_8aa0_5651c4d730d6.slice - libcontainer container kubepods-besteffort-poddf965b2b_b1fe_46f6_8aa0_5651c4d730d6.slice. Sep 9 23:18:40.425280 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-78ff2f106ba5f418d9d2a646b5fed1d4a989bcc207ba9490c2268cbc4be3ea59-rootfs.mount: Deactivated successfully. Sep 9 23:18:40.609489 containerd[1589]: time="2025-09-09T23:18:40.609191601Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-w4xgg,Uid:6bee4708-c6c2-4c72-85d5-67acf5711a98,Namespace:kube-system,Attempt:0,}" Sep 9 23:18:40.615198 containerd[1589]: time="2025-09-09T23:18:40.615029606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5bc589cc-wtzzr,Uid:e5d2f19e-a775-4e1a-873a-f65ee624f7f0,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:18:40.633231 containerd[1589]: time="2025-09-09T23:18:40.632887511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cht4z,Uid:38bec17a-1d12-40cd-be11-efa68728ce36,Namespace:kube-system,Attempt:0,}" Sep 9 23:18:40.676775 containerd[1589]: time="2025-09-09T23:18:40.676565271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6ccbfd889c-n54ct,Uid:4437d017-b50d-4c6e-9b27-8d321a43b551,Namespace:calico-system,Attempt:0,}" Sep 9 23:18:40.700823 containerd[1589]: time="2025-09-09T23:18:40.699226134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5bc589cc-26mdm,Uid:bcc9abc6-6d48-4bc6-a634-589f1b9e5771,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:18:40.720773 containerd[1589]: time="2025-09-09T23:18:40.720723262Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c6cd884f-hjkmf,Uid:df965b2b-b1fe-46f6-8aa0-5651c4d730d6,Namespace:calico-system,Attempt:0,}" Sep 9 23:18:41.025100 containerd[1589]: time="2025-09-09T23:18:41.024866359Z" level=error msg="Failed to destroy network for sandbox \"009249e6bcc082e5d5a7af2c653c251da9d267acb9ce6edf0cb8e480e8e9e224\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.025431 containerd[1589]: time="2025-09-09T23:18:41.025133472Z" level=error msg="Failed to destroy network for sandbox \"19dea9b3453f8e7e7a6fab981d0732c02f9d0cd50016722aab2703d6ba9ba321\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.059483 containerd[1589]: time="2025-09-09T23:18:41.036063365Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-w4xgg,Uid:6bee4708-c6c2-4c72-85d5-67acf5711a98,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"009249e6bcc082e5d5a7af2c653c251da9d267acb9ce6edf0cb8e480e8e9e224\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.061499 containerd[1589]: time="2025-09-09T23:18:41.061434348Z" level=error msg="Failed to destroy network for sandbox \"798121c0707beb89a6fd314ac2cfc187a3e177728d235614efb6aa4055eae15f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.065560 containerd[1589]: time="2025-09-09T23:18:41.065483241Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6ccbfd889c-n54ct,Uid:4437d017-b50d-4c6e-9b27-8d321a43b551,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"19dea9b3453f8e7e7a6fab981d0732c02f9d0cd50016722aab2703d6ba9ba321\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.068541 kubelet[2893]: E0909 23:18:41.068326 2893 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"009249e6bcc082e5d5a7af2c653c251da9d267acb9ce6edf0cb8e480e8e9e224\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.072587 kubelet[2893]: E0909 23:18:41.069128 2893 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"009249e6bcc082e5d5a7af2c653c251da9d267acb9ce6edf0cb8e480e8e9e224\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-w4xgg" Sep 9 23:18:41.072587 kubelet[2893]: E0909 23:18:41.069205 2893 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"009249e6bcc082e5d5a7af2c653c251da9d267acb9ce6edf0cb8e480e8e9e224\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-w4xgg" Sep 9 23:18:41.072587 kubelet[2893]: E0909 23:18:41.069302 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-w4xgg_kube-system(6bee4708-c6c2-4c72-85d5-67acf5711a98)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-w4xgg_kube-system(6bee4708-c6c2-4c72-85d5-67acf5711a98)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"009249e6bcc082e5d5a7af2c653c251da9d267acb9ce6edf0cb8e480e8e9e224\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-w4xgg" podUID="6bee4708-c6c2-4c72-85d5-67acf5711a98" Sep 9 23:18:41.082284 containerd[1589]: time="2025-09-09T23:18:41.082201244Z" level=error msg="Failed to destroy network for sandbox \"f2994f07f08ad4a20033da2cff1c55e143fef8eac6fce50d1331ce3157ff3d1b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.084193 kubelet[2893]: E0909 23:18:41.083146 2893 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19dea9b3453f8e7e7a6fab981d0732c02f9d0cd50016722aab2703d6ba9ba321\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.084193 kubelet[2893]: E0909 23:18:41.083260 2893 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19dea9b3453f8e7e7a6fab981d0732c02f9d0cd50016722aab2703d6ba9ba321\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6ccbfd889c-n54ct" Sep 9 23:18:41.084193 kubelet[2893]: E0909 23:18:41.083289 2893 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19dea9b3453f8e7e7a6fab981d0732c02f9d0cd50016722aab2703d6ba9ba321\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6ccbfd889c-n54ct" Sep 9 23:18:41.084365 kubelet[2893]: E0909 23:18:41.083348 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6ccbfd889c-n54ct_calico-system(4437d017-b50d-4c6e-9b27-8d321a43b551)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6ccbfd889c-n54ct_calico-system(4437d017-b50d-4c6e-9b27-8d321a43b551)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"19dea9b3453f8e7e7a6fab981d0732c02f9d0cd50016722aab2703d6ba9ba321\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6ccbfd889c-n54ct" podUID="4437d017-b50d-4c6e-9b27-8d321a43b551" Sep 9 23:18:41.086732 containerd[1589]: time="2025-09-09T23:18:41.086676834Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5bc589cc-wtzzr,Uid:e5d2f19e-a775-4e1a-873a-f65ee624f7f0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2994f07f08ad4a20033da2cff1c55e143fef8eac6fce50d1331ce3157ff3d1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.088230 containerd[1589]: time="2025-09-09T23:18:41.087982570Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cht4z,Uid:38bec17a-1d12-40cd-be11-efa68728ce36,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"798121c0707beb89a6fd314ac2cfc187a3e177728d235614efb6aa4055eae15f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.088998 kubelet[2893]: E0909 23:18:41.088951 2893 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"798121c0707beb89a6fd314ac2cfc187a3e177728d235614efb6aa4055eae15f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.090138 kubelet[2893]: E0909 23:18:41.090090 2893 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"798121c0707beb89a6fd314ac2cfc187a3e177728d235614efb6aa4055eae15f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-cht4z" Sep 9 23:18:41.090245 kubelet[2893]: E0909 23:18:41.088951 2893 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2994f07f08ad4a20033da2cff1c55e143fef8eac6fce50d1331ce3157ff3d1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.090245 kubelet[2893]: E0909 23:18:41.090221 2893 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2994f07f08ad4a20033da2cff1c55e143fef8eac6fce50d1331ce3157ff3d1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c5bc589cc-wtzzr" Sep 9 23:18:41.091288 kubelet[2893]: E0909 23:18:41.090377 2893 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"798121c0707beb89a6fd314ac2cfc187a3e177728d235614efb6aa4055eae15f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-cht4z" Sep 9 23:18:41.091288 kubelet[2893]: E0909 23:18:41.090471 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-cht4z_kube-system(38bec17a-1d12-40cd-be11-efa68728ce36)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-cht4z_kube-system(38bec17a-1d12-40cd-be11-efa68728ce36)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"798121c0707beb89a6fd314ac2cfc187a3e177728d235614efb6aa4055eae15f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-cht4z" podUID="38bec17a-1d12-40cd-be11-efa68728ce36" Sep 9 23:18:41.091288 kubelet[2893]: E0909 23:18:41.091207 2893 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2994f07f08ad4a20033da2cff1c55e143fef8eac6fce50d1331ce3157ff3d1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c5bc589cc-wtzzr" Sep 9 23:18:41.091532 containerd[1589]: time="2025-09-09T23:18:41.090558850Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 23:18:41.091588 kubelet[2893]: E0909 23:18:41.091260 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-c5bc589cc-wtzzr_calico-apiserver(e5d2f19e-a775-4e1a-873a-f65ee624f7f0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-c5bc589cc-wtzzr_calico-apiserver(e5d2f19e-a775-4e1a-873a-f65ee624f7f0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f2994f07f08ad4a20033da2cff1c55e143fef8eac6fce50d1331ce3157ff3d1b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-c5bc589cc-wtzzr" podUID="e5d2f19e-a775-4e1a-873a-f65ee624f7f0" Sep 9 23:18:41.129970 containerd[1589]: time="2025-09-09T23:18:41.129339639Z" level=error msg="Failed to destroy network for sandbox \"13af6f67b23220a24b86c5dc0e973c3d0cc84f79751c391a1ae50beab2feffb7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.131960 containerd[1589]: time="2025-09-09T23:18:41.131841282Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c6cd884f-hjkmf,Uid:df965b2b-b1fe-46f6-8aa0-5651c4d730d6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"13af6f67b23220a24b86c5dc0e973c3d0cc84f79751c391a1ae50beab2feffb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.132243 kubelet[2893]: E0909 23:18:41.132196 2893 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13af6f67b23220a24b86c5dc0e973c3d0cc84f79751c391a1ae50beab2feffb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.132470 kubelet[2893]: E0909 23:18:41.132267 2893 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13af6f67b23220a24b86c5dc0e973c3d0cc84f79751c391a1ae50beab2feffb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5c6cd884f-hjkmf" Sep 9 23:18:41.132470 kubelet[2893]: E0909 23:18:41.132297 2893 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13af6f67b23220a24b86c5dc0e973c3d0cc84f79751c391a1ae50beab2feffb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5c6cd884f-hjkmf" Sep 9 23:18:41.132470 kubelet[2893]: E0909 23:18:41.132368 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5c6cd884f-hjkmf_calico-system(df965b2b-b1fe-46f6-8aa0-5651c4d730d6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5c6cd884f-hjkmf_calico-system(df965b2b-b1fe-46f6-8aa0-5651c4d730d6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"13af6f67b23220a24b86c5dc0e973c3d0cc84f79751c391a1ae50beab2feffb7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5c6cd884f-hjkmf" podUID="df965b2b-b1fe-46f6-8aa0-5651c4d730d6" Sep 9 23:18:41.135777 containerd[1589]: time="2025-09-09T23:18:41.135704878Z" level=error msg="Failed to destroy network for sandbox \"4ceabb6c91ced843d0fc144aefa68f15a6d6253000fd306e444241981223222b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.137031 containerd[1589]: time="2025-09-09T23:18:41.136947753Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5bc589cc-26mdm,Uid:bcc9abc6-6d48-4bc6-a634-589f1b9e5771,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ceabb6c91ced843d0fc144aefa68f15a6d6253000fd306e444241981223222b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.137710 kubelet[2893]: E0909 23:18:41.137639 2893 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ceabb6c91ced843d0fc144aefa68f15a6d6253000fd306e444241981223222b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.137859 kubelet[2893]: E0909 23:18:41.137724 2893 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ceabb6c91ced843d0fc144aefa68f15a6d6253000fd306e444241981223222b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c5bc589cc-26mdm" Sep 9 23:18:41.137859 kubelet[2893]: E0909 23:18:41.137772 2893 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ceabb6c91ced843d0fc144aefa68f15a6d6253000fd306e444241981223222b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c5bc589cc-26mdm" Sep 9 23:18:41.138257 kubelet[2893]: E0909 23:18:41.137979 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-c5bc589cc-26mdm_calico-apiserver(bcc9abc6-6d48-4bc6-a634-589f1b9e5771)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-c5bc589cc-26mdm_calico-apiserver(bcc9abc6-6d48-4bc6-a634-589f1b9e5771)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4ceabb6c91ced843d0fc144aefa68f15a6d6253000fd306e444241981223222b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-c5bc589cc-26mdm" podUID="bcc9abc6-6d48-4bc6-a634-589f1b9e5771" Sep 9 23:18:41.582840 containerd[1589]: time="2025-09-09T23:18:41.582774356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-lvhtm,Uid:ef7990c2-8108-400b-aa6e-868d127aa7b8,Namespace:calico-system,Attempt:0,}" Sep 9 23:18:41.674203 containerd[1589]: time="2025-09-09T23:18:41.674095732Z" level=error msg="Failed to destroy network for sandbox \"6d60807da4f2b73c8c3b01f4111e92a50baebc463238537b67f38d8a48f8b308\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.677610 containerd[1589]: time="2025-09-09T23:18:41.677472671Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-lvhtm,Uid:ef7990c2-8108-400b-aa6e-868d127aa7b8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d60807da4f2b73c8c3b01f4111e92a50baebc463238537b67f38d8a48f8b308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.678233 kubelet[2893]: E0909 23:18:41.678112 2893 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d60807da4f2b73c8c3b01f4111e92a50baebc463238537b67f38d8a48f8b308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.678316 kubelet[2893]: E0909 23:18:41.678262 2893 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d60807da4f2b73c8c3b01f4111e92a50baebc463238537b67f38d8a48f8b308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-lvhtm" Sep 9 23:18:41.678940 systemd[1]: run-netns-cni\x2d93b0cca1\x2d54f7\x2db2cb\x2d88c9\x2d26d708deb37c.mount: Deactivated successfully. Sep 9 23:18:41.679712 kubelet[2893]: E0909 23:18:41.679263 2893 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d60807da4f2b73c8c3b01f4111e92a50baebc463238537b67f38d8a48f8b308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-lvhtm" Sep 9 23:18:41.679712 kubelet[2893]: E0909 23:18:41.679390 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-lvhtm_calico-system(ef7990c2-8108-400b-aa6e-868d127aa7b8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-lvhtm_calico-system(ef7990c2-8108-400b-aa6e-868d127aa7b8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6d60807da4f2b73c8c3b01f4111e92a50baebc463238537b67f38d8a48f8b308\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-lvhtm" podUID="ef7990c2-8108-400b-aa6e-868d127aa7b8" Sep 9 23:18:41.837784 systemd[1]: Created slice kubepods-besteffort-pod88a3a96f_1289_44aa_b5ca_25670a32dc3d.slice - libcontainer container kubepods-besteffort-pod88a3a96f_1289_44aa_b5ca_25670a32dc3d.slice. Sep 9 23:18:41.842597 containerd[1589]: time="2025-09-09T23:18:41.842538546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lkvhf,Uid:88a3a96f-1289-44aa-b5ca-25670a32dc3d,Namespace:calico-system,Attempt:0,}" Sep 9 23:18:41.927389 containerd[1589]: time="2025-09-09T23:18:41.927324309Z" level=error msg="Failed to destroy network for sandbox \"6e82b5a076258865a7c0c55c60accbc334252a621ae8f09657b02200c2bc5c95\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.930353 systemd[1]: run-netns-cni\x2d5e9a195f\x2d27cc\x2d4be7\x2de108\x2dcfd9fa111ddc.mount: Deactivated successfully. Sep 9 23:18:41.931377 containerd[1589]: time="2025-09-09T23:18:41.930330535Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lkvhf,Uid:88a3a96f-1289-44aa-b5ca-25670a32dc3d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e82b5a076258865a7c0c55c60accbc334252a621ae8f09657b02200c2bc5c95\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.932518 kubelet[2893]: E0909 23:18:41.931780 2893 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e82b5a076258865a7c0c55c60accbc334252a621ae8f09657b02200c2bc5c95\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:18:41.932724 kubelet[2893]: E0909 23:18:41.932689 2893 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e82b5a076258865a7c0c55c60accbc334252a621ae8f09657b02200c2bc5c95\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-lkvhf" Sep 9 23:18:41.932902 kubelet[2893]: E0909 23:18:41.932874 2893 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e82b5a076258865a7c0c55c60accbc334252a621ae8f09657b02200c2bc5c95\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-lkvhf" Sep 9 23:18:41.933378 kubelet[2893]: E0909 23:18:41.933040 2893 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-lkvhf_calico-system(88a3a96f-1289-44aa-b5ca-25670a32dc3d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-lkvhf_calico-system(88a3a96f-1289-44aa-b5ca-25670a32dc3d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6e82b5a076258865a7c0c55c60accbc334252a621ae8f09657b02200c2bc5c95\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-lkvhf" podUID="88a3a96f-1289-44aa-b5ca-25670a32dc3d" Sep 9 23:18:50.847732 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2526105275.mount: Deactivated successfully. Sep 9 23:18:50.906260 containerd[1589]: time="2025-09-09T23:18:50.894922443Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:50.906260 containerd[1589]: time="2025-09-09T23:18:50.905902113Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 9 23:18:50.907837 containerd[1589]: time="2025-09-09T23:18:50.907804486Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:50.908944 containerd[1589]: time="2025-09-09T23:18:50.908906458Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 9.818299872s" Sep 9 23:18:50.909019 containerd[1589]: time="2025-09-09T23:18:50.908950410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 9 23:18:50.909806 containerd[1589]: time="2025-09-09T23:18:50.909694347Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:50.931783 containerd[1589]: time="2025-09-09T23:18:50.931598799Z" level=info msg="CreateContainer within sandbox \"f925e06b94a50aab4131fabc19e8a036fa66103be37a444c780acfa37c19296b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 23:18:50.975410 containerd[1589]: time="2025-09-09T23:18:50.975358735Z" level=info msg="Container 798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:18:50.975661 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2053699211.mount: Deactivated successfully. Sep 9 23:18:50.999406 containerd[1589]: time="2025-09-09T23:18:50.999344042Z" level=info msg="CreateContainer within sandbox \"f925e06b94a50aab4131fabc19e8a036fa66103be37a444c780acfa37c19296b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e\"" Sep 9 23:18:51.000603 containerd[1589]: time="2025-09-09T23:18:51.000556185Z" level=info msg="StartContainer for \"798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e\"" Sep 9 23:18:51.010615 containerd[1589]: time="2025-09-09T23:18:51.010548946Z" level=info msg="connecting to shim 798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e" address="unix:///run/containerd/s/91a2988c0fb9fa433ea9bf245773b67fef2a5325367b256f8d707f9070f88946" protocol=ttrpc version=3 Sep 9 23:18:51.108434 systemd[1]: Started cri-containerd-798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e.scope - libcontainer container 798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e. Sep 9 23:18:51.191995 containerd[1589]: time="2025-09-09T23:18:51.191920429Z" level=info msg="StartContainer for \"798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e\" returns successfully" Sep 9 23:18:51.331656 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 23:18:51.338668 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 23:18:51.635109 kubelet[2893]: I0909 23:18:51.634123 2893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qwnh\" (UniqueName: \"kubernetes.io/projected/4437d017-b50d-4c6e-9b27-8d321a43b551-kube-api-access-8qwnh\") pod \"4437d017-b50d-4c6e-9b27-8d321a43b551\" (UID: \"4437d017-b50d-4c6e-9b27-8d321a43b551\") " Sep 9 23:18:51.636944 kubelet[2893]: I0909 23:18:51.636129 2893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4437d017-b50d-4c6e-9b27-8d321a43b551-whisker-ca-bundle\") pod \"4437d017-b50d-4c6e-9b27-8d321a43b551\" (UID: \"4437d017-b50d-4c6e-9b27-8d321a43b551\") " Sep 9 23:18:51.636944 kubelet[2893]: I0909 23:18:51.636249 2893 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4437d017-b50d-4c6e-9b27-8d321a43b551-whisker-backend-key-pair\") pod \"4437d017-b50d-4c6e-9b27-8d321a43b551\" (UID: \"4437d017-b50d-4c6e-9b27-8d321a43b551\") " Sep 9 23:18:51.642248 kubelet[2893]: I0909 23:18:51.641349 2893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4437d017-b50d-4c6e-9b27-8d321a43b551-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "4437d017-b50d-4c6e-9b27-8d321a43b551" (UID: "4437d017-b50d-4c6e-9b27-8d321a43b551"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 9 23:18:51.643516 kubelet[2893]: I0909 23:18:51.643468 2893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4437d017-b50d-4c6e-9b27-8d321a43b551-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "4437d017-b50d-4c6e-9b27-8d321a43b551" (UID: "4437d017-b50d-4c6e-9b27-8d321a43b551"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 9 23:18:51.644856 kubelet[2893]: I0909 23:18:51.644819 2893 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4437d017-b50d-4c6e-9b27-8d321a43b551-kube-api-access-8qwnh" (OuterVolumeSpecName: "kube-api-access-8qwnh") pod "4437d017-b50d-4c6e-9b27-8d321a43b551" (UID: "4437d017-b50d-4c6e-9b27-8d321a43b551"). InnerVolumeSpecName "kube-api-access-8qwnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 9 23:18:51.737616 kubelet[2893]: I0909 23:18:51.737478 2893 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qwnh\" (UniqueName: \"kubernetes.io/projected/4437d017-b50d-4c6e-9b27-8d321a43b551-kube-api-access-8qwnh\") on node \"srv-5qwy1.gb1.brightbox.com\" DevicePath \"\"" Sep 9 23:18:51.738162 kubelet[2893]: I0909 23:18:51.737824 2893 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4437d017-b50d-4c6e-9b27-8d321a43b551-whisker-ca-bundle\") on node \"srv-5qwy1.gb1.brightbox.com\" DevicePath \"\"" Sep 9 23:18:51.738162 kubelet[2893]: I0909 23:18:51.738131 2893 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4437d017-b50d-4c6e-9b27-8d321a43b551-whisker-backend-key-pair\") on node \"srv-5qwy1.gb1.brightbox.com\" DevicePath \"\"" Sep 9 23:18:51.830679 containerd[1589]: time="2025-09-09T23:18:51.830546470Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-w4xgg,Uid:6bee4708-c6c2-4c72-85d5-67acf5711a98,Namespace:kube-system,Attempt:0,}" Sep 9 23:18:51.832159 containerd[1589]: time="2025-09-09T23:18:51.831995861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5bc589cc-wtzzr,Uid:e5d2f19e-a775-4e1a-873a-f65ee624f7f0,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:18:51.858772 systemd[1]: var-lib-kubelet-pods-4437d017\x2db50d\x2d4c6e\x2d9b27\x2d8d321a43b551-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8qwnh.mount: Deactivated successfully. Sep 9 23:18:51.861634 systemd[1]: var-lib-kubelet-pods-4437d017\x2db50d\x2d4c6e\x2d9b27\x2d8d321a43b551-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 23:18:51.883136 systemd[1]: Removed slice kubepods-besteffort-pod4437d017_b50d_4c6e_9b27_8d321a43b551.slice - libcontainer container kubepods-besteffort-pod4437d017_b50d_4c6e_9b27_8d321a43b551.slice. Sep 9 23:18:52.188580 kubelet[2893]: I0909 23:18:52.187337 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-9g7dw" podStartSLOduration=3.077557322 podStartE2EDuration="25.18731674s" podCreationTimestamp="2025-09-09 23:18:27 +0000 UTC" firstStartedPulling="2025-09-09 23:18:28.800560679 +0000 UTC m=+21.210745572" lastFinishedPulling="2025-09-09 23:18:50.91032009 +0000 UTC m=+43.320504990" observedRunningTime="2025-09-09 23:18:52.185941804 +0000 UTC m=+44.596126724" watchObservedRunningTime="2025-09-09 23:18:52.18731674 +0000 UTC m=+44.597501648" Sep 9 23:18:52.315383 systemd[1]: Created slice kubepods-besteffort-poda26bb48e_2502_48d2_8d96_9dd3b152d029.slice - libcontainer container kubepods-besteffort-poda26bb48e_2502_48d2_8d96_9dd3b152d029.slice. Sep 9 23:18:52.342462 kubelet[2893]: I0909 23:18:52.342409 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a26bb48e-2502-48d2-8d96-9dd3b152d029-whisker-backend-key-pair\") pod \"whisker-77cdf898b8-n8jqb\" (UID: \"a26bb48e-2502-48d2-8d96-9dd3b152d029\") " pod="calico-system/whisker-77cdf898b8-n8jqb" Sep 9 23:18:52.342462 kubelet[2893]: I0909 23:18:52.342471 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp8zd\" (UniqueName: \"kubernetes.io/projected/a26bb48e-2502-48d2-8d96-9dd3b152d029-kube-api-access-hp8zd\") pod \"whisker-77cdf898b8-n8jqb\" (UID: \"a26bb48e-2502-48d2-8d96-9dd3b152d029\") " pod="calico-system/whisker-77cdf898b8-n8jqb" Sep 9 23:18:52.342843 kubelet[2893]: I0909 23:18:52.342528 2893 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a26bb48e-2502-48d2-8d96-9dd3b152d029-whisker-ca-bundle\") pod \"whisker-77cdf898b8-n8jqb\" (UID: \"a26bb48e-2502-48d2-8d96-9dd3b152d029\") " pod="calico-system/whisker-77cdf898b8-n8jqb" Sep 9 23:18:52.379368 systemd-networkd[1518]: calidb6b67bc792: Link UP Sep 9 23:18:52.380987 systemd-networkd[1518]: calidb6b67bc792: Gained carrier Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:51.934 [INFO][3919] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:51.967 [INFO][3919] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0 calico-apiserver-c5bc589cc- calico-apiserver e5d2f19e-a775-4e1a-873a-f65ee624f7f0 839 0 2025-09-09 23:18:24 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:c5bc589cc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-5qwy1.gb1.brightbox.com calico-apiserver-c5bc589cc-wtzzr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calidb6b67bc792 [] [] }} ContainerID="d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-wtzzr" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-" Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:51.967 [INFO][3919] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-wtzzr" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0" Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.173 [INFO][3954] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" HandleID="k8s-pod-network.d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" Workload="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0" Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.174 [INFO][3954] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" HandleID="k8s-pod-network.d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" Workload="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000395bc0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-5qwy1.gb1.brightbox.com", "pod":"calico-apiserver-c5bc589cc-wtzzr", "timestamp":"2025-09-09 23:18:52.173274484 +0000 UTC"}, Hostname:"srv-5qwy1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.174 [INFO][3954] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.175 [INFO][3954] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.175 [INFO][3954] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5qwy1.gb1.brightbox.com' Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.248 [INFO][3954] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.270 [INFO][3954] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.298 [INFO][3954] ipam/ipam.go 511: Trying affinity for 192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.318 [INFO][3954] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.329 [INFO][3954] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.329 [INFO][3954] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.192/26 handle="k8s-pod-network.d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.331 [INFO][3954] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.345 [INFO][3954] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.192/26 handle="k8s-pod-network.d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.354 [INFO][3954] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.193/26] block=192.168.8.192/26 handle="k8s-pod-network.d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.354 [INFO][3954] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.193/26] handle="k8s-pod-network.d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.354 [INFO][3954] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:18:52.417054 containerd[1589]: 2025-09-09 23:18:52.354 [INFO][3954] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.193/26] IPv6=[] ContainerID="d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" HandleID="k8s-pod-network.d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" Workload="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0" Sep 9 23:18:52.420155 containerd[1589]: 2025-09-09 23:18:52.358 [INFO][3919] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-wtzzr" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0", GenerateName:"calico-apiserver-c5bc589cc-", Namespace:"calico-apiserver", SelfLink:"", UID:"e5d2f19e-a775-4e1a-873a-f65ee624f7f0", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c5bc589cc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-c5bc589cc-wtzzr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidb6b67bc792", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:52.420155 containerd[1589]: 2025-09-09 23:18:52.359 [INFO][3919] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.193/32] ContainerID="d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-wtzzr" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0" Sep 9 23:18:52.420155 containerd[1589]: 2025-09-09 23:18:52.359 [INFO][3919] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidb6b67bc792 ContainerID="d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-wtzzr" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0" Sep 9 23:18:52.420155 containerd[1589]: 2025-09-09 23:18:52.381 [INFO][3919] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-wtzzr" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0" Sep 9 23:18:52.420155 containerd[1589]: 2025-09-09 23:18:52.386 [INFO][3919] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-wtzzr" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0", GenerateName:"calico-apiserver-c5bc589cc-", Namespace:"calico-apiserver", SelfLink:"", UID:"e5d2f19e-a775-4e1a-873a-f65ee624f7f0", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c5bc589cc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a", Pod:"calico-apiserver-c5bc589cc-wtzzr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidb6b67bc792", MAC:"fe:34:a9:cc:3b:ef", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:52.420155 containerd[1589]: 2025-09-09 23:18:52.410 [INFO][3919] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-wtzzr" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--wtzzr-eth0" Sep 9 23:18:52.480394 systemd-networkd[1518]: cali52091bd45fa: Link UP Sep 9 23:18:52.481261 systemd-networkd[1518]: cali52091bd45fa: Gained carrier Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:51.921 [INFO][3918] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:51.966 [INFO][3918] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0 coredns-7c65d6cfc9- kube-system 6bee4708-c6c2-4c72-85d5-67acf5711a98 830 0 2025-09-09 23:18:13 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-5qwy1.gb1.brightbox.com coredns-7c65d6cfc9-w4xgg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali52091bd45fa [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w4xgg" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-" Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:51.968 [INFO][3918] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w4xgg" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0" Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.172 [INFO][3952] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" HandleID="k8s-pod-network.fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" Workload="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0" Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.174 [INFO][3952] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" HandleID="k8s-pod-network.fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" Workload="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00062c8b0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-5qwy1.gb1.brightbox.com", "pod":"coredns-7c65d6cfc9-w4xgg", "timestamp":"2025-09-09 23:18:52.172578136 +0000 UTC"}, Hostname:"srv-5qwy1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.174 [INFO][3952] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.354 [INFO][3952] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.354 [INFO][3952] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5qwy1.gb1.brightbox.com' Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.378 [INFO][3952] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.399 [INFO][3952] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.414 [INFO][3952] ipam/ipam.go 511: Trying affinity for 192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.420 [INFO][3952] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.430 [INFO][3952] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.430 [INFO][3952] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.192/26 handle="k8s-pod-network.fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.433 [INFO][3952] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.439 [INFO][3952] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.192/26 handle="k8s-pod-network.fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.453 [INFO][3952] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.194/26] block=192.168.8.192/26 handle="k8s-pod-network.fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.454 [INFO][3952] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.194/26] handle="k8s-pod-network.fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.454 [INFO][3952] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:18:52.515565 containerd[1589]: 2025-09-09 23:18:52.454 [INFO][3952] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.194/26] IPv6=[] ContainerID="fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" HandleID="k8s-pod-network.fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" Workload="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0" Sep 9 23:18:52.516988 containerd[1589]: 2025-09-09 23:18:52.469 [INFO][3918] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w4xgg" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"6bee4708-c6c2-4c72-85d5-67acf5711a98", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7c65d6cfc9-w4xgg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali52091bd45fa", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:52.516988 containerd[1589]: 2025-09-09 23:18:52.470 [INFO][3918] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.194/32] ContainerID="fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w4xgg" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0" Sep 9 23:18:52.516988 containerd[1589]: 2025-09-09 23:18:52.470 [INFO][3918] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali52091bd45fa ContainerID="fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w4xgg" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0" Sep 9 23:18:52.516988 containerd[1589]: 2025-09-09 23:18:52.481 [INFO][3918] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w4xgg" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0" Sep 9 23:18:52.516988 containerd[1589]: 2025-09-09 23:18:52.482 [INFO][3918] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w4xgg" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"6bee4708-c6c2-4c72-85d5-67acf5711a98", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f", Pod:"coredns-7c65d6cfc9-w4xgg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali52091bd45fa", MAC:"16:f3:e3:13:0d:06", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:52.518367 containerd[1589]: 2025-09-09 23:18:52.503 [INFO][3918] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w4xgg" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--w4xgg-eth0" Sep 9 23:18:52.625313 containerd[1589]: time="2025-09-09T23:18:52.625253538Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-77cdf898b8-n8jqb,Uid:a26bb48e-2502-48d2-8d96-9dd3b152d029,Namespace:calico-system,Attempt:0,}" Sep 9 23:18:52.707873 containerd[1589]: time="2025-09-09T23:18:52.707818820Z" level=info msg="connecting to shim d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a" address="unix:///run/containerd/s/3d95c0fef768609e37ce469731a981e93aa632cb279d7f7ce015ff1c44b76c45" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:52.729302 containerd[1589]: time="2025-09-09T23:18:52.729162613Z" level=info msg="connecting to shim fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f" address="unix:///run/containerd/s/919e72365dda7ad74437d67c89c0fc247e437cc9614a41db52261037291ef0b5" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:52.759376 systemd[1]: Started cri-containerd-d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a.scope - libcontainer container d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a. Sep 9 23:18:52.797612 systemd[1]: Started cri-containerd-fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f.scope - libcontainer container fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f. Sep 9 23:18:52.832959 containerd[1589]: time="2025-09-09T23:18:52.832905992Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c6cd884f-hjkmf,Uid:df965b2b-b1fe-46f6-8aa0-5651c4d730d6,Namespace:calico-system,Attempt:0,}" Sep 9 23:18:52.956525 systemd-networkd[1518]: cali356f65cd283: Link UP Sep 9 23:18:52.960728 systemd-networkd[1518]: cali356f65cd283: Gained carrier Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.689 [INFO][3991] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.719 [INFO][3991] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0 whisker-77cdf898b8- calico-system a26bb48e-2502-48d2-8d96-9dd3b152d029 911 0 2025-09-09 23:18:52 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:77cdf898b8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-5qwy1.gb1.brightbox.com whisker-77cdf898b8-n8jqb eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali356f65cd283 [] [] }} ContainerID="be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" Namespace="calico-system" Pod="whisker-77cdf898b8-n8jqb" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-" Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.719 [INFO][3991] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" Namespace="calico-system" Pod="whisker-77cdf898b8-n8jqb" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0" Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.809 [INFO][4031] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" HandleID="k8s-pod-network.be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" Workload="srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0" Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.811 [INFO][4031] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" HandleID="k8s-pod-network.be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" Workload="srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00037c600), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-5qwy1.gb1.brightbox.com", "pod":"whisker-77cdf898b8-n8jqb", "timestamp":"2025-09-09 23:18:52.809897843 +0000 UTC"}, Hostname:"srv-5qwy1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.811 [INFO][4031] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.811 [INFO][4031] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.812 [INFO][4031] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5qwy1.gb1.brightbox.com' Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.834 [INFO][4031] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.847 [INFO][4031] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.866 [INFO][4031] ipam/ipam.go 511: Trying affinity for 192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.869 [INFO][4031] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.875 [INFO][4031] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.875 [INFO][4031] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.192/26 handle="k8s-pod-network.be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.880 [INFO][4031] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571 Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.893 [INFO][4031] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.192/26 handle="k8s-pod-network.be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.909 [INFO][4031] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.195/26] block=192.168.8.192/26 handle="k8s-pod-network.be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.909 [INFO][4031] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.195/26] handle="k8s-pod-network.be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.909 [INFO][4031] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:18:53.004385 containerd[1589]: 2025-09-09 23:18:52.909 [INFO][4031] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.195/26] IPv6=[] ContainerID="be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" HandleID="k8s-pod-network.be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" Workload="srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0" Sep 9 23:18:53.008616 containerd[1589]: 2025-09-09 23:18:52.915 [INFO][3991] cni-plugin/k8s.go 418: Populated endpoint ContainerID="be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" Namespace="calico-system" Pod="whisker-77cdf898b8-n8jqb" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0", GenerateName:"whisker-77cdf898b8-", Namespace:"calico-system", SelfLink:"", UID:"a26bb48e-2502-48d2-8d96-9dd3b152d029", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"77cdf898b8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"", Pod:"whisker-77cdf898b8-n8jqb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.8.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali356f65cd283", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:53.008616 containerd[1589]: 2025-09-09 23:18:52.915 [INFO][3991] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.195/32] ContainerID="be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" Namespace="calico-system" Pod="whisker-77cdf898b8-n8jqb" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0" Sep 9 23:18:53.008616 containerd[1589]: 2025-09-09 23:18:52.915 [INFO][3991] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali356f65cd283 ContainerID="be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" Namespace="calico-system" Pod="whisker-77cdf898b8-n8jqb" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0" Sep 9 23:18:53.008616 containerd[1589]: 2025-09-09 23:18:52.962 [INFO][3991] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" Namespace="calico-system" Pod="whisker-77cdf898b8-n8jqb" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0" Sep 9 23:18:53.008616 containerd[1589]: 2025-09-09 23:18:52.965 [INFO][3991] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" Namespace="calico-system" Pod="whisker-77cdf898b8-n8jqb" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0", GenerateName:"whisker-77cdf898b8-", Namespace:"calico-system", SelfLink:"", UID:"a26bb48e-2502-48d2-8d96-9dd3b152d029", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"77cdf898b8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571", Pod:"whisker-77cdf898b8-n8jqb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.8.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali356f65cd283", MAC:"9e:71:23:c8:4d:57", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:53.008616 containerd[1589]: 2025-09-09 23:18:52.990 [INFO][3991] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" Namespace="calico-system" Pod="whisker-77cdf898b8-n8jqb" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-whisker--77cdf898b8--n8jqb-eth0" Sep 9 23:18:53.024960 containerd[1589]: time="2025-09-09T23:18:53.024813272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-w4xgg,Uid:6bee4708-c6c2-4c72-85d5-67acf5711a98,Namespace:kube-system,Attempt:0,} returns sandbox id \"fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f\"" Sep 9 23:18:53.045508 containerd[1589]: time="2025-09-09T23:18:53.044413819Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5bc589cc-wtzzr,Uid:e5d2f19e-a775-4e1a-873a-f65ee624f7f0,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a\"" Sep 9 23:18:53.065039 containerd[1589]: time="2025-09-09T23:18:53.064976414Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 23:18:53.077724 containerd[1589]: time="2025-09-09T23:18:53.077674348Z" level=info msg="CreateContainer within sandbox \"fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 23:18:53.140114 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3697429588.mount: Deactivated successfully. Sep 9 23:18:53.155853 containerd[1589]: time="2025-09-09T23:18:53.155800456Z" level=info msg="Container c997c5b78c67e111f8cd3194f7fa6b0f0400028a8fd6e11c4dcafb549c035e68: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:18:53.187986 containerd[1589]: time="2025-09-09T23:18:53.187937243Z" level=info msg="CreateContainer within sandbox \"fb2f38a05d71dad1edb951227d70f0c34878a02e8f600bfdf767a5dcca24776f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c997c5b78c67e111f8cd3194f7fa6b0f0400028a8fd6e11c4dcafb549c035e68\"" Sep 9 23:18:53.197199 containerd[1589]: time="2025-09-09T23:18:53.196932809Z" level=info msg="StartContainer for \"c997c5b78c67e111f8cd3194f7fa6b0f0400028a8fd6e11c4dcafb549c035e68\"" Sep 9 23:18:53.218191 containerd[1589]: time="2025-09-09T23:18:53.215426882Z" level=info msg="connecting to shim c997c5b78c67e111f8cd3194f7fa6b0f0400028a8fd6e11c4dcafb549c035e68" address="unix:///run/containerd/s/919e72365dda7ad74437d67c89c0fc247e437cc9614a41db52261037291ef0b5" protocol=ttrpc version=3 Sep 9 23:18:53.228014 containerd[1589]: time="2025-09-09T23:18:53.227844110Z" level=info msg="connecting to shim be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571" address="unix:///run/containerd/s/9084cacb1f822237911b3234c7485af1ab0f5cf8f0482405b055695722d4a8ac" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:53.331996 systemd[1]: Started cri-containerd-c997c5b78c67e111f8cd3194f7fa6b0f0400028a8fd6e11c4dcafb549c035e68.scope - libcontainer container c997c5b78c67e111f8cd3194f7fa6b0f0400028a8fd6e11c4dcafb549c035e68. Sep 9 23:18:53.371399 systemd-networkd[1518]: cali185280c9fce: Link UP Sep 9 23:18:53.373078 systemd-networkd[1518]: cali185280c9fce: Gained carrier Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:52.916 [INFO][4081] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:52.951 [INFO][4081] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0 calico-kube-controllers-5c6cd884f- calico-system df965b2b-b1fe-46f6-8aa0-5651c4d730d6 838 0 2025-09-09 23:18:28 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5c6cd884f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-5qwy1.gb1.brightbox.com calico-kube-controllers-5c6cd884f-hjkmf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali185280c9fce [] [] }} ContainerID="d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" Namespace="calico-system" Pod="calico-kube-controllers-5c6cd884f-hjkmf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-" Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:52.951 [INFO][4081] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" Namespace="calico-system" Pod="calico-kube-controllers-5c6cd884f-hjkmf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0" Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.124 [INFO][4103] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" HandleID="k8s-pod-network.d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" Workload="srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0" Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.124 [INFO][4103] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" HandleID="k8s-pod-network.d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" Workload="srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f160), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-5qwy1.gb1.brightbox.com", "pod":"calico-kube-controllers-5c6cd884f-hjkmf", "timestamp":"2025-09-09 23:18:53.122299322 +0000 UTC"}, Hostname:"srv-5qwy1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.124 [INFO][4103] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.124 [INFO][4103] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.126 [INFO][4103] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5qwy1.gb1.brightbox.com' Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.174 [INFO][4103] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.204 [INFO][4103] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.232 [INFO][4103] ipam/ipam.go 511: Trying affinity for 192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.240 [INFO][4103] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.258 [INFO][4103] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.258 [INFO][4103] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.192/26 handle="k8s-pod-network.d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.262 [INFO][4103] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.271 [INFO][4103] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.192/26 handle="k8s-pod-network.d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.286 [INFO][4103] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.196/26] block=192.168.8.192/26 handle="k8s-pod-network.d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.287 [INFO][4103] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.196/26] handle="k8s-pod-network.d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:53.401222 containerd[1589]: 2025-09-09 23:18:53.287 [INFO][4103] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:18:53.407098 containerd[1589]: 2025-09-09 23:18:53.287 [INFO][4103] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.196/26] IPv6=[] ContainerID="d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" HandleID="k8s-pod-network.d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" Workload="srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0" Sep 9 23:18:53.407098 containerd[1589]: 2025-09-09 23:18:53.303 [INFO][4081] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" Namespace="calico-system" Pod="calico-kube-controllers-5c6cd884f-hjkmf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0", GenerateName:"calico-kube-controllers-5c6cd884f-", Namespace:"calico-system", SelfLink:"", UID:"df965b2b-b1fe-46f6-8aa0-5651c4d730d6", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5c6cd884f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-5c6cd884f-hjkmf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.8.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali185280c9fce", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:53.407098 containerd[1589]: 2025-09-09 23:18:53.312 [INFO][4081] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.196/32] ContainerID="d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" Namespace="calico-system" Pod="calico-kube-controllers-5c6cd884f-hjkmf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0" Sep 9 23:18:53.407098 containerd[1589]: 2025-09-09 23:18:53.312 [INFO][4081] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali185280c9fce ContainerID="d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" Namespace="calico-system" Pod="calico-kube-controllers-5c6cd884f-hjkmf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0" Sep 9 23:18:53.407098 containerd[1589]: 2025-09-09 23:18:53.373 [INFO][4081] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" Namespace="calico-system" Pod="calico-kube-controllers-5c6cd884f-hjkmf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0" Sep 9 23:18:53.407098 containerd[1589]: 2025-09-09 23:18:53.374 [INFO][4081] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" Namespace="calico-system" Pod="calico-kube-controllers-5c6cd884f-hjkmf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0", GenerateName:"calico-kube-controllers-5c6cd884f-", Namespace:"calico-system", SelfLink:"", UID:"df965b2b-b1fe-46f6-8aa0-5651c4d730d6", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5c6cd884f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df", Pod:"calico-kube-controllers-5c6cd884f-hjkmf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.8.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali185280c9fce", MAC:"d2:2a:47:91:e8:3e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:53.411145 containerd[1589]: 2025-09-09 23:18:53.396 [INFO][4081] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" Namespace="calico-system" Pod="calico-kube-controllers-5c6cd884f-hjkmf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--kube--controllers--5c6cd884f--hjkmf-eth0" Sep 9 23:18:53.425591 systemd[1]: Started cri-containerd-be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571.scope - libcontainer container be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571. Sep 9 23:18:53.521447 containerd[1589]: time="2025-09-09T23:18:53.521375721Z" level=info msg="connecting to shim d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df" address="unix:///run/containerd/s/026062917350160bc089c11bebde898c415d3b33cf73071ae2f4afc0ac5e7f22" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:53.544845 containerd[1589]: time="2025-09-09T23:18:53.544802463Z" level=info msg="StartContainer for \"c997c5b78c67e111f8cd3194f7fa6b0f0400028a8fd6e11c4dcafb549c035e68\" returns successfully" Sep 9 23:18:53.589574 systemd-networkd[1518]: cali52091bd45fa: Gained IPv6LL Sep 9 23:18:53.612503 systemd[1]: Started cri-containerd-d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df.scope - libcontainer container d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df. Sep 9 23:18:53.667629 containerd[1589]: time="2025-09-09T23:18:53.667455167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-77cdf898b8-n8jqb,Uid:a26bb48e-2502-48d2-8d96-9dd3b152d029,Namespace:calico-system,Attempt:0,} returns sandbox id \"be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571\"" Sep 9 23:18:53.782988 containerd[1589]: time="2025-09-09T23:18:53.782750495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c6cd884f-hjkmf,Uid:df965b2b-b1fe-46f6-8aa0-5651c4d730d6,Namespace:calico-system,Attempt:0,} returns sandbox id \"d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df\"" Sep 9 23:18:53.840299 containerd[1589]: time="2025-09-09T23:18:53.840110807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5bc589cc-26mdm,Uid:bcc9abc6-6d48-4bc6-a634-589f1b9e5771,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:18:53.858201 kubelet[2893]: I0909 23:18:53.858137 2893 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4437d017-b50d-4c6e-9b27-8d321a43b551" path="/var/lib/kubelet/pods/4437d017-b50d-4c6e-9b27-8d321a43b551/volumes" Sep 9 23:18:53.864715 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount988000941.mount: Deactivated successfully. Sep 9 23:18:54.158535 systemd-networkd[1518]: calica0b230b099: Link UP Sep 9 23:18:54.159674 systemd-networkd[1518]: calica0b230b099: Gained carrier Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:53.958 [INFO][4370] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:53.992 [INFO][4370] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0 calico-apiserver-c5bc589cc- calico-apiserver bcc9abc6-6d48-4bc6-a634-589f1b9e5771 841 0 2025-09-09 23:18:24 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:c5bc589cc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-5qwy1.gb1.brightbox.com calico-apiserver-c5bc589cc-26mdm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calica0b230b099 [] [] }} ContainerID="e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-26mdm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-" Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:53.995 [INFO][4370] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-26mdm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0" Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.088 [INFO][4384] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" HandleID="k8s-pod-network.e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" Workload="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0" Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.089 [INFO][4384] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" HandleID="k8s-pod-network.e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" Workload="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5180), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-5qwy1.gb1.brightbox.com", "pod":"calico-apiserver-c5bc589cc-26mdm", "timestamp":"2025-09-09 23:18:54.088701989 +0000 UTC"}, Hostname:"srv-5qwy1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.089 [INFO][4384] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.089 [INFO][4384] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.090 [INFO][4384] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5qwy1.gb1.brightbox.com' Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.104 [INFO][4384] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.114 [INFO][4384] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.122 [INFO][4384] ipam/ipam.go 511: Trying affinity for 192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.124 [INFO][4384] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.127 [INFO][4384] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.128 [INFO][4384] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.192/26 handle="k8s-pod-network.e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.131 [INFO][4384] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.137 [INFO][4384] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.192/26 handle="k8s-pod-network.e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.145 [INFO][4384] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.197/26] block=192.168.8.192/26 handle="k8s-pod-network.e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.145 [INFO][4384] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.197/26] handle="k8s-pod-network.e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.146 [INFO][4384] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:18:54.192083 containerd[1589]: 2025-09-09 23:18:54.146 [INFO][4384] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.197/26] IPv6=[] ContainerID="e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" HandleID="k8s-pod-network.e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" Workload="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0" Sep 9 23:18:54.198084 containerd[1589]: 2025-09-09 23:18:54.151 [INFO][4370] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-26mdm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0", GenerateName:"calico-apiserver-c5bc589cc-", Namespace:"calico-apiserver", SelfLink:"", UID:"bcc9abc6-6d48-4bc6-a634-589f1b9e5771", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c5bc589cc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-c5bc589cc-26mdm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calica0b230b099", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:54.198084 containerd[1589]: 2025-09-09 23:18:54.152 [INFO][4370] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.197/32] ContainerID="e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-26mdm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0" Sep 9 23:18:54.198084 containerd[1589]: 2025-09-09 23:18:54.152 [INFO][4370] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calica0b230b099 ContainerID="e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-26mdm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0" Sep 9 23:18:54.198084 containerd[1589]: 2025-09-09 23:18:54.161 [INFO][4370] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-26mdm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0" Sep 9 23:18:54.198084 containerd[1589]: 2025-09-09 23:18:54.162 [INFO][4370] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-26mdm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0", GenerateName:"calico-apiserver-c5bc589cc-", Namespace:"calico-apiserver", SelfLink:"", UID:"bcc9abc6-6d48-4bc6-a634-589f1b9e5771", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c5bc589cc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e", Pod:"calico-apiserver-c5bc589cc-26mdm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calica0b230b099", MAC:"62:42:96:e2:c9:62", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:54.198084 containerd[1589]: 2025-09-09 23:18:54.176 [INFO][4370] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" Namespace="calico-apiserver" Pod="calico-apiserver-c5bc589cc-26mdm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-calico--apiserver--c5bc589cc--26mdm-eth0" Sep 9 23:18:54.284540 containerd[1589]: time="2025-09-09T23:18:54.284346919Z" level=info msg="connecting to shim e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e" address="unix:///run/containerd/s/ffc93bb29c6fa1ee6d13acc95acb4ebd3eb2106f3e2b61d2c0b0accb6a440cea" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:54.296728 kubelet[2893]: I0909 23:18:54.296665 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-w4xgg" podStartSLOduration=41.296641174 podStartE2EDuration="41.296641174s" podCreationTimestamp="2025-09-09 23:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:18:54.295410649 +0000 UTC m=+46.705595574" watchObservedRunningTime="2025-09-09 23:18:54.296641174 +0000 UTC m=+46.706826084" Sep 9 23:18:54.347349 systemd[1]: Started cri-containerd-e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e.scope - libcontainer container e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e. Sep 9 23:18:54.356314 systemd-networkd[1518]: cali356f65cd283: Gained IPv6LL Sep 9 23:18:54.414400 systemd-networkd[1518]: calidb6b67bc792: Gained IPv6LL Sep 9 23:18:54.509907 containerd[1589]: time="2025-09-09T23:18:54.509804943Z" level=info msg="TaskExit event in podsandbox handler container_id:\"798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e\" id:\"338cd5a26c9a8fe035025d95801854e15b630a03ee5b9cd194ef0d30a6ab34fd\" pid:4280 exit_status:1 exited_at:{seconds:1757459934 nanos:502700470}" Sep 9 23:18:54.609188 containerd[1589]: time="2025-09-09T23:18:54.609069081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5bc589cc-26mdm,Uid:bcc9abc6-6d48-4bc6-a634-589f1b9e5771,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e\"" Sep 9 23:18:54.789478 containerd[1589]: time="2025-09-09T23:18:54.789437445Z" level=info msg="TaskExit event in podsandbox handler container_id:\"798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e\" id:\"e50c17db1345ca64c1c1e8fb96fb7816288a8d127ec7c3b4628405cad1824e5b\" pid:4483 exit_status:1 exited_at:{seconds:1757459934 nanos:780689895}" Sep 9 23:18:55.310372 systemd-networkd[1518]: cali185280c9fce: Gained IPv6LL Sep 9 23:18:55.438472 systemd-networkd[1518]: calica0b230b099: Gained IPv6LL Sep 9 23:18:55.832451 containerd[1589]: time="2025-09-09T23:18:55.831851378Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cht4z,Uid:38bec17a-1d12-40cd-be11-efa68728ce36,Namespace:kube-system,Attempt:0,}" Sep 9 23:18:55.843323 containerd[1589]: time="2025-09-09T23:18:55.840324295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lkvhf,Uid:88a3a96f-1289-44aa-b5ca-25670a32dc3d,Namespace:calico-system,Attempt:0,}" Sep 9 23:18:56.280120 systemd-networkd[1518]: cali65e8c0f35c7: Link UP Sep 9 23:18:56.283745 systemd-networkd[1518]: cali65e8c0f35c7: Gained carrier Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:55.993 [INFO][4520] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.049 [INFO][4520] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0 csi-node-driver- calico-system 88a3a96f-1289-44aa-b5ca-25670a32dc3d 719 0 2025-09-09 23:18:27 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-5qwy1.gb1.brightbox.com csi-node-driver-lkvhf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali65e8c0f35c7 [] [] }} ContainerID="17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" Namespace="calico-system" Pod="csi-node-driver-lkvhf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-" Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.049 [INFO][4520] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" Namespace="calico-system" Pod="csi-node-driver-lkvhf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0" Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.147 [INFO][4543] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" HandleID="k8s-pod-network.17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" Workload="srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0" Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.148 [INFO][4543] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" HandleID="k8s-pod-network.17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" Workload="srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000317d30), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-5qwy1.gb1.brightbox.com", "pod":"csi-node-driver-lkvhf", "timestamp":"2025-09-09 23:18:56.147936403 +0000 UTC"}, Hostname:"srv-5qwy1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.148 [INFO][4543] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.150 [INFO][4543] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.150 [INFO][4543] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5qwy1.gb1.brightbox.com' Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.162 [INFO][4543] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.173 [INFO][4543] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.210 [INFO][4543] ipam/ipam.go 511: Trying affinity for 192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.217 [INFO][4543] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.226 [INFO][4543] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.226 [INFO][4543] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.192/26 handle="k8s-pod-network.17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.231 [INFO][4543] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.242 [INFO][4543] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.192/26 handle="k8s-pod-network.17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.254 [INFO][4543] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.198/26] block=192.168.8.192/26 handle="k8s-pod-network.17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.254 [INFO][4543] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.198/26] handle="k8s-pod-network.17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.255 [INFO][4543] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:18:56.373935 containerd[1589]: 2025-09-09 23:18:56.255 [INFO][4543] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.198/26] IPv6=[] ContainerID="17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" HandleID="k8s-pod-network.17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" Workload="srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0" Sep 9 23:18:56.375921 containerd[1589]: 2025-09-09 23:18:56.266 [INFO][4520] cni-plugin/k8s.go 418: Populated endpoint ContainerID="17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" Namespace="calico-system" Pod="csi-node-driver-lkvhf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"88a3a96f-1289-44aa-b5ca-25670a32dc3d", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-lkvhf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.8.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali65e8c0f35c7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:56.375921 containerd[1589]: 2025-09-09 23:18:56.267 [INFO][4520] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.198/32] ContainerID="17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" Namespace="calico-system" Pod="csi-node-driver-lkvhf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0" Sep 9 23:18:56.375921 containerd[1589]: 2025-09-09 23:18:56.268 [INFO][4520] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali65e8c0f35c7 ContainerID="17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" Namespace="calico-system" Pod="csi-node-driver-lkvhf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0" Sep 9 23:18:56.375921 containerd[1589]: 2025-09-09 23:18:56.301 [INFO][4520] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" Namespace="calico-system" Pod="csi-node-driver-lkvhf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0" Sep 9 23:18:56.375921 containerd[1589]: 2025-09-09 23:18:56.309 [INFO][4520] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" Namespace="calico-system" Pod="csi-node-driver-lkvhf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"88a3a96f-1289-44aa-b5ca-25670a32dc3d", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a", Pod:"csi-node-driver-lkvhf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.8.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali65e8c0f35c7", MAC:"8a:cd:bd:8c:6d:70", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:56.375921 containerd[1589]: 2025-09-09 23:18:56.356 [INFO][4520] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" Namespace="calico-system" Pod="csi-node-driver-lkvhf" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-csi--node--driver--lkvhf-eth0" Sep 9 23:18:56.438495 systemd-networkd[1518]: cali6a8998ca600: Link UP Sep 9 23:18:56.441442 systemd-networkd[1518]: cali6a8998ca600: Gained carrier Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.004 [INFO][4512] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.037 [INFO][4512] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0 coredns-7c65d6cfc9- kube-system 38bec17a-1d12-40cd-be11-efa68728ce36 834 0 2025-09-09 23:18:13 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-5qwy1.gb1.brightbox.com coredns-7c65d6cfc9-cht4z eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6a8998ca600 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cht4z" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-" Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.038 [INFO][4512] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cht4z" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0" Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.151 [INFO][4538] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" HandleID="k8s-pod-network.f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" Workload="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0" Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.152 [INFO][4538] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" HandleID="k8s-pod-network.f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" Workload="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000123a30), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-5qwy1.gb1.brightbox.com", "pod":"coredns-7c65d6cfc9-cht4z", "timestamp":"2025-09-09 23:18:56.151845645 +0000 UTC"}, Hostname:"srv-5qwy1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.152 [INFO][4538] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.257 [INFO][4538] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.257 [INFO][4538] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5qwy1.gb1.brightbox.com' Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.295 [INFO][4538] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.315 [INFO][4538] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.334 [INFO][4538] ipam/ipam.go 511: Trying affinity for 192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.339 [INFO][4538] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.344 [INFO][4538] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.344 [INFO][4538] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.192/26 handle="k8s-pod-network.f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.352 [INFO][4538] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.370 [INFO][4538] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.192/26 handle="k8s-pod-network.f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.391 [INFO][4538] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.199/26] block=192.168.8.192/26 handle="k8s-pod-network.f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.391 [INFO][4538] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.199/26] handle="k8s-pod-network.f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.391 [INFO][4538] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:18:56.497669 containerd[1589]: 2025-09-09 23:18:56.391 [INFO][4538] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.199/26] IPv6=[] ContainerID="f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" HandleID="k8s-pod-network.f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" Workload="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0" Sep 9 23:18:56.499257 containerd[1589]: 2025-09-09 23:18:56.415 [INFO][4512] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cht4z" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"38bec17a-1d12-40cd-be11-efa68728ce36", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7c65d6cfc9-cht4z", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6a8998ca600", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:56.499257 containerd[1589]: 2025-09-09 23:18:56.416 [INFO][4512] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.199/32] ContainerID="f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cht4z" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0" Sep 9 23:18:56.499257 containerd[1589]: 2025-09-09 23:18:56.419 [INFO][4512] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6a8998ca600 ContainerID="f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cht4z" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0" Sep 9 23:18:56.499257 containerd[1589]: 2025-09-09 23:18:56.456 [INFO][4512] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cht4z" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0" Sep 9 23:18:56.499257 containerd[1589]: 2025-09-09 23:18:56.461 [INFO][4512] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cht4z" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"38bec17a-1d12-40cd-be11-efa68728ce36", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd", Pod:"coredns-7c65d6cfc9-cht4z", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6a8998ca600", MAC:"9e:76:21:1e:a7:ed", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:56.500753 containerd[1589]: 2025-09-09 23:18:56.489 [INFO][4512] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cht4z" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cht4z-eth0" Sep 9 23:18:56.505371 containerd[1589]: time="2025-09-09T23:18:56.505300550Z" level=info msg="connecting to shim 17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a" address="unix:///run/containerd/s/9b53c6350b798680f9d2934374f5db0020c770d6bb98c9a18a3731806d9f87fe" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:56.585432 containerd[1589]: time="2025-09-09T23:18:56.585277381Z" level=info msg="connecting to shim f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd" address="unix:///run/containerd/s/1b462625642f5ae26ad4813a5a4a4eb25fa03abd1c3f3efb3f1dfc714243c317" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:56.641988 systemd[1]: Started cri-containerd-17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a.scope - libcontainer container 17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a. Sep 9 23:18:56.718532 systemd[1]: Started cri-containerd-f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd.scope - libcontainer container f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd. Sep 9 23:18:56.830660 containerd[1589]: time="2025-09-09T23:18:56.830534855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-lvhtm,Uid:ef7990c2-8108-400b-aa6e-868d127aa7b8,Namespace:calico-system,Attempt:0,}" Sep 9 23:18:57.009785 containerd[1589]: time="2025-09-09T23:18:57.008280661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lkvhf,Uid:88a3a96f-1289-44aa-b5ca-25670a32dc3d,Namespace:calico-system,Attempt:0,} returns sandbox id \"17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a\"" Sep 9 23:18:57.022134 containerd[1589]: time="2025-09-09T23:18:57.022075211Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cht4z,Uid:38bec17a-1d12-40cd-be11-efa68728ce36,Namespace:kube-system,Attempt:0,} returns sandbox id \"f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd\"" Sep 9 23:18:57.113945 containerd[1589]: time="2025-09-09T23:18:57.113779868Z" level=info msg="CreateContainer within sandbox \"f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 23:18:57.159264 containerd[1589]: time="2025-09-09T23:18:57.159004049Z" level=info msg="Container f1c2a4fc03077de7b2f7d2692674d6570e6e9955f351553549469063ec831a99: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:18:57.164853 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount390312376.mount: Deactivated successfully. Sep 9 23:18:57.179664 containerd[1589]: time="2025-09-09T23:18:57.179545003Z" level=info msg="CreateContainer within sandbox \"f1cff9bcb94efc286c2c1daf5b19b8a65627cccaa29628c6e5dee55c42c488dd\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f1c2a4fc03077de7b2f7d2692674d6570e6e9955f351553549469063ec831a99\"" Sep 9 23:18:57.184833 containerd[1589]: time="2025-09-09T23:18:57.184698102Z" level=info msg="StartContainer for \"f1c2a4fc03077de7b2f7d2692674d6570e6e9955f351553549469063ec831a99\"" Sep 9 23:18:57.202948 containerd[1589]: time="2025-09-09T23:18:57.202889384Z" level=info msg="connecting to shim f1c2a4fc03077de7b2f7d2692674d6570e6e9955f351553549469063ec831a99" address="unix:///run/containerd/s/1b462625642f5ae26ad4813a5a4a4eb25fa03abd1c3f3efb3f1dfc714243c317" protocol=ttrpc version=3 Sep 9 23:18:57.286512 systemd[1]: Started cri-containerd-f1c2a4fc03077de7b2f7d2692674d6570e6e9955f351553549469063ec831a99.scope - libcontainer container f1c2a4fc03077de7b2f7d2692674d6570e6e9955f351553549469063ec831a99. Sep 9 23:18:57.375830 systemd-networkd[1518]: cali17016d09fe7: Link UP Sep 9 23:18:57.379839 systemd-networkd[1518]: cali17016d09fe7: Gained carrier Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.012 [INFO][4654] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.077 [INFO][4654] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0 goldmane-7988f88666- calico-system ef7990c2-8108-400b-aa6e-868d127aa7b8 840 0 2025-09-09 23:18:27 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-5qwy1.gb1.brightbox.com goldmane-7988f88666-lvhtm eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali17016d09fe7 [] [] }} ContainerID="02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" Namespace="calico-system" Pod="goldmane-7988f88666-lvhtm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-" Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.080 [INFO][4654] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" Namespace="calico-system" Pod="goldmane-7988f88666-lvhtm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0" Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.256 [INFO][4688] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" HandleID="k8s-pod-network.02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" Workload="srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0" Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.262 [INFO][4688] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" HandleID="k8s-pod-network.02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" Workload="srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002dfd90), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-5qwy1.gb1.brightbox.com", "pod":"goldmane-7988f88666-lvhtm", "timestamp":"2025-09-09 23:18:57.256019953 +0000 UTC"}, Hostname:"srv-5qwy1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.264 [INFO][4688] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.267 [INFO][4688] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.267 [INFO][4688] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5qwy1.gb1.brightbox.com' Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.291 [INFO][4688] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.303 [INFO][4688] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.314 [INFO][4688] ipam/ipam.go 511: Trying affinity for 192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.320 [INFO][4688] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.324 [INFO][4688] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.192/26 host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.325 [INFO][4688] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.192/26 handle="k8s-pod-network.02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.328 [INFO][4688] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75 Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.336 [INFO][4688] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.192/26 handle="k8s-pod-network.02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.349 [INFO][4688] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.200/26] block=192.168.8.192/26 handle="k8s-pod-network.02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.351 [INFO][4688] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.200/26] handle="k8s-pod-network.02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" host="srv-5qwy1.gb1.brightbox.com" Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.351 [INFO][4688] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:18:57.445413 containerd[1589]: 2025-09-09 23:18:57.351 [INFO][4688] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.200/26] IPv6=[] ContainerID="02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" HandleID="k8s-pod-network.02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" Workload="srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0" Sep 9 23:18:57.449493 containerd[1589]: 2025-09-09 23:18:57.356 [INFO][4654] cni-plugin/k8s.go 418: Populated endpoint ContainerID="02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" Namespace="calico-system" Pod="goldmane-7988f88666-lvhtm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"ef7990c2-8108-400b-aa6e-868d127aa7b8", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-7988f88666-lvhtm", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.8.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali17016d09fe7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:57.449493 containerd[1589]: 2025-09-09 23:18:57.356 [INFO][4654] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.200/32] ContainerID="02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" Namespace="calico-system" Pod="goldmane-7988f88666-lvhtm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0" Sep 9 23:18:57.449493 containerd[1589]: 2025-09-09 23:18:57.356 [INFO][4654] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali17016d09fe7 ContainerID="02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" Namespace="calico-system" Pod="goldmane-7988f88666-lvhtm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0" Sep 9 23:18:57.449493 containerd[1589]: 2025-09-09 23:18:57.383 [INFO][4654] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" Namespace="calico-system" Pod="goldmane-7988f88666-lvhtm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0" Sep 9 23:18:57.449493 containerd[1589]: 2025-09-09 23:18:57.401 [INFO][4654] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" Namespace="calico-system" Pod="goldmane-7988f88666-lvhtm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"ef7990c2-8108-400b-aa6e-868d127aa7b8", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 18, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5qwy1.gb1.brightbox.com", ContainerID:"02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75", Pod:"goldmane-7988f88666-lvhtm", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.8.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali17016d09fe7", MAC:"1a:8e:d3:32:6f:f6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:18:57.449493 containerd[1589]: 2025-09-09 23:18:57.430 [INFO][4654] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" Namespace="calico-system" Pod="goldmane-7988f88666-lvhtm" WorkloadEndpoint="srv--5qwy1.gb1.brightbox.com-k8s-goldmane--7988f88666--lvhtm-eth0" Sep 9 23:18:57.449493 containerd[1589]: time="2025-09-09T23:18:57.448897238Z" level=info msg="StartContainer for \"f1c2a4fc03077de7b2f7d2692674d6570e6e9955f351553549469063ec831a99\" returns successfully" Sep 9 23:18:57.477583 kubelet[2893]: I0909 23:18:57.476563 2893 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:18:57.488328 systemd-networkd[1518]: cali65e8c0f35c7: Gained IPv6LL Sep 9 23:18:57.584949 containerd[1589]: time="2025-09-09T23:18:57.584742942Z" level=info msg="connecting to shim 02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75" address="unix:///run/containerd/s/a0e9aa39490b2883adfc3465901d7193e84f6ad05d5a2906d9aa6ce35b92b8a7" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:18:57.678626 systemd-networkd[1518]: cali6a8998ca600: Gained IPv6LL Sep 9 23:18:57.735272 systemd[1]: Started cri-containerd-02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75.scope - libcontainer container 02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75. Sep 9 23:18:58.098106 containerd[1589]: time="2025-09-09T23:18:58.098022397Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-lvhtm,Uid:ef7990c2-8108-400b-aa6e-868d127aa7b8,Namespace:calico-system,Attempt:0,} returns sandbox id \"02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75\"" Sep 9 23:18:58.393601 kubelet[2893]: I0909 23:18:58.390643 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-cht4z" podStartSLOduration=45.39059157 podStartE2EDuration="45.39059157s" podCreationTimestamp="2025-09-09 23:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:18:58.388814407 +0000 UTC m=+50.798999319" watchObservedRunningTime="2025-09-09 23:18:58.39059157 +0000 UTC m=+50.800776479" Sep 9 23:18:58.575532 systemd-networkd[1518]: cali17016d09fe7: Gained IPv6LL Sep 9 23:18:59.521416 containerd[1589]: time="2025-09-09T23:18:59.521278473Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:59.521416 containerd[1589]: time="2025-09-09T23:18:59.521369892Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 9 23:18:59.524051 containerd[1589]: time="2025-09-09T23:18:59.523985812Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:59.528249 containerd[1589]: time="2025-09-09T23:18:59.526538121Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 6.461500899s" Sep 9 23:18:59.528249 containerd[1589]: time="2025-09-09T23:18:59.526600314Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 23:18:59.528249 containerd[1589]: time="2025-09-09T23:18:59.527135266Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:18:59.533387 containerd[1589]: time="2025-09-09T23:18:59.533357248Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 23:18:59.534406 containerd[1589]: time="2025-09-09T23:18:59.534320616Z" level=info msg="CreateContainer within sandbox \"d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 23:18:59.545609 containerd[1589]: time="2025-09-09T23:18:59.545571577Z" level=info msg="Container 2259818661860da50744da2063433175d3698df0332cc65af47a696075c404eb: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:18:59.563788 containerd[1589]: time="2025-09-09T23:18:59.563708778Z" level=info msg="CreateContainer within sandbox \"d2050f0f9c13f2da14c22bf2df486402bdc5d79f7cd997b9c33bc59706891e8a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2259818661860da50744da2063433175d3698df0332cc65af47a696075c404eb\"" Sep 9 23:18:59.564970 containerd[1589]: time="2025-09-09T23:18:59.564767655Z" level=info msg="StartContainer for \"2259818661860da50744da2063433175d3698df0332cc65af47a696075c404eb\"" Sep 9 23:18:59.567446 containerd[1589]: time="2025-09-09T23:18:59.567411792Z" level=info msg="connecting to shim 2259818661860da50744da2063433175d3698df0332cc65af47a696075c404eb" address="unix:///run/containerd/s/3d95c0fef768609e37ce469731a981e93aa632cb279d7f7ce015ff1c44b76c45" protocol=ttrpc version=3 Sep 9 23:18:59.623606 systemd[1]: Started cri-containerd-2259818661860da50744da2063433175d3698df0332cc65af47a696075c404eb.scope - libcontainer container 2259818661860da50744da2063433175d3698df0332cc65af47a696075c404eb. Sep 9 23:18:59.654007 systemd-networkd[1518]: vxlan.calico: Link UP Sep 9 23:18:59.654019 systemd-networkd[1518]: vxlan.calico: Gained carrier Sep 9 23:18:59.807036 containerd[1589]: time="2025-09-09T23:18:59.806786248Z" level=info msg="StartContainer for \"2259818661860da50744da2063433175d3698df0332cc65af47a696075c404eb\" returns successfully" Sep 9 23:19:00.371144 kubelet[2893]: I0909 23:19:00.371065 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-c5bc589cc-wtzzr" podStartSLOduration=29.904122322 podStartE2EDuration="36.371036128s" podCreationTimestamp="2025-09-09 23:18:24 +0000 UTC" firstStartedPulling="2025-09-09 23:18:53.062372669 +0000 UTC m=+45.472557562" lastFinishedPulling="2025-09-09 23:18:59.529286457 +0000 UTC m=+51.939471368" observedRunningTime="2025-09-09 23:19:00.369749223 +0000 UTC m=+52.779934160" watchObservedRunningTime="2025-09-09 23:19:00.371036128 +0000 UTC m=+52.781221028" Sep 9 23:19:01.198365 systemd-networkd[1518]: vxlan.calico: Gained IPv6LL Sep 9 23:19:01.306303 containerd[1589]: time="2025-09-09T23:19:01.305667417Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:01.308460 containerd[1589]: time="2025-09-09T23:19:01.308397256Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 9 23:19:01.311161 containerd[1589]: time="2025-09-09T23:19:01.311126893Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:01.320026 containerd[1589]: time="2025-09-09T23:19:01.319956130Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.786331907s" Sep 9 23:19:01.320389 containerd[1589]: time="2025-09-09T23:19:01.320339025Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 9 23:19:01.320727 containerd[1589]: time="2025-09-09T23:19:01.320679384Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:01.325912 containerd[1589]: time="2025-09-09T23:19:01.325860980Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 23:19:01.330888 containerd[1589]: time="2025-09-09T23:19:01.330733818Z" level=info msg="CreateContainer within sandbox \"be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 23:19:01.343394 containerd[1589]: time="2025-09-09T23:19:01.343344477Z" level=info msg="Container 463505011d385efaccb112d42e602598415f2b1f964546395c62aa43fd059db7: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:19:01.372488 containerd[1589]: time="2025-09-09T23:19:01.372349950Z" level=info msg="CreateContainer within sandbox \"be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"463505011d385efaccb112d42e602598415f2b1f964546395c62aa43fd059db7\"" Sep 9 23:19:01.376288 containerd[1589]: time="2025-09-09T23:19:01.375140931Z" level=info msg="StartContainer for \"463505011d385efaccb112d42e602598415f2b1f964546395c62aa43fd059db7\"" Sep 9 23:19:01.377547 containerd[1589]: time="2025-09-09T23:19:01.377122406Z" level=info msg="connecting to shim 463505011d385efaccb112d42e602598415f2b1f964546395c62aa43fd059db7" address="unix:///run/containerd/s/9084cacb1f822237911b3234c7485af1ab0f5cf8f0482405b055695722d4a8ac" protocol=ttrpc version=3 Sep 9 23:19:01.430444 systemd[1]: Started cri-containerd-463505011d385efaccb112d42e602598415f2b1f964546395c62aa43fd059db7.scope - libcontainer container 463505011d385efaccb112d42e602598415f2b1f964546395c62aa43fd059db7. Sep 9 23:19:01.540639 containerd[1589]: time="2025-09-09T23:19:01.540587977Z" level=info msg="StartContainer for \"463505011d385efaccb112d42e602598415f2b1f964546395c62aa43fd059db7\" returns successfully" Sep 9 23:19:05.255826 containerd[1589]: time="2025-09-09T23:19:05.255764662Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:05.290852 containerd[1589]: time="2025-09-09T23:19:05.258104201Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 9 23:19:05.290852 containerd[1589]: time="2025-09-09T23:19:05.286939409Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:05.305208 containerd[1589]: time="2025-09-09T23:19:05.305137617Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:05.306317 containerd[1589]: time="2025-09-09T23:19:05.305836035Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.979789118s" Sep 9 23:19:05.306317 containerd[1589]: time="2025-09-09T23:19:05.305878368Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 9 23:19:05.309186 containerd[1589]: time="2025-09-09T23:19:05.308419604Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 23:19:05.349633 containerd[1589]: time="2025-09-09T23:19:05.349361159Z" level=info msg="CreateContainer within sandbox \"d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 23:19:05.364576 containerd[1589]: time="2025-09-09T23:19:05.364530470Z" level=info msg="Container 2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:19:05.371793 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3766886996.mount: Deactivated successfully. Sep 9 23:19:05.395226 containerd[1589]: time="2025-09-09T23:19:05.394341347Z" level=info msg="CreateContainer within sandbox \"d431f2a3a0ec4221ef6b71c06813aeaac6784a223c75529264172c0e833db5df\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b\"" Sep 9 23:19:05.398116 containerd[1589]: time="2025-09-09T23:19:05.398033885Z" level=info msg="StartContainer for \"2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b\"" Sep 9 23:19:05.418906 containerd[1589]: time="2025-09-09T23:19:05.418582689Z" level=info msg="connecting to shim 2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b" address="unix:///run/containerd/s/026062917350160bc089c11bebde898c415d3b33cf73071ae2f4afc0ac5e7f22" protocol=ttrpc version=3 Sep 9 23:19:05.506485 systemd[1]: Started cri-containerd-2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b.scope - libcontainer container 2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b. Sep 9 23:19:05.586945 containerd[1589]: time="2025-09-09T23:19:05.586719461Z" level=info msg="StartContainer for \"2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b\" returns successfully" Sep 9 23:19:05.771280 containerd[1589]: time="2025-09-09T23:19:05.770422990Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:05.772558 containerd[1589]: time="2025-09-09T23:19:05.772530906Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 23:19:05.775795 containerd[1589]: time="2025-09-09T23:19:05.775754387Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 467.259128ms" Sep 9 23:19:05.775932 containerd[1589]: time="2025-09-09T23:19:05.775907267Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 23:19:05.778200 containerd[1589]: time="2025-09-09T23:19:05.778149910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 23:19:05.781278 containerd[1589]: time="2025-09-09T23:19:05.781246061Z" level=info msg="CreateContainer within sandbox \"e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 23:19:05.788727 containerd[1589]: time="2025-09-09T23:19:05.788647053Z" level=info msg="Container 610046b1ca53dc8181da8702eb22e3ec35806aa4f295bcc84349b0e8b3daaf09: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:19:05.825190 containerd[1589]: time="2025-09-09T23:19:05.824961862Z" level=info msg="CreateContainer within sandbox \"e5ae579e943c5fed043263d83e14b484202feb4f54898908c5b7ebd8d63ca86e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"610046b1ca53dc8181da8702eb22e3ec35806aa4f295bcc84349b0e8b3daaf09\"" Sep 9 23:19:05.826407 containerd[1589]: time="2025-09-09T23:19:05.826374783Z" level=info msg="StartContainer for \"610046b1ca53dc8181da8702eb22e3ec35806aa4f295bcc84349b0e8b3daaf09\"" Sep 9 23:19:05.835196 containerd[1589]: time="2025-09-09T23:19:05.833676183Z" level=info msg="connecting to shim 610046b1ca53dc8181da8702eb22e3ec35806aa4f295bcc84349b0e8b3daaf09" address="unix:///run/containerd/s/ffc93bb29c6fa1ee6d13acc95acb4ebd3eb2106f3e2b61d2c0b0accb6a440cea" protocol=ttrpc version=3 Sep 9 23:19:05.872723 systemd[1]: Started cri-containerd-610046b1ca53dc8181da8702eb22e3ec35806aa4f295bcc84349b0e8b3daaf09.scope - libcontainer container 610046b1ca53dc8181da8702eb22e3ec35806aa4f295bcc84349b0e8b3daaf09. Sep 9 23:19:05.996352 containerd[1589]: time="2025-09-09T23:19:05.996301945Z" level=info msg="StartContainer for \"610046b1ca53dc8181da8702eb22e3ec35806aa4f295bcc84349b0e8b3daaf09\" returns successfully" Sep 9 23:19:06.507722 kubelet[2893]: I0909 23:19:06.505074 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5c6cd884f-hjkmf" podStartSLOduration=26.98361694 podStartE2EDuration="38.504945294s" podCreationTimestamp="2025-09-09 23:18:28 +0000 UTC" firstStartedPulling="2025-09-09 23:18:53.786449884 +0000 UTC m=+46.196634781" lastFinishedPulling="2025-09-09 23:19:05.307778233 +0000 UTC m=+57.717963135" observedRunningTime="2025-09-09 23:19:06.504035492 +0000 UTC m=+58.914220394" watchObservedRunningTime="2025-09-09 23:19:06.504945294 +0000 UTC m=+58.915130200" Sep 9 23:19:06.533134 kubelet[2893]: I0909 23:19:06.533064 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-c5bc589cc-26mdm" podStartSLOduration=31.368930643 podStartE2EDuration="42.533041038s" podCreationTimestamp="2025-09-09 23:18:24 +0000 UTC" firstStartedPulling="2025-09-09 23:18:54.61359234 +0000 UTC m=+47.023777234" lastFinishedPulling="2025-09-09 23:19:05.777702723 +0000 UTC m=+58.187887629" observedRunningTime="2025-09-09 23:19:06.531057796 +0000 UTC m=+58.941242733" watchObservedRunningTime="2025-09-09 23:19:06.533041038 +0000 UTC m=+58.943225944" Sep 9 23:19:06.624383 containerd[1589]: time="2025-09-09T23:19:06.624328219Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b\" id:\"355600ead087c6a49e9e85ad61cb4f1811763997a5ec95b6b04e25a0a8bf514d\" pid:5109 exited_at:{seconds:1757459946 nanos:604427664}" Sep 9 23:19:07.697317 containerd[1589]: time="2025-09-09T23:19:07.697257266Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:07.699185 containerd[1589]: time="2025-09-09T23:19:07.699110776Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 9 23:19:07.700061 containerd[1589]: time="2025-09-09T23:19:07.700026706Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:07.703461 containerd[1589]: time="2025-09-09T23:19:07.703426239Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:07.705304 containerd[1589]: time="2025-09-09T23:19:07.705270170Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.92670487s" Sep 9 23:19:07.705769 containerd[1589]: time="2025-09-09T23:19:07.705310662Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 9 23:19:07.706654 containerd[1589]: time="2025-09-09T23:19:07.706621941Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 23:19:07.710606 containerd[1589]: time="2025-09-09T23:19:07.710541456Z" level=info msg="CreateContainer within sandbox \"17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 23:19:07.765552 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3142237412.mount: Deactivated successfully. Sep 9 23:19:07.768972 containerd[1589]: time="2025-09-09T23:19:07.766603077Z" level=info msg="Container 93185762403b6db315f693c9bf0ce802763ec74b3f06a0d2f84ea6e630842c5f: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:19:07.859824 containerd[1589]: time="2025-09-09T23:19:07.858994292Z" level=info msg="CreateContainer within sandbox \"17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"93185762403b6db315f693c9bf0ce802763ec74b3f06a0d2f84ea6e630842c5f\"" Sep 9 23:19:07.884845 containerd[1589]: time="2025-09-09T23:19:07.882955750Z" level=info msg="StartContainer for \"93185762403b6db315f693c9bf0ce802763ec74b3f06a0d2f84ea6e630842c5f\"" Sep 9 23:19:07.886833 containerd[1589]: time="2025-09-09T23:19:07.886802445Z" level=info msg="connecting to shim 93185762403b6db315f693c9bf0ce802763ec74b3f06a0d2f84ea6e630842c5f" address="unix:///run/containerd/s/9b53c6350b798680f9d2934374f5db0020c770d6bb98c9a18a3731806d9f87fe" protocol=ttrpc version=3 Sep 9 23:19:07.960292 systemd[1]: Started cri-containerd-93185762403b6db315f693c9bf0ce802763ec74b3f06a0d2f84ea6e630842c5f.scope - libcontainer container 93185762403b6db315f693c9bf0ce802763ec74b3f06a0d2f84ea6e630842c5f. Sep 9 23:19:08.158382 containerd[1589]: time="2025-09-09T23:19:08.158260765Z" level=info msg="StartContainer for \"93185762403b6db315f693c9bf0ce802763ec74b3f06a0d2f84ea6e630842c5f\" returns successfully" Sep 9 23:19:09.713095 containerd[1589]: time="2025-09-09T23:19:09.713017548Z" level=info msg="TaskExit event in podsandbox handler container_id:\"798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e\" id:\"bfa7e2cb517b0e01bc4b5c1e9e13f99603f5f8f79930347ba5774b22fb395ce7\" pid:5170 exited_at:{seconds:1757459949 nanos:711676461}" Sep 9 23:19:10.842764 containerd[1589]: time="2025-09-09T23:19:10.842379567Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b\" id:\"2c9e3c2277b987964c70563573b325d259468b372aab7e00464ba759b3090972\" pid:5203 exited_at:{seconds:1757459950 nanos:836771054}" Sep 9 23:19:13.587869 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3318139148.mount: Deactivated successfully. Sep 9 23:19:15.189585 containerd[1589]: time="2025-09-09T23:19:15.189506200Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:15.192202 containerd[1589]: time="2025-09-09T23:19:15.191363803Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 9 23:19:15.192782 containerd[1589]: time="2025-09-09T23:19:15.192733346Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:15.200492 containerd[1589]: time="2025-09-09T23:19:15.200450026Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:15.202186 containerd[1589]: time="2025-09-09T23:19:15.201509446Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 7.494850569s" Sep 9 23:19:15.202186 containerd[1589]: time="2025-09-09T23:19:15.201549147Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 9 23:19:15.249155 containerd[1589]: time="2025-09-09T23:19:15.249086288Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 23:19:15.331243 containerd[1589]: time="2025-09-09T23:19:15.331164261Z" level=info msg="CreateContainer within sandbox \"02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 23:19:15.354326 containerd[1589]: time="2025-09-09T23:19:15.354273630Z" level=info msg="Container d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:19:15.380116 containerd[1589]: time="2025-09-09T23:19:15.380052908Z" level=info msg="CreateContainer within sandbox \"02ee04aef710a329803e279a7a99a9cd30f321ddab6092b27ae8808bb3c74d75\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7\"" Sep 9 23:19:15.380762 containerd[1589]: time="2025-09-09T23:19:15.380714400Z" level=info msg="StartContainer for \"d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7\"" Sep 9 23:19:15.382605 containerd[1589]: time="2025-09-09T23:19:15.382567291Z" level=info msg="connecting to shim d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7" address="unix:///run/containerd/s/a0e9aa39490b2883adfc3465901d7193e84f6ad05d5a2906d9aa6ce35b92b8a7" protocol=ttrpc version=3 Sep 9 23:19:15.571515 systemd[1]: Started cri-containerd-d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7.scope - libcontainer container d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7. Sep 9 23:19:15.804988 containerd[1589]: time="2025-09-09T23:19:15.804329883Z" level=info msg="StartContainer for \"d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7\" returns successfully" Sep 9 23:19:18.432442 containerd[1589]: time="2025-09-09T23:19:18.432180986Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7\" id:\"8a8e105a12a0a264b9353c6fcf665275191d5b42e283cdd4f937e7692c35f2f4\" pid:5279 exit_status:1 exited_at:{seconds:1757459958 nanos:373070519}" Sep 9 23:19:20.028608 containerd[1589]: time="2025-09-09T23:19:20.028516785Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7\" id:\"adc924591d02a3426309d34383860cbca84ed59df49d92c1b623053aea8a1ba5\" pid:5311 exit_status:1 exited_at:{seconds:1757459960 nanos:15030736}" Sep 9 23:19:20.283895 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount786063032.mount: Deactivated successfully. Sep 9 23:19:20.363229 containerd[1589]: time="2025-09-09T23:19:20.363107452Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:20.365142 containerd[1589]: time="2025-09-09T23:19:20.365112276Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 9 23:19:20.365333 containerd[1589]: time="2025-09-09T23:19:20.365298709Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:20.369375 containerd[1589]: time="2025-09-09T23:19:20.369317054Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:20.371414 containerd[1589]: time="2025-09-09T23:19:20.371381426Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 5.122214671s" Sep 9 23:19:20.371555 containerd[1589]: time="2025-09-09T23:19:20.371529569Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 9 23:19:20.374156 containerd[1589]: time="2025-09-09T23:19:20.373433561Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 23:19:20.390767 containerd[1589]: time="2025-09-09T23:19:20.390331128Z" level=info msg="CreateContainer within sandbox \"be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 23:19:20.413358 containerd[1589]: time="2025-09-09T23:19:20.413311531Z" level=info msg="Container 901c388d1d2332f31be060a783c39b218b14bf526b75b6a9abfb06d1d1f49cf8: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:19:20.468484 containerd[1589]: time="2025-09-09T23:19:20.468397754Z" level=info msg="CreateContainer within sandbox \"be0f0a048f146dec70144bb9fbd499b2e501b201f80aadff6f5729162b78e571\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"901c388d1d2332f31be060a783c39b218b14bf526b75b6a9abfb06d1d1f49cf8\"" Sep 9 23:19:20.469894 containerd[1589]: time="2025-09-09T23:19:20.469622354Z" level=info msg="StartContainer for \"901c388d1d2332f31be060a783c39b218b14bf526b75b6a9abfb06d1d1f49cf8\"" Sep 9 23:19:20.475416 containerd[1589]: time="2025-09-09T23:19:20.475364002Z" level=info msg="connecting to shim 901c388d1d2332f31be060a783c39b218b14bf526b75b6a9abfb06d1d1f49cf8" address="unix:///run/containerd/s/9084cacb1f822237911b3234c7485af1ab0f5cf8f0482405b055695722d4a8ac" protocol=ttrpc version=3 Sep 9 23:19:20.555335 systemd[1]: Started cri-containerd-901c388d1d2332f31be060a783c39b218b14bf526b75b6a9abfb06d1d1f49cf8.scope - libcontainer container 901c388d1d2332f31be060a783c39b218b14bf526b75b6a9abfb06d1d1f49cf8. Sep 9 23:19:20.757445 containerd[1589]: time="2025-09-09T23:19:20.757385362Z" level=info msg="StartContainer for \"901c388d1d2332f31be060a783c39b218b14bf526b75b6a9abfb06d1d1f49cf8\" returns successfully" Sep 9 23:19:21.150211 kubelet[2893]: I0909 23:19:21.132675 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-lvhtm" podStartSLOduration=37.009673917 podStartE2EDuration="54.119837672s" podCreationTimestamp="2025-09-09 23:18:27 +0000 UTC" firstStartedPulling="2025-09-09 23:18:58.106200785 +0000 UTC m=+50.516385686" lastFinishedPulling="2025-09-09 23:19:15.216364534 +0000 UTC m=+67.626549441" observedRunningTime="2025-09-09 23:19:16.841880982 +0000 UTC m=+69.252065910" watchObservedRunningTime="2025-09-09 23:19:21.119837672 +0000 UTC m=+73.530022580" Sep 9 23:19:21.150211 kubelet[2893]: I0909 23:19:21.149883 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-77cdf898b8-n8jqb" podStartSLOduration=2.448467589 podStartE2EDuration="29.149858807s" podCreationTimestamp="2025-09-09 23:18:52 +0000 UTC" firstStartedPulling="2025-09-09 23:18:53.671486347 +0000 UTC m=+46.081671253" lastFinishedPulling="2025-09-09 23:19:20.372877572 +0000 UTC m=+72.783062471" observedRunningTime="2025-09-09 23:19:21.116078929 +0000 UTC m=+73.526263872" watchObservedRunningTime="2025-09-09 23:19:21.149858807 +0000 UTC m=+73.560043708" Sep 9 23:19:22.819260 containerd[1589]: time="2025-09-09T23:19:22.819206011Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:22.820565 containerd[1589]: time="2025-09-09T23:19:22.820532912Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 9 23:19:22.822036 containerd[1589]: time="2025-09-09T23:19:22.821304513Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:22.825046 containerd[1589]: time="2025-09-09T23:19:22.825008737Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:19:22.826068 containerd[1589]: time="2025-09-09T23:19:22.825971856Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.452500674s" Sep 9 23:19:22.826422 containerd[1589]: time="2025-09-09T23:19:22.826394908Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 9 23:19:22.841665 containerd[1589]: time="2025-09-09T23:19:22.841613336Z" level=info msg="CreateContainer within sandbox \"17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 23:19:22.915195 containerd[1589]: time="2025-09-09T23:19:22.913492053Z" level=info msg="Container 9ddb70e9a760703cd4b570e17304fcca0ace51b11df92ea73c2915d1f1d193dc: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:19:22.922517 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2602786899.mount: Deactivated successfully. Sep 9 23:19:22.976526 containerd[1589]: time="2025-09-09T23:19:22.976332767Z" level=info msg="CreateContainer within sandbox \"17e20c770d279e619ddfd4e573a214b1412468e677335f51b878380c8715383a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"9ddb70e9a760703cd4b570e17304fcca0ace51b11df92ea73c2915d1f1d193dc\"" Sep 9 23:19:22.980230 containerd[1589]: time="2025-09-09T23:19:22.979062440Z" level=info msg="StartContainer for \"9ddb70e9a760703cd4b570e17304fcca0ace51b11df92ea73c2915d1f1d193dc\"" Sep 9 23:19:22.983271 containerd[1589]: time="2025-09-09T23:19:22.982142987Z" level=info msg="connecting to shim 9ddb70e9a760703cd4b570e17304fcca0ace51b11df92ea73c2915d1f1d193dc" address="unix:///run/containerd/s/9b53c6350b798680f9d2934374f5db0020c770d6bb98c9a18a3731806d9f87fe" protocol=ttrpc version=3 Sep 9 23:19:23.042310 systemd[1]: Started cri-containerd-9ddb70e9a760703cd4b570e17304fcca0ace51b11df92ea73c2915d1f1d193dc.scope - libcontainer container 9ddb70e9a760703cd4b570e17304fcca0ace51b11df92ea73c2915d1f1d193dc. Sep 9 23:19:23.265898 containerd[1589]: time="2025-09-09T23:19:23.265343272Z" level=info msg="StartContainer for \"9ddb70e9a760703cd4b570e17304fcca0ace51b11df92ea73c2915d1f1d193dc\" returns successfully" Sep 9 23:19:24.169441 kubelet[2893]: I0909 23:19:24.168913 2893 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-lkvhf" podStartSLOduration=31.355794879 podStartE2EDuration="57.168888764s" podCreationTimestamp="2025-09-09 23:18:27 +0000 UTC" firstStartedPulling="2025-09-09 23:18:57.015515022 +0000 UTC m=+49.425699922" lastFinishedPulling="2025-09-09 23:19:22.828608907 +0000 UTC m=+75.238793807" observedRunningTime="2025-09-09 23:19:24.166906096 +0000 UTC m=+76.577091021" watchObservedRunningTime="2025-09-09 23:19:24.168888764 +0000 UTC m=+76.579073687" Sep 9 23:19:24.234907 kubelet[2893]: I0909 23:19:24.231088 2893 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 23:19:24.236924 kubelet[2893]: I0909 23:19:24.236775 2893 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 23:19:28.313446 containerd[1589]: time="2025-09-09T23:19:28.313381288Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7\" id:\"2296c3db577cac1df499867c9bc5a131e9e706a2d17436bea70a9a3c6796ae2f\" pid:5422 exited_at:{seconds:1757459968 nanos:298308430}" Sep 9 23:19:29.273199 containerd[1589]: time="2025-09-09T23:19:29.272340807Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b\" id:\"8daa8c73bb1f08819c236ac423174a791a81f347783d1cb0fec1e4db60fb0a3b\" pid:5444 exited_at:{seconds:1757459969 nanos:271598435}" Sep 9 23:19:40.013377 containerd[1589]: time="2025-09-09T23:19:40.013312279Z" level=info msg="TaskExit event in podsandbox handler container_id:\"798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e\" id:\"a566bde79e6317db2015db84004d854c28fdd394973bf23843d9cdb89ae0fee1\" pid:5468 exited_at:{seconds:1757459980 nanos:12507308}" Sep 9 23:19:40.820575 containerd[1589]: time="2025-09-09T23:19:40.800875686Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b\" id:\"172e33df999bfadc2d01be2c3b4d676d7f90a087f251e9647c98639ed3430616\" pid:5501 exited_at:{seconds:1757459980 nanos:800125086}" Sep 9 23:19:41.722461 containerd[1589]: time="2025-09-09T23:19:41.722376574Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7\" id:\"3bf33650cc7c06a892ef74725707604065625fdb980cacc6cb984d0d715756b8\" pid:5522 exited_at:{seconds:1757459981 nanos:721732424}" Sep 9 23:19:58.564831 systemd[1]: Started sshd@9-10.230.66.202:22-139.178.68.195:47682.service - OpenSSH per-connection server daemon (139.178.68.195:47682). Sep 9 23:19:59.559653 sshd[5539]: Accepted publickey for core from 139.178.68.195 port 47682 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:19:59.563151 sshd-session[5539]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:19:59.573538 systemd-logind[1563]: New session 12 of user core. Sep 9 23:19:59.581357 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 23:20:00.730368 sshd[5542]: Connection closed by 139.178.68.195 port 47682 Sep 9 23:20:00.731237 sshd-session[5539]: pam_unix(sshd:session): session closed for user core Sep 9 23:20:00.739489 systemd[1]: sshd@9-10.230.66.202:22-139.178.68.195:47682.service: Deactivated successfully. Sep 9 23:20:00.744147 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 23:20:00.745574 systemd-logind[1563]: Session 12 logged out. Waiting for processes to exit. Sep 9 23:20:00.747730 systemd-logind[1563]: Removed session 12. Sep 9 23:20:05.897891 systemd[1]: Started sshd@10-10.230.66.202:22-139.178.68.195:41038.service - OpenSSH per-connection server daemon (139.178.68.195:41038). Sep 9 23:20:06.881911 sshd[5555]: Accepted publickey for core from 139.178.68.195 port 41038 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:20:06.885961 sshd-session[5555]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:20:06.895595 systemd-logind[1563]: New session 13 of user core. Sep 9 23:20:06.903254 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 23:20:07.814768 sshd[5558]: Connection closed by 139.178.68.195 port 41038 Sep 9 23:20:07.815268 sshd-session[5555]: pam_unix(sshd:session): session closed for user core Sep 9 23:20:07.823303 systemd-logind[1563]: Session 13 logged out. Waiting for processes to exit. Sep 9 23:20:07.823660 systemd[1]: sshd@10-10.230.66.202:22-139.178.68.195:41038.service: Deactivated successfully. Sep 9 23:20:07.831327 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 23:20:07.841198 systemd-logind[1563]: Removed session 13. Sep 9 23:20:08.876484 update_engine[1565]: I20250909 23:20:08.876344 1565 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 9 23:20:08.876484 update_engine[1565]: I20250909 23:20:08.876477 1565 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 9 23:20:08.878106 update_engine[1565]: I20250909 23:20:08.877918 1565 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 9 23:20:08.878804 update_engine[1565]: I20250909 23:20:08.878747 1565 omaha_request_params.cc:62] Current group set to developer Sep 9 23:20:08.879249 update_engine[1565]: I20250909 23:20:08.878953 1565 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 9 23:20:08.879249 update_engine[1565]: I20250909 23:20:08.878978 1565 update_attempter.cc:643] Scheduling an action processor start. Sep 9 23:20:08.879249 update_engine[1565]: I20250909 23:20:08.879016 1565 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 9 23:20:08.879249 update_engine[1565]: I20250909 23:20:08.879089 1565 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 9 23:20:08.879249 update_engine[1565]: I20250909 23:20:08.879199 1565 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 9 23:20:08.879249 update_engine[1565]: I20250909 23:20:08.879216 1565 omaha_request_action.cc:272] Request: Sep 9 23:20:08.879249 update_engine[1565]: Sep 9 23:20:08.879249 update_engine[1565]: Sep 9 23:20:08.879249 update_engine[1565]: Sep 9 23:20:08.879249 update_engine[1565]: Sep 9 23:20:08.879249 update_engine[1565]: Sep 9 23:20:08.879249 update_engine[1565]: Sep 9 23:20:08.879249 update_engine[1565]: Sep 9 23:20:08.879249 update_engine[1565]: Sep 9 23:20:08.879249 update_engine[1565]: I20250909 23:20:08.879228 1565 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 9 23:20:08.885786 update_engine[1565]: I20250909 23:20:08.885157 1565 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 9 23:20:08.892358 update_engine[1565]: I20250909 23:20:08.886011 1565 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 9 23:20:08.905299 locksmithd[1619]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 9 23:20:08.909385 update_engine[1565]: E20250909 23:20:08.909152 1565 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 9 23:20:08.909385 update_engine[1565]: I20250909 23:20:08.909313 1565 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 9 23:20:09.809513 containerd[1589]: time="2025-09-09T23:20:09.809406695Z" level=info msg="TaskExit event in podsandbox handler container_id:\"798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e\" id:\"ea82843bebe6e4768141936d19832604e5b7d013d1dd7f3f4e3298954f75322d\" pid:5584 exited_at:{seconds:1757460009 nanos:808402495}" Sep 9 23:20:10.807402 containerd[1589]: time="2025-09-09T23:20:10.807243198Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b\" id:\"046c9146270155302983e2c3e136e1331e3f8ffceb0b0151b7d07a1e61f08471\" pid:5608 exited_at:{seconds:1757460010 nanos:805797105}" Sep 9 23:20:11.749206 containerd[1589]: time="2025-09-09T23:20:11.749056072Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7\" id:\"54c7eee949b5b7b74b2cd9674311b9881078add108ceb65a57c274dd1b9840e2\" pid:5630 exited_at:{seconds:1757460011 nanos:748499150}" Sep 9 23:20:12.971345 systemd[1]: Started sshd@11-10.230.66.202:22-139.178.68.195:58456.service - OpenSSH per-connection server daemon (139.178.68.195:58456). Sep 9 23:20:13.934191 sshd[5641]: Accepted publickey for core from 139.178.68.195 port 58456 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:20:13.936184 sshd-session[5641]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:20:13.943077 systemd-logind[1563]: New session 14 of user core. Sep 9 23:20:13.955400 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 23:20:14.763674 sshd[5644]: Connection closed by 139.178.68.195 port 58456 Sep 9 23:20:14.766137 sshd-session[5641]: pam_unix(sshd:session): session closed for user core Sep 9 23:20:14.775296 systemd[1]: sshd@11-10.230.66.202:22-139.178.68.195:58456.service: Deactivated successfully. Sep 9 23:20:14.778819 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 23:20:14.780551 systemd-logind[1563]: Session 14 logged out. Waiting for processes to exit. Sep 9 23:20:14.782612 systemd-logind[1563]: Removed session 14. Sep 9 23:20:18.816040 update_engine[1565]: I20250909 23:20:18.815855 1565 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 9 23:20:18.816040 update_engine[1565]: I20250909 23:20:18.816040 1565 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 9 23:20:18.817992 update_engine[1565]: I20250909 23:20:18.817503 1565 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 9 23:20:18.854610 update_engine[1565]: E20250909 23:20:18.854554 1565 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 9 23:20:18.854713 update_engine[1565]: I20250909 23:20:18.854681 1565 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 9 23:20:19.924545 systemd[1]: Started sshd@12-10.230.66.202:22-139.178.68.195:58458.service - OpenSSH per-connection server daemon (139.178.68.195:58458). Sep 9 23:20:20.852802 sshd[5658]: Accepted publickey for core from 139.178.68.195 port 58458 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:20:20.855004 sshd-session[5658]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:20:20.863889 systemd-logind[1563]: New session 15 of user core. Sep 9 23:20:20.868413 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 23:20:21.572527 sshd[5669]: Connection closed by 139.178.68.195 port 58458 Sep 9 23:20:21.573424 sshd-session[5658]: pam_unix(sshd:session): session closed for user core Sep 9 23:20:21.580981 systemd-logind[1563]: Session 15 logged out. Waiting for processes to exit. Sep 9 23:20:21.581410 systemd[1]: sshd@12-10.230.66.202:22-139.178.68.195:58458.service: Deactivated successfully. Sep 9 23:20:21.584304 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 23:20:21.588453 systemd-logind[1563]: Removed session 15. Sep 9 23:20:26.731336 systemd[1]: Started sshd@13-10.230.66.202:22-139.178.68.195:37100.service - OpenSSH per-connection server daemon (139.178.68.195:37100). Sep 9 23:20:27.720500 sshd[5682]: Accepted publickey for core from 139.178.68.195 port 37100 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:20:27.722524 sshd-session[5682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:20:27.730960 systemd-logind[1563]: New session 16 of user core. Sep 9 23:20:27.740355 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 23:20:28.284161 containerd[1589]: time="2025-09-09T23:20:28.284092511Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7\" id:\"bd700034ab275517aa0ef58bf9782fce8088df5a451e1483da896fc5bb3d6808\" pid:5698 exited_at:{seconds:1757460028 nanos:279827067}" Sep 9 23:20:28.551735 sshd[5685]: Connection closed by 139.178.68.195 port 37100 Sep 9 23:20:28.552574 sshd-session[5682]: pam_unix(sshd:session): session closed for user core Sep 9 23:20:28.563479 systemd[1]: sshd@13-10.230.66.202:22-139.178.68.195:37100.service: Deactivated successfully. Sep 9 23:20:28.564733 systemd-logind[1563]: Session 16 logged out. Waiting for processes to exit. Sep 9 23:20:28.568213 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 23:20:28.571174 systemd-logind[1563]: Removed session 16. Sep 9 23:20:28.814667 update_engine[1565]: I20250909 23:20:28.814242 1565 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 9 23:20:28.814667 update_engine[1565]: I20250909 23:20:28.814374 1565 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 9 23:20:28.815471 update_engine[1565]: I20250909 23:20:28.814846 1565 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 9 23:20:28.815471 update_engine[1565]: E20250909 23:20:28.815333 1565 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 9 23:20:28.815471 update_engine[1565]: I20250909 23:20:28.815424 1565 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 9 23:20:29.151011 containerd[1589]: time="2025-09-09T23:20:29.150581717Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b\" id:\"7153661f6461526a3526fe1a0194471e525e0835b293b15d191f577a847e843e\" pid:5735 exited_at:{seconds:1757460029 nanos:150133882}" Sep 9 23:20:33.708710 systemd[1]: Started sshd@14-10.230.66.202:22-139.178.68.195:39558.service - OpenSSH per-connection server daemon (139.178.68.195:39558). Sep 9 23:20:34.668218 sshd[5761]: Accepted publickey for core from 139.178.68.195 port 39558 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:20:34.670897 sshd-session[5761]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:20:34.686155 systemd-logind[1563]: New session 17 of user core. Sep 9 23:20:34.689714 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 23:20:35.416172 sshd[5764]: Connection closed by 139.178.68.195 port 39558 Sep 9 23:20:35.417728 sshd-session[5761]: pam_unix(sshd:session): session closed for user core Sep 9 23:20:35.426517 systemd[1]: sshd@14-10.230.66.202:22-139.178.68.195:39558.service: Deactivated successfully. Sep 9 23:20:35.431012 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 23:20:35.432824 systemd-logind[1563]: Session 17 logged out. Waiting for processes to exit. Sep 9 23:20:35.434412 systemd-logind[1563]: Removed session 17. Sep 9 23:20:35.577031 systemd[1]: Started sshd@15-10.230.66.202:22-139.178.68.195:39570.service - OpenSSH per-connection server daemon (139.178.68.195:39570). Sep 9 23:20:36.559284 sshd[5777]: Accepted publickey for core from 139.178.68.195 port 39570 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:20:36.562635 sshd-session[5777]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:20:36.581814 systemd-logind[1563]: New session 18 of user core. Sep 9 23:20:36.587819 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 23:20:37.536712 sshd[5780]: Connection closed by 139.178.68.195 port 39570 Sep 9 23:20:37.545677 sshd-session[5777]: pam_unix(sshd:session): session closed for user core Sep 9 23:20:37.558252 systemd[1]: sshd@15-10.230.66.202:22-139.178.68.195:39570.service: Deactivated successfully. Sep 9 23:20:37.562753 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 23:20:37.565346 systemd-logind[1563]: Session 18 logged out. Waiting for processes to exit. Sep 9 23:20:37.569467 systemd-logind[1563]: Removed session 18. Sep 9 23:20:37.694958 systemd[1]: Started sshd@16-10.230.66.202:22-139.178.68.195:39572.service - OpenSSH per-connection server daemon (139.178.68.195:39572). Sep 9 23:20:38.636294 sshd[5790]: Accepted publickey for core from 139.178.68.195 port 39572 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:20:38.638730 sshd-session[5790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:20:38.647304 systemd-logind[1563]: New session 19 of user core. Sep 9 23:20:38.658619 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 23:20:38.814575 update_engine[1565]: I20250909 23:20:38.814396 1565 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 9 23:20:38.814575 update_engine[1565]: I20250909 23:20:38.814587 1565 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 9 23:20:38.815474 update_engine[1565]: I20250909 23:20:38.815327 1565 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 9 23:20:38.815988 update_engine[1565]: E20250909 23:20:38.815927 1565 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 9 23:20:38.816084 update_engine[1565]: I20250909 23:20:38.816062 1565 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 9 23:20:38.816136 update_engine[1565]: I20250909 23:20:38.816082 1565 omaha_request_action.cc:617] Omaha request response: Sep 9 23:20:38.816302 update_engine[1565]: E20250909 23:20:38.816268 1565 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 9 23:20:38.816471 update_engine[1565]: I20250909 23:20:38.816322 1565 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 9 23:20:38.816471 update_engine[1565]: I20250909 23:20:38.816334 1565 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 9 23:20:38.816471 update_engine[1565]: I20250909 23:20:38.816344 1565 update_attempter.cc:306] Processing Done. Sep 9 23:20:38.816471 update_engine[1565]: E20250909 23:20:38.816366 1565 update_attempter.cc:619] Update failed. Sep 9 23:20:38.818547 update_engine[1565]: I20250909 23:20:38.818190 1565 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 9 23:20:38.818705 update_engine[1565]: I20250909 23:20:38.818222 1565 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 9 23:20:38.818705 update_engine[1565]: I20250909 23:20:38.818603 1565 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 9 23:20:38.818810 update_engine[1565]: I20250909 23:20:38.818719 1565 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 9 23:20:38.818810 update_engine[1565]: I20250909 23:20:38.818773 1565 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 9 23:20:38.818810 update_engine[1565]: I20250909 23:20:38.818785 1565 omaha_request_action.cc:272] Request: Sep 9 23:20:38.818810 update_engine[1565]: Sep 9 23:20:38.818810 update_engine[1565]: Sep 9 23:20:38.818810 update_engine[1565]: Sep 9 23:20:38.818810 update_engine[1565]: Sep 9 23:20:38.818810 update_engine[1565]: Sep 9 23:20:38.818810 update_engine[1565]: Sep 9 23:20:38.818810 update_engine[1565]: I20250909 23:20:38.818795 1565 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 9 23:20:38.819326 update_engine[1565]: I20250909 23:20:38.818840 1565 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 9 23:20:38.819382 update_engine[1565]: I20250909 23:20:38.819340 1565 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 9 23:20:38.820091 locksmithd[1619]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 9 23:20:38.820667 update_engine[1565]: E20250909 23:20:38.820394 1565 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 9 23:20:38.820667 update_engine[1565]: I20250909 23:20:38.820484 1565 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 9 23:20:38.820667 update_engine[1565]: I20250909 23:20:38.820509 1565 omaha_request_action.cc:617] Omaha request response: Sep 9 23:20:38.820667 update_engine[1565]: I20250909 23:20:38.820519 1565 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 9 23:20:38.820667 update_engine[1565]: I20250909 23:20:38.820528 1565 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 9 23:20:38.820667 update_engine[1565]: I20250909 23:20:38.820536 1565 update_attempter.cc:306] Processing Done. Sep 9 23:20:38.820667 update_engine[1565]: I20250909 23:20:38.820545 1565 update_attempter.cc:310] Error event sent. Sep 9 23:20:38.820667 update_engine[1565]: I20250909 23:20:38.820560 1565 update_check_scheduler.cc:74] Next update check in 47m36s Sep 9 23:20:38.821091 locksmithd[1619]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 9 23:20:39.454449 sshd[5800]: Connection closed by 139.178.68.195 port 39572 Sep 9 23:20:39.456689 sshd-session[5790]: pam_unix(sshd:session): session closed for user core Sep 9 23:20:39.470636 systemd-logind[1563]: Session 19 logged out. Waiting for processes to exit. Sep 9 23:20:39.470997 systemd[1]: sshd@16-10.230.66.202:22-139.178.68.195:39572.service: Deactivated successfully. Sep 9 23:20:39.474137 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 23:20:39.477672 systemd-logind[1563]: Removed session 19. Sep 9 23:20:39.716351 containerd[1589]: time="2025-09-09T23:20:39.715102590Z" level=info msg="TaskExit event in podsandbox handler container_id:\"798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e\" id:\"f3a08163bc490badc9d57fc5aa50730daa4b8bc0bf556d7b1c7074dac0d5ec54\" pid:5828 exit_status:1 exited_at:{seconds:1757460039 nanos:714570186}" Sep 9 23:20:40.797933 containerd[1589]: time="2025-09-09T23:20:40.797839866Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b\" id:\"37c268db7b5850057f9fcea2c849195877ce3d8b904dd0904dbdd131bd2bc0a2\" pid:5854 exited_at:{seconds:1757460040 nanos:797347084}" Sep 9 23:20:41.709455 containerd[1589]: time="2025-09-09T23:20:41.709398745Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7\" id:\"b4255f58b923c3f0a96f3ba08c7235d10b88d6dc541c475a9c76558111051794\" pid:5874 exited_at:{seconds:1757460041 nanos:708464411}" Sep 9 23:20:44.614392 systemd[1]: Started sshd@17-10.230.66.202:22-139.178.68.195:60796.service - OpenSSH per-connection server daemon (139.178.68.195:60796). Sep 9 23:20:45.569043 sshd[5886]: Accepted publickey for core from 139.178.68.195 port 60796 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:20:45.571291 sshd-session[5886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:20:45.580031 systemd-logind[1563]: New session 20 of user core. Sep 9 23:20:45.587363 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 23:20:46.485489 sshd[5889]: Connection closed by 139.178.68.195 port 60796 Sep 9 23:20:46.486986 sshd-session[5886]: pam_unix(sshd:session): session closed for user core Sep 9 23:20:46.493980 systemd[1]: sshd@17-10.230.66.202:22-139.178.68.195:60796.service: Deactivated successfully. Sep 9 23:20:46.498529 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 23:20:46.507727 systemd-logind[1563]: Session 20 logged out. Waiting for processes to exit. Sep 9 23:20:46.509580 systemd-logind[1563]: Removed session 20. Sep 9 23:20:51.646410 systemd[1]: Started sshd@18-10.230.66.202:22-139.178.68.195:55584.service - OpenSSH per-connection server daemon (139.178.68.195:55584). Sep 9 23:20:52.598158 sshd[5900]: Accepted publickey for core from 139.178.68.195 port 55584 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:20:52.600092 sshd-session[5900]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:20:52.608757 systemd-logind[1563]: New session 21 of user core. Sep 9 23:20:52.616501 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 23:20:53.394382 sshd[5903]: Connection closed by 139.178.68.195 port 55584 Sep 9 23:20:53.395572 sshd-session[5900]: pam_unix(sshd:session): session closed for user core Sep 9 23:20:53.401804 systemd[1]: sshd@18-10.230.66.202:22-139.178.68.195:55584.service: Deactivated successfully. Sep 9 23:20:53.404700 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 23:20:53.408568 systemd-logind[1563]: Session 21 logged out. Waiting for processes to exit. Sep 9 23:20:53.410560 systemd-logind[1563]: Removed session 21. Sep 9 23:20:58.557740 systemd[1]: Started sshd@19-10.230.66.202:22-139.178.68.195:55588.service - OpenSSH per-connection server daemon (139.178.68.195:55588). Sep 9 23:20:59.482685 sshd[5915]: Accepted publickey for core from 139.178.68.195 port 55588 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:20:59.484338 sshd-session[5915]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:20:59.491071 systemd-logind[1563]: New session 22 of user core. Sep 9 23:20:59.503393 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 23:21:00.243338 sshd[5918]: Connection closed by 139.178.68.195 port 55588 Sep 9 23:21:00.245466 sshd-session[5915]: pam_unix(sshd:session): session closed for user core Sep 9 23:21:00.254474 systemd[1]: sshd@19-10.230.66.202:22-139.178.68.195:55588.service: Deactivated successfully. Sep 9 23:21:00.256784 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 23:21:00.258633 systemd-logind[1563]: Session 22 logged out. Waiting for processes to exit. Sep 9 23:21:00.260743 systemd-logind[1563]: Removed session 22. Sep 9 23:21:00.403020 systemd[1]: Started sshd@20-10.230.66.202:22-139.178.68.195:47920.service - OpenSSH per-connection server daemon (139.178.68.195:47920). Sep 9 23:21:01.334349 sshd[5930]: Accepted publickey for core from 139.178.68.195 port 47920 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:21:01.336029 sshd-session[5930]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:21:01.343298 systemd-logind[1563]: New session 23 of user core. Sep 9 23:21:01.352404 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 23:21:02.302971 sshd[5933]: Connection closed by 139.178.68.195 port 47920 Sep 9 23:21:02.306936 sshd-session[5930]: pam_unix(sshd:session): session closed for user core Sep 9 23:21:02.319530 systemd-logind[1563]: Session 23 logged out. Waiting for processes to exit. Sep 9 23:21:02.320715 systemd[1]: sshd@20-10.230.66.202:22-139.178.68.195:47920.service: Deactivated successfully. Sep 9 23:21:02.323506 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 23:21:02.325993 systemd-logind[1563]: Removed session 23. Sep 9 23:21:02.470660 systemd[1]: Started sshd@21-10.230.66.202:22-139.178.68.195:47924.service - OpenSSH per-connection server daemon (139.178.68.195:47924). Sep 9 23:21:03.496331 sshd[5943]: Accepted publickey for core from 139.178.68.195 port 47924 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:21:03.498503 sshd-session[5943]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:21:03.506376 systemd-logind[1563]: New session 24 of user core. Sep 9 23:21:03.514540 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 23:21:07.099552 sshd[5946]: Connection closed by 139.178.68.195 port 47924 Sep 9 23:21:07.122585 sshd-session[5943]: pam_unix(sshd:session): session closed for user core Sep 9 23:21:07.166610 systemd[1]: sshd@21-10.230.66.202:22-139.178.68.195:47924.service: Deactivated successfully. Sep 9 23:21:07.166653 systemd-logind[1563]: Session 24 logged out. Waiting for processes to exit. Sep 9 23:21:07.171996 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 23:21:07.175415 systemd[1]: session-24.scope: Consumed 789ms CPU time, 80M memory peak. Sep 9 23:21:07.181618 systemd-logind[1563]: Removed session 24. Sep 9 23:21:07.284615 systemd[1]: Started sshd@22-10.230.66.202:22-139.178.68.195:47934.service - OpenSSH per-connection server daemon (139.178.68.195:47934). Sep 9 23:21:08.312780 sshd[5963]: Accepted publickey for core from 139.178.68.195 port 47934 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:21:08.316554 sshd-session[5963]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:21:08.362063 systemd-logind[1563]: New session 25 of user core. Sep 9 23:21:08.378218 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 9 23:21:10.704637 sshd[5969]: Connection closed by 139.178.68.195 port 47934 Sep 9 23:21:10.712677 sshd-session[5963]: pam_unix(sshd:session): session closed for user core Sep 9 23:21:10.791899 systemd[1]: sshd@22-10.230.66.202:22-139.178.68.195:47934.service: Deactivated successfully. Sep 9 23:21:10.795737 systemd[1]: session-25.scope: Deactivated successfully. Sep 9 23:21:10.798567 systemd[1]: session-25.scope: Consumed 769ms CPU time, 68.5M memory peak. Sep 9 23:21:10.802394 systemd-logind[1563]: Session 25 logged out. Waiting for processes to exit. Sep 9 23:21:10.817434 systemd-logind[1563]: Removed session 25. Sep 9 23:21:10.849554 systemd[1]: Started sshd@23-10.230.66.202:22-139.178.68.195:48808.service - OpenSSH per-connection server daemon (139.178.68.195:48808). Sep 9 23:21:11.579521 containerd[1589]: time="2025-09-09T23:21:11.561989298Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b\" id:\"537f7a28843d08088bdb08d930e29303c4baa08993d388e9042b20f5fa1a16c7\" pid:6017 exited_at:{seconds:1757460071 nanos:374008566}" Sep 9 23:21:11.897508 sshd[6001]: Accepted publickey for core from 139.178.68.195 port 48808 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:21:11.904432 sshd-session[6001]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:21:11.944812 systemd-logind[1563]: New session 26 of user core. Sep 9 23:21:11.950390 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 9 23:21:12.054576 containerd[1589]: time="2025-09-09T23:21:12.054259616Z" level=info msg="TaskExit event in podsandbox handler container_id:\"798385f6c70110704fe7840b91e0ad0ba2a97e2f49f79e7f8c6a10692f5c519e\" id:\"b7413fb1fc8d06b2f5d845588769abf18f8525bc7ea64f801da83176d15ee9b5\" pid:5988 exited_at:{seconds:1757460072 nanos:46878133}" Sep 9 23:21:12.425619 containerd[1589]: time="2025-09-09T23:21:12.425546202Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7\" id:\"09da80f30163cd1d1303acc4189fcbfa79c3271786c75d3915bffe32e7c5da57\" pid:6039 exited_at:{seconds:1757460072 nanos:421589358}" Sep 9 23:21:13.419742 sshd[6048]: Connection closed by 139.178.68.195 port 48808 Sep 9 23:21:13.421042 sshd-session[6001]: pam_unix(sshd:session): session closed for user core Sep 9 23:21:13.430497 systemd[1]: sshd@23-10.230.66.202:22-139.178.68.195:48808.service: Deactivated successfully. Sep 9 23:21:13.435242 systemd[1]: session-26.scope: Deactivated successfully. Sep 9 23:21:13.436789 systemd-logind[1563]: Session 26 logged out. Waiting for processes to exit. Sep 9 23:21:13.439617 systemd-logind[1563]: Removed session 26. Sep 9 23:21:18.579141 systemd[1]: Started sshd@24-10.230.66.202:22-139.178.68.195:48818.service - OpenSSH per-connection server daemon (139.178.68.195:48818). Sep 9 23:21:19.578250 sshd[6067]: Accepted publickey for core from 139.178.68.195 port 48818 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:21:19.580915 sshd-session[6067]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:21:19.590461 systemd-logind[1563]: New session 27 of user core. Sep 9 23:21:19.598487 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 9 23:21:20.882990 sshd[6070]: Connection closed by 139.178.68.195 port 48818 Sep 9 23:21:20.883823 sshd-session[6067]: pam_unix(sshd:session): session closed for user core Sep 9 23:21:20.890441 systemd[1]: sshd@24-10.230.66.202:22-139.178.68.195:48818.service: Deactivated successfully. Sep 9 23:21:20.894265 systemd[1]: session-27.scope: Deactivated successfully. Sep 9 23:21:20.901130 systemd-logind[1563]: Session 27 logged out. Waiting for processes to exit. Sep 9 23:21:20.902901 systemd-logind[1563]: Removed session 27. Sep 9 23:21:26.045769 systemd[1]: Started sshd@25-10.230.66.202:22-139.178.68.195:44770.service - OpenSSH per-connection server daemon (139.178.68.195:44770). Sep 9 23:21:26.989955 sshd[6083]: Accepted publickey for core from 139.178.68.195 port 44770 ssh2: RSA SHA256:5l6k2ma6SSo0HOaaqvjb7RCCsoKKVX+U0QDH5LJjdtQ Sep 9 23:21:26.990880 sshd-session[6083]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:21:26.999669 systemd-logind[1563]: New session 28 of user core. Sep 9 23:21:27.007385 systemd[1]: Started session-28.scope - Session 28 of User core. Sep 9 23:21:27.970106 sshd[6086]: Connection closed by 139.178.68.195 port 44770 Sep 9 23:21:27.970861 sshd-session[6083]: pam_unix(sshd:session): session closed for user core Sep 9 23:21:27.979315 systemd[1]: sshd@25-10.230.66.202:22-139.178.68.195:44770.service: Deactivated successfully. Sep 9 23:21:27.982229 systemd[1]: session-28.scope: Deactivated successfully. Sep 9 23:21:27.989286 systemd-logind[1563]: Session 28 logged out. Waiting for processes to exit. Sep 9 23:21:27.995267 systemd-logind[1563]: Removed session 28. Sep 9 23:21:28.407272 containerd[1589]: time="2025-09-09T23:21:28.405873050Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d2648cd8308f41e6f3a73c4b035fb7daa15439e8e724eadd860a3cbe6684c6f7\" id:\"261007c2d2c94170b128cf678b887c2cc7716ab391334150c553481098b9cff8\" pid:6108 exited_at:{seconds:1757460088 nanos:400314022}" Sep 9 23:21:29.164211 containerd[1589]: time="2025-09-09T23:21:29.164074195Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2f37d51e1c55c2d48a08117770473141b569a432ba22c41f1d8aee5c62f6898b\" id:\"9a29a3e545eed4ca474df95fc94958fa9f08ff404a8bca44824c361057858876\" pid:6130 exited_at:{seconds:1757460089 nanos:163734966}"