Sep 9 22:14:07.933931 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Sep 9 19:55:16 -00 2025 Sep 9 22:14:07.933973 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=f0ebd120fc09fb344715b1492c3f1d02e1457be2c9792ea5ffb3fe4b15efa812 Sep 9 22:14:07.933992 kernel: BIOS-provided physical RAM map: Sep 9 22:14:07.934002 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 9 22:14:07.934012 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 9 22:14:07.934021 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 9 22:14:07.934033 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Sep 9 22:14:07.934043 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Sep 9 22:14:07.934053 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 9 22:14:07.934062 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 9 22:14:07.934076 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 9 22:14:07.934086 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 9 22:14:07.934096 kernel: NX (Execute Disable) protection: active Sep 9 22:14:07.934106 kernel: APIC: Static calls initialized Sep 9 22:14:07.934129 kernel: SMBIOS 2.8 present. Sep 9 22:14:07.934141 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Sep 9 22:14:07.934157 kernel: DMI: Memory slots populated: 1/1 Sep 9 22:14:07.934168 kernel: Hypervisor detected: KVM Sep 9 22:14:07.934179 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 9 22:14:07.934189 kernel: kvm-clock: using sched offset of 5616834171 cycles Sep 9 22:14:07.934201 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 9 22:14:07.934212 kernel: tsc: Detected 2799.998 MHz processor Sep 9 22:14:07.934223 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 9 22:14:07.934234 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 9 22:14:07.934245 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Sep 9 22:14:07.934260 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 9 22:14:07.934272 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 9 22:14:07.934283 kernel: Using GB pages for direct mapping Sep 9 22:14:07.934293 kernel: ACPI: Early table checksum verification disabled Sep 9 22:14:07.934304 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Sep 9 22:14:07.934315 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:14:07.934326 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:14:07.934337 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:14:07.934348 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Sep 9 22:14:07.934364 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:14:07.934375 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:14:07.934386 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:14:07.934397 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:14:07.934408 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Sep 9 22:14:07.934419 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Sep 9 22:14:07.934435 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Sep 9 22:14:07.934450 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Sep 9 22:14:07.934462 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Sep 9 22:14:07.934473 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Sep 9 22:14:07.934485 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Sep 9 22:14:07.934496 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 9 22:14:07.934508 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 9 22:14:07.934519 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Sep 9 22:14:07.934535 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00001000-0x7ffdbfff] Sep 9 22:14:07.934546 kernel: NODE_DATA(0) allocated [mem 0x7ffd4dc0-0x7ffdbfff] Sep 9 22:14:07.934558 kernel: Zone ranges: Sep 9 22:14:07.934569 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 9 22:14:07.934581 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Sep 9 22:14:07.934592 kernel: Normal empty Sep 9 22:14:07.934603 kernel: Device empty Sep 9 22:14:07.934614 kernel: Movable zone start for each node Sep 9 22:14:07.934626 kernel: Early memory node ranges Sep 9 22:14:07.934637 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 9 22:14:07.934652 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Sep 9 22:14:07.934664 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Sep 9 22:14:07.934675 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 9 22:14:07.934687 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 9 22:14:07.934698 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Sep 9 22:14:07.934719 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 9 22:14:07.934730 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 9 22:14:07.934741 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 9 22:14:07.934753 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 9 22:14:07.934769 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 9 22:14:07.937404 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 9 22:14:07.937420 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 9 22:14:07.937432 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 9 22:14:07.937443 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 9 22:14:07.937455 kernel: TSC deadline timer available Sep 9 22:14:07.937466 kernel: CPU topo: Max. logical packages: 16 Sep 9 22:14:07.937478 kernel: CPU topo: Max. logical dies: 16 Sep 9 22:14:07.937489 kernel: CPU topo: Max. dies per package: 1 Sep 9 22:14:07.937520 kernel: CPU topo: Max. threads per core: 1 Sep 9 22:14:07.937531 kernel: CPU topo: Num. cores per package: 1 Sep 9 22:14:07.937542 kernel: CPU topo: Num. threads per package: 1 Sep 9 22:14:07.937553 kernel: CPU topo: Allowing 2 present CPUs plus 14 hotplug CPUs Sep 9 22:14:07.937564 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 9 22:14:07.937588 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 9 22:14:07.937599 kernel: Booting paravirtualized kernel on KVM Sep 9 22:14:07.937611 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 9 22:14:07.937623 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 9 22:14:07.937639 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 9 22:14:07.937650 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 9 22:14:07.937662 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 9 22:14:07.937673 kernel: kvm-guest: PV spinlocks enabled Sep 9 22:14:07.937685 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 9 22:14:07.937698 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=f0ebd120fc09fb344715b1492c3f1d02e1457be2c9792ea5ffb3fe4b15efa812 Sep 9 22:14:07.937710 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 22:14:07.937729 kernel: random: crng init done Sep 9 22:14:07.938062 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 22:14:07.938079 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 9 22:14:07.938091 kernel: Fallback order for Node 0: 0 Sep 9 22:14:07.938102 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524154 Sep 9 22:14:07.938124 kernel: Policy zone: DMA32 Sep 9 22:14:07.938137 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 22:14:07.938148 kernel: software IO TLB: area num 16. Sep 9 22:14:07.938160 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 9 22:14:07.938171 kernel: Kernel/User page tables isolation: enabled Sep 9 22:14:07.938189 kernel: ftrace: allocating 40102 entries in 157 pages Sep 9 22:14:07.938201 kernel: ftrace: allocated 157 pages with 5 groups Sep 9 22:14:07.938212 kernel: Dynamic Preempt: voluntary Sep 9 22:14:07.938224 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 22:14:07.938236 kernel: rcu: RCU event tracing is enabled. Sep 9 22:14:07.938248 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 9 22:14:07.938260 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 22:14:07.938271 kernel: Rude variant of Tasks RCU enabled. Sep 9 22:14:07.938283 kernel: Tracing variant of Tasks RCU enabled. Sep 9 22:14:07.938298 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 22:14:07.938310 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 9 22:14:07.938322 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 22:14:07.938334 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 22:14:07.938345 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 22:14:07.938357 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Sep 9 22:14:07.938369 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 22:14:07.938404 kernel: Console: colour VGA+ 80x25 Sep 9 22:14:07.938416 kernel: printk: legacy console [tty0] enabled Sep 9 22:14:07.938428 kernel: printk: legacy console [ttyS0] enabled Sep 9 22:14:07.938440 kernel: ACPI: Core revision 20240827 Sep 9 22:14:07.938451 kernel: APIC: Switch to symmetric I/O mode setup Sep 9 22:14:07.938467 kernel: x2apic enabled Sep 9 22:14:07.938478 kernel: APIC: Switched APIC routing to: physical x2apic Sep 9 22:14:07.938490 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Sep 9 22:14:07.938502 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Sep 9 22:14:07.938514 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 9 22:14:07.938529 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Sep 9 22:14:07.938541 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Sep 9 22:14:07.938565 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 9 22:14:07.938576 kernel: Spectre V2 : Mitigation: Retpolines Sep 9 22:14:07.938587 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 9 22:14:07.938598 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Sep 9 22:14:07.938622 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 9 22:14:07.938634 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 9 22:14:07.938646 kernel: MDS: Mitigation: Clear CPU buffers Sep 9 22:14:07.938663 kernel: MMIO Stale Data: Unknown: No mitigations Sep 9 22:14:07.938675 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 9 22:14:07.938691 kernel: active return thunk: its_return_thunk Sep 9 22:14:07.938703 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 9 22:14:07.938715 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 9 22:14:07.938727 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 9 22:14:07.938738 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 9 22:14:07.938750 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 9 22:14:07.938762 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Sep 9 22:14:07.938774 kernel: Freeing SMP alternatives memory: 32K Sep 9 22:14:07.938786 kernel: pid_max: default: 32768 minimum: 301 Sep 9 22:14:07.938798 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 22:14:07.938809 kernel: landlock: Up and running. Sep 9 22:14:07.938839 kernel: SELinux: Initializing. Sep 9 22:14:07.938852 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 22:14:07.938864 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 22:14:07.938876 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Sep 9 22:14:07.938888 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Sep 9 22:14:07.938900 kernel: signal: max sigframe size: 1776 Sep 9 22:14:07.938912 kernel: rcu: Hierarchical SRCU implementation. Sep 9 22:14:07.938924 kernel: rcu: Max phase no-delay instances is 400. Sep 9 22:14:07.938937 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Sep 9 22:14:07.938949 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 9 22:14:07.938965 kernel: smp: Bringing up secondary CPUs ... Sep 9 22:14:07.938977 kernel: smpboot: x86: Booting SMP configuration: Sep 9 22:14:07.938989 kernel: .... node #0, CPUs: #1 Sep 9 22:14:07.939001 kernel: smp: Brought up 1 node, 2 CPUs Sep 9 22:14:07.939013 kernel: smpboot: Total of 2 processors activated (11199.99 BogoMIPS) Sep 9 22:14:07.939026 kernel: Memory: 1895688K/2096616K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54092K init, 2876K bss, 194920K reserved, 0K cma-reserved) Sep 9 22:14:07.939038 kernel: devtmpfs: initialized Sep 9 22:14:07.939050 kernel: x86/mm: Memory block size: 128MB Sep 9 22:14:07.939063 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 22:14:07.939079 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 9 22:14:07.939091 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 22:14:07.939103 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 22:14:07.939125 kernel: audit: initializing netlink subsys (disabled) Sep 9 22:14:07.939138 kernel: audit: type=2000 audit(1757456044.149:1): state=initialized audit_enabled=0 res=1 Sep 9 22:14:07.939149 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 22:14:07.939161 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 9 22:14:07.939173 kernel: cpuidle: using governor menu Sep 9 22:14:07.939185 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 22:14:07.939202 kernel: dca service started, version 1.12.1 Sep 9 22:14:07.939214 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 9 22:14:07.939226 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Sep 9 22:14:07.939238 kernel: PCI: Using configuration type 1 for base access Sep 9 22:14:07.939251 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 9 22:14:07.939263 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 22:14:07.939275 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 22:14:07.939286 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 22:14:07.939303 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 22:14:07.939315 kernel: ACPI: Added _OSI(Module Device) Sep 9 22:14:07.939327 kernel: ACPI: Added _OSI(Processor Device) Sep 9 22:14:07.939339 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 22:14:07.939351 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 22:14:07.939363 kernel: ACPI: Interpreter enabled Sep 9 22:14:07.939375 kernel: ACPI: PM: (supports S0 S5) Sep 9 22:14:07.939387 kernel: ACPI: Using IOAPIC for interrupt routing Sep 9 22:14:07.939399 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 9 22:14:07.939411 kernel: PCI: Using E820 reservations for host bridge windows Sep 9 22:14:07.939427 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 9 22:14:07.939439 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 22:14:07.939679 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 22:14:07.941124 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 9 22:14:07.941288 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 9 22:14:07.941308 kernel: PCI host bridge to bus 0000:00 Sep 9 22:14:07.941489 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 9 22:14:07.941649 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 9 22:14:07.941788 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 9 22:14:07.948973 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Sep 9 22:14:07.949128 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 9 22:14:07.949269 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Sep 9 22:14:07.949408 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 22:14:07.949594 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 9 22:14:07.949790 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 conventional PCI endpoint Sep 9 22:14:07.949950 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfa000000-0xfbffffff pref] Sep 9 22:14:07.950132 kernel: pci 0000:00:01.0: BAR 1 [mem 0xfea50000-0xfea50fff] Sep 9 22:14:07.950304 kernel: pci 0000:00:01.0: ROM [mem 0xfea40000-0xfea4ffff pref] Sep 9 22:14:07.950460 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 9 22:14:07.950635 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 22:14:07.950816 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea51000-0xfea51fff] Sep 9 22:14:07.951002 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 9 22:14:07.951189 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 9 22:14:07.951440 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 22:14:07.951653 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 22:14:07.951847 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea52000-0xfea52fff] Sep 9 22:14:07.952003 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 9 22:14:07.952188 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 9 22:14:07.952341 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 22:14:07.952540 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 22:14:07.952706 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea53000-0xfea53fff] Sep 9 22:14:07.952873 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 9 22:14:07.953026 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 9 22:14:07.953206 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 22:14:07.953375 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 22:14:07.953551 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea54000-0xfea54fff] Sep 9 22:14:07.953711 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 9 22:14:07.953989 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 9 22:14:07.954188 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 22:14:07.954354 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 22:14:07.954506 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea55000-0xfea55fff] Sep 9 22:14:07.954665 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 9 22:14:07.954853 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 9 22:14:07.955020 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 22:14:07.955202 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 22:14:07.955356 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea56000-0xfea56fff] Sep 9 22:14:07.955508 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 9 22:14:07.955681 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 9 22:14:07.956869 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 22:14:07.957059 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 22:14:07.957254 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea57000-0xfea57fff] Sep 9 22:14:07.957409 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 9 22:14:07.957562 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 9 22:14:07.957715 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 22:14:07.958415 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 22:14:07.958598 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea58000-0xfea58fff] Sep 9 22:14:07.958752 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 9 22:14:07.961967 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 9 22:14:07.962177 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 22:14:07.962343 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 9 22:14:07.962496 kernel: pci 0000:00:03.0: BAR 0 [io 0xc0c0-0xc0df] Sep 9 22:14:07.962669 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfea59000-0xfea59fff] Sep 9 22:14:07.962883 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Sep 9 22:14:07.963041 kernel: pci 0000:00:03.0: ROM [mem 0xfea00000-0xfea3ffff pref] Sep 9 22:14:07.963217 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 9 22:14:07.963370 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc07f] Sep 9 22:14:07.963521 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfea5a000-0xfea5afff] Sep 9 22:14:07.963703 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfd004000-0xfd007fff 64bit pref] Sep 9 22:14:07.963907 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 9 22:14:07.964068 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 9 22:14:07.964247 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 9 22:14:07.964415 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc0e0-0xc0ff] Sep 9 22:14:07.964566 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea5b000-0xfea5bfff] Sep 9 22:14:07.964744 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 9 22:14:07.964943 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 9 22:14:07.965170 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Sep 9 22:14:07.965328 kernel: pci 0000:01:00.0: BAR 0 [mem 0xfda00000-0xfda000ff 64bit] Sep 9 22:14:07.965483 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 9 22:14:07.965671 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 9 22:14:07.965854 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 9 22:14:07.966048 kernel: pci_bus 0000:02: extended config space not accessible Sep 9 22:14:07.966260 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 conventional PCI endpoint Sep 9 22:14:07.966431 kernel: pci 0000:02:01.0: BAR 0 [mem 0xfd800000-0xfd80000f] Sep 9 22:14:07.966589 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 9 22:14:07.966802 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Sep 9 22:14:07.966985 kernel: pci 0000:03:00.0: BAR 0 [mem 0xfe800000-0xfe803fff 64bit] Sep 9 22:14:07.967151 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 9 22:14:07.967354 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Sep 9 22:14:07.967525 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Sep 9 22:14:07.967697 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 9 22:14:07.968651 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 9 22:14:07.968845 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 9 22:14:07.969017 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 9 22:14:07.969217 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 9 22:14:07.969369 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 9 22:14:07.969395 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 9 22:14:07.969409 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 9 22:14:07.969422 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 9 22:14:07.969434 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 9 22:14:07.969446 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 9 22:14:07.969467 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 9 22:14:07.969478 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 9 22:14:07.969490 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 9 22:14:07.969507 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 9 22:14:07.969519 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 9 22:14:07.969538 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 9 22:14:07.969551 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 9 22:14:07.969562 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 9 22:14:07.969574 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 9 22:14:07.969587 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 9 22:14:07.969599 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 9 22:14:07.969611 kernel: iommu: Default domain type: Translated Sep 9 22:14:07.969627 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 9 22:14:07.969639 kernel: PCI: Using ACPI for IRQ routing Sep 9 22:14:07.969651 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 9 22:14:07.969663 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 9 22:14:07.969675 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Sep 9 22:14:07.969891 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 9 22:14:07.970058 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 9 22:14:07.970231 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 9 22:14:07.970250 kernel: vgaarb: loaded Sep 9 22:14:07.970269 kernel: clocksource: Switched to clocksource kvm-clock Sep 9 22:14:07.970282 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 22:14:07.970294 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 22:14:07.970306 kernel: pnp: PnP ACPI init Sep 9 22:14:07.970465 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 9 22:14:07.970485 kernel: pnp: PnP ACPI: found 5 devices Sep 9 22:14:07.970498 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 9 22:14:07.970510 kernel: NET: Registered PF_INET protocol family Sep 9 22:14:07.970528 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 22:14:07.970540 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 9 22:14:07.970553 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 22:14:07.970565 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 9 22:14:07.970578 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 9 22:14:07.970590 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 9 22:14:07.970602 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 22:14:07.970614 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 22:14:07.970626 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 22:14:07.970643 kernel: NET: Registered PF_XDP protocol family Sep 9 22:14:07.970810 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Sep 9 22:14:07.970964 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 9 22:14:07.971150 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 9 22:14:07.971302 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 9 22:14:07.971453 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 9 22:14:07.971604 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 9 22:14:07.971755 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 9 22:14:07.971932 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 9 22:14:07.972083 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Sep 9 22:14:07.972246 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Sep 9 22:14:07.972397 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Sep 9 22:14:07.972547 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Sep 9 22:14:07.972701 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Sep 9 22:14:07.972867 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Sep 9 22:14:07.973100 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Sep 9 22:14:07.973273 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Sep 9 22:14:07.973444 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 9 22:14:07.973627 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 9 22:14:07.973805 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 9 22:14:07.973962 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 9 22:14:07.974133 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 9 22:14:07.974286 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 22:14:07.974447 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 9 22:14:07.974617 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 9 22:14:07.976850 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 9 22:14:07.977026 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 22:14:07.977204 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 9 22:14:07.977368 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 9 22:14:07.977547 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 9 22:14:07.977713 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 22:14:07.977912 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 9 22:14:07.978077 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 9 22:14:07.978264 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 9 22:14:07.978418 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 22:14:07.978578 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 9 22:14:07.978729 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 9 22:14:07.978907 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 9 22:14:07.979060 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 22:14:07.979225 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 9 22:14:07.979378 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 9 22:14:07.979542 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 9 22:14:07.979733 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 22:14:07.979899 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 9 22:14:07.980051 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 9 22:14:07.980224 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 9 22:14:07.980376 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 22:14:07.980555 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 9 22:14:07.980708 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 9 22:14:07.981107 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 9 22:14:07.981277 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 22:14:07.981422 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 9 22:14:07.981583 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 9 22:14:07.981721 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 9 22:14:07.981883 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Sep 9 22:14:07.982053 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 9 22:14:07.982221 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Sep 9 22:14:07.982376 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 9 22:14:07.982520 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Sep 9 22:14:07.982666 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 22:14:07.982832 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 9 22:14:07.982992 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Sep 9 22:14:07.983150 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 9 22:14:07.983294 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 22:14:07.983445 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Sep 9 22:14:07.983588 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 9 22:14:07.983729 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 22:14:07.983914 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Sep 9 22:14:07.984081 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 9 22:14:07.984236 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 22:14:07.984393 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Sep 9 22:14:07.984536 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 9 22:14:07.984682 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 22:14:07.984869 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Sep 9 22:14:07.985036 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 9 22:14:07.985196 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 22:14:07.985348 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Sep 9 22:14:07.985491 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Sep 9 22:14:07.985634 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 22:14:07.985799 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Sep 9 22:14:07.985946 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 9 22:14:07.986096 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 22:14:07.986126 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 9 22:14:07.986140 kernel: PCI: CLS 0 bytes, default 64 Sep 9 22:14:07.986153 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 9 22:14:07.986166 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Sep 9 22:14:07.986179 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 9 22:14:07.986191 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Sep 9 22:14:07.986204 kernel: Initialise system trusted keyrings Sep 9 22:14:07.986223 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 9 22:14:07.986236 kernel: Key type asymmetric registered Sep 9 22:14:07.986249 kernel: Asymmetric key parser 'x509' registered Sep 9 22:14:07.986261 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 9 22:14:07.986274 kernel: io scheduler mq-deadline registered Sep 9 22:14:07.986287 kernel: io scheduler kyber registered Sep 9 22:14:07.986304 kernel: io scheduler bfq registered Sep 9 22:14:07.986455 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 9 22:14:07.986607 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 9 22:14:07.986767 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 22:14:07.986934 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 9 22:14:07.987085 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 9 22:14:07.987250 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 22:14:07.987414 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 9 22:14:07.987574 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 9 22:14:07.987733 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 22:14:07.988580 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 9 22:14:07.988761 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 9 22:14:07.988935 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 22:14:07.989088 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 9 22:14:07.989259 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 9 22:14:07.989420 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 22:14:07.989595 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 9 22:14:07.989747 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 9 22:14:07.990256 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 22:14:07.990485 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 9 22:14:07.990668 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 9 22:14:07.990863 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 22:14:07.991018 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 9 22:14:07.991193 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 9 22:14:07.991345 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 22:14:07.991365 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 9 22:14:07.991379 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 9 22:14:07.991398 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 9 22:14:07.991411 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 22:14:07.991424 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 9 22:14:07.991437 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 9 22:14:07.991450 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 9 22:14:07.991463 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 9 22:14:07.991619 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 9 22:14:07.991640 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 9 22:14:07.991810 kernel: rtc_cmos 00:03: registered as rtc0 Sep 9 22:14:07.991982 kernel: rtc_cmos 00:03: setting system clock to 2025-09-09T22:14:07 UTC (1757456047) Sep 9 22:14:07.992141 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Sep 9 22:14:07.992167 kernel: intel_pstate: CPU model not supported Sep 9 22:14:07.992180 kernel: NET: Registered PF_INET6 protocol family Sep 9 22:14:07.992192 kernel: Segment Routing with IPv6 Sep 9 22:14:07.992205 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 22:14:07.992218 kernel: NET: Registered PF_PACKET protocol family Sep 9 22:14:07.992231 kernel: Key type dns_resolver registered Sep 9 22:14:07.992249 kernel: IPI shorthand broadcast: enabled Sep 9 22:14:07.992262 kernel: sched_clock: Marking stable (3359052771, 221692592)->(3709474896, -128729533) Sep 9 22:14:07.992275 kernel: registered taskstats version 1 Sep 9 22:14:07.992288 kernel: Loading compiled-in X.509 certificates Sep 9 22:14:07.992301 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 003b39862f2a560eb5545d7d88a07fc5bdfce075' Sep 9 22:14:07.992313 kernel: Demotion targets for Node 0: null Sep 9 22:14:07.992326 kernel: Key type .fscrypt registered Sep 9 22:14:07.992338 kernel: Key type fscrypt-provisioning registered Sep 9 22:14:07.992351 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 22:14:07.992368 kernel: ima: Allocated hash algorithm: sha1 Sep 9 22:14:07.992393 kernel: ima: No architecture policies found Sep 9 22:14:07.992405 kernel: clk: Disabling unused clocks Sep 9 22:14:07.992417 kernel: Warning: unable to open an initial console. Sep 9 22:14:07.992430 kernel: Freeing unused kernel image (initmem) memory: 54092K Sep 9 22:14:07.992442 kernel: Write protecting the kernel read-only data: 24576k Sep 9 22:14:07.992467 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 9 22:14:07.992479 kernel: Run /init as init process Sep 9 22:14:07.992492 kernel: with arguments: Sep 9 22:14:07.992509 kernel: /init Sep 9 22:14:07.992522 kernel: with environment: Sep 9 22:14:07.992534 kernel: HOME=/ Sep 9 22:14:07.992546 kernel: TERM=linux Sep 9 22:14:07.992559 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 22:14:07.992584 systemd[1]: Successfully made /usr/ read-only. Sep 9 22:14:07.992603 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 22:14:07.992625 systemd[1]: Detected virtualization kvm. Sep 9 22:14:07.992639 systemd[1]: Detected architecture x86-64. Sep 9 22:14:07.992652 systemd[1]: Running in initrd. Sep 9 22:14:07.992665 systemd[1]: No hostname configured, using default hostname. Sep 9 22:14:07.992679 systemd[1]: Hostname set to . Sep 9 22:14:07.992693 systemd[1]: Initializing machine ID from VM UUID. Sep 9 22:14:07.992706 systemd[1]: Queued start job for default target initrd.target. Sep 9 22:14:07.992720 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 22:14:07.992733 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 22:14:07.992765 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 22:14:07.992778 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 22:14:07.992839 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 22:14:07.992861 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 22:14:07.992876 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 22:14:07.992890 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 22:14:07.992910 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 22:14:07.992924 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 22:14:07.992937 systemd[1]: Reached target paths.target - Path Units. Sep 9 22:14:07.992951 systemd[1]: Reached target slices.target - Slice Units. Sep 9 22:14:07.992964 systemd[1]: Reached target swap.target - Swaps. Sep 9 22:14:07.992978 systemd[1]: Reached target timers.target - Timer Units. Sep 9 22:14:07.992991 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 22:14:07.993005 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 22:14:07.993018 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 22:14:07.993037 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 22:14:07.993050 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 22:14:07.993064 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 22:14:07.993077 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 22:14:07.993091 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 22:14:07.993126 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 22:14:07.993140 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 22:14:07.993154 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 22:14:07.993173 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 22:14:07.993187 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 22:14:07.993200 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 22:14:07.993214 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 22:14:07.993227 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 22:14:07.993241 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 22:14:07.993300 systemd-journald[230]: Collecting audit messages is disabled. Sep 9 22:14:07.993332 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 22:14:07.993347 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 22:14:07.993366 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 22:14:07.993381 systemd-journald[230]: Journal started Sep 9 22:14:07.993409 systemd-journald[230]: Runtime Journal (/run/log/journal/015ebde908bb4833ace0e87e54de2c92) is 4.7M, max 38.2M, 33.4M free. Sep 9 22:14:07.951530 systemd-modules-load[231]: Inserted module 'overlay' Sep 9 22:14:08.060830 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 22:14:08.060874 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 22:14:08.060894 kernel: Bridge firewalling registered Sep 9 22:14:08.008162 systemd-modules-load[231]: Inserted module 'br_netfilter' Sep 9 22:14:08.061461 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 22:14:08.062804 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 22:14:08.064246 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 22:14:08.069736 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 22:14:08.073991 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 22:14:08.078811 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 22:14:08.080970 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 22:14:08.102618 systemd-tmpfiles[250]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 22:14:08.102738 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 22:14:08.110138 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 22:14:08.112160 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 22:14:08.114021 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 22:14:08.117562 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 22:14:08.120957 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 22:14:08.155021 dracut-cmdline[267]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=f0ebd120fc09fb344715b1492c3f1d02e1457be2c9792ea5ffb3fe4b15efa812 Sep 9 22:14:08.177044 systemd-resolved[268]: Positive Trust Anchors: Sep 9 22:14:08.177070 systemd-resolved[268]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 22:14:08.177140 systemd-resolved[268]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 22:14:08.181673 systemd-resolved[268]: Defaulting to hostname 'linux'. Sep 9 22:14:08.183464 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 22:14:08.185123 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 22:14:08.268840 kernel: SCSI subsystem initialized Sep 9 22:14:08.281821 kernel: Loading iSCSI transport class v2.0-870. Sep 9 22:14:08.295813 kernel: iscsi: registered transport (tcp) Sep 9 22:14:08.321902 kernel: iscsi: registered transport (qla4xxx) Sep 9 22:14:08.321957 kernel: QLogic iSCSI HBA Driver Sep 9 22:14:08.345247 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 22:14:08.368828 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 22:14:08.371795 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 22:14:08.426708 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 22:14:08.430075 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 22:14:08.489900 kernel: raid6: sse2x4 gen() 7954 MB/s Sep 9 22:14:08.507805 kernel: raid6: sse2x2 gen() 5608 MB/s Sep 9 22:14:08.526273 kernel: raid6: sse2x1 gen() 5626 MB/s Sep 9 22:14:08.526336 kernel: raid6: using algorithm sse2x4 gen() 7954 MB/s Sep 9 22:14:08.545301 kernel: raid6: .... xor() 5088 MB/s, rmw enabled Sep 9 22:14:08.545352 kernel: raid6: using ssse3x2 recovery algorithm Sep 9 22:14:08.569818 kernel: xor: automatically using best checksumming function avx Sep 9 22:14:08.753837 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 22:14:08.762848 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 22:14:08.766415 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 22:14:08.814510 systemd-udevd[477]: Using default interface naming scheme 'v255'. Sep 9 22:14:08.824000 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 22:14:08.828168 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 22:14:08.854004 dracut-pre-trigger[484]: rd.md=0: removing MD RAID activation Sep 9 22:14:08.885003 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 22:14:08.887849 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 22:14:09.003546 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 22:14:09.007053 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 22:14:09.110068 kernel: ACPI: bus type USB registered Sep 9 22:14:09.110162 kernel: usbcore: registered new interface driver usbfs Sep 9 22:14:09.112820 kernel: usbcore: registered new interface driver hub Sep 9 22:14:09.112855 kernel: usbcore: registered new device driver usb Sep 9 22:14:09.151822 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Sep 9 22:14:09.155798 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Sep 9 22:14:09.157329 kernel: cryptd: max_cpu_qlen set to 1000 Sep 9 22:14:09.167069 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Sep 9 22:14:09.180066 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 9 22:14:09.185834 kernel: AES CTR mode by8 optimization enabled Sep 9 22:14:09.200264 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 22:14:09.201522 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 22:14:09.211729 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 22:14:09.217837 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Sep 9 22:14:09.218337 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 22:14:09.220609 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 22:14:09.223304 kernel: libata version 3.00 loaded. Sep 9 22:14:09.237417 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 9 22:14:09.237662 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 22:14:09.237682 kernel: GPT:17805311 != 125829119 Sep 9 22:14:09.237707 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 22:14:09.237723 kernel: GPT:17805311 != 125829119 Sep 9 22:14:09.237739 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 22:14:09.237772 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 22:14:09.241547 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 9 22:14:09.241769 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Sep 9 22:14:09.244860 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Sep 9 22:14:09.246815 kernel: hub 1-0:1.0: USB hub found Sep 9 22:14:09.250803 kernel: hub 1-0:1.0: 4 ports detected Sep 9 22:14:09.253449 kernel: ahci 0000:00:1f.2: version 3.0 Sep 9 22:14:09.253676 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 9 22:14:09.256812 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 9 22:14:09.262195 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 9 22:14:09.262440 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 9 22:14:09.271111 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 9 22:14:09.272008 kernel: hub 2-0:1.0: USB hub found Sep 9 22:14:09.275039 kernel: scsi host0: ahci Sep 9 22:14:09.275803 kernel: scsi host1: ahci Sep 9 22:14:09.277802 kernel: scsi host2: ahci Sep 9 22:14:09.278004 kernel: scsi host3: ahci Sep 9 22:14:09.278204 kernel: scsi host4: ahci Sep 9 22:14:09.278799 kernel: scsi host5: ahci Sep 9 22:14:09.278994 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 41 lpm-pol 1 Sep 9 22:14:09.279013 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 41 lpm-pol 1 Sep 9 22:14:09.279037 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 41 lpm-pol 1 Sep 9 22:14:09.279054 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 41 lpm-pol 1 Sep 9 22:14:09.279070 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 41 lpm-pol 1 Sep 9 22:14:09.279110 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 41 lpm-pol 1 Sep 9 22:14:09.279803 kernel: hub 2-0:1.0: 4 ports detected Sep 9 22:14:09.326995 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 9 22:14:09.381123 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 22:14:09.402604 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 9 22:14:09.422967 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 22:14:09.433024 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 9 22:14:09.433823 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 9 22:14:09.436499 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 22:14:09.470307 disk-uuid[630]: Primary Header is updated. Sep 9 22:14:09.470307 disk-uuid[630]: Secondary Entries is updated. Sep 9 22:14:09.470307 disk-uuid[630]: Secondary Header is updated. Sep 9 22:14:09.474128 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 22:14:09.496432 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 9 22:14:09.593897 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 9 22:14:09.593963 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 9 22:14:09.593981 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 9 22:14:09.594006 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 9 22:14:09.594024 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 9 22:14:09.594930 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 9 22:14:09.642917 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 22:14:09.649599 kernel: usbcore: registered new interface driver usbhid Sep 9 22:14:09.649646 kernel: usbhid: USB HID core driver Sep 9 22:14:09.659485 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input4 Sep 9 22:14:09.659516 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Sep 9 22:14:09.676451 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 22:14:09.677778 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 22:14:09.679048 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 22:14:09.680656 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 22:14:09.683326 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 22:14:09.709587 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 22:14:10.483428 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 22:14:10.483948 disk-uuid[631]: The operation has completed successfully. Sep 9 22:14:10.542657 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 22:14:10.542833 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 22:14:10.601170 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 22:14:10.628551 sh[657]: Success Sep 9 22:14:10.653622 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 22:14:10.653688 kernel: device-mapper: uevent: version 1.0.3 Sep 9 22:14:10.654541 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 22:14:10.666812 kernel: device-mapper: verity: sha256 using shash "sha256-avx" Sep 9 22:14:10.716052 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 22:14:10.719184 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 22:14:10.732608 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 22:14:10.744834 kernel: BTRFS: device fsid f72d0a81-8b28-47a3-b3ab-bf6ecd8938f0 devid 1 transid 35 /dev/mapper/usr (253:0) scanned by mount (669) Sep 9 22:14:10.750202 kernel: BTRFS info (device dm-0): first mount of filesystem f72d0a81-8b28-47a3-b3ab-bf6ecd8938f0 Sep 9 22:14:10.750239 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 9 22:14:10.760073 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 22:14:10.760106 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 22:14:10.762400 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 22:14:10.763605 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 22:14:10.764693 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 22:14:10.765687 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 22:14:10.769267 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 22:14:10.795819 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (700) Sep 9 22:14:10.798798 kernel: BTRFS info (device vda6): first mount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 22:14:10.801795 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 22:14:10.808054 kernel: BTRFS info (device vda6): turning on async discard Sep 9 22:14:10.808094 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 22:14:10.816836 kernel: BTRFS info (device vda6): last unmount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 22:14:10.817413 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 22:14:10.821973 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 22:14:10.909388 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 22:14:10.913217 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 22:14:10.969824 systemd-networkd[840]: lo: Link UP Sep 9 22:14:10.970904 systemd-networkd[840]: lo: Gained carrier Sep 9 22:14:10.972988 systemd-networkd[840]: Enumeration completed Sep 9 22:14:10.973539 systemd-networkd[840]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 22:14:10.973546 systemd-networkd[840]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 22:14:10.974525 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 22:14:10.975856 systemd[1]: Reached target network.target - Network. Sep 9 22:14:10.979946 systemd-networkd[840]: eth0: Link UP Sep 9 22:14:10.980907 systemd-networkd[840]: eth0: Gained carrier Sep 9 22:14:10.980923 systemd-networkd[840]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 22:14:11.002870 systemd-networkd[840]: eth0: DHCPv4 address 10.230.51.18/30, gateway 10.230.51.17 acquired from 10.230.51.17 Sep 9 22:14:11.035259 ignition[753]: Ignition 2.22.0 Sep 9 22:14:11.035986 ignition[753]: Stage: fetch-offline Sep 9 22:14:11.036089 ignition[753]: no configs at "/usr/lib/ignition/base.d" Sep 9 22:14:11.036107 ignition[753]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 22:14:11.038833 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 22:14:11.036260 ignition[753]: parsed url from cmdline: "" Sep 9 22:14:11.036266 ignition[753]: no config URL provided Sep 9 22:14:11.036279 ignition[753]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 22:14:11.042948 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 9 22:14:11.036293 ignition[753]: no config at "/usr/lib/ignition/user.ign" Sep 9 22:14:11.036306 ignition[753]: failed to fetch config: resource requires networking Sep 9 22:14:11.036539 ignition[753]: Ignition finished successfully Sep 9 22:14:11.089457 ignition[850]: Ignition 2.22.0 Sep 9 22:14:11.089476 ignition[850]: Stage: fetch Sep 9 22:14:11.089707 ignition[850]: no configs at "/usr/lib/ignition/base.d" Sep 9 22:14:11.089723 ignition[850]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 22:14:11.090880 ignition[850]: parsed url from cmdline: "" Sep 9 22:14:11.090887 ignition[850]: no config URL provided Sep 9 22:14:11.090896 ignition[850]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 22:14:11.090910 ignition[850]: no config at "/usr/lib/ignition/user.ign" Sep 9 22:14:11.091096 ignition[850]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Sep 9 22:14:11.092864 ignition[850]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Sep 9 22:14:11.092909 ignition[850]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Sep 9 22:14:11.115608 ignition[850]: GET result: OK Sep 9 22:14:11.116585 ignition[850]: parsing config with SHA512: 1d4cc7ae884b57b64614c06011b170f031e320d97c4bc09402b01ddfc7e5a033937bd64957e585f54c1d0e957c4c6eba2db27349f968d067ff2d35237a4fc774 Sep 9 22:14:11.122534 unknown[850]: fetched base config from "system" Sep 9 22:14:11.122551 unknown[850]: fetched base config from "system" Sep 9 22:14:11.123144 ignition[850]: fetch: fetch complete Sep 9 22:14:11.122564 unknown[850]: fetched user config from "openstack" Sep 9 22:14:11.123152 ignition[850]: fetch: fetch passed Sep 9 22:14:11.126463 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 9 22:14:11.123218 ignition[850]: Ignition finished successfully Sep 9 22:14:11.130015 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 22:14:11.167598 ignition[856]: Ignition 2.22.0 Sep 9 22:14:11.168655 ignition[856]: Stage: kargs Sep 9 22:14:11.168910 ignition[856]: no configs at "/usr/lib/ignition/base.d" Sep 9 22:14:11.168928 ignition[856]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 22:14:11.172124 ignition[856]: kargs: kargs passed Sep 9 22:14:11.172196 ignition[856]: Ignition finished successfully Sep 9 22:14:11.173869 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 22:14:11.176525 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 22:14:11.215271 ignition[862]: Ignition 2.22.0 Sep 9 22:14:11.216375 ignition[862]: Stage: disks Sep 9 22:14:11.217229 ignition[862]: no configs at "/usr/lib/ignition/base.d" Sep 9 22:14:11.217248 ignition[862]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 22:14:11.218649 ignition[862]: disks: disks passed Sep 9 22:14:11.218725 ignition[862]: Ignition finished successfully Sep 9 22:14:11.221117 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 22:14:11.223240 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 22:14:11.224116 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 22:14:11.225644 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 22:14:11.227138 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 22:14:11.228467 systemd[1]: Reached target basic.target - Basic System. Sep 9 22:14:11.231159 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 22:14:11.259873 systemd-fsck[871]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 9 22:14:11.264293 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 22:14:11.267864 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 22:14:11.397836 kernel: EXT4-fs (vda9): mounted filesystem b54acc07-9600-49db-baed-d5fd6f41a1a5 r/w with ordered data mode. Quota mode: none. Sep 9 22:14:11.398473 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 22:14:11.399715 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 22:14:11.402765 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 22:14:11.405158 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 22:14:11.407983 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 22:14:11.417077 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Sep 9 22:14:11.419895 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 22:14:11.419941 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 22:14:11.425189 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 22:14:11.427851 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 22:14:11.440080 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (879) Sep 9 22:14:11.440147 kernel: BTRFS info (device vda6): first mount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 22:14:11.447822 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 22:14:11.451926 kernel: BTRFS info (device vda6): turning on async discard Sep 9 22:14:11.451960 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 22:14:11.455498 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 22:14:11.507844 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 22:14:11.529863 initrd-setup-root[907]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 22:14:11.537061 initrd-setup-root[914]: cut: /sysroot/etc/group: No such file or directory Sep 9 22:14:11.543149 initrd-setup-root[921]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 22:14:11.550114 initrd-setup-root[928]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 22:14:11.659556 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 22:14:11.662671 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 22:14:11.664380 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 22:14:11.683860 kernel: BTRFS info (device vda6): last unmount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 22:14:11.702247 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 22:14:11.725117 ignition[997]: INFO : Ignition 2.22.0 Sep 9 22:14:11.727423 ignition[997]: INFO : Stage: mount Sep 9 22:14:11.727423 ignition[997]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 22:14:11.727423 ignition[997]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 22:14:11.727423 ignition[997]: INFO : mount: mount passed Sep 9 22:14:11.727423 ignition[997]: INFO : Ignition finished successfully Sep 9 22:14:11.728713 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 22:14:11.744130 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 22:14:12.537832 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 22:14:12.821214 systemd-networkd[840]: eth0: Gained IPv6LL Sep 9 22:14:14.330255 systemd-networkd[840]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8cc4:24:19ff:fee6:3312/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8cc4:24:19ff:fee6:3312/64 assigned by NDisc. Sep 9 22:14:14.330268 systemd-networkd[840]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 9 22:14:14.550930 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 22:14:18.557823 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 22:14:18.564756 coreos-metadata[881]: Sep 09 22:14:18.564 WARN failed to locate config-drive, using the metadata service API instead Sep 9 22:14:18.588259 coreos-metadata[881]: Sep 09 22:14:18.588 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 9 22:14:18.605181 coreos-metadata[881]: Sep 09 22:14:18.605 INFO Fetch successful Sep 9 22:14:18.606126 coreos-metadata[881]: Sep 09 22:14:18.605 INFO wrote hostname srv-rokxy.gb1.brightbox.com to /sysroot/etc/hostname Sep 9 22:14:18.608492 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Sep 9 22:14:18.610371 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Sep 9 22:14:18.614616 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 22:14:18.639322 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 22:14:18.679750 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1012) Sep 9 22:14:18.679881 kernel: BTRFS info (device vda6): first mount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 22:14:18.682073 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 22:14:18.687647 kernel: BTRFS info (device vda6): turning on async discard Sep 9 22:14:18.687704 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 22:14:18.690526 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 22:14:18.735090 ignition[1030]: INFO : Ignition 2.22.0 Sep 9 22:14:18.735090 ignition[1030]: INFO : Stage: files Sep 9 22:14:18.736976 ignition[1030]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 22:14:18.736976 ignition[1030]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 22:14:18.736976 ignition[1030]: DEBUG : files: compiled without relabeling support, skipping Sep 9 22:14:18.739654 ignition[1030]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 22:14:18.739654 ignition[1030]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 22:14:18.747238 ignition[1030]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 22:14:18.747238 ignition[1030]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 22:14:18.747238 ignition[1030]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 22:14:18.747238 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 9 22:14:18.747238 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 9 22:14:18.741524 unknown[1030]: wrote ssh authorized keys file for user: core Sep 9 22:14:18.975816 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 22:14:20.193932 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 9 22:14:20.199138 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 22:14:20.199138 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 22:14:20.199138 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 22:14:20.199138 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 22:14:20.199138 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 22:14:20.205756 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 22:14:20.205756 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 22:14:20.205756 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 22:14:20.205756 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 22:14:20.205756 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 22:14:20.205756 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 22:14:20.214147 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 22:14:20.214147 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 22:14:20.214147 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 9 22:14:20.563754 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 22:14:21.766666 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 22:14:21.766666 ignition[1030]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 22:14:21.770700 ignition[1030]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 22:14:21.774127 ignition[1030]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 22:14:21.774127 ignition[1030]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 22:14:21.774127 ignition[1030]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 9 22:14:21.774127 ignition[1030]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 22:14:21.774127 ignition[1030]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 22:14:21.774127 ignition[1030]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 22:14:21.774127 ignition[1030]: INFO : files: files passed Sep 9 22:14:21.774127 ignition[1030]: INFO : Ignition finished successfully Sep 9 22:14:21.777642 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 22:14:21.782985 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 22:14:21.786930 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 22:14:21.802734 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 22:14:21.802941 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 22:14:21.811724 initrd-setup-root-after-ignition[1059]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 22:14:21.813850 initrd-setup-root-after-ignition[1059]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 22:14:21.815208 initrd-setup-root-after-ignition[1063]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 22:14:21.816556 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 22:14:21.818004 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 22:14:21.820974 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 22:14:21.875556 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 22:14:21.875735 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 22:14:21.877437 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 22:14:21.878697 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 22:14:21.880285 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 22:14:21.882936 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 22:14:21.925493 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 22:14:21.928281 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 22:14:21.954217 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 22:14:21.956063 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 22:14:21.957003 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 22:14:21.958576 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 22:14:21.958819 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 22:14:21.961037 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 22:14:21.961900 systemd[1]: Stopped target basic.target - Basic System. Sep 9 22:14:21.963693 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 22:14:21.965114 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 22:14:21.966558 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 22:14:21.968968 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 22:14:21.970464 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 22:14:21.972110 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 22:14:21.973674 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 22:14:21.975108 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 22:14:21.976720 systemd[1]: Stopped target swap.target - Swaps. Sep 9 22:14:21.977993 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 22:14:21.978194 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 22:14:21.979959 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 22:14:21.980980 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 22:14:21.982387 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 22:14:21.982564 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 22:14:21.984023 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 22:14:21.984263 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 22:14:21.985914 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 22:14:21.986191 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 22:14:21.987948 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 22:14:21.988093 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 22:14:21.996904 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 22:14:21.998140 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 22:14:21.999967 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 22:14:22.003987 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 22:14:22.005381 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 22:14:22.006414 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 22:14:22.008227 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 22:14:22.008418 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 22:14:22.018568 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 22:14:22.019502 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 22:14:22.037037 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 22:14:22.040531 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 22:14:22.040695 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 22:14:22.053496 ignition[1083]: INFO : Ignition 2.22.0 Sep 9 22:14:22.056037 ignition[1083]: INFO : Stage: umount Sep 9 22:14:22.056037 ignition[1083]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 22:14:22.056037 ignition[1083]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 22:14:22.058488 ignition[1083]: INFO : umount: umount passed Sep 9 22:14:22.058488 ignition[1083]: INFO : Ignition finished successfully Sep 9 22:14:22.058681 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 22:14:22.058870 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 22:14:22.060356 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 22:14:22.060519 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 22:14:22.061456 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 22:14:22.061533 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 22:14:22.063018 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 9 22:14:22.063088 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 9 22:14:22.064273 systemd[1]: Stopped target network.target - Network. Sep 9 22:14:22.065542 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 22:14:22.065618 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 22:14:22.066976 systemd[1]: Stopped target paths.target - Path Units. Sep 9 22:14:22.068240 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 22:14:22.071854 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 22:14:22.072857 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 22:14:22.074358 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 22:14:22.075712 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 22:14:22.075856 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 22:14:22.078396 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 22:14:22.078467 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 22:14:22.079671 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 22:14:22.079745 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 22:14:22.081040 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 22:14:22.081106 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 22:14:22.082324 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 22:14:22.082390 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 22:14:22.083943 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 22:14:22.085712 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 22:14:22.087244 systemd-networkd[840]: eth0: DHCPv6 lease lost Sep 9 22:14:22.092508 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 22:14:22.092685 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 22:14:22.098608 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 22:14:22.099031 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 22:14:22.099215 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 22:14:22.101933 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 22:14:22.103417 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 22:14:22.104280 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 22:14:22.104360 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 22:14:22.107877 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 22:14:22.108591 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 22:14:22.108671 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 22:14:22.111163 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 22:14:22.111238 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 22:14:22.113234 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 22:14:22.113297 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 22:14:22.114104 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 22:14:22.114166 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 22:14:22.115628 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 22:14:22.118548 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 22:14:22.118633 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 22:14:22.125617 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 22:14:22.127970 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 22:14:22.129055 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 22:14:22.129117 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 22:14:22.130400 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 22:14:22.130448 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 22:14:22.132678 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 22:14:22.132746 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 22:14:22.136237 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 22:14:22.136306 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 22:14:22.137992 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 22:14:22.138062 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 22:14:22.140400 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 22:14:22.142930 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 22:14:22.143021 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 22:14:22.145572 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 22:14:22.145637 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 22:14:22.148412 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 22:14:22.148491 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 22:14:22.152623 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 9 22:14:22.152709 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 9 22:14:22.152825 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 22:14:22.153366 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 22:14:22.154248 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 22:14:22.163539 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 22:14:22.163717 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 22:14:22.166092 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 22:14:22.168336 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 22:14:22.193075 systemd[1]: Switching root. Sep 9 22:14:22.244038 systemd-journald[230]: Journal stopped Sep 9 22:14:23.727901 systemd-journald[230]: Received SIGTERM from PID 1 (systemd). Sep 9 22:14:23.728032 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 22:14:23.728069 kernel: SELinux: policy capability open_perms=1 Sep 9 22:14:23.728103 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 22:14:23.728127 kernel: SELinux: policy capability always_check_network=0 Sep 9 22:14:23.728160 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 22:14:23.728179 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 22:14:23.728196 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 22:14:23.728219 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 22:14:23.728260 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 22:14:23.728294 kernel: audit: type=1403 audit(1757456062.506:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 22:14:23.728321 systemd[1]: Successfully loaded SELinux policy in 75.021ms. Sep 9 22:14:23.728358 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 11.851ms. Sep 9 22:14:23.728379 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 22:14:23.728399 systemd[1]: Detected virtualization kvm. Sep 9 22:14:23.728424 systemd[1]: Detected architecture x86-64. Sep 9 22:14:23.728443 systemd[1]: Detected first boot. Sep 9 22:14:23.728462 systemd[1]: Hostname set to . Sep 9 22:14:23.728496 systemd[1]: Initializing machine ID from VM UUID. Sep 9 22:14:23.728517 zram_generator::config[1128]: No configuration found. Sep 9 22:14:23.728551 kernel: Guest personality initialized and is inactive Sep 9 22:14:23.728615 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 9 22:14:23.728640 kernel: Initialized host personality Sep 9 22:14:23.728666 kernel: NET: Registered PF_VSOCK protocol family Sep 9 22:14:23.728693 systemd[1]: Populated /etc with preset unit settings. Sep 9 22:14:23.728723 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 22:14:23.728754 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 22:14:23.728807 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 22:14:23.728844 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 22:14:23.728865 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 22:14:23.728885 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 22:14:23.728904 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 22:14:23.728922 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 22:14:23.728948 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 22:14:23.728969 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 22:14:23.729000 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 22:14:23.729022 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 22:14:23.729059 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 22:14:23.729103 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 22:14:23.729124 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 22:14:23.729153 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 22:14:23.729186 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 22:14:23.729212 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 22:14:23.729239 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 9 22:14:23.729259 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 22:14:23.729277 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 22:14:23.729303 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 22:14:23.729340 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 22:14:23.729360 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 22:14:23.729380 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 22:14:23.729400 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 22:14:23.729425 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 22:14:23.729451 systemd[1]: Reached target slices.target - Slice Units. Sep 9 22:14:23.729470 systemd[1]: Reached target swap.target - Swaps. Sep 9 22:14:23.729495 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 22:14:23.729515 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 22:14:23.729545 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 22:14:23.729566 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 22:14:23.729590 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 22:14:23.729610 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 22:14:23.729630 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 22:14:23.729648 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 22:14:23.729667 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 22:14:23.729686 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 22:14:23.729705 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 22:14:23.729748 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 22:14:23.729770 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 22:14:23.730169 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 22:14:23.730197 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 22:14:23.730227 systemd[1]: Reached target machines.target - Containers. Sep 9 22:14:23.730248 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 22:14:23.730267 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 22:14:23.730285 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 22:14:23.730310 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 22:14:23.730344 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 22:14:23.730372 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 22:14:23.730399 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 22:14:23.730418 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 22:14:23.730437 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 22:14:23.730464 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 22:14:23.730483 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 22:14:23.730501 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 22:14:23.730532 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 22:14:23.730561 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 22:14:23.730629 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 22:14:23.730653 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 22:14:23.730672 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 22:14:23.730691 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 22:14:23.730728 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 22:14:23.730750 kernel: fuse: init (API version 7.41) Sep 9 22:14:23.730829 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 22:14:23.730862 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 22:14:23.730898 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 22:14:23.730925 systemd[1]: Stopped verity-setup.service. Sep 9 22:14:23.730946 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 22:14:23.730966 kernel: ACPI: bus type drm_connector registered Sep 9 22:14:23.730996 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 22:14:23.731017 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 22:14:23.731069 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 22:14:23.731093 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 22:14:23.731128 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 22:14:23.731154 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 22:14:23.731174 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 22:14:23.731192 kernel: loop: module loaded Sep 9 22:14:23.731217 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 22:14:23.731236 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 22:14:23.731256 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 22:14:23.731274 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 22:14:23.731293 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 22:14:23.731324 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 22:14:23.731369 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 22:14:23.731397 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 22:14:23.731424 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 22:14:23.731445 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 22:14:23.731464 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 22:14:23.731482 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 22:14:23.731502 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 22:14:23.731557 systemd-journald[1218]: Collecting audit messages is disabled. Sep 9 22:14:23.731622 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 22:14:23.731657 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 22:14:23.731684 systemd-journald[1218]: Journal started Sep 9 22:14:23.731728 systemd-journald[1218]: Runtime Journal (/run/log/journal/015ebde908bb4833ace0e87e54de2c92) is 4.7M, max 38.2M, 33.4M free. Sep 9 22:14:23.296405 systemd[1]: Queued start job for default target multi-user.target. Sep 9 22:14:23.310078 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 9 22:14:23.310849 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 22:14:23.736812 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 22:14:23.736629 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 22:14:23.737741 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 22:14:23.752883 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 22:14:23.756885 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 22:14:23.758952 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 22:14:23.760860 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 22:14:23.760907 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 22:14:23.764797 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 22:14:23.769003 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 22:14:23.771835 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 22:14:23.779955 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 22:14:23.785303 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 22:14:23.786145 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 22:14:23.793660 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 22:14:23.794527 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 22:14:23.798155 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 22:14:23.808391 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 22:14:23.812184 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 22:14:23.818015 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 22:14:23.818929 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 22:14:23.827449 systemd-journald[1218]: Time spent on flushing to /var/log/journal/015ebde908bb4833ace0e87e54de2c92 is 19.759ms for 1163 entries. Sep 9 22:14:23.827449 systemd-journald[1218]: System Journal (/var/log/journal/015ebde908bb4833ace0e87e54de2c92) is 8M, max 584.8M, 576.8M free. Sep 9 22:14:23.853840 systemd-journald[1218]: Received client request to flush runtime journal. Sep 9 22:14:23.858834 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 22:14:23.864900 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 22:14:23.866664 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 22:14:23.873267 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 22:14:23.886854 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 22:14:23.912812 kernel: loop0: detected capacity change from 0 to 128016 Sep 9 22:14:23.938643 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 22:14:23.951337 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 22:14:23.977804 kernel: loop1: detected capacity change from 0 to 224512 Sep 9 22:14:23.998097 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 22:14:24.007268 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 22:14:24.017173 kernel: loop2: detected capacity change from 0 to 8 Sep 9 22:14:24.014320 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 22:14:24.044877 kernel: loop3: detected capacity change from 0 to 110984 Sep 9 22:14:24.076124 systemd-tmpfiles[1285]: ACLs are not supported, ignoring. Sep 9 22:14:24.084150 kernel: loop4: detected capacity change from 0 to 128016 Sep 9 22:14:24.077308 systemd-tmpfiles[1285]: ACLs are not supported, ignoring. Sep 9 22:14:24.106442 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 22:14:24.115966 kernel: loop5: detected capacity change from 0 to 224512 Sep 9 22:14:24.164810 kernel: loop6: detected capacity change from 0 to 8 Sep 9 22:14:24.170806 kernel: loop7: detected capacity change from 0 to 110984 Sep 9 22:14:24.209120 (sd-merge)[1289]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Sep 9 22:14:24.212668 (sd-merge)[1289]: Merged extensions into '/usr'. Sep 9 22:14:24.221063 systemd[1]: Reload requested from client PID 1266 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 22:14:24.221100 systemd[1]: Reloading... Sep 9 22:14:24.341899 zram_generator::config[1313]: No configuration found. Sep 9 22:14:24.627293 ldconfig[1261]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 22:14:24.764558 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 22:14:24.765734 systemd[1]: Reloading finished in 542 ms. Sep 9 22:14:24.786992 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 22:14:24.788258 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 22:14:24.789500 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 22:14:24.803754 systemd[1]: Starting ensure-sysext.service... Sep 9 22:14:24.808923 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 22:14:24.815188 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 22:14:24.830312 systemd[1]: Reload requested from client PID 1374 ('systemctl') (unit ensure-sysext.service)... Sep 9 22:14:24.830334 systemd[1]: Reloading... Sep 9 22:14:24.852201 systemd-tmpfiles[1375]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 22:14:24.852265 systemd-tmpfiles[1375]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 22:14:24.852678 systemd-tmpfiles[1375]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 22:14:24.855730 systemd-tmpfiles[1375]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 22:14:24.859682 systemd-tmpfiles[1375]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 22:14:24.860093 systemd-tmpfiles[1375]: ACLs are not supported, ignoring. Sep 9 22:14:24.860193 systemd-tmpfiles[1375]: ACLs are not supported, ignoring. Sep 9 22:14:24.868839 systemd-tmpfiles[1375]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 22:14:24.868855 systemd-tmpfiles[1375]: Skipping /boot Sep 9 22:14:24.883391 systemd-udevd[1376]: Using default interface naming scheme 'v255'. Sep 9 22:14:24.894306 systemd-tmpfiles[1375]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 22:14:24.894323 systemd-tmpfiles[1375]: Skipping /boot Sep 9 22:14:24.939903 zram_generator::config[1402]: No configuration found. Sep 9 22:14:25.256812 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 22:14:25.328814 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Sep 9 22:14:25.346799 kernel: ACPI: button: Power Button [PWRF] Sep 9 22:14:25.415393 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 9 22:14:25.415590 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 22:14:25.418932 systemd[1]: Reloading finished in 588 ms. Sep 9 22:14:25.434114 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 22:14:25.436827 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 9 22:14:25.437459 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 22:14:25.449849 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 9 22:14:25.527952 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 22:14:25.533070 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 22:14:25.547980 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 22:14:25.550070 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 22:14:25.555545 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 22:14:25.574160 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 22:14:25.577797 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 22:14:25.582099 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 22:14:25.582955 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 22:14:25.587145 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 22:14:25.589935 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 22:14:25.592735 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 22:14:25.598148 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 22:14:25.608129 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 22:14:25.619511 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 22:14:25.620329 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 22:14:25.626824 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 22:14:25.627865 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 22:14:25.635006 systemd[1]: Finished ensure-sysext.service. Sep 9 22:14:25.640883 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 22:14:25.655053 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 22:14:25.656746 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 22:14:25.676102 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 22:14:25.678375 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 22:14:25.679888 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 22:14:25.681552 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 22:14:25.683005 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 22:14:25.687199 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 22:14:25.710295 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 22:14:25.713348 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 22:14:25.714141 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 22:14:25.720677 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 22:14:25.726597 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 22:14:25.728734 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 22:14:25.730587 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 22:14:25.783169 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 22:14:25.794077 augenrules[1547]: No rules Sep 9 22:14:25.795294 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 22:14:25.796189 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 22:14:25.810011 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 22:14:25.945299 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 22:14:26.050732 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 22:14:26.073155 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 22:14:26.075051 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 22:14:26.100763 systemd-resolved[1516]: Positive Trust Anchors: Sep 9 22:14:26.100863 systemd-resolved[1516]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 22:14:26.100906 systemd-resolved[1516]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 22:14:26.103341 systemd-networkd[1513]: lo: Link UP Sep 9 22:14:26.103713 systemd-networkd[1513]: lo: Gained carrier Sep 9 22:14:26.106165 systemd-networkd[1513]: Enumeration completed Sep 9 22:14:26.106726 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 22:14:26.107076 systemd-networkd[1513]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 22:14:26.107286 systemd-networkd[1513]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 22:14:26.108741 systemd-networkd[1513]: eth0: Link UP Sep 9 22:14:26.109186 systemd-networkd[1513]: eth0: Gained carrier Sep 9 22:14:26.109297 systemd-networkd[1513]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 22:14:26.110142 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 22:14:26.111448 systemd-resolved[1516]: Using system hostname 'srv-rokxy.gb1.brightbox.com'. Sep 9 22:14:26.114423 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 22:14:26.115340 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 22:14:26.116148 systemd[1]: Reached target network.target - Network. Sep 9 22:14:26.116749 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 22:14:26.117462 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 22:14:26.120264 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 22:14:26.121051 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 22:14:26.121867 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 9 22:14:26.122796 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 22:14:26.123573 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 22:14:26.124349 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 22:14:26.124859 systemd-networkd[1513]: eth0: DHCPv4 address 10.230.51.18/30, gateway 10.230.51.17 acquired from 10.230.51.17 Sep 9 22:14:26.125109 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 22:14:26.125147 systemd[1]: Reached target paths.target - Path Units. Sep 9 22:14:26.125743 systemd[1]: Reached target timers.target - Timer Units. Sep 9 22:14:26.127798 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 22:14:26.127898 systemd-timesyncd[1522]: Network configuration changed, trying to establish connection. Sep 9 22:14:26.130336 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 22:14:26.134818 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 22:14:26.135874 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 22:14:26.136735 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 22:14:26.145566 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 22:14:26.146889 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 22:14:26.148764 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 22:14:26.151086 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 22:14:26.151893 systemd[1]: Reached target basic.target - Basic System. Sep 9 22:14:26.152698 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 22:14:26.152888 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 22:14:26.158881 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 22:14:26.163042 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 9 22:14:26.173150 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 22:14:26.176926 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 22:14:26.179811 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 22:14:26.181909 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 22:14:26.186968 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 22:14:26.188292 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 22:14:26.191072 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 9 22:14:26.201119 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 22:14:26.206324 jq[1577]: false Sep 9 22:14:26.207902 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 22:14:26.215087 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 22:14:26.219048 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 22:14:26.226046 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 22:14:26.229058 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 22:14:26.229619 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 22:14:26.229775 extend-filesystems[1579]: Found /dev/vda6 Sep 9 22:14:26.233357 oslogin_cache_refresh[1580]: Refreshing passwd entry cache Sep 9 22:14:26.234276 google_oslogin_nss_cache[1580]: oslogin_cache_refresh[1580]: Refreshing passwd entry cache Sep 9 22:14:26.235065 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 22:14:26.241416 extend-filesystems[1579]: Found /dev/vda9 Sep 9 22:14:26.242663 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 22:14:26.246335 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 22:14:26.251885 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 22:14:26.255290 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 22:14:26.255713 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 22:14:26.256123 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 22:14:26.256440 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 22:14:26.265801 extend-filesystems[1579]: Checking size of /dev/vda9 Sep 9 22:14:26.273603 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 22:14:26.275109 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 22:14:26.303896 jq[1592]: true Sep 9 22:14:26.307424 extend-filesystems[1579]: Resized partition /dev/vda9 Sep 9 22:14:26.315806 google_oslogin_nss_cache[1580]: oslogin_cache_refresh[1580]: Failure getting users, quitting Sep 9 22:14:26.315806 google_oslogin_nss_cache[1580]: oslogin_cache_refresh[1580]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 22:14:26.315806 google_oslogin_nss_cache[1580]: oslogin_cache_refresh[1580]: Refreshing group entry cache Sep 9 22:14:26.314859 oslogin_cache_refresh[1580]: Failure getting users, quitting Sep 9 22:14:26.314892 oslogin_cache_refresh[1580]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 22:14:26.314977 oslogin_cache_refresh[1580]: Refreshing group entry cache Sep 9 22:14:26.320378 extend-filesystems[1618]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 22:14:26.318969 oslogin_cache_refresh[1580]: Failure getting groups, quitting Sep 9 22:14:26.324978 google_oslogin_nss_cache[1580]: oslogin_cache_refresh[1580]: Failure getting groups, quitting Sep 9 22:14:26.324978 google_oslogin_nss_cache[1580]: oslogin_cache_refresh[1580]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 22:14:26.318983 oslogin_cache_refresh[1580]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 22:14:26.329445 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 9 22:14:26.329986 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 9 22:14:26.339224 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Sep 9 22:14:26.335313 (ntainerd)[1617]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 22:14:26.339714 update_engine[1590]: I20250909 22:14:26.336304 1590 main.cc:92] Flatcar Update Engine starting Sep 9 22:14:26.350114 tar[1600]: linux-amd64/LICENSE Sep 9 22:14:26.350114 tar[1600]: linux-amd64/helm Sep 9 22:14:26.377724 dbus-daemon[1574]: [system] SELinux support is enabled Sep 9 22:14:26.377977 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 22:14:26.384514 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 22:14:26.384544 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 22:14:26.386109 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 22:14:26.386143 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 22:14:26.399102 jq[1619]: true Sep 9 22:14:26.423028 dbus-daemon[1574]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.1' (uid=244 pid=1513 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 9 22:14:26.429043 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 9 22:14:26.444637 systemd[1]: Started update-engine.service - Update Engine. Sep 9 22:14:26.455155 update_engine[1590]: I20250909 22:14:26.448278 1590 update_check_scheduler.cc:74] Next update check in 9m39s Sep 9 22:14:26.464696 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 22:14:26.542933 systemd-logind[1588]: Watching system buttons on /dev/input/event3 (Power Button) Sep 9 22:14:26.542972 systemd-logind[1588]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 9 22:14:26.543837 systemd-logind[1588]: New seat seat0. Sep 9 22:14:26.545046 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 22:14:26.643879 bash[1642]: Updated "/home/core/.ssh/authorized_keys" Sep 9 22:14:26.658185 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 22:14:26.662032 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 22:14:26.667849 systemd[1]: Starting sshkeys.service... Sep 9 22:14:26.676864 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Sep 9 22:14:26.701083 extend-filesystems[1618]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 9 22:14:26.701083 extend-filesystems[1618]: old_desc_blocks = 1, new_desc_blocks = 8 Sep 9 22:14:26.701083 extend-filesystems[1618]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Sep 9 22:14:26.698505 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 9 22:14:26.713371 dbus-daemon[1574]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 9 22:14:26.728181 extend-filesystems[1579]: Resized filesystem in /dev/vda9 Sep 9 22:14:26.702316 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 22:14:26.721910 dbus-daemon[1574]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1626 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 9 22:14:26.702680 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 22:14:26.727860 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 9 22:14:26.735252 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 9 22:14:26.742192 systemd[1]: Starting polkit.service - Authorization Manager... Sep 9 22:14:26.779805 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 22:14:26.860145 locksmithd[1627]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 22:14:26.894542 containerd[1617]: time="2025-09-09T22:14:26Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 22:14:26.898085 containerd[1617]: time="2025-09-09T22:14:26.898022049Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 22:14:26.930796 containerd[1617]: time="2025-09-09T22:14:26.927582983Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="15.407µs" Sep 9 22:14:26.930796 containerd[1617]: time="2025-09-09T22:14:26.927643116Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 22:14:26.930796 containerd[1617]: time="2025-09-09T22:14:26.927672489Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 22:14:26.930796 containerd[1617]: time="2025-09-09T22:14:26.927977267Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 22:14:26.930796 containerd[1617]: time="2025-09-09T22:14:26.928001056Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 22:14:26.930796 containerd[1617]: time="2025-09-09T22:14:26.928058216Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 22:14:26.930796 containerd[1617]: time="2025-09-09T22:14:26.928159782Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 22:14:26.930796 containerd[1617]: time="2025-09-09T22:14:26.928178957Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 22:14:26.930796 containerd[1617]: time="2025-09-09T22:14:26.928464661Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 22:14:26.930796 containerd[1617]: time="2025-09-09T22:14:26.928492470Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 22:14:26.930796 containerd[1617]: time="2025-09-09T22:14:26.928508387Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 22:14:26.930796 containerd[1617]: time="2025-09-09T22:14:26.928533850Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 22:14:26.931288 containerd[1617]: time="2025-09-09T22:14:26.928716937Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 22:14:26.934892 containerd[1617]: time="2025-09-09T22:14:26.934854239Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 22:14:26.934963 containerd[1617]: time="2025-09-09T22:14:26.934923131Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 22:14:26.934963 containerd[1617]: time="2025-09-09T22:14:26.934951398Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 22:14:26.935053 containerd[1617]: time="2025-09-09T22:14:26.935022154Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 22:14:26.935533 containerd[1617]: time="2025-09-09T22:14:26.935495643Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 22:14:26.935656 containerd[1617]: time="2025-09-09T22:14:26.935608653Z" level=info msg="metadata content store policy set" policy=shared Sep 9 22:14:26.941461 sshd_keygen[1620]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 22:14:26.941908 containerd[1617]: time="2025-09-09T22:14:26.941528121Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 22:14:26.941908 containerd[1617]: time="2025-09-09T22:14:26.941616258Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 22:14:26.941908 containerd[1617]: time="2025-09-09T22:14:26.941708209Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 22:14:26.941908 containerd[1617]: time="2025-09-09T22:14:26.941733881Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 22:14:26.941908 containerd[1617]: time="2025-09-09T22:14:26.941753467Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 22:14:26.941908 containerd[1617]: time="2025-09-09T22:14:26.941771422Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 22:14:26.941908 containerd[1617]: time="2025-09-09T22:14:26.941812436Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 22:14:26.941908 containerd[1617]: time="2025-09-09T22:14:26.941832077Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 22:14:26.941908 containerd[1617]: time="2025-09-09T22:14:26.941848387Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 22:14:26.941908 containerd[1617]: time="2025-09-09T22:14:26.941865325Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 22:14:26.941908 containerd[1617]: time="2025-09-09T22:14:26.941879996Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 22:14:26.941908 containerd[1617]: time="2025-09-09T22:14:26.941906506Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 22:14:26.942277 containerd[1617]: time="2025-09-09T22:14:26.942061729Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 22:14:26.942277 containerd[1617]: time="2025-09-09T22:14:26.942099691Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 22:14:26.942277 containerd[1617]: time="2025-09-09T22:14:26.942137662Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 22:14:26.942277 containerd[1617]: time="2025-09-09T22:14:26.942157730Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 22:14:26.942277 containerd[1617]: time="2025-09-09T22:14:26.942173770Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 22:14:26.942277 containerd[1617]: time="2025-09-09T22:14:26.942189559Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 22:14:26.942277 containerd[1617]: time="2025-09-09T22:14:26.942205447Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 22:14:26.942277 containerd[1617]: time="2025-09-09T22:14:26.942222544Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 22:14:26.942277 containerd[1617]: time="2025-09-09T22:14:26.942239784Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 22:14:26.942277 containerd[1617]: time="2025-09-09T22:14:26.942256483Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 22:14:26.942277 containerd[1617]: time="2025-09-09T22:14:26.942271701Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 22:14:26.942631 containerd[1617]: time="2025-09-09T22:14:26.942399418Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 22:14:26.942631 containerd[1617]: time="2025-09-09T22:14:26.942422547Z" level=info msg="Start snapshots syncer" Sep 9 22:14:26.944652 containerd[1617]: time="2025-09-09T22:14:26.943948382Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 22:14:26.944652 containerd[1617]: time="2025-09-09T22:14:26.944264628Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 22:14:26.944946 containerd[1617]: time="2025-09-09T22:14:26.944333168Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 22:14:26.946383 containerd[1617]: time="2025-09-09T22:14:26.946348479Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 22:14:26.946534 containerd[1617]: time="2025-09-09T22:14:26.946507719Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 22:14:26.946580 containerd[1617]: time="2025-09-09T22:14:26.946547344Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 22:14:26.946580 containerd[1617]: time="2025-09-09T22:14:26.946567218Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 22:14:26.946672 containerd[1617]: time="2025-09-09T22:14:26.946584182Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 22:14:26.946672 containerd[1617]: time="2025-09-09T22:14:26.946603534Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 22:14:26.946672 containerd[1617]: time="2025-09-09T22:14:26.946619950Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 22:14:26.946672 containerd[1617]: time="2025-09-09T22:14:26.946646798Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 22:14:26.947861 containerd[1617]: time="2025-09-09T22:14:26.946679369Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 22:14:26.947861 containerd[1617]: time="2025-09-09T22:14:26.946698159Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 22:14:26.947861 containerd[1617]: time="2025-09-09T22:14:26.946714411Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 22:14:26.947861 containerd[1617]: time="2025-09-09T22:14:26.946762881Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 22:14:26.951077 containerd[1617]: time="2025-09-09T22:14:26.950960384Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 22:14:26.951202 containerd[1617]: time="2025-09-09T22:14:26.951076832Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 22:14:26.951202 containerd[1617]: time="2025-09-09T22:14:26.951099425Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 22:14:26.951202 containerd[1617]: time="2025-09-09T22:14:26.951113817Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 22:14:26.951202 containerd[1617]: time="2025-09-09T22:14:26.951138286Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 22:14:26.951202 containerd[1617]: time="2025-09-09T22:14:26.951155867Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 22:14:26.951406 containerd[1617]: time="2025-09-09T22:14:26.951206230Z" level=info msg="runtime interface created" Sep 9 22:14:26.951406 containerd[1617]: time="2025-09-09T22:14:26.951217708Z" level=info msg="created NRI interface" Sep 9 22:14:26.951406 containerd[1617]: time="2025-09-09T22:14:26.951230102Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 22:14:26.951406 containerd[1617]: time="2025-09-09T22:14:26.951248566Z" level=info msg="Connect containerd service" Sep 9 22:14:26.951406 containerd[1617]: time="2025-09-09T22:14:26.951291097Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 22:14:26.956366 containerd[1617]: time="2025-09-09T22:14:26.956323062Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 22:14:26.982168 polkitd[1656]: Started polkitd version 126 Sep 9 22:14:27.004144 polkitd[1656]: Loading rules from directory /etc/polkit-1/rules.d Sep 9 22:14:27.004560 polkitd[1656]: Loading rules from directory /run/polkit-1/rules.d Sep 9 22:14:27.004630 polkitd[1656]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 9 22:14:27.004993 polkitd[1656]: Loading rules from directory /usr/local/share/polkit-1/rules.d Sep 9 22:14:27.005039 polkitd[1656]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 9 22:14:27.005093 polkitd[1656]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 9 22:14:27.005838 polkitd[1656]: Finished loading, compiling and executing 2 rules Sep 9 22:14:27.008715 systemd[1]: Started polkit.service - Authorization Manager. Sep 9 22:14:27.009445 dbus-daemon[1574]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 9 22:14:27.011150 polkitd[1656]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 9 22:14:27.023903 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 22:14:27.043528 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 22:14:27.049141 systemd[1]: Started sshd@0-10.230.51.18:22-139.178.68.195:48902.service - OpenSSH per-connection server daemon (139.178.68.195:48902). Sep 9 22:14:27.055202 systemd-hostnamed[1626]: Hostname set to (static) Sep 9 22:14:27.072700 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 22:14:27.074879 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 22:14:27.080911 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 22:14:27.138152 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 22:14:27.146037 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 22:14:27.155812 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 9 22:14:27.157985 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 22:14:27.180246 containerd[1617]: time="2025-09-09T22:14:27.179493208Z" level=info msg="Start subscribing containerd event" Sep 9 22:14:27.180246 containerd[1617]: time="2025-09-09T22:14:27.179559405Z" level=info msg="Start recovering state" Sep 9 22:14:27.180246 containerd[1617]: time="2025-09-09T22:14:27.179748435Z" level=info msg="Start event monitor" Sep 9 22:14:27.180246 containerd[1617]: time="2025-09-09T22:14:27.179774519Z" level=info msg="Start cni network conf syncer for default" Sep 9 22:14:27.180246 containerd[1617]: time="2025-09-09T22:14:27.179813211Z" level=info msg="Start streaming server" Sep 9 22:14:27.180246 containerd[1617]: time="2025-09-09T22:14:27.179836473Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 22:14:27.180246 containerd[1617]: time="2025-09-09T22:14:27.179849488Z" level=info msg="runtime interface starting up..." Sep 9 22:14:27.180246 containerd[1617]: time="2025-09-09T22:14:27.179861725Z" level=info msg="starting plugins..." Sep 9 22:14:27.180246 containerd[1617]: time="2025-09-09T22:14:27.179895154Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 22:14:27.181197 containerd[1617]: time="2025-09-09T22:14:27.180978602Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 22:14:27.181197 containerd[1617]: time="2025-09-09T22:14:27.181075664Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 22:14:27.181602 containerd[1617]: time="2025-09-09T22:14:27.181519154Z" level=info msg="containerd successfully booted in 0.289317s" Sep 9 22:14:27.181655 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 22:14:27.207805 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 22:14:27.332615 tar[1600]: linux-amd64/README.md Sep 9 22:14:27.349984 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 22:14:27.824877 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 22:14:28.053135 systemd-networkd[1513]: eth0: Gained IPv6LL Sep 9 22:14:28.054693 systemd-timesyncd[1522]: Network configuration changed, trying to establish connection. Sep 9 22:14:28.057296 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 22:14:28.060365 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 22:14:28.064571 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:14:28.069137 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 22:14:28.081806 sshd[1685]: Accepted publickey for core from 139.178.68.195 port 48902 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:14:28.081765 sshd-session[1685]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:14:28.101632 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 22:14:28.105130 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 22:14:28.126211 systemd-logind[1588]: New session 1 of user core. Sep 9 22:14:28.143633 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 22:14:28.145124 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 22:14:28.151438 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 22:14:28.169695 (systemd)[1721]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 22:14:28.176577 systemd-logind[1588]: New session c1 of user core. Sep 9 22:14:28.356687 systemd[1721]: Queued start job for default target default.target. Sep 9 22:14:28.365103 systemd[1721]: Created slice app.slice - User Application Slice. Sep 9 22:14:28.365288 systemd[1721]: Reached target paths.target - Paths. Sep 9 22:14:28.365366 systemd[1721]: Reached target timers.target - Timers. Sep 9 22:14:28.369887 systemd[1721]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 22:14:28.391692 systemd[1721]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 22:14:28.391893 systemd[1721]: Reached target sockets.target - Sockets. Sep 9 22:14:28.391959 systemd[1721]: Reached target basic.target - Basic System. Sep 9 22:14:28.392031 systemd[1721]: Reached target default.target - Main User Target. Sep 9 22:14:28.392103 systemd[1721]: Startup finished in 204ms. Sep 9 22:14:28.392438 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 22:14:28.398071 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 22:14:28.779016 systemd-timesyncd[1522]: Network configuration changed, trying to establish connection. Sep 9 22:14:28.782421 systemd-networkd[1513]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8cc4:24:19ff:fee6:3312/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8cc4:24:19ff:fee6:3312/64 assigned by NDisc. Sep 9 22:14:28.782432 systemd-networkd[1513]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 9 22:14:29.060136 systemd[1]: Started sshd@1-10.230.51.18:22-139.178.68.195:48908.service - OpenSSH per-connection server daemon (139.178.68.195:48908). Sep 9 22:14:29.206685 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:14:29.221773 (kubelet)[1741]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 22:14:29.223800 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 22:14:29.843056 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 22:14:29.845808 kubelet[1741]: E0909 22:14:29.845754 1741 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 22:14:29.849699 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 22:14:29.850256 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 22:14:29.851166 systemd[1]: kubelet.service: Consumed 1.053s CPU time, 263.2M memory peak. Sep 9 22:14:29.965619 sshd[1733]: Accepted publickey for core from 139.178.68.195 port 48908 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:14:29.967379 sshd-session[1733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:14:29.975304 systemd-logind[1588]: New session 2 of user core. Sep 9 22:14:29.984143 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 22:14:30.037728 systemd-timesyncd[1522]: Network configuration changed, trying to establish connection. Sep 9 22:14:30.582062 sshd[1750]: Connection closed by 139.178.68.195 port 48908 Sep 9 22:14:30.582526 sshd-session[1733]: pam_unix(sshd:session): session closed for user core Sep 9 22:14:30.588849 systemd-logind[1588]: Session 2 logged out. Waiting for processes to exit. Sep 9 22:14:30.589528 systemd[1]: sshd@1-10.230.51.18:22-139.178.68.195:48908.service: Deactivated successfully. Sep 9 22:14:30.592167 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 22:14:30.594698 systemd-logind[1588]: Removed session 2. Sep 9 22:14:30.748376 systemd[1]: Started sshd@2-10.230.51.18:22-139.178.68.195:51166.service - OpenSSH per-connection server daemon (139.178.68.195:51166). Sep 9 22:14:31.712808 sshd[1756]: Accepted publickey for core from 139.178.68.195 port 51166 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:14:31.714464 sshd-session[1756]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:14:31.722509 systemd-logind[1588]: New session 3 of user core. Sep 9 22:14:31.732060 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 22:14:32.238226 login[1701]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 9 22:14:32.242271 login[1700]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 9 22:14:32.245593 systemd-logind[1588]: New session 4 of user core. Sep 9 22:14:32.253069 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 22:14:32.257049 systemd-logind[1588]: New session 5 of user core. Sep 9 22:14:32.268224 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 22:14:32.413851 sshd[1759]: Connection closed by 139.178.68.195 port 51166 Sep 9 22:14:32.414686 sshd-session[1756]: pam_unix(sshd:session): session closed for user core Sep 9 22:14:32.419678 systemd[1]: sshd@2-10.230.51.18:22-139.178.68.195:51166.service: Deactivated successfully. Sep 9 22:14:32.422055 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 22:14:32.424071 systemd-logind[1588]: Session 3 logged out. Waiting for processes to exit. Sep 9 22:14:32.426173 systemd-logind[1588]: Removed session 3. Sep 9 22:14:33.239817 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 22:14:33.249128 coreos-metadata[1573]: Sep 09 22:14:33.249 WARN failed to locate config-drive, using the metadata service API instead Sep 9 22:14:33.273961 coreos-metadata[1573]: Sep 09 22:14:33.273 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Sep 9 22:14:33.281689 coreos-metadata[1573]: Sep 09 22:14:33.281 INFO Fetch failed with 404: resource not found Sep 9 22:14:33.281689 coreos-metadata[1573]: Sep 09 22:14:33.281 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 9 22:14:33.282575 coreos-metadata[1573]: Sep 09 22:14:33.282 INFO Fetch successful Sep 9 22:14:33.282658 coreos-metadata[1573]: Sep 09 22:14:33.282 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Sep 9 22:14:33.293642 coreos-metadata[1573]: Sep 09 22:14:33.293 INFO Fetch successful Sep 9 22:14:33.293642 coreos-metadata[1573]: Sep 09 22:14:33.293 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Sep 9 22:14:33.311837 coreos-metadata[1573]: Sep 09 22:14:33.311 INFO Fetch successful Sep 9 22:14:33.311983 coreos-metadata[1573]: Sep 09 22:14:33.311 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Sep 9 22:14:33.331262 coreos-metadata[1573]: Sep 09 22:14:33.331 INFO Fetch successful Sep 9 22:14:33.331262 coreos-metadata[1573]: Sep 09 22:14:33.331 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Sep 9 22:14:33.350407 coreos-metadata[1573]: Sep 09 22:14:33.350 INFO Fetch successful Sep 9 22:14:33.382317 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 9 22:14:33.383515 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 22:14:33.856829 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 22:14:33.865897 coreos-metadata[1655]: Sep 09 22:14:33.865 WARN failed to locate config-drive, using the metadata service API instead Sep 9 22:14:33.886654 coreos-metadata[1655]: Sep 09 22:14:33.886 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Sep 9 22:14:33.907851 coreos-metadata[1655]: Sep 09 22:14:33.907 INFO Fetch successful Sep 9 22:14:33.908166 coreos-metadata[1655]: Sep 09 22:14:33.908 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 9 22:14:33.937012 coreos-metadata[1655]: Sep 09 22:14:33.936 INFO Fetch successful Sep 9 22:14:33.942716 unknown[1655]: wrote ssh authorized keys file for user: core Sep 9 22:14:33.968425 update-ssh-keys[1799]: Updated "/home/core/.ssh/authorized_keys" Sep 9 22:14:33.970171 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 9 22:14:33.972576 systemd[1]: Finished sshkeys.service. Sep 9 22:14:33.976778 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 22:14:33.977343 systemd[1]: Startup finished in 3.438s (kernel) + 14.839s (initrd) + 11.542s (userspace) = 29.820s. Sep 9 22:14:39.782297 systemd[1]: Started sshd@3-10.230.51.18:22-14.194.76.134:31519.service - OpenSSH per-connection server daemon (14.194.76.134:31519). Sep 9 22:14:40.022746 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 22:14:40.026547 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:14:40.224261 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:14:40.254666 (kubelet)[1814]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 22:14:40.305278 kubelet[1814]: E0909 22:14:40.305199 1814 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 22:14:40.309757 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 22:14:40.310016 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 22:14:40.310523 systemd[1]: kubelet.service: Consumed 223ms CPU time, 110M memory peak. Sep 9 22:14:41.444397 sshd[1803]: Received disconnect from 14.194.76.134 port 31519:11: Bye Bye [preauth] Sep 9 22:14:41.444397 sshd[1803]: Disconnected from authenticating user root 14.194.76.134 port 31519 [preauth] Sep 9 22:14:41.447314 systemd[1]: sshd@3-10.230.51.18:22-14.194.76.134:31519.service: Deactivated successfully. Sep 9 22:14:42.573634 systemd[1]: Started sshd@4-10.230.51.18:22-139.178.68.195:33998.service - OpenSSH per-connection server daemon (139.178.68.195:33998). Sep 9 22:14:43.481944 sshd[1823]: Accepted publickey for core from 139.178.68.195 port 33998 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:14:43.483484 sshd-session[1823]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:14:43.490741 systemd-logind[1588]: New session 6 of user core. Sep 9 22:14:43.497004 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 22:14:44.101140 sshd[1826]: Connection closed by 139.178.68.195 port 33998 Sep 9 22:14:44.101961 sshd-session[1823]: pam_unix(sshd:session): session closed for user core Sep 9 22:14:44.107130 systemd[1]: sshd@4-10.230.51.18:22-139.178.68.195:33998.service: Deactivated successfully. Sep 9 22:14:44.109582 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 22:14:44.110926 systemd-logind[1588]: Session 6 logged out. Waiting for processes to exit. Sep 9 22:14:44.112740 systemd-logind[1588]: Removed session 6. Sep 9 22:14:44.260588 systemd[1]: Started sshd@5-10.230.51.18:22-139.178.68.195:34010.service - OpenSSH per-connection server daemon (139.178.68.195:34010). Sep 9 22:14:45.167484 sshd[1832]: Accepted publickey for core from 139.178.68.195 port 34010 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:14:45.169088 sshd-session[1832]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:14:45.177444 systemd-logind[1588]: New session 7 of user core. Sep 9 22:14:45.180968 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 22:14:45.783740 sshd[1835]: Connection closed by 139.178.68.195 port 34010 Sep 9 22:14:45.783613 sshd-session[1832]: pam_unix(sshd:session): session closed for user core Sep 9 22:14:45.788367 systemd[1]: sshd@5-10.230.51.18:22-139.178.68.195:34010.service: Deactivated successfully. Sep 9 22:14:45.788963 systemd-logind[1588]: Session 7 logged out. Waiting for processes to exit. Sep 9 22:14:45.790707 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 22:14:45.793318 systemd-logind[1588]: Removed session 7. Sep 9 22:14:45.940436 systemd[1]: Started sshd@6-10.230.51.18:22-139.178.68.195:34020.service - OpenSSH per-connection server daemon (139.178.68.195:34020). Sep 9 22:14:46.855990 sshd[1841]: Accepted publickey for core from 139.178.68.195 port 34020 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:14:46.858359 sshd-session[1841]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:14:46.865950 systemd-logind[1588]: New session 8 of user core. Sep 9 22:14:46.872020 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 22:14:47.470844 sshd[1844]: Connection closed by 139.178.68.195 port 34020 Sep 9 22:14:47.471645 sshd-session[1841]: pam_unix(sshd:session): session closed for user core Sep 9 22:14:47.476637 systemd-logind[1588]: Session 8 logged out. Waiting for processes to exit. Sep 9 22:14:47.477091 systemd[1]: sshd@6-10.230.51.18:22-139.178.68.195:34020.service: Deactivated successfully. Sep 9 22:14:47.479414 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 22:14:47.481498 systemd-logind[1588]: Removed session 8. Sep 9 22:14:47.629582 systemd[1]: Started sshd@7-10.230.51.18:22-139.178.68.195:34034.service - OpenSSH per-connection server daemon (139.178.68.195:34034). Sep 9 22:14:48.547001 sshd[1850]: Accepted publickey for core from 139.178.68.195 port 34034 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:14:48.548483 sshd-session[1850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:14:48.555892 systemd-logind[1588]: New session 9 of user core. Sep 9 22:14:48.563005 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 22:14:49.033911 sudo[1854]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 22:14:49.034344 sudo[1854]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 22:14:49.047197 sudo[1854]: pam_unix(sudo:session): session closed for user root Sep 9 22:14:49.190062 sshd[1853]: Connection closed by 139.178.68.195 port 34034 Sep 9 22:14:49.191049 sshd-session[1850]: pam_unix(sshd:session): session closed for user core Sep 9 22:14:49.196425 systemd[1]: sshd@7-10.230.51.18:22-139.178.68.195:34034.service: Deactivated successfully. Sep 9 22:14:49.198616 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 22:14:49.200402 systemd-logind[1588]: Session 9 logged out. Waiting for processes to exit. Sep 9 22:14:49.202042 systemd-logind[1588]: Removed session 9. Sep 9 22:14:49.343281 systemd[1]: Started sshd@8-10.230.51.18:22-139.178.68.195:34038.service - OpenSSH per-connection server daemon (139.178.68.195:34038). Sep 9 22:14:50.243408 sshd[1860]: Accepted publickey for core from 139.178.68.195 port 34038 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:14:50.245114 sshd-session[1860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:14:50.252649 systemd-logind[1588]: New session 10 of user core. Sep 9 22:14:50.258978 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 22:14:50.522622 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 22:14:50.524920 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:14:50.693079 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:14:50.705304 (kubelet)[1872]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 22:14:50.719913 sudo[1878]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 22:14:50.720908 sudo[1878]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 22:14:50.787177 sudo[1878]: pam_unix(sudo:session): session closed for user root Sep 9 22:14:50.797942 sudo[1873]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 22:14:50.798384 sudo[1873]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 22:14:50.818887 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 22:14:50.831283 kubelet[1872]: E0909 22:14:50.831219 1872 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 22:14:50.833134 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 22:14:50.833885 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 22:14:50.834724 systemd[1]: kubelet.service: Consumed 211ms CPU time, 108.1M memory peak. Sep 9 22:14:50.867526 augenrules[1902]: No rules Sep 9 22:14:50.868372 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 22:14:50.868712 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 22:14:50.870409 sudo[1873]: pam_unix(sudo:session): session closed for user root Sep 9 22:14:51.013859 sshd[1863]: Connection closed by 139.178.68.195 port 34038 Sep 9 22:14:51.014712 sshd-session[1860]: pam_unix(sshd:session): session closed for user core Sep 9 22:14:51.020469 systemd-logind[1588]: Session 10 logged out. Waiting for processes to exit. Sep 9 22:14:51.020943 systemd[1]: sshd@8-10.230.51.18:22-139.178.68.195:34038.service: Deactivated successfully. Sep 9 22:14:51.023285 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 22:14:51.025587 systemd-logind[1588]: Removed session 10. Sep 9 22:14:51.176959 systemd[1]: Started sshd@9-10.230.51.18:22-139.178.68.195:36504.service - OpenSSH per-connection server daemon (139.178.68.195:36504). Sep 9 22:14:52.138864 sshd[1911]: Accepted publickey for core from 139.178.68.195 port 36504 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:14:52.140383 sshd-session[1911]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:14:52.147945 systemd-logind[1588]: New session 11 of user core. Sep 9 22:14:52.155048 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 22:14:52.648022 sudo[1915]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 22:14:52.648437 sudo[1915]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 22:14:53.155142 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 22:14:53.168305 (dockerd)[1933]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 22:14:53.520754 dockerd[1933]: time="2025-09-09T22:14:53.516502653Z" level=info msg="Starting up" Sep 9 22:14:53.520754 dockerd[1933]: time="2025-09-09T22:14:53.520676728Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 22:14:53.536989 dockerd[1933]: time="2025-09-09T22:14:53.536891892Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 22:14:53.584729 dockerd[1933]: time="2025-09-09T22:14:53.584659091Z" level=info msg="Loading containers: start." Sep 9 22:14:53.599918 kernel: Initializing XFRM netlink socket Sep 9 22:14:53.864291 systemd-timesyncd[1522]: Network configuration changed, trying to establish connection. Sep 9 22:14:53.929645 systemd-networkd[1513]: docker0: Link UP Sep 9 22:14:53.934052 dockerd[1933]: time="2025-09-09T22:14:53.933993972Z" level=info msg="Loading containers: done." Sep 9 22:14:53.953625 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4260055590-merged.mount: Deactivated successfully. Sep 9 22:14:53.955700 dockerd[1933]: time="2025-09-09T22:14:53.955176699Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 22:14:53.955700 dockerd[1933]: time="2025-09-09T22:14:53.955284455Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 22:14:53.955700 dockerd[1933]: time="2025-09-09T22:14:53.955460050Z" level=info msg="Initializing buildkit" Sep 9 22:14:53.981837 dockerd[1933]: time="2025-09-09T22:14:53.981773567Z" level=info msg="Completed buildkit initialization" Sep 9 22:14:53.991439 dockerd[1933]: time="2025-09-09T22:14:53.991282212Z" level=info msg="Daemon has completed initialization" Sep 9 22:14:53.991628 dockerd[1933]: time="2025-09-09T22:14:53.991568464Z" level=info msg="API listen on /run/docker.sock" Sep 9 22:14:53.992014 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 22:14:54.263578 systemd-timesyncd[1522]: Contacted time server [2a00:da00:1800:83b0::1]:123 (2.flatcar.pool.ntp.org). Sep 9 22:14:54.263687 systemd-timesyncd[1522]: Initial clock synchronization to Tue 2025-09-09 22:14:54.523521 UTC. Sep 9 22:14:55.268185 containerd[1617]: time="2025-09-09T22:14:55.268053845Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 9 22:14:56.236354 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3761159037.mount: Deactivated successfully. Sep 9 22:14:58.235854 containerd[1617]: time="2025-09-09T22:14:58.235305258Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:14:58.237178 containerd[1617]: time="2025-09-09T22:14:58.237147904Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=28800695" Sep 9 22:14:58.237500 containerd[1617]: time="2025-09-09T22:14:58.237467996Z" level=info msg="ImageCreate event name:\"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:14:58.241895 containerd[1617]: time="2025-09-09T22:14:58.241092081Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:14:58.242974 containerd[1617]: time="2025-09-09T22:14:58.242413842Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"28797487\" in 2.973627255s" Sep 9 22:14:58.242974 containerd[1617]: time="2025-09-09T22:14:58.242466687Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\"" Sep 9 22:14:58.243496 containerd[1617]: time="2025-09-09T22:14:58.243463366Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 9 22:14:58.833755 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 9 22:15:00.480505 containerd[1617]: time="2025-09-09T22:15:00.480432358Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:00.481769 containerd[1617]: time="2025-09-09T22:15:00.481735326Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=24784136" Sep 9 22:15:00.483859 containerd[1617]: time="2025-09-09T22:15:00.482427042Z" level=info msg="ImageCreate event name:\"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:00.485984 containerd[1617]: time="2025-09-09T22:15:00.485948628Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:00.487467 containerd[1617]: time="2025-09-09T22:15:00.487432375Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"26387322\" in 2.243845237s" Sep 9 22:15:00.487589 containerd[1617]: time="2025-09-09T22:15:00.487564452Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\"" Sep 9 22:15:00.488610 containerd[1617]: time="2025-09-09T22:15:00.488564332Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 9 22:15:01.023967 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 9 22:15:01.028112 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:15:01.237096 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:15:01.250360 (kubelet)[2216]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 22:15:01.314113 kubelet[2216]: E0909 22:15:01.313887 2216 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 22:15:01.317723 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 22:15:01.318016 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 22:15:01.319307 systemd[1]: kubelet.service: Consumed 222ms CPU time, 108.8M memory peak. Sep 9 22:15:01.878431 systemd[1]: Started sshd@10-10.230.51.18:22-103.146.52.252:35416.service - OpenSSH per-connection server daemon (103.146.52.252:35416). Sep 9 22:15:02.276814 containerd[1617]: time="2025-09-09T22:15:02.275870274Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:02.278534 containerd[1617]: time="2025-09-09T22:15:02.278477091Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=19175044" Sep 9 22:15:02.279719 containerd[1617]: time="2025-09-09T22:15:02.279686593Z" level=info msg="ImageCreate event name:\"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:02.282649 containerd[1617]: time="2025-09-09T22:15:02.282594505Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:02.285502 containerd[1617]: time="2025-09-09T22:15:02.285422970Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"20778248\" in 1.79679512s" Sep 9 22:15:02.285502 containerd[1617]: time="2025-09-09T22:15:02.285481715Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\"" Sep 9 22:15:02.286406 containerd[1617]: time="2025-09-09T22:15:02.286349079Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 9 22:15:02.685303 sshd[2229]: Invalid user test2 from 103.146.52.252 port 35416 Sep 9 22:15:02.832066 sshd[2229]: Received disconnect from 103.146.52.252 port 35416:11: Bye Bye [preauth] Sep 9 22:15:02.832066 sshd[2229]: Disconnected from invalid user test2 103.146.52.252 port 35416 [preauth] Sep 9 22:15:02.836864 systemd[1]: sshd@10-10.230.51.18:22-103.146.52.252:35416.service: Deactivated successfully. Sep 9 22:15:04.069916 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2296260392.mount: Deactivated successfully. Sep 9 22:15:04.767697 containerd[1617]: time="2025-09-09T22:15:04.767622630Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:04.769076 containerd[1617]: time="2025-09-09T22:15:04.769025107Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=30897178" Sep 9 22:15:04.769523 containerd[1617]: time="2025-09-09T22:15:04.769477896Z" level=info msg="ImageCreate event name:\"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:04.771869 containerd[1617]: time="2025-09-09T22:15:04.771837805Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:04.772837 containerd[1617]: time="2025-09-09T22:15:04.772778235Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"30896189\" in 2.486275518s" Sep 9 22:15:04.772974 containerd[1617]: time="2025-09-09T22:15:04.772949050Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\"" Sep 9 22:15:04.773907 containerd[1617]: time="2025-09-09T22:15:04.773841980Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 22:15:05.383114 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1311303911.mount: Deactivated successfully. Sep 9 22:15:06.847443 containerd[1617]: time="2025-09-09T22:15:06.847381963Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:06.850308 containerd[1617]: time="2025-09-09T22:15:06.850256318Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Sep 9 22:15:06.855867 containerd[1617]: time="2025-09-09T22:15:06.855834914Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:06.861406 containerd[1617]: time="2025-09-09T22:15:06.861331041Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:06.862852 containerd[1617]: time="2025-09-09T22:15:06.862690949Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 2.088748618s" Sep 9 22:15:06.862852 containerd[1617]: time="2025-09-09T22:15:06.862731392Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 9 22:15:06.864087 containerd[1617]: time="2025-09-09T22:15:06.864043857Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 22:15:07.065026 systemd[1]: Started sshd@11-10.230.51.18:22-172.245.45.194:54384.service - OpenSSH per-connection server daemon (172.245.45.194:54384). Sep 9 22:15:07.503382 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1610375072.mount: Deactivated successfully. Sep 9 22:15:07.535237 containerd[1617]: time="2025-09-09T22:15:07.535168053Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 22:15:07.538476 containerd[1617]: time="2025-09-09T22:15:07.538436807Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Sep 9 22:15:07.541770 containerd[1617]: time="2025-09-09T22:15:07.541703190Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 22:15:07.547647 containerd[1617]: time="2025-09-09T22:15:07.547567298Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 22:15:07.548712 containerd[1617]: time="2025-09-09T22:15:07.548048573Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 683.965799ms" Sep 9 22:15:07.548712 containerd[1617]: time="2025-09-09T22:15:07.548090271Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 9 22:15:07.549651 containerd[1617]: time="2025-09-09T22:15:07.548957757Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 9 22:15:07.855863 sshd[2296]: Invalid user dmdba from 172.245.45.194 port 54384 Sep 9 22:15:08.002152 sshd[2296]: Received disconnect from 172.245.45.194 port 54384:11: Bye Bye [preauth] Sep 9 22:15:08.002152 sshd[2296]: Disconnected from invalid user dmdba 172.245.45.194 port 54384 [preauth] Sep 9 22:15:08.004380 systemd[1]: sshd@11-10.230.51.18:22-172.245.45.194:54384.service: Deactivated successfully. Sep 9 22:15:08.278518 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2454493326.mount: Deactivated successfully. Sep 9 22:15:09.546120 systemd[1]: Started sshd@12-10.230.51.18:22-85.209.134.43:31812.service - OpenSSH per-connection server daemon (85.209.134.43:31812). Sep 9 22:15:10.022628 sshd[2358]: Invalid user grid from 85.209.134.43 port 31812 Sep 9 22:15:10.100616 sshd[2358]: Received disconnect from 85.209.134.43 port 31812:11: Bye Bye [preauth] Sep 9 22:15:10.100616 sshd[2358]: Disconnected from invalid user grid 85.209.134.43 port 31812 [preauth] Sep 9 22:15:10.103077 systemd[1]: sshd@12-10.230.51.18:22-85.209.134.43:31812.service: Deactivated successfully. Sep 9 22:15:10.936040 containerd[1617]: time="2025-09-09T22:15:10.935964103Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:10.937726 containerd[1617]: time="2025-09-09T22:15:10.937678583Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682064" Sep 9 22:15:10.938950 containerd[1617]: time="2025-09-09T22:15:10.938890881Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:10.957424 containerd[1617]: time="2025-09-09T22:15:10.957303158Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:10.959059 containerd[1617]: time="2025-09-09T22:15:10.958843507Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.409847457s" Sep 9 22:15:10.959059 containerd[1617]: time="2025-09-09T22:15:10.958893736Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 9 22:15:11.522998 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 9 22:15:11.525863 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:15:11.851625 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:15:11.865418 (kubelet)[2384]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 22:15:11.940948 kubelet[2384]: E0909 22:15:11.940833 2384 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 22:15:11.943157 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 22:15:11.943447 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 22:15:11.943939 systemd[1]: kubelet.service: Consumed 214ms CPU time, 108.2M memory peak. Sep 9 22:15:12.171800 update_engine[1590]: I20250909 22:15:12.169930 1590 update_attempter.cc:509] Updating boot flags... Sep 9 22:15:15.931441 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:15:15.932137 systemd[1]: kubelet.service: Consumed 214ms CPU time, 108.2M memory peak. Sep 9 22:15:15.935899 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:15:15.972879 systemd[1]: Reload requested from client PID 2424 ('systemctl') (unit session-11.scope)... Sep 9 22:15:15.973142 systemd[1]: Reloading... Sep 9 22:15:16.192881 zram_generator::config[2478]: No configuration found. Sep 9 22:15:16.443812 systemd[1]: Reloading finished in 469 ms. Sep 9 22:15:16.535308 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 22:15:16.535434 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 22:15:16.536542 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:15:16.536617 systemd[1]: kubelet.service: Consumed 131ms CPU time, 98.3M memory peak. Sep 9 22:15:16.539610 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:15:16.721263 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:15:16.733210 (kubelet)[2537]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 22:15:16.833243 kubelet[2537]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 22:15:16.833243 kubelet[2537]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 22:15:16.833243 kubelet[2537]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 22:15:16.833872 kubelet[2537]: I0909 22:15:16.833395 2537 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 22:15:17.508372 kubelet[2537]: I0909 22:15:17.508278 2537 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 22:15:17.508372 kubelet[2537]: I0909 22:15:17.508321 2537 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 22:15:17.508873 kubelet[2537]: I0909 22:15:17.508840 2537 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 22:15:17.609359 kubelet[2537]: E0909 22:15:17.609220 2537 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.230.51.18:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.230.51.18:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:15:17.614878 kubelet[2537]: I0909 22:15:17.614801 2537 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 22:15:17.632293 kubelet[2537]: I0909 22:15:17.632254 2537 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 22:15:17.643506 kubelet[2537]: I0909 22:15:17.643345 2537 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 22:15:17.648814 kubelet[2537]: I0909 22:15:17.648742 2537 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 22:15:17.649239 kubelet[2537]: I0909 22:15:17.648922 2537 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-rokxy.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 22:15:17.651428 kubelet[2537]: I0909 22:15:17.651126 2537 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 22:15:17.651428 kubelet[2537]: I0909 22:15:17.651156 2537 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 22:15:17.652455 kubelet[2537]: I0909 22:15:17.652434 2537 state_mem.go:36] "Initialized new in-memory state store" Sep 9 22:15:17.656235 kubelet[2537]: I0909 22:15:17.656214 2537 kubelet.go:446] "Attempting to sync node with API server" Sep 9 22:15:17.656426 kubelet[2537]: I0909 22:15:17.656406 2537 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 22:15:17.657689 kubelet[2537]: I0909 22:15:17.657666 2537 kubelet.go:352] "Adding apiserver pod source" Sep 9 22:15:17.657828 kubelet[2537]: I0909 22:15:17.657809 2537 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 22:15:17.664703 kubelet[2537]: W0909 22:15:17.664041 2537 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.51.18:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-rokxy.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.51.18:6443: connect: connection refused Sep 9 22:15:17.664703 kubelet[2537]: E0909 22:15:17.664134 2537 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.51.18:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-rokxy.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.51.18:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:15:17.664703 kubelet[2537]: W0909 22:15:17.664654 2537 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.51.18:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.51.18:6443: connect: connection refused Sep 9 22:15:17.664895 kubelet[2537]: E0909 22:15:17.664699 2537 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.51.18:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.51.18:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:15:17.666406 kubelet[2537]: I0909 22:15:17.666352 2537 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 22:15:17.670149 kubelet[2537]: I0909 22:15:17.669614 2537 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 22:15:17.670149 kubelet[2537]: W0909 22:15:17.669710 2537 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 22:15:17.671335 kubelet[2537]: I0909 22:15:17.671313 2537 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 22:15:17.671480 kubelet[2537]: I0909 22:15:17.671460 2537 server.go:1287] "Started kubelet" Sep 9 22:15:17.675489 kubelet[2537]: I0909 22:15:17.675082 2537 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 22:15:17.676603 kubelet[2537]: I0909 22:15:17.676572 2537 server.go:479] "Adding debug handlers to kubelet server" Sep 9 22:15:17.677995 kubelet[2537]: I0909 22:15:17.677925 2537 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 22:15:17.678424 kubelet[2537]: I0909 22:15:17.678402 2537 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 22:15:17.681959 kubelet[2537]: E0909 22:15:17.679513 2537 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.230.51.18:6443/api/v1/namespaces/default/events\": dial tcp 10.230.51.18:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-rokxy.gb1.brightbox.com.1863bd0712248d26 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-rokxy.gb1.brightbox.com,UID:srv-rokxy.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-rokxy.gb1.brightbox.com,},FirstTimestamp:2025-09-09 22:15:17.671431462 +0000 UTC m=+0.932833762,LastTimestamp:2025-09-09 22:15:17.671431462 +0000 UTC m=+0.932833762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-rokxy.gb1.brightbox.com,}" Sep 9 22:15:17.689138 kubelet[2537]: I0909 22:15:17.689103 2537 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 22:15:17.692800 kubelet[2537]: I0909 22:15:17.692248 2537 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 22:15:17.694527 kubelet[2537]: I0909 22:15:17.694505 2537 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 22:15:17.696610 kubelet[2537]: I0909 22:15:17.694806 2537 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 22:15:17.696722 kubelet[2537]: E0909 22:15:17.695297 2537 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-rokxy.gb1.brightbox.com\" not found" Sep 9 22:15:17.696883 kubelet[2537]: I0909 22:15:17.696865 2537 reconciler.go:26] "Reconciler: start to sync state" Sep 9 22:15:17.697493 kubelet[2537]: W0909 22:15:17.697438 2537 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.51.18:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.51.18:6443: connect: connection refused Sep 9 22:15:17.697646 kubelet[2537]: E0909 22:15:17.697620 2537 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.51.18:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.51.18:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:15:17.697876 kubelet[2537]: E0909 22:15:17.697824 2537 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.51.18:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-rokxy.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.51.18:6443: connect: connection refused" interval="200ms" Sep 9 22:15:17.702001 kubelet[2537]: E0909 22:15:17.701978 2537 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 22:15:17.702645 kubelet[2537]: I0909 22:15:17.702620 2537 factory.go:221] Registration of the containerd container factory successfully Sep 9 22:15:17.703807 kubelet[2537]: I0909 22:15:17.702726 2537 factory.go:221] Registration of the systemd container factory successfully Sep 9 22:15:17.703807 kubelet[2537]: I0909 22:15:17.702863 2537 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 22:15:17.721534 kubelet[2537]: I0909 22:15:17.721473 2537 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 22:15:17.723022 kubelet[2537]: I0909 22:15:17.722985 2537 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 22:15:17.723085 kubelet[2537]: I0909 22:15:17.723034 2537 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 22:15:17.723085 kubelet[2537]: I0909 22:15:17.723082 2537 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 22:15:17.723185 kubelet[2537]: I0909 22:15:17.723094 2537 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 22:15:17.723185 kubelet[2537]: E0909 22:15:17.723169 2537 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 22:15:17.733980 kubelet[2537]: W0909 22:15:17.733925 2537 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.51.18:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.51.18:6443: connect: connection refused Sep 9 22:15:17.734126 kubelet[2537]: E0909 22:15:17.734088 2537 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.51.18:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.51.18:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:15:17.763874 kubelet[2537]: I0909 22:15:17.763755 2537 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 22:15:17.764010 kubelet[2537]: I0909 22:15:17.763991 2537 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 22:15:17.764132 kubelet[2537]: I0909 22:15:17.764113 2537 state_mem.go:36] "Initialized new in-memory state store" Sep 9 22:15:17.775481 kubelet[2537]: I0909 22:15:17.775454 2537 policy_none.go:49] "None policy: Start" Sep 9 22:15:17.775656 kubelet[2537]: I0909 22:15:17.775627 2537 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 22:15:17.775810 kubelet[2537]: I0909 22:15:17.775772 2537 state_mem.go:35] "Initializing new in-memory state store" Sep 9 22:15:17.786148 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 22:15:17.797078 kubelet[2537]: E0909 22:15:17.796921 2537 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-rokxy.gb1.brightbox.com\" not found" Sep 9 22:15:17.799903 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 22:15:17.822487 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 22:15:17.824067 kubelet[2537]: E0909 22:15:17.823574 2537 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 22:15:17.825125 kubelet[2537]: I0909 22:15:17.825102 2537 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 22:15:17.825593 kubelet[2537]: I0909 22:15:17.825512 2537 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 22:15:17.825723 kubelet[2537]: I0909 22:15:17.825676 2537 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 22:15:17.827975 kubelet[2537]: I0909 22:15:17.827857 2537 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 22:15:17.828408 kubelet[2537]: E0909 22:15:17.828385 2537 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 22:15:17.828558 kubelet[2537]: E0909 22:15:17.828536 2537 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-rokxy.gb1.brightbox.com\" not found" Sep 9 22:15:17.899073 kubelet[2537]: E0909 22:15:17.899030 2537 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.51.18:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-rokxy.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.51.18:6443: connect: connection refused" interval="400ms" Sep 9 22:15:17.928617 kubelet[2537]: I0909 22:15:17.928523 2537 kubelet_node_status.go:75] "Attempting to register node" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:17.929227 kubelet[2537]: E0909 22:15:17.929194 2537 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.230.51.18:6443/api/v1/nodes\": dial tcp 10.230.51.18:6443: connect: connection refused" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.040699 systemd[1]: Created slice kubepods-burstable-pod07b3c056376fafdee159e16590c2a689.slice - libcontainer container kubepods-burstable-pod07b3c056376fafdee159e16590c2a689.slice. Sep 9 22:15:18.054130 kubelet[2537]: E0909 22:15:18.053211 2537 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-rokxy.gb1.brightbox.com\" not found" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.054002 systemd[1]: Created slice kubepods-burstable-pod02fb051d04ef43b5da9607f182f8f09c.slice - libcontainer container kubepods-burstable-pod02fb051d04ef43b5da9607f182f8f09c.slice. Sep 9 22:15:18.059534 kubelet[2537]: E0909 22:15:18.059486 2537 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-rokxy.gb1.brightbox.com\" not found" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.062098 systemd[1]: Created slice kubepods-burstable-pod011c52450fb356b8c58e4596f8f1b21f.slice - libcontainer container kubepods-burstable-pod011c52450fb356b8c58e4596f8f1b21f.slice. Sep 9 22:15:18.064833 kubelet[2537]: E0909 22:15:18.064684 2537 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-rokxy.gb1.brightbox.com\" not found" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.098258 kubelet[2537]: I0909 22:15:18.098184 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/02fb051d04ef43b5da9607f182f8f09c-flexvolume-dir\") pod \"kube-controller-manager-srv-rokxy.gb1.brightbox.com\" (UID: \"02fb051d04ef43b5da9607f182f8f09c\") " pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.098537 kubelet[2537]: I0909 22:15:18.098469 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/02fb051d04ef43b5da9607f182f8f09c-k8s-certs\") pod \"kube-controller-manager-srv-rokxy.gb1.brightbox.com\" (UID: \"02fb051d04ef43b5da9607f182f8f09c\") " pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.098715 kubelet[2537]: I0909 22:15:18.098624 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/02fb051d04ef43b5da9607f182f8f09c-ca-certs\") pod \"kube-controller-manager-srv-rokxy.gb1.brightbox.com\" (UID: \"02fb051d04ef43b5da9607f182f8f09c\") " pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.098872 kubelet[2537]: I0909 22:15:18.098829 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/02fb051d04ef43b5da9607f182f8f09c-kubeconfig\") pod \"kube-controller-manager-srv-rokxy.gb1.brightbox.com\" (UID: \"02fb051d04ef43b5da9607f182f8f09c\") " pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.099261 kubelet[2537]: I0909 22:15:18.099036 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/02fb051d04ef43b5da9607f182f8f09c-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-rokxy.gb1.brightbox.com\" (UID: \"02fb051d04ef43b5da9607f182f8f09c\") " pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.099261 kubelet[2537]: I0909 22:15:18.099072 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/011c52450fb356b8c58e4596f8f1b21f-kubeconfig\") pod \"kube-scheduler-srv-rokxy.gb1.brightbox.com\" (UID: \"011c52450fb356b8c58e4596f8f1b21f\") " pod="kube-system/kube-scheduler-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.099261 kubelet[2537]: I0909 22:15:18.099103 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/07b3c056376fafdee159e16590c2a689-ca-certs\") pod \"kube-apiserver-srv-rokxy.gb1.brightbox.com\" (UID: \"07b3c056376fafdee159e16590c2a689\") " pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.099261 kubelet[2537]: I0909 22:15:18.099135 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/07b3c056376fafdee159e16590c2a689-k8s-certs\") pod \"kube-apiserver-srv-rokxy.gb1.brightbox.com\" (UID: \"07b3c056376fafdee159e16590c2a689\") " pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.099261 kubelet[2537]: I0909 22:15:18.099170 2537 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/07b3c056376fafdee159e16590c2a689-usr-share-ca-certificates\") pod \"kube-apiserver-srv-rokxy.gb1.brightbox.com\" (UID: \"07b3c056376fafdee159e16590c2a689\") " pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.132703 kubelet[2537]: I0909 22:15:18.132305 2537 kubelet_node_status.go:75] "Attempting to register node" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.133011 kubelet[2537]: E0909 22:15:18.132982 2537 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.230.51.18:6443/api/v1/nodes\": dial tcp 10.230.51.18:6443: connect: connection refused" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.300182 kubelet[2537]: E0909 22:15:18.299947 2537 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.51.18:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-rokxy.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.51.18:6443: connect: connection refused" interval="800ms" Sep 9 22:15:18.357825 containerd[1617]: time="2025-09-09T22:15:18.357620054Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-rokxy.gb1.brightbox.com,Uid:07b3c056376fafdee159e16590c2a689,Namespace:kube-system,Attempt:0,}" Sep 9 22:15:18.367022 containerd[1617]: time="2025-09-09T22:15:18.366773014Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-rokxy.gb1.brightbox.com,Uid:02fb051d04ef43b5da9607f182f8f09c,Namespace:kube-system,Attempt:0,}" Sep 9 22:15:18.367341 containerd[1617]: time="2025-09-09T22:15:18.367293677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-rokxy.gb1.brightbox.com,Uid:011c52450fb356b8c58e4596f8f1b21f,Namespace:kube-system,Attempt:0,}" Sep 9 22:15:18.505590 containerd[1617]: time="2025-09-09T22:15:18.505218368Z" level=info msg="connecting to shim ada0ec632d411a2c8a529c8d45b30ca37c85de1789eb13e8d700cc9649063829" address="unix:///run/containerd/s/2c75c8dfcade8c405a48349e183c5b30b9b49cc909eec706dc15aed126590b49" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:15:18.507647 containerd[1617]: time="2025-09-09T22:15:18.507612431Z" level=info msg="connecting to shim 814db517d97094d91687fdc8a9444c7567728570230c1dab33076f227730535b" address="unix:///run/containerd/s/1fffecdcf7695db5a7696b43860a47d78b384edbf5c7379197f5cc09743ff065" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:15:18.547527 containerd[1617]: time="2025-09-09T22:15:18.547468651Z" level=info msg="connecting to shim 2199853586f82eadd88ad89462b5b17c55dea97d413b6b793f306fd260a60637" address="unix:///run/containerd/s/54245155f356fce4ac14d08186bf57b30963d77fde7d8960d90789762dc83e5b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:15:18.551280 kubelet[2537]: I0909 22:15:18.551039 2537 kubelet_node_status.go:75] "Attempting to register node" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.552248 kubelet[2537]: E0909 22:15:18.552154 2537 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.230.51.18:6443/api/v1/nodes\": dial tcp 10.230.51.18:6443: connect: connection refused" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:18.695065 systemd[1]: Started cri-containerd-2199853586f82eadd88ad89462b5b17c55dea97d413b6b793f306fd260a60637.scope - libcontainer container 2199853586f82eadd88ad89462b5b17c55dea97d413b6b793f306fd260a60637. Sep 9 22:15:18.698237 systemd[1]: Started cri-containerd-814db517d97094d91687fdc8a9444c7567728570230c1dab33076f227730535b.scope - libcontainer container 814db517d97094d91687fdc8a9444c7567728570230c1dab33076f227730535b. Sep 9 22:15:18.701282 systemd[1]: Started cri-containerd-ada0ec632d411a2c8a529c8d45b30ca37c85de1789eb13e8d700cc9649063829.scope - libcontainer container ada0ec632d411a2c8a529c8d45b30ca37c85de1789eb13e8d700cc9649063829. Sep 9 22:15:18.844264 containerd[1617]: time="2025-09-09T22:15:18.843954797Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-rokxy.gb1.brightbox.com,Uid:011c52450fb356b8c58e4596f8f1b21f,Namespace:kube-system,Attempt:0,} returns sandbox id \"814db517d97094d91687fdc8a9444c7567728570230c1dab33076f227730535b\"" Sep 9 22:15:18.852489 containerd[1617]: time="2025-09-09T22:15:18.852284774Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-rokxy.gb1.brightbox.com,Uid:02fb051d04ef43b5da9607f182f8f09c,Namespace:kube-system,Attempt:0,} returns sandbox id \"ada0ec632d411a2c8a529c8d45b30ca37c85de1789eb13e8d700cc9649063829\"" Sep 9 22:15:18.858810 containerd[1617]: time="2025-09-09T22:15:18.857973571Z" level=info msg="CreateContainer within sandbox \"814db517d97094d91687fdc8a9444c7567728570230c1dab33076f227730535b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 22:15:18.860494 containerd[1617]: time="2025-09-09T22:15:18.860462407Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-rokxy.gb1.brightbox.com,Uid:07b3c056376fafdee159e16590c2a689,Namespace:kube-system,Attempt:0,} returns sandbox id \"2199853586f82eadd88ad89462b5b17c55dea97d413b6b793f306fd260a60637\"" Sep 9 22:15:18.861704 containerd[1617]: time="2025-09-09T22:15:18.861671162Z" level=info msg="CreateContainer within sandbox \"ada0ec632d411a2c8a529c8d45b30ca37c85de1789eb13e8d700cc9649063829\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 22:15:18.864684 containerd[1617]: time="2025-09-09T22:15:18.864621149Z" level=info msg="CreateContainer within sandbox \"2199853586f82eadd88ad89462b5b17c55dea97d413b6b793f306fd260a60637\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 22:15:18.871502 containerd[1617]: time="2025-09-09T22:15:18.871466412Z" level=info msg="Container 0dcb9eda53ee4c01f4838ded050fa96022c42ae94a00aae334941ae03a210415: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:15:18.878872 containerd[1617]: time="2025-09-09T22:15:18.878803777Z" level=info msg="Container ce5e23366cd3baa9c49d2a9aeb0cd9d3610036d17423a1d7212da266a0fb0fd0: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:15:18.885574 containerd[1617]: time="2025-09-09T22:15:18.885528405Z" level=info msg="Container 7af234191fec757f13142d894117fb12e7c1c178da28f0f2574e433238cb46de: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:15:18.889349 containerd[1617]: time="2025-09-09T22:15:18.889313999Z" level=info msg="CreateContainer within sandbox \"814db517d97094d91687fdc8a9444c7567728570230c1dab33076f227730535b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"0dcb9eda53ee4c01f4838ded050fa96022c42ae94a00aae334941ae03a210415\"" Sep 9 22:15:18.889579 containerd[1617]: time="2025-09-09T22:15:18.889550667Z" level=info msg="CreateContainer within sandbox \"ada0ec632d411a2c8a529c8d45b30ca37c85de1789eb13e8d700cc9649063829\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"ce5e23366cd3baa9c49d2a9aeb0cd9d3610036d17423a1d7212da266a0fb0fd0\"" Sep 9 22:15:18.890472 containerd[1617]: time="2025-09-09T22:15:18.890252892Z" level=info msg="StartContainer for \"0dcb9eda53ee4c01f4838ded050fa96022c42ae94a00aae334941ae03a210415\"" Sep 9 22:15:18.890701 containerd[1617]: time="2025-09-09T22:15:18.890657000Z" level=info msg="StartContainer for \"ce5e23366cd3baa9c49d2a9aeb0cd9d3610036d17423a1d7212da266a0fb0fd0\"" Sep 9 22:15:18.893092 containerd[1617]: time="2025-09-09T22:15:18.892892946Z" level=info msg="connecting to shim 0dcb9eda53ee4c01f4838ded050fa96022c42ae94a00aae334941ae03a210415" address="unix:///run/containerd/s/1fffecdcf7695db5a7696b43860a47d78b384edbf5c7379197f5cc09743ff065" protocol=ttrpc version=3 Sep 9 22:15:18.894347 containerd[1617]: time="2025-09-09T22:15:18.894317249Z" level=info msg="connecting to shim ce5e23366cd3baa9c49d2a9aeb0cd9d3610036d17423a1d7212da266a0fb0fd0" address="unix:///run/containerd/s/2c75c8dfcade8c405a48349e183c5b30b9b49cc909eec706dc15aed126590b49" protocol=ttrpc version=3 Sep 9 22:15:18.898129 containerd[1617]: time="2025-09-09T22:15:18.898087972Z" level=info msg="CreateContainer within sandbox \"2199853586f82eadd88ad89462b5b17c55dea97d413b6b793f306fd260a60637\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"7af234191fec757f13142d894117fb12e7c1c178da28f0f2574e433238cb46de\"" Sep 9 22:15:18.899836 containerd[1617]: time="2025-09-09T22:15:18.898577494Z" level=info msg="StartContainer for \"7af234191fec757f13142d894117fb12e7c1c178da28f0f2574e433238cb46de\"" Sep 9 22:15:18.901677 containerd[1617]: time="2025-09-09T22:15:18.899765385Z" level=info msg="connecting to shim 7af234191fec757f13142d894117fb12e7c1c178da28f0f2574e433238cb46de" address="unix:///run/containerd/s/54245155f356fce4ac14d08186bf57b30963d77fde7d8960d90789762dc83e5b" protocol=ttrpc version=3 Sep 9 22:15:18.910373 kubelet[2537]: W0909 22:15:18.910202 2537 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.51.18:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.51.18:6443: connect: connection refused Sep 9 22:15:18.910861 kubelet[2537]: E0909 22:15:18.910391 2537 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.51.18:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.51.18:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:15:18.926985 systemd[1]: Started cri-containerd-ce5e23366cd3baa9c49d2a9aeb0cd9d3610036d17423a1d7212da266a0fb0fd0.scope - libcontainer container ce5e23366cd3baa9c49d2a9aeb0cd9d3610036d17423a1d7212da266a0fb0fd0. Sep 9 22:15:18.943024 systemd[1]: Started cri-containerd-0dcb9eda53ee4c01f4838ded050fa96022c42ae94a00aae334941ae03a210415.scope - libcontainer container 0dcb9eda53ee4c01f4838ded050fa96022c42ae94a00aae334941ae03a210415. Sep 9 22:15:18.964985 systemd[1]: Started cri-containerd-7af234191fec757f13142d894117fb12e7c1c178da28f0f2574e433238cb46de.scope - libcontainer container 7af234191fec757f13142d894117fb12e7c1c178da28f0f2574e433238cb46de. Sep 9 22:15:19.000531 kubelet[2537]: W0909 22:15:19.000451 2537 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.51.18:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-rokxy.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.51.18:6443: connect: connection refused Sep 9 22:15:19.000732 kubelet[2537]: E0909 22:15:19.000536 2537 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.51.18:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-rokxy.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.51.18:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:15:19.043094 containerd[1617]: time="2025-09-09T22:15:19.042936518Z" level=info msg="StartContainer for \"0dcb9eda53ee4c01f4838ded050fa96022c42ae94a00aae334941ae03a210415\" returns successfully" Sep 9 22:15:19.097756 containerd[1617]: time="2025-09-09T22:15:19.096896006Z" level=info msg="StartContainer for \"ce5e23366cd3baa9c49d2a9aeb0cd9d3610036d17423a1d7212da266a0fb0fd0\" returns successfully" Sep 9 22:15:19.103106 kubelet[2537]: E0909 22:15:19.101773 2537 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.51.18:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-rokxy.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.51.18:6443: connect: connection refused" interval="1.6s" Sep 9 22:15:19.124820 kubelet[2537]: W0909 22:15:19.123519 2537 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.51.18:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.51.18:6443: connect: connection refused Sep 9 22:15:19.126401 kubelet[2537]: E0909 22:15:19.124963 2537 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.51.18:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.51.18:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:15:19.131039 containerd[1617]: time="2025-09-09T22:15:19.130995555Z" level=info msg="StartContainer for \"7af234191fec757f13142d894117fb12e7c1c178da28f0f2574e433238cb46de\" returns successfully" Sep 9 22:15:19.244832 kubelet[2537]: W0909 22:15:19.244494 2537 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.51.18:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.51.18:6443: connect: connection refused Sep 9 22:15:19.246924 kubelet[2537]: E0909 22:15:19.246851 2537 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.51.18:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.51.18:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:15:19.355930 kubelet[2537]: I0909 22:15:19.355433 2537 kubelet_node_status.go:75] "Attempting to register node" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:19.356570 kubelet[2537]: E0909 22:15:19.356179 2537 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.230.51.18:6443/api/v1/nodes\": dial tcp 10.230.51.18:6443: connect: connection refused" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:19.754240 kubelet[2537]: E0909 22:15:19.754118 2537 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-rokxy.gb1.brightbox.com\" not found" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:19.758709 kubelet[2537]: E0909 22:15:19.758683 2537 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-rokxy.gb1.brightbox.com\" not found" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:19.763777 kubelet[2537]: E0909 22:15:19.763749 2537 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-rokxy.gb1.brightbox.com\" not found" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:20.770380 kubelet[2537]: E0909 22:15:20.770319 2537 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-rokxy.gb1.brightbox.com\" not found" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:20.772456 kubelet[2537]: E0909 22:15:20.771700 2537 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-rokxy.gb1.brightbox.com\" not found" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:20.960495 kubelet[2537]: I0909 22:15:20.960460 2537 kubelet_node_status.go:75] "Attempting to register node" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:21.741018 kubelet[2537]: E0909 22:15:21.740972 2537 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-rokxy.gb1.brightbox.com\" not found" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:21.829734 kubelet[2537]: I0909 22:15:21.829372 2537 kubelet_node_status.go:78] "Successfully registered node" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:21.895917 kubelet[2537]: I0909 22:15:21.895874 2537 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:21.901905 kubelet[2537]: E0909 22:15:21.901846 2537 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-rokxy.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:21.901905 kubelet[2537]: I0909 22:15:21.901894 2537 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:21.904577 kubelet[2537]: E0909 22:15:21.904545 2537 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-rokxy.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:21.904577 kubelet[2537]: I0909 22:15:21.904575 2537 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:21.906415 kubelet[2537]: E0909 22:15:21.906383 2537 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-rokxy.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:22.188582 kubelet[2537]: I0909 22:15:22.188258 2537 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:22.191123 kubelet[2537]: E0909 22:15:22.191065 2537 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-rokxy.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:22.666224 kubelet[2537]: I0909 22:15:22.666147 2537 apiserver.go:52] "Watching apiserver" Sep 9 22:15:22.697898 kubelet[2537]: I0909 22:15:22.697851 2537 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 22:15:23.774513 systemd[1]: Reload requested from client PID 2808 ('systemctl') (unit session-11.scope)... Sep 9 22:15:23.775009 systemd[1]: Reloading... Sep 9 22:15:23.909184 zram_generator::config[2865]: No configuration found. Sep 9 22:15:24.243517 systemd[1]: Reloading finished in 467 ms. Sep 9 22:15:24.274269 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:15:24.292378 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 22:15:24.292955 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:15:24.293156 systemd[1]: kubelet.service: Consumed 1.331s CPU time, 130.6M memory peak. Sep 9 22:15:24.297517 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:15:24.612740 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:15:24.624228 (kubelet)[2917]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 22:15:24.704982 kubelet[2917]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 22:15:24.704982 kubelet[2917]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 22:15:24.704982 kubelet[2917]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 22:15:24.705512 kubelet[2917]: I0909 22:15:24.705099 2917 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 22:15:24.716882 kubelet[2917]: I0909 22:15:24.716071 2917 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 22:15:24.716882 kubelet[2917]: I0909 22:15:24.716108 2917 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 22:15:24.717138 kubelet[2917]: I0909 22:15:24.717116 2917 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 22:15:24.721165 kubelet[2917]: I0909 22:15:24.721143 2917 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 22:15:24.725843 kubelet[2917]: I0909 22:15:24.725803 2917 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 22:15:24.735608 kubelet[2917]: I0909 22:15:24.735579 2917 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 22:15:24.745930 kubelet[2917]: I0909 22:15:24.745822 2917 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 22:15:24.746444 kubelet[2917]: I0909 22:15:24.746211 2917 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 22:15:24.746569 kubelet[2917]: I0909 22:15:24.746261 2917 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-rokxy.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 22:15:24.746569 kubelet[2917]: I0909 22:15:24.746556 2917 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 22:15:24.746569 kubelet[2917]: I0909 22:15:24.746571 2917 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 22:15:24.746965 kubelet[2917]: I0909 22:15:24.746665 2917 state_mem.go:36] "Initialized new in-memory state store" Sep 9 22:15:24.750687 kubelet[2917]: I0909 22:15:24.748835 2917 kubelet.go:446] "Attempting to sync node with API server" Sep 9 22:15:24.750687 kubelet[2917]: I0909 22:15:24.748885 2917 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 22:15:24.750687 kubelet[2917]: I0909 22:15:24.748933 2917 kubelet.go:352] "Adding apiserver pod source" Sep 9 22:15:24.750687 kubelet[2917]: I0909 22:15:24.748975 2917 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 22:15:24.753002 kubelet[2917]: I0909 22:15:24.752977 2917 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 22:15:24.755029 kubelet[2917]: I0909 22:15:24.753667 2917 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 22:15:24.757587 kubelet[2917]: I0909 22:15:24.757564 2917 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 22:15:24.757760 kubelet[2917]: I0909 22:15:24.757742 2917 server.go:1287] "Started kubelet" Sep 9 22:15:24.773157 kubelet[2917]: I0909 22:15:24.773117 2917 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 22:15:24.777665 kubelet[2917]: I0909 22:15:24.777472 2917 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 22:15:24.781810 kubelet[2917]: I0909 22:15:24.781552 2917 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 22:15:24.783817 kubelet[2917]: I0909 22:15:24.782353 2917 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 22:15:24.787889 kubelet[2917]: I0909 22:15:24.786175 2917 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 22:15:24.788173 kubelet[2917]: I0909 22:15:24.788152 2917 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 22:15:24.788803 kubelet[2917]: E0909 22:15:24.788496 2917 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-rokxy.gb1.brightbox.com\" not found" Sep 9 22:15:24.790824 kubelet[2917]: I0909 22:15:24.789430 2917 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 22:15:24.791819 kubelet[2917]: I0909 22:15:24.791653 2917 reconciler.go:26] "Reconciler: start to sync state" Sep 9 22:15:24.812808 kubelet[2917]: I0909 22:15:24.811502 2917 factory.go:221] Registration of the systemd container factory successfully Sep 9 22:15:24.812808 kubelet[2917]: I0909 22:15:24.811637 2917 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 22:15:24.817907 kubelet[2917]: I0909 22:15:24.816881 2917 server.go:479] "Adding debug handlers to kubelet server" Sep 9 22:15:24.822846 kubelet[2917]: I0909 22:15:24.822817 2917 factory.go:221] Registration of the containerd container factory successfully Sep 9 22:15:24.832172 kubelet[2917]: E0909 22:15:24.832024 2917 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 22:15:24.856611 kubelet[2917]: I0909 22:15:24.856537 2917 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 22:15:24.867156 kubelet[2917]: I0909 22:15:24.864725 2917 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 22:15:24.867156 kubelet[2917]: I0909 22:15:24.864844 2917 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 22:15:24.867156 kubelet[2917]: I0909 22:15:24.864912 2917 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 22:15:24.867156 kubelet[2917]: I0909 22:15:24.864948 2917 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 22:15:24.867156 kubelet[2917]: E0909 22:15:24.865230 2917 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 22:15:24.969875 kubelet[2917]: E0909 22:15:24.969832 2917 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 22:15:25.009878 kubelet[2917]: I0909 22:15:25.009842 2917 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 22:15:25.009878 kubelet[2917]: I0909 22:15:25.009868 2917 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 22:15:25.011470 kubelet[2917]: I0909 22:15:25.009900 2917 state_mem.go:36] "Initialized new in-memory state store" Sep 9 22:15:25.011470 kubelet[2917]: I0909 22:15:25.010142 2917 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 22:15:25.011470 kubelet[2917]: I0909 22:15:25.010160 2917 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 22:15:25.011470 kubelet[2917]: I0909 22:15:25.010195 2917 policy_none.go:49] "None policy: Start" Sep 9 22:15:25.011470 kubelet[2917]: I0909 22:15:25.010222 2917 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 22:15:25.011470 kubelet[2917]: I0909 22:15:25.010262 2917 state_mem.go:35] "Initializing new in-memory state store" Sep 9 22:15:25.011470 kubelet[2917]: I0909 22:15:25.010410 2917 state_mem.go:75] "Updated machine memory state" Sep 9 22:15:25.029272 kubelet[2917]: I0909 22:15:25.028319 2917 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 22:15:25.029272 kubelet[2917]: I0909 22:15:25.029011 2917 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 22:15:25.029872 kubelet[2917]: I0909 22:15:25.029715 2917 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 22:15:25.031785 kubelet[2917]: I0909 22:15:25.031737 2917 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 22:15:25.035991 kubelet[2917]: E0909 22:15:25.035149 2917 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 22:15:25.167136 kubelet[2917]: I0909 22:15:25.166859 2917 kubelet_node_status.go:75] "Attempting to register node" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.174063 kubelet[2917]: I0909 22:15:25.173938 2917 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.174599 kubelet[2917]: I0909 22:15:25.174346 2917 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.175928 kubelet[2917]: I0909 22:15:25.175903 2917 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.191515 kubelet[2917]: W0909 22:15:25.191479 2917 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 22:15:25.193776 kubelet[2917]: I0909 22:15:25.193255 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/02fb051d04ef43b5da9607f182f8f09c-k8s-certs\") pod \"kube-controller-manager-srv-rokxy.gb1.brightbox.com\" (UID: \"02fb051d04ef43b5da9607f182f8f09c\") " pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.193776 kubelet[2917]: I0909 22:15:25.193315 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/02fb051d04ef43b5da9607f182f8f09c-kubeconfig\") pod \"kube-controller-manager-srv-rokxy.gb1.brightbox.com\" (UID: \"02fb051d04ef43b5da9607f182f8f09c\") " pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.193776 kubelet[2917]: I0909 22:15:25.193394 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/02fb051d04ef43b5da9607f182f8f09c-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-rokxy.gb1.brightbox.com\" (UID: \"02fb051d04ef43b5da9607f182f8f09c\") " pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.193776 kubelet[2917]: I0909 22:15:25.193428 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/011c52450fb356b8c58e4596f8f1b21f-kubeconfig\") pod \"kube-scheduler-srv-rokxy.gb1.brightbox.com\" (UID: \"011c52450fb356b8c58e4596f8f1b21f\") " pod="kube-system/kube-scheduler-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.193776 kubelet[2917]: I0909 22:15:25.193469 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/07b3c056376fafdee159e16590c2a689-ca-certs\") pod \"kube-apiserver-srv-rokxy.gb1.brightbox.com\" (UID: \"07b3c056376fafdee159e16590c2a689\") " pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.194049 kubelet[2917]: I0909 22:15:25.193506 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/07b3c056376fafdee159e16590c2a689-usr-share-ca-certificates\") pod \"kube-apiserver-srv-rokxy.gb1.brightbox.com\" (UID: \"07b3c056376fafdee159e16590c2a689\") " pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.194049 kubelet[2917]: I0909 22:15:25.193530 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/02fb051d04ef43b5da9607f182f8f09c-ca-certs\") pod \"kube-controller-manager-srv-rokxy.gb1.brightbox.com\" (UID: \"02fb051d04ef43b5da9607f182f8f09c\") " pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.194049 kubelet[2917]: I0909 22:15:25.193554 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/07b3c056376fafdee159e16590c2a689-k8s-certs\") pod \"kube-apiserver-srv-rokxy.gb1.brightbox.com\" (UID: \"07b3c056376fafdee159e16590c2a689\") " pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.194049 kubelet[2917]: I0909 22:15:25.193583 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/02fb051d04ef43b5da9607f182f8f09c-flexvolume-dir\") pod \"kube-controller-manager-srv-rokxy.gb1.brightbox.com\" (UID: \"02fb051d04ef43b5da9607f182f8f09c\") " pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.197264 kubelet[2917]: W0909 22:15:25.197240 2917 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 22:15:25.200085 kubelet[2917]: I0909 22:15:25.199569 2917 kubelet_node_status.go:124] "Node was previously registered" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.200085 kubelet[2917]: I0909 22:15:25.199892 2917 kubelet_node_status.go:78] "Successfully registered node" node="srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.205543 kubelet[2917]: W0909 22:15:25.205467 2917 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 22:15:25.751634 kubelet[2917]: I0909 22:15:25.751279 2917 apiserver.go:52] "Watching apiserver" Sep 9 22:15:25.792445 kubelet[2917]: I0909 22:15:25.792396 2917 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 22:15:25.923646 kubelet[2917]: I0909 22:15:25.923590 2917 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.926856 kubelet[2917]: I0909 22:15:25.924001 2917 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.935764 kubelet[2917]: W0909 22:15:25.933319 2917 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 22:15:25.935764 kubelet[2917]: E0909 22:15:25.933500 2917 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-rokxy.gb1.brightbox.com\" already exists" pod="kube-system/kube-scheduler-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.945022 kubelet[2917]: W0909 22:15:25.944988 2917 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 22:15:25.945170 kubelet[2917]: E0909 22:15:25.945096 2917 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-rokxy.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" Sep 9 22:15:25.971508 kubelet[2917]: I0909 22:15:25.971304 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-rokxy.gb1.brightbox.com" podStartSLOduration=0.971266983 podStartE2EDuration="971.266983ms" podCreationTimestamp="2025-09-09 22:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 22:15:25.970890793 +0000 UTC m=+1.339047520" watchObservedRunningTime="2025-09-09 22:15:25.971266983 +0000 UTC m=+1.339423675" Sep 9 22:15:25.998127 kubelet[2917]: I0909 22:15:25.997714 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-rokxy.gb1.brightbox.com" podStartSLOduration=0.99769639 podStartE2EDuration="997.69639ms" podCreationTimestamp="2025-09-09 22:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 22:15:25.986254897 +0000 UTC m=+1.354411626" watchObservedRunningTime="2025-09-09 22:15:25.99769639 +0000 UTC m=+1.365853103" Sep 9 22:15:26.016004 kubelet[2917]: I0909 22:15:26.015464 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-rokxy.gb1.brightbox.com" podStartSLOduration=1.015448775 podStartE2EDuration="1.015448775s" podCreationTimestamp="2025-09-09 22:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 22:15:26.000388684 +0000 UTC m=+1.368545390" watchObservedRunningTime="2025-09-09 22:15:26.015448775 +0000 UTC m=+1.383605478" Sep 9 22:15:29.508094 kubelet[2917]: I0909 22:15:29.507584 2917 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 22:15:29.508904 containerd[1617]: time="2025-09-09T22:15:29.508711974Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 22:15:29.509425 kubelet[2917]: I0909 22:15:29.509013 2917 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 22:15:30.204902 systemd[1]: Created slice kubepods-besteffort-pod990bd276_98bf_4599_89df_fe6c768f508d.slice - libcontainer container kubepods-besteffort-pod990bd276_98bf_4599_89df_fe6c768f508d.slice. Sep 9 22:15:30.226752 kubelet[2917]: I0909 22:15:30.226673 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/990bd276-98bf-4599-89df-fe6c768f508d-kube-proxy\") pod \"kube-proxy-vdv55\" (UID: \"990bd276-98bf-4599-89df-fe6c768f508d\") " pod="kube-system/kube-proxy-vdv55" Sep 9 22:15:30.227978 kubelet[2917]: I0909 22:15:30.226809 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/990bd276-98bf-4599-89df-fe6c768f508d-xtables-lock\") pod \"kube-proxy-vdv55\" (UID: \"990bd276-98bf-4599-89df-fe6c768f508d\") " pod="kube-system/kube-proxy-vdv55" Sep 9 22:15:30.227978 kubelet[2917]: I0909 22:15:30.226869 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/990bd276-98bf-4599-89df-fe6c768f508d-lib-modules\") pod \"kube-proxy-vdv55\" (UID: \"990bd276-98bf-4599-89df-fe6c768f508d\") " pod="kube-system/kube-proxy-vdv55" Sep 9 22:15:30.227978 kubelet[2917]: I0909 22:15:30.226939 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqplc\" (UniqueName: \"kubernetes.io/projected/990bd276-98bf-4599-89df-fe6c768f508d-kube-api-access-rqplc\") pod \"kube-proxy-vdv55\" (UID: \"990bd276-98bf-4599-89df-fe6c768f508d\") " pod="kube-system/kube-proxy-vdv55" Sep 9 22:15:30.517822 containerd[1617]: time="2025-09-09T22:15:30.515501991Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vdv55,Uid:990bd276-98bf-4599-89df-fe6c768f508d,Namespace:kube-system,Attempt:0,}" Sep 9 22:15:30.574855 containerd[1617]: time="2025-09-09T22:15:30.574763213Z" level=info msg="connecting to shim d3148150ef7c99daeb04af98e100219111f3e830276443bc2946bf73ac2f7c56" address="unix:///run/containerd/s/2ed315dab34ee52e58a9d116c8dfc372df9d03340e0ed21139e5d6aa88296a76" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:15:30.629017 systemd[1]: Created slice kubepods-besteffort-podad422346_1daf_4cde_946a_0063afd4425f.slice - libcontainer container kubepods-besteffort-podad422346_1daf_4cde_946a_0063afd4425f.slice. Sep 9 22:15:30.631549 kubelet[2917]: I0909 22:15:30.631426 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ad422346-1daf-4cde-946a-0063afd4425f-var-lib-calico\") pod \"tigera-operator-755d956888-927gw\" (UID: \"ad422346-1daf-4cde-946a-0063afd4425f\") " pod="tigera-operator/tigera-operator-755d956888-927gw" Sep 9 22:15:30.632429 kubelet[2917]: I0909 22:15:30.632142 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd5kr\" (UniqueName: \"kubernetes.io/projected/ad422346-1daf-4cde-946a-0063afd4425f-kube-api-access-zd5kr\") pod \"tigera-operator-755d956888-927gw\" (UID: \"ad422346-1daf-4cde-946a-0063afd4425f\") " pod="tigera-operator/tigera-operator-755d956888-927gw" Sep 9 22:15:30.660051 systemd[1]: Started cri-containerd-d3148150ef7c99daeb04af98e100219111f3e830276443bc2946bf73ac2f7c56.scope - libcontainer container d3148150ef7c99daeb04af98e100219111f3e830276443bc2946bf73ac2f7c56. Sep 9 22:15:30.724205 containerd[1617]: time="2025-09-09T22:15:30.724146998Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vdv55,Uid:990bd276-98bf-4599-89df-fe6c768f508d,Namespace:kube-system,Attempt:0,} returns sandbox id \"d3148150ef7c99daeb04af98e100219111f3e830276443bc2946bf73ac2f7c56\"" Sep 9 22:15:30.730454 containerd[1617]: time="2025-09-09T22:15:30.730403342Z" level=info msg="CreateContainer within sandbox \"d3148150ef7c99daeb04af98e100219111f3e830276443bc2946bf73ac2f7c56\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 22:15:30.749871 containerd[1617]: time="2025-09-09T22:15:30.746918435Z" level=info msg="Container 991d1148033138a9d33899c8c1d7d2bf738bf8cf2305a971b8920eb513eeb109: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:15:30.749497 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2117119983.mount: Deactivated successfully. Sep 9 22:15:30.761190 containerd[1617]: time="2025-09-09T22:15:30.761058841Z" level=info msg="CreateContainer within sandbox \"d3148150ef7c99daeb04af98e100219111f3e830276443bc2946bf73ac2f7c56\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"991d1148033138a9d33899c8c1d7d2bf738bf8cf2305a971b8920eb513eeb109\"" Sep 9 22:15:30.762164 containerd[1617]: time="2025-09-09T22:15:30.762076065Z" level=info msg="StartContainer for \"991d1148033138a9d33899c8c1d7d2bf738bf8cf2305a971b8920eb513eeb109\"" Sep 9 22:15:30.764038 containerd[1617]: time="2025-09-09T22:15:30.763934143Z" level=info msg="connecting to shim 991d1148033138a9d33899c8c1d7d2bf738bf8cf2305a971b8920eb513eeb109" address="unix:///run/containerd/s/2ed315dab34ee52e58a9d116c8dfc372df9d03340e0ed21139e5d6aa88296a76" protocol=ttrpc version=3 Sep 9 22:15:30.798038 systemd[1]: Started cri-containerd-991d1148033138a9d33899c8c1d7d2bf738bf8cf2305a971b8920eb513eeb109.scope - libcontainer container 991d1148033138a9d33899c8c1d7d2bf738bf8cf2305a971b8920eb513eeb109. Sep 9 22:15:30.864848 containerd[1617]: time="2025-09-09T22:15:30.864796546Z" level=info msg="StartContainer for \"991d1148033138a9d33899c8c1d7d2bf738bf8cf2305a971b8920eb513eeb109\" returns successfully" Sep 9 22:15:30.946605 containerd[1617]: time="2025-09-09T22:15:30.946539770Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-927gw,Uid:ad422346-1daf-4cde-946a-0063afd4425f,Namespace:tigera-operator,Attempt:0,}" Sep 9 22:15:30.960142 kubelet[2917]: I0909 22:15:30.960023 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-vdv55" podStartSLOduration=0.96000429 podStartE2EDuration="960.00429ms" podCreationTimestamp="2025-09-09 22:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 22:15:30.958312106 +0000 UTC m=+6.326468817" watchObservedRunningTime="2025-09-09 22:15:30.96000429 +0000 UTC m=+6.328160993" Sep 9 22:15:30.992064 containerd[1617]: time="2025-09-09T22:15:30.991595418Z" level=info msg="connecting to shim 2472a3d7cb8012333b342f67b74851785e606c620ca699674acc8468ef4cec90" address="unix:///run/containerd/s/8097048fcccb1a14730a005e5547ff0219efa86506887a0939a80b28f2f6eb91" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:15:31.031982 systemd[1]: Started cri-containerd-2472a3d7cb8012333b342f67b74851785e606c620ca699674acc8468ef4cec90.scope - libcontainer container 2472a3d7cb8012333b342f67b74851785e606c620ca699674acc8468ef4cec90. Sep 9 22:15:31.129766 containerd[1617]: time="2025-09-09T22:15:31.129634196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-927gw,Uid:ad422346-1daf-4cde-946a-0063afd4425f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"2472a3d7cb8012333b342f67b74851785e606c620ca699674acc8468ef4cec90\"" Sep 9 22:15:31.135068 containerd[1617]: time="2025-09-09T22:15:31.135034557Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 22:15:33.058516 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2413232353.mount: Deactivated successfully. Sep 9 22:15:35.645981 containerd[1617]: time="2025-09-09T22:15:35.645905611Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:35.647186 containerd[1617]: time="2025-09-09T22:15:35.647014324Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 9 22:15:35.648062 containerd[1617]: time="2025-09-09T22:15:35.648025802Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:35.655399 containerd[1617]: time="2025-09-09T22:15:35.655344557Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:35.656828 containerd[1617]: time="2025-09-09T22:15:35.656718166Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 4.521636326s" Sep 9 22:15:35.656828 containerd[1617]: time="2025-09-09T22:15:35.656769701Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 9 22:15:35.661505 containerd[1617]: time="2025-09-09T22:15:35.660952157Z" level=info msg="CreateContainer within sandbox \"2472a3d7cb8012333b342f67b74851785e606c620ca699674acc8468ef4cec90\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 22:15:35.671566 containerd[1617]: time="2025-09-09T22:15:35.671361944Z" level=info msg="Container 82153c70900fb6d2af9d53adb4b2438fdba9afe2ee89aff8c19249262c9ac4c5: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:15:35.676054 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount484904554.mount: Deactivated successfully. Sep 9 22:15:35.686382 containerd[1617]: time="2025-09-09T22:15:35.686298147Z" level=info msg="CreateContainer within sandbox \"2472a3d7cb8012333b342f67b74851785e606c620ca699674acc8468ef4cec90\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"82153c70900fb6d2af9d53adb4b2438fdba9afe2ee89aff8c19249262c9ac4c5\"" Sep 9 22:15:35.689031 containerd[1617]: time="2025-09-09T22:15:35.688986487Z" level=info msg="StartContainer for \"82153c70900fb6d2af9d53adb4b2438fdba9afe2ee89aff8c19249262c9ac4c5\"" Sep 9 22:15:35.690165 containerd[1617]: time="2025-09-09T22:15:35.690133388Z" level=info msg="connecting to shim 82153c70900fb6d2af9d53adb4b2438fdba9afe2ee89aff8c19249262c9ac4c5" address="unix:///run/containerd/s/8097048fcccb1a14730a005e5547ff0219efa86506887a0939a80b28f2f6eb91" protocol=ttrpc version=3 Sep 9 22:15:35.722013 systemd[1]: Started cri-containerd-82153c70900fb6d2af9d53adb4b2438fdba9afe2ee89aff8c19249262c9ac4c5.scope - libcontainer container 82153c70900fb6d2af9d53adb4b2438fdba9afe2ee89aff8c19249262c9ac4c5. Sep 9 22:15:35.771300 containerd[1617]: time="2025-09-09T22:15:35.771252897Z" level=info msg="StartContainer for \"82153c70900fb6d2af9d53adb4b2438fdba9afe2ee89aff8c19249262c9ac4c5\" returns successfully" Sep 9 22:15:37.869201 kubelet[2917]: I0909 22:15:37.869042 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-927gw" podStartSLOduration=3.343557073 podStartE2EDuration="7.869020412s" podCreationTimestamp="2025-09-09 22:15:30 +0000 UTC" firstStartedPulling="2025-09-09 22:15:31.132730629 +0000 UTC m=+6.500887324" lastFinishedPulling="2025-09-09 22:15:35.658193974 +0000 UTC m=+11.026350663" observedRunningTime="2025-09-09 22:15:35.979022535 +0000 UTC m=+11.347179246" watchObservedRunningTime="2025-09-09 22:15:37.869020412 +0000 UTC m=+13.237177106" Sep 9 22:15:42.895676 sudo[1915]: pam_unix(sudo:session): session closed for user root Sep 9 22:15:43.049753 sshd[1914]: Connection closed by 139.178.68.195 port 36504 Sep 9 22:15:43.052582 sshd-session[1911]: pam_unix(sshd:session): session closed for user core Sep 9 22:15:43.061587 systemd[1]: sshd@9-10.230.51.18:22-139.178.68.195:36504.service: Deactivated successfully. Sep 9 22:15:43.061808 systemd-logind[1588]: Session 11 logged out. Waiting for processes to exit. Sep 9 22:15:43.070907 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 22:15:43.072183 systemd[1]: session-11.scope: Consumed 6.981s CPU time, 154.3M memory peak. Sep 9 22:15:43.081866 systemd-logind[1588]: Removed session 11. Sep 9 22:15:46.673799 systemd[1]: Started sshd@13-10.230.51.18:22-14.194.76.134:39651.service - OpenSSH per-connection server daemon (14.194.76.134:39651). Sep 9 22:15:47.229202 systemd[1]: Created slice kubepods-besteffort-podd6ba087c_a4db_4dda_8258_a1d19c3288b3.slice - libcontainer container kubepods-besteffort-podd6ba087c_a4db_4dda_8258_a1d19c3288b3.slice. Sep 9 22:15:47.246579 kubelet[2917]: I0909 22:15:47.246139 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5lpb\" (UniqueName: \"kubernetes.io/projected/d6ba087c-a4db-4dda-8258-a1d19c3288b3-kube-api-access-z5lpb\") pod \"calico-typha-7cb5dbb58c-858hj\" (UID: \"d6ba087c-a4db-4dda-8258-a1d19c3288b3\") " pod="calico-system/calico-typha-7cb5dbb58c-858hj" Sep 9 22:15:47.247291 kubelet[2917]: I0909 22:15:47.247191 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6ba087c-a4db-4dda-8258-a1d19c3288b3-tigera-ca-bundle\") pod \"calico-typha-7cb5dbb58c-858hj\" (UID: \"d6ba087c-a4db-4dda-8258-a1d19c3288b3\") " pod="calico-system/calico-typha-7cb5dbb58c-858hj" Sep 9 22:15:47.247291 kubelet[2917]: I0909 22:15:47.247235 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d6ba087c-a4db-4dda-8258-a1d19c3288b3-typha-certs\") pod \"calico-typha-7cb5dbb58c-858hj\" (UID: \"d6ba087c-a4db-4dda-8258-a1d19c3288b3\") " pod="calico-system/calico-typha-7cb5dbb58c-858hj" Sep 9 22:15:47.503577 systemd[1]: Created slice kubepods-besteffort-podabdb43db_ee47_461a_a555_8668929fc8cc.slice - libcontainer container kubepods-besteffort-podabdb43db_ee47_461a_a555_8668929fc8cc.slice. Sep 9 22:15:47.536896 containerd[1617]: time="2025-09-09T22:15:47.536829744Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7cb5dbb58c-858hj,Uid:d6ba087c-a4db-4dda-8258-a1d19c3288b3,Namespace:calico-system,Attempt:0,}" Sep 9 22:15:47.551307 kubelet[2917]: I0909 22:15:47.549641 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbhlg\" (UniqueName: \"kubernetes.io/projected/abdb43db-ee47-461a-a555-8668929fc8cc-kube-api-access-lbhlg\") pod \"calico-node-fk2fc\" (UID: \"abdb43db-ee47-461a-a555-8668929fc8cc\") " pod="calico-system/calico-node-fk2fc" Sep 9 22:15:47.551307 kubelet[2917]: I0909 22:15:47.549894 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/abdb43db-ee47-461a-a555-8668929fc8cc-cni-log-dir\") pod \"calico-node-fk2fc\" (UID: \"abdb43db-ee47-461a-a555-8668929fc8cc\") " pod="calico-system/calico-node-fk2fc" Sep 9 22:15:47.551307 kubelet[2917]: I0909 22:15:47.549927 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/abdb43db-ee47-461a-a555-8668929fc8cc-flexvol-driver-host\") pod \"calico-node-fk2fc\" (UID: \"abdb43db-ee47-461a-a555-8668929fc8cc\") " pod="calico-system/calico-node-fk2fc" Sep 9 22:15:47.551307 kubelet[2917]: I0909 22:15:47.549952 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/abdb43db-ee47-461a-a555-8668929fc8cc-policysync\") pod \"calico-node-fk2fc\" (UID: \"abdb43db-ee47-461a-a555-8668929fc8cc\") " pod="calico-system/calico-node-fk2fc" Sep 9 22:15:47.551307 kubelet[2917]: I0909 22:15:47.550662 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/abdb43db-ee47-461a-a555-8668929fc8cc-cni-net-dir\") pod \"calico-node-fk2fc\" (UID: \"abdb43db-ee47-461a-a555-8668929fc8cc\") " pod="calico-system/calico-node-fk2fc" Sep 9 22:15:47.552146 kubelet[2917]: I0909 22:15:47.550700 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abdb43db-ee47-461a-a555-8668929fc8cc-lib-modules\") pod \"calico-node-fk2fc\" (UID: \"abdb43db-ee47-461a-a555-8668929fc8cc\") " pod="calico-system/calico-node-fk2fc" Sep 9 22:15:47.552146 kubelet[2917]: I0909 22:15:47.550741 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/abdb43db-ee47-461a-a555-8668929fc8cc-var-run-calico\") pod \"calico-node-fk2fc\" (UID: \"abdb43db-ee47-461a-a555-8668929fc8cc\") " pod="calico-system/calico-node-fk2fc" Sep 9 22:15:47.552146 kubelet[2917]: I0909 22:15:47.550770 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/abdb43db-ee47-461a-a555-8668929fc8cc-node-certs\") pod \"calico-node-fk2fc\" (UID: \"abdb43db-ee47-461a-a555-8668929fc8cc\") " pod="calico-system/calico-node-fk2fc" Sep 9 22:15:47.552146 kubelet[2917]: I0909 22:15:47.550815 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/abdb43db-ee47-461a-a555-8668929fc8cc-xtables-lock\") pod \"calico-node-fk2fc\" (UID: \"abdb43db-ee47-461a-a555-8668929fc8cc\") " pod="calico-system/calico-node-fk2fc" Sep 9 22:15:47.552146 kubelet[2917]: I0909 22:15:47.550846 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abdb43db-ee47-461a-a555-8668929fc8cc-tigera-ca-bundle\") pod \"calico-node-fk2fc\" (UID: \"abdb43db-ee47-461a-a555-8668929fc8cc\") " pod="calico-system/calico-node-fk2fc" Sep 9 22:15:47.552372 kubelet[2917]: I0909 22:15:47.550869 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/abdb43db-ee47-461a-a555-8668929fc8cc-cni-bin-dir\") pod \"calico-node-fk2fc\" (UID: \"abdb43db-ee47-461a-a555-8668929fc8cc\") " pod="calico-system/calico-node-fk2fc" Sep 9 22:15:47.552372 kubelet[2917]: I0909 22:15:47.550893 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/abdb43db-ee47-461a-a555-8668929fc8cc-var-lib-calico\") pod \"calico-node-fk2fc\" (UID: \"abdb43db-ee47-461a-a555-8668929fc8cc\") " pod="calico-system/calico-node-fk2fc" Sep 9 22:15:47.590517 containerd[1617]: time="2025-09-09T22:15:47.590453580Z" level=info msg="connecting to shim 4553f15ccf2d2b3675c854130867d0516359c348be8fa73e925dadccf921f867" address="unix:///run/containerd/s/a2b26acbd307f734dda1d97df7ba27f74c01a02beaa6e6cc72992bcb7aab570a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:15:47.645118 systemd[1]: Started cri-containerd-4553f15ccf2d2b3675c854130867d0516359c348be8fa73e925dadccf921f867.scope - libcontainer container 4553f15ccf2d2b3675c854130867d0516359c348be8fa73e925dadccf921f867. Sep 9 22:15:47.659936 kubelet[2917]: E0909 22:15:47.659883 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.660341 kubelet[2917]: W0909 22:15:47.659912 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.661256 kubelet[2917]: E0909 22:15:47.661227 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.671590 kubelet[2917]: E0909 22:15:47.671444 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.671590 kubelet[2917]: W0909 22:15:47.671485 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.671590 kubelet[2917]: E0909 22:15:47.671524 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.684538 kubelet[2917]: E0909 22:15:47.684374 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.684538 kubelet[2917]: W0909 22:15:47.684396 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.684538 kubelet[2917]: E0909 22:15:47.684544 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.769900 containerd[1617]: time="2025-09-09T22:15:47.768540349Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7cb5dbb58c-858hj,Uid:d6ba087c-a4db-4dda-8258-a1d19c3288b3,Namespace:calico-system,Attempt:0,} returns sandbox id \"4553f15ccf2d2b3675c854130867d0516359c348be8fa73e925dadccf921f867\"" Sep 9 22:15:47.772827 containerd[1617]: time="2025-09-09T22:15:47.772642256Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 22:15:47.811006 containerd[1617]: time="2025-09-09T22:15:47.810729394Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fk2fc,Uid:abdb43db-ee47-461a-a555-8668929fc8cc,Namespace:calico-system,Attempt:0,}" Sep 9 22:15:47.856264 kubelet[2917]: E0909 22:15:47.855872 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nxmhd" podUID="ee89689e-04d0-4108-8d66-1a161973549c" Sep 9 22:15:47.879679 kubelet[2917]: I0909 22:15:47.879608 2917 status_manager.go:890] "Failed to get status for pod" podUID="ee89689e-04d0-4108-8d66-1a161973549c" pod="calico-system/csi-node-driver-nxmhd" err="pods \"csi-node-driver-nxmhd\" is forbidden: User \"system:node:srv-rokxy.gb1.brightbox.com\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'srv-rokxy.gb1.brightbox.com' and this object" Sep 9 22:15:47.908556 containerd[1617]: time="2025-09-09T22:15:47.908207551Z" level=info msg="connecting to shim 4d08ce43e3f3a96ea00dbde202bf6108a585f6a8ac837b6accfbc8c48386c8cf" address="unix:///run/containerd/s/4e58339c1406230f49d9ffe76a393f434127ad8d535c374abcfc0586769aba3b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:15:47.932908 kubelet[2917]: E0909 22:15:47.932691 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.932908 kubelet[2917]: W0909 22:15:47.932724 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.932908 kubelet[2917]: E0909 22:15:47.932755 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.935443 kubelet[2917]: E0909 22:15:47.934040 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.935443 kubelet[2917]: W0909 22:15:47.934071 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.935443 kubelet[2917]: E0909 22:15:47.934098 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.936355 kubelet[2917]: E0909 22:15:47.935967 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.936355 kubelet[2917]: W0909 22:15:47.936001 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.936355 kubelet[2917]: E0909 22:15:47.936019 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.937048 kubelet[2917]: E0909 22:15:47.937029 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.938683 kubelet[2917]: W0909 22:15:47.937917 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.938683 kubelet[2917]: E0909 22:15:47.937944 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.939082 kubelet[2917]: E0909 22:15:47.938972 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.939082 kubelet[2917]: W0909 22:15:47.939009 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.939082 kubelet[2917]: E0909 22:15:47.939026 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.940741 kubelet[2917]: E0909 22:15:47.940478 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.940741 kubelet[2917]: W0909 22:15:47.940499 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.940741 kubelet[2917]: E0909 22:15:47.940541 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.941657 kubelet[2917]: E0909 22:15:47.941528 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.944377 kubelet[2917]: W0909 22:15:47.944135 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.944723 kubelet[2917]: E0909 22:15:47.944179 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.945394 kubelet[2917]: E0909 22:15:47.945279 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.945764 kubelet[2917]: W0909 22:15:47.945605 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.945764 kubelet[2917]: E0909 22:15:47.945629 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.947063 kubelet[2917]: E0909 22:15:47.946990 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.947613 kubelet[2917]: W0909 22:15:47.947243 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.947613 kubelet[2917]: E0909 22:15:47.947263 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.948676 kubelet[2917]: E0909 22:15:47.948657 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.949129 kubelet[2917]: W0909 22:15:47.948920 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.949129 kubelet[2917]: E0909 22:15:47.948940 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.951315 kubelet[2917]: E0909 22:15:47.951284 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.951541 kubelet[2917]: W0909 22:15:47.951421 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.951541 kubelet[2917]: E0909 22:15:47.951444 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.952238 kubelet[2917]: E0909 22:15:47.952220 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.952484 kubelet[2917]: W0909 22:15:47.952327 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.952484 kubelet[2917]: E0909 22:15:47.952350 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.952966 kubelet[2917]: E0909 22:15:47.952948 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.953256 kubelet[2917]: W0909 22:15:47.953053 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.953256 kubelet[2917]: E0909 22:15:47.953077 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.954661 kubelet[2917]: E0909 22:15:47.954622 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.954855 kubelet[2917]: W0909 22:15:47.954765 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.954855 kubelet[2917]: E0909 22:15:47.954789 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.956392 kubelet[2917]: E0909 22:15:47.956339 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.956690 kubelet[2917]: W0909 22:15:47.956519 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.956690 kubelet[2917]: E0909 22:15:47.956543 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.958192 kubelet[2917]: E0909 22:15:47.957918 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.958192 kubelet[2917]: W0909 22:15:47.957936 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.958192 kubelet[2917]: E0909 22:15:47.957952 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.958635 kubelet[2917]: E0909 22:15:47.958617 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.958969 kubelet[2917]: W0909 22:15:47.958725 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.958969 kubelet[2917]: E0909 22:15:47.958748 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.960066 kubelet[2917]: E0909 22:15:47.959739 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.960066 kubelet[2917]: W0909 22:15:47.959973 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.960066 kubelet[2917]: E0909 22:15:47.959995 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.962073 kubelet[2917]: E0909 22:15:47.961658 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.962073 kubelet[2917]: W0909 22:15:47.961677 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.962073 kubelet[2917]: E0909 22:15:47.961692 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.963367 kubelet[2917]: E0909 22:15:47.963200 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.963367 kubelet[2917]: W0909 22:15:47.963218 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.964840 kubelet[2917]: E0909 22:15:47.964792 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.965748 kubelet[2917]: E0909 22:15:47.965718 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.966248 kubelet[2917]: W0909 22:15:47.965907 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.966248 kubelet[2917]: E0909 22:15:47.965928 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.967074 kubelet[2917]: E0909 22:15:47.966394 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.967074 kubelet[2917]: W0909 22:15:47.966419 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.967074 kubelet[2917]: I0909 22:15:47.965956 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee89689e-04d0-4108-8d66-1a161973549c-kubelet-dir\") pod \"csi-node-driver-nxmhd\" (UID: \"ee89689e-04d0-4108-8d66-1a161973549c\") " pod="calico-system/csi-node-driver-nxmhd" Sep 9 22:15:47.967074 kubelet[2917]: E0909 22:15:47.966438 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.968053 kubelet[2917]: E0909 22:15:47.967862 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.968053 kubelet[2917]: W0909 22:15:47.967876 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.968053 kubelet[2917]: E0909 22:15:47.967890 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.969537 kubelet[2917]: E0909 22:15:47.969446 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.970347 kubelet[2917]: W0909 22:15:47.969720 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.970347 kubelet[2917]: E0909 22:15:47.969744 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.970347 kubelet[2917]: I0909 22:15:47.969792 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ee89689e-04d0-4108-8d66-1a161973549c-registration-dir\") pod \"csi-node-driver-nxmhd\" (UID: \"ee89689e-04d0-4108-8d66-1a161973549c\") " pod="calico-system/csi-node-driver-nxmhd" Sep 9 22:15:47.972629 kubelet[2917]: E0909 22:15:47.972154 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.972629 kubelet[2917]: W0909 22:15:47.972173 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.972629 kubelet[2917]: E0909 22:15:47.972196 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.972629 kubelet[2917]: I0909 22:15:47.972220 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wc5z\" (UniqueName: \"kubernetes.io/projected/ee89689e-04d0-4108-8d66-1a161973549c-kube-api-access-2wc5z\") pod \"csi-node-driver-nxmhd\" (UID: \"ee89689e-04d0-4108-8d66-1a161973549c\") " pod="calico-system/csi-node-driver-nxmhd" Sep 9 22:15:47.974309 kubelet[2917]: E0909 22:15:47.974107 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.976170 kubelet[2917]: W0909 22:15:47.976068 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.976170 kubelet[2917]: E0909 22:15:47.976119 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.976170 kubelet[2917]: I0909 22:15:47.976156 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ee89689e-04d0-4108-8d66-1a161973549c-socket-dir\") pod \"csi-node-driver-nxmhd\" (UID: \"ee89689e-04d0-4108-8d66-1a161973549c\") " pod="calico-system/csi-node-driver-nxmhd" Sep 9 22:15:47.978383 kubelet[2917]: E0909 22:15:47.978242 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.978383 kubelet[2917]: W0909 22:15:47.978261 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.979242 kubelet[2917]: E0909 22:15:47.979223 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.980644 kubelet[2917]: W0909 22:15:47.980508 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.980894 kubelet[2917]: E0909 22:15:47.980876 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.981010 kubelet[2917]: W0909 22:15:47.980977 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.981623 kubelet[2917]: E0909 22:15:47.981318 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.981623 kubelet[2917]: W0909 22:15:47.981338 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.981623 kubelet[2917]: E0909 22:15:47.981355 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.983282 kubelet[2917]: E0909 22:15:47.982954 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.983282 kubelet[2917]: E0909 22:15:47.982975 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.983282 kubelet[2917]: E0909 22:15:47.982987 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.983282 kubelet[2917]: I0909 22:15:47.983015 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/ee89689e-04d0-4108-8d66-1a161973549c-varrun\") pod \"csi-node-driver-nxmhd\" (UID: \"ee89689e-04d0-4108-8d66-1a161973549c\") " pod="calico-system/csi-node-driver-nxmhd" Sep 9 22:15:47.983282 kubelet[2917]: E0909 22:15:47.983075 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.983282 kubelet[2917]: W0909 22:15:47.983103 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.983282 kubelet[2917]: E0909 22:15:47.983116 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.983668 kubelet[2917]: E0909 22:15:47.983650 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.983768 kubelet[2917]: W0909 22:15:47.983748 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.984054 kubelet[2917]: E0909 22:15:47.983886 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.984289 systemd[1]: Started cri-containerd-4d08ce43e3f3a96ea00dbde202bf6108a585f6a8ac837b6accfbc8c48386c8cf.scope - libcontainer container 4d08ce43e3f3a96ea00dbde202bf6108a585f6a8ac837b6accfbc8c48386c8cf. Sep 9 22:15:47.986655 kubelet[2917]: E0909 22:15:47.986383 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.986655 kubelet[2917]: W0909 22:15:47.986404 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.986655 kubelet[2917]: E0909 22:15:47.986421 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.987907 kubelet[2917]: E0909 22:15:47.987613 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.987907 kubelet[2917]: W0909 22:15:47.987632 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.987907 kubelet[2917]: E0909 22:15:47.987658 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:47.988707 kubelet[2917]: E0909 22:15:47.988687 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:47.989083 kubelet[2917]: W0909 22:15:47.988844 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:47.989574 kubelet[2917]: E0909 22:15:47.988864 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.087450 kubelet[2917]: E0909 22:15:48.086895 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.087450 kubelet[2917]: W0909 22:15:48.086927 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.087450 kubelet[2917]: E0909 22:15:48.086953 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.087450 kubelet[2917]: E0909 22:15:48.087192 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.087450 kubelet[2917]: W0909 22:15:48.087205 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.087450 kubelet[2917]: E0909 22:15:48.087218 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.087450 kubelet[2917]: E0909 22:15:48.087473 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.087866 kubelet[2917]: W0909 22:15:48.087486 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.087866 kubelet[2917]: E0909 22:15:48.087626 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.089005 kubelet[2917]: E0909 22:15:48.088985 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.089005 kubelet[2917]: W0909 22:15:48.089003 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.089218 kubelet[2917]: E0909 22:15:48.089025 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.090889 kubelet[2917]: E0909 22:15:48.090867 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.090889 kubelet[2917]: W0909 22:15:48.090885 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.091339 kubelet[2917]: E0909 22:15:48.090938 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.091339 kubelet[2917]: E0909 22:15:48.091127 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.091339 kubelet[2917]: W0909 22:15:48.091140 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.091339 kubelet[2917]: E0909 22:15:48.091186 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.091972 kubelet[2917]: E0909 22:15:48.091952 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.091972 kubelet[2917]: W0909 22:15:48.091969 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.092211 kubelet[2917]: E0909 22:15:48.092043 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.093993 kubelet[2917]: E0909 22:15:48.093973 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.093993 kubelet[2917]: W0909 22:15:48.093991 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.094193 kubelet[2917]: E0909 22:15:48.094064 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.094340 kubelet[2917]: E0909 22:15:48.094207 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.094340 kubelet[2917]: W0909 22:15:48.094218 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.094340 kubelet[2917]: E0909 22:15:48.094272 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.094949 kubelet[2917]: E0909 22:15:48.094928 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.094949 kubelet[2917]: W0909 22:15:48.094946 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.095145 kubelet[2917]: E0909 22:15:48.095031 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.095995 kubelet[2917]: E0909 22:15:48.095975 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.095995 kubelet[2917]: W0909 22:15:48.095992 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.096207 kubelet[2917]: E0909 22:15:48.096057 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.096388 kubelet[2917]: E0909 22:15:48.096353 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.096388 kubelet[2917]: W0909 22:15:48.096373 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.096388 kubelet[2917]: E0909 22:15:48.096406 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.096988 kubelet[2917]: E0909 22:15:48.096968 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.096988 kubelet[2917]: W0909 22:15:48.096985 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.097174 kubelet[2917]: E0909 22:15:48.097055 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.098798 kubelet[2917]: E0909 22:15:48.098735 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.098798 kubelet[2917]: W0909 22:15:48.098758 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.099075 kubelet[2917]: E0909 22:15:48.099044 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.099406 kubelet[2917]: E0909 22:15:48.099365 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.099406 kubelet[2917]: W0909 22:15:48.099383 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.099636 kubelet[2917]: E0909 22:15:48.099606 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.099974 kubelet[2917]: E0909 22:15:48.099935 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.099974 kubelet[2917]: W0909 22:15:48.099953 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.100168 kubelet[2917]: E0909 22:15:48.100149 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.100622 kubelet[2917]: E0909 22:15:48.100548 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.100622 kubelet[2917]: W0909 22:15:48.100564 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.100812 kubelet[2917]: E0909 22:15:48.100742 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.101113 kubelet[2917]: E0909 22:15:48.101074 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.101113 kubelet[2917]: W0909 22:15:48.101091 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.101292 kubelet[2917]: E0909 22:15:48.101274 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.101690 kubelet[2917]: E0909 22:15:48.101651 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.101690 kubelet[2917]: W0909 22:15:48.101669 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.102013 kubelet[2917]: E0909 22:15:48.101992 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.102386 kubelet[2917]: E0909 22:15:48.102189 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.102386 kubelet[2917]: W0909 22:15:48.102206 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.102689 kubelet[2917]: E0909 22:15:48.102668 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.102935 kubelet[2917]: E0909 22:15:48.102836 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.103043 kubelet[2917]: W0909 22:15:48.103012 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.103312 kubelet[2917]: E0909 22:15:48.103243 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.104032 kubelet[2917]: E0909 22:15:48.103982 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.104032 kubelet[2917]: W0909 22:15:48.104011 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.104318 kubelet[2917]: E0909 22:15:48.104271 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.104812 kubelet[2917]: E0909 22:15:48.104716 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.104812 kubelet[2917]: W0909 22:15:48.104744 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.105320 kubelet[2917]: E0909 22:15:48.105038 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.105473 kubelet[2917]: E0909 22:15:48.105443 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.105753 kubelet[2917]: W0909 22:15:48.105713 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.106050 kubelet[2917]: E0909 22:15:48.106011 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.106583 kubelet[2917]: E0909 22:15:48.106510 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.106583 kubelet[2917]: W0909 22:15:48.106529 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.106583 kubelet[2917]: E0909 22:15:48.106544 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.131372 sshd[3316]: Invalid user innova from 14.194.76.134 port 39651 Sep 9 22:15:48.133237 kubelet[2917]: E0909 22:15:48.132952 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:48.133237 kubelet[2917]: W0909 22:15:48.132985 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:48.133237 kubelet[2917]: E0909 22:15:48.133010 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:48.209402 containerd[1617]: time="2025-09-09T22:15:48.209301566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fk2fc,Uid:abdb43db-ee47-461a-a555-8668929fc8cc,Namespace:calico-system,Attempt:0,} returns sandbox id \"4d08ce43e3f3a96ea00dbde202bf6108a585f6a8ac837b6accfbc8c48386c8cf\"" Sep 9 22:15:48.400612 sshd[3316]: Received disconnect from 14.194.76.134 port 39651:11: Bye Bye [preauth] Sep 9 22:15:48.400612 sshd[3316]: Disconnected from invalid user innova 14.194.76.134 port 39651 [preauth] Sep 9 22:15:48.406497 systemd[1]: sshd@13-10.230.51.18:22-14.194.76.134:39651.service: Deactivated successfully. Sep 9 22:15:49.400920 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1906001818.mount: Deactivated successfully. Sep 9 22:15:49.868201 kubelet[2917]: E0909 22:15:49.866437 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nxmhd" podUID="ee89689e-04d0-4108-8d66-1a161973549c" Sep 9 22:15:51.508641 containerd[1617]: time="2025-09-09T22:15:51.508586742Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:51.509653 containerd[1617]: time="2025-09-09T22:15:51.509623143Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 9 22:15:51.510820 containerd[1617]: time="2025-09-09T22:15:51.510522931Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:51.512816 containerd[1617]: time="2025-09-09T22:15:51.512750984Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:51.513758 containerd[1617]: time="2025-09-09T22:15:51.513718470Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 3.741035105s" Sep 9 22:15:51.513858 containerd[1617]: time="2025-09-09T22:15:51.513762309Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 9 22:15:51.515655 containerd[1617]: time="2025-09-09T22:15:51.515403985Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 22:15:51.542691 containerd[1617]: time="2025-09-09T22:15:51.542306426Z" level=info msg="CreateContainer within sandbox \"4553f15ccf2d2b3675c854130867d0516359c348be8fa73e925dadccf921f867\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 22:15:51.575812 containerd[1617]: time="2025-09-09T22:15:51.575724471Z" level=info msg="Container 23345813d07de71d864fa6699472af5761850cc3857d7af3d44f5539c249fac2: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:15:51.592016 containerd[1617]: time="2025-09-09T22:15:51.591915714Z" level=info msg="CreateContainer within sandbox \"4553f15ccf2d2b3675c854130867d0516359c348be8fa73e925dadccf921f867\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"23345813d07de71d864fa6699472af5761850cc3857d7af3d44f5539c249fac2\"" Sep 9 22:15:51.593761 containerd[1617]: time="2025-09-09T22:15:51.593729608Z" level=info msg="StartContainer for \"23345813d07de71d864fa6699472af5761850cc3857d7af3d44f5539c249fac2\"" Sep 9 22:15:51.615630 containerd[1617]: time="2025-09-09T22:15:51.615586728Z" level=info msg="connecting to shim 23345813d07de71d864fa6699472af5761850cc3857d7af3d44f5539c249fac2" address="unix:///run/containerd/s/a2b26acbd307f734dda1d97df7ba27f74c01a02beaa6e6cc72992bcb7aab570a" protocol=ttrpc version=3 Sep 9 22:15:51.652312 systemd[1]: Started cri-containerd-23345813d07de71d864fa6699472af5761850cc3857d7af3d44f5539c249fac2.scope - libcontainer container 23345813d07de71d864fa6699472af5761850cc3857d7af3d44f5539c249fac2. Sep 9 22:15:51.744999 containerd[1617]: time="2025-09-09T22:15:51.744674955Z" level=info msg="StartContainer for \"23345813d07de71d864fa6699472af5761850cc3857d7af3d44f5539c249fac2\" returns successfully" Sep 9 22:15:51.866563 kubelet[2917]: E0909 22:15:51.866497 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nxmhd" podUID="ee89689e-04d0-4108-8d66-1a161973549c" Sep 9 22:15:52.054233 kubelet[2917]: I0909 22:15:52.053519 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7cb5dbb58c-858hj" podStartSLOduration=1.310884237 podStartE2EDuration="5.053487751s" podCreationTimestamp="2025-09-09 22:15:47 +0000 UTC" firstStartedPulling="2025-09-09 22:15:47.772341234 +0000 UTC m=+23.140497924" lastFinishedPulling="2025-09-09 22:15:51.514944738 +0000 UTC m=+26.883101438" observedRunningTime="2025-09-09 22:15:52.052924273 +0000 UTC m=+27.421080973" watchObservedRunningTime="2025-09-09 22:15:52.053487751 +0000 UTC m=+27.421644445" Sep 9 22:15:52.097197 kubelet[2917]: E0909 22:15:52.097156 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.097370 kubelet[2917]: W0909 22:15:52.097203 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.097370 kubelet[2917]: E0909 22:15:52.097263 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.097566 kubelet[2917]: E0909 22:15:52.097527 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.097627 kubelet[2917]: W0909 22:15:52.097568 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.097627 kubelet[2917]: E0909 22:15:52.097587 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.098672 kubelet[2917]: E0909 22:15:52.097852 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.098672 kubelet[2917]: W0909 22:15:52.097882 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.098672 kubelet[2917]: E0909 22:15:52.097900 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.100391 kubelet[2917]: E0909 22:15:52.100363 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.100391 kubelet[2917]: W0909 22:15:52.100387 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.100591 kubelet[2917]: E0909 22:15:52.100405 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.100843 kubelet[2917]: E0909 22:15:52.100821 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.100843 kubelet[2917]: W0909 22:15:52.100840 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.100955 kubelet[2917]: E0909 22:15:52.100855 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.101126 kubelet[2917]: E0909 22:15:52.101103 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.101184 kubelet[2917]: W0909 22:15:52.101134 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.101184 kubelet[2917]: E0909 22:15:52.101150 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.101415 kubelet[2917]: E0909 22:15:52.101372 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.101415 kubelet[2917]: W0909 22:15:52.101414 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.101507 kubelet[2917]: E0909 22:15:52.101430 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.101792 kubelet[2917]: E0909 22:15:52.101757 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.101856 kubelet[2917]: W0909 22:15:52.101776 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.101856 kubelet[2917]: E0909 22:15:52.101817 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.102116 kubelet[2917]: E0909 22:15:52.102095 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.102174 kubelet[2917]: W0909 22:15:52.102113 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.102174 kubelet[2917]: E0909 22:15:52.102149 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.102954 kubelet[2917]: E0909 22:15:52.102921 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.102954 kubelet[2917]: W0909 22:15:52.102941 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.102954 kubelet[2917]: E0909 22:15:52.102955 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.103276 kubelet[2917]: E0909 22:15:52.103253 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.103276 kubelet[2917]: W0909 22:15:52.103272 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.103388 kubelet[2917]: E0909 22:15:52.103287 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.103924 kubelet[2917]: E0909 22:15:52.103901 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.103924 kubelet[2917]: W0909 22:15:52.103920 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.104026 kubelet[2917]: E0909 22:15:52.103935 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.104214 kubelet[2917]: E0909 22:15:52.104189 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.104214 kubelet[2917]: W0909 22:15:52.104208 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.104333 kubelet[2917]: E0909 22:15:52.104222 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.105475 kubelet[2917]: E0909 22:15:52.105446 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.105475 kubelet[2917]: W0909 22:15:52.105467 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.105619 kubelet[2917]: E0909 22:15:52.105482 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.106863 kubelet[2917]: E0909 22:15:52.105749 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.106863 kubelet[2917]: W0909 22:15:52.105769 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.106863 kubelet[2917]: E0909 22:15:52.105808 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.130890 kubelet[2917]: E0909 22:15:52.130753 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.131867 kubelet[2917]: W0909 22:15:52.131067 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.131867 kubelet[2917]: E0909 22:15:52.131100 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.131867 kubelet[2917]: E0909 22:15:52.131843 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.131867 kubelet[2917]: W0909 22:15:52.131858 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.133061 kubelet[2917]: E0909 22:15:52.131986 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.133295 kubelet[2917]: E0909 22:15:52.133235 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.133815 kubelet[2917]: W0909 22:15:52.133253 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.133815 kubelet[2917]: E0909 22:15:52.133408 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.133815 kubelet[2917]: E0909 22:15:52.133758 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.134350 kubelet[2917]: W0909 22:15:52.133771 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.134350 kubelet[2917]: E0909 22:15:52.134342 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.134632 kubelet[2917]: E0909 22:15:52.134577 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.134632 kubelet[2917]: W0909 22:15:52.134590 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.134632 kubelet[2917]: E0909 22:15:52.134611 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.135960 kubelet[2917]: E0909 22:15:52.134935 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.135960 kubelet[2917]: W0909 22:15:52.134947 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.135960 kubelet[2917]: E0909 22:15:52.134961 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.135960 kubelet[2917]: E0909 22:15:52.135852 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.135960 kubelet[2917]: W0909 22:15:52.135865 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.136166 kubelet[2917]: E0909 22:15:52.136069 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.136166 kubelet[2917]: W0909 22:15:52.136081 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.136636 kubelet[2917]: E0909 22:15:52.136313 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.136636 kubelet[2917]: W0909 22:15:52.136326 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.136636 kubelet[2917]: E0909 22:15:52.136341 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.137596 kubelet[2917]: E0909 22:15:52.136723 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.137596 kubelet[2917]: E0909 22:15:52.137163 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.137596 kubelet[2917]: W0909 22:15:52.137176 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.137596 kubelet[2917]: E0909 22:15:52.137190 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.137596 kubelet[2917]: E0909 22:15:52.136957 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.138028 kubelet[2917]: E0909 22:15:52.137691 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.138028 kubelet[2917]: W0909 22:15:52.137704 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.138028 kubelet[2917]: E0909 22:15:52.137718 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.138514 kubelet[2917]: E0909 22:15:52.138142 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.138514 kubelet[2917]: W0909 22:15:52.138165 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.138514 kubelet[2917]: E0909 22:15:52.138199 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.139768 kubelet[2917]: E0909 22:15:52.139630 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.139768 kubelet[2917]: W0909 22:15:52.139649 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.140984 kubelet[2917]: E0909 22:15:52.140962 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.141354 kubelet[2917]: E0909 22:15:52.141324 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.141465 kubelet[2917]: W0909 22:15:52.141447 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.141633 kubelet[2917]: E0909 22:15:52.141546 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.142046 kubelet[2917]: E0909 22:15:52.142023 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.142046 kubelet[2917]: W0909 22:15:52.142043 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.142251 kubelet[2917]: E0909 22:15:52.142058 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.142319 kubelet[2917]: E0909 22:15:52.142287 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.142319 kubelet[2917]: W0909 22:15:52.142300 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.142319 kubelet[2917]: E0909 22:15:52.142313 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.143286 kubelet[2917]: E0909 22:15:52.143264 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.143286 kubelet[2917]: W0909 22:15:52.143282 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.143390 kubelet[2917]: E0909 22:15:52.143297 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:52.144887 kubelet[2917]: E0909 22:15:52.144866 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:52.144887 kubelet[2917]: W0909 22:15:52.144884 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:52.144999 kubelet[2917]: E0909 22:15:52.144899 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.027094 kubelet[2917]: I0909 22:15:53.027038 2917 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 22:15:53.082007 containerd[1617]: time="2025-09-09T22:15:53.081954375Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:53.083738 containerd[1617]: time="2025-09-09T22:15:53.083688559Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 9 22:15:53.084332 containerd[1617]: time="2025-09-09T22:15:53.084260325Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:53.088025 containerd[1617]: time="2025-09-09T22:15:53.087970067Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:53.089026 containerd[1617]: time="2025-09-09T22:15:53.088818498Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.573351518s" Sep 9 22:15:53.089026 containerd[1617]: time="2025-09-09T22:15:53.088860523Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 9 22:15:53.093955 containerd[1617]: time="2025-09-09T22:15:53.093912244Z" level=info msg="CreateContainer within sandbox \"4d08ce43e3f3a96ea00dbde202bf6108a585f6a8ac837b6accfbc8c48386c8cf\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 22:15:53.106105 containerd[1617]: time="2025-09-09T22:15:53.104008807Z" level=info msg="Container e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:15:53.109640 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount265587714.mount: Deactivated successfully. Sep 9 22:15:53.113405 kubelet[2917]: E0909 22:15:53.113380 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.113605 kubelet[2917]: W0909 22:15:53.113581 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.113741 kubelet[2917]: E0909 22:15:53.113717 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.114417 kubelet[2917]: E0909 22:15:53.114397 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.114665 kubelet[2917]: W0909 22:15:53.114644 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.115624 kubelet[2917]: E0909 22:15:53.115413 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.116210 kubelet[2917]: E0909 22:15:53.115802 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.116210 kubelet[2917]: W0909 22:15:53.115823 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.116210 kubelet[2917]: E0909 22:15:53.115839 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.116584 kubelet[2917]: E0909 22:15:53.116507 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.116757 kubelet[2917]: W0909 22:15:53.116736 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.117044 kubelet[2917]: E0909 22:15:53.116843 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.117302 kubelet[2917]: E0909 22:15:53.117250 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.117302 kubelet[2917]: W0909 22:15:53.117288 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.117746 kubelet[2917]: E0909 22:15:53.117308 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.117746 kubelet[2917]: E0909 22:15:53.117555 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.117746 kubelet[2917]: W0909 22:15:53.117568 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.117746 kubelet[2917]: E0909 22:15:53.117584 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.118166 kubelet[2917]: E0909 22:15:53.117858 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.118166 kubelet[2917]: W0909 22:15:53.117890 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.118166 kubelet[2917]: E0909 22:15:53.117905 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.118166 kubelet[2917]: E0909 22:15:53.118141 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.118166 kubelet[2917]: W0909 22:15:53.118153 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.118930 kubelet[2917]: E0909 22:15:53.118186 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.118930 kubelet[2917]: E0909 22:15:53.118449 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.118930 kubelet[2917]: W0909 22:15:53.118462 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.118930 kubelet[2917]: E0909 22:15:53.118474 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.118930 kubelet[2917]: E0909 22:15:53.118719 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.118930 kubelet[2917]: W0909 22:15:53.118731 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.118930 kubelet[2917]: E0909 22:15:53.118744 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.119695 kubelet[2917]: E0909 22:15:53.119040 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.119695 kubelet[2917]: W0909 22:15:53.119052 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.119695 kubelet[2917]: E0909 22:15:53.119084 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.119695 kubelet[2917]: E0909 22:15:53.119338 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.119695 kubelet[2917]: W0909 22:15:53.119350 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.119695 kubelet[2917]: E0909 22:15:53.119362 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.119695 kubelet[2917]: E0909 22:15:53.119622 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.119695 kubelet[2917]: W0909 22:15:53.119654 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.119695 kubelet[2917]: E0909 22:15:53.119669 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.120051 kubelet[2917]: E0909 22:15:53.119935 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.120051 kubelet[2917]: W0909 22:15:53.119947 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.120051 kubelet[2917]: E0909 22:15:53.119959 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.120356 kubelet[2917]: E0909 22:15:53.120188 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.120356 kubelet[2917]: W0909 22:15:53.120200 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.120356 kubelet[2917]: E0909 22:15:53.120213 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.128321 containerd[1617]: time="2025-09-09T22:15:53.128108118Z" level=info msg="CreateContainer within sandbox \"4d08ce43e3f3a96ea00dbde202bf6108a585f6a8ac837b6accfbc8c48386c8cf\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df\"" Sep 9 22:15:53.130621 containerd[1617]: time="2025-09-09T22:15:53.129826895Z" level=info msg="StartContainer for \"e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df\"" Sep 9 22:15:53.133753 containerd[1617]: time="2025-09-09T22:15:53.133373062Z" level=info msg="connecting to shim e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df" address="unix:///run/containerd/s/4e58339c1406230f49d9ffe76a393f434127ad8d535c374abcfc0586769aba3b" protocol=ttrpc version=3 Sep 9 22:15:53.144585 kubelet[2917]: E0909 22:15:53.144403 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.144585 kubelet[2917]: W0909 22:15:53.144465 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.144585 kubelet[2917]: E0909 22:15:53.144507 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.145238 kubelet[2917]: E0909 22:15:53.145216 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.145342 kubelet[2917]: W0909 22:15:53.145264 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.145342 kubelet[2917]: E0909 22:15:53.145305 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.145738 kubelet[2917]: E0909 22:15:53.145629 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.145813 kubelet[2917]: W0909 22:15:53.145765 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.145915 kubelet[2917]: E0909 22:15:53.145826 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.146169 kubelet[2917]: E0909 22:15:53.146115 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.146315 kubelet[2917]: W0909 22:15:53.146181 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.146435 kubelet[2917]: E0909 22:15:53.146408 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.146623 kubelet[2917]: E0909 22:15:53.146600 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.146623 kubelet[2917]: W0909 22:15:53.146618 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.146926 kubelet[2917]: E0909 22:15:53.146672 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.148078 kubelet[2917]: E0909 22:15:53.147885 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.148078 kubelet[2917]: W0909 22:15:53.147906 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.148078 kubelet[2917]: E0909 22:15:53.147932 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.148511 kubelet[2917]: E0909 22:15:53.148343 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.148511 kubelet[2917]: W0909 22:15:53.148369 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.148511 kubelet[2917]: E0909 22:15:53.148403 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.148895 kubelet[2917]: E0909 22:15:53.148744 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.148895 kubelet[2917]: W0909 22:15:53.148762 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.148895 kubelet[2917]: E0909 22:15:53.148835 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.149272 kubelet[2917]: E0909 22:15:53.149128 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.149272 kubelet[2917]: W0909 22:15:53.149147 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.149272 kubelet[2917]: E0909 22:15:53.149177 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.149744 kubelet[2917]: E0909 22:15:53.149527 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.149744 kubelet[2917]: W0909 22:15:53.149545 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.149744 kubelet[2917]: E0909 22:15:53.149569 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.150897 kubelet[2917]: E0909 22:15:53.150872 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.150897 kubelet[2917]: W0909 22:15:53.150892 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.151029 kubelet[2917]: E0909 22:15:53.150914 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.151800 kubelet[2917]: E0909 22:15:53.151672 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.151800 kubelet[2917]: W0909 22:15:53.151701 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.151930 kubelet[2917]: E0909 22:15:53.151808 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.152064 kubelet[2917]: E0909 22:15:53.152045 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.152116 kubelet[2917]: W0909 22:15:53.152063 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.152169 kubelet[2917]: E0909 22:15:53.152152 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.152666 kubelet[2917]: E0909 22:15:53.152582 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.152666 kubelet[2917]: W0909 22:15:53.152600 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.152666 kubelet[2917]: E0909 22:15:53.152631 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.152949 kubelet[2917]: E0909 22:15:53.152927 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.152949 kubelet[2917]: W0909 22:15:53.152945 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.153045 kubelet[2917]: E0909 22:15:53.152976 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.153254 kubelet[2917]: E0909 22:15:53.153233 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.153254 kubelet[2917]: W0909 22:15:53.153251 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.153348 kubelet[2917]: E0909 22:15:53.153282 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.153558 kubelet[2917]: E0909 22:15:53.153537 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.153558 kubelet[2917]: W0909 22:15:53.153557 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.153654 kubelet[2917]: E0909 22:15:53.153571 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.154391 kubelet[2917]: E0909 22:15:53.153945 2917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:15:53.154391 kubelet[2917]: W0909 22:15:53.153957 2917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:15:53.154391 kubelet[2917]: E0909 22:15:53.153970 2917 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:15:53.170274 systemd[1]: Started cri-containerd-e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df.scope - libcontainer container e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df. Sep 9 22:15:53.241449 containerd[1617]: time="2025-09-09T22:15:53.241354533Z" level=info msg="StartContainer for \"e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df\" returns successfully" Sep 9 22:15:53.253176 systemd[1]: cri-containerd-e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df.scope: Deactivated successfully. Sep 9 22:15:53.279180 containerd[1617]: time="2025-09-09T22:15:53.278608844Z" level=info msg="received exit event container_id:\"e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df\" id:\"e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df\" pid:3630 exited_at:{seconds:1757456153 nanos:256730163}" Sep 9 22:15:53.299862 containerd[1617]: time="2025-09-09T22:15:53.299806618Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df\" id:\"e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df\" pid:3630 exited_at:{seconds:1757456153 nanos:256730163}" Sep 9 22:15:53.348685 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e6d67f39dfc9f04a4f67066a43508721d1223cca18a37be207606c8db82848df-rootfs.mount: Deactivated successfully. Sep 9 22:15:53.865966 kubelet[2917]: E0909 22:15:53.865898 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nxmhd" podUID="ee89689e-04d0-4108-8d66-1a161973549c" Sep 9 22:15:54.036177 containerd[1617]: time="2025-09-09T22:15:54.035997786Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 22:15:55.865863 kubelet[2917]: E0909 22:15:55.865775 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nxmhd" podUID="ee89689e-04d0-4108-8d66-1a161973549c" Sep 9 22:15:56.260362 kubelet[2917]: I0909 22:15:56.259670 2917 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 22:15:57.867858 kubelet[2917]: E0909 22:15:57.866839 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nxmhd" podUID="ee89689e-04d0-4108-8d66-1a161973549c" Sep 9 22:15:59.266076 containerd[1617]: time="2025-09-09T22:15:59.265162787Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:59.266076 containerd[1617]: time="2025-09-09T22:15:59.266036524Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 9 22:15:59.266879 containerd[1617]: time="2025-09-09T22:15:59.266849017Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:59.269020 containerd[1617]: time="2025-09-09T22:15:59.268960859Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:15:59.270037 containerd[1617]: time="2025-09-09T22:15:59.269991470Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 5.233898955s" Sep 9 22:15:59.270127 containerd[1617]: time="2025-09-09T22:15:59.270040719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 9 22:15:59.324077 containerd[1617]: time="2025-09-09T22:15:59.324028520Z" level=info msg="CreateContainer within sandbox \"4d08ce43e3f3a96ea00dbde202bf6108a585f6a8ac837b6accfbc8c48386c8cf\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 22:15:59.339214 containerd[1617]: time="2025-09-09T22:15:59.339156643Z" level=info msg="Container a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:15:59.347266 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount856559427.mount: Deactivated successfully. Sep 9 22:15:59.403237 containerd[1617]: time="2025-09-09T22:15:59.403194182Z" level=info msg="CreateContainer within sandbox \"4d08ce43e3f3a96ea00dbde202bf6108a585f6a8ac837b6accfbc8c48386c8cf\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e\"" Sep 9 22:15:59.405414 containerd[1617]: time="2025-09-09T22:15:59.405205526Z" level=info msg="StartContainer for \"a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e\"" Sep 9 22:15:59.408807 containerd[1617]: time="2025-09-09T22:15:59.408706550Z" level=info msg="connecting to shim a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e" address="unix:///run/containerd/s/4e58339c1406230f49d9ffe76a393f434127ad8d535c374abcfc0586769aba3b" protocol=ttrpc version=3 Sep 9 22:15:59.442369 systemd[1]: Started cri-containerd-a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e.scope - libcontainer container a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e. Sep 9 22:15:59.675031 containerd[1617]: time="2025-09-09T22:15:59.674966617Z" level=info msg="StartContainer for \"a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e\" returns successfully" Sep 9 22:15:59.866226 kubelet[2917]: E0909 22:15:59.866121 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nxmhd" podUID="ee89689e-04d0-4108-8d66-1a161973549c" Sep 9 22:16:00.700677 systemd[1]: cri-containerd-a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e.scope: Deactivated successfully. Sep 9 22:16:00.701081 systemd[1]: cri-containerd-a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e.scope: Consumed 710ms CPU time, 166.6M memory peak, 10.3M read from disk, 171.3M written to disk. Sep 9 22:16:00.740358 kubelet[2917]: I0909 22:16:00.739674 2917 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 22:16:00.753931 containerd[1617]: time="2025-09-09T22:16:00.753884115Z" level=info msg="received exit event container_id:\"a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e\" id:\"a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e\" pid:3690 exited_at:{seconds:1757456160 nanos:743046532}" Sep 9 22:16:00.755035 containerd[1617]: time="2025-09-09T22:16:00.754171205Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e\" id:\"a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e\" pid:3690 exited_at:{seconds:1757456160 nanos:743046532}" Sep 9 22:16:00.833402 kubelet[2917]: W0909 22:16:00.833362 2917 reflector.go:569] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:srv-rokxy.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'srv-rokxy.gb1.brightbox.com' and this object Sep 9 22:16:00.834710 kubelet[2917]: E0909 22:16:00.834667 2917 reflector.go:166] "Unhandled Error" err="object-\"kube-system\"/\"coredns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"coredns\" is forbidden: User \"system:node:srv-rokxy.gb1.brightbox.com\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'srv-rokxy.gb1.brightbox.com' and this object" logger="UnhandledError" Sep 9 22:16:00.837673 systemd[1]: Created slice kubepods-burstable-podae85377c_e3d4_4eb7_96d9_9c28e74f1766.slice - libcontainer container kubepods-burstable-podae85377c_e3d4_4eb7_96d9_9c28e74f1766.slice. Sep 9 22:16:00.842535 kubelet[2917]: I0909 22:16:00.842285 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wrtx\" (UniqueName: \"kubernetes.io/projected/ae85377c-e3d4-4eb7-96d9-9c28e74f1766-kube-api-access-6wrtx\") pod \"coredns-668d6bf9bc-66bzc\" (UID: \"ae85377c-e3d4-4eb7-96d9-9c28e74f1766\") " pod="kube-system/coredns-668d6bf9bc-66bzc" Sep 9 22:16:00.843554 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a2232c3bb69a5422320f0c59db93296eef93b81090288c8a867b2460398cf94e-rootfs.mount: Deactivated successfully. Sep 9 22:16:00.847807 kubelet[2917]: I0909 22:16:00.842618 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae85377c-e3d4-4eb7-96d9-9c28e74f1766-config-volume\") pod \"coredns-668d6bf9bc-66bzc\" (UID: \"ae85377c-e3d4-4eb7-96d9-9c28e74f1766\") " pod="kube-system/coredns-668d6bf9bc-66bzc" Sep 9 22:16:00.873843 systemd[1]: Created slice kubepods-besteffort-pod01ff96af_882d_4b97_9f66_cd447252b78e.slice - libcontainer container kubepods-besteffort-pod01ff96af_882d_4b97_9f66_cd447252b78e.slice. Sep 9 22:16:00.891701 systemd[1]: Created slice kubepods-burstable-poda8772658_fdec_4b6f_ba7d_ab8a7007aafd.slice - libcontainer container kubepods-burstable-poda8772658_fdec_4b6f_ba7d_ab8a7007aafd.slice. Sep 9 22:16:00.916090 systemd[1]: Created slice kubepods-besteffort-poda3c2f6ec_3026_47d2_b3a9_9d8d93bd8613.slice - libcontainer container kubepods-besteffort-poda3c2f6ec_3026_47d2_b3a9_9d8d93bd8613.slice. Sep 9 22:16:00.928930 systemd[1]: Created slice kubepods-besteffort-podefcf61b5_b781_4fd4_a290_db86c9cc791c.slice - libcontainer container kubepods-besteffort-podefcf61b5_b781_4fd4_a290_db86c9cc791c.slice. Sep 9 22:16:00.946052 systemd[1]: Created slice kubepods-besteffort-pod968c322d_47fa_4065_99b3_0c60d65ccbe3.slice - libcontainer container kubepods-besteffort-pod968c322d_47fa_4065_99b3_0c60d65ccbe3.slice. Sep 9 22:16:00.954047 kubelet[2917]: I0909 22:16:00.953819 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/968c322d-47fa-4065-99b3-0c60d65ccbe3-config\") pod \"goldmane-54d579b49d-hqktr\" (UID: \"968c322d-47fa-4065-99b3-0c60d65ccbe3\") " pod="calico-system/goldmane-54d579b49d-hqktr" Sep 9 22:16:00.954047 kubelet[2917]: I0909 22:16:00.953877 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/01ff96af-882d-4b97-9f66-cd447252b78e-whisker-backend-key-pair\") pod \"whisker-c5869dd9d-bl8z2\" (UID: \"01ff96af-882d-4b97-9f66-cd447252b78e\") " pod="calico-system/whisker-c5869dd9d-bl8z2" Sep 9 22:16:00.954962 kubelet[2917]: I0909 22:16:00.954839 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqvlm\" (UniqueName: \"kubernetes.io/projected/0ead3947-6dec-40c9-9eb3-9969c527f104-kube-api-access-dqvlm\") pod \"calico-apiserver-56d9fcb775-xls5d\" (UID: \"0ead3947-6dec-40c9-9eb3-9969c527f104\") " pod="calico-apiserver/calico-apiserver-56d9fcb775-xls5d" Sep 9 22:16:00.954962 kubelet[2917]: I0909 22:16:00.954884 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01ff96af-882d-4b97-9f66-cd447252b78e-whisker-ca-bundle\") pod \"whisker-c5869dd9d-bl8z2\" (UID: \"01ff96af-882d-4b97-9f66-cd447252b78e\") " pod="calico-system/whisker-c5869dd9d-bl8z2" Sep 9 22:16:00.954962 kubelet[2917]: I0909 22:16:00.954928 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nshg8\" (UniqueName: \"kubernetes.io/projected/01ff96af-882d-4b97-9f66-cd447252b78e-kube-api-access-nshg8\") pod \"whisker-c5869dd9d-bl8z2\" (UID: \"01ff96af-882d-4b97-9f66-cd447252b78e\") " pod="calico-system/whisker-c5869dd9d-bl8z2" Sep 9 22:16:00.956101 kubelet[2917]: I0909 22:16:00.954964 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksctl\" (UniqueName: \"kubernetes.io/projected/efcf61b5-b781-4fd4-a290-db86c9cc791c-kube-api-access-ksctl\") pod \"calico-apiserver-56d9fcb775-78pgr\" (UID: \"efcf61b5-b781-4fd4-a290-db86c9cc791c\") " pod="calico-apiserver/calico-apiserver-56d9fcb775-78pgr" Sep 9 22:16:00.956101 kubelet[2917]: I0909 22:16:00.954989 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613-tigera-ca-bundle\") pod \"calico-kube-controllers-545dfbd8d5-xv6qv\" (UID: \"a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613\") " pod="calico-system/calico-kube-controllers-545dfbd8d5-xv6qv" Sep 9 22:16:00.956101 kubelet[2917]: I0909 22:16:00.955021 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5ml2\" (UniqueName: \"kubernetes.io/projected/a8772658-fdec-4b6f-ba7d-ab8a7007aafd-kube-api-access-s5ml2\") pod \"coredns-668d6bf9bc-hjrnt\" (UID: \"a8772658-fdec-4b6f-ba7d-ab8a7007aafd\") " pod="kube-system/coredns-668d6bf9bc-hjrnt" Sep 9 22:16:00.956101 kubelet[2917]: I0909 22:16:00.955048 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzkjx\" (UniqueName: \"kubernetes.io/projected/a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613-kube-api-access-zzkjx\") pod \"calico-kube-controllers-545dfbd8d5-xv6qv\" (UID: \"a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613\") " pod="calico-system/calico-kube-controllers-545dfbd8d5-xv6qv" Sep 9 22:16:00.956101 kubelet[2917]: I0909 22:16:00.956070 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/968c322d-47fa-4065-99b3-0c60d65ccbe3-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-hqktr\" (UID: \"968c322d-47fa-4065-99b3-0c60d65ccbe3\") " pod="calico-system/goldmane-54d579b49d-hqktr" Sep 9 22:16:00.957052 kubelet[2917]: I0909 22:16:00.956105 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0ead3947-6dec-40c9-9eb3-9969c527f104-calico-apiserver-certs\") pod \"calico-apiserver-56d9fcb775-xls5d\" (UID: \"0ead3947-6dec-40c9-9eb3-9969c527f104\") " pod="calico-apiserver/calico-apiserver-56d9fcb775-xls5d" Sep 9 22:16:00.957052 kubelet[2917]: I0909 22:16:00.956139 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/efcf61b5-b781-4fd4-a290-db86c9cc791c-calico-apiserver-certs\") pod \"calico-apiserver-56d9fcb775-78pgr\" (UID: \"efcf61b5-b781-4fd4-a290-db86c9cc791c\") " pod="calico-apiserver/calico-apiserver-56d9fcb775-78pgr" Sep 9 22:16:00.957052 kubelet[2917]: I0909 22:16:00.956169 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/968c322d-47fa-4065-99b3-0c60d65ccbe3-goldmane-key-pair\") pod \"goldmane-54d579b49d-hqktr\" (UID: \"968c322d-47fa-4065-99b3-0c60d65ccbe3\") " pod="calico-system/goldmane-54d579b49d-hqktr" Sep 9 22:16:00.957052 kubelet[2917]: I0909 22:16:00.956193 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfvpx\" (UniqueName: \"kubernetes.io/projected/968c322d-47fa-4065-99b3-0c60d65ccbe3-kube-api-access-bfvpx\") pod \"goldmane-54d579b49d-hqktr\" (UID: \"968c322d-47fa-4065-99b3-0c60d65ccbe3\") " pod="calico-system/goldmane-54d579b49d-hqktr" Sep 9 22:16:00.957052 kubelet[2917]: I0909 22:16:00.956235 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8772658-fdec-4b6f-ba7d-ab8a7007aafd-config-volume\") pod \"coredns-668d6bf9bc-hjrnt\" (UID: \"a8772658-fdec-4b6f-ba7d-ab8a7007aafd\") " pod="kube-system/coredns-668d6bf9bc-hjrnt" Sep 9 22:16:00.963001 systemd[1]: Created slice kubepods-besteffort-pod0ead3947_6dec_40c9_9eb3_9969c527f104.slice - libcontainer container kubepods-besteffort-pod0ead3947_6dec_40c9_9eb3_9969c527f104.slice. Sep 9 22:16:01.120421 containerd[1617]: time="2025-09-09T22:16:01.120001594Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 22:16:01.194547 containerd[1617]: time="2025-09-09T22:16:01.194442901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-c5869dd9d-bl8z2,Uid:01ff96af-882d-4b97-9f66-cd447252b78e,Namespace:calico-system,Attempt:0,}" Sep 9 22:16:01.223503 containerd[1617]: time="2025-09-09T22:16:01.223366172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-545dfbd8d5-xv6qv,Uid:a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613,Namespace:calico-system,Attempt:0,}" Sep 9 22:16:01.232659 systemd[1]: Started sshd@14-10.230.51.18:22-85.209.134.43:10372.service - OpenSSH per-connection server daemon (85.209.134.43:10372). Sep 9 22:16:01.265475 containerd[1617]: time="2025-09-09T22:16:01.265097734Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-hqktr,Uid:968c322d-47fa-4065-99b3-0c60d65ccbe3,Namespace:calico-system,Attempt:0,}" Sep 9 22:16:01.281105 containerd[1617]: time="2025-09-09T22:16:01.281007931Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56d9fcb775-78pgr,Uid:efcf61b5-b781-4fd4-a290-db86c9cc791c,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:16:01.284461 containerd[1617]: time="2025-09-09T22:16:01.284401979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56d9fcb775-xls5d,Uid:0ead3947-6dec-40c9-9eb3-9969c527f104,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:16:01.534452 containerd[1617]: time="2025-09-09T22:16:01.534397595Z" level=error msg="Failed to destroy network for sandbox \"8d5c31cc08e8d52efa32e509b9cf4b01d96b0b7e404e5669072ed0d9225fc126\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.540631 containerd[1617]: time="2025-09-09T22:16:01.537664032Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-c5869dd9d-bl8z2,Uid:01ff96af-882d-4b97-9f66-cd447252b78e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d5c31cc08e8d52efa32e509b9cf4b01d96b0b7e404e5669072ed0d9225fc126\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.542243 kubelet[2917]: E0909 22:16:01.542164 2917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d5c31cc08e8d52efa32e509b9cf4b01d96b0b7e404e5669072ed0d9225fc126\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.544025 kubelet[2917]: E0909 22:16:01.542771 2917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d5c31cc08e8d52efa32e509b9cf4b01d96b0b7e404e5669072ed0d9225fc126\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-c5869dd9d-bl8z2" Sep 9 22:16:01.544025 kubelet[2917]: E0909 22:16:01.543363 2917 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d5c31cc08e8d52efa32e509b9cf4b01d96b0b7e404e5669072ed0d9225fc126\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-c5869dd9d-bl8z2" Sep 9 22:16:01.544025 kubelet[2917]: E0909 22:16:01.543483 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-c5869dd9d-bl8z2_calico-system(01ff96af-882d-4b97-9f66-cd447252b78e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-c5869dd9d-bl8z2_calico-system(01ff96af-882d-4b97-9f66-cd447252b78e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d5c31cc08e8d52efa32e509b9cf4b01d96b0b7e404e5669072ed0d9225fc126\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-c5869dd9d-bl8z2" podUID="01ff96af-882d-4b97-9f66-cd447252b78e" Sep 9 22:16:01.567441 containerd[1617]: time="2025-09-09T22:16:01.567352832Z" level=error msg="Failed to destroy network for sandbox \"3f660bd8b24e873c15c88407c7ff3fb25848ded27e9c834d1ebb2f28ace334b5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.570610 containerd[1617]: time="2025-09-09T22:16:01.570570111Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-545dfbd8d5-xv6qv,Uid:a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f660bd8b24e873c15c88407c7ff3fb25848ded27e9c834d1ebb2f28ace334b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.571199 kubelet[2917]: E0909 22:16:01.571148 2917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f660bd8b24e873c15c88407c7ff3fb25848ded27e9c834d1ebb2f28ace334b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.572665 kubelet[2917]: E0909 22:16:01.572264 2917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f660bd8b24e873c15c88407c7ff3fb25848ded27e9c834d1ebb2f28ace334b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-545dfbd8d5-xv6qv" Sep 9 22:16:01.572665 kubelet[2917]: E0909 22:16:01.572304 2917 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f660bd8b24e873c15c88407c7ff3fb25848ded27e9c834d1ebb2f28ace334b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-545dfbd8d5-xv6qv" Sep 9 22:16:01.572665 kubelet[2917]: E0909 22:16:01.572370 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-545dfbd8d5-xv6qv_calico-system(a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-545dfbd8d5-xv6qv_calico-system(a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3f660bd8b24e873c15c88407c7ff3fb25848ded27e9c834d1ebb2f28ace334b5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-545dfbd8d5-xv6qv" podUID="a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613" Sep 9 22:16:01.581998 containerd[1617]: time="2025-09-09T22:16:01.581954118Z" level=error msg="Failed to destroy network for sandbox \"b2c9689d10f9e09af585d15465e8a7f31815a238a10e72fcfcf4db6282d26c34\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.583141 containerd[1617]: time="2025-09-09T22:16:01.582963660Z" level=error msg="Failed to destroy network for sandbox \"8fb4ac68a2158e8e1d48f7e1e887babc2569b9abd5a1528e9be6c177c892cb88\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.584058 containerd[1617]: time="2025-09-09T22:16:01.584009548Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-hqktr,Uid:968c322d-47fa-4065-99b3-0c60d65ccbe3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2c9689d10f9e09af585d15465e8a7f31815a238a10e72fcfcf4db6282d26c34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.584469 kubelet[2917]: E0909 22:16:01.584390 2917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2c9689d10f9e09af585d15465e8a7f31815a238a10e72fcfcf4db6282d26c34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.584756 kubelet[2917]: E0909 22:16:01.584591 2917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2c9689d10f9e09af585d15465e8a7f31815a238a10e72fcfcf4db6282d26c34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-hqktr" Sep 9 22:16:01.584756 kubelet[2917]: E0909 22:16:01.584680 2917 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2c9689d10f9e09af585d15465e8a7f31815a238a10e72fcfcf4db6282d26c34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-hqktr" Sep 9 22:16:01.585076 kubelet[2917]: E0909 22:16:01.584999 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-hqktr_calico-system(968c322d-47fa-4065-99b3-0c60d65ccbe3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-hqktr_calico-system(968c322d-47fa-4065-99b3-0c60d65ccbe3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b2c9689d10f9e09af585d15465e8a7f31815a238a10e72fcfcf4db6282d26c34\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-hqktr" podUID="968c322d-47fa-4065-99b3-0c60d65ccbe3" Sep 9 22:16:01.586501 containerd[1617]: time="2025-09-09T22:16:01.586452145Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56d9fcb775-xls5d,Uid:0ead3947-6dec-40c9-9eb3-9969c527f104,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fb4ac68a2158e8e1d48f7e1e887babc2569b9abd5a1528e9be6c177c892cb88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.586774 kubelet[2917]: E0909 22:16:01.586707 2917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fb4ac68a2158e8e1d48f7e1e887babc2569b9abd5a1528e9be6c177c892cb88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.586974 kubelet[2917]: E0909 22:16:01.586769 2917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fb4ac68a2158e8e1d48f7e1e887babc2569b9abd5a1528e9be6c177c892cb88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56d9fcb775-xls5d" Sep 9 22:16:01.587068 kubelet[2917]: E0909 22:16:01.586982 2917 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fb4ac68a2158e8e1d48f7e1e887babc2569b9abd5a1528e9be6c177c892cb88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56d9fcb775-xls5d" Sep 9 22:16:01.587068 kubelet[2917]: E0909 22:16:01.587048 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-56d9fcb775-xls5d_calico-apiserver(0ead3947-6dec-40c9-9eb3-9969c527f104)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-56d9fcb775-xls5d_calico-apiserver(0ead3947-6dec-40c9-9eb3-9969c527f104)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8fb4ac68a2158e8e1d48f7e1e887babc2569b9abd5a1528e9be6c177c892cb88\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-56d9fcb775-xls5d" podUID="0ead3947-6dec-40c9-9eb3-9969c527f104" Sep 9 22:16:01.595514 containerd[1617]: time="2025-09-09T22:16:01.595344530Z" level=error msg="Failed to destroy network for sandbox \"61e97b0bdff6f26a1a26eef85a1c78f8513e28bb369db69272ca37fdfa9e4c24\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.596710 containerd[1617]: time="2025-09-09T22:16:01.596640625Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56d9fcb775-78pgr,Uid:efcf61b5-b781-4fd4-a290-db86c9cc791c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"61e97b0bdff6f26a1a26eef85a1c78f8513e28bb369db69272ca37fdfa9e4c24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.597344 kubelet[2917]: E0909 22:16:01.596984 2917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61e97b0bdff6f26a1a26eef85a1c78f8513e28bb369db69272ca37fdfa9e4c24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.597344 kubelet[2917]: E0909 22:16:01.597062 2917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61e97b0bdff6f26a1a26eef85a1c78f8513e28bb369db69272ca37fdfa9e4c24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56d9fcb775-78pgr" Sep 9 22:16:01.597344 kubelet[2917]: E0909 22:16:01.597103 2917 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61e97b0bdff6f26a1a26eef85a1c78f8513e28bb369db69272ca37fdfa9e4c24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56d9fcb775-78pgr" Sep 9 22:16:01.597478 kubelet[2917]: E0909 22:16:01.597167 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-56d9fcb775-78pgr_calico-apiserver(efcf61b5-b781-4fd4-a290-db86c9cc791c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-56d9fcb775-78pgr_calico-apiserver(efcf61b5-b781-4fd4-a290-db86c9cc791c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"61e97b0bdff6f26a1a26eef85a1c78f8513e28bb369db69272ca37fdfa9e4c24\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-56d9fcb775-78pgr" podUID="efcf61b5-b781-4fd4-a290-db86c9cc791c" Sep 9 22:16:01.827035 sshd[3742]: Received disconnect from 85.209.134.43 port 10372:11: Bye Bye [preauth] Sep 9 22:16:01.827035 sshd[3742]: Disconnected from authenticating user root 85.209.134.43 port 10372 [preauth] Sep 9 22:16:01.829445 systemd[1]: sshd@14-10.230.51.18:22-85.209.134.43:10372.service: Deactivated successfully. Sep 9 22:16:01.878639 systemd[1]: Created slice kubepods-besteffort-podee89689e_04d0_4108_8d66_1a161973549c.slice - libcontainer container kubepods-besteffort-podee89689e_04d0_4108_8d66_1a161973549c.slice. Sep 9 22:16:01.886038 containerd[1617]: time="2025-09-09T22:16:01.885958040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nxmhd,Uid:ee89689e-04d0-4108-8d66-1a161973549c,Namespace:calico-system,Attempt:0,}" Sep 9 22:16:01.957766 kubelet[2917]: E0909 22:16:01.957609 2917 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Sep 9 22:16:01.959109 kubelet[2917]: E0909 22:16:01.958434 2917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae85377c-e3d4-4eb7-96d9-9c28e74f1766-config-volume podName:ae85377c-e3d4-4eb7-96d9-9c28e74f1766 nodeName:}" failed. No retries permitted until 2025-09-09 22:16:02.457964607 +0000 UTC m=+37.826121289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/ae85377c-e3d4-4eb7-96d9-9c28e74f1766-config-volume") pod "coredns-668d6bf9bc-66bzc" (UID: "ae85377c-e3d4-4eb7-96d9-9c28e74f1766") : failed to sync configmap cache: timed out waiting for the condition Sep 9 22:16:01.985827 containerd[1617]: time="2025-09-09T22:16:01.985669398Z" level=error msg="Failed to destroy network for sandbox \"902ed8a567cea1d724b626df47bccf506a6c6301b0da84f19b9e9d4972879efa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.990001 systemd[1]: run-netns-cni\x2dc0d1f0f7\x2d2dd9\x2dc389\x2dc969\x2d5cf1fed8055b.mount: Deactivated successfully. Sep 9 22:16:01.991350 containerd[1617]: time="2025-09-09T22:16:01.991194736Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nxmhd,Uid:ee89689e-04d0-4108-8d66-1a161973549c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"902ed8a567cea1d724b626df47bccf506a6c6301b0da84f19b9e9d4972879efa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.991642 kubelet[2917]: E0909 22:16:01.991566 2917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"902ed8a567cea1d724b626df47bccf506a6c6301b0da84f19b9e9d4972879efa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:01.991729 kubelet[2917]: E0909 22:16:01.991664 2917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"902ed8a567cea1d724b626df47bccf506a6c6301b0da84f19b9e9d4972879efa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-nxmhd" Sep 9 22:16:01.991729 kubelet[2917]: E0909 22:16:01.991713 2917 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"902ed8a567cea1d724b626df47bccf506a6c6301b0da84f19b9e9d4972879efa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-nxmhd" Sep 9 22:16:01.991877 kubelet[2917]: E0909 22:16:01.991814 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-nxmhd_calico-system(ee89689e-04d0-4108-8d66-1a161973549c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-nxmhd_calico-system(ee89689e-04d0-4108-8d66-1a161973549c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"902ed8a567cea1d724b626df47bccf506a6c6301b0da84f19b9e9d4972879efa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-nxmhd" podUID="ee89689e-04d0-4108-8d66-1a161973549c" Sep 9 22:16:02.068238 kubelet[2917]: E0909 22:16:02.068087 2917 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Sep 9 22:16:02.068238 kubelet[2917]: E0909 22:16:02.068228 2917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a8772658-fdec-4b6f-ba7d-ab8a7007aafd-config-volume podName:a8772658-fdec-4b6f-ba7d-ab8a7007aafd nodeName:}" failed. No retries permitted until 2025-09-09 22:16:02.568197032 +0000 UTC m=+37.936353726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/a8772658-fdec-4b6f-ba7d-ab8a7007aafd-config-volume") pod "coredns-668d6bf9bc-hjrnt" (UID: "a8772658-fdec-4b6f-ba7d-ab8a7007aafd") : failed to sync configmap cache: timed out waiting for the condition Sep 9 22:16:02.667169 containerd[1617]: time="2025-09-09T22:16:02.667071414Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-66bzc,Uid:ae85377c-e3d4-4eb7-96d9-9c28e74f1766,Namespace:kube-system,Attempt:0,}" Sep 9 22:16:02.721831 containerd[1617]: time="2025-09-09T22:16:02.721437833Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hjrnt,Uid:a8772658-fdec-4b6f-ba7d-ab8a7007aafd,Namespace:kube-system,Attempt:0,}" Sep 9 22:16:02.818871 containerd[1617]: time="2025-09-09T22:16:02.818755911Z" level=error msg="Failed to destroy network for sandbox \"6636335cfbb01ca01eb640fda238cae5ad35a27a0d1c526171c6e169d9147eb2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:02.823699 containerd[1617]: time="2025-09-09T22:16:02.823657823Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-66bzc,Uid:ae85377c-e3d4-4eb7-96d9-9c28e74f1766,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6636335cfbb01ca01eb640fda238cae5ad35a27a0d1c526171c6e169d9147eb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:02.824241 kubelet[2917]: E0909 22:16:02.824178 2917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6636335cfbb01ca01eb640fda238cae5ad35a27a0d1c526171c6e169d9147eb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:02.824358 kubelet[2917]: E0909 22:16:02.824256 2917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6636335cfbb01ca01eb640fda238cae5ad35a27a0d1c526171c6e169d9147eb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-66bzc" Sep 9 22:16:02.824358 kubelet[2917]: E0909 22:16:02.824291 2917 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6636335cfbb01ca01eb640fda238cae5ad35a27a0d1c526171c6e169d9147eb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-66bzc" Sep 9 22:16:02.824516 kubelet[2917]: E0909 22:16:02.824354 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-66bzc_kube-system(ae85377c-e3d4-4eb7-96d9-9c28e74f1766)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-66bzc_kube-system(ae85377c-e3d4-4eb7-96d9-9c28e74f1766)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6636335cfbb01ca01eb640fda238cae5ad35a27a0d1c526171c6e169d9147eb2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-66bzc" podUID="ae85377c-e3d4-4eb7-96d9-9c28e74f1766" Sep 9 22:16:02.831930 containerd[1617]: time="2025-09-09T22:16:02.831853459Z" level=error msg="Failed to destroy network for sandbox \"37f63548d31f3a8f03a7a13dcfefe4bfd385bc004393d1bdcfecfab86a846372\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:02.833383 containerd[1617]: time="2025-09-09T22:16:02.833343173Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hjrnt,Uid:a8772658-fdec-4b6f-ba7d-ab8a7007aafd,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"37f63548d31f3a8f03a7a13dcfefe4bfd385bc004393d1bdcfecfab86a846372\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:02.834818 kubelet[2917]: E0909 22:16:02.833624 2917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37f63548d31f3a8f03a7a13dcfefe4bfd385bc004393d1bdcfecfab86a846372\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:16:02.834818 kubelet[2917]: E0909 22:16:02.833689 2917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37f63548d31f3a8f03a7a13dcfefe4bfd385bc004393d1bdcfecfab86a846372\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-hjrnt" Sep 9 22:16:02.834818 kubelet[2917]: E0909 22:16:02.833758 2917 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37f63548d31f3a8f03a7a13dcfefe4bfd385bc004393d1bdcfecfab86a846372\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-hjrnt" Sep 9 22:16:02.835033 kubelet[2917]: E0909 22:16:02.833884 2917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-hjrnt_kube-system(a8772658-fdec-4b6f-ba7d-ab8a7007aafd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-hjrnt_kube-system(a8772658-fdec-4b6f-ba7d-ab8a7007aafd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"37f63548d31f3a8f03a7a13dcfefe4bfd385bc004393d1bdcfecfab86a846372\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-hjrnt" podUID="a8772658-fdec-4b6f-ba7d-ab8a7007aafd" Sep 9 22:16:02.843929 systemd[1]: run-netns-cni\x2d13951636\x2dd149\x2def93\x2d0f62\x2d48048b43926e.mount: Deactivated successfully. Sep 9 22:16:02.844071 systemd[1]: run-netns-cni\x2dc734694e\x2d1e0e\x2dcb75\x2dc70d\x2dc1f8003b7708.mount: Deactivated successfully. Sep 9 22:16:04.354237 systemd[1]: Started sshd@15-10.230.51.18:22-103.146.52.252:59458.service - OpenSSH per-connection server daemon (103.146.52.252:59458). Sep 9 22:16:05.321239 sshd[3946]: Received disconnect from 103.146.52.252 port 59458:11: Bye Bye [preauth] Sep 9 22:16:05.321239 sshd[3946]: Disconnected from authenticating user sshd 103.146.52.252 port 59458 [preauth] Sep 9 22:16:05.324488 systemd[1]: sshd@15-10.230.51.18:22-103.146.52.252:59458.service: Deactivated successfully. Sep 9 22:16:09.545071 systemd[1]: Started sshd@16-10.230.51.18:22-172.245.45.194:54496.service - OpenSSH per-connection server daemon (172.245.45.194:54496). Sep 9 22:16:10.483376 sshd[3956]: Invalid user slave from 172.245.45.194 port 54496 Sep 9 22:16:10.643396 sshd[3956]: Received disconnect from 172.245.45.194 port 54496:11: Bye Bye [preauth] Sep 9 22:16:10.643396 sshd[3956]: Disconnected from invalid user slave 172.245.45.194 port 54496 [preauth] Sep 9 22:16:10.648372 systemd[1]: sshd@16-10.230.51.18:22-172.245.45.194:54496.service: Deactivated successfully. Sep 9 22:16:11.994324 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount708819928.mount: Deactivated successfully. Sep 9 22:16:12.156717 containerd[1617]: time="2025-09-09T22:16:12.125245736Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 9 22:16:12.161384 containerd[1617]: time="2025-09-09T22:16:12.161334757Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:12.186285 containerd[1617]: time="2025-09-09T22:16:12.186211980Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:12.188913 containerd[1617]: time="2025-09-09T22:16:12.188858128Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:12.190483 containerd[1617]: time="2025-09-09T22:16:12.189884187Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 11.069827576s" Sep 9 22:16:12.190483 containerd[1617]: time="2025-09-09T22:16:12.189932889Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 9 22:16:12.221001 containerd[1617]: time="2025-09-09T22:16:12.220928549Z" level=info msg="CreateContainer within sandbox \"4d08ce43e3f3a96ea00dbde202bf6108a585f6a8ac837b6accfbc8c48386c8cf\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 22:16:12.262774 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount636203957.mount: Deactivated successfully. Sep 9 22:16:12.263655 containerd[1617]: time="2025-09-09T22:16:12.263481105Z" level=info msg="Container cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:16:12.299646 containerd[1617]: time="2025-09-09T22:16:12.299550966Z" level=info msg="CreateContainer within sandbox \"4d08ce43e3f3a96ea00dbde202bf6108a585f6a8ac837b6accfbc8c48386c8cf\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99\"" Sep 9 22:16:12.300648 containerd[1617]: time="2025-09-09T22:16:12.300329797Z" level=info msg="StartContainer for \"cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99\"" Sep 9 22:16:12.307747 containerd[1617]: time="2025-09-09T22:16:12.307506599Z" level=info msg="connecting to shim cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99" address="unix:///run/containerd/s/4e58339c1406230f49d9ffe76a393f434127ad8d535c374abcfc0586769aba3b" protocol=ttrpc version=3 Sep 9 22:16:12.467041 systemd[1]: Started cri-containerd-cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99.scope - libcontainer container cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99. Sep 9 22:16:12.546088 containerd[1617]: time="2025-09-09T22:16:12.546029463Z" level=info msg="StartContainer for \"cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99\" returns successfully" Sep 9 22:16:12.684608 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 22:16:12.685350 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 22:16:12.871667 containerd[1617]: time="2025-09-09T22:16:12.871096011Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56d9fcb775-xls5d,Uid:0ead3947-6dec-40c9-9eb3-9969c527f104,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:16:12.872540 containerd[1617]: time="2025-09-09T22:16:12.872424595Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nxmhd,Uid:ee89689e-04d0-4108-8d66-1a161973549c,Namespace:calico-system,Attempt:0,}" Sep 9 22:16:13.160821 kubelet[2917]: I0909 22:16:13.160327 2917 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01ff96af-882d-4b97-9f66-cd447252b78e-whisker-ca-bundle\") pod \"01ff96af-882d-4b97-9f66-cd447252b78e\" (UID: \"01ff96af-882d-4b97-9f66-cd447252b78e\") " Sep 9 22:16:13.160821 kubelet[2917]: I0909 22:16:13.160413 2917 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/01ff96af-882d-4b97-9f66-cd447252b78e-whisker-backend-key-pair\") pod \"01ff96af-882d-4b97-9f66-cd447252b78e\" (UID: \"01ff96af-882d-4b97-9f66-cd447252b78e\") " Sep 9 22:16:13.160821 kubelet[2917]: I0909 22:16:13.160445 2917 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nshg8\" (UniqueName: \"kubernetes.io/projected/01ff96af-882d-4b97-9f66-cd447252b78e-kube-api-access-nshg8\") pod \"01ff96af-882d-4b97-9f66-cd447252b78e\" (UID: \"01ff96af-882d-4b97-9f66-cd447252b78e\") " Sep 9 22:16:13.162869 kubelet[2917]: I0909 22:16:13.162181 2917 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ff96af-882d-4b97-9f66-cd447252b78e-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "01ff96af-882d-4b97-9f66-cd447252b78e" (UID: "01ff96af-882d-4b97-9f66-cd447252b78e"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 22:16:13.179158 kubelet[2917]: I0909 22:16:13.178833 2917 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ff96af-882d-4b97-9f66-cd447252b78e-kube-api-access-nshg8" (OuterVolumeSpecName: "kube-api-access-nshg8") pod "01ff96af-882d-4b97-9f66-cd447252b78e" (UID: "01ff96af-882d-4b97-9f66-cd447252b78e"). InnerVolumeSpecName "kube-api-access-nshg8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 22:16:13.179478 kubelet[2917]: I0909 22:16:13.179042 2917 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ff96af-882d-4b97-9f66-cd447252b78e-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "01ff96af-882d-4b97-9f66-cd447252b78e" (UID: "01ff96af-882d-4b97-9f66-cd447252b78e"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 22:16:13.181449 systemd[1]: var-lib-kubelet-pods-01ff96af\x2d882d\x2d4b97\x2d9f66\x2dcd447252b78e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dnshg8.mount: Deactivated successfully. Sep 9 22:16:13.181618 systemd[1]: var-lib-kubelet-pods-01ff96af\x2d882d\x2d4b97\x2d9f66\x2dcd447252b78e-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 22:16:13.231792 systemd[1]: Removed slice kubepods-besteffort-pod01ff96af_882d_4b97_9f66_cd447252b78e.slice - libcontainer container kubepods-besteffort-pod01ff96af_882d_4b97_9f66_cd447252b78e.slice. Sep 9 22:16:13.261407 kubelet[2917]: I0909 22:16:13.261349 2917 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/01ff96af-882d-4b97-9f66-cd447252b78e-whisker-backend-key-pair\") on node \"srv-rokxy.gb1.brightbox.com\" DevicePath \"\"" Sep 9 22:16:13.261938 kubelet[2917]: I0909 22:16:13.261392 2917 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nshg8\" (UniqueName: \"kubernetes.io/projected/01ff96af-882d-4b97-9f66-cd447252b78e-kube-api-access-nshg8\") on node \"srv-rokxy.gb1.brightbox.com\" DevicePath \"\"" Sep 9 22:16:13.261938 kubelet[2917]: I0909 22:16:13.261867 2917 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01ff96af-882d-4b97-9f66-cd447252b78e-whisker-ca-bundle\") on node \"srv-rokxy.gb1.brightbox.com\" DevicePath \"\"" Sep 9 22:16:13.266670 kubelet[2917]: I0909 22:16:13.266207 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-fk2fc" podStartSLOduration=2.287578467 podStartE2EDuration="26.266181309s" podCreationTimestamp="2025-09-09 22:15:47 +0000 UTC" firstStartedPulling="2025-09-09 22:15:48.212642726 +0000 UTC m=+23.580799407" lastFinishedPulling="2025-09-09 22:16:12.191245568 +0000 UTC m=+47.559402249" observedRunningTime="2025-09-09 22:16:13.263580989 +0000 UTC m=+48.631737721" watchObservedRunningTime="2025-09-09 22:16:13.266181309 +0000 UTC m=+48.634338005" Sep 9 22:16:13.393483 systemd[1]: Created slice kubepods-besteffort-pod1dbecbb3_bfb5_435c_a27f_5c04e7bb44b8.slice - libcontainer container kubepods-besteffort-pod1dbecbb3_bfb5_435c_a27f_5c04e7bb44b8.slice. Sep 9 22:16:13.463225 kubelet[2917]: I0909 22:16:13.462600 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7749\" (UniqueName: \"kubernetes.io/projected/1dbecbb3-bfb5-435c-a27f-5c04e7bb44b8-kube-api-access-b7749\") pod \"whisker-fcd8795b9-v7sjs\" (UID: \"1dbecbb3-bfb5-435c-a27f-5c04e7bb44b8\") " pod="calico-system/whisker-fcd8795b9-v7sjs" Sep 9 22:16:13.463459 kubelet[2917]: I0909 22:16:13.463433 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1dbecbb3-bfb5-435c-a27f-5c04e7bb44b8-whisker-backend-key-pair\") pod \"whisker-fcd8795b9-v7sjs\" (UID: \"1dbecbb3-bfb5-435c-a27f-5c04e7bb44b8\") " pod="calico-system/whisker-fcd8795b9-v7sjs" Sep 9 22:16:13.463654 kubelet[2917]: I0909 22:16:13.463619 2917 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dbecbb3-bfb5-435c-a27f-5c04e7bb44b8-whisker-ca-bundle\") pod \"whisker-fcd8795b9-v7sjs\" (UID: \"1dbecbb3-bfb5-435c-a27f-5c04e7bb44b8\") " pod="calico-system/whisker-fcd8795b9-v7sjs" Sep 9 22:16:13.644203 systemd-networkd[1513]: cali216f14f2fb4: Link UP Sep 9 22:16:13.645853 systemd-networkd[1513]: cali216f14f2fb4: Gained carrier Sep 9 22:16:13.682971 containerd[1617]: 2025-09-09 22:16:12.957 [INFO][4012] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 22:16:13.682971 containerd[1617]: 2025-09-09 22:16:13.050 [INFO][4012] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0 csi-node-driver- calico-system ee89689e-04d0-4108-8d66-1a161973549c 743 0 2025-09-09 22:15:47 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-rokxy.gb1.brightbox.com csi-node-driver-nxmhd eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali216f14f2fb4 [] [] }} ContainerID="0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" Namespace="calico-system" Pod="csi-node-driver-nxmhd" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-" Sep 9 22:16:13.682971 containerd[1617]: 2025-09-09 22:16:13.050 [INFO][4012] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" Namespace="calico-system" Pod="csi-node-driver-nxmhd" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0" Sep 9 22:16:13.682971 containerd[1617]: 2025-09-09 22:16:13.351 [INFO][4036] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" HandleID="k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" Workload="srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0" Sep 9 22:16:13.684431 containerd[1617]: 2025-09-09 22:16:13.352 [INFO][4036] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" HandleID="k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" Workload="srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103a90), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-rokxy.gb1.brightbox.com", "pod":"csi-node-driver-nxmhd", "timestamp":"2025-09-09 22:16:13.351047007 +0000 UTC"}, Hostname:"srv-rokxy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:16:13.684431 containerd[1617]: 2025-09-09 22:16:13.352 [INFO][4036] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:16:13.684431 containerd[1617]: 2025-09-09 22:16:13.352 [INFO][4036] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:16:13.684431 containerd[1617]: 2025-09-09 22:16:13.353 [INFO][4036] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-rokxy.gb1.brightbox.com' Sep 9 22:16:13.684431 containerd[1617]: 2025-09-09 22:16:13.451 [INFO][4036] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.684431 containerd[1617]: 2025-09-09 22:16:13.493 [INFO][4036] ipam/ipam.go 394: Looking up existing affinities for host host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.684431 containerd[1617]: 2025-09-09 22:16:13.502 [INFO][4036] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.684431 containerd[1617]: 2025-09-09 22:16:13.504 [INFO][4036] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.684431 containerd[1617]: 2025-09-09 22:16:13.513 [INFO][4036] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.684929 containerd[1617]: 2025-09-09 22:16:13.513 [INFO][4036] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.684929 containerd[1617]: 2025-09-09 22:16:13.516 [INFO][4036] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606 Sep 9 22:16:13.684929 containerd[1617]: 2025-09-09 22:16:13.522 [INFO][4036] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.685069 containerd[1617]: 2025-09-09 22:16:13.528 [ERROR][4036] ipam/customresource.go 184: Error updating resource Key=IPAMBlock(192-168-61-0-26) Name="192-168-61-0-26" Resource="IPAMBlocks" Value=&v3.IPAMBlock{TypeMeta:v1.TypeMeta{Kind:"IPAMBlock", APIVersion:"crd.projectcalico.org/v1"}, ObjectMeta:v1.ObjectMeta{Name:"192-168-61-0-26", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"885", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.IPAMBlockSpec{CIDR:"192.168.61.0/26", Affinity:(*string)(0xc00031b920), Allocations:[]*int{(*int)(0xc000312158), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil)}, Unallocated:[]int{1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63}, Attributes:[]v3.AllocationAttribute{v3.AllocationAttribute{AttrPrimary:(*string)(0xc000103a90), AttrSecondary:map[string]string{"namespace":"calico-system", "node":"srv-rokxy.gb1.brightbox.com", "pod":"csi-node-driver-nxmhd", "timestamp":"2025-09-09 22:16:13.351047007 +0000 UTC"}}}, SequenceNumber:0x1863bd14105bf0f9, SequenceNumberForAllocation:map[string]uint64{"0":0x1863bd14105bf0f8}, Deleted:false, DeprecatedStrictAffinity:false}} error=Operation cannot be fulfilled on ipamblocks.crd.projectcalico.org "192-168-61-0-26": the object has been modified; please apply your changes to the latest version and try again Sep 9 22:16:13.685069 containerd[1617]: 2025-09-09 22:16:13.528 [INFO][4036] ipam/ipam.go 1247: Failed to update block block=192.168.61.0/26 error=update conflict: IPAMBlock(192-168-61-0-26) handle="k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.685069 containerd[1617]: 2025-09-09 22:16:13.556 [INFO][4036] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.685069 containerd[1617]: 2025-09-09 22:16:13.558 [INFO][4036] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606 Sep 9 22:16:13.685069 containerd[1617]: 2025-09-09 22:16:13.566 [INFO][4036] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.685069 containerd[1617]: 2025-09-09 22:16:13.574 [INFO][4036] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.1/26] block=192.168.61.0/26 handle="k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.685069 containerd[1617]: 2025-09-09 22:16:13.574 [INFO][4036] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.1/26] handle="k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.685069 containerd[1617]: 2025-09-09 22:16:13.574 [INFO][4036] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:16:13.685069 containerd[1617]: 2025-09-09 22:16:13.574 [INFO][4036] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.1/26] IPv6=[] ContainerID="0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" HandleID="k8s-pod-network.0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" Workload="srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0" Sep 9 22:16:13.686058 containerd[1617]: 2025-09-09 22:16:13.590 [INFO][4012] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" Namespace="calico-system" Pod="csi-node-driver-nxmhd" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ee89689e-04d0-4108-8d66-1a161973549c", ResourceVersion:"743", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-nxmhd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali216f14f2fb4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:13.686058 containerd[1617]: 2025-09-09 22:16:13.591 [INFO][4012] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.1/32] ContainerID="0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" Namespace="calico-system" Pod="csi-node-driver-nxmhd" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0" Sep 9 22:16:13.686058 containerd[1617]: 2025-09-09 22:16:13.591 [INFO][4012] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali216f14f2fb4 ContainerID="0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" Namespace="calico-system" Pod="csi-node-driver-nxmhd" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0" Sep 9 22:16:13.686058 containerd[1617]: 2025-09-09 22:16:13.648 [INFO][4012] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" Namespace="calico-system" Pod="csi-node-driver-nxmhd" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0" Sep 9 22:16:13.686058 containerd[1617]: 2025-09-09 22:16:13.649 [INFO][4012] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" Namespace="calico-system" Pod="csi-node-driver-nxmhd" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ee89689e-04d0-4108-8d66-1a161973549c", ResourceVersion:"743", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606", Pod:"csi-node-driver-nxmhd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali216f14f2fb4", MAC:"76:69:a1:ca:3e:fd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:13.686058 containerd[1617]: 2025-09-09 22:16:13.677 [INFO][4012] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" Namespace="calico-system" Pod="csi-node-driver-nxmhd" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-csi--node--driver--nxmhd-eth0" Sep 9 22:16:13.703872 containerd[1617]: time="2025-09-09T22:16:13.703174967Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fcd8795b9-v7sjs,Uid:1dbecbb3-bfb5-435c-a27f-5c04e7bb44b8,Namespace:calico-system,Attempt:0,}" Sep 9 22:16:13.711940 systemd-networkd[1513]: cali5935bf87076: Link UP Sep 9 22:16:13.713810 systemd-networkd[1513]: cali5935bf87076: Gained carrier Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.022 [INFO][4010] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.080 [INFO][4010] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0 calico-apiserver-56d9fcb775- calico-apiserver 0ead3947-6dec-40c9-9eb3-9969c527f104 812 0 2025-09-09 22:15:43 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:56d9fcb775 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-rokxy.gb1.brightbox.com calico-apiserver-56d9fcb775-xls5d eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5935bf87076 [] [] }} ContainerID="fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-xls5d" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-" Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.080 [INFO][4010] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-xls5d" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0" Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.350 [INFO][4038] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" HandleID="k8s-pod-network.fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" Workload="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0" Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.352 [INFO][4038] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" HandleID="k8s-pod-network.fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" Workload="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e7b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-rokxy.gb1.brightbox.com", "pod":"calico-apiserver-56d9fcb775-xls5d", "timestamp":"2025-09-09 22:16:13.350152745 +0000 UTC"}, Hostname:"srv-rokxy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.352 [INFO][4038] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.578 [INFO][4038] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.578 [INFO][4038] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-rokxy.gb1.brightbox.com' Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.634 [INFO][4038] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.654 [INFO][4038] ipam/ipam.go 394: Looking up existing affinities for host host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.661 [INFO][4038] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.664 [INFO][4038] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.667 [INFO][4038] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.668 [INFO][4038] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.672 [INFO][4038] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.683 [INFO][4038] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.700 [INFO][4038] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.2/26] block=192.168.61.0/26 handle="k8s-pod-network.fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.700 [INFO][4038] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.2/26] handle="k8s-pod-network.fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.700 [INFO][4038] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:16:13.751385 containerd[1617]: 2025-09-09 22:16:13.700 [INFO][4038] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.2/26] IPv6=[] ContainerID="fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" HandleID="k8s-pod-network.fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" Workload="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0" Sep 9 22:16:13.752926 containerd[1617]: 2025-09-09 22:16:13.708 [INFO][4010] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-xls5d" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0", GenerateName:"calico-apiserver-56d9fcb775-", Namespace:"calico-apiserver", SelfLink:"", UID:"0ead3947-6dec-40c9-9eb3-9969c527f104", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56d9fcb775", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-56d9fcb775-xls5d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5935bf87076", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:13.752926 containerd[1617]: 2025-09-09 22:16:13.708 [INFO][4010] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.2/32] ContainerID="fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-xls5d" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0" Sep 9 22:16:13.752926 containerd[1617]: 2025-09-09 22:16:13.708 [INFO][4010] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5935bf87076 ContainerID="fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-xls5d" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0" Sep 9 22:16:13.752926 containerd[1617]: 2025-09-09 22:16:13.713 [INFO][4010] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-xls5d" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0" Sep 9 22:16:13.752926 containerd[1617]: 2025-09-09 22:16:13.714 [INFO][4010] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-xls5d" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0", GenerateName:"calico-apiserver-56d9fcb775-", Namespace:"calico-apiserver", SelfLink:"", UID:"0ead3947-6dec-40c9-9eb3-9969c527f104", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56d9fcb775", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da", Pod:"calico-apiserver-56d9fcb775-xls5d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5935bf87076", MAC:"66:ec:dd:da:7f:e3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:13.752926 containerd[1617]: 2025-09-09 22:16:13.740 [INFO][4010] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-xls5d" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--xls5d-eth0" Sep 9 22:16:13.871657 containerd[1617]: time="2025-09-09T22:16:13.871603373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-545dfbd8d5-xv6qv,Uid:a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613,Namespace:calico-system,Attempt:0,}" Sep 9 22:16:13.873160 containerd[1617]: time="2025-09-09T22:16:13.872892679Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-66bzc,Uid:ae85377c-e3d4-4eb7-96d9-9c28e74f1766,Namespace:kube-system,Attempt:0,}" Sep 9 22:16:13.888332 containerd[1617]: time="2025-09-09T22:16:13.888270180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-hqktr,Uid:968c322d-47fa-4065-99b3-0c60d65ccbe3,Namespace:calico-system,Attempt:0,}" Sep 9 22:16:13.929889 containerd[1617]: time="2025-09-09T22:16:13.929823877Z" level=info msg="connecting to shim fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da" address="unix:///run/containerd/s/b1fcd1f6f589cf5f4f413f1686eba6f7ae000d73b05be1ee62c3d9d140e3adca" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:16:13.950892 containerd[1617]: time="2025-09-09T22:16:13.950660858Z" level=info msg="connecting to shim 0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606" address="unix:///run/containerd/s/36eb65b1dac9c3851af444a1378c4387442438c1c54d0608b43476794c4cd95c" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:16:14.079568 systemd[1]: Started cri-containerd-fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da.scope - libcontainer container fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da. Sep 9 22:16:14.138178 systemd[1]: Started cri-containerd-0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606.scope - libcontainer container 0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606. Sep 9 22:16:14.190640 systemd-networkd[1513]: cali12e3540d1e7: Link UP Sep 9 22:16:14.192193 systemd-networkd[1513]: cali12e3540d1e7: Gained carrier Sep 9 22:16:14.285143 kubelet[2917]: I0909 22:16:14.285069 2917 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:13.784 [INFO][4074] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:13.814 [INFO][4074] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0 whisker-fcd8795b9- calico-system 1dbecbb3-bfb5-435c-a27f-5c04e7bb44b8 880 0 2025-09-09 22:16:13 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:fcd8795b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-rokxy.gb1.brightbox.com whisker-fcd8795b9-v7sjs eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali12e3540d1e7 [] [] }} ContainerID="5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" Namespace="calico-system" Pod="whisker-fcd8795b9-v7sjs" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:13.814 [INFO][4074] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" Namespace="calico-system" Pod="whisker-fcd8795b9-v7sjs" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:13.929 [INFO][4093] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" HandleID="k8s-pod-network.5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" Workload="srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:13.930 [INFO][4093] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" HandleID="k8s-pod-network.5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" Workload="srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f730), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-rokxy.gb1.brightbox.com", "pod":"whisker-fcd8795b9-v7sjs", "timestamp":"2025-09-09 22:16:13.929473413 +0000 UTC"}, Hostname:"srv-rokxy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:13.930 [INFO][4093] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:13.930 [INFO][4093] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:13.930 [INFO][4093] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-rokxy.gb1.brightbox.com' Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:13.961 [INFO][4093] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:13.973 [INFO][4093] ipam/ipam.go 394: Looking up existing affinities for host host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:14.005 [INFO][4093] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:14.030 [INFO][4093] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:14.044 [INFO][4093] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:14.044 [INFO][4093] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:14.051 [INFO][4093] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:14.071 [INFO][4093] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:14.158 [INFO][4093] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.3/26] block=192.168.61.0/26 handle="k8s-pod-network.5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:14.158 [INFO][4093] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.3/26] handle="k8s-pod-network.5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:14.158 [INFO][4093] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:16:14.307086 containerd[1617]: 2025-09-09 22:16:14.158 [INFO][4093] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.3/26] IPv6=[] ContainerID="5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" HandleID="k8s-pod-network.5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" Workload="srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0" Sep 9 22:16:14.309302 containerd[1617]: 2025-09-09 22:16:14.182 [INFO][4074] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" Namespace="calico-system" Pod="whisker-fcd8795b9-v7sjs" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0", GenerateName:"whisker-fcd8795b9-", Namespace:"calico-system", SelfLink:"", UID:"1dbecbb3-bfb5-435c-a27f-5c04e7bb44b8", ResourceVersion:"880", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 16, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"fcd8795b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"", Pod:"whisker-fcd8795b9-v7sjs", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.61.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali12e3540d1e7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:14.309302 containerd[1617]: 2025-09-09 22:16:14.182 [INFO][4074] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.3/32] ContainerID="5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" Namespace="calico-system" Pod="whisker-fcd8795b9-v7sjs" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0" Sep 9 22:16:14.309302 containerd[1617]: 2025-09-09 22:16:14.183 [INFO][4074] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali12e3540d1e7 ContainerID="5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" Namespace="calico-system" Pod="whisker-fcd8795b9-v7sjs" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0" Sep 9 22:16:14.309302 containerd[1617]: 2025-09-09 22:16:14.193 [INFO][4074] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" Namespace="calico-system" Pod="whisker-fcd8795b9-v7sjs" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0" Sep 9 22:16:14.309302 containerd[1617]: 2025-09-09 22:16:14.195 [INFO][4074] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" Namespace="calico-system" Pod="whisker-fcd8795b9-v7sjs" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0", GenerateName:"whisker-fcd8795b9-", Namespace:"calico-system", SelfLink:"", UID:"1dbecbb3-bfb5-435c-a27f-5c04e7bb44b8", ResourceVersion:"880", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 16, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"fcd8795b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd", Pod:"whisker-fcd8795b9-v7sjs", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.61.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali12e3540d1e7", MAC:"ca:aa:0a:62:b2:55", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:14.309302 containerd[1617]: 2025-09-09 22:16:14.283 [INFO][4074] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" Namespace="calico-system" Pod="whisker-fcd8795b9-v7sjs" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-whisker--fcd8795b9--v7sjs-eth0" Sep 9 22:16:14.430725 containerd[1617]: time="2025-09-09T22:16:14.430666042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nxmhd,Uid:ee89689e-04d0-4108-8d66-1a161973549c,Namespace:calico-system,Attempt:0,} returns sandbox id \"0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606\"" Sep 9 22:16:14.434673 containerd[1617]: time="2025-09-09T22:16:14.434402999Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 22:16:14.444617 containerd[1617]: time="2025-09-09T22:16:14.442412991Z" level=info msg="connecting to shim 5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd" address="unix:///run/containerd/s/dcc52075aec8305c2dd29a179a41b62abde363b420c94ffcc091fb2a5fcccf48" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:16:14.565984 systemd-networkd[1513]: calic94b42d9566: Link UP Sep 9 22:16:14.566580 systemd-networkd[1513]: calic94b42d9566: Gained carrier Sep 9 22:16:14.569252 systemd[1]: Started cri-containerd-5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd.scope - libcontainer container 5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd. Sep 9 22:16:14.631676 containerd[1617]: time="2025-09-09T22:16:14.630835101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56d9fcb775-xls5d,Uid:0ead3947-6dec-40c9-9eb3-9969c527f104,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da\"" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.033 [INFO][4111] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.076 [INFO][4111] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0 calico-kube-controllers-545dfbd8d5- calico-system a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613 813 0 2025-09-09 22:15:47 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:545dfbd8d5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-rokxy.gb1.brightbox.com calico-kube-controllers-545dfbd8d5-xv6qv eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic94b42d9566 [] [] }} ContainerID="7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" Namespace="calico-system" Pod="calico-kube-controllers-545dfbd8d5-xv6qv" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.077 [INFO][4111] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" Namespace="calico-system" Pod="calico-kube-controllers-545dfbd8d5-xv6qv" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.320 [INFO][4205] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" HandleID="k8s-pod-network.7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" Workload="srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.321 [INFO][4205] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" HandleID="k8s-pod-network.7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" Workload="srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000315850), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-rokxy.gb1.brightbox.com", "pod":"calico-kube-controllers-545dfbd8d5-xv6qv", "timestamp":"2025-09-09 22:16:14.320266737 +0000 UTC"}, Hostname:"srv-rokxy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.321 [INFO][4205] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.322 [INFO][4205] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.322 [INFO][4205] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-rokxy.gb1.brightbox.com' Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.355 [INFO][4205] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.399 [INFO][4205] ipam/ipam.go 394: Looking up existing affinities for host host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.438 [INFO][4205] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.447 [INFO][4205] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.456 [INFO][4205] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.456 [INFO][4205] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.465 [INFO][4205] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329 Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.488 [INFO][4205] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.531 [INFO][4205] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.4/26] block=192.168.61.0/26 handle="k8s-pod-network.7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.531 [INFO][4205] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.4/26] handle="k8s-pod-network.7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.531 [INFO][4205] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:16:14.652003 containerd[1617]: 2025-09-09 22:16:14.531 [INFO][4205] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.4/26] IPv6=[] ContainerID="7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" HandleID="k8s-pod-network.7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" Workload="srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0" Sep 9 22:16:14.654722 containerd[1617]: 2025-09-09 22:16:14.544 [INFO][4111] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" Namespace="calico-system" Pod="calico-kube-controllers-545dfbd8d5-xv6qv" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0", GenerateName:"calico-kube-controllers-545dfbd8d5-", Namespace:"calico-system", SelfLink:"", UID:"a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"545dfbd8d5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-545dfbd8d5-xv6qv", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic94b42d9566", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:14.654722 containerd[1617]: 2025-09-09 22:16:14.547 [INFO][4111] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.4/32] ContainerID="7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" Namespace="calico-system" Pod="calico-kube-controllers-545dfbd8d5-xv6qv" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0" Sep 9 22:16:14.654722 containerd[1617]: 2025-09-09 22:16:14.548 [INFO][4111] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic94b42d9566 ContainerID="7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" Namespace="calico-system" Pod="calico-kube-controllers-545dfbd8d5-xv6qv" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0" Sep 9 22:16:14.654722 containerd[1617]: 2025-09-09 22:16:14.576 [INFO][4111] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" Namespace="calico-system" Pod="calico-kube-controllers-545dfbd8d5-xv6qv" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0" Sep 9 22:16:14.654722 containerd[1617]: 2025-09-09 22:16:14.596 [INFO][4111] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" Namespace="calico-system" Pod="calico-kube-controllers-545dfbd8d5-xv6qv" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0", GenerateName:"calico-kube-controllers-545dfbd8d5-", Namespace:"calico-system", SelfLink:"", UID:"a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"545dfbd8d5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329", Pod:"calico-kube-controllers-545dfbd8d5-xv6qv", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic94b42d9566", MAC:"02:46:32:25:26:1d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:14.654722 containerd[1617]: 2025-09-09 22:16:14.645 [INFO][4111] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" Namespace="calico-system" Pod="calico-kube-controllers-545dfbd8d5-xv6qv" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--kube--controllers--545dfbd8d5--xv6qv-eth0" Sep 9 22:16:14.699310 containerd[1617]: time="2025-09-09T22:16:14.699151903Z" level=info msg="connecting to shim 7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329" address="unix:///run/containerd/s/6c8d4af71f8607ce9bae63299e85573dcd71630f5dd71801bb6c9292edb9d7e9" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:16:14.782986 systemd[1]: Started cri-containerd-7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329.scope - libcontainer container 7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329. Sep 9 22:16:14.783548 systemd-networkd[1513]: cali5d3941f2d7d: Link UP Sep 9 22:16:14.785203 systemd-networkd[1513]: cali5d3941f2d7d: Gained carrier Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.191 [INFO][4144] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.291 [INFO][4144] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0 goldmane-54d579b49d- calico-system 968c322d-47fa-4065-99b3-0c60d65ccbe3 811 0 2025-09-09 22:15:46 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-rokxy.gb1.brightbox.com goldmane-54d579b49d-hqktr eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali5d3941f2d7d [] [] }} ContainerID="f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" Namespace="calico-system" Pod="goldmane-54d579b49d-hqktr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-" Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.291 [INFO][4144] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" Namespace="calico-system" Pod="goldmane-54d579b49d-hqktr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0" Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.556 [INFO][4242] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" HandleID="k8s-pod-network.f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" Workload="srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0" Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.565 [INFO][4242] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" HandleID="k8s-pod-network.f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" Workload="srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000115900), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-rokxy.gb1.brightbox.com", "pod":"goldmane-54d579b49d-hqktr", "timestamp":"2025-09-09 22:16:14.556086526 +0000 UTC"}, Hostname:"srv-rokxy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.575 [INFO][4242] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.575 [INFO][4242] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.575 [INFO][4242] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-rokxy.gb1.brightbox.com' Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.651 [INFO][4242] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.703 [INFO][4242] ipam/ipam.go 394: Looking up existing affinities for host host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.720 [INFO][4242] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.726 [INFO][4242] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.735 [INFO][4242] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.735 [INFO][4242] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.741 [INFO][4242] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1 Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.753 [INFO][4242] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.764 [INFO][4242] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.5/26] block=192.168.61.0/26 handle="k8s-pod-network.f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.767 [INFO][4242] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.5/26] handle="k8s-pod-network.f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.767 [INFO][4242] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:16:14.834971 containerd[1617]: 2025-09-09 22:16:14.767 [INFO][4242] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.5/26] IPv6=[] ContainerID="f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" HandleID="k8s-pod-network.f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" Workload="srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0" Sep 9 22:16:14.837477 containerd[1617]: 2025-09-09 22:16:14.774 [INFO][4144] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" Namespace="calico-system" Pod="goldmane-54d579b49d-hqktr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"968c322d-47fa-4065-99b3-0c60d65ccbe3", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-54d579b49d-hqktr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5d3941f2d7d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:14.837477 containerd[1617]: 2025-09-09 22:16:14.774 [INFO][4144] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.5/32] ContainerID="f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" Namespace="calico-system" Pod="goldmane-54d579b49d-hqktr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0" Sep 9 22:16:14.837477 containerd[1617]: 2025-09-09 22:16:14.774 [INFO][4144] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5d3941f2d7d ContainerID="f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" Namespace="calico-system" Pod="goldmane-54d579b49d-hqktr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0" Sep 9 22:16:14.837477 containerd[1617]: 2025-09-09 22:16:14.790 [INFO][4144] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" Namespace="calico-system" Pod="goldmane-54d579b49d-hqktr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0" Sep 9 22:16:14.837477 containerd[1617]: 2025-09-09 22:16:14.795 [INFO][4144] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" Namespace="calico-system" Pod="goldmane-54d579b49d-hqktr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"968c322d-47fa-4065-99b3-0c60d65ccbe3", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1", Pod:"goldmane-54d579b49d-hqktr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5d3941f2d7d", MAC:"7a:ec:15:8c:d8:af", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:14.837477 containerd[1617]: 2025-09-09 22:16:14.824 [INFO][4144] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" Namespace="calico-system" Pod="goldmane-54d579b49d-hqktr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-goldmane--54d579b49d--hqktr-eth0" Sep 9 22:16:14.875151 containerd[1617]: time="2025-09-09T22:16:14.875089878Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hjrnt,Uid:a8772658-fdec-4b6f-ba7d-ab8a7007aafd,Namespace:kube-system,Attempt:0,}" Sep 9 22:16:14.876291 kubelet[2917]: I0909 22:16:14.876010 2917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ff96af-882d-4b97-9f66-cd447252b78e" path="/var/lib/kubelet/pods/01ff96af-882d-4b97-9f66-cd447252b78e/volumes" Sep 9 22:16:14.877336 containerd[1617]: time="2025-09-09T22:16:14.876877689Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56d9fcb775-78pgr,Uid:efcf61b5-b781-4fd4-a290-db86c9cc791c,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:16:14.961483 systemd-networkd[1513]: calicabafeea74e: Link UP Sep 9 22:16:14.967374 systemd-networkd[1513]: calicabafeea74e: Gained carrier Sep 9 22:16:14.975413 containerd[1617]: time="2025-09-09T22:16:14.974768642Z" level=info msg="connecting to shim f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1" address="unix:///run/containerd/s/db303a7554cf1a6fc29bd85aa9ac146346dd6a7cdb6f22a134342b85ad911b6e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.122 [INFO][4122] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.240 [INFO][4122] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0 coredns-668d6bf9bc- kube-system ae85377c-e3d4-4eb7-96d9-9c28e74f1766 805 0 2025-09-09 22:15:30 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-rokxy.gb1.brightbox.com coredns-668d6bf9bc-66bzc eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calicabafeea74e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" Namespace="kube-system" Pod="coredns-668d6bf9bc-66bzc" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-" Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.240 [INFO][4122] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" Namespace="kube-system" Pod="coredns-668d6bf9bc-66bzc" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0" Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.626 [INFO][4225] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" HandleID="k8s-pod-network.6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" Workload="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0" Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.627 [INFO][4225] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" HandleID="k8s-pod-network.6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" Workload="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103950), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-rokxy.gb1.brightbox.com", "pod":"coredns-668d6bf9bc-66bzc", "timestamp":"2025-09-09 22:16:14.626553142 +0000 UTC"}, Hostname:"srv-rokxy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.628 [INFO][4225] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.767 [INFO][4225] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.768 [INFO][4225] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-rokxy.gb1.brightbox.com' Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.791 [INFO][4225] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.821 [INFO][4225] ipam/ipam.go 394: Looking up existing affinities for host host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.844 [INFO][4225] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.856 [INFO][4225] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.861 [INFO][4225] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.861 [INFO][4225] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.871 [INFO][4225] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970 Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.891 [INFO][4225] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.923 [INFO][4225] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.6/26] block=192.168.61.0/26 handle="k8s-pod-network.6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.923 [INFO][4225] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.6/26] handle="k8s-pod-network.6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.924 [INFO][4225] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:16:15.045412 containerd[1617]: 2025-09-09 22:16:14.924 [INFO][4225] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.6/26] IPv6=[] ContainerID="6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" HandleID="k8s-pod-network.6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" Workload="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0" Sep 9 22:16:15.046397 containerd[1617]: 2025-09-09 22:16:14.938 [INFO][4122] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" Namespace="kube-system" Pod="coredns-668d6bf9bc-66bzc" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ae85377c-e3d4-4eb7-96d9-9c28e74f1766", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"", Pod:"coredns-668d6bf9bc-66bzc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicabafeea74e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:15.046397 containerd[1617]: 2025-09-09 22:16:14.943 [INFO][4122] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.6/32] ContainerID="6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" Namespace="kube-system" Pod="coredns-668d6bf9bc-66bzc" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0" Sep 9 22:16:15.046397 containerd[1617]: 2025-09-09 22:16:14.944 [INFO][4122] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicabafeea74e ContainerID="6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" Namespace="kube-system" Pod="coredns-668d6bf9bc-66bzc" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0" Sep 9 22:16:15.046397 containerd[1617]: 2025-09-09 22:16:14.968 [INFO][4122] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" Namespace="kube-system" Pod="coredns-668d6bf9bc-66bzc" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0" Sep 9 22:16:15.046397 containerd[1617]: 2025-09-09 22:16:14.979 [INFO][4122] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" Namespace="kube-system" Pod="coredns-668d6bf9bc-66bzc" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ae85377c-e3d4-4eb7-96d9-9c28e74f1766", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970", Pod:"coredns-668d6bf9bc-66bzc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicabafeea74e", MAC:"7e:68:b6:0c:d7:e8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:15.046397 containerd[1617]: 2025-09-09 22:16:15.028 [INFO][4122] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" Namespace="kube-system" Pod="coredns-668d6bf9bc-66bzc" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--66bzc-eth0" Sep 9 22:16:15.062503 systemd-networkd[1513]: cali216f14f2fb4: Gained IPv6LL Sep 9 22:16:15.146413 containerd[1617]: time="2025-09-09T22:16:15.146363928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fcd8795b9-v7sjs,Uid:1dbecbb3-bfb5-435c-a27f-5c04e7bb44b8,Namespace:calico-system,Attempt:0,} returns sandbox id \"5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd\"" Sep 9 22:16:15.218140 systemd[1]: Started cri-containerd-f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1.scope - libcontainer container f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1. Sep 9 22:16:15.250308 containerd[1617]: time="2025-09-09T22:16:15.248852822Z" level=info msg="connecting to shim 6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970" address="unix:///run/containerd/s/52465044525da21a4df61707613891809d96a3d5f4a06e2cf274e7394519dd89" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:16:15.379020 systemd[1]: Started cri-containerd-6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970.scope - libcontainer container 6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970. Sep 9 22:16:15.510039 systemd-networkd[1513]: cali5935bf87076: Gained IPv6LL Sep 9 22:16:15.529924 containerd[1617]: time="2025-09-09T22:16:15.529586253Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-545dfbd8d5-xv6qv,Uid:a3c2f6ec-3026-47d2-b3a9-9d8d93bd8613,Namespace:calico-system,Attempt:0,} returns sandbox id \"7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329\"" Sep 9 22:16:15.577392 containerd[1617]: time="2025-09-09T22:16:15.577249664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-66bzc,Uid:ae85377c-e3d4-4eb7-96d9-9c28e74f1766,Namespace:kube-system,Attempt:0,} returns sandbox id \"6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970\"" Sep 9 22:16:15.580922 containerd[1617]: time="2025-09-09T22:16:15.580824381Z" level=info msg="CreateContainer within sandbox \"6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 22:16:15.615959 systemd-networkd[1513]: calif60ce828f2b: Link UP Sep 9 22:16:15.628967 containerd[1617]: time="2025-09-09T22:16:15.628903853Z" level=info msg="Container c20cfd68f189dcb7570aed788d9bfcf1d7832e97600c72a282f7b77ea0d64d21: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:16:15.629594 systemd-networkd[1513]: calif60ce828f2b: Gained carrier Sep 9 22:16:15.637207 systemd-networkd[1513]: cali12e3540d1e7: Gained IPv6LL Sep 9 22:16:15.654563 containerd[1617]: time="2025-09-09T22:16:15.654211967Z" level=info msg="CreateContainer within sandbox \"6a6643dc9be4dff3fb92d1efcb2406ccf6ed1c67117c84a44cc56c6fcda32970\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c20cfd68f189dcb7570aed788d9bfcf1d7832e97600c72a282f7b77ea0d64d21\"" Sep 9 22:16:15.657807 containerd[1617]: time="2025-09-09T22:16:15.655965348Z" level=info msg="StartContainer for \"c20cfd68f189dcb7570aed788d9bfcf1d7832e97600c72a282f7b77ea0d64d21\"" Sep 9 22:16:15.658110 containerd[1617]: time="2025-09-09T22:16:15.658075244Z" level=info msg="connecting to shim c20cfd68f189dcb7570aed788d9bfcf1d7832e97600c72a282f7b77ea0d64d21" address="unix:///run/containerd/s/52465044525da21a4df61707613891809d96a3d5f4a06e2cf274e7394519dd89" protocol=ttrpc version=3 Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.200 [INFO][4375] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.279 [INFO][4375] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0 calico-apiserver-56d9fcb775- calico-apiserver efcf61b5-b781-4fd4-a290-db86c9cc791c 808 0 2025-09-09 22:15:43 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:56d9fcb775 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-rokxy.gb1.brightbox.com calico-apiserver-56d9fcb775-78pgr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif60ce828f2b [] [] }} ContainerID="b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-78pgr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-" Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.279 [INFO][4375] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-78pgr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0" Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.468 [INFO][4490] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" HandleID="k8s-pod-network.b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" Workload="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0" Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.468 [INFO][4490] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" HandleID="k8s-pod-network.b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" Workload="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e550), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-rokxy.gb1.brightbox.com", "pod":"calico-apiserver-56d9fcb775-78pgr", "timestamp":"2025-09-09 22:16:15.467993826 +0000 UTC"}, Hostname:"srv-rokxy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.468 [INFO][4490] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.468 [INFO][4490] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.468 [INFO][4490] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-rokxy.gb1.brightbox.com' Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.494 [INFO][4490] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.517 [INFO][4490] ipam/ipam.go 394: Looking up existing affinities for host host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.540 [INFO][4490] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.548 [INFO][4490] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.552 [INFO][4490] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.552 [INFO][4490] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.559 [INFO][4490] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.569 [INFO][4490] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.581 [INFO][4490] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.7/26] block=192.168.61.0/26 handle="k8s-pod-network.b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.581 [INFO][4490] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.7/26] handle="k8s-pod-network.b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.581 [INFO][4490] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:16:15.675767 containerd[1617]: 2025-09-09 22:16:15.581 [INFO][4490] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.7/26] IPv6=[] ContainerID="b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" HandleID="k8s-pod-network.b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" Workload="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0" Sep 9 22:16:15.677550 containerd[1617]: 2025-09-09 22:16:15.588 [INFO][4375] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-78pgr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0", GenerateName:"calico-apiserver-56d9fcb775-", Namespace:"calico-apiserver", SelfLink:"", UID:"efcf61b5-b781-4fd4-a290-db86c9cc791c", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56d9fcb775", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-56d9fcb775-78pgr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif60ce828f2b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:15.677550 containerd[1617]: 2025-09-09 22:16:15.589 [INFO][4375] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.7/32] ContainerID="b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-78pgr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0" Sep 9 22:16:15.677550 containerd[1617]: 2025-09-09 22:16:15.589 [INFO][4375] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif60ce828f2b ContainerID="b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-78pgr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0" Sep 9 22:16:15.677550 containerd[1617]: 2025-09-09 22:16:15.642 [INFO][4375] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-78pgr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0" Sep 9 22:16:15.677550 containerd[1617]: 2025-09-09 22:16:15.647 [INFO][4375] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-78pgr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0", GenerateName:"calico-apiserver-56d9fcb775-", Namespace:"calico-apiserver", SelfLink:"", UID:"efcf61b5-b781-4fd4-a290-db86c9cc791c", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56d9fcb775", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa", Pod:"calico-apiserver-56d9fcb775-78pgr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif60ce828f2b", MAC:"d6:85:c9:ad:cd:23", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:15.677550 containerd[1617]: 2025-09-09 22:16:15.666 [INFO][4375] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" Namespace="calico-apiserver" Pod="calico-apiserver-56d9fcb775-78pgr" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-calico--apiserver--56d9fcb775--78pgr-eth0" Sep 9 22:16:15.752306 systemd-networkd[1513]: caliac528cc2401: Link UP Sep 9 22:16:15.755565 systemd-networkd[1513]: caliac528cc2401: Gained carrier Sep 9 22:16:15.788034 systemd[1]: Started cri-containerd-c20cfd68f189dcb7570aed788d9bfcf1d7832e97600c72a282f7b77ea0d64d21.scope - libcontainer container c20cfd68f189dcb7570aed788d9bfcf1d7832e97600c72a282f7b77ea0d64d21. Sep 9 22:16:15.800554 containerd[1617]: time="2025-09-09T22:16:15.800406732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-hqktr,Uid:968c322d-47fa-4065-99b3-0c60d65ccbe3,Namespace:calico-system,Attempt:0,} returns sandbox id \"f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1\"" Sep 9 22:16:15.807508 containerd[1617]: time="2025-09-09T22:16:15.807415877Z" level=info msg="connecting to shim b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa" address="unix:///run/containerd/s/8314ab342227146fd7a4fffcb24abf16e4a0c07ef4e837bcc351e4ed3e6bfb45" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.246 [INFO][4384] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.309 [INFO][4384] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0 coredns-668d6bf9bc- kube-system a8772658-fdec-4b6f-ba7d-ab8a7007aafd 810 0 2025-09-09 22:15:30 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-rokxy.gb1.brightbox.com coredns-668d6bf9bc-hjrnt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliac528cc2401 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-hjrnt" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-" Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.309 [INFO][4384] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-hjrnt" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0" Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.476 [INFO][4502] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" HandleID="k8s-pod-network.7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" Workload="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0" Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.477 [INFO][4502] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" HandleID="k8s-pod-network.7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" Workload="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000388010), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-rokxy.gb1.brightbox.com", "pod":"coredns-668d6bf9bc-hjrnt", "timestamp":"2025-09-09 22:16:15.475301481 +0000 UTC"}, Hostname:"srv-rokxy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.477 [INFO][4502] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.581 [INFO][4502] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.582 [INFO][4502] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-rokxy.gb1.brightbox.com' Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.623 [INFO][4502] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.638 [INFO][4502] ipam/ipam.go 394: Looking up existing affinities for host host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.654 [INFO][4502] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.659 [INFO][4502] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.664 [INFO][4502] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.664 [INFO][4502] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.676 [INFO][4502] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.690 [INFO][4502] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.714 [INFO][4502] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.8/26] block=192.168.61.0/26 handle="k8s-pod-network.7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.714 [INFO][4502] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.8/26] handle="k8s-pod-network.7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" host="srv-rokxy.gb1.brightbox.com" Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.714 [INFO][4502] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:16:15.818559 containerd[1617]: 2025-09-09 22:16:15.717 [INFO][4502] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.8/26] IPv6=[] ContainerID="7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" HandleID="k8s-pod-network.7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" Workload="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0" Sep 9 22:16:15.821544 containerd[1617]: 2025-09-09 22:16:15.729 [INFO][4384] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-hjrnt" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a8772658-fdec-4b6f-ba7d-ab8a7007aafd", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"", Pod:"coredns-668d6bf9bc-hjrnt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliac528cc2401", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:15.821544 containerd[1617]: 2025-09-09 22:16:15.730 [INFO][4384] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.8/32] ContainerID="7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-hjrnt" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0" Sep 9 22:16:15.821544 containerd[1617]: 2025-09-09 22:16:15.731 [INFO][4384] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliac528cc2401 ContainerID="7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-hjrnt" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0" Sep 9 22:16:15.821544 containerd[1617]: 2025-09-09 22:16:15.759 [INFO][4384] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-hjrnt" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0" Sep 9 22:16:15.821544 containerd[1617]: 2025-09-09 22:16:15.769 [INFO][4384] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-hjrnt" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a8772658-fdec-4b6f-ba7d-ab8a7007aafd", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 15, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-rokxy.gb1.brightbox.com", ContainerID:"7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c", Pod:"coredns-668d6bf9bc-hjrnt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliac528cc2401", MAC:"62:a3:d8:b3:bd:f5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:16:15.821544 containerd[1617]: 2025-09-09 22:16:15.808 [INFO][4384] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-hjrnt" WorkloadEndpoint="srv--rokxy.gb1.brightbox.com-k8s-coredns--668d6bf9bc--hjrnt-eth0" Sep 9 22:16:15.876635 systemd[1]: Started cri-containerd-b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa.scope - libcontainer container b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa. Sep 9 22:16:15.928411 containerd[1617]: time="2025-09-09T22:16:15.928056122Z" level=info msg="connecting to shim 7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c" address="unix:///run/containerd/s/80266d4eda7f8551ae8711e98453ef222ec30dd35ad72e30bfd293762d048e30" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:16:15.978773 containerd[1617]: time="2025-09-09T22:16:15.978643439Z" level=info msg="StartContainer for \"c20cfd68f189dcb7570aed788d9bfcf1d7832e97600c72a282f7b77ea0d64d21\" returns successfully" Sep 9 22:16:16.006935 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1289714845.mount: Deactivated successfully. Sep 9 22:16:16.051153 systemd[1]: Started cri-containerd-7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c.scope - libcontainer container 7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c. Sep 9 22:16:16.085019 systemd-networkd[1513]: cali5d3941f2d7d: Gained IPv6LL Sep 9 22:16:16.135788 containerd[1617]: time="2025-09-09T22:16:16.135739932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56d9fcb775-78pgr,Uid:efcf61b5-b781-4fd4-a290-db86c9cc791c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa\"" Sep 9 22:16:16.149023 systemd-networkd[1513]: calic94b42d9566: Gained IPv6LL Sep 9 22:16:16.213942 containerd[1617]: time="2025-09-09T22:16:16.213889046Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hjrnt,Uid:a8772658-fdec-4b6f-ba7d-ab8a7007aafd,Namespace:kube-system,Attempt:0,} returns sandbox id \"7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c\"" Sep 9 22:16:16.224380 containerd[1617]: time="2025-09-09T22:16:16.224336011Z" level=info msg="CreateContainer within sandbox \"7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 22:16:16.243396 containerd[1617]: time="2025-09-09T22:16:16.241284489Z" level=info msg="Container 3d8140b3336c58526290364ba1509d27d270ba1280bac017df7b66d2be6f46ab: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:16:16.246953 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3389187351.mount: Deactivated successfully. Sep 9 22:16:16.253409 containerd[1617]: time="2025-09-09T22:16:16.253351821Z" level=info msg="CreateContainer within sandbox \"7ce646ab1aa9309b6fa94959198a1d22525383dbdae20ac1242bf4f6fbd8ec1c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3d8140b3336c58526290364ba1509d27d270ba1280bac017df7b66d2be6f46ab\"" Sep 9 22:16:16.255797 containerd[1617]: time="2025-09-09T22:16:16.255744392Z" level=info msg="StartContainer for \"3d8140b3336c58526290364ba1509d27d270ba1280bac017df7b66d2be6f46ab\"" Sep 9 22:16:16.260803 containerd[1617]: time="2025-09-09T22:16:16.260742621Z" level=info msg="connecting to shim 3d8140b3336c58526290364ba1509d27d270ba1280bac017df7b66d2be6f46ab" address="unix:///run/containerd/s/80266d4eda7f8551ae8711e98453ef222ec30dd35ad72e30bfd293762d048e30" protocol=ttrpc version=3 Sep 9 22:16:16.337213 systemd[1]: Started cri-containerd-3d8140b3336c58526290364ba1509d27d270ba1280bac017df7b66d2be6f46ab.scope - libcontainer container 3d8140b3336c58526290364ba1509d27d270ba1280bac017df7b66d2be6f46ab. Sep 9 22:16:16.405047 systemd-networkd[1513]: calicabafeea74e: Gained IPv6LL Sep 9 22:16:16.521624 kubelet[2917]: I0909 22:16:16.521108 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-66bzc" podStartSLOduration=46.521087481 podStartE2EDuration="46.521087481s" podCreationTimestamp="2025-09-09 22:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 22:16:16.520704315 +0000 UTC m=+51.888861045" watchObservedRunningTime="2025-09-09 22:16:16.521087481 +0000 UTC m=+51.889244177" Sep 9 22:16:16.550410 containerd[1617]: time="2025-09-09T22:16:16.550045175Z" level=info msg="StartContainer for \"3d8140b3336c58526290364ba1509d27d270ba1280bac017df7b66d2be6f46ab\" returns successfully" Sep 9 22:16:16.853035 systemd-networkd[1513]: caliac528cc2401: Gained IPv6LL Sep 9 22:16:17.008798 containerd[1617]: time="2025-09-09T22:16:17.008722781Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:17.019623 containerd[1617]: time="2025-09-09T22:16:17.019542893Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 9 22:16:17.030700 containerd[1617]: time="2025-09-09T22:16:17.030587629Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:17.033728 kubelet[2917]: I0909 22:16:17.033688 2917 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 22:16:17.038803 containerd[1617]: time="2025-09-09T22:16:17.038198354Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:17.040241 containerd[1617]: time="2025-09-09T22:16:17.040201015Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.605745224s" Sep 9 22:16:17.040320 containerd[1617]: time="2025-09-09T22:16:17.040247786Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 9 22:16:17.052817 containerd[1617]: time="2025-09-09T22:16:17.051791889Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 22:16:17.055561 containerd[1617]: time="2025-09-09T22:16:17.054171784Z" level=info msg="CreateContainer within sandbox \"0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 22:16:17.088120 containerd[1617]: time="2025-09-09T22:16:17.085053240Z" level=info msg="Container 624574ad6cdab29b5e60d50f23c6402bda14825dfe98e24f3ee49bb99629df2e: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:16:17.110545 containerd[1617]: time="2025-09-09T22:16:17.109879643Z" level=info msg="CreateContainer within sandbox \"0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"624574ad6cdab29b5e60d50f23c6402bda14825dfe98e24f3ee49bb99629df2e\"" Sep 9 22:16:17.111295 containerd[1617]: time="2025-09-09T22:16:17.111041125Z" level=info msg="StartContainer for \"624574ad6cdab29b5e60d50f23c6402bda14825dfe98e24f3ee49bb99629df2e\"" Sep 9 22:16:17.116359 containerd[1617]: time="2025-09-09T22:16:17.116207513Z" level=info msg="connecting to shim 624574ad6cdab29b5e60d50f23c6402bda14825dfe98e24f3ee49bb99629df2e" address="unix:///run/containerd/s/36eb65b1dac9c3851af444a1378c4387442438c1c54d0608b43476794c4cd95c" protocol=ttrpc version=3 Sep 9 22:16:17.194066 systemd[1]: Started cri-containerd-624574ad6cdab29b5e60d50f23c6402bda14825dfe98e24f3ee49bb99629df2e.scope - libcontainer container 624574ad6cdab29b5e60d50f23c6402bda14825dfe98e24f3ee49bb99629df2e. Sep 9 22:16:17.238989 systemd-networkd[1513]: calif60ce828f2b: Gained IPv6LL Sep 9 22:16:17.464696 containerd[1617]: time="2025-09-09T22:16:17.463707473Z" level=info msg="StartContainer for \"624574ad6cdab29b5e60d50f23c6402bda14825dfe98e24f3ee49bb99629df2e\" returns successfully" Sep 9 22:16:17.531501 kubelet[2917]: I0909 22:16:17.531431 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-hjrnt" podStartSLOduration=47.531402089 podStartE2EDuration="47.531402089s" podCreationTimestamp="2025-09-09 22:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 22:16:17.531079449 +0000 UTC m=+52.899236150" watchObservedRunningTime="2025-09-09 22:16:17.531402089 +0000 UTC m=+52.899558785" Sep 9 22:16:17.798734 containerd[1617]: time="2025-09-09T22:16:17.798608470Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99\" id:\"57d3764248c2f20d6a0697dfd7fb9d1ab9e0b27cece2ee2b7abc7a1c80f9eb56\" pid:4792 exit_status:1 exited_at:{seconds:1757456177 nanos:787336864}" Sep 9 22:16:17.935999 containerd[1617]: time="2025-09-09T22:16:17.935937847Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99\" id:\"8db3a55480b909a1672249f2c9f9f9067b2cf5a6ca016ed53a01d5758864aece\" pid:4867 exit_status:1 exited_at:{seconds:1757456177 nanos:935461123}" Sep 9 22:16:18.226580 systemd-networkd[1513]: vxlan.calico: Link UP Sep 9 22:16:18.226600 systemd-networkd[1513]: vxlan.calico: Gained carrier Sep 9 22:16:19.476984 systemd-networkd[1513]: vxlan.calico: Gained IPv6LL Sep 9 22:16:21.362971 containerd[1617]: time="2025-09-09T22:16:21.362916429Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:21.364278 containerd[1617]: time="2025-09-09T22:16:21.364248895Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 9 22:16:21.365383 containerd[1617]: time="2025-09-09T22:16:21.365352910Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:21.368769 containerd[1617]: time="2025-09-09T22:16:21.367769275Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:21.369112 containerd[1617]: time="2025-09-09T22:16:21.368717822Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 4.316874387s" Sep 9 22:16:21.369112 containerd[1617]: time="2025-09-09T22:16:21.368972996Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 22:16:21.371444 containerd[1617]: time="2025-09-09T22:16:21.371411024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 22:16:21.373426 containerd[1617]: time="2025-09-09T22:16:21.373380773Z" level=info msg="CreateContainer within sandbox \"fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 22:16:21.384816 containerd[1617]: time="2025-09-09T22:16:21.384549738Z" level=info msg="Container 6d95e0b99365613f103052dec7e85f819be248eee564f54fd366c1b8d1d0fefd: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:16:21.398987 containerd[1617]: time="2025-09-09T22:16:21.398839554Z" level=info msg="CreateContainer within sandbox \"fdf079e6464cf4c24ee12e30632aad27b56418213ce9e8e83b17fdf2dd3445da\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"6d95e0b99365613f103052dec7e85f819be248eee564f54fd366c1b8d1d0fefd\"" Sep 9 22:16:21.400378 containerd[1617]: time="2025-09-09T22:16:21.400089286Z" level=info msg="StartContainer for \"6d95e0b99365613f103052dec7e85f819be248eee564f54fd366c1b8d1d0fefd\"" Sep 9 22:16:21.402383 containerd[1617]: time="2025-09-09T22:16:21.402335657Z" level=info msg="connecting to shim 6d95e0b99365613f103052dec7e85f819be248eee564f54fd366c1b8d1d0fefd" address="unix:///run/containerd/s/b1fcd1f6f589cf5f4f413f1686eba6f7ae000d73b05be1ee62c3d9d140e3adca" protocol=ttrpc version=3 Sep 9 22:16:21.481997 systemd[1]: Started cri-containerd-6d95e0b99365613f103052dec7e85f819be248eee564f54fd366c1b8d1d0fefd.scope - libcontainer container 6d95e0b99365613f103052dec7e85f819be248eee564f54fd366c1b8d1d0fefd. Sep 9 22:16:21.565268 containerd[1617]: time="2025-09-09T22:16:21.565168537Z" level=info msg="StartContainer for \"6d95e0b99365613f103052dec7e85f819be248eee564f54fd366c1b8d1d0fefd\" returns successfully" Sep 9 22:16:22.532774 kubelet[2917]: I0909 22:16:22.532192 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-56d9fcb775-xls5d" podStartSLOduration=32.796715402 podStartE2EDuration="39.532155193s" podCreationTimestamp="2025-09-09 22:15:43 +0000 UTC" firstStartedPulling="2025-09-09 22:16:14.635245467 +0000 UTC m=+50.003402154" lastFinishedPulling="2025-09-09 22:16:21.370685251 +0000 UTC m=+56.738841945" observedRunningTime="2025-09-09 22:16:22.530948435 +0000 UTC m=+57.899105130" watchObservedRunningTime="2025-09-09 22:16:22.532155193 +0000 UTC m=+57.900311879" Sep 9 22:16:23.022888 containerd[1617]: time="2025-09-09T22:16:23.022841151Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:23.025034 containerd[1617]: time="2025-09-09T22:16:23.024829828Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 9 22:16:23.025208 containerd[1617]: time="2025-09-09T22:16:23.025172556Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:23.028798 containerd[1617]: time="2025-09-09T22:16:23.028481878Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:23.032674 containerd[1617]: time="2025-09-09T22:16:23.032103002Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.660651789s" Sep 9 22:16:23.032674 containerd[1617]: time="2025-09-09T22:16:23.032160627Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 9 22:16:23.035648 containerd[1617]: time="2025-09-09T22:16:23.035447586Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 22:16:23.040010 containerd[1617]: time="2025-09-09T22:16:23.039965179Z" level=info msg="CreateContainer within sandbox \"5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 22:16:23.110512 containerd[1617]: time="2025-09-09T22:16:23.109861416Z" level=info msg="Container c0cccee008561adb310ca3b316967fa4a5de32cce1edf908fb6e8f9203b85aa0: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:16:23.123917 containerd[1617]: time="2025-09-09T22:16:23.123873191Z" level=info msg="CreateContainer within sandbox \"5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"c0cccee008561adb310ca3b316967fa4a5de32cce1edf908fb6e8f9203b85aa0\"" Sep 9 22:16:23.126071 containerd[1617]: time="2025-09-09T22:16:23.124943924Z" level=info msg="StartContainer for \"c0cccee008561adb310ca3b316967fa4a5de32cce1edf908fb6e8f9203b85aa0\"" Sep 9 22:16:23.126852 containerd[1617]: time="2025-09-09T22:16:23.126822756Z" level=info msg="connecting to shim c0cccee008561adb310ca3b316967fa4a5de32cce1edf908fb6e8f9203b85aa0" address="unix:///run/containerd/s/dcc52075aec8305c2dd29a179a41b62abde363b420c94ffcc091fb2a5fcccf48" protocol=ttrpc version=3 Sep 9 22:16:23.168033 systemd[1]: Started cri-containerd-c0cccee008561adb310ca3b316967fa4a5de32cce1edf908fb6e8f9203b85aa0.scope - libcontainer container c0cccee008561adb310ca3b316967fa4a5de32cce1edf908fb6e8f9203b85aa0. Sep 9 22:16:23.273914 containerd[1617]: time="2025-09-09T22:16:23.273709984Z" level=info msg="StartContainer for \"c0cccee008561adb310ca3b316967fa4a5de32cce1edf908fb6e8f9203b85aa0\" returns successfully" Sep 9 22:16:27.409521 containerd[1617]: time="2025-09-09T22:16:27.409422389Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:27.411437 containerd[1617]: time="2025-09-09T22:16:27.411331979Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 9 22:16:27.412561 containerd[1617]: time="2025-09-09T22:16:27.412516579Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:27.415310 containerd[1617]: time="2025-09-09T22:16:27.415275209Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:27.416846 containerd[1617]: time="2025-09-09T22:16:27.416742714Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 4.380536859s" Sep 9 22:16:27.417494 containerd[1617]: time="2025-09-09T22:16:27.416963626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 9 22:16:27.419320 containerd[1617]: time="2025-09-09T22:16:27.419261768Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 22:16:27.474694 containerd[1617]: time="2025-09-09T22:16:27.473997390Z" level=info msg="CreateContainer within sandbox \"7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 22:16:27.502912 containerd[1617]: time="2025-09-09T22:16:27.499861733Z" level=info msg="Container 34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:16:27.518374 containerd[1617]: time="2025-09-09T22:16:27.518250237Z" level=info msg="CreateContainer within sandbox \"7418a6e56a8422b1112e8bcef274a210e0cd363c922bd8fcf21801ec656c5329\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d\"" Sep 9 22:16:27.521292 containerd[1617]: time="2025-09-09T22:16:27.521255263Z" level=info msg="StartContainer for \"34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d\"" Sep 9 22:16:27.523708 containerd[1617]: time="2025-09-09T22:16:27.523634504Z" level=info msg="connecting to shim 34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d" address="unix:///run/containerd/s/6c8d4af71f8607ce9bae63299e85573dcd71630f5dd71801bb6c9292edb9d7e9" protocol=ttrpc version=3 Sep 9 22:16:27.583994 systemd[1]: Started cri-containerd-34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d.scope - libcontainer container 34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d. Sep 9 22:16:27.670890 containerd[1617]: time="2025-09-09T22:16:27.670323340Z" level=info msg="StartContainer for \"34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d\" returns successfully" Sep 9 22:16:28.615234 kubelet[2917]: I0909 22:16:28.614417 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-545dfbd8d5-xv6qv" podStartSLOduration=29.72536164 podStartE2EDuration="41.611243309s" podCreationTimestamp="2025-09-09 22:15:47 +0000 UTC" firstStartedPulling="2025-09-09 22:16:15.532619734 +0000 UTC m=+50.900776424" lastFinishedPulling="2025-09-09 22:16:27.418501394 +0000 UTC m=+62.786658093" observedRunningTime="2025-09-09 22:16:28.60992206 +0000 UTC m=+63.978078783" watchObservedRunningTime="2025-09-09 22:16:28.611243309 +0000 UTC m=+63.979400016" Sep 9 22:16:28.701249 containerd[1617]: time="2025-09-09T22:16:28.700614440Z" level=info msg="TaskExit event in podsandbox handler container_id:\"34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d\" id:\"fd7a6e0a5c92eb468b64c74313be8c2aca2e8b466bc6fc8dd72567f08c45e3d0\" pid:5124 exited_at:{seconds:1757456188 nanos:681460107}" Sep 9 22:16:31.351145 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount81434585.mount: Deactivated successfully. Sep 9 22:16:32.255858 containerd[1617]: time="2025-09-09T22:16:32.255767867Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:32.257589 containerd[1617]: time="2025-09-09T22:16:32.257538755Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 9 22:16:32.258886 containerd[1617]: time="2025-09-09T22:16:32.257717148Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:32.266168 containerd[1617]: time="2025-09-09T22:16:32.266077127Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:32.267831 containerd[1617]: time="2025-09-09T22:16:32.267097757Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 4.847794603s" Sep 9 22:16:32.267831 containerd[1617]: time="2025-09-09T22:16:32.267143717Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 9 22:16:32.268563 containerd[1617]: time="2025-09-09T22:16:32.268534906Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 22:16:32.283904 containerd[1617]: time="2025-09-09T22:16:32.283841874Z" level=info msg="CreateContainer within sandbox \"f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 22:16:32.305089 containerd[1617]: time="2025-09-09T22:16:32.305039940Z" level=info msg="Container de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:16:32.314526 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3135879398.mount: Deactivated successfully. Sep 9 22:16:32.318540 containerd[1617]: time="2025-09-09T22:16:32.318464756Z" level=info msg="CreateContainer within sandbox \"f8fdeb7863cfe9d907207ce4e7da44c3be06afa6b3338261f1f0d0507f5a2eb1\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf\"" Sep 9 22:16:32.319482 containerd[1617]: time="2025-09-09T22:16:32.319432719Z" level=info msg="StartContainer for \"de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf\"" Sep 9 22:16:32.321276 containerd[1617]: time="2025-09-09T22:16:32.321177971Z" level=info msg="connecting to shim de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf" address="unix:///run/containerd/s/db303a7554cf1a6fc29bd85aa9ac146346dd6a7cdb6f22a134342b85ad911b6e" protocol=ttrpc version=3 Sep 9 22:16:32.352981 systemd[1]: Started cri-containerd-de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf.scope - libcontainer container de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf. Sep 9 22:16:32.436352 containerd[1617]: time="2025-09-09T22:16:32.436297894Z" level=info msg="StartContainer for \"de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf\" returns successfully" Sep 9 22:16:32.621147 kubelet[2917]: I0909 22:16:32.620444 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-hqktr" podStartSLOduration=30.166151157 podStartE2EDuration="46.620421606s" podCreationTimestamp="2025-09-09 22:15:46 +0000 UTC" firstStartedPulling="2025-09-09 22:16:15.813859955 +0000 UTC m=+51.182016642" lastFinishedPulling="2025-09-09 22:16:32.268130402 +0000 UTC m=+67.636287091" observedRunningTime="2025-09-09 22:16:32.619374436 +0000 UTC m=+67.987531144" watchObservedRunningTime="2025-09-09 22:16:32.620421606 +0000 UTC m=+67.988578294" Sep 9 22:16:32.719381 containerd[1617]: time="2025-09-09T22:16:32.719316242Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:32.721283 containerd[1617]: time="2025-09-09T22:16:32.721247364Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 22:16:32.728736 containerd[1617]: time="2025-09-09T22:16:32.728667631Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 459.987664ms" Sep 9 22:16:32.728876 containerd[1617]: time="2025-09-09T22:16:32.728739407Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 22:16:32.744718 containerd[1617]: time="2025-09-09T22:16:32.744406974Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 22:16:32.747563 containerd[1617]: time="2025-09-09T22:16:32.747528464Z" level=info msg="CreateContainer within sandbox \"b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 22:16:32.776290 containerd[1617]: time="2025-09-09T22:16:32.775978626Z" level=info msg="Container e8c7d876bd93fb3050c6948a73c7b7453653926b070c180b05633b37f6060d6d: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:16:32.786985 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1458011545.mount: Deactivated successfully. Sep 9 22:16:32.811600 containerd[1617]: time="2025-09-09T22:16:32.811524667Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf\" id:\"1266277ba9047f8770c2285856dbf0e6f77df421f95ae8f9549a59763da4eb98\" pid:5201 exit_status:1 exited_at:{seconds:1757456192 nanos:804080892}" Sep 9 22:16:32.823150 containerd[1617]: time="2025-09-09T22:16:32.823107050Z" level=info msg="CreateContainer within sandbox \"b443e0c4d9eefd99c24055767acd24e744a42cd5855e7550f5fb874cf9df17fa\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e8c7d876bd93fb3050c6948a73c7b7453653926b070c180b05633b37f6060d6d\"" Sep 9 22:16:32.824958 containerd[1617]: time="2025-09-09T22:16:32.824918680Z" level=info msg="StartContainer for \"e8c7d876bd93fb3050c6948a73c7b7453653926b070c180b05633b37f6060d6d\"" Sep 9 22:16:32.826499 containerd[1617]: time="2025-09-09T22:16:32.826466911Z" level=info msg="connecting to shim e8c7d876bd93fb3050c6948a73c7b7453653926b070c180b05633b37f6060d6d" address="unix:///run/containerd/s/8314ab342227146fd7a4fffcb24abf16e4a0c07ef4e837bcc351e4ed3e6bfb45" protocol=ttrpc version=3 Sep 9 22:16:32.862021 systemd[1]: Started cri-containerd-e8c7d876bd93fb3050c6948a73c7b7453653926b070c180b05633b37f6060d6d.scope - libcontainer container e8c7d876bd93fb3050c6948a73c7b7453653926b070c180b05633b37f6060d6d. Sep 9 22:16:32.949687 containerd[1617]: time="2025-09-09T22:16:32.949568913Z" level=info msg="StartContainer for \"e8c7d876bd93fb3050c6948a73c7b7453653926b070c180b05633b37f6060d6d\" returns successfully" Sep 9 22:16:33.648502 kubelet[2917]: I0909 22:16:33.648435 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-56d9fcb775-78pgr" podStartSLOduration=34.056618859 podStartE2EDuration="50.648406301s" podCreationTimestamp="2025-09-09 22:15:43 +0000 UTC" firstStartedPulling="2025-09-09 22:16:16.13835935 +0000 UTC m=+51.506516032" lastFinishedPulling="2025-09-09 22:16:32.730146778 +0000 UTC m=+68.098303474" observedRunningTime="2025-09-09 22:16:33.645173672 +0000 UTC m=+69.013330374" watchObservedRunningTime="2025-09-09 22:16:33.648406301 +0000 UTC m=+69.016562990" Sep 9 22:16:33.783876 containerd[1617]: time="2025-09-09T22:16:33.783819645Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf\" id:\"acb39ef55fe89a0f658b471f869570f794a97d58cd1eb4e1295e8be8e282e5bd\" pid:5262 exit_status:1 exited_at:{seconds:1757456193 nanos:783324967}" Sep 9 22:16:34.626552 kubelet[2917]: I0909 22:16:34.625654 2917 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 22:16:34.962210 containerd[1617]: time="2025-09-09T22:16:34.961900847Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:34.968538 containerd[1617]: time="2025-09-09T22:16:34.967401471Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 9 22:16:34.977162 containerd[1617]: time="2025-09-09T22:16:34.977105439Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:34.997233 containerd[1617]: time="2025-09-09T22:16:34.995766712Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:34.997233 containerd[1617]: time="2025-09-09T22:16:34.996765231Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.252302956s" Sep 9 22:16:34.997233 containerd[1617]: time="2025-09-09T22:16:34.996833640Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 9 22:16:35.004125 containerd[1617]: time="2025-09-09T22:16:35.004081152Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 22:16:35.005729 containerd[1617]: time="2025-09-09T22:16:35.005419121Z" level=info msg="CreateContainer within sandbox \"0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 22:16:35.026308 containerd[1617]: time="2025-09-09T22:16:35.025090017Z" level=info msg="Container d17e66cf039f0891809c4bff186ba8accf8cfbc4b91f0ea8d989ff0679273990: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:16:35.036791 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3956167170.mount: Deactivated successfully. Sep 9 22:16:35.079844 containerd[1617]: time="2025-09-09T22:16:35.079744986Z" level=info msg="CreateContainer within sandbox \"0038a1ae8eb87bd696a90c65db792e57b6591854e3f957bde81a1497aa637606\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"d17e66cf039f0891809c4bff186ba8accf8cfbc4b91f0ea8d989ff0679273990\"" Sep 9 22:16:35.082813 containerd[1617]: time="2025-09-09T22:16:35.082361020Z" level=info msg="StartContainer for \"d17e66cf039f0891809c4bff186ba8accf8cfbc4b91f0ea8d989ff0679273990\"" Sep 9 22:16:35.084624 containerd[1617]: time="2025-09-09T22:16:35.084591845Z" level=info msg="connecting to shim d17e66cf039f0891809c4bff186ba8accf8cfbc4b91f0ea8d989ff0679273990" address="unix:///run/containerd/s/36eb65b1dac9c3851af444a1378c4387442438c1c54d0608b43476794c4cd95c" protocol=ttrpc version=3 Sep 9 22:16:35.166009 systemd[1]: Started cri-containerd-d17e66cf039f0891809c4bff186ba8accf8cfbc4b91f0ea8d989ff0679273990.scope - libcontainer container d17e66cf039f0891809c4bff186ba8accf8cfbc4b91f0ea8d989ff0679273990. Sep 9 22:16:35.395089 containerd[1617]: time="2025-09-09T22:16:35.394952846Z" level=info msg="StartContainer for \"d17e66cf039f0891809c4bff186ba8accf8cfbc4b91f0ea8d989ff0679273990\" returns successfully" Sep 9 22:16:35.762905 kubelet[2917]: I0909 22:16:35.762200 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-nxmhd" podStartSLOduration=28.194759942 podStartE2EDuration="48.762140953s" podCreationTimestamp="2025-09-09 22:15:47 +0000 UTC" firstStartedPulling="2025-09-09 22:16:14.434096748 +0000 UTC m=+49.802253441" lastFinishedPulling="2025-09-09 22:16:35.001477756 +0000 UTC m=+70.369634452" observedRunningTime="2025-09-09 22:16:35.735930724 +0000 UTC m=+71.104087418" watchObservedRunningTime="2025-09-09 22:16:35.762140953 +0000 UTC m=+71.130297650" Sep 9 22:16:36.182907 kubelet[2917]: I0909 22:16:36.182845 2917 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 22:16:36.186658 kubelet[2917]: I0909 22:16:36.186606 2917 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 22:16:37.970171 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1473879467.mount: Deactivated successfully. Sep 9 22:16:38.004719 containerd[1617]: time="2025-09-09T22:16:38.004618439Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:38.006935 containerd[1617]: time="2025-09-09T22:16:38.006897718Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 9 22:16:38.007878 containerd[1617]: time="2025-09-09T22:16:38.007773284Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:38.010833 containerd[1617]: time="2025-09-09T22:16:38.010412915Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:16:38.011896 containerd[1617]: time="2025-09-09T22:16:38.011432576Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 3.006386645s" Sep 9 22:16:38.011896 containerd[1617]: time="2025-09-09T22:16:38.011474658Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 9 22:16:38.020703 containerd[1617]: time="2025-09-09T22:16:38.020627678Z" level=info msg="CreateContainer within sandbox \"5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 22:16:38.044060 containerd[1617]: time="2025-09-09T22:16:38.043995392Z" level=info msg="Container 2b3bec637e114b4b56428c3230a95c68185c553381d3ecd98c511574c309467d: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:16:38.061328 containerd[1617]: time="2025-09-09T22:16:38.061272108Z" level=info msg="CreateContainer within sandbox \"5960586a90bae1faa6dc9175c7fa80ca62e9ea2a9bc11dd6b3b96271047623dd\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"2b3bec637e114b4b56428c3230a95c68185c553381d3ecd98c511574c309467d\"" Sep 9 22:16:38.062826 containerd[1617]: time="2025-09-09T22:16:38.062590114Z" level=info msg="StartContainer for \"2b3bec637e114b4b56428c3230a95c68185c553381d3ecd98c511574c309467d\"" Sep 9 22:16:38.064829 containerd[1617]: time="2025-09-09T22:16:38.064798850Z" level=info msg="connecting to shim 2b3bec637e114b4b56428c3230a95c68185c553381d3ecd98c511574c309467d" address="unix:///run/containerd/s/dcc52075aec8305c2dd29a179a41b62abde363b420c94ffcc091fb2a5fcccf48" protocol=ttrpc version=3 Sep 9 22:16:38.189983 systemd[1]: Started cri-containerd-2b3bec637e114b4b56428c3230a95c68185c553381d3ecd98c511574c309467d.scope - libcontainer container 2b3bec637e114b4b56428c3230a95c68185c553381d3ecd98c511574c309467d. Sep 9 22:16:38.328337 containerd[1617]: time="2025-09-09T22:16:38.328273417Z" level=info msg="StartContainer for \"2b3bec637e114b4b56428c3230a95c68185c553381d3ecd98c511574c309467d\" returns successfully" Sep 9 22:16:38.691182 kubelet[2917]: I0909 22:16:38.690289 2917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-fcd8795b9-v7sjs" podStartSLOduration=2.825789379 podStartE2EDuration="25.690235613s" podCreationTimestamp="2025-09-09 22:16:13 +0000 UTC" firstStartedPulling="2025-09-09 22:16:15.148650012 +0000 UTC m=+50.516806694" lastFinishedPulling="2025-09-09 22:16:38.013096246 +0000 UTC m=+73.381252928" observedRunningTime="2025-09-09 22:16:38.68839243 +0000 UTC m=+74.056549134" watchObservedRunningTime="2025-09-09 22:16:38.690235613 +0000 UTC m=+74.058392310" Sep 9 22:16:45.714022 containerd[1617]: time="2025-09-09T22:16:45.713946393Z" level=info msg="TaskExit event in podsandbox handler container_id:\"34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d\" id:\"241c47340aa8368624eb49ce7f31c21b7fe5a0f6d9be4eb56bcdd2fbd63f6f34\" pid:5376 exited_at:{seconds:1757456205 nanos:705157556}" Sep 9 22:16:48.261081 containerd[1617]: time="2025-09-09T22:16:48.261004198Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99\" id:\"b0c62bdb8d114103c5e14e4dce9d42166887b958e8445b78e3968d923a885b0d\" pid:5397 exited_at:{seconds:1757456208 nanos:259641345}" Sep 9 22:16:52.390503 kubelet[2917]: I0909 22:16:52.390150 2917 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 22:16:52.646606 systemd[1]: Started sshd@17-10.230.51.18:22-14.194.76.134:29673.service - OpenSSH per-connection server daemon (14.194.76.134:29673). Sep 9 22:16:54.190800 sshd[5417]: Invalid user zabbix from 14.194.76.134 port 29673 Sep 9 22:16:54.454985 sshd[5417]: Received disconnect from 14.194.76.134 port 29673:11: Bye Bye [preauth] Sep 9 22:16:54.454985 sshd[5417]: Disconnected from invalid user zabbix 14.194.76.134 port 29673 [preauth] Sep 9 22:16:54.458601 systemd[1]: sshd@17-10.230.51.18:22-14.194.76.134:29673.service: Deactivated successfully. Sep 9 22:16:55.633063 systemd[1]: Started sshd@18-10.230.51.18:22-85.209.134.43:23392.service - OpenSSH per-connection server daemon (85.209.134.43:23392). Sep 9 22:16:56.142737 sshd[5425]: Invalid user fish from 85.209.134.43 port 23392 Sep 9 22:16:56.217897 sshd[5425]: Received disconnect from 85.209.134.43 port 23392:11: Bye Bye [preauth] Sep 9 22:16:56.217897 sshd[5425]: Disconnected from invalid user fish 85.209.134.43 port 23392 [preauth] Sep 9 22:16:56.221444 systemd[1]: sshd@18-10.230.51.18:22-85.209.134.43:23392.service: Deactivated successfully. Sep 9 22:16:58.181190 systemd[1]: Started sshd@19-10.230.51.18:22-139.178.68.195:53734.service - OpenSSH per-connection server daemon (139.178.68.195:53734). Sep 9 22:16:59.091563 containerd[1617]: time="2025-09-09T22:16:59.091478277Z" level=info msg="TaskExit event in podsandbox handler container_id:\"34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d\" id:\"e594360b62f035c811ff765ff7e68413d85e698bf6c01460878219299cbcb457\" pid:5450 exited_at:{seconds:1757456219 nanos:90652444}" Sep 9 22:16:59.156910 sshd[5434]: Accepted publickey for core from 139.178.68.195 port 53734 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:16:59.160504 sshd-session[5434]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:16:59.180168 systemd-logind[1588]: New session 12 of user core. Sep 9 22:16:59.184951 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 22:17:00.478917 sshd[5467]: Connection closed by 139.178.68.195 port 53734 Sep 9 22:17:00.484211 sshd-session[5434]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:00.497500 systemd[1]: sshd@19-10.230.51.18:22-139.178.68.195:53734.service: Deactivated successfully. Sep 9 22:17:00.503246 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 22:17:00.509710 systemd-logind[1588]: Session 12 logged out. Waiting for processes to exit. Sep 9 22:17:00.512981 systemd-logind[1588]: Removed session 12. Sep 9 22:17:01.893047 containerd[1617]: time="2025-09-09T22:17:01.892393365Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf\" id:\"06886d9d0308b034911b1e8f6bc79295db56b975a75a2335a1989ac9d0ecd07b\" pid:5494 exited_at:{seconds:1757456221 nanos:891888745}" Sep 9 22:17:03.932662 containerd[1617]: time="2025-09-09T22:17:03.932477502Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf\" id:\"9546317f238d68b4f8c003d570f7dc734d9be8abd32bd1f77850efc2b00aeb9c\" pid:5522 exited_at:{seconds:1757456223 nanos:931643459}" Sep 9 22:17:05.641490 systemd[1]: Started sshd@20-10.230.51.18:22-139.178.68.195:49180.service - OpenSSH per-connection server daemon (139.178.68.195:49180). Sep 9 22:17:06.609804 sshd[5535]: Accepted publickey for core from 139.178.68.195 port 49180 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:17:06.612304 sshd-session[5535]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:17:06.622280 systemd-logind[1588]: New session 13 of user core. Sep 9 22:17:06.627989 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 22:17:07.352158 systemd[1]: Started sshd@21-10.230.51.18:22-103.146.52.252:55326.service - OpenSSH per-connection server daemon (103.146.52.252:55326). Sep 9 22:17:07.620299 sshd[5538]: Connection closed by 139.178.68.195 port 49180 Sep 9 22:17:07.621908 sshd-session[5535]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:07.634684 systemd[1]: sshd@20-10.230.51.18:22-139.178.68.195:49180.service: Deactivated successfully. Sep 9 22:17:07.640614 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 22:17:07.642992 systemd-logind[1588]: Session 13 logged out. Waiting for processes to exit. Sep 9 22:17:07.645445 systemd-logind[1588]: Removed session 13. Sep 9 22:17:08.185867 sshd[5548]: Invalid user nagios from 103.146.52.252 port 55326 Sep 9 22:17:08.335859 sshd[5548]: Received disconnect from 103.146.52.252 port 55326:11: Bye Bye [preauth] Sep 9 22:17:08.335859 sshd[5548]: Disconnected from invalid user nagios 103.146.52.252 port 55326 [preauth] Sep 9 22:17:08.340281 systemd[1]: sshd@21-10.230.51.18:22-103.146.52.252:55326.service: Deactivated successfully. Sep 9 22:17:12.777126 systemd[1]: Started sshd@22-10.230.51.18:22-139.178.68.195:44972.service - OpenSSH per-connection server daemon (139.178.68.195:44972). Sep 9 22:17:13.698508 sshd[5557]: Accepted publickey for core from 139.178.68.195 port 44972 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:17:13.700588 sshd-session[5557]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:17:13.709334 systemd-logind[1588]: New session 14 of user core. Sep 9 22:17:13.717998 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 22:17:14.431952 sshd[5560]: Connection closed by 139.178.68.195 port 44972 Sep 9 22:17:14.438995 sshd-session[5557]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:14.446893 systemd[1]: sshd@22-10.230.51.18:22-139.178.68.195:44972.service: Deactivated successfully. Sep 9 22:17:14.450985 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 22:17:14.453233 systemd-logind[1588]: Session 14 logged out. Waiting for processes to exit. Sep 9 22:17:14.456521 systemd-logind[1588]: Removed session 14. Sep 9 22:17:14.589943 systemd[1]: Started sshd@23-10.230.51.18:22-139.178.68.195:44982.service - OpenSSH per-connection server daemon (139.178.68.195:44982). Sep 9 22:17:15.528838 sshd[5574]: Accepted publickey for core from 139.178.68.195 port 44982 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:17:15.531356 sshd-session[5574]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:17:15.544552 systemd-logind[1588]: New session 15 of user core. Sep 9 22:17:15.550905 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 22:17:15.665097 systemd[1]: Started sshd@24-10.230.51.18:22-172.245.45.194:54606.service - OpenSSH per-connection server daemon (172.245.45.194:54606). Sep 9 22:17:16.393257 sshd[5577]: Connection closed by 139.178.68.195 port 44982 Sep 9 22:17:16.395353 sshd-session[5574]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:16.406638 systemd[1]: sshd@23-10.230.51.18:22-139.178.68.195:44982.service: Deactivated successfully. Sep 9 22:17:16.411845 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 22:17:16.415006 systemd-logind[1588]: Session 15 logged out. Waiting for processes to exit. Sep 9 22:17:16.416635 systemd-logind[1588]: Removed session 15. Sep 9 22:17:16.530536 sshd[5579]: Invalid user jupyterhub from 172.245.45.194 port 54606 Sep 9 22:17:16.558296 systemd[1]: Started sshd@25-10.230.51.18:22-139.178.68.195:44990.service - OpenSSH per-connection server daemon (139.178.68.195:44990). Sep 9 22:17:16.677541 sshd[5579]: Received disconnect from 172.245.45.194 port 54606:11: Bye Bye [preauth] Sep 9 22:17:16.677541 sshd[5579]: Disconnected from invalid user jupyterhub 172.245.45.194 port 54606 [preauth] Sep 9 22:17:16.682079 systemd[1]: sshd@24-10.230.51.18:22-172.245.45.194:54606.service: Deactivated successfully. Sep 9 22:17:17.514638 sshd[5591]: Accepted publickey for core from 139.178.68.195 port 44990 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:17:17.516909 sshd-session[5591]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:17:17.531689 systemd-logind[1588]: New session 16 of user core. Sep 9 22:17:17.536438 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 22:17:18.327698 sshd[5596]: Connection closed by 139.178.68.195 port 44990 Sep 9 22:17:18.329079 sshd-session[5591]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:18.338213 systemd[1]: sshd@25-10.230.51.18:22-139.178.68.195:44990.service: Deactivated successfully. Sep 9 22:17:18.343761 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 22:17:18.349679 systemd-logind[1588]: Session 16 logged out. Waiting for processes to exit. Sep 9 22:17:18.352833 systemd-logind[1588]: Removed session 16. Sep 9 22:17:18.493135 containerd[1617]: time="2025-09-09T22:17:18.493052146Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99\" id:\"f2529fed733cbcfb76fec06284b85418d39b582ffd9c007ce32da221d3ad8342\" pid:5609 exited_at:{seconds:1757456238 nanos:492447176}" Sep 9 22:17:23.490877 systemd[1]: Started sshd@26-10.230.51.18:22-139.178.68.195:43442.service - OpenSSH per-connection server daemon (139.178.68.195:43442). Sep 9 22:17:24.547829 sshd[5634]: Accepted publickey for core from 139.178.68.195 port 43442 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:17:24.552396 sshd-session[5634]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:17:24.565403 systemd-logind[1588]: New session 17 of user core. Sep 9 22:17:24.574447 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 22:17:25.496908 sshd[5638]: Connection closed by 139.178.68.195 port 43442 Sep 9 22:17:25.499406 sshd-session[5634]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:25.510952 systemd-logind[1588]: Session 17 logged out. Waiting for processes to exit. Sep 9 22:17:25.511826 systemd[1]: sshd@26-10.230.51.18:22-139.178.68.195:43442.service: Deactivated successfully. Sep 9 22:17:25.517664 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 22:17:25.521527 systemd-logind[1588]: Removed session 17. Sep 9 22:17:28.733555 containerd[1617]: time="2025-09-09T22:17:28.733463073Z" level=info msg="TaskExit event in podsandbox handler container_id:\"34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d\" id:\"e46a06afeea3d09ac2ff8ff3935ba19306a80e0c338c4b04a3b76256eba8a369\" pid:5663 exited_at:{seconds:1757456248 nanos:733138926}" Sep 9 22:17:30.666970 systemd[1]: Started sshd@27-10.230.51.18:22-139.178.68.195:55240.service - OpenSSH per-connection server daemon (139.178.68.195:55240). Sep 9 22:17:31.594048 sshd[5673]: Accepted publickey for core from 139.178.68.195 port 55240 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:17:31.597125 sshd-session[5673]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:17:31.608241 systemd-logind[1588]: New session 18 of user core. Sep 9 22:17:31.616345 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 22:17:32.502561 sshd[5679]: Connection closed by 139.178.68.195 port 55240 Sep 9 22:17:32.505202 sshd-session[5673]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:32.514940 systemd[1]: sshd@27-10.230.51.18:22-139.178.68.195:55240.service: Deactivated successfully. Sep 9 22:17:32.518357 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 22:17:32.522108 systemd-logind[1588]: Session 18 logged out. Waiting for processes to exit. Sep 9 22:17:32.525383 systemd-logind[1588]: Removed session 18. Sep 9 22:17:34.111683 containerd[1617]: time="2025-09-09T22:17:34.086639817Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf\" id:\"f102e4bc0f3ab5061efe88447719edc7c8e4fb92ec93295742339d794bf75953\" pid:5703 exited_at:{seconds:1757456254 nanos:85336051}" Sep 9 22:17:37.662090 systemd[1]: Started sshd@28-10.230.51.18:22-139.178.68.195:55252.service - OpenSSH per-connection server daemon (139.178.68.195:55252). Sep 9 22:17:38.605316 sshd[5714]: Accepted publickey for core from 139.178.68.195 port 55252 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:17:38.609485 sshd-session[5714]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:17:38.620115 systemd-logind[1588]: New session 19 of user core. Sep 9 22:17:38.626020 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 22:17:39.504200 sshd[5717]: Connection closed by 139.178.68.195 port 55252 Sep 9 22:17:39.505109 sshd-session[5714]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:39.513613 systemd-logind[1588]: Session 19 logged out. Waiting for processes to exit. Sep 9 22:17:39.514651 systemd[1]: sshd@28-10.230.51.18:22-139.178.68.195:55252.service: Deactivated successfully. Sep 9 22:17:39.519271 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 22:17:39.524746 systemd-logind[1588]: Removed session 19. Sep 9 22:17:44.673771 systemd[1]: Started sshd@29-10.230.51.18:22-139.178.68.195:56296.service - OpenSSH per-connection server daemon (139.178.68.195:56296). Sep 9 22:17:45.681406 sshd[5737]: Accepted publickey for core from 139.178.68.195 port 56296 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:17:45.683744 sshd-session[5737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:17:45.695854 systemd-logind[1588]: New session 20 of user core. Sep 9 22:17:45.699989 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 22:17:45.718978 containerd[1617]: time="2025-09-09T22:17:45.718908335Z" level=info msg="TaskExit event in podsandbox handler container_id:\"34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d\" id:\"71d48d76ea07fb402b8ee2bfe1db37c4acea804747cff840a16c769a533c6a17\" pid:5753 exited_at:{seconds:1757456265 nanos:718031631}" Sep 9 22:17:46.516881 sshd[5760]: Connection closed by 139.178.68.195 port 56296 Sep 9 22:17:46.517358 sshd-session[5737]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:46.524488 systemd-logind[1588]: Session 20 logged out. Waiting for processes to exit. Sep 9 22:17:46.526333 systemd[1]: sshd@29-10.230.51.18:22-139.178.68.195:56296.service: Deactivated successfully. Sep 9 22:17:46.530429 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 22:17:46.534083 systemd-logind[1588]: Removed session 20. Sep 9 22:17:46.674595 systemd[1]: Started sshd@30-10.230.51.18:22-139.178.68.195:56310.service - OpenSSH per-connection server daemon (139.178.68.195:56310). Sep 9 22:17:47.660831 sshd[5775]: Accepted publickey for core from 139.178.68.195 port 56310 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:17:47.663293 sshd-session[5775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:17:47.671112 systemd-logind[1588]: New session 21 of user core. Sep 9 22:17:47.678980 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 22:17:48.055828 containerd[1617]: time="2025-09-09T22:17:48.055750017Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99\" id:\"fc29ba19d6bfb431bce97e533cadde0514691d9135ef5bb8f1b251e2d51b8af2\" pid:5792 exited_at:{seconds:1757456268 nanos:54949798}" Sep 9 22:17:48.801913 sshd[5778]: Connection closed by 139.178.68.195 port 56310 Sep 9 22:17:48.805446 sshd-session[5775]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:48.822794 systemd[1]: sshd@30-10.230.51.18:22-139.178.68.195:56310.service: Deactivated successfully. Sep 9 22:17:48.826350 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 22:17:48.856495 systemd-logind[1588]: Session 21 logged out. Waiting for processes to exit. Sep 9 22:17:48.858703 systemd-logind[1588]: Removed session 21. Sep 9 22:17:48.962032 systemd[1]: Started sshd@31-10.230.51.18:22-139.178.68.195:56318.service - OpenSSH per-connection server daemon (139.178.68.195:56318). Sep 9 22:17:49.586886 systemd[1]: Started sshd@32-10.230.51.18:22-85.209.134.43:42394.service - OpenSSH per-connection server daemon (85.209.134.43:42394). Sep 9 22:17:49.934809 sshd[5812]: Accepted publickey for core from 139.178.68.195 port 56318 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:17:49.938652 sshd-session[5812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:17:49.955613 systemd-logind[1588]: New session 22 of user core. Sep 9 22:17:49.961007 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 22:17:50.058940 sshd[5816]: Invalid user install from 85.209.134.43 port 42394 Sep 9 22:17:50.136846 sshd[5816]: Received disconnect from 85.209.134.43 port 42394:11: Bye Bye [preauth] Sep 9 22:17:50.137216 sshd[5816]: Disconnected from invalid user install 85.209.134.43 port 42394 [preauth] Sep 9 22:17:50.139641 systemd[1]: sshd@32-10.230.51.18:22-85.209.134.43:42394.service: Deactivated successfully. Sep 9 22:17:51.626045 sshd[5819]: Connection closed by 139.178.68.195 port 56318 Sep 9 22:17:51.627500 sshd-session[5812]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:51.652038 systemd[1]: sshd@31-10.230.51.18:22-139.178.68.195:56318.service: Deactivated successfully. Sep 9 22:17:51.657722 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 22:17:51.661484 systemd-logind[1588]: Session 22 logged out. Waiting for processes to exit. Sep 9 22:17:51.664951 systemd-logind[1588]: Removed session 22. Sep 9 22:17:51.781580 systemd[1]: Started sshd@33-10.230.51.18:22-139.178.68.195:36956.service - OpenSSH per-connection server daemon (139.178.68.195:36956). Sep 9 22:17:52.709859 sshd[5851]: Accepted publickey for core from 139.178.68.195 port 36956 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:17:52.713441 sshd-session[5851]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:17:52.724461 systemd-logind[1588]: New session 23 of user core. Sep 9 22:17:52.732980 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 22:17:54.299685 sshd[5854]: Connection closed by 139.178.68.195 port 36956 Sep 9 22:17:54.300808 sshd-session[5851]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:54.309002 systemd-logind[1588]: Session 23 logged out. Waiting for processes to exit. Sep 9 22:17:54.311045 systemd[1]: sshd@33-10.230.51.18:22-139.178.68.195:36956.service: Deactivated successfully. Sep 9 22:17:54.317114 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 22:17:54.322825 systemd-logind[1588]: Removed session 23. Sep 9 22:17:54.456181 systemd[1]: Started sshd@34-10.230.51.18:22-139.178.68.195:36960.service - OpenSSH per-connection server daemon (139.178.68.195:36960). Sep 9 22:17:55.431644 sshd[5864]: Accepted publickey for core from 139.178.68.195 port 36960 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:17:55.435210 sshd-session[5864]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:17:55.445031 systemd-logind[1588]: New session 24 of user core. Sep 9 22:17:55.454052 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 22:17:56.289767 sshd[5867]: Connection closed by 139.178.68.195 port 36960 Sep 9 22:17:56.290653 sshd-session[5864]: pam_unix(sshd:session): session closed for user core Sep 9 22:17:56.298772 systemd[1]: sshd@34-10.230.51.18:22-139.178.68.195:36960.service: Deactivated successfully. Sep 9 22:17:56.303284 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 22:17:56.309143 systemd-logind[1588]: Session 24 logged out. Waiting for processes to exit. Sep 9 22:17:56.310891 systemd-logind[1588]: Removed session 24. Sep 9 22:17:56.881366 systemd[1]: Started sshd@35-10.230.51.18:22-14.194.76.134:56143.service - OpenSSH per-connection server daemon (14.194.76.134:56143). Sep 9 22:17:58.305637 sshd[5886]: Invalid user server from 14.194.76.134 port 56143 Sep 9 22:17:58.583826 sshd[5886]: Received disconnect from 14.194.76.134 port 56143:11: Bye Bye [preauth] Sep 9 22:17:58.583826 sshd[5886]: Disconnected from invalid user server 14.194.76.134 port 56143 [preauth] Sep 9 22:17:58.584044 systemd[1]: sshd@35-10.230.51.18:22-14.194.76.134:56143.service: Deactivated successfully. Sep 9 22:17:58.848515 containerd[1617]: time="2025-09-09T22:17:58.847843305Z" level=info msg="TaskExit event in podsandbox handler container_id:\"34bfdc9a548e552d216e8626586a3cccaf20f21e374f804843cc953149dc4c5d\" id:\"7c93cf83a372fcbebc914cdfb61adb8b58f1694f4681dc429cb3acdd43535e44\" pid:5903 exited_at:{seconds:1757456278 nanos:847223756}" Sep 9 22:18:01.450406 systemd[1]: Started sshd@36-10.230.51.18:22-139.178.68.195:33840.service - OpenSSH per-connection server daemon (139.178.68.195:33840). Sep 9 22:18:01.577930 containerd[1617]: time="2025-09-09T22:18:01.574169826Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf\" id:\"2a820f0c800fc2144d34ecd9e601dd648ff44a73dfa5cf6f99e4c52263cd4c5b\" pid:5928 exited_at:{seconds:1757456281 nanos:573135050}" Sep 9 22:18:02.393802 sshd[5938]: Accepted publickey for core from 139.178.68.195 port 33840 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:18:02.395204 sshd-session[5938]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:18:02.407375 systemd-logind[1588]: New session 25 of user core. Sep 9 22:18:02.412005 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 9 22:18:03.270362 sshd[5942]: Connection closed by 139.178.68.195 port 33840 Sep 9 22:18:03.275255 sshd-session[5938]: pam_unix(sshd:session): session closed for user core Sep 9 22:18:03.282973 systemd[1]: sshd@36-10.230.51.18:22-139.178.68.195:33840.service: Deactivated successfully. Sep 9 22:18:03.287538 systemd[1]: session-25.scope: Deactivated successfully. Sep 9 22:18:03.294132 systemd-logind[1588]: Session 25 logged out. Waiting for processes to exit. Sep 9 22:18:03.295978 systemd-logind[1588]: Removed session 25. Sep 9 22:18:03.874730 containerd[1617]: time="2025-09-09T22:18:03.874629094Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de3a13bf38fa0ad94607bd3596e1e79da38a0edcadd180b5c91b48ce4d5c4abf\" id:\"8a5365f9f7193143e2c8a5a9f007d3c4ccac3ac75147e489f933eb0d05a82cbb\" pid:5965 exited_at:{seconds:1757456283 nanos:873724133}" Sep 9 22:18:08.431877 systemd[1]: Started sshd@37-10.230.51.18:22-139.178.68.195:33854.service - OpenSSH per-connection server daemon (139.178.68.195:33854). Sep 9 22:18:09.382881 sshd[5976]: Accepted publickey for core from 139.178.68.195 port 33854 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:18:09.386072 sshd-session[5976]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:18:09.397516 systemd-logind[1588]: New session 26 of user core. Sep 9 22:18:09.405346 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 9 22:18:10.160914 sshd[5979]: Connection closed by 139.178.68.195 port 33854 Sep 9 22:18:10.161778 sshd-session[5976]: pam_unix(sshd:session): session closed for user core Sep 9 22:18:10.169234 systemd-logind[1588]: Session 26 logged out. Waiting for processes to exit. Sep 9 22:18:10.170403 systemd[1]: sshd@37-10.230.51.18:22-139.178.68.195:33854.service: Deactivated successfully. Sep 9 22:18:10.174727 systemd[1]: session-26.scope: Deactivated successfully. Sep 9 22:18:10.179699 systemd-logind[1588]: Removed session 26. Sep 9 22:18:10.210976 systemd[1]: Started sshd@38-10.230.51.18:22-103.146.52.252:42956.service - OpenSSH per-connection server daemon (103.146.52.252:42956). Sep 9 22:18:11.024446 sshd[5993]: Invalid user lenovo from 103.146.52.252 port 42956 Sep 9 22:18:11.170818 sshd[5993]: Received disconnect from 103.146.52.252 port 42956:11: Bye Bye [preauth] Sep 9 22:18:11.170818 sshd[5993]: Disconnected from invalid user lenovo 103.146.52.252 port 42956 [preauth] Sep 9 22:18:11.174313 systemd[1]: sshd@38-10.230.51.18:22-103.146.52.252:42956.service: Deactivated successfully. Sep 9 22:18:15.321669 systemd[1]: Started sshd@39-10.230.51.18:22-139.178.68.195:50700.service - OpenSSH per-connection server daemon (139.178.68.195:50700). Sep 9 22:18:16.271931 sshd[5999]: Accepted publickey for core from 139.178.68.195 port 50700 ssh2: RSA SHA256:+qALU9k/mo8UhykNHlDcmxrEaPMwkARJbTQwajD+DYE Sep 9 22:18:16.274175 sshd-session[5999]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:18:16.284838 systemd-logind[1588]: New session 27 of user core. Sep 9 22:18:16.287948 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 9 22:18:17.115010 sshd[6002]: Connection closed by 139.178.68.195 port 50700 Sep 9 22:18:17.115559 sshd-session[5999]: pam_unix(sshd:session): session closed for user core Sep 9 22:18:17.125064 systemd[1]: sshd@39-10.230.51.18:22-139.178.68.195:50700.service: Deactivated successfully. Sep 9 22:18:17.130632 systemd[1]: session-27.scope: Deactivated successfully. Sep 9 22:18:17.134190 systemd-logind[1588]: Session 27 logged out. Waiting for processes to exit. Sep 9 22:18:17.136449 systemd-logind[1588]: Removed session 27. Sep 9 22:18:18.249869 containerd[1617]: time="2025-09-09T22:18:18.249717241Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cceeb7fd5a5c1c12276e999ecfe6736105686d2a5a1ca50b861d42320b9d5a99\" id:\"525dea5e717568b01e49d5084f860b1bff3c60f6cfa10fcef8fb4a0194c74e42\" pid:6026 exited_at:{seconds:1757456298 nanos:249117886}" Sep 9 22:18:19.091190 systemd[1]: Started sshd@40-10.230.51.18:22-172.245.45.194:54716.service - OpenSSH per-connection server daemon (172.245.45.194:54716). Sep 9 22:18:19.953070 sshd[6039]: Invalid user hammer from 172.245.45.194 port 54716 Sep 9 22:18:20.098071 sshd[6039]: Received disconnect from 172.245.45.194 port 54716:11: Bye Bye [preauth] Sep 9 22:18:20.098071 sshd[6039]: Disconnected from invalid user hammer 172.245.45.194 port 54716 [preauth] Sep 9 22:18:20.101382 systemd[1]: sshd@40-10.230.51.18:22-172.245.45.194:54716.service: Deactivated successfully.