Sep 13 02:25:54.927388 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 22:15:39 -00 2025 Sep 13 02:25:54.927432 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=21b29c6e420cf06e0546ff797fc1285d986af130e4ba1abb9f27cb6343b53294 Sep 13 02:25:54.927450 kernel: BIOS-provided physical RAM map: Sep 13 02:25:54.927460 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 13 02:25:54.927470 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 13 02:25:54.927479 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 13 02:25:54.927490 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Sep 13 02:25:54.927500 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Sep 13 02:25:54.927510 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 13 02:25:54.927520 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 13 02:25:54.927534 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 13 02:25:54.927544 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 13 02:25:54.927554 kernel: NX (Execute Disable) protection: active Sep 13 02:25:54.927564 kernel: APIC: Static calls initialized Sep 13 02:25:54.927575 kernel: SMBIOS 2.8 present. Sep 13 02:25:54.927590 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Sep 13 02:25:54.927601 kernel: DMI: Memory slots populated: 1/1 Sep 13 02:25:54.927612 kernel: Hypervisor detected: KVM Sep 13 02:25:54.927622 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 13 02:25:54.927633 kernel: kvm-clock: using sched offset of 5543942596 cycles Sep 13 02:25:54.927644 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 13 02:25:54.927670 kernel: tsc: Detected 2799.998 MHz processor Sep 13 02:25:54.927682 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 13 02:25:54.927693 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 13 02:25:54.927704 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Sep 13 02:25:54.927720 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 13 02:25:54.927731 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 13 02:25:54.927742 kernel: Using GB pages for direct mapping Sep 13 02:25:54.927753 kernel: ACPI: Early table checksum verification disabled Sep 13 02:25:54.927763 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Sep 13 02:25:54.927774 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 02:25:54.927785 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 02:25:54.927796 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 02:25:54.927807 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Sep 13 02:25:54.927822 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 02:25:54.927833 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 02:25:54.927844 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 02:25:54.927855 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 02:25:54.927866 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Sep 13 02:25:54.927877 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Sep 13 02:25:54.927893 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Sep 13 02:25:54.927908 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Sep 13 02:25:54.927919 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Sep 13 02:25:54.927931 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Sep 13 02:25:54.927942 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Sep 13 02:25:54.927954 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 13 02:25:54.927965 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 13 02:25:54.927976 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Sep 13 02:25:54.927992 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00001000-0x7ffdbfff] Sep 13 02:25:54.928004 kernel: NODE_DATA(0) allocated [mem 0x7ffd4dc0-0x7ffdbfff] Sep 13 02:25:54.928015 kernel: Zone ranges: Sep 13 02:25:54.928041 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 13 02:25:54.928054 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Sep 13 02:25:54.928066 kernel: Normal empty Sep 13 02:25:54.928077 kernel: Device empty Sep 13 02:25:54.928088 kernel: Movable zone start for each node Sep 13 02:25:54.928099 kernel: Early memory node ranges Sep 13 02:25:54.928115 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 13 02:25:54.928127 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Sep 13 02:25:54.928139 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Sep 13 02:25:54.928150 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 13 02:25:54.928161 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 13 02:25:54.928172 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Sep 13 02:25:54.928184 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 13 02:25:54.928195 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 13 02:25:54.928206 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 13 02:25:54.928222 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 13 02:25:54.928233 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 13 02:25:54.928245 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 13 02:25:54.928256 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 13 02:25:54.928267 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 13 02:25:54.928278 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 13 02:25:54.928290 kernel: TSC deadline timer available Sep 13 02:25:54.928301 kernel: CPU topo: Max. logical packages: 16 Sep 13 02:25:54.928312 kernel: CPU topo: Max. logical dies: 16 Sep 13 02:25:54.928324 kernel: CPU topo: Max. dies per package: 1 Sep 13 02:25:54.928339 kernel: CPU topo: Max. threads per core: 1 Sep 13 02:25:54.928350 kernel: CPU topo: Num. cores per package: 1 Sep 13 02:25:54.928361 kernel: CPU topo: Num. threads per package: 1 Sep 13 02:25:54.928373 kernel: CPU topo: Allowing 2 present CPUs plus 14 hotplug CPUs Sep 13 02:25:54.928384 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 13 02:25:54.928395 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 13 02:25:54.928407 kernel: Booting paravirtualized kernel on KVM Sep 13 02:25:54.928418 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 13 02:25:54.928430 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 13 02:25:54.928445 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 13 02:25:54.928457 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 13 02:25:54.928468 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 13 02:25:54.928479 kernel: kvm-guest: PV spinlocks enabled Sep 13 02:25:54.928490 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 13 02:25:54.928503 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=21b29c6e420cf06e0546ff797fc1285d986af130e4ba1abb9f27cb6343b53294 Sep 13 02:25:54.928515 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 13 02:25:54.928526 kernel: random: crng init done Sep 13 02:25:54.928542 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 13 02:25:54.928553 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 13 02:25:54.928565 kernel: Fallback order for Node 0: 0 Sep 13 02:25:54.928576 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524154 Sep 13 02:25:54.928587 kernel: Policy zone: DMA32 Sep 13 02:25:54.928599 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 13 02:25:54.928610 kernel: software IO TLB: area num 16. Sep 13 02:25:54.928622 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 13 02:25:54.928633 kernel: Kernel/User page tables isolation: enabled Sep 13 02:25:54.928657 kernel: ftrace: allocating 40122 entries in 157 pages Sep 13 02:25:54.928670 kernel: ftrace: allocated 157 pages with 5 groups Sep 13 02:25:54.928682 kernel: Dynamic Preempt: voluntary Sep 13 02:25:54.928693 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 13 02:25:54.928705 kernel: rcu: RCU event tracing is enabled. Sep 13 02:25:54.928716 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 13 02:25:54.928728 kernel: Trampoline variant of Tasks RCU enabled. Sep 13 02:25:54.928739 kernel: Rude variant of Tasks RCU enabled. Sep 13 02:25:54.928751 kernel: Tracing variant of Tasks RCU enabled. Sep 13 02:25:54.928766 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 13 02:25:54.928778 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 13 02:25:54.928790 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 13 02:25:54.928801 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 13 02:25:54.928813 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 13 02:25:54.928824 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Sep 13 02:25:54.928836 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 13 02:25:54.928860 kernel: Console: colour VGA+ 80x25 Sep 13 02:25:54.928872 kernel: printk: legacy console [tty0] enabled Sep 13 02:25:54.928884 kernel: printk: legacy console [ttyS0] enabled Sep 13 02:25:54.928896 kernel: ACPI: Core revision 20240827 Sep 13 02:25:54.928907 kernel: APIC: Switch to symmetric I/O mode setup Sep 13 02:25:54.928923 kernel: x2apic enabled Sep 13 02:25:54.928936 kernel: APIC: Switched APIC routing to: physical x2apic Sep 13 02:25:54.928948 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Sep 13 02:25:54.928960 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Sep 13 02:25:54.928972 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 13 02:25:54.928988 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Sep 13 02:25:54.929000 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Sep 13 02:25:54.929011 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 13 02:25:54.929023 kernel: Spectre V2 : Mitigation: Retpolines Sep 13 02:25:54.930353 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 13 02:25:54.930367 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Sep 13 02:25:54.930379 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 13 02:25:54.930391 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 13 02:25:54.930403 kernel: MDS: Mitigation: Clear CPU buffers Sep 13 02:25:54.930414 kernel: MMIO Stale Data: Unknown: No mitigations Sep 13 02:25:54.930432 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 13 02:25:54.930444 kernel: active return thunk: its_return_thunk Sep 13 02:25:54.930457 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 13 02:25:54.930469 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 13 02:25:54.930481 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 13 02:25:54.930493 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 13 02:25:54.930504 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 13 02:25:54.930516 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Sep 13 02:25:54.930528 kernel: Freeing SMP alternatives memory: 32K Sep 13 02:25:54.930540 kernel: pid_max: default: 32768 minimum: 301 Sep 13 02:25:54.930552 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 13 02:25:54.930568 kernel: landlock: Up and running. Sep 13 02:25:54.930580 kernel: SELinux: Initializing. Sep 13 02:25:54.930592 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 13 02:25:54.930604 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 13 02:25:54.930616 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Sep 13 02:25:54.930628 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Sep 13 02:25:54.930640 kernel: signal: max sigframe size: 1776 Sep 13 02:25:54.930664 kernel: rcu: Hierarchical SRCU implementation. Sep 13 02:25:54.930678 kernel: rcu: Max phase no-delay instances is 400. Sep 13 02:25:54.930690 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Sep 13 02:25:54.930706 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 13 02:25:54.930719 kernel: smp: Bringing up secondary CPUs ... Sep 13 02:25:54.930731 kernel: smpboot: x86: Booting SMP configuration: Sep 13 02:25:54.930742 kernel: .... node #0, CPUs: #1 Sep 13 02:25:54.930754 kernel: smp: Brought up 1 node, 2 CPUs Sep 13 02:25:54.930766 kernel: smpboot: Total of 2 processors activated (11199.99 BogoMIPS) Sep 13 02:25:54.930779 kernel: Memory: 1897728K/2096616K available (14336K kernel code, 2432K rwdata, 9960K rodata, 53828K init, 1088K bss, 192880K reserved, 0K cma-reserved) Sep 13 02:25:54.930791 kernel: devtmpfs: initialized Sep 13 02:25:54.930803 kernel: x86/mm: Memory block size: 128MB Sep 13 02:25:54.930819 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 13 02:25:54.930831 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 13 02:25:54.930843 kernel: pinctrl core: initialized pinctrl subsystem Sep 13 02:25:54.930855 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 13 02:25:54.930867 kernel: audit: initializing netlink subsys (disabled) Sep 13 02:25:54.930879 kernel: audit: type=2000 audit(1757730351.293:1): state=initialized audit_enabled=0 res=1 Sep 13 02:25:54.930891 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 13 02:25:54.930903 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 13 02:25:54.930915 kernel: cpuidle: using governor menu Sep 13 02:25:54.930931 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 13 02:25:54.930943 kernel: dca service started, version 1.12.1 Sep 13 02:25:54.930955 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 13 02:25:54.930967 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Sep 13 02:25:54.930979 kernel: PCI: Using configuration type 1 for base access Sep 13 02:25:54.930991 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 13 02:25:54.931003 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 13 02:25:54.931015 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 13 02:25:54.932232 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 13 02:25:54.932258 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 13 02:25:54.932271 kernel: ACPI: Added _OSI(Module Device) Sep 13 02:25:54.932284 kernel: ACPI: Added _OSI(Processor Device) Sep 13 02:25:54.932296 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 13 02:25:54.932308 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 13 02:25:54.932320 kernel: ACPI: Interpreter enabled Sep 13 02:25:54.932332 kernel: ACPI: PM: (supports S0 S5) Sep 13 02:25:54.932344 kernel: ACPI: Using IOAPIC for interrupt routing Sep 13 02:25:54.932356 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 13 02:25:54.932372 kernel: PCI: Using E820 reservations for host bridge windows Sep 13 02:25:54.932384 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 13 02:25:54.932396 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 13 02:25:54.932669 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 13 02:25:54.932826 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 13 02:25:54.932993 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 13 02:25:54.933012 kernel: PCI host bridge to bus 0000:00 Sep 13 02:25:54.933589 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 13 02:25:54.933752 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 13 02:25:54.933888 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 13 02:25:54.934020 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Sep 13 02:25:54.934177 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 13 02:25:54.934310 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Sep 13 02:25:54.934443 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 13 02:25:54.934635 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 13 02:25:54.934830 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 conventional PCI endpoint Sep 13 02:25:54.934980 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfa000000-0xfbffffff pref] Sep 13 02:25:54.937174 kernel: pci 0000:00:01.0: BAR 1 [mem 0xfea50000-0xfea50fff] Sep 13 02:25:54.937335 kernel: pci 0000:00:01.0: ROM [mem 0xfea40000-0xfea4ffff pref] Sep 13 02:25:54.937487 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 13 02:25:54.937669 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 13 02:25:54.937831 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea51000-0xfea51fff] Sep 13 02:25:54.937992 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 13 02:25:54.938171 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 13 02:25:54.938320 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 13 02:25:54.938491 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 13 02:25:54.942093 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea52000-0xfea52fff] Sep 13 02:25:54.942274 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 13 02:25:54.942437 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 13 02:25:54.942592 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 13 02:25:54.942777 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 13 02:25:54.942929 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea53000-0xfea53fff] Sep 13 02:25:54.943099 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 13 02:25:54.943249 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 13 02:25:54.943396 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 13 02:25:54.943559 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 13 02:25:54.943722 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea54000-0xfea54fff] Sep 13 02:25:54.943869 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 13 02:25:54.944014 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 13 02:25:54.947216 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 13 02:25:54.947405 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 13 02:25:54.947579 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea55000-0xfea55fff] Sep 13 02:25:54.947752 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 13 02:25:54.947901 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 13 02:25:54.948108 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 13 02:25:54.948281 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 13 02:25:54.948430 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea56000-0xfea56fff] Sep 13 02:25:54.948578 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 13 02:25:54.948741 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 13 02:25:54.948896 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 13 02:25:54.953253 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 13 02:25:54.953420 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea57000-0xfea57fff] Sep 13 02:25:54.953596 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 13 02:25:54.953760 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 13 02:25:54.953909 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 13 02:25:54.954093 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 13 02:25:54.954246 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea58000-0xfea58fff] Sep 13 02:25:54.954392 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 13 02:25:54.954539 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 13 02:25:54.954700 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 13 02:25:54.954859 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 13 02:25:54.955020 kernel: pci 0000:00:03.0: BAR 0 [io 0xc0c0-0xc0df] Sep 13 02:25:54.955203 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfea59000-0xfea59fff] Sep 13 02:25:54.955350 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Sep 13 02:25:54.955497 kernel: pci 0000:00:03.0: ROM [mem 0xfea00000-0xfea3ffff pref] Sep 13 02:25:54.955662 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 13 02:25:54.955813 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc07f] Sep 13 02:25:54.955967 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfea5a000-0xfea5afff] Sep 13 02:25:54.956143 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfd004000-0xfd007fff 64bit pref] Sep 13 02:25:54.956312 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 13 02:25:54.956460 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 13 02:25:54.956615 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 13 02:25:54.956773 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc0e0-0xc0ff] Sep 13 02:25:54.956921 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea5b000-0xfea5bfff] Sep 13 02:25:54.958390 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 13 02:25:54.958572 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 13 02:25:54.958772 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Sep 13 02:25:54.958927 kernel: pci 0000:01:00.0: BAR 0 [mem 0xfda00000-0xfda000ff 64bit] Sep 13 02:25:54.959100 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 13 02:25:54.959265 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 13 02:25:54.959427 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 13 02:25:54.959597 kernel: pci_bus 0000:02: extended config space not accessible Sep 13 02:25:54.959785 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 conventional PCI endpoint Sep 13 02:25:54.959953 kernel: pci 0000:02:01.0: BAR 0 [mem 0xfd800000-0xfd80000f] Sep 13 02:25:54.962950 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 13 02:25:54.963149 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Sep 13 02:25:54.963311 kernel: pci 0000:03:00.0: BAR 0 [mem 0xfe800000-0xfe803fff 64bit] Sep 13 02:25:54.963464 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 13 02:25:54.963631 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Sep 13 02:25:54.963810 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Sep 13 02:25:54.963963 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 13 02:25:54.964130 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 13 02:25:54.964284 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 13 02:25:54.964458 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 13 02:25:54.964643 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 13 02:25:54.964806 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 13 02:25:54.964831 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 13 02:25:54.964844 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 13 02:25:54.964857 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 13 02:25:54.964869 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 13 02:25:54.964881 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 13 02:25:54.964893 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 13 02:25:54.964905 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 13 02:25:54.964918 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 13 02:25:54.964934 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 13 02:25:54.964946 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 13 02:25:54.964958 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 13 02:25:54.964970 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 13 02:25:54.964982 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 13 02:25:54.964995 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 13 02:25:54.965007 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 13 02:25:54.965019 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 13 02:25:54.968141 kernel: iommu: Default domain type: Translated Sep 13 02:25:54.968171 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 13 02:25:54.968184 kernel: PCI: Using ACPI for IRQ routing Sep 13 02:25:54.968197 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 13 02:25:54.968209 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 13 02:25:54.968221 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Sep 13 02:25:54.968431 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 13 02:25:54.968607 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 13 02:25:54.968776 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 13 02:25:54.968795 kernel: vgaarb: loaded Sep 13 02:25:54.968815 kernel: clocksource: Switched to clocksource kvm-clock Sep 13 02:25:54.968827 kernel: VFS: Disk quotas dquot_6.6.0 Sep 13 02:25:54.968840 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 13 02:25:54.968852 kernel: pnp: PnP ACPI init Sep 13 02:25:54.969012 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 13 02:25:54.970055 kernel: pnp: PnP ACPI: found 5 devices Sep 13 02:25:54.970071 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 13 02:25:54.970084 kernel: NET: Registered PF_INET protocol family Sep 13 02:25:54.970103 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 13 02:25:54.970116 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 13 02:25:54.970128 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 13 02:25:54.970140 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 13 02:25:54.970152 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 13 02:25:54.970164 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 13 02:25:54.970177 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 13 02:25:54.970189 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 13 02:25:54.970205 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 13 02:25:54.970217 kernel: NET: Registered PF_XDP protocol family Sep 13 02:25:54.970377 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Sep 13 02:25:54.970550 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 13 02:25:54.970734 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 13 02:25:54.970886 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 13 02:25:54.973077 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 13 02:25:54.973254 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 13 02:25:54.973410 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 13 02:25:54.973584 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 13 02:25:54.973763 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Sep 13 02:25:54.973917 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Sep 13 02:25:54.974631 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Sep 13 02:25:54.974797 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Sep 13 02:25:54.974948 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Sep 13 02:25:54.975160 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Sep 13 02:25:54.975318 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Sep 13 02:25:54.975466 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Sep 13 02:25:54.975664 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 13 02:25:54.975857 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 13 02:25:54.976006 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 13 02:25:54.978192 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 13 02:25:54.978362 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 13 02:25:54.978519 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 13 02:25:54.978684 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 13 02:25:54.978843 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 13 02:25:54.978994 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 13 02:25:54.979179 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 13 02:25:54.979330 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 13 02:25:54.979479 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 13 02:25:54.979627 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 13 02:25:54.979787 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 13 02:25:54.979945 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 13 02:25:54.980116 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 13 02:25:54.980267 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 13 02:25:54.980416 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 13 02:25:54.980572 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 13 02:25:54.980734 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 13 02:25:54.980885 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 13 02:25:54.983053 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 13 02:25:54.983232 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 13 02:25:54.983390 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 13 02:25:54.983546 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 13 02:25:54.983711 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 13 02:25:54.983864 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 13 02:25:54.984014 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 13 02:25:54.984190 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 13 02:25:54.984340 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 13 02:25:54.984489 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 13 02:25:54.984638 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 13 02:25:54.984802 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 13 02:25:54.984951 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 13 02:25:54.987133 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 13 02:25:54.987299 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 13 02:25:54.987458 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 13 02:25:54.987607 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Sep 13 02:25:54.987755 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 13 02:25:54.987892 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Sep 13 02:25:54.988067 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 13 02:25:54.988213 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Sep 13 02:25:54.988363 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 13 02:25:54.988556 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 13 02:25:54.988721 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Sep 13 02:25:54.988862 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 13 02:25:54.989001 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 13 02:25:54.995087 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Sep 13 02:25:54.995249 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 13 02:25:54.995403 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 13 02:25:54.995586 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Sep 13 02:25:54.995751 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 13 02:25:54.995893 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 13 02:25:54.996090 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Sep 13 02:25:54.996234 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 13 02:25:54.996373 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 13 02:25:54.996520 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Sep 13 02:25:54.996680 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 13 02:25:54.996821 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 13 02:25:54.996969 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Sep 13 02:25:54.997144 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Sep 13 02:25:54.997286 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 13 02:25:54.997439 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Sep 13 02:25:54.997578 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 13 02:25:54.997738 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 13 02:25:54.997759 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 13 02:25:54.997773 kernel: PCI: CLS 0 bytes, default 64 Sep 13 02:25:54.997786 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 13 02:25:54.997799 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Sep 13 02:25:54.997812 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 13 02:25:54.997825 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Sep 13 02:25:54.997844 kernel: Initialise system trusted keyrings Sep 13 02:25:54.997861 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 13 02:25:54.997874 kernel: Key type asymmetric registered Sep 13 02:25:54.997886 kernel: Asymmetric key parser 'x509' registered Sep 13 02:25:54.997899 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 13 02:25:54.997911 kernel: io scheduler mq-deadline registered Sep 13 02:25:54.997924 kernel: io scheduler kyber registered Sep 13 02:25:54.997937 kernel: io scheduler bfq registered Sep 13 02:25:54.998107 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 13 02:25:54.998261 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 13 02:25:54.998419 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 02:25:54.998569 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 13 02:25:54.998733 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 13 02:25:54.998883 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 02:25:54.999047 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 13 02:25:54.999200 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 13 02:25:54.999357 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 02:25:54.999517 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 13 02:25:54.999686 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 13 02:25:54.999839 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 02:25:54.999989 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 13 02:25:55.004182 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 13 02:25:55.004351 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 02:25:55.004505 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 13 02:25:55.004669 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 13 02:25:55.004823 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 02:25:55.004973 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 13 02:25:55.005155 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 13 02:25:55.005313 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 02:25:55.005465 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 13 02:25:55.005613 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 13 02:25:55.005775 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 02:25:55.005795 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 13 02:25:55.005809 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 13 02:25:55.005829 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 13 02:25:55.005841 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 13 02:25:55.005854 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 13 02:25:55.005867 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 13 02:25:55.005880 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 13 02:25:55.005893 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 13 02:25:55.006059 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 13 02:25:55.006080 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 13 02:25:55.006225 kernel: rtc_cmos 00:03: registered as rtc0 Sep 13 02:25:55.006363 kernel: rtc_cmos 00:03: setting system clock to 2025-09-13T02:25:54 UTC (1757730354) Sep 13 02:25:55.006500 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Sep 13 02:25:55.006519 kernel: intel_pstate: CPU model not supported Sep 13 02:25:55.006532 kernel: NET: Registered PF_INET6 protocol family Sep 13 02:25:55.006545 kernel: Segment Routing with IPv6 Sep 13 02:25:55.006558 kernel: In-situ OAM (IOAM) with IPv6 Sep 13 02:25:55.006571 kernel: NET: Registered PF_PACKET protocol family Sep 13 02:25:55.006584 kernel: Key type dns_resolver registered Sep 13 02:25:55.006602 kernel: IPI shorthand broadcast: enabled Sep 13 02:25:55.006615 kernel: sched_clock: Marking stable (3366004013, 219480603)->(3709179140, -123694524) Sep 13 02:25:55.006628 kernel: registered taskstats version 1 Sep 13 02:25:55.006641 kernel: Loading compiled-in X.509 certificates Sep 13 02:25:55.006665 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: dd6b45f5ed9ac8d42d60bdb17f83ef06c8bcd8f6' Sep 13 02:25:55.006678 kernel: Demotion targets for Node 0: null Sep 13 02:25:55.006690 kernel: Key type .fscrypt registered Sep 13 02:25:55.006703 kernel: Key type fscrypt-provisioning registered Sep 13 02:25:55.006715 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 13 02:25:55.006734 kernel: ima: Allocated hash algorithm: sha1 Sep 13 02:25:55.006747 kernel: ima: No architecture policies found Sep 13 02:25:55.006763 kernel: clk: Disabling unused clocks Sep 13 02:25:55.006776 kernel: Warning: unable to open an initial console. Sep 13 02:25:55.006789 kernel: Freeing unused kernel image (initmem) memory: 53828K Sep 13 02:25:55.006802 kernel: Write protecting the kernel read-only data: 24576k Sep 13 02:25:55.006814 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Sep 13 02:25:55.006831 kernel: Run /init as init process Sep 13 02:25:55.006843 kernel: with arguments: Sep 13 02:25:55.006860 kernel: /init Sep 13 02:25:55.006872 kernel: with environment: Sep 13 02:25:55.006885 kernel: HOME=/ Sep 13 02:25:55.006897 kernel: TERM=linux Sep 13 02:25:55.006909 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 13 02:25:55.006931 systemd[1]: Successfully made /usr/ read-only. Sep 13 02:25:55.006949 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 13 02:25:55.006969 systemd[1]: Detected virtualization kvm. Sep 13 02:25:55.006982 systemd[1]: Detected architecture x86-64. Sep 13 02:25:55.006995 systemd[1]: Running in initrd. Sep 13 02:25:55.007008 systemd[1]: No hostname configured, using default hostname. Sep 13 02:25:55.007021 systemd[1]: Hostname set to . Sep 13 02:25:55.007055 systemd[1]: Initializing machine ID from VM UUID. Sep 13 02:25:55.007068 systemd[1]: Queued start job for default target initrd.target. Sep 13 02:25:55.007082 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 02:25:55.007095 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 02:25:55.007115 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 13 02:25:55.007128 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 13 02:25:55.007142 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 13 02:25:55.007156 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 13 02:25:55.007171 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 13 02:25:55.007184 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 13 02:25:55.007202 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 02:25:55.007216 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 13 02:25:55.007230 systemd[1]: Reached target paths.target - Path Units. Sep 13 02:25:55.007243 systemd[1]: Reached target slices.target - Slice Units. Sep 13 02:25:55.007257 systemd[1]: Reached target swap.target - Swaps. Sep 13 02:25:55.007270 systemd[1]: Reached target timers.target - Timer Units. Sep 13 02:25:55.007284 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 13 02:25:55.007297 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 13 02:25:55.007310 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 13 02:25:55.007328 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 13 02:25:55.007341 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 13 02:25:55.007355 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 13 02:25:55.007369 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 02:25:55.007382 systemd[1]: Reached target sockets.target - Socket Units. Sep 13 02:25:55.007395 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 13 02:25:55.007409 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 13 02:25:55.007422 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 13 02:25:55.007440 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 13 02:25:55.007453 systemd[1]: Starting systemd-fsck-usr.service... Sep 13 02:25:55.007466 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 13 02:25:55.007480 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 13 02:25:55.007493 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 02:25:55.007557 systemd-journald[230]: Collecting audit messages is disabled. Sep 13 02:25:55.007595 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 13 02:25:55.007610 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 02:25:55.007624 systemd[1]: Finished systemd-fsck-usr.service. Sep 13 02:25:55.007642 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 13 02:25:55.007668 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 02:25:55.007683 systemd-journald[230]: Journal started Sep 13 02:25:55.007712 systemd-journald[230]: Runtime Journal (/run/log/journal/549e75df26ce436a88731fda20073c51) is 4.7M, max 38.2M, 33.4M free. Sep 13 02:25:54.965375 systemd-modules-load[232]: Inserted module 'overlay' Sep 13 02:25:55.046501 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 13 02:25:55.046534 kernel: Bridge firewalling registered Sep 13 02:25:55.046552 systemd[1]: Started systemd-journald.service - Journal Service. Sep 13 02:25:55.016066 systemd-modules-load[232]: Inserted module 'br_netfilter' Sep 13 02:25:55.048676 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 13 02:25:55.049734 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 02:25:55.054672 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 13 02:25:55.057197 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 13 02:25:55.061119 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 13 02:25:55.065170 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 13 02:25:55.079000 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 13 02:25:55.083404 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 02:25:55.087750 systemd-tmpfiles[251]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 13 02:25:55.095175 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 02:25:55.097886 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 13 02:25:55.098852 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 02:25:55.102229 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 13 02:25:55.136175 dracut-cmdline[269]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=21b29c6e420cf06e0546ff797fc1285d986af130e4ba1abb9f27cb6343b53294 Sep 13 02:25:55.159677 systemd-resolved[268]: Positive Trust Anchors: Sep 13 02:25:55.160620 systemd-resolved[268]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 13 02:25:55.160676 systemd-resolved[268]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 13 02:25:55.168573 systemd-resolved[268]: Defaulting to hostname 'linux'. Sep 13 02:25:55.171696 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 13 02:25:55.172505 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 13 02:25:55.242109 kernel: SCSI subsystem initialized Sep 13 02:25:55.253071 kernel: Loading iSCSI transport class v2.0-870. Sep 13 02:25:55.266146 kernel: iscsi: registered transport (tcp) Sep 13 02:25:55.291380 kernel: iscsi: registered transport (qla4xxx) Sep 13 02:25:55.291460 kernel: QLogic iSCSI HBA Driver Sep 13 02:25:55.314927 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 13 02:25:55.331813 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 13 02:25:55.333335 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 13 02:25:55.394409 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 13 02:25:55.397775 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 13 02:25:55.456082 kernel: raid6: sse2x4 gen() 14676 MB/s Sep 13 02:25:55.474069 kernel: raid6: sse2x2 gen() 10083 MB/s Sep 13 02:25:55.492527 kernel: raid6: sse2x1 gen() 10447 MB/s Sep 13 02:25:55.492589 kernel: raid6: using algorithm sse2x4 gen() 14676 MB/s Sep 13 02:25:55.511545 kernel: raid6: .... xor() 8337 MB/s, rmw enabled Sep 13 02:25:55.511628 kernel: raid6: using ssse3x2 recovery algorithm Sep 13 02:25:55.536071 kernel: xor: automatically using best checksumming function avx Sep 13 02:25:55.715080 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 13 02:25:55.722882 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 13 02:25:55.726304 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 02:25:55.763232 systemd-udevd[478]: Using default interface naming scheme 'v255'. Sep 13 02:25:55.773158 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 02:25:55.776452 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 13 02:25:55.808715 dracut-pre-trigger[483]: rd.md=0: removing MD RAID activation Sep 13 02:25:55.840973 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 13 02:25:55.843493 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 13 02:25:55.954847 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 02:25:55.961745 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 13 02:25:56.073081 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Sep 13 02:25:56.083054 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Sep 13 02:25:56.097078 kernel: cryptd: max_cpu_qlen set to 1000 Sep 13 02:25:56.106374 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 13 02:25:56.106472 kernel: GPT:17805311 != 125829119 Sep 13 02:25:56.106491 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 13 02:25:56.106508 kernel: GPT:17805311 != 125829119 Sep 13 02:25:56.106523 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 13 02:25:56.106539 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 02:25:56.121080 kernel: AES CTR mode by8 optimization enabled Sep 13 02:25:56.140089 kernel: libata version 3.00 loaded. Sep 13 02:25:56.171113 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 13 02:25:56.175178 kernel: ahci 0000:00:1f.2: version 3.0 Sep 13 02:25:56.175498 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 13 02:25:56.176081 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 02:25:56.177422 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 02:25:56.179730 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 02:25:56.191512 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 13 02:25:56.191764 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 13 02:25:56.191973 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 13 02:25:56.192179 kernel: scsi host0: ahci Sep 13 02:25:56.186324 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 02:25:56.194136 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 13 02:25:56.200449 kernel: scsi host1: ahci Sep 13 02:25:56.200717 kernel: scsi host2: ahci Sep 13 02:25:56.200905 kernel: scsi host3: ahci Sep 13 02:25:56.202556 kernel: scsi host4: ahci Sep 13 02:25:56.202600 kernel: ACPI: bus type USB registered Sep 13 02:25:56.205059 kernel: usbcore: registered new interface driver usbfs Sep 13 02:25:56.209163 kernel: scsi host5: ahci Sep 13 02:25:56.214129 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 lpm-pol 1 Sep 13 02:25:56.214184 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 lpm-pol 1 Sep 13 02:25:56.214209 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 lpm-pol 1 Sep 13 02:25:56.214238 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 lpm-pol 1 Sep 13 02:25:56.214253 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 lpm-pol 1 Sep 13 02:25:56.214268 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 lpm-pol 1 Sep 13 02:25:56.228184 kernel: usbcore: registered new interface driver hub Sep 13 02:25:56.228244 kernel: usbcore: registered new device driver usb Sep 13 02:25:56.244733 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 13 02:25:56.288927 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 13 02:25:56.340602 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 13 02:25:56.341729 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 02:25:56.354661 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 13 02:25:56.367412 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 13 02:25:56.370229 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 13 02:25:56.390406 disk-uuid[630]: Primary Header is updated. Sep 13 02:25:56.390406 disk-uuid[630]: Secondary Entries is updated. Sep 13 02:25:56.390406 disk-uuid[630]: Secondary Header is updated. Sep 13 02:25:56.396070 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 02:25:56.403051 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 02:25:56.525597 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 13 02:25:56.525682 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 13 02:25:56.526062 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 13 02:25:56.528068 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 13 02:25:56.532421 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 13 02:25:56.532455 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 13 02:25:56.585396 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 13 02:25:56.585695 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Sep 13 02:25:56.589050 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 13 02:25:56.593302 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 13 02:25:56.593528 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Sep 13 02:25:56.595339 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Sep 13 02:25:56.596947 kernel: hub 1-0:1.0: USB hub found Sep 13 02:25:56.600887 kernel: hub 1-0:1.0: 4 ports detected Sep 13 02:25:56.601121 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 13 02:25:56.603601 kernel: hub 2-0:1.0: USB hub found Sep 13 02:25:56.603840 kernel: hub 2-0:1.0: 4 ports detected Sep 13 02:25:56.620865 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 13 02:25:56.622465 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 13 02:25:56.623508 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 02:25:56.625200 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 13 02:25:56.627983 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 13 02:25:56.659835 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 13 02:25:56.838156 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 13 02:25:56.979064 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 13 02:25:56.986209 kernel: usbcore: registered new interface driver usbhid Sep 13 02:25:56.986257 kernel: usbhid: USB HID core driver Sep 13 02:25:56.993068 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Sep 13 02:25:56.997065 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Sep 13 02:25:57.406807 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 02:25:57.407477 disk-uuid[631]: The operation has completed successfully. Sep 13 02:25:57.454744 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 13 02:25:57.454904 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 13 02:25:57.508320 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 13 02:25:57.535754 sh[656]: Success Sep 13 02:25:57.560507 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 13 02:25:57.560595 kernel: device-mapper: uevent: version 1.0.3 Sep 13 02:25:57.561553 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 13 02:25:57.575073 kernel: device-mapper: verity: sha256 using shash "sha256-avx" Sep 13 02:25:57.635965 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 13 02:25:57.637671 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 13 02:25:57.651982 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 13 02:25:57.667074 kernel: BTRFS: device fsid ca815b72-c68a-4b5e-8622-cfb6842bab47 devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (668) Sep 13 02:25:57.672090 kernel: BTRFS info (device dm-0): first mount of filesystem ca815b72-c68a-4b5e-8622-cfb6842bab47 Sep 13 02:25:57.672130 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 13 02:25:57.681047 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 13 02:25:57.681093 kernel: BTRFS info (device dm-0): enabling free space tree Sep 13 02:25:57.684683 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 13 02:25:57.686545 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 13 02:25:57.688258 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 13 02:25:57.690307 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 13 02:25:57.694159 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 13 02:25:57.728074 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (699) Sep 13 02:25:57.732381 kernel: BTRFS info (device vda6): first mount of filesystem 9cd66393-e258-466a-9c7b-a40c48e4924e Sep 13 02:25:57.732416 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 02:25:57.739291 kernel: BTRFS info (device vda6): turning on async discard Sep 13 02:25:57.739338 kernel: BTRFS info (device vda6): enabling free space tree Sep 13 02:25:57.747061 kernel: BTRFS info (device vda6): last unmount of filesystem 9cd66393-e258-466a-9c7b-a40c48e4924e Sep 13 02:25:57.748790 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 13 02:25:57.752210 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 13 02:25:57.831972 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 13 02:25:57.839353 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 13 02:25:57.919257 systemd-networkd[838]: lo: Link UP Sep 13 02:25:57.919269 systemd-networkd[838]: lo: Gained carrier Sep 13 02:25:57.921911 systemd-networkd[838]: Enumeration completed Sep 13 02:25:57.922070 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 13 02:25:57.923082 systemd[1]: Reached target network.target - Network. Sep 13 02:25:57.924321 systemd-networkd[838]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 02:25:57.924333 systemd-networkd[838]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 13 02:25:57.925554 systemd-networkd[838]: eth0: Link UP Sep 13 02:25:57.925895 systemd-networkd[838]: eth0: Gained carrier Sep 13 02:25:57.925909 systemd-networkd[838]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 02:25:57.942466 systemd-networkd[838]: eth0: DHCPv4 address 10.230.67.142/30, gateway 10.230.67.141 acquired from 10.230.67.141 Sep 13 02:25:57.965728 ignition[754]: Ignition 2.21.0 Sep 13 02:25:57.965754 ignition[754]: Stage: fetch-offline Sep 13 02:25:57.965820 ignition[754]: no configs at "/usr/lib/ignition/base.d" Sep 13 02:25:57.965837 ignition[754]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 02:25:57.968911 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 13 02:25:57.965996 ignition[754]: parsed url from cmdline: "" Sep 13 02:25:57.966002 ignition[754]: no config URL provided Sep 13 02:25:57.966011 ignition[754]: reading system config file "/usr/lib/ignition/user.ign" Sep 13 02:25:57.966042 ignition[754]: no config at "/usr/lib/ignition/user.ign" Sep 13 02:25:57.966061 ignition[754]: failed to fetch config: resource requires networking Sep 13 02:25:57.973257 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 13 02:25:57.966321 ignition[754]: Ignition finished successfully Sep 13 02:25:58.004453 ignition[847]: Ignition 2.21.0 Sep 13 02:25:58.004475 ignition[847]: Stage: fetch Sep 13 02:25:58.004683 ignition[847]: no configs at "/usr/lib/ignition/base.d" Sep 13 02:25:58.004700 ignition[847]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 02:25:58.004801 ignition[847]: parsed url from cmdline: "" Sep 13 02:25:58.004807 ignition[847]: no config URL provided Sep 13 02:25:58.004816 ignition[847]: reading system config file "/usr/lib/ignition/user.ign" Sep 13 02:25:58.004831 ignition[847]: no config at "/usr/lib/ignition/user.ign" Sep 13 02:25:58.004982 ignition[847]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Sep 13 02:25:58.006314 ignition[847]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Sep 13 02:25:58.006339 ignition[847]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Sep 13 02:25:58.037715 ignition[847]: GET result: OK Sep 13 02:25:58.038262 ignition[847]: parsing config with SHA512: 30c109dbfb2af3dfa2b78caf538a18ae733f4a19eec64e4bca68e9184417555c3645dd61ba7bcca3e7a29d8ecee43989c07b0a73c5cbd842cb240ec16be4a5c2 Sep 13 02:25:58.044307 unknown[847]: fetched base config from "system" Sep 13 02:25:58.044319 unknown[847]: fetched base config from "system" Sep 13 02:25:58.044702 ignition[847]: fetch: fetch complete Sep 13 02:25:58.044328 unknown[847]: fetched user config from "openstack" Sep 13 02:25:58.044709 ignition[847]: fetch: fetch passed Sep 13 02:25:58.046993 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 13 02:25:58.044770 ignition[847]: Ignition finished successfully Sep 13 02:25:58.051181 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 13 02:25:58.099071 ignition[854]: Ignition 2.21.0 Sep 13 02:25:58.099093 ignition[854]: Stage: kargs Sep 13 02:25:58.099294 ignition[854]: no configs at "/usr/lib/ignition/base.d" Sep 13 02:25:58.099311 ignition[854]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 02:25:58.100510 ignition[854]: kargs: kargs passed Sep 13 02:25:58.103534 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 13 02:25:58.100577 ignition[854]: Ignition finished successfully Sep 13 02:25:58.106356 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 13 02:25:58.139338 ignition[861]: Ignition 2.21.0 Sep 13 02:25:58.140268 ignition[861]: Stage: disks Sep 13 02:25:58.140440 ignition[861]: no configs at "/usr/lib/ignition/base.d" Sep 13 02:25:58.140458 ignition[861]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 02:25:58.144666 ignition[861]: disks: disks passed Sep 13 02:25:58.145342 ignition[861]: Ignition finished successfully Sep 13 02:25:58.147311 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 13 02:25:58.148677 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 13 02:25:58.149466 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 13 02:25:58.151079 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 13 02:25:58.152626 systemd[1]: Reached target sysinit.target - System Initialization. Sep 13 02:25:58.154004 systemd[1]: Reached target basic.target - Basic System. Sep 13 02:25:58.156195 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 13 02:25:58.203852 systemd-fsck[870]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 13 02:25:58.207603 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 13 02:25:58.210214 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 13 02:25:58.340082 kernel: EXT4-fs (vda9): mounted filesystem 7f859ed0-e8c8-40c1-91d3-e1e964d8c4e8 r/w with ordered data mode. Quota mode: none. Sep 13 02:25:58.341775 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 13 02:25:58.343874 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 13 02:25:58.348136 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 13 02:25:58.352177 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 13 02:25:58.355112 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 13 02:25:58.357185 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Sep 13 02:25:58.361210 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 13 02:25:58.361258 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 13 02:25:58.367353 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 13 02:25:58.370236 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 13 02:25:58.385277 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (878) Sep 13 02:25:58.389083 kernel: BTRFS info (device vda6): first mount of filesystem 9cd66393-e258-466a-9c7b-a40c48e4924e Sep 13 02:25:58.392263 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 02:25:58.406775 kernel: BTRFS info (device vda6): turning on async discard Sep 13 02:25:58.406819 kernel: BTRFS info (device vda6): enabling free space tree Sep 13 02:25:58.412395 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 13 02:25:58.457682 initrd-setup-root[906]: cut: /sysroot/etc/passwd: No such file or directory Sep 13 02:25:58.467052 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 13 02:25:58.467871 initrd-setup-root[913]: cut: /sysroot/etc/group: No such file or directory Sep 13 02:25:58.486809 initrd-setup-root[921]: cut: /sysroot/etc/shadow: No such file or directory Sep 13 02:25:58.496269 initrd-setup-root[928]: cut: /sysroot/etc/gshadow: No such file or directory Sep 13 02:25:58.607496 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 13 02:25:58.610806 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 13 02:25:58.612446 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 13 02:25:58.634090 kernel: BTRFS info (device vda6): last unmount of filesystem 9cd66393-e258-466a-9c7b-a40c48e4924e Sep 13 02:25:58.654699 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 13 02:25:58.668468 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 13 02:25:58.674583 ignition[996]: INFO : Ignition 2.21.0 Sep 13 02:25:58.674583 ignition[996]: INFO : Stage: mount Sep 13 02:25:58.676308 ignition[996]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 02:25:58.676308 ignition[996]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 02:25:58.676308 ignition[996]: INFO : mount: mount passed Sep 13 02:25:58.676308 ignition[996]: INFO : Ignition finished successfully Sep 13 02:25:58.677491 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 13 02:25:59.428396 systemd-networkd[838]: eth0: Gained IPv6LL Sep 13 02:25:59.514095 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 13 02:26:00.934723 systemd-networkd[838]: eth0: Ignoring DHCPv6 address 2a02:1348:179:90e3:24:19ff:fee6:438e/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:90e3:24:19ff:fee6:438e/64 assigned by NDisc. Sep 13 02:26:00.934739 systemd-networkd[838]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 13 02:26:01.525062 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 13 02:26:05.537057 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 13 02:26:05.543451 coreos-metadata[880]: Sep 13 02:26:05.543 WARN failed to locate config-drive, using the metadata service API instead Sep 13 02:26:05.567820 coreos-metadata[880]: Sep 13 02:26:05.567 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 13 02:26:05.587299 coreos-metadata[880]: Sep 13 02:26:05.587 INFO Fetch successful Sep 13 02:26:05.588460 coreos-metadata[880]: Sep 13 02:26:05.588 INFO wrote hostname srv-1di1n.gb1.brightbox.com to /sysroot/etc/hostname Sep 13 02:26:05.590779 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Sep 13 02:26:05.590971 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Sep 13 02:26:05.596109 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 13 02:26:05.624969 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 13 02:26:05.650074 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1013) Sep 13 02:26:05.655463 kernel: BTRFS info (device vda6): first mount of filesystem 9cd66393-e258-466a-9c7b-a40c48e4924e Sep 13 02:26:05.655546 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 02:26:05.661912 kernel: BTRFS info (device vda6): turning on async discard Sep 13 02:26:05.661980 kernel: BTRFS info (device vda6): enabling free space tree Sep 13 02:26:05.665352 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 13 02:26:05.701714 ignition[1031]: INFO : Ignition 2.21.0 Sep 13 02:26:05.701714 ignition[1031]: INFO : Stage: files Sep 13 02:26:05.703457 ignition[1031]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 02:26:05.703457 ignition[1031]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 02:26:05.703457 ignition[1031]: DEBUG : files: compiled without relabeling support, skipping Sep 13 02:26:05.705991 ignition[1031]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 13 02:26:05.705991 ignition[1031]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 13 02:26:05.713354 ignition[1031]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 13 02:26:05.713354 ignition[1031]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 13 02:26:05.713354 ignition[1031]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 13 02:26:05.712534 unknown[1031]: wrote ssh authorized keys file for user: core Sep 13 02:26:05.717117 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 13 02:26:05.717117 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 13 02:26:05.950829 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 13 02:26:06.541600 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 13 02:26:06.543276 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 13 02:26:06.543276 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 13 02:26:06.543276 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 13 02:26:06.543276 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 13 02:26:06.543276 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 13 02:26:06.543276 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 13 02:26:06.543276 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 13 02:26:06.543276 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 13 02:26:06.551893 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 13 02:26:06.551893 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 13 02:26:06.551893 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 13 02:26:06.551893 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 13 02:26:06.551893 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 13 02:26:06.551893 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 13 02:26:06.945087 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 13 02:26:09.163298 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 13 02:26:09.163298 ignition[1031]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 13 02:26:09.166461 ignition[1031]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 13 02:26:09.167863 ignition[1031]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 13 02:26:09.167863 ignition[1031]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 13 02:26:09.167863 ignition[1031]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 13 02:26:09.167863 ignition[1031]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 13 02:26:09.167863 ignition[1031]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 13 02:26:09.167863 ignition[1031]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 13 02:26:09.167863 ignition[1031]: INFO : files: files passed Sep 13 02:26:09.167863 ignition[1031]: INFO : Ignition finished successfully Sep 13 02:26:09.170161 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 13 02:26:09.174149 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 13 02:26:09.178175 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 13 02:26:09.197504 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 13 02:26:09.197669 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 13 02:26:09.207112 initrd-setup-root-after-ignition[1061]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 13 02:26:09.208624 initrd-setup-root-after-ignition[1061]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 13 02:26:09.210353 initrd-setup-root-after-ignition[1065]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 13 02:26:09.212611 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 13 02:26:09.214562 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 13 02:26:09.216383 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 13 02:26:09.270932 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 13 02:26:09.271195 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 13 02:26:09.272932 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 13 02:26:09.274225 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 13 02:26:09.275717 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 13 02:26:09.276844 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 13 02:26:09.305381 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 13 02:26:09.308071 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 13 02:26:09.334652 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 13 02:26:09.336486 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 02:26:09.337365 systemd[1]: Stopped target timers.target - Timer Units. Sep 13 02:26:09.338846 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 13 02:26:09.339115 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 13 02:26:09.340758 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 13 02:26:09.341726 systemd[1]: Stopped target basic.target - Basic System. Sep 13 02:26:09.343163 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 13 02:26:09.344490 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 13 02:26:09.345856 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 13 02:26:09.347182 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 13 02:26:09.348829 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 13 02:26:09.350272 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 13 02:26:09.352008 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 13 02:26:09.353346 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 13 02:26:09.354990 systemd[1]: Stopped target swap.target - Swaps. Sep 13 02:26:09.356259 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 13 02:26:09.356479 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 13 02:26:09.358232 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 13 02:26:09.359160 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 02:26:09.360441 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 13 02:26:09.360614 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 02:26:09.362072 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 13 02:26:09.362323 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 13 02:26:09.364228 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 13 02:26:09.364437 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 13 02:26:09.366102 systemd[1]: ignition-files.service: Deactivated successfully. Sep 13 02:26:09.366316 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 13 02:26:09.375143 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 13 02:26:09.379309 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 13 02:26:09.379961 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 13 02:26:09.380204 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 02:26:09.382376 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 13 02:26:09.382626 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 13 02:26:09.393572 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 13 02:26:09.393725 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 13 02:26:09.408814 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 13 02:26:09.412977 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 13 02:26:09.413469 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 13 02:26:09.420096 ignition[1085]: INFO : Ignition 2.21.0 Sep 13 02:26:09.421045 ignition[1085]: INFO : Stage: umount Sep 13 02:26:09.421045 ignition[1085]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 02:26:09.421045 ignition[1085]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 02:26:09.424712 ignition[1085]: INFO : umount: umount passed Sep 13 02:26:09.424712 ignition[1085]: INFO : Ignition finished successfully Sep 13 02:26:09.422783 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 13 02:26:09.422941 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 13 02:26:09.424199 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 13 02:26:09.424335 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 13 02:26:09.425381 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 13 02:26:09.425469 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 13 02:26:09.426624 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 13 02:26:09.426686 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 13 02:26:09.427883 systemd[1]: Stopped target network.target - Network. Sep 13 02:26:09.429069 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 13 02:26:09.429159 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 13 02:26:09.430489 systemd[1]: Stopped target paths.target - Path Units. Sep 13 02:26:09.431675 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 13 02:26:09.434240 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 02:26:09.435225 systemd[1]: Stopped target slices.target - Slice Units. Sep 13 02:26:09.436470 systemd[1]: Stopped target sockets.target - Socket Units. Sep 13 02:26:09.437747 systemd[1]: iscsid.socket: Deactivated successfully. Sep 13 02:26:09.437821 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 13 02:26:09.439132 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 13 02:26:09.439191 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 13 02:26:09.440558 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 13 02:26:09.440633 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 13 02:26:09.441829 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 13 02:26:09.441902 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 13 02:26:09.443268 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 13 02:26:09.443335 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 13 02:26:09.445023 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 13 02:26:09.447182 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 13 02:26:09.450738 systemd-networkd[838]: eth0: DHCPv6 lease lost Sep 13 02:26:09.452532 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 13 02:26:09.452713 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 13 02:26:09.458824 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 13 02:26:09.459242 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 13 02:26:09.459430 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 13 02:26:09.461953 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 13 02:26:09.463227 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 13 02:26:09.464010 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 13 02:26:09.464194 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 13 02:26:09.466446 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 13 02:26:09.467769 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 13 02:26:09.467833 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 13 02:26:09.470174 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 13 02:26:09.470242 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 13 02:26:09.473152 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 13 02:26:09.473220 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 13 02:26:09.474223 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 13 02:26:09.474286 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 02:26:09.477217 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 02:26:09.480021 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 13 02:26:09.480679 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 13 02:26:09.495982 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 13 02:26:09.497375 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 02:26:09.498538 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 13 02:26:09.498697 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 13 02:26:09.500437 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 13 02:26:09.500534 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 13 02:26:09.501906 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 13 02:26:09.501959 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 02:26:09.503437 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 13 02:26:09.503512 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 13 02:26:09.508573 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 13 02:26:09.508725 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 13 02:26:09.510332 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 13 02:26:09.510448 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 02:26:09.515361 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 13 02:26:09.517719 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 13 02:26:09.517816 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 13 02:26:09.521292 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 13 02:26:09.521376 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 02:26:09.529579 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 13 02:26:09.529661 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 02:26:09.531798 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 13 02:26:09.531878 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 02:26:09.532784 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 02:26:09.532862 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 02:26:09.538766 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 13 02:26:09.538852 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 13 02:26:09.538914 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 13 02:26:09.538976 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 13 02:26:09.546716 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 13 02:26:09.546914 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 13 02:26:09.548773 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 13 02:26:09.551131 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 13 02:26:09.570812 systemd[1]: Switching root. Sep 13 02:26:09.609050 systemd-journald[230]: Received SIGTERM from PID 1 (systemd). Sep 13 02:26:09.609145 systemd-journald[230]: Journal stopped Sep 13 02:26:11.081232 kernel: SELinux: policy capability network_peer_controls=1 Sep 13 02:26:11.081407 kernel: SELinux: policy capability open_perms=1 Sep 13 02:26:11.081431 kernel: SELinux: policy capability extended_socket_class=1 Sep 13 02:26:11.081477 kernel: SELinux: policy capability always_check_network=0 Sep 13 02:26:11.081502 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 13 02:26:11.081528 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 13 02:26:11.081555 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 13 02:26:11.081581 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 13 02:26:11.081612 kernel: SELinux: policy capability userspace_initial_context=0 Sep 13 02:26:11.081666 kernel: audit: type=1403 audit(1757730369.854:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 13 02:26:11.081702 systemd[1]: Successfully loaded SELinux policy in 55.396ms. Sep 13 02:26:11.081749 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 23.019ms. Sep 13 02:26:11.081779 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 13 02:26:11.081804 systemd[1]: Detected virtualization kvm. Sep 13 02:26:11.081822 systemd[1]: Detected architecture x86-64. Sep 13 02:26:11.081859 systemd[1]: Detected first boot. Sep 13 02:26:11.081882 systemd[1]: Hostname set to . Sep 13 02:26:11.081932 systemd[1]: Initializing machine ID from VM UUID. Sep 13 02:26:11.081951 zram_generator::config[1129]: No configuration found. Sep 13 02:26:11.081988 kernel: Guest personality initialized and is inactive Sep 13 02:26:11.082012 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 13 02:26:11.082061 kernel: Initialized host personality Sep 13 02:26:11.082095 kernel: NET: Registered PF_VSOCK protocol family Sep 13 02:26:11.082121 systemd[1]: Populated /etc with preset unit settings. Sep 13 02:26:11.082147 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 13 02:26:11.082169 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 13 02:26:11.082201 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 13 02:26:11.082234 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 13 02:26:11.082259 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 13 02:26:11.082292 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 13 02:26:11.082317 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 13 02:26:11.082337 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 13 02:26:11.082391 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 13 02:26:11.082427 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 13 02:26:11.082448 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 13 02:26:11.082473 systemd[1]: Created slice user.slice - User and Session Slice. Sep 13 02:26:11.082494 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 02:26:11.082514 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 02:26:11.082533 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 13 02:26:11.082552 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 13 02:26:11.082584 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 13 02:26:11.082605 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 13 02:26:11.082632 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 13 02:26:11.082670 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 02:26:11.082713 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 13 02:26:11.082733 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 13 02:26:11.082761 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 13 02:26:11.082782 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 13 02:26:11.082799 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 13 02:26:11.082823 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 02:26:11.082869 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 13 02:26:11.082895 systemd[1]: Reached target slices.target - Slice Units. Sep 13 02:26:11.082926 systemd[1]: Reached target swap.target - Swaps. Sep 13 02:26:11.082944 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 13 02:26:11.082984 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 13 02:26:11.083013 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 13 02:26:11.083048 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 13 02:26:11.083076 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 13 02:26:11.083101 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 02:26:11.083120 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 13 02:26:11.083145 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 13 02:26:11.083164 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 13 02:26:11.083195 systemd[1]: Mounting media.mount - External Media Directory... Sep 13 02:26:11.083214 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 02:26:11.083245 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 13 02:26:11.083273 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 13 02:26:11.083305 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 13 02:26:11.083332 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 13 02:26:11.083353 systemd[1]: Reached target machines.target - Containers. Sep 13 02:26:11.083386 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 13 02:26:11.083408 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 02:26:11.083427 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 13 02:26:11.083453 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 13 02:26:11.083486 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 02:26:11.083506 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 13 02:26:11.083525 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 13 02:26:11.083544 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 13 02:26:11.083563 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 02:26:11.083582 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 13 02:26:11.083601 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 13 02:26:11.083619 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 13 02:26:11.083650 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 13 02:26:11.083678 systemd[1]: Stopped systemd-fsck-usr.service. Sep 13 02:26:11.084060 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 13 02:26:11.084101 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 13 02:26:11.084136 kernel: loop: module loaded Sep 13 02:26:11.084167 kernel: fuse: init (API version 7.41) Sep 13 02:26:11.084187 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 13 02:26:11.084213 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 13 02:26:11.084240 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 13 02:26:11.084273 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 13 02:26:11.084304 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 13 02:26:11.084331 systemd[1]: verity-setup.service: Deactivated successfully. Sep 13 02:26:11.084352 systemd[1]: Stopped verity-setup.service. Sep 13 02:26:11.084372 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 02:26:11.084409 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 13 02:26:11.084436 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 13 02:26:11.084457 systemd[1]: Mounted media.mount - External Media Directory. Sep 13 02:26:11.084483 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 13 02:26:11.084515 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 13 02:26:11.084536 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 13 02:26:11.084555 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 02:26:11.084581 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 13 02:26:11.084607 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 13 02:26:11.084634 kernel: ACPI: bus type drm_connector registered Sep 13 02:26:11.084653 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 02:26:11.084673 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 02:26:11.084692 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 13 02:26:11.084726 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 13 02:26:11.084746 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 13 02:26:11.084766 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 13 02:26:11.084785 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 13 02:26:11.084804 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 13 02:26:11.084830 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 02:26:11.084856 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 02:26:11.084911 systemd-journald[1219]: Collecting audit messages is disabled. Sep 13 02:26:11.084977 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 13 02:26:11.085007 systemd-journald[1219]: Journal started Sep 13 02:26:11.088144 systemd-journald[1219]: Runtime Journal (/run/log/journal/549e75df26ce436a88731fda20073c51) is 4.7M, max 38.2M, 33.4M free. Sep 13 02:26:11.088212 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 13 02:26:10.668358 systemd[1]: Queued start job for default target multi-user.target. Sep 13 02:26:10.685546 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 13 02:26:10.686266 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 13 02:26:11.097050 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 13 02:26:11.103089 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 13 02:26:11.114524 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 13 02:26:11.121060 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 13 02:26:11.126055 systemd[1]: Started systemd-journald.service - Journal Service. Sep 13 02:26:11.128215 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 13 02:26:11.129395 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 13 02:26:11.136080 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 13 02:26:11.137420 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 13 02:26:11.138338 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 13 02:26:11.139228 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 13 02:26:11.161962 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 13 02:26:11.164566 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 13 02:26:11.164609 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 13 02:26:11.166824 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 13 02:26:11.169996 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 13 02:26:11.172514 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 02:26:11.174340 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 13 02:26:11.177554 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 13 02:26:11.178320 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 13 02:26:11.182001 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 13 02:26:11.187436 systemd-tmpfiles[1244]: ACLs are not supported, ignoring. Sep 13 02:26:11.187462 systemd-tmpfiles[1244]: ACLs are not supported, ignoring. Sep 13 02:26:11.188295 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 13 02:26:11.215721 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 13 02:26:11.224642 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 02:26:11.225472 systemd-journald[1219]: Time spent on flushing to /var/log/journal/549e75df26ce436a88731fda20073c51 is 52.109ms for 1171 entries. Sep 13 02:26:11.225472 systemd-journald[1219]: System Journal (/var/log/journal/549e75df26ce436a88731fda20073c51) is 8M, max 584.8M, 576.8M free. Sep 13 02:26:11.296740 systemd-journald[1219]: Received client request to flush runtime journal. Sep 13 02:26:11.296809 kernel: loop0: detected capacity change from 0 to 146240 Sep 13 02:26:11.229789 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 13 02:26:11.233158 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 13 02:26:11.234263 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 13 02:26:11.239069 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 13 02:26:11.302770 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 13 02:26:11.308552 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 13 02:26:11.326362 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 13 02:26:11.352060 kernel: loop1: detected capacity change from 0 to 221472 Sep 13 02:26:11.403058 kernel: loop2: detected capacity change from 0 to 113872 Sep 13 02:26:11.403547 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 13 02:26:11.412559 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 13 02:26:11.442121 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 02:26:11.447112 kernel: loop3: detected capacity change from 0 to 8 Sep 13 02:26:11.471308 kernel: loop4: detected capacity change from 0 to 146240 Sep 13 02:26:11.478869 systemd-tmpfiles[1288]: ACLs are not supported, ignoring. Sep 13 02:26:11.478894 systemd-tmpfiles[1288]: ACLs are not supported, ignoring. Sep 13 02:26:11.520565 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 02:26:11.536477 kernel: loop5: detected capacity change from 0 to 221472 Sep 13 02:26:11.566401 kernel: loop6: detected capacity change from 0 to 113872 Sep 13 02:26:11.601636 kernel: loop7: detected capacity change from 0 to 8 Sep 13 02:26:11.605022 (sd-merge)[1293]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Sep 13 02:26:11.606976 (sd-merge)[1293]: Merged extensions into '/usr'. Sep 13 02:26:11.616754 systemd[1]: Reload requested from client PID 1272 ('systemd-sysext') (unit systemd-sysext.service)... Sep 13 02:26:11.617699 systemd[1]: Reloading... Sep 13 02:26:11.784107 zram_generator::config[1318]: No configuration found. Sep 13 02:26:11.945050 ldconfig[1268]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 13 02:26:12.039839 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 02:26:12.154657 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 13 02:26:12.155597 systemd[1]: Reloading finished in 537 ms. Sep 13 02:26:12.171410 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 13 02:26:12.179435 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 13 02:26:12.182824 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 13 02:26:12.190952 systemd[1]: Starting ensure-sysext.service... Sep 13 02:26:12.196225 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 13 02:26:12.198835 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 02:26:12.218433 systemd[1]: Reload requested from client PID 1377 ('systemctl') (unit ensure-sysext.service)... Sep 13 02:26:12.218456 systemd[1]: Reloading... Sep 13 02:26:12.229470 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 13 02:26:12.230174 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 13 02:26:12.230741 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 13 02:26:12.231262 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 13 02:26:12.232691 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 13 02:26:12.233211 systemd-tmpfiles[1378]: ACLs are not supported, ignoring. Sep 13 02:26:12.233433 systemd-tmpfiles[1378]: ACLs are not supported, ignoring. Sep 13 02:26:12.239513 systemd-tmpfiles[1378]: Detected autofs mount point /boot during canonicalization of boot. Sep 13 02:26:12.239651 systemd-tmpfiles[1378]: Skipping /boot Sep 13 02:26:12.276381 systemd-tmpfiles[1378]: Detected autofs mount point /boot during canonicalization of boot. Sep 13 02:26:12.276398 systemd-tmpfiles[1378]: Skipping /boot Sep 13 02:26:12.312489 systemd-udevd[1379]: Using default interface naming scheme 'v255'. Sep 13 02:26:12.324083 zram_generator::config[1417]: No configuration found. Sep 13 02:26:12.528106 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 02:26:12.700826 systemd[1]: Reloading finished in 481 ms. Sep 13 02:26:12.714270 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 02:26:12.717192 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 02:26:12.738118 kernel: mousedev: PS/2 mouse device common for all mice Sep 13 02:26:12.749871 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 13 02:26:12.775974 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 13 02:26:12.814641 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 02:26:12.820144 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 13 02:26:12.824084 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 13 02:26:12.824698 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 13 02:26:12.825687 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 02:26:12.828144 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 02:26:12.831495 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 13 02:26:12.837544 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 02:26:12.838499 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 02:26:12.840420 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 13 02:26:12.841739 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 13 02:26:12.848420 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 13 02:26:12.857804 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 13 02:26:12.868147 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 13 02:26:12.874461 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 13 02:26:12.875248 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 02:26:12.885738 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 02:26:12.886713 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 02:26:12.891077 kernel: ACPI: button: Power Button [PWRF] Sep 13 02:26:12.891412 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 13 02:26:12.892772 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 02:26:12.892946 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 13 02:26:12.893171 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 02:26:12.894214 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 02:26:12.896103 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 02:26:12.906724 systemd[1]: Finished ensure-sysext.service. Sep 13 02:26:12.918373 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 13 02:26:12.935569 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 13 02:26:12.936758 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 13 02:26:12.937886 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 13 02:26:12.942777 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 13 02:26:12.953286 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 13 02:26:12.955509 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 02:26:12.955869 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 02:26:12.959310 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 13 02:26:12.964184 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 13 02:26:12.966569 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 13 02:26:12.969273 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 13 02:26:12.979840 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 13 02:26:12.985110 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 13 02:26:12.990574 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 13 02:26:12.990920 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 13 02:26:13.017050 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 13 02:26:13.021084 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 13 02:26:13.036512 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 13 02:26:13.056055 augenrules[1540]: No rules Sep 13 02:26:13.057169 systemd[1]: audit-rules.service: Deactivated successfully. Sep 13 02:26:13.059141 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 13 02:26:13.103302 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 13 02:26:13.199158 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 02:26:13.445631 systemd-networkd[1500]: lo: Link UP Sep 13 02:26:13.446313 systemd-networkd[1500]: lo: Gained carrier Sep 13 02:26:13.450770 systemd-networkd[1500]: Enumeration completed Sep 13 02:26:13.450920 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 13 02:26:13.451292 systemd-networkd[1500]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 02:26:13.451298 systemd-networkd[1500]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 13 02:26:13.452519 systemd-networkd[1500]: eth0: Link UP Sep 13 02:26:13.452743 systemd-networkd[1500]: eth0: Gained carrier Sep 13 02:26:13.452764 systemd-networkd[1500]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 02:26:13.478156 systemd-networkd[1500]: eth0: DHCPv4 address 10.230.67.142/30, gateway 10.230.67.141 acquired from 10.230.67.141 Sep 13 02:26:13.512741 systemd-resolved[1501]: Positive Trust Anchors: Sep 13 02:26:13.514063 systemd-resolved[1501]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 13 02:26:13.514215 systemd-resolved[1501]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 13 02:26:13.522473 systemd-resolved[1501]: Using system hostname 'srv-1di1n.gb1.brightbox.com'. Sep 13 02:26:13.523506 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 13 02:26:13.526133 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 13 02:26:13.527443 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 02:26:13.529407 systemd[1]: Reached target network.target - Network. Sep 13 02:26:13.530212 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 13 02:26:13.531127 systemd[1]: Reached target sysinit.target - System Initialization. Sep 13 02:26:13.532236 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 13 02:26:13.533183 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 13 02:26:13.534077 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 13 02:26:13.534909 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 13 02:26:13.535771 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 13 02:26:13.535824 systemd[1]: Reached target paths.target - Path Units. Sep 13 02:26:13.536640 systemd[1]: Reached target time-set.target - System Time Set. Sep 13 02:26:13.537655 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 13 02:26:13.538575 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 13 02:26:13.539459 systemd[1]: Reached target timers.target - Timer Units. Sep 13 02:26:13.542282 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 13 02:26:13.545464 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 13 02:26:13.552481 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 13 02:26:13.553669 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 13 02:26:13.555651 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 13 02:26:13.559712 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 13 02:26:13.561447 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 13 02:26:13.564380 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 13 02:26:13.569115 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 13 02:26:13.571982 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 13 02:26:13.573640 systemd[1]: Reached target sockets.target - Socket Units. Sep 13 02:26:13.575218 systemd[1]: Reached target basic.target - Basic System. Sep 13 02:26:13.575919 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 13 02:26:13.575968 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 13 02:26:13.579163 systemd[1]: Starting containerd.service - containerd container runtime... Sep 13 02:26:13.582936 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 13 02:26:13.589822 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 13 02:26:13.592914 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 13 02:26:13.596239 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 13 02:26:13.602272 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 13 02:26:13.602977 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 13 02:26:13.607078 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 13 02:26:13.608328 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 13 02:26:13.614124 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 13 02:26:13.622324 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 13 02:26:13.629431 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 13 02:26:13.632926 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 13 02:26:13.641622 jq[1580]: false Sep 13 02:26:13.644321 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 13 02:26:13.648230 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 13 02:26:13.648947 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 13 02:26:13.652876 systemd[1]: Starting update-engine.service - Update Engine... Sep 13 02:26:13.659189 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 13 02:26:13.663820 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 13 02:26:13.664961 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 13 02:26:13.666277 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 13 02:26:13.678410 google_oslogin_nss_cache[1583]: oslogin_cache_refresh[1583]: Refreshing passwd entry cache Sep 13 02:26:13.677400 oslogin_cache_refresh[1583]: Refreshing passwd entry cache Sep 13 02:26:13.690766 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 13 02:26:13.691184 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 13 02:26:13.696531 extend-filesystems[1581]: Found /dev/vda6 Sep 13 02:26:13.716159 jq[1594]: true Sep 13 02:26:13.727492 extend-filesystems[1581]: Found /dev/vda9 Sep 13 02:26:13.744072 update_engine[1593]: I20250913 02:26:13.737560 1593 main.cc:92] Flatcar Update Engine starting Sep 13 02:26:13.743248 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 13 02:26:13.738975 oslogin_cache_refresh[1583]: Failure getting users, quitting Sep 13 02:26:13.744724 google_oslogin_nss_cache[1583]: oslogin_cache_refresh[1583]: Failure getting users, quitting Sep 13 02:26:13.744724 google_oslogin_nss_cache[1583]: oslogin_cache_refresh[1583]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 13 02:26:13.744724 google_oslogin_nss_cache[1583]: oslogin_cache_refresh[1583]: Refreshing group entry cache Sep 13 02:26:13.744724 google_oslogin_nss_cache[1583]: oslogin_cache_refresh[1583]: Failure getting groups, quitting Sep 13 02:26:13.744724 google_oslogin_nss_cache[1583]: oslogin_cache_refresh[1583]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 13 02:26:13.743617 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 13 02:26:13.739004 oslogin_cache_refresh[1583]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 13 02:26:13.739137 oslogin_cache_refresh[1583]: Refreshing group entry cache Sep 13 02:26:13.740345 oslogin_cache_refresh[1583]: Failure getting groups, quitting Sep 13 02:26:13.740360 oslogin_cache_refresh[1583]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 13 02:26:14.161319 tar[1600]: linux-amd64/helm Sep 13 02:26:14.158977 systemd-timesyncd[1507]: Contacted time server 176.58.124.166:123 (0.flatcar.pool.ntp.org). Sep 13 02:26:14.162012 extend-filesystems[1581]: Checking size of /dev/vda9 Sep 13 02:26:14.159056 systemd-timesyncd[1507]: Initial clock synchronization to Sat 2025-09-13 02:26:14.158810 UTC. Sep 13 02:26:14.159385 systemd-resolved[1501]: Clock change detected. Flushing caches. Sep 13 02:26:14.164980 (ntainerd)[1607]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 13 02:26:14.186267 systemd[1]: motdgen.service: Deactivated successfully. Sep 13 02:26:14.191654 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 13 02:26:14.202934 dbus-daemon[1578]: [system] SELinux support is enabled Sep 13 02:26:14.203470 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 13 02:26:14.210346 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 13 02:26:14.213799 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 13 02:26:14.213857 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 13 02:26:14.215714 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 13 02:26:14.220511 jq[1613]: true Sep 13 02:26:14.215753 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 13 02:26:14.231966 extend-filesystems[1581]: Resized partition /dev/vda9 Sep 13 02:26:14.236717 extend-filesystems[1627]: resize2fs 1.47.2 (1-Jan-2025) Sep 13 02:26:14.236678 dbus-daemon[1578]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.1' (uid=244 pid=1500 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 13 02:26:14.243745 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Sep 13 02:26:14.247190 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 13 02:26:14.249419 systemd[1]: Started update-engine.service - Update Engine. Sep 13 02:26:14.252090 update_engine[1593]: I20250913 02:26:14.251998 1593 update_check_scheduler.cc:74] Next update check in 3m19s Sep 13 02:26:14.303656 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 13 02:26:14.493959 bash[1644]: Updated "/home/core/.ssh/authorized_keys" Sep 13 02:26:14.494895 systemd-logind[1589]: Watching system buttons on /dev/input/event3 (Power Button) Sep 13 02:26:14.494937 systemd-logind[1589]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 13 02:26:14.495282 systemd-logind[1589]: New seat seat0. Sep 13 02:26:14.496921 systemd[1]: Started systemd-logind.service - User Login Management. Sep 13 02:26:14.505585 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 13 02:26:14.512722 systemd[1]: Starting sshkeys.service... Sep 13 02:26:14.552981 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Sep 13 02:26:14.585301 extend-filesystems[1627]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 13 02:26:14.585301 extend-filesystems[1627]: old_desc_blocks = 1, new_desc_blocks = 8 Sep 13 02:26:14.585301 extend-filesystems[1627]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Sep 13 02:26:14.597516 extend-filesystems[1581]: Resized filesystem in /dev/vda9 Sep 13 02:26:14.587759 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 13 02:26:14.588243 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 13 02:26:14.617625 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 13 02:26:14.622715 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 13 02:26:14.642292 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 13 02:26:14.644466 locksmithd[1629]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 13 02:26:14.724741 dbus-daemon[1578]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 13 02:26:14.730517 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 13 02:26:14.725812 dbus-daemon[1578]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.8' (uid=0 pid=1628 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 13 02:26:14.736513 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 13 02:26:14.740283 systemd[1]: Starting polkit.service - Authorization Manager... Sep 13 02:26:14.854060 polkitd[1664]: Started polkitd version 126 Sep 13 02:26:14.865637 containerd[1607]: time="2025-09-13T02:26:14Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 13 02:26:14.871727 polkitd[1664]: Loading rules from directory /etc/polkit-1/rules.d Sep 13 02:26:14.872170 polkitd[1664]: Loading rules from directory /run/polkit-1/rules.d Sep 13 02:26:14.872253 polkitd[1664]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 13 02:26:14.874620 containerd[1607]: time="2025-09-13T02:26:14.874567795Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Sep 13 02:26:14.875353 polkitd[1664]: Loading rules from directory /usr/local/share/polkit-1/rules.d Sep 13 02:26:14.875413 polkitd[1664]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 13 02:26:14.875485 polkitd[1664]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 13 02:26:14.878698 polkitd[1664]: Finished loading, compiling and executing 2 rules Sep 13 02:26:14.879055 systemd[1]: Started polkit.service - Authorization Manager. Sep 13 02:26:14.884099 dbus-daemon[1578]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 13 02:26:14.885325 polkitd[1664]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 13 02:26:14.914285 containerd[1607]: time="2025-09-13T02:26:14.913036863Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="16.506µs" Sep 13 02:26:14.917681 containerd[1607]: time="2025-09-13T02:26:14.917612210Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 13 02:26:14.917830 containerd[1607]: time="2025-09-13T02:26:14.917804623Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 13 02:26:14.918714 containerd[1607]: time="2025-09-13T02:26:14.918686856Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 13 02:26:14.918833 containerd[1607]: time="2025-09-13T02:26:14.918808500Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 13 02:26:14.918984 containerd[1607]: time="2025-09-13T02:26:14.918951734Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 13 02:26:14.921762 containerd[1607]: time="2025-09-13T02:26:14.921379265Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 13 02:26:14.921762 containerd[1607]: time="2025-09-13T02:26:14.921420101Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 13 02:26:14.923027 containerd[1607]: time="2025-09-13T02:26:14.922017048Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 13 02:26:14.923027 containerd[1607]: time="2025-09-13T02:26:14.922302565Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 13 02:26:14.923027 containerd[1607]: time="2025-09-13T02:26:14.922326209Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 13 02:26:14.923027 containerd[1607]: time="2025-09-13T02:26:14.922340486Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 13 02:26:14.923027 containerd[1607]: time="2025-09-13T02:26:14.922527270Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 13 02:26:14.924374 containerd[1607]: time="2025-09-13T02:26:14.924187078Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 13 02:26:14.924374 containerd[1607]: time="2025-09-13T02:26:14.924289929Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 13 02:26:14.924374 containerd[1607]: time="2025-09-13T02:26:14.924312219Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 13 02:26:14.924579 containerd[1607]: time="2025-09-13T02:26:14.924543279Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 13 02:26:14.925198 systemd-hostnamed[1628]: Hostname set to (static) Sep 13 02:26:14.928287 containerd[1607]: time="2025-09-13T02:26:14.926182514Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 13 02:26:14.931013 containerd[1607]: time="2025-09-13T02:26:14.929384567Z" level=info msg="metadata content store policy set" policy=shared Sep 13 02:26:14.934029 containerd[1607]: time="2025-09-13T02:26:14.933995932Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936338051Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936391712Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936413632Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936465922Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936488173Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936507418Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936540001Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936571249Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936590927Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936606340Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936633375Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936809883Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936850839Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 13 02:26:14.937289 containerd[1607]: time="2025-09-13T02:26:14.936875037Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 13 02:26:14.937809 containerd[1607]: time="2025-09-13T02:26:14.936893374Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 13 02:26:14.937809 containerd[1607]: time="2025-09-13T02:26:14.936922800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 13 02:26:14.937809 containerd[1607]: time="2025-09-13T02:26:14.936942725Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 13 02:26:14.937809 containerd[1607]: time="2025-09-13T02:26:14.936962402Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 13 02:26:14.937809 containerd[1607]: time="2025-09-13T02:26:14.936979182Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 13 02:26:14.937809 containerd[1607]: time="2025-09-13T02:26:14.936997283Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 13 02:26:14.937809 containerd[1607]: time="2025-09-13T02:26:14.937013611Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 13 02:26:14.937809 containerd[1607]: time="2025-09-13T02:26:14.937031996Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 13 02:26:14.937809 containerd[1607]: time="2025-09-13T02:26:14.937147340Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 13 02:26:14.937809 containerd[1607]: time="2025-09-13T02:26:14.937177496Z" level=info msg="Start snapshots syncer" Sep 13 02:26:14.937809 containerd[1607]: time="2025-09-13T02:26:14.937214773Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 13 02:26:14.938636 containerd[1607]: time="2025-09-13T02:26:14.938552073Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 13 02:26:14.938983 containerd[1607]: time="2025-09-13T02:26:14.938940462Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 13 02:26:14.939230 containerd[1607]: time="2025-09-13T02:26:14.939202078Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941440057Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941482364Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941503583Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941520258Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941593102Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941618376Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941656371Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941698192Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941718527Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941738120Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941778323Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941803400Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 13 02:26:14.942736 containerd[1607]: time="2025-09-13T02:26:14.941834971Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 13 02:26:14.943218 containerd[1607]: time="2025-09-13T02:26:14.941851651Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 13 02:26:14.943218 containerd[1607]: time="2025-09-13T02:26:14.941865324Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 13 02:26:14.943218 containerd[1607]: time="2025-09-13T02:26:14.941889904Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 13 02:26:14.943218 containerd[1607]: time="2025-09-13T02:26:14.941910478Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 13 02:26:14.943218 containerd[1607]: time="2025-09-13T02:26:14.941951483Z" level=info msg="runtime interface created" Sep 13 02:26:14.943218 containerd[1607]: time="2025-09-13T02:26:14.941963560Z" level=info msg="created NRI interface" Sep 13 02:26:14.943218 containerd[1607]: time="2025-09-13T02:26:14.941976191Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 13 02:26:14.943218 containerd[1607]: time="2025-09-13T02:26:14.941993816Z" level=info msg="Connect containerd service" Sep 13 02:26:14.943218 containerd[1607]: time="2025-09-13T02:26:14.942064994Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 13 02:26:14.946282 containerd[1607]: time="2025-09-13T02:26:14.944045033Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 13 02:26:14.998477 sshd_keygen[1618]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 13 02:26:15.033279 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 13 02:26:15.040540 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 13 02:26:15.059899 systemd[1]: Started sshd@0-10.230.67.142:22-139.178.89.65:47274.service - OpenSSH per-connection server daemon (139.178.89.65:47274). Sep 13 02:26:15.076299 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 13 02:26:15.109539 systemd[1]: issuegen.service: Deactivated successfully. Sep 13 02:26:15.111367 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 13 02:26:15.117787 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 13 02:26:15.174979 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 13 02:26:15.181064 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 13 02:26:15.185124 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 13 02:26:15.186972 systemd[1]: Reached target getty.target - Login Prompts. Sep 13 02:26:15.207478 containerd[1607]: time="2025-09-13T02:26:15.207418443Z" level=info msg="Start subscribing containerd event" Sep 13 02:26:15.207616 containerd[1607]: time="2025-09-13T02:26:15.207488427Z" level=info msg="Start recovering state" Sep 13 02:26:15.207713 containerd[1607]: time="2025-09-13T02:26:15.207674123Z" level=info msg="Start event monitor" Sep 13 02:26:15.207713 containerd[1607]: time="2025-09-13T02:26:15.207708694Z" level=info msg="Start cni network conf syncer for default" Sep 13 02:26:15.207850 containerd[1607]: time="2025-09-13T02:26:15.207743593Z" level=info msg="Start streaming server" Sep 13 02:26:15.207850 containerd[1607]: time="2025-09-13T02:26:15.207775212Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 13 02:26:15.207850 containerd[1607]: time="2025-09-13T02:26:15.207788803Z" level=info msg="runtime interface starting up..." Sep 13 02:26:15.207850 containerd[1607]: time="2025-09-13T02:26:15.207801191Z" level=info msg="starting plugins..." Sep 13 02:26:15.207850 containerd[1607]: time="2025-09-13T02:26:15.207836203Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 13 02:26:15.209389 containerd[1607]: time="2025-09-13T02:26:15.209356593Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 13 02:26:15.209602 containerd[1607]: time="2025-09-13T02:26:15.209577873Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 13 02:26:15.211397 containerd[1607]: time="2025-09-13T02:26:15.211368260Z" level=info msg="containerd successfully booted in 0.347217s" Sep 13 02:26:15.211605 systemd[1]: Started containerd.service - containerd container runtime. Sep 13 02:26:15.322710 tar[1600]: linux-amd64/LICENSE Sep 13 02:26:15.323237 tar[1600]: linux-amd64/README.md Sep 13 02:26:15.343075 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 13 02:26:15.383499 systemd-networkd[1500]: eth0: Gained IPv6LL Sep 13 02:26:15.387285 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 13 02:26:15.389025 systemd[1]: Reached target network-online.target - Network is Online. Sep 13 02:26:15.392213 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 02:26:15.396646 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 13 02:26:15.431421 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 13 02:26:15.660318 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 13 02:26:16.022847 sshd[1690]: Accepted publickey for core from 139.178.89.65 port 47274 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:26:16.025326 sshd-session[1690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:26:16.039149 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 13 02:26:16.043019 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 13 02:26:16.061712 systemd-logind[1589]: New session 1 of user core. Sep 13 02:26:16.079997 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 13 02:26:16.087444 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 13 02:26:16.105045 (systemd)[1724]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 13 02:26:16.111576 systemd-logind[1589]: New session c1 of user core. Sep 13 02:26:16.303549 systemd[1724]: Queued start job for default target default.target. Sep 13 02:26:16.310763 systemd[1724]: Created slice app.slice - User Application Slice. Sep 13 02:26:16.310964 systemd[1724]: Reached target paths.target - Paths. Sep 13 02:26:16.311060 systemd[1724]: Reached target timers.target - Timers. Sep 13 02:26:16.314384 systemd[1724]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 13 02:26:16.338451 systemd[1724]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 13 02:26:16.338661 systemd[1724]: Reached target sockets.target - Sockets. Sep 13 02:26:16.338736 systemd[1724]: Reached target basic.target - Basic System. Sep 13 02:26:16.338819 systemd[1724]: Reached target default.target - Main User Target. Sep 13 02:26:16.338877 systemd[1724]: Startup finished in 214ms. Sep 13 02:26:16.338977 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 13 02:26:16.349879 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 13 02:26:16.450212 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 02:26:16.466287 (kubelet)[1738]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 02:26:16.892000 systemd-networkd[1500]: eth0: Ignoring DHCPv6 address 2a02:1348:179:90e3:24:19ff:fee6:438e/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:90e3:24:19ff:fee6:438e/64 assigned by NDisc. Sep 13 02:26:16.892012 systemd-networkd[1500]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 13 02:26:16.980226 systemd[1]: Started sshd@1-10.230.67.142:22-139.178.89.65:47280.service - OpenSSH per-connection server daemon (139.178.89.65:47280). Sep 13 02:26:17.096295 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 13 02:26:17.115211 kubelet[1738]: E0913 02:26:17.115131 1738 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 02:26:17.118480 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 02:26:17.118715 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 02:26:17.119556 systemd[1]: kubelet.service: Consumed 1.018s CPU time, 266.4M memory peak. Sep 13 02:26:17.679306 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 13 02:26:17.896201 sshd[1747]: Accepted publickey for core from 139.178.89.65 port 47280 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:26:17.899015 sshd-session[1747]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:26:17.907587 systemd-logind[1589]: New session 2 of user core. Sep 13 02:26:17.918702 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 13 02:26:18.525036 sshd[1752]: Connection closed by 139.178.89.65 port 47280 Sep 13 02:26:18.526111 sshd-session[1747]: pam_unix(sshd:session): session closed for user core Sep 13 02:26:18.531754 systemd[1]: sshd@1-10.230.67.142:22-139.178.89.65:47280.service: Deactivated successfully. Sep 13 02:26:18.534125 systemd[1]: session-2.scope: Deactivated successfully. Sep 13 02:26:18.535577 systemd-logind[1589]: Session 2 logged out. Waiting for processes to exit. Sep 13 02:26:18.537834 systemd-logind[1589]: Removed session 2. Sep 13 02:26:18.686000 systemd[1]: Started sshd@2-10.230.67.142:22-139.178.89.65:47296.service - OpenSSH per-connection server daemon (139.178.89.65:47296). Sep 13 02:26:20.000154 sshd[1758]: Accepted publickey for core from 139.178.89.65 port 47296 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:26:20.002641 sshd-session[1758]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:26:20.010081 systemd-logind[1589]: New session 3 of user core. Sep 13 02:26:20.023819 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 13 02:26:20.268171 login[1705]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 13 02:26:20.276105 systemd-logind[1589]: New session 4 of user core. Sep 13 02:26:20.281987 login[1704]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 13 02:26:20.289184 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 13 02:26:20.302484 systemd-logind[1589]: New session 5 of user core. Sep 13 02:26:20.307598 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 13 02:26:20.635907 sshd[1760]: Connection closed by 139.178.89.65 port 47296 Sep 13 02:26:20.636245 sshd-session[1758]: pam_unix(sshd:session): session closed for user core Sep 13 02:26:20.641049 systemd[1]: sshd@2-10.230.67.142:22-139.178.89.65:47296.service: Deactivated successfully. Sep 13 02:26:20.643537 systemd[1]: session-3.scope: Deactivated successfully. Sep 13 02:26:20.646197 systemd-logind[1589]: Session 3 logged out. Waiting for processes to exit. Sep 13 02:26:20.647707 systemd-logind[1589]: Removed session 3. Sep 13 02:26:21.112313 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 13 02:26:21.122846 coreos-metadata[1577]: Sep 13 02:26:21.122 WARN failed to locate config-drive, using the metadata service API instead Sep 13 02:26:21.148418 coreos-metadata[1577]: Sep 13 02:26:21.148 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Sep 13 02:26:21.155633 coreos-metadata[1577]: Sep 13 02:26:21.155 INFO Fetch failed with 404: resource not found Sep 13 02:26:21.155707 coreos-metadata[1577]: Sep 13 02:26:21.155 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 13 02:26:21.156487 coreos-metadata[1577]: Sep 13 02:26:21.156 INFO Fetch successful Sep 13 02:26:21.156605 coreos-metadata[1577]: Sep 13 02:26:21.156 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Sep 13 02:26:21.172548 coreos-metadata[1577]: Sep 13 02:26:21.172 INFO Fetch successful Sep 13 02:26:21.172632 coreos-metadata[1577]: Sep 13 02:26:21.172 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Sep 13 02:26:21.187247 coreos-metadata[1577]: Sep 13 02:26:21.187 INFO Fetch successful Sep 13 02:26:21.187334 coreos-metadata[1577]: Sep 13 02:26:21.187 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Sep 13 02:26:21.207093 coreos-metadata[1577]: Sep 13 02:26:21.207 INFO Fetch successful Sep 13 02:26:21.207166 coreos-metadata[1577]: Sep 13 02:26:21.207 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Sep 13 02:26:21.228114 coreos-metadata[1577]: Sep 13 02:26:21.228 INFO Fetch successful Sep 13 02:26:21.262255 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 13 02:26:21.263380 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 13 02:26:21.700327 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 13 02:26:21.711976 coreos-metadata[1659]: Sep 13 02:26:21.711 WARN failed to locate config-drive, using the metadata service API instead Sep 13 02:26:21.732861 coreos-metadata[1659]: Sep 13 02:26:21.732 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Sep 13 02:26:21.820932 coreos-metadata[1659]: Sep 13 02:26:21.820 INFO Fetch successful Sep 13 02:26:21.821422 coreos-metadata[1659]: Sep 13 02:26:21.821 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 13 02:26:21.852231 coreos-metadata[1659]: Sep 13 02:26:21.852 INFO Fetch successful Sep 13 02:26:21.854713 unknown[1659]: wrote ssh authorized keys file for user: core Sep 13 02:26:21.892977 update-ssh-keys[1800]: Updated "/home/core/.ssh/authorized_keys" Sep 13 02:26:21.894835 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 13 02:26:21.897507 systemd[1]: Finished sshkeys.service. Sep 13 02:26:21.900874 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 13 02:26:21.901566 systemd[1]: Startup finished in 3.467s (kernel) + 15.184s (initrd) + 11.698s (userspace) = 30.350s. Sep 13 02:26:27.369416 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 13 02:26:27.371890 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 02:26:27.550475 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 02:26:27.564864 (kubelet)[1812]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 02:26:27.667477 kubelet[1812]: E0913 02:26:27.667255 1812 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 02:26:27.672227 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 02:26:27.672704 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 02:26:27.673780 systemd[1]: kubelet.service: Consumed 217ms CPU time, 109.1M memory peak. Sep 13 02:26:30.795430 systemd[1]: Started sshd@3-10.230.67.142:22-139.178.89.65:37106.service - OpenSSH per-connection server daemon (139.178.89.65:37106). Sep 13 02:26:31.695960 sshd[1819]: Accepted publickey for core from 139.178.89.65 port 37106 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:26:31.697862 sshd-session[1819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:26:31.704776 systemd-logind[1589]: New session 6 of user core. Sep 13 02:26:31.720669 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 13 02:26:32.313291 sshd[1821]: Connection closed by 139.178.89.65 port 37106 Sep 13 02:26:32.314067 sshd-session[1819]: pam_unix(sshd:session): session closed for user core Sep 13 02:26:32.319023 systemd[1]: sshd@3-10.230.67.142:22-139.178.89.65:37106.service: Deactivated successfully. Sep 13 02:26:32.321060 systemd[1]: session-6.scope: Deactivated successfully. Sep 13 02:26:32.322460 systemd-logind[1589]: Session 6 logged out. Waiting for processes to exit. Sep 13 02:26:32.324185 systemd-logind[1589]: Removed session 6. Sep 13 02:26:32.472751 systemd[1]: Started sshd@4-10.230.67.142:22-139.178.89.65:37120.service - OpenSSH per-connection server daemon (139.178.89.65:37120). Sep 13 02:26:33.375200 sshd[1827]: Accepted publickey for core from 139.178.89.65 port 37120 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:26:33.377139 sshd-session[1827]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:26:33.384978 systemd-logind[1589]: New session 7 of user core. Sep 13 02:26:33.391479 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 13 02:26:33.990994 sshd[1829]: Connection closed by 139.178.89.65 port 37120 Sep 13 02:26:33.989998 sshd-session[1827]: pam_unix(sshd:session): session closed for user core Sep 13 02:26:33.994487 systemd[1]: sshd@4-10.230.67.142:22-139.178.89.65:37120.service: Deactivated successfully. Sep 13 02:26:33.996529 systemd[1]: session-7.scope: Deactivated successfully. Sep 13 02:26:33.997693 systemd-logind[1589]: Session 7 logged out. Waiting for processes to exit. Sep 13 02:26:33.999536 systemd-logind[1589]: Removed session 7. Sep 13 02:26:34.150202 systemd[1]: Started sshd@5-10.230.67.142:22-139.178.89.65:37132.service - OpenSSH per-connection server daemon (139.178.89.65:37132). Sep 13 02:26:35.072740 sshd[1835]: Accepted publickey for core from 139.178.89.65 port 37132 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:26:35.074736 sshd-session[1835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:26:35.083547 systemd-logind[1589]: New session 8 of user core. Sep 13 02:26:35.089517 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 13 02:26:35.703830 sshd[1837]: Connection closed by 139.178.89.65 port 37132 Sep 13 02:26:35.703063 sshd-session[1835]: pam_unix(sshd:session): session closed for user core Sep 13 02:26:35.707547 systemd[1]: sshd@5-10.230.67.142:22-139.178.89.65:37132.service: Deactivated successfully. Sep 13 02:26:35.710121 systemd[1]: session-8.scope: Deactivated successfully. Sep 13 02:26:35.712847 systemd-logind[1589]: Session 8 logged out. Waiting for processes to exit. Sep 13 02:26:35.715258 systemd-logind[1589]: Removed session 8. Sep 13 02:26:35.860999 systemd[1]: Started sshd@6-10.230.67.142:22-139.178.89.65:37134.service - OpenSSH per-connection server daemon (139.178.89.65:37134). Sep 13 02:26:36.763907 sshd[1843]: Accepted publickey for core from 139.178.89.65 port 37134 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:26:36.765612 sshd-session[1843]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:26:36.774835 systemd-logind[1589]: New session 9 of user core. Sep 13 02:26:36.788606 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 13 02:26:37.257385 sudo[1846]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 13 02:26:37.257820 sudo[1846]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 02:26:37.275011 sudo[1846]: pam_unix(sudo:session): session closed for user root Sep 13 02:26:37.418290 sshd[1845]: Connection closed by 139.178.89.65 port 37134 Sep 13 02:26:37.419218 sshd-session[1843]: pam_unix(sshd:session): session closed for user core Sep 13 02:26:37.424364 systemd[1]: sshd@6-10.230.67.142:22-139.178.89.65:37134.service: Deactivated successfully. Sep 13 02:26:37.426541 systemd[1]: session-9.scope: Deactivated successfully. Sep 13 02:26:37.427621 systemd-logind[1589]: Session 9 logged out. Waiting for processes to exit. Sep 13 02:26:37.429700 systemd-logind[1589]: Removed session 9. Sep 13 02:26:37.573525 systemd[1]: Started sshd@7-10.230.67.142:22-139.178.89.65:37138.service - OpenSSH per-connection server daemon (139.178.89.65:37138). Sep 13 02:26:37.923124 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 13 02:26:37.926101 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 02:26:38.092726 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 02:26:38.105701 (kubelet)[1862]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 02:26:38.192546 kubelet[1862]: E0913 02:26:38.192388 1862 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 02:26:38.195426 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 02:26:38.195680 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 02:26:38.196377 systemd[1]: kubelet.service: Consumed 184ms CPU time, 110.2M memory peak. Sep 13 02:26:38.479392 sshd[1852]: Accepted publickey for core from 139.178.89.65 port 37138 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:26:38.481348 sshd-session[1852]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:26:38.488510 systemd-logind[1589]: New session 10 of user core. Sep 13 02:26:38.498492 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 13 02:26:38.958303 sudo[1871]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 13 02:26:38.959605 sudo[1871]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 02:26:38.966282 sudo[1871]: pam_unix(sudo:session): session closed for user root Sep 13 02:26:38.973742 sudo[1870]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 13 02:26:38.974125 sudo[1870]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 02:26:38.986524 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 13 02:26:39.043828 augenrules[1893]: No rules Sep 13 02:26:39.044677 systemd[1]: audit-rules.service: Deactivated successfully. Sep 13 02:26:39.044997 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 13 02:26:39.046538 sudo[1870]: pam_unix(sudo:session): session closed for user root Sep 13 02:26:39.191411 sshd[1869]: Connection closed by 139.178.89.65 port 37138 Sep 13 02:26:39.192482 sshd-session[1852]: pam_unix(sshd:session): session closed for user core Sep 13 02:26:39.196954 systemd[1]: sshd@7-10.230.67.142:22-139.178.89.65:37138.service: Deactivated successfully. Sep 13 02:26:39.199769 systemd[1]: session-10.scope: Deactivated successfully. Sep 13 02:26:39.201357 systemd-logind[1589]: Session 10 logged out. Waiting for processes to exit. Sep 13 02:26:39.203977 systemd-logind[1589]: Removed session 10. Sep 13 02:26:39.347534 systemd[1]: Started sshd@8-10.230.67.142:22-139.178.89.65:37154.service - OpenSSH per-connection server daemon (139.178.89.65:37154). Sep 13 02:26:40.252312 sshd[1902]: Accepted publickey for core from 139.178.89.65 port 37154 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:26:40.254138 sshd-session[1902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:26:40.260661 systemd-logind[1589]: New session 11 of user core. Sep 13 02:26:40.271468 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 13 02:26:40.730373 sudo[1905]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 13 02:26:40.731689 sudo[1905]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 02:26:41.221032 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 13 02:26:41.250073 (dockerd)[1924]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 13 02:26:41.563058 dockerd[1924]: time="2025-09-13T02:26:41.562577838Z" level=info msg="Starting up" Sep 13 02:26:41.567209 dockerd[1924]: time="2025-09-13T02:26:41.566912339Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 13 02:26:41.604453 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport323614327-merged.mount: Deactivated successfully. Sep 13 02:26:41.638133 dockerd[1924]: time="2025-09-13T02:26:41.638059448Z" level=info msg="Loading containers: start." Sep 13 02:26:41.652412 kernel: Initializing XFRM netlink socket Sep 13 02:26:41.958340 systemd-networkd[1500]: docker0: Link UP Sep 13 02:26:41.962364 dockerd[1924]: time="2025-09-13T02:26:41.962321589Z" level=info msg="Loading containers: done." Sep 13 02:26:41.983169 dockerd[1924]: time="2025-09-13T02:26:41.982646954Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 13 02:26:41.983169 dockerd[1924]: time="2025-09-13T02:26:41.982742394Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Sep 13 02:26:41.983169 dockerd[1924]: time="2025-09-13T02:26:41.982897134Z" level=info msg="Initializing buildkit" Sep 13 02:26:41.983301 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2565275469-merged.mount: Deactivated successfully. Sep 13 02:26:42.011604 dockerd[1924]: time="2025-09-13T02:26:42.011321097Z" level=info msg="Completed buildkit initialization" Sep 13 02:26:42.021253 dockerd[1924]: time="2025-09-13T02:26:42.021218092Z" level=info msg="Daemon has completed initialization" Sep 13 02:26:42.021576 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 13 02:26:42.022572 dockerd[1924]: time="2025-09-13T02:26:42.022233706Z" level=info msg="API listen on /run/docker.sock" Sep 13 02:26:43.111632 containerd[1607]: time="2025-09-13T02:26:43.111487316Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 13 02:26:43.856130 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2236358407.mount: Deactivated successfully. Sep 13 02:26:45.505594 containerd[1607]: time="2025-09-13T02:26:45.505516342Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:45.507196 containerd[1607]: time="2025-09-13T02:26:45.507158350Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=28117132" Sep 13 02:26:45.507606 containerd[1607]: time="2025-09-13T02:26:45.507572907Z" level=info msg="ImageCreate event name:\"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:45.510969 containerd[1607]: time="2025-09-13T02:26:45.510932974Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:45.512500 containerd[1607]: time="2025-09-13T02:26:45.512454856Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"28113723\" in 2.400823392s" Sep 13 02:26:45.512582 containerd[1607]: time="2025-09-13T02:26:45.512516755Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\"" Sep 13 02:26:45.513874 containerd[1607]: time="2025-09-13T02:26:45.513832135Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 13 02:26:46.937755 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 13 02:26:47.840319 containerd[1607]: time="2025-09-13T02:26:47.839060694Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:47.840319 containerd[1607]: time="2025-09-13T02:26:47.840291566Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=24716640" Sep 13 02:26:47.841619 containerd[1607]: time="2025-09-13T02:26:47.841131326Z" level=info msg="ImageCreate event name:\"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:47.845662 containerd[1607]: time="2025-09-13T02:26:47.844229614Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:47.845662 containerd[1607]: time="2025-09-13T02:26:47.845563708Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"26351311\" in 2.331691584s" Sep 13 02:26:47.845662 containerd[1607]: time="2025-09-13T02:26:47.845614148Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\"" Sep 13 02:26:47.846999 containerd[1607]: time="2025-09-13T02:26:47.846948634Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 13 02:26:48.273200 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 13 02:26:48.276923 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 02:26:48.572419 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 02:26:48.583131 (kubelet)[2204]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 02:26:48.634838 kubelet[2204]: E0913 02:26:48.634785 2204 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 02:26:48.637291 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 02:26:48.637604 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 02:26:48.638046 systemd[1]: kubelet.service: Consumed 196ms CPU time, 108.7M memory peak. Sep 13 02:26:49.918536 containerd[1607]: time="2025-09-13T02:26:49.918463736Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:49.919716 containerd[1607]: time="2025-09-13T02:26:49.919649911Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=18787706" Sep 13 02:26:49.921291 containerd[1607]: time="2025-09-13T02:26:49.920361399Z" level=info msg="ImageCreate event name:\"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:49.925630 containerd[1607]: time="2025-09-13T02:26:49.925597981Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:49.926868 containerd[1607]: time="2025-09-13T02:26:49.926834619Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"20422395\" in 2.079833809s" Sep 13 02:26:49.926988 containerd[1607]: time="2025-09-13T02:26:49.926963763Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\"" Sep 13 02:26:49.928252 containerd[1607]: time="2025-09-13T02:26:49.928182874Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 13 02:26:52.086007 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount544489529.mount: Deactivated successfully. Sep 13 02:26:53.230343 containerd[1607]: time="2025-09-13T02:26:53.230260543Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:53.233287 containerd[1607]: time="2025-09-13T02:26:53.232849529Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=30410260" Sep 13 02:26:53.234746 containerd[1607]: time="2025-09-13T02:26:53.234711753Z" level=info msg="ImageCreate event name:\"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:53.238504 containerd[1607]: time="2025-09-13T02:26:53.238466193Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:53.239104 containerd[1607]: time="2025-09-13T02:26:53.239070344Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"30409271\" in 3.310819765s" Sep 13 02:26:53.239235 containerd[1607]: time="2025-09-13T02:26:53.239209889Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\"" Sep 13 02:26:53.239967 containerd[1607]: time="2025-09-13T02:26:53.239938958Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 13 02:26:53.886254 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2395736290.mount: Deactivated successfully. Sep 13 02:26:55.137283 containerd[1607]: time="2025-09-13T02:26:55.137178203Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:55.138972 containerd[1607]: time="2025-09-13T02:26:55.138694171Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Sep 13 02:26:55.139700 containerd[1607]: time="2025-09-13T02:26:55.139659951Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:55.143561 containerd[1607]: time="2025-09-13T02:26:55.143519479Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:26:55.145280 containerd[1607]: time="2025-09-13T02:26:55.145229839Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.905256911s" Sep 13 02:26:55.145361 containerd[1607]: time="2025-09-13T02:26:55.145285828Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 13 02:26:55.146505 containerd[1607]: time="2025-09-13T02:26:55.146433326Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 13 02:26:55.767225 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3223800401.mount: Deactivated successfully. Sep 13 02:26:55.773190 containerd[1607]: time="2025-09-13T02:26:55.773085556Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 02:26:55.774119 containerd[1607]: time="2025-09-13T02:26:55.774087843Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Sep 13 02:26:55.775306 containerd[1607]: time="2025-09-13T02:26:55.774823447Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 02:26:55.777208 containerd[1607]: time="2025-09-13T02:26:55.777140150Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 02:26:55.778781 containerd[1607]: time="2025-09-13T02:26:55.778221413Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 631.748558ms" Sep 13 02:26:55.778781 containerd[1607]: time="2025-09-13T02:26:55.778281088Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 13 02:26:55.779870 containerd[1607]: time="2025-09-13T02:26:55.779609470Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 13 02:26:56.557378 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount411809189.mount: Deactivated successfully. Sep 13 02:26:58.772882 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 13 02:26:58.777938 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 02:26:59.078505 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 02:26:59.090730 (kubelet)[2340]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 02:26:59.391649 kubelet[2340]: E0913 02:26:59.390448 2340 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 02:26:59.394372 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 02:26:59.394624 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 02:26:59.395116 systemd[1]: kubelet.service: Consumed 228ms CPU time, 105.6M memory peak. Sep 13 02:26:59.551675 update_engine[1593]: I20250913 02:26:59.551530 1593 update_attempter.cc:509] Updating boot flags... Sep 13 02:27:00.790301 containerd[1607]: time="2025-09-13T02:27:00.789623780Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:00.790858 containerd[1607]: time="2025-09-13T02:27:00.790460334Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910717" Sep 13 02:27:00.792297 containerd[1607]: time="2025-09-13T02:27:00.791699859Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:00.795119 containerd[1607]: time="2025-09-13T02:27:00.795088087Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:00.796684 containerd[1607]: time="2025-09-13T02:27:00.796585895Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 5.016936723s" Sep 13 02:27:00.796684 containerd[1607]: time="2025-09-13T02:27:00.796628450Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 13 02:27:05.025185 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 02:27:05.025955 systemd[1]: kubelet.service: Consumed 228ms CPU time, 105.6M memory peak. Sep 13 02:27:05.029048 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 02:27:05.062151 systemd[1]: Reload requested from client PID 2395 ('systemctl') (unit session-11.scope)... Sep 13 02:27:05.062214 systemd[1]: Reloading... Sep 13 02:27:05.203346 zram_generator::config[2436]: No configuration found. Sep 13 02:27:05.357514 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 02:27:05.526148 systemd[1]: Reloading finished in 463 ms. Sep 13 02:27:05.599108 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 13 02:27:05.599502 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 13 02:27:05.600065 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 02:27:05.600137 systemd[1]: kubelet.service: Consumed 127ms CPU time, 98.5M memory peak. Sep 13 02:27:05.602309 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 02:27:05.884543 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 02:27:05.895675 (kubelet)[2507]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 13 02:27:05.956482 kubelet[2507]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 02:27:05.957293 kubelet[2507]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 13 02:27:05.957293 kubelet[2507]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 02:27:05.957293 kubelet[2507]: I0913 02:27:05.957105 2507 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 13 02:27:06.309248 kubelet[2507]: I0913 02:27:06.309176 2507 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 13 02:27:06.309248 kubelet[2507]: I0913 02:27:06.309219 2507 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 13 02:27:06.309622 kubelet[2507]: I0913 02:27:06.309578 2507 server.go:934] "Client rotation is on, will bootstrap in background" Sep 13 02:27:06.341345 kubelet[2507]: E0913 02:27:06.340312 2507 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.230.67.142:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.230.67.142:6443: connect: connection refused" logger="UnhandledError" Sep 13 02:27:06.341345 kubelet[2507]: I0913 02:27:06.341120 2507 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 13 02:27:06.354474 kubelet[2507]: I0913 02:27:06.354444 2507 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 13 02:27:06.362924 kubelet[2507]: I0913 02:27:06.362871 2507 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 13 02:27:06.364916 kubelet[2507]: I0913 02:27:06.364783 2507 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 13 02:27:06.365152 kubelet[2507]: I0913 02:27:06.365069 2507 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 13 02:27:06.365386 kubelet[2507]: I0913 02:27:06.365121 2507 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-1di1n.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 13 02:27:06.365627 kubelet[2507]: I0913 02:27:06.365409 2507 topology_manager.go:138] "Creating topology manager with none policy" Sep 13 02:27:06.365627 kubelet[2507]: I0913 02:27:06.365425 2507 container_manager_linux.go:300] "Creating device plugin manager" Sep 13 02:27:06.365627 kubelet[2507]: I0913 02:27:06.365590 2507 state_mem.go:36] "Initialized new in-memory state store" Sep 13 02:27:06.368354 kubelet[2507]: I0913 02:27:06.368316 2507 kubelet.go:408] "Attempting to sync node with API server" Sep 13 02:27:06.368354 kubelet[2507]: I0913 02:27:06.368354 2507 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 13 02:27:06.370734 kubelet[2507]: I0913 02:27:06.370666 2507 kubelet.go:314] "Adding apiserver pod source" Sep 13 02:27:06.370734 kubelet[2507]: I0913 02:27:06.370723 2507 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 13 02:27:06.372706 kubelet[2507]: W0913 02:27:06.372452 2507 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.67.142:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-1di1n.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.67.142:6443: connect: connection refused Sep 13 02:27:06.372706 kubelet[2507]: E0913 02:27:06.372526 2507 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.67.142:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-1di1n.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.67.142:6443: connect: connection refused" logger="UnhandledError" Sep 13 02:27:06.374373 kubelet[2507]: W0913 02:27:06.374330 2507 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.67.142:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.67.142:6443: connect: connection refused Sep 13 02:27:06.374525 kubelet[2507]: E0913 02:27:06.374495 2507 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.67.142:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.67.142:6443: connect: connection refused" logger="UnhandledError" Sep 13 02:27:06.374712 kubelet[2507]: I0913 02:27:06.374678 2507 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 13 02:27:06.380286 kubelet[2507]: I0913 02:27:06.379565 2507 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 13 02:27:06.380286 kubelet[2507]: W0913 02:27:06.379674 2507 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 13 02:27:06.383211 kubelet[2507]: I0913 02:27:06.383189 2507 server.go:1274] "Started kubelet" Sep 13 02:27:06.386369 kubelet[2507]: I0913 02:27:06.386347 2507 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 13 02:27:06.398569 kubelet[2507]: I0913 02:27:06.398510 2507 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 13 02:27:06.400100 kubelet[2507]: I0913 02:27:06.400078 2507 server.go:449] "Adding debug handlers to kubelet server" Sep 13 02:27:06.403666 kubelet[2507]: I0913 02:27:06.403629 2507 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 13 02:27:06.404028 kubelet[2507]: I0913 02:27:06.404006 2507 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 13 02:27:06.404471 kubelet[2507]: I0913 02:27:06.404447 2507 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 13 02:27:06.408216 kubelet[2507]: E0913 02:27:06.404841 2507 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.230.67.142:6443/api/v1/namespaces/default/events\": dial tcp 10.230.67.142:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-1di1n.gb1.brightbox.com.1864b6828b76b01d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-1di1n.gb1.brightbox.com,UID:srv-1di1n.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-1di1n.gb1.brightbox.com,},FirstTimestamp:2025-09-13 02:27:06.383151133 +0000 UTC m=+0.482722328,LastTimestamp:2025-09-13 02:27:06.383151133 +0000 UTC m=+0.482722328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-1di1n.gb1.brightbox.com,}" Sep 13 02:27:06.408216 kubelet[2507]: E0913 02:27:06.407918 2507 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-1di1n.gb1.brightbox.com\" not found" Sep 13 02:27:06.408216 kubelet[2507]: I0913 02:27:06.407981 2507 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 13 02:27:06.408644 kubelet[2507]: I0913 02:27:06.408620 2507 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 13 02:27:06.416205 kubelet[2507]: I0913 02:27:06.414062 2507 reconciler.go:26] "Reconciler: start to sync state" Sep 13 02:27:06.416205 kubelet[2507]: W0913 02:27:06.414818 2507 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.67.142:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.67.142:6443: connect: connection refused Sep 13 02:27:06.416205 kubelet[2507]: E0913 02:27:06.414872 2507 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.67.142:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.67.142:6443: connect: connection refused" logger="UnhandledError" Sep 13 02:27:06.417735 kubelet[2507]: I0913 02:27:06.416645 2507 factory.go:221] Registration of the systemd container factory successfully Sep 13 02:27:06.417735 kubelet[2507]: I0913 02:27:06.416801 2507 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 13 02:27:06.418541 kubelet[2507]: E0913 02:27:06.418487 2507 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.67.142:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-1di1n.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.67.142:6443: connect: connection refused" interval="200ms" Sep 13 02:27:06.422834 kubelet[2507]: I0913 02:27:06.422796 2507 factory.go:221] Registration of the containerd container factory successfully Sep 13 02:27:06.438109 kubelet[2507]: I0913 02:27:06.437920 2507 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 13 02:27:06.439700 kubelet[2507]: I0913 02:27:06.439402 2507 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 13 02:27:06.439700 kubelet[2507]: I0913 02:27:06.439451 2507 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 13 02:27:06.439700 kubelet[2507]: I0913 02:27:06.439488 2507 kubelet.go:2321] "Starting kubelet main sync loop" Sep 13 02:27:06.439700 kubelet[2507]: E0913 02:27:06.439554 2507 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 13 02:27:06.448854 kubelet[2507]: W0913 02:27:06.448732 2507 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.67.142:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.67.142:6443: connect: connection refused Sep 13 02:27:06.448854 kubelet[2507]: E0913 02:27:06.448803 2507 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.67.142:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.67.142:6443: connect: connection refused" logger="UnhandledError" Sep 13 02:27:06.453851 kubelet[2507]: E0913 02:27:06.453819 2507 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 13 02:27:06.469195 kubelet[2507]: I0913 02:27:06.469152 2507 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 13 02:27:06.469195 kubelet[2507]: I0913 02:27:06.469188 2507 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 13 02:27:06.469397 kubelet[2507]: I0913 02:27:06.469216 2507 state_mem.go:36] "Initialized new in-memory state store" Sep 13 02:27:06.471074 kubelet[2507]: I0913 02:27:06.471051 2507 policy_none.go:49] "None policy: Start" Sep 13 02:27:06.472081 kubelet[2507]: I0913 02:27:06.472037 2507 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 13 02:27:06.472186 kubelet[2507]: I0913 02:27:06.472094 2507 state_mem.go:35] "Initializing new in-memory state store" Sep 13 02:27:06.482954 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 13 02:27:06.495754 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 13 02:27:06.500725 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 13 02:27:06.508219 kubelet[2507]: E0913 02:27:06.508150 2507 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-1di1n.gb1.brightbox.com\" not found" Sep 13 02:27:06.511549 kubelet[2507]: I0913 02:27:06.511296 2507 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 13 02:27:06.512297 kubelet[2507]: I0913 02:27:06.512261 2507 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 13 02:27:06.512468 kubelet[2507]: I0913 02:27:06.512416 2507 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 13 02:27:06.512934 kubelet[2507]: I0913 02:27:06.512913 2507 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 13 02:27:06.516818 kubelet[2507]: E0913 02:27:06.516650 2507 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-1di1n.gb1.brightbox.com\" not found" Sep 13 02:27:06.557815 systemd[1]: Created slice kubepods-burstable-pod2ab7b6318cbdf40b6f6cdbb00b2ab37d.slice - libcontainer container kubepods-burstable-pod2ab7b6318cbdf40b6f6cdbb00b2ab37d.slice. Sep 13 02:27:06.573848 systemd[1]: Created slice kubepods-burstable-pod87d83fbae50a4a45d8ba2fd00c500225.slice - libcontainer container kubepods-burstable-pod87d83fbae50a4a45d8ba2fd00c500225.slice. Sep 13 02:27:06.588232 systemd[1]: Created slice kubepods-burstable-pod5e4355412329407efdee23ffb45447a2.slice - libcontainer container kubepods-burstable-pod5e4355412329407efdee23ffb45447a2.slice. Sep 13 02:27:06.615067 kubelet[2507]: I0913 02:27:06.615004 2507 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/87d83fbae50a4a45d8ba2fd00c500225-flexvolume-dir\") pod \"kube-controller-manager-srv-1di1n.gb1.brightbox.com\" (UID: \"87d83fbae50a4a45d8ba2fd00c500225\") " pod="kube-system/kube-controller-manager-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.615067 kubelet[2507]: I0913 02:27:06.615068 2507 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/87d83fbae50a4a45d8ba2fd00c500225-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-1di1n.gb1.brightbox.com\" (UID: \"87d83fbae50a4a45d8ba2fd00c500225\") " pod="kube-system/kube-controller-manager-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.615329 kubelet[2507]: I0913 02:27:06.615100 2507 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5e4355412329407efdee23ffb45447a2-kubeconfig\") pod \"kube-scheduler-srv-1di1n.gb1.brightbox.com\" (UID: \"5e4355412329407efdee23ffb45447a2\") " pod="kube-system/kube-scheduler-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.615329 kubelet[2507]: I0913 02:27:06.615125 2507 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2ab7b6318cbdf40b6f6cdbb00b2ab37d-ca-certs\") pod \"kube-apiserver-srv-1di1n.gb1.brightbox.com\" (UID: \"2ab7b6318cbdf40b6f6cdbb00b2ab37d\") " pod="kube-system/kube-apiserver-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.615329 kubelet[2507]: I0913 02:27:06.615174 2507 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/87d83fbae50a4a45d8ba2fd00c500225-ca-certs\") pod \"kube-controller-manager-srv-1di1n.gb1.brightbox.com\" (UID: \"87d83fbae50a4a45d8ba2fd00c500225\") " pod="kube-system/kube-controller-manager-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.615329 kubelet[2507]: I0913 02:27:06.615201 2507 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/87d83fbae50a4a45d8ba2fd00c500225-k8s-certs\") pod \"kube-controller-manager-srv-1di1n.gb1.brightbox.com\" (UID: \"87d83fbae50a4a45d8ba2fd00c500225\") " pod="kube-system/kube-controller-manager-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.615329 kubelet[2507]: I0913 02:27:06.615225 2507 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/87d83fbae50a4a45d8ba2fd00c500225-kubeconfig\") pod \"kube-controller-manager-srv-1di1n.gb1.brightbox.com\" (UID: \"87d83fbae50a4a45d8ba2fd00c500225\") " pod="kube-system/kube-controller-manager-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.615546 kubelet[2507]: I0913 02:27:06.615248 2507 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2ab7b6318cbdf40b6f6cdbb00b2ab37d-k8s-certs\") pod \"kube-apiserver-srv-1di1n.gb1.brightbox.com\" (UID: \"2ab7b6318cbdf40b6f6cdbb00b2ab37d\") " pod="kube-system/kube-apiserver-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.615546 kubelet[2507]: I0913 02:27:06.615296 2507 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2ab7b6318cbdf40b6f6cdbb00b2ab37d-usr-share-ca-certificates\") pod \"kube-apiserver-srv-1di1n.gb1.brightbox.com\" (UID: \"2ab7b6318cbdf40b6f6cdbb00b2ab37d\") " pod="kube-system/kube-apiserver-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.615718 kubelet[2507]: I0913 02:27:06.615691 2507 kubelet_node_status.go:72] "Attempting to register node" node="srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.616282 kubelet[2507]: E0913 02:27:06.616233 2507 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.67.142:6443/api/v1/nodes\": dial tcp 10.230.67.142:6443: connect: connection refused" node="srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.619639 kubelet[2507]: E0913 02:27:06.619603 2507 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.67.142:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-1di1n.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.67.142:6443: connect: connection refused" interval="400ms" Sep 13 02:27:06.819861 kubelet[2507]: I0913 02:27:06.819820 2507 kubelet_node_status.go:72] "Attempting to register node" node="srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.820579 kubelet[2507]: E0913 02:27:06.820541 2507 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.67.142:6443/api/v1/nodes\": dial tcp 10.230.67.142:6443: connect: connection refused" node="srv-1di1n.gb1.brightbox.com" Sep 13 02:27:06.871412 containerd[1607]: time="2025-09-13T02:27:06.871243353Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-1di1n.gb1.brightbox.com,Uid:2ab7b6318cbdf40b6f6cdbb00b2ab37d,Namespace:kube-system,Attempt:0,}" Sep 13 02:27:06.886341 containerd[1607]: time="2025-09-13T02:27:06.886296563Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-1di1n.gb1.brightbox.com,Uid:87d83fbae50a4a45d8ba2fd00c500225,Namespace:kube-system,Attempt:0,}" Sep 13 02:27:06.896526 containerd[1607]: time="2025-09-13T02:27:06.896424655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-1di1n.gb1.brightbox.com,Uid:5e4355412329407efdee23ffb45447a2,Namespace:kube-system,Attempt:0,}" Sep 13 02:27:07.021145 kubelet[2507]: E0913 02:27:07.021052 2507 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.67.142:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-1di1n.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.67.142:6443: connect: connection refused" interval="800ms" Sep 13 02:27:07.032694 containerd[1607]: time="2025-09-13T02:27:07.032615966Z" level=info msg="connecting to shim 6ace00b3710e6b87766d18d6b71c916ba0f4d588a93fc786019e49eaa8b90e27" address="unix:///run/containerd/s/950e81f3132e7730f45a566c87a01d9e36af10072fed03aa2e554f0044e4b40c" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:27:07.037300 containerd[1607]: time="2025-09-13T02:27:07.037212437Z" level=info msg="connecting to shim 41e6118736f1f4b431149009ebf5d5622e26fd10e8bd69666bbbeede8a36b24c" address="unix:///run/containerd/s/77af3095fed9744aee108dda285844d946774c2141da2710f74e76eac8ea9127" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:27:07.040942 containerd[1607]: time="2025-09-13T02:27:07.040851723Z" level=info msg="connecting to shim cc3b1adbc6f2ea233f1bf24757ce7b9c38930338c2cdf6cf5f0e84f5bfd4cbde" address="unix:///run/containerd/s/af09060253715214e40c1196e3bede4dc3f10a4e06512fdf467c5cab384dcc5f" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:27:07.154494 systemd[1]: Started cri-containerd-41e6118736f1f4b431149009ebf5d5622e26fd10e8bd69666bbbeede8a36b24c.scope - libcontainer container 41e6118736f1f4b431149009ebf5d5622e26fd10e8bd69666bbbeede8a36b24c. Sep 13 02:27:07.169614 systemd[1]: Started cri-containerd-6ace00b3710e6b87766d18d6b71c916ba0f4d588a93fc786019e49eaa8b90e27.scope - libcontainer container 6ace00b3710e6b87766d18d6b71c916ba0f4d588a93fc786019e49eaa8b90e27. Sep 13 02:27:07.172890 systemd[1]: Started cri-containerd-cc3b1adbc6f2ea233f1bf24757ce7b9c38930338c2cdf6cf5f0e84f5bfd4cbde.scope - libcontainer container cc3b1adbc6f2ea233f1bf24757ce7b9c38930338c2cdf6cf5f0e84f5bfd4cbde. Sep 13 02:27:07.189784 kubelet[2507]: W0913 02:27:07.189555 2507 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.67.142:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.67.142:6443: connect: connection refused Sep 13 02:27:07.189784 kubelet[2507]: E0913 02:27:07.189638 2507 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.67.142:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.67.142:6443: connect: connection refused" logger="UnhandledError" Sep 13 02:27:07.225931 kubelet[2507]: I0913 02:27:07.225818 2507 kubelet_node_status.go:72] "Attempting to register node" node="srv-1di1n.gb1.brightbox.com" Sep 13 02:27:07.227075 kubelet[2507]: E0913 02:27:07.226942 2507 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.67.142:6443/api/v1/nodes\": dial tcp 10.230.67.142:6443: connect: connection refused" node="srv-1di1n.gb1.brightbox.com" Sep 13 02:27:07.246372 kubelet[2507]: W0913 02:27:07.246128 2507 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.67.142:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-1di1n.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.67.142:6443: connect: connection refused Sep 13 02:27:07.246749 kubelet[2507]: E0913 02:27:07.246488 2507 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.67.142:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-1di1n.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.67.142:6443: connect: connection refused" logger="UnhandledError" Sep 13 02:27:07.258941 containerd[1607]: time="2025-09-13T02:27:07.258779954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-1di1n.gb1.brightbox.com,Uid:87d83fbae50a4a45d8ba2fd00c500225,Namespace:kube-system,Attempt:0,} returns sandbox id \"41e6118736f1f4b431149009ebf5d5622e26fd10e8bd69666bbbeede8a36b24c\"" Sep 13 02:27:07.266204 containerd[1607]: time="2025-09-13T02:27:07.266124267Z" level=info msg="CreateContainer within sandbox \"41e6118736f1f4b431149009ebf5d5622e26fd10e8bd69666bbbeede8a36b24c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 13 02:27:07.276032 containerd[1607]: time="2025-09-13T02:27:07.275970564Z" level=info msg="Container f528be087ce585603aef059b694d9d061391a8b1db26bd00668f8018823077fa: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:27:07.286589 containerd[1607]: time="2025-09-13T02:27:07.286445071Z" level=info msg="CreateContainer within sandbox \"41e6118736f1f4b431149009ebf5d5622e26fd10e8bd69666bbbeede8a36b24c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"f528be087ce585603aef059b694d9d061391a8b1db26bd00668f8018823077fa\"" Sep 13 02:27:07.288094 containerd[1607]: time="2025-09-13T02:27:07.288060585Z" level=info msg="StartContainer for \"f528be087ce585603aef059b694d9d061391a8b1db26bd00668f8018823077fa\"" Sep 13 02:27:07.294654 containerd[1607]: time="2025-09-13T02:27:07.294620582Z" level=info msg="connecting to shim f528be087ce585603aef059b694d9d061391a8b1db26bd00668f8018823077fa" address="unix:///run/containerd/s/77af3095fed9744aee108dda285844d946774c2141da2710f74e76eac8ea9127" protocol=ttrpc version=3 Sep 13 02:27:07.338934 systemd[1]: Started cri-containerd-f528be087ce585603aef059b694d9d061391a8b1db26bd00668f8018823077fa.scope - libcontainer container f528be087ce585603aef059b694d9d061391a8b1db26bd00668f8018823077fa. Sep 13 02:27:07.361903 containerd[1607]: time="2025-09-13T02:27:07.361754650Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-1di1n.gb1.brightbox.com,Uid:2ab7b6318cbdf40b6f6cdbb00b2ab37d,Namespace:kube-system,Attempt:0,} returns sandbox id \"6ace00b3710e6b87766d18d6b71c916ba0f4d588a93fc786019e49eaa8b90e27\"" Sep 13 02:27:07.362933 containerd[1607]: time="2025-09-13T02:27:07.362814071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-1di1n.gb1.brightbox.com,Uid:5e4355412329407efdee23ffb45447a2,Namespace:kube-system,Attempt:0,} returns sandbox id \"cc3b1adbc6f2ea233f1bf24757ce7b9c38930338c2cdf6cf5f0e84f5bfd4cbde\"" Sep 13 02:27:07.369961 containerd[1607]: time="2025-09-13T02:27:07.369924465Z" level=info msg="CreateContainer within sandbox \"6ace00b3710e6b87766d18d6b71c916ba0f4d588a93fc786019e49eaa8b90e27\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 13 02:27:07.370552 containerd[1607]: time="2025-09-13T02:27:07.370256402Z" level=info msg="CreateContainer within sandbox \"cc3b1adbc6f2ea233f1bf24757ce7b9c38930338c2cdf6cf5f0e84f5bfd4cbde\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 13 02:27:07.382346 containerd[1607]: time="2025-09-13T02:27:07.382309857Z" level=info msg="Container 0a3ff25853ab6f8c7760144da82767fae6f7cc7c8cc202c96cac9f6cdcbf1664: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:27:07.383049 containerd[1607]: time="2025-09-13T02:27:07.383006058Z" level=info msg="Container c33ddad3315171bb2535e88783212336b4ac9d42981b2bda194dd0557b730032: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:27:07.390423 containerd[1607]: time="2025-09-13T02:27:07.390378763Z" level=info msg="CreateContainer within sandbox \"6ace00b3710e6b87766d18d6b71c916ba0f4d588a93fc786019e49eaa8b90e27\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"0a3ff25853ab6f8c7760144da82767fae6f7cc7c8cc202c96cac9f6cdcbf1664\"" Sep 13 02:27:07.392367 containerd[1607]: time="2025-09-13T02:27:07.392255653Z" level=info msg="CreateContainer within sandbox \"cc3b1adbc6f2ea233f1bf24757ce7b9c38930338c2cdf6cf5f0e84f5bfd4cbde\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c33ddad3315171bb2535e88783212336b4ac9d42981b2bda194dd0557b730032\"" Sep 13 02:27:07.392828 containerd[1607]: time="2025-09-13T02:27:07.392334762Z" level=info msg="StartContainer for \"0a3ff25853ab6f8c7760144da82767fae6f7cc7c8cc202c96cac9f6cdcbf1664\"" Sep 13 02:27:07.392828 containerd[1607]: time="2025-09-13T02:27:07.392723443Z" level=info msg="StartContainer for \"c33ddad3315171bb2535e88783212336b4ac9d42981b2bda194dd0557b730032\"" Sep 13 02:27:07.393939 containerd[1607]: time="2025-09-13T02:27:07.393873825Z" level=info msg="connecting to shim c33ddad3315171bb2535e88783212336b4ac9d42981b2bda194dd0557b730032" address="unix:///run/containerd/s/af09060253715214e40c1196e3bede4dc3f10a4e06512fdf467c5cab384dcc5f" protocol=ttrpc version=3 Sep 13 02:27:07.399734 containerd[1607]: time="2025-09-13T02:27:07.399035773Z" level=info msg="connecting to shim 0a3ff25853ab6f8c7760144da82767fae6f7cc7c8cc202c96cac9f6cdcbf1664" address="unix:///run/containerd/s/950e81f3132e7730f45a566c87a01d9e36af10072fed03aa2e554f0044e4b40c" protocol=ttrpc version=3 Sep 13 02:27:07.427679 systemd[1]: Started cri-containerd-c33ddad3315171bb2535e88783212336b4ac9d42981b2bda194dd0557b730032.scope - libcontainer container c33ddad3315171bb2535e88783212336b4ac9d42981b2bda194dd0557b730032. Sep 13 02:27:07.454593 systemd[1]: Started cri-containerd-0a3ff25853ab6f8c7760144da82767fae6f7cc7c8cc202c96cac9f6cdcbf1664.scope - libcontainer container 0a3ff25853ab6f8c7760144da82767fae6f7cc7c8cc202c96cac9f6cdcbf1664. Sep 13 02:27:07.475184 containerd[1607]: time="2025-09-13T02:27:07.474805866Z" level=info msg="StartContainer for \"f528be087ce585603aef059b694d9d061391a8b1db26bd00668f8018823077fa\" returns successfully" Sep 13 02:27:07.546589 kubelet[2507]: W0913 02:27:07.546451 2507 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.67.142:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.67.142:6443: connect: connection refused Sep 13 02:27:07.547367 kubelet[2507]: E0913 02:27:07.546902 2507 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.67.142:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.67.142:6443: connect: connection refused" logger="UnhandledError" Sep 13 02:27:07.556302 containerd[1607]: time="2025-09-13T02:27:07.556212731Z" level=info msg="StartContainer for \"c33ddad3315171bb2535e88783212336b4ac9d42981b2bda194dd0557b730032\" returns successfully" Sep 13 02:27:07.594833 containerd[1607]: time="2025-09-13T02:27:07.594781107Z" level=info msg="StartContainer for \"0a3ff25853ab6f8c7760144da82767fae6f7cc7c8cc202c96cac9f6cdcbf1664\" returns successfully" Sep 13 02:27:07.822070 kubelet[2507]: E0913 02:27:07.821974 2507 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.67.142:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-1di1n.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.67.142:6443: connect: connection refused" interval="1.6s" Sep 13 02:27:07.843772 kubelet[2507]: W0913 02:27:07.843666 2507 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.67.142:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.67.142:6443: connect: connection refused Sep 13 02:27:07.843772 kubelet[2507]: E0913 02:27:07.843741 2507 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.67.142:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.67.142:6443: connect: connection refused" logger="UnhandledError" Sep 13 02:27:08.030301 kubelet[2507]: I0913 02:27:08.030243 2507 kubelet_node_status.go:72] "Attempting to register node" node="srv-1di1n.gb1.brightbox.com" Sep 13 02:27:10.069199 kubelet[2507]: I0913 02:27:10.068951 2507 kubelet_node_status.go:75] "Successfully registered node" node="srv-1di1n.gb1.brightbox.com" Sep 13 02:27:10.069199 kubelet[2507]: E0913 02:27:10.068994 2507 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"srv-1di1n.gb1.brightbox.com\": node \"srv-1di1n.gb1.brightbox.com\" not found" Sep 13 02:27:10.377298 kubelet[2507]: I0913 02:27:10.377151 2507 apiserver.go:52] "Watching apiserver" Sep 13 02:27:10.409187 kubelet[2507]: I0913 02:27:10.409085 2507 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 13 02:27:10.521542 kubelet[2507]: E0913 02:27:10.521456 2507 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-srv-1di1n.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:12.354680 systemd[1]: Reload requested from client PID 2782 ('systemctl') (unit session-11.scope)... Sep 13 02:27:12.355138 systemd[1]: Reloading... Sep 13 02:27:12.485311 zram_generator::config[2826]: No configuration found. Sep 13 02:27:12.641843 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 02:27:12.832839 systemd[1]: Reloading finished in 477 ms. Sep 13 02:27:12.866302 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 02:27:12.879373 systemd[1]: kubelet.service: Deactivated successfully. Sep 13 02:27:12.880350 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 02:27:12.880457 systemd[1]: kubelet.service: Consumed 953ms CPU time, 128.4M memory peak. Sep 13 02:27:12.884571 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 02:27:13.204778 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 02:27:13.215188 (kubelet)[2891]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 13 02:27:13.288369 kubelet[2891]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 02:27:13.288369 kubelet[2891]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 13 02:27:13.288369 kubelet[2891]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 02:27:13.288944 kubelet[2891]: I0913 02:27:13.288445 2891 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 13 02:27:13.298225 kubelet[2891]: I0913 02:27:13.298167 2891 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 13 02:27:13.298225 kubelet[2891]: I0913 02:27:13.298202 2891 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 13 02:27:13.298550 kubelet[2891]: I0913 02:27:13.298527 2891 server.go:934] "Client rotation is on, will bootstrap in background" Sep 13 02:27:13.300161 kubelet[2891]: I0913 02:27:13.300129 2891 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 13 02:27:13.302644 kubelet[2891]: I0913 02:27:13.302596 2891 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 13 02:27:13.309582 kubelet[2891]: I0913 02:27:13.309521 2891 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 13 02:27:13.315292 kubelet[2891]: I0913 02:27:13.314677 2891 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 13 02:27:13.315292 kubelet[2891]: I0913 02:27:13.314835 2891 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 13 02:27:13.315292 kubelet[2891]: I0913 02:27:13.315042 2891 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 13 02:27:13.315996 kubelet[2891]: I0913 02:27:13.315080 2891 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-1di1n.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 13 02:27:13.316309 kubelet[2891]: I0913 02:27:13.316288 2891 topology_manager.go:138] "Creating topology manager with none policy" Sep 13 02:27:13.316550 kubelet[2891]: I0913 02:27:13.316430 2891 container_manager_linux.go:300] "Creating device plugin manager" Sep 13 02:27:13.316694 kubelet[2891]: I0913 02:27:13.316674 2891 state_mem.go:36] "Initialized new in-memory state store" Sep 13 02:27:13.316915 kubelet[2891]: I0913 02:27:13.316897 2891 kubelet.go:408] "Attempting to sync node with API server" Sep 13 02:27:13.317012 kubelet[2891]: I0913 02:27:13.316996 2891 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 13 02:27:13.317141 kubelet[2891]: I0913 02:27:13.317123 2891 kubelet.go:314] "Adding apiserver pod source" Sep 13 02:27:13.317245 kubelet[2891]: I0913 02:27:13.317227 2891 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 13 02:27:13.324291 kubelet[2891]: I0913 02:27:13.323223 2891 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 13 02:27:13.324291 kubelet[2891]: I0913 02:27:13.323744 2891 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 13 02:27:13.324500 kubelet[2891]: I0913 02:27:13.324309 2891 server.go:1274] "Started kubelet" Sep 13 02:27:13.327513 kubelet[2891]: I0913 02:27:13.326513 2891 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 13 02:27:13.336221 kubelet[2891]: I0913 02:27:13.335098 2891 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 13 02:27:13.338388 kubelet[2891]: I0913 02:27:13.337501 2891 server.go:449] "Adding debug handlers to kubelet server" Sep 13 02:27:13.338937 kubelet[2891]: I0913 02:27:13.338902 2891 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 13 02:27:13.339279 kubelet[2891]: I0913 02:27:13.339184 2891 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 13 02:27:13.340292 kubelet[2891]: I0913 02:27:13.339735 2891 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 13 02:27:13.344260 kubelet[2891]: I0913 02:27:13.343930 2891 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 13 02:27:13.346332 kubelet[2891]: E0913 02:27:13.344667 2891 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-1di1n.gb1.brightbox.com\" not found" Sep 13 02:27:13.364328 kubelet[2891]: I0913 02:27:13.364116 2891 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 13 02:27:13.364467 kubelet[2891]: I0913 02:27:13.364360 2891 reconciler.go:26] "Reconciler: start to sync state" Sep 13 02:27:13.370350 kubelet[2891]: I0913 02:27:13.370291 2891 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 13 02:27:13.370738 kubelet[2891]: I0913 02:27:13.370706 2891 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 13 02:27:13.372517 kubelet[2891]: I0913 02:27:13.372495 2891 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 13 02:27:13.372654 kubelet[2891]: I0913 02:27:13.372636 2891 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 13 02:27:13.372783 kubelet[2891]: I0913 02:27:13.372766 2891 kubelet.go:2321] "Starting kubelet main sync loop" Sep 13 02:27:13.372921 kubelet[2891]: E0913 02:27:13.372896 2891 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 13 02:27:13.383570 kubelet[2891]: I0913 02:27:13.383531 2891 factory.go:221] Registration of the containerd container factory successfully Sep 13 02:27:13.383570 kubelet[2891]: I0913 02:27:13.383561 2891 factory.go:221] Registration of the systemd container factory successfully Sep 13 02:27:13.384850 kubelet[2891]: E0913 02:27:13.384825 2891 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 13 02:27:13.461919 kubelet[2891]: I0913 02:27:13.459074 2891 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 13 02:27:13.461919 kubelet[2891]: I0913 02:27:13.460440 2891 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 13 02:27:13.461919 kubelet[2891]: I0913 02:27:13.460493 2891 state_mem.go:36] "Initialized new in-memory state store" Sep 13 02:27:13.461919 kubelet[2891]: I0913 02:27:13.460715 2891 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 13 02:27:13.461919 kubelet[2891]: I0913 02:27:13.460733 2891 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 13 02:27:13.461919 kubelet[2891]: I0913 02:27:13.460760 2891 policy_none.go:49] "None policy: Start" Sep 13 02:27:13.463375 kubelet[2891]: I0913 02:27:13.463318 2891 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 13 02:27:13.463448 kubelet[2891]: I0913 02:27:13.463385 2891 state_mem.go:35] "Initializing new in-memory state store" Sep 13 02:27:13.464093 kubelet[2891]: I0913 02:27:13.463567 2891 state_mem.go:75] "Updated machine memory state" Sep 13 02:27:13.471735 kubelet[2891]: I0913 02:27:13.471150 2891 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 13 02:27:13.471735 kubelet[2891]: I0913 02:27:13.471389 2891 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 13 02:27:13.471735 kubelet[2891]: I0913 02:27:13.471405 2891 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 13 02:27:13.476303 kubelet[2891]: I0913 02:27:13.475327 2891 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 13 02:27:13.499570 kubelet[2891]: W0913 02:27:13.499523 2891 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 02:27:13.501888 kubelet[2891]: W0913 02:27:13.500651 2891 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 02:27:13.504596 kubelet[2891]: W0913 02:27:13.503654 2891 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 02:27:13.565436 kubelet[2891]: I0913 02:27:13.565337 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/87d83fbae50a4a45d8ba2fd00c500225-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-1di1n.gb1.brightbox.com\" (UID: \"87d83fbae50a4a45d8ba2fd00c500225\") " pod="kube-system/kube-controller-manager-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:13.565436 kubelet[2891]: I0913 02:27:13.565402 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2ab7b6318cbdf40b6f6cdbb00b2ab37d-ca-certs\") pod \"kube-apiserver-srv-1di1n.gb1.brightbox.com\" (UID: \"2ab7b6318cbdf40b6f6cdbb00b2ab37d\") " pod="kube-system/kube-apiserver-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:13.566137 kubelet[2891]: I0913 02:27:13.565445 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2ab7b6318cbdf40b6f6cdbb00b2ab37d-k8s-certs\") pod \"kube-apiserver-srv-1di1n.gb1.brightbox.com\" (UID: \"2ab7b6318cbdf40b6f6cdbb00b2ab37d\") " pod="kube-system/kube-apiserver-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:13.566137 kubelet[2891]: I0913 02:27:13.565498 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2ab7b6318cbdf40b6f6cdbb00b2ab37d-usr-share-ca-certificates\") pod \"kube-apiserver-srv-1di1n.gb1.brightbox.com\" (UID: \"2ab7b6318cbdf40b6f6cdbb00b2ab37d\") " pod="kube-system/kube-apiserver-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:13.566137 kubelet[2891]: I0913 02:27:13.565535 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/87d83fbae50a4a45d8ba2fd00c500225-flexvolume-dir\") pod \"kube-controller-manager-srv-1di1n.gb1.brightbox.com\" (UID: \"87d83fbae50a4a45d8ba2fd00c500225\") " pod="kube-system/kube-controller-manager-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:13.566137 kubelet[2891]: I0913 02:27:13.565564 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/87d83fbae50a4a45d8ba2fd00c500225-k8s-certs\") pod \"kube-controller-manager-srv-1di1n.gb1.brightbox.com\" (UID: \"87d83fbae50a4a45d8ba2fd00c500225\") " pod="kube-system/kube-controller-manager-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:13.566137 kubelet[2891]: I0913 02:27:13.565590 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/87d83fbae50a4a45d8ba2fd00c500225-kubeconfig\") pod \"kube-controller-manager-srv-1di1n.gb1.brightbox.com\" (UID: \"87d83fbae50a4a45d8ba2fd00c500225\") " pod="kube-system/kube-controller-manager-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:13.566827 kubelet[2891]: I0913 02:27:13.565633 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/87d83fbae50a4a45d8ba2fd00c500225-ca-certs\") pod \"kube-controller-manager-srv-1di1n.gb1.brightbox.com\" (UID: \"87d83fbae50a4a45d8ba2fd00c500225\") " pod="kube-system/kube-controller-manager-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:13.566827 kubelet[2891]: I0913 02:27:13.565674 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5e4355412329407efdee23ffb45447a2-kubeconfig\") pod \"kube-scheduler-srv-1di1n.gb1.brightbox.com\" (UID: \"5e4355412329407efdee23ffb45447a2\") " pod="kube-system/kube-scheduler-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:13.615375 kubelet[2891]: I0913 02:27:13.615335 2891 kubelet_node_status.go:72] "Attempting to register node" node="srv-1di1n.gb1.brightbox.com" Sep 13 02:27:13.626056 kubelet[2891]: I0913 02:27:13.625706 2891 kubelet_node_status.go:111] "Node was previously registered" node="srv-1di1n.gb1.brightbox.com" Sep 13 02:27:13.626056 kubelet[2891]: I0913 02:27:13.625809 2891 kubelet_node_status.go:75] "Successfully registered node" node="srv-1di1n.gb1.brightbox.com" Sep 13 02:27:14.321301 kubelet[2891]: I0913 02:27:14.320300 2891 apiserver.go:52] "Watching apiserver" Sep 13 02:27:14.364742 kubelet[2891]: I0913 02:27:14.364645 2891 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 13 02:27:14.439361 kubelet[2891]: W0913 02:27:14.438972 2891 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 02:27:14.439361 kubelet[2891]: E0913 02:27:14.439046 2891 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-srv-1di1n.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:14.441559 kubelet[2891]: W0913 02:27:14.441536 2891 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 02:27:14.441707 kubelet[2891]: E0913 02:27:14.441684 2891 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-srv-1di1n.gb1.brightbox.com\" already exists" pod="kube-system/kube-controller-manager-srv-1di1n.gb1.brightbox.com" Sep 13 02:27:14.483856 kubelet[2891]: I0913 02:27:14.483056 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-1di1n.gb1.brightbox.com" podStartSLOduration=1.483019787 podStartE2EDuration="1.483019787s" podCreationTimestamp="2025-09-13 02:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 02:27:14.471555599 +0000 UTC m=+1.238866388" watchObservedRunningTime="2025-09-13 02:27:14.483019787 +0000 UTC m=+1.250330564" Sep 13 02:27:14.495202 kubelet[2891]: I0913 02:27:14.495136 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-1di1n.gb1.brightbox.com" podStartSLOduration=1.495113847 podStartE2EDuration="1.495113847s" podCreationTimestamp="2025-09-13 02:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 02:27:14.483611667 +0000 UTC m=+1.250922483" watchObservedRunningTime="2025-09-13 02:27:14.495113847 +0000 UTC m=+1.262424618" Sep 13 02:27:14.512021 kubelet[2891]: I0913 02:27:14.511666 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-1di1n.gb1.brightbox.com" podStartSLOduration=1.511647212 podStartE2EDuration="1.511647212s" podCreationTimestamp="2025-09-13 02:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 02:27:14.497588712 +0000 UTC m=+1.264899499" watchObservedRunningTime="2025-09-13 02:27:14.511647212 +0000 UTC m=+1.278957992" Sep 13 02:27:18.672931 kubelet[2891]: I0913 02:27:18.672817 2891 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 13 02:27:18.674330 containerd[1607]: time="2025-09-13T02:27:18.673763389Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 13 02:27:18.675301 kubelet[2891]: I0913 02:27:18.674750 2891 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 13 02:27:19.619324 systemd[1]: Created slice kubepods-besteffort-pod8d309c7a_9bb7_4f03_b9ad_4f584652088c.slice - libcontainer container kubepods-besteffort-pod8d309c7a_9bb7_4f03_b9ad_4f584652088c.slice. Sep 13 02:27:19.701681 kubelet[2891]: I0913 02:27:19.701623 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8d309c7a-9bb7-4f03-b9ad-4f584652088c-kube-proxy\") pod \"kube-proxy-mfpbp\" (UID: \"8d309c7a-9bb7-4f03-b9ad-4f584652088c\") " pod="kube-system/kube-proxy-mfpbp" Sep 13 02:27:19.701681 kubelet[2891]: I0913 02:27:19.701688 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dhjk\" (UniqueName: \"kubernetes.io/projected/8d309c7a-9bb7-4f03-b9ad-4f584652088c-kube-api-access-7dhjk\") pod \"kube-proxy-mfpbp\" (UID: \"8d309c7a-9bb7-4f03-b9ad-4f584652088c\") " pod="kube-system/kube-proxy-mfpbp" Sep 13 02:27:19.702467 kubelet[2891]: I0913 02:27:19.701722 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8d309c7a-9bb7-4f03-b9ad-4f584652088c-xtables-lock\") pod \"kube-proxy-mfpbp\" (UID: \"8d309c7a-9bb7-4f03-b9ad-4f584652088c\") " pod="kube-system/kube-proxy-mfpbp" Sep 13 02:27:19.702467 kubelet[2891]: I0913 02:27:19.701747 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d309c7a-9bb7-4f03-b9ad-4f584652088c-lib-modules\") pod \"kube-proxy-mfpbp\" (UID: \"8d309c7a-9bb7-4f03-b9ad-4f584652088c\") " pod="kube-system/kube-proxy-mfpbp" Sep 13 02:27:19.756413 systemd[1]: Created slice kubepods-besteffort-pode320b53e_e70f_423d_b7a1_9ece708ac185.slice - libcontainer container kubepods-besteffort-pode320b53e_e70f_423d_b7a1_9ece708ac185.slice. Sep 13 02:27:19.802707 kubelet[2891]: I0913 02:27:19.802648 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e320b53e-e70f-423d-b7a1-9ece708ac185-var-lib-calico\") pod \"tigera-operator-58fc44c59b-22w4l\" (UID: \"e320b53e-e70f-423d-b7a1-9ece708ac185\") " pod="tigera-operator/tigera-operator-58fc44c59b-22w4l" Sep 13 02:27:19.803090 kubelet[2891]: I0913 02:27:19.803056 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kvcp\" (UniqueName: \"kubernetes.io/projected/e320b53e-e70f-423d-b7a1-9ece708ac185-kube-api-access-4kvcp\") pod \"tigera-operator-58fc44c59b-22w4l\" (UID: \"e320b53e-e70f-423d-b7a1-9ece708ac185\") " pod="tigera-operator/tigera-operator-58fc44c59b-22w4l" Sep 13 02:27:19.930372 containerd[1607]: time="2025-09-13T02:27:19.930225828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mfpbp,Uid:8d309c7a-9bb7-4f03-b9ad-4f584652088c,Namespace:kube-system,Attempt:0,}" Sep 13 02:27:19.960260 containerd[1607]: time="2025-09-13T02:27:19.960170253Z" level=info msg="connecting to shim b8e1a9be913c44cee82068644b962230129f5852c1ee91d35f94f2d89c2ca571" address="unix:///run/containerd/s/f8a198d42adc875ea09f20947fab75abc0e1a65c25ff49b88ac996997c64f4cf" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:27:20.002580 systemd[1]: Started cri-containerd-b8e1a9be913c44cee82068644b962230129f5852c1ee91d35f94f2d89c2ca571.scope - libcontainer container b8e1a9be913c44cee82068644b962230129f5852c1ee91d35f94f2d89c2ca571. Sep 13 02:27:20.048206 containerd[1607]: time="2025-09-13T02:27:20.048108055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mfpbp,Uid:8d309c7a-9bb7-4f03-b9ad-4f584652088c,Namespace:kube-system,Attempt:0,} returns sandbox id \"b8e1a9be913c44cee82068644b962230129f5852c1ee91d35f94f2d89c2ca571\"" Sep 13 02:27:20.053645 containerd[1607]: time="2025-09-13T02:27:20.053610111Z" level=info msg="CreateContainer within sandbox \"b8e1a9be913c44cee82068644b962230129f5852c1ee91d35f94f2d89c2ca571\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 13 02:27:20.067080 containerd[1607]: time="2025-09-13T02:27:20.066934344Z" level=info msg="Container 4cd8571bfb598dbaf31b25f635a096129f198da45943ef4c3ce88a89cb67b420: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:27:20.068713 containerd[1607]: time="2025-09-13T02:27:20.068648750Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-22w4l,Uid:e320b53e-e70f-423d-b7a1-9ece708ac185,Namespace:tigera-operator,Attempt:0,}" Sep 13 02:27:20.093797 containerd[1607]: time="2025-09-13T02:27:20.093749264Z" level=info msg="CreateContainer within sandbox \"b8e1a9be913c44cee82068644b962230129f5852c1ee91d35f94f2d89c2ca571\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"4cd8571bfb598dbaf31b25f635a096129f198da45943ef4c3ce88a89cb67b420\"" Sep 13 02:27:20.094790 containerd[1607]: time="2025-09-13T02:27:20.094586025Z" level=info msg="StartContainer for \"4cd8571bfb598dbaf31b25f635a096129f198da45943ef4c3ce88a89cb67b420\"" Sep 13 02:27:20.097114 containerd[1607]: time="2025-09-13T02:27:20.097080755Z" level=info msg="connecting to shim 4cd8571bfb598dbaf31b25f635a096129f198da45943ef4c3ce88a89cb67b420" address="unix:///run/containerd/s/f8a198d42adc875ea09f20947fab75abc0e1a65c25ff49b88ac996997c64f4cf" protocol=ttrpc version=3 Sep 13 02:27:20.115726 containerd[1607]: time="2025-09-13T02:27:20.115667041Z" level=info msg="connecting to shim e695d443826895cc9ef727e0b5db7353db7d0b23ce7e0fa4d81018be00413d2f" address="unix:///run/containerd/s/aebfe25286807fc77713950ad16c8337646a13d766ae21a78f9a4fb5c7450f8c" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:27:20.125490 systemd[1]: Started cri-containerd-4cd8571bfb598dbaf31b25f635a096129f198da45943ef4c3ce88a89cb67b420.scope - libcontainer container 4cd8571bfb598dbaf31b25f635a096129f198da45943ef4c3ce88a89cb67b420. Sep 13 02:27:20.158522 systemd[1]: Started cri-containerd-e695d443826895cc9ef727e0b5db7353db7d0b23ce7e0fa4d81018be00413d2f.scope - libcontainer container e695d443826895cc9ef727e0b5db7353db7d0b23ce7e0fa4d81018be00413d2f. Sep 13 02:27:20.228526 containerd[1607]: time="2025-09-13T02:27:20.228403323Z" level=info msg="StartContainer for \"4cd8571bfb598dbaf31b25f635a096129f198da45943ef4c3ce88a89cb67b420\" returns successfully" Sep 13 02:27:20.284141 containerd[1607]: time="2025-09-13T02:27:20.284089241Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-22w4l,Uid:e320b53e-e70f-423d-b7a1-9ece708ac185,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e695d443826895cc9ef727e0b5db7353db7d0b23ce7e0fa4d81018be00413d2f\"" Sep 13 02:27:20.288365 containerd[1607]: time="2025-09-13T02:27:20.288260431Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 13 02:27:20.462709 kubelet[2891]: I0913 02:27:20.462608 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-mfpbp" podStartSLOduration=1.462442916 podStartE2EDuration="1.462442916s" podCreationTimestamp="2025-09-13 02:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 02:27:20.462356192 +0000 UTC m=+7.229667005" watchObservedRunningTime="2025-09-13 02:27:20.462442916 +0000 UTC m=+7.229753694" Sep 13 02:27:20.827595 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1087698185.mount: Deactivated successfully. Sep 13 02:27:22.484465 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3141184719.mount: Deactivated successfully. Sep 13 02:27:23.475946 containerd[1607]: time="2025-09-13T02:27:23.475878413Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:23.477230 containerd[1607]: time="2025-09-13T02:27:23.477027586Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 13 02:27:23.478009 containerd[1607]: time="2025-09-13T02:27:23.477949396Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:23.481071 containerd[1607]: time="2025-09-13T02:27:23.481036681Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:23.482276 containerd[1607]: time="2025-09-13T02:27:23.482017780Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 3.193680811s" Sep 13 02:27:23.482276 containerd[1607]: time="2025-09-13T02:27:23.482055321Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 13 02:27:23.485255 containerd[1607]: time="2025-09-13T02:27:23.485151890Z" level=info msg="CreateContainer within sandbox \"e695d443826895cc9ef727e0b5db7353db7d0b23ce7e0fa4d81018be00413d2f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 13 02:27:23.499096 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount45524970.mount: Deactivated successfully. Sep 13 02:27:23.505574 containerd[1607]: time="2025-09-13T02:27:23.505526838Z" level=info msg="Container d666ee20ec5e05b6d072ac886c98dc9ed5c0f43d959cb52abcd377bfaef10a04: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:27:23.510958 containerd[1607]: time="2025-09-13T02:27:23.510915680Z" level=info msg="CreateContainer within sandbox \"e695d443826895cc9ef727e0b5db7353db7d0b23ce7e0fa4d81018be00413d2f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d666ee20ec5e05b6d072ac886c98dc9ed5c0f43d959cb52abcd377bfaef10a04\"" Sep 13 02:27:23.512396 containerd[1607]: time="2025-09-13T02:27:23.512339454Z" level=info msg="StartContainer for \"d666ee20ec5e05b6d072ac886c98dc9ed5c0f43d959cb52abcd377bfaef10a04\"" Sep 13 02:27:23.514661 containerd[1607]: time="2025-09-13T02:27:23.514551737Z" level=info msg="connecting to shim d666ee20ec5e05b6d072ac886c98dc9ed5c0f43d959cb52abcd377bfaef10a04" address="unix:///run/containerd/s/aebfe25286807fc77713950ad16c8337646a13d766ae21a78f9a4fb5c7450f8c" protocol=ttrpc version=3 Sep 13 02:27:23.551497 systemd[1]: Started cri-containerd-d666ee20ec5e05b6d072ac886c98dc9ed5c0f43d959cb52abcd377bfaef10a04.scope - libcontainer container d666ee20ec5e05b6d072ac886c98dc9ed5c0f43d959cb52abcd377bfaef10a04. Sep 13 02:27:23.610430 containerd[1607]: time="2025-09-13T02:27:23.610322103Z" level=info msg="StartContainer for \"d666ee20ec5e05b6d072ac886c98dc9ed5c0f43d959cb52abcd377bfaef10a04\" returns successfully" Sep 13 02:27:24.474323 kubelet[2891]: I0913 02:27:24.474056 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-22w4l" podStartSLOduration=2.277249168 podStartE2EDuration="5.474037194s" podCreationTimestamp="2025-09-13 02:27:19 +0000 UTC" firstStartedPulling="2025-09-13 02:27:20.286568619 +0000 UTC m=+7.053879385" lastFinishedPulling="2025-09-13 02:27:23.483356638 +0000 UTC m=+10.250667411" observedRunningTime="2025-09-13 02:27:24.473512696 +0000 UTC m=+11.240823502" watchObservedRunningTime="2025-09-13 02:27:24.474037194 +0000 UTC m=+11.241347985" Sep 13 02:27:28.593486 sudo[1905]: pam_unix(sudo:session): session closed for user root Sep 13 02:27:28.738237 sshd[1904]: Connection closed by 139.178.89.65 port 37154 Sep 13 02:27:28.746956 sshd-session[1902]: pam_unix(sshd:session): session closed for user core Sep 13 02:27:28.755146 systemd[1]: sshd@8-10.230.67.142:22-139.178.89.65:37154.service: Deactivated successfully. Sep 13 02:27:28.760037 systemd[1]: session-11.scope: Deactivated successfully. Sep 13 02:27:28.760399 systemd[1]: session-11.scope: Consumed 6.440s CPU time, 152.8M memory peak. Sep 13 02:27:28.765944 systemd-logind[1589]: Session 11 logged out. Waiting for processes to exit. Sep 13 02:27:28.768793 systemd-logind[1589]: Removed session 11. Sep 13 02:27:32.913055 systemd[1]: Created slice kubepods-besteffort-pod8f678dc9_90ca_49cf_875e_139ebcb1cb80.slice - libcontainer container kubepods-besteffort-pod8f678dc9_90ca_49cf_875e_139ebcb1cb80.slice. Sep 13 02:27:32.996681 kubelet[2891]: I0913 02:27:32.996595 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f678dc9-90ca-49cf-875e-139ebcb1cb80-tigera-ca-bundle\") pod \"calico-typha-57b685b94f-cfjpz\" (UID: \"8f678dc9-90ca-49cf-875e-139ebcb1cb80\") " pod="calico-system/calico-typha-57b685b94f-cfjpz" Sep 13 02:27:32.997938 kubelet[2891]: I0913 02:27:32.997558 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/8f678dc9-90ca-49cf-875e-139ebcb1cb80-typha-certs\") pod \"calico-typha-57b685b94f-cfjpz\" (UID: \"8f678dc9-90ca-49cf-875e-139ebcb1cb80\") " pod="calico-system/calico-typha-57b685b94f-cfjpz" Sep 13 02:27:32.997938 kubelet[2891]: I0913 02:27:32.997654 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szx2b\" (UniqueName: \"kubernetes.io/projected/8f678dc9-90ca-49cf-875e-139ebcb1cb80-kube-api-access-szx2b\") pod \"calico-typha-57b685b94f-cfjpz\" (UID: \"8f678dc9-90ca-49cf-875e-139ebcb1cb80\") " pod="calico-system/calico-typha-57b685b94f-cfjpz" Sep 13 02:27:33.225226 containerd[1607]: time="2025-09-13T02:27:33.224816494Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-57b685b94f-cfjpz,Uid:8f678dc9-90ca-49cf-875e-139ebcb1cb80,Namespace:calico-system,Attempt:0,}" Sep 13 02:27:33.261101 containerd[1607]: time="2025-09-13T02:27:33.260973618Z" level=info msg="connecting to shim b793392b4d06663902128b66556f00b8140d8f7119e4e6c05e0d063c8a422a6d" address="unix:///run/containerd/s/3708b5bfbac0f42f213617fb118fe46c274e0379e6a2673c8498f048ef535664" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:27:33.320686 systemd[1]: Started cri-containerd-b793392b4d06663902128b66556f00b8140d8f7119e4e6c05e0d063c8a422a6d.scope - libcontainer container b793392b4d06663902128b66556f00b8140d8f7119e4e6c05e0d063c8a422a6d. Sep 13 02:27:33.376890 systemd[1]: Created slice kubepods-besteffort-podbaa1ad0a_aa9a_40fb_9891_0610f0cbb498.slice - libcontainer container kubepods-besteffort-podbaa1ad0a_aa9a_40fb_9891_0610f0cbb498.slice. Sep 13 02:27:33.401411 kubelet[2891]: I0913 02:27:33.400444 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/baa1ad0a-aa9a-40fb-9891-0610f0cbb498-xtables-lock\") pod \"calico-node-djrws\" (UID: \"baa1ad0a-aa9a-40fb-9891-0610f0cbb498\") " pod="calico-system/calico-node-djrws" Sep 13 02:27:33.401411 kubelet[2891]: I0913 02:27:33.400512 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/baa1ad0a-aa9a-40fb-9891-0610f0cbb498-node-certs\") pod \"calico-node-djrws\" (UID: \"baa1ad0a-aa9a-40fb-9891-0610f0cbb498\") " pod="calico-system/calico-node-djrws" Sep 13 02:27:33.401411 kubelet[2891]: I0913 02:27:33.400543 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/baa1ad0a-aa9a-40fb-9891-0610f0cbb498-cni-net-dir\") pod \"calico-node-djrws\" (UID: \"baa1ad0a-aa9a-40fb-9891-0610f0cbb498\") " pod="calico-system/calico-node-djrws" Sep 13 02:27:33.401411 kubelet[2891]: I0913 02:27:33.400569 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm2qm\" (UniqueName: \"kubernetes.io/projected/baa1ad0a-aa9a-40fb-9891-0610f0cbb498-kube-api-access-bm2qm\") pod \"calico-node-djrws\" (UID: \"baa1ad0a-aa9a-40fb-9891-0610f0cbb498\") " pod="calico-system/calico-node-djrws" Sep 13 02:27:33.401411 kubelet[2891]: I0913 02:27:33.400604 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/baa1ad0a-aa9a-40fb-9891-0610f0cbb498-policysync\") pod \"calico-node-djrws\" (UID: \"baa1ad0a-aa9a-40fb-9891-0610f0cbb498\") " pod="calico-system/calico-node-djrws" Sep 13 02:27:33.401774 kubelet[2891]: I0913 02:27:33.400630 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/baa1ad0a-aa9a-40fb-9891-0610f0cbb498-flexvol-driver-host\") pod \"calico-node-djrws\" (UID: \"baa1ad0a-aa9a-40fb-9891-0610f0cbb498\") " pod="calico-system/calico-node-djrws" Sep 13 02:27:33.401774 kubelet[2891]: I0913 02:27:33.400656 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/baa1ad0a-aa9a-40fb-9891-0610f0cbb498-cni-log-dir\") pod \"calico-node-djrws\" (UID: \"baa1ad0a-aa9a-40fb-9891-0610f0cbb498\") " pod="calico-system/calico-node-djrws" Sep 13 02:27:33.401774 kubelet[2891]: I0913 02:27:33.400680 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/baa1ad0a-aa9a-40fb-9891-0610f0cbb498-cni-bin-dir\") pod \"calico-node-djrws\" (UID: \"baa1ad0a-aa9a-40fb-9891-0610f0cbb498\") " pod="calico-system/calico-node-djrws" Sep 13 02:27:33.401774 kubelet[2891]: I0913 02:27:33.400704 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/baa1ad0a-aa9a-40fb-9891-0610f0cbb498-tigera-ca-bundle\") pod \"calico-node-djrws\" (UID: \"baa1ad0a-aa9a-40fb-9891-0610f0cbb498\") " pod="calico-system/calico-node-djrws" Sep 13 02:27:33.401774 kubelet[2891]: I0913 02:27:33.400759 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/baa1ad0a-aa9a-40fb-9891-0610f0cbb498-lib-modules\") pod \"calico-node-djrws\" (UID: \"baa1ad0a-aa9a-40fb-9891-0610f0cbb498\") " pod="calico-system/calico-node-djrws" Sep 13 02:27:33.401988 kubelet[2891]: I0913 02:27:33.400796 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/baa1ad0a-aa9a-40fb-9891-0610f0cbb498-var-lib-calico\") pod \"calico-node-djrws\" (UID: \"baa1ad0a-aa9a-40fb-9891-0610f0cbb498\") " pod="calico-system/calico-node-djrws" Sep 13 02:27:33.401988 kubelet[2891]: I0913 02:27:33.400824 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/baa1ad0a-aa9a-40fb-9891-0610f0cbb498-var-run-calico\") pod \"calico-node-djrws\" (UID: \"baa1ad0a-aa9a-40fb-9891-0610f0cbb498\") " pod="calico-system/calico-node-djrws" Sep 13 02:27:33.509455 kubelet[2891]: E0913 02:27:33.508849 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.509455 kubelet[2891]: W0913 02:27:33.509320 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.510498 kubelet[2891]: E0913 02:27:33.510466 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.513792 kubelet[2891]: E0913 02:27:33.513441 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.513792 kubelet[2891]: W0913 02:27:33.513464 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.513792 kubelet[2891]: E0913 02:27:33.513481 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.514329 kubelet[2891]: E0913 02:27:33.514243 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.514329 kubelet[2891]: W0913 02:27:33.514277 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.514329 kubelet[2891]: E0913 02:27:33.514294 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.516392 kubelet[2891]: E0913 02:27:33.516357 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.516392 kubelet[2891]: W0913 02:27:33.516379 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.516503 kubelet[2891]: E0913 02:27:33.516395 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.522671 kubelet[2891]: E0913 02:27:33.522642 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.522965 kubelet[2891]: W0913 02:27:33.522939 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.523432 kubelet[2891]: E0913 02:27:33.523250 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.543295 kubelet[2891]: E0913 02:27:33.542355 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.543552 kubelet[2891]: W0913 02:27:33.543473 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.543552 kubelet[2891]: E0913 02:27:33.543509 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.552591 containerd[1607]: time="2025-09-13T02:27:33.552546403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-57b685b94f-cfjpz,Uid:8f678dc9-90ca-49cf-875e-139ebcb1cb80,Namespace:calico-system,Attempt:0,} returns sandbox id \"b793392b4d06663902128b66556f00b8140d8f7119e4e6c05e0d063c8a422a6d\"" Sep 13 02:27:33.554815 containerd[1607]: time="2025-09-13T02:27:33.554716484Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 13 02:27:33.617516 kubelet[2891]: E0913 02:27:33.617459 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f4w47" podUID="6c998d1d-8056-4958-bc44-d35b2d7c5300" Sep 13 02:27:33.683926 containerd[1607]: time="2025-09-13T02:27:33.683876564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-djrws,Uid:baa1ad0a-aa9a-40fb-9891-0610f0cbb498,Namespace:calico-system,Attempt:0,}" Sep 13 02:27:33.698253 kubelet[2891]: E0913 02:27:33.696884 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.698253 kubelet[2891]: W0913 02:27:33.696918 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.698253 kubelet[2891]: E0913 02:27:33.696945 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.700151 kubelet[2891]: E0913 02:27:33.699344 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.700151 kubelet[2891]: W0913 02:27:33.699364 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.700151 kubelet[2891]: E0913 02:27:33.699379 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.700377 kubelet[2891]: E0913 02:27:33.700183 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.700377 kubelet[2891]: W0913 02:27:33.700197 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.700377 kubelet[2891]: E0913 02:27:33.700211 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.700803 kubelet[2891]: E0913 02:27:33.700762 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.700803 kubelet[2891]: W0913 02:27:33.700780 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.700803 kubelet[2891]: E0913 02:27:33.700795 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.702794 kubelet[2891]: E0913 02:27:33.702382 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.702794 kubelet[2891]: W0913 02:27:33.702403 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.702794 kubelet[2891]: E0913 02:27:33.702418 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.702794 kubelet[2891]: E0913 02:27:33.702658 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.702794 kubelet[2891]: W0913 02:27:33.702671 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.704491 kubelet[2891]: E0913 02:27:33.703755 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.704491 kubelet[2891]: E0913 02:27:33.704013 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.704491 kubelet[2891]: W0913 02:27:33.704027 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.704491 kubelet[2891]: E0913 02:27:33.704158 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.705373 kubelet[2891]: E0913 02:27:33.704839 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.705373 kubelet[2891]: W0913 02:27:33.704870 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.705373 kubelet[2891]: E0913 02:27:33.704890 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.706422 kubelet[2891]: E0913 02:27:33.706396 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.706422 kubelet[2891]: W0913 02:27:33.706416 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.706955 kubelet[2891]: E0913 02:27:33.706431 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.706955 kubelet[2891]: E0913 02:27:33.706657 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.706955 kubelet[2891]: W0913 02:27:33.706670 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.706955 kubelet[2891]: E0913 02:27:33.706684 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.707193 kubelet[2891]: E0913 02:27:33.706955 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.707193 kubelet[2891]: W0913 02:27:33.706968 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.707375 kubelet[2891]: E0913 02:27:33.707329 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.707626 kubelet[2891]: E0913 02:27:33.707606 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.707626 kubelet[2891]: W0913 02:27:33.707624 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.707896 kubelet[2891]: E0913 02:27:33.707639 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.709475 kubelet[2891]: E0913 02:27:33.709440 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.709475 kubelet[2891]: W0913 02:27:33.709460 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.709475 kubelet[2891]: E0913 02:27:33.709476 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.709974 kubelet[2891]: E0913 02:27:33.709752 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.709974 kubelet[2891]: W0913 02:27:33.709766 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.709974 kubelet[2891]: E0913 02:27:33.709780 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.710447 kubelet[2891]: E0913 02:27:33.710360 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.710447 kubelet[2891]: W0913 02:27:33.710380 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.710447 kubelet[2891]: E0913 02:27:33.710396 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.710890 kubelet[2891]: E0913 02:27:33.710721 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.710890 kubelet[2891]: W0913 02:27:33.710749 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.710890 kubelet[2891]: E0913 02:27:33.710766 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.712071 kubelet[2891]: E0913 02:27:33.712038 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.712071 kubelet[2891]: W0913 02:27:33.712059 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.712199 kubelet[2891]: E0913 02:27:33.712074 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.713552 kubelet[2891]: E0913 02:27:33.713496 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.713552 kubelet[2891]: W0913 02:27:33.713528 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.713552 kubelet[2891]: E0913 02:27:33.713547 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.714045 kubelet[2891]: E0913 02:27:33.713890 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.714045 kubelet[2891]: W0913 02:27:33.713913 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.714045 kubelet[2891]: E0913 02:27:33.713933 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.716842 kubelet[2891]: E0913 02:27:33.715352 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.716842 kubelet[2891]: W0913 02:27:33.715392 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.716842 kubelet[2891]: E0913 02:27:33.715421 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.716842 kubelet[2891]: E0913 02:27:33.716013 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.716842 kubelet[2891]: W0913 02:27:33.716027 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.716842 kubelet[2891]: E0913 02:27:33.716041 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.716842 kubelet[2891]: I0913 02:27:33.716069 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6c998d1d-8056-4958-bc44-d35b2d7c5300-registration-dir\") pod \"csi-node-driver-f4w47\" (UID: \"6c998d1d-8056-4958-bc44-d35b2d7c5300\") " pod="calico-system/csi-node-driver-f4w47" Sep 13 02:27:33.717441 kubelet[2891]: E0913 02:27:33.717416 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.717441 kubelet[2891]: W0913 02:27:33.717441 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.717684 kubelet[2891]: E0913 02:27:33.717487 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.717684 kubelet[2891]: I0913 02:27:33.717512 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6c998d1d-8056-4958-bc44-d35b2d7c5300-socket-dir\") pod \"csi-node-driver-f4w47\" (UID: \"6c998d1d-8056-4958-bc44-d35b2d7c5300\") " pod="calico-system/csi-node-driver-f4w47" Sep 13 02:27:33.717926 kubelet[2891]: E0913 02:27:33.717819 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.718347 kubelet[2891]: W0913 02:27:33.718298 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.718347 kubelet[2891]: E0913 02:27:33.718335 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.719551 kubelet[2891]: E0913 02:27:33.719524 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.719551 kubelet[2891]: W0913 02:27:33.719546 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.719667 kubelet[2891]: E0913 02:27:33.719568 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.721551 kubelet[2891]: E0913 02:27:33.721368 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.721551 kubelet[2891]: W0913 02:27:33.721388 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.721551 kubelet[2891]: E0913 02:27:33.721424 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.721551 kubelet[2891]: I0913 02:27:33.721452 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmqbz\" (UniqueName: \"kubernetes.io/projected/6c998d1d-8056-4958-bc44-d35b2d7c5300-kube-api-access-xmqbz\") pod \"csi-node-driver-f4w47\" (UID: \"6c998d1d-8056-4958-bc44-d35b2d7c5300\") " pod="calico-system/csi-node-driver-f4w47" Sep 13 02:27:33.722954 kubelet[2891]: E0913 02:27:33.722833 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.722954 kubelet[2891]: W0913 02:27:33.722848 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.722954 kubelet[2891]: E0913 02:27:33.722886 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.723149 kubelet[2891]: E0913 02:27:33.723126 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.723149 kubelet[2891]: W0913 02:27:33.723147 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.723256 kubelet[2891]: E0913 02:27:33.723162 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.724486 kubelet[2891]: E0913 02:27:33.724465 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.724486 kubelet[2891]: W0913 02:27:33.724484 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.724612 kubelet[2891]: E0913 02:27:33.724529 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.724612 kubelet[2891]: I0913 02:27:33.724552 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c998d1d-8056-4958-bc44-d35b2d7c5300-kubelet-dir\") pod \"csi-node-driver-f4w47\" (UID: \"6c998d1d-8056-4958-bc44-d35b2d7c5300\") " pod="calico-system/csi-node-driver-f4w47" Sep 13 02:27:33.725543 kubelet[2891]: E0913 02:27:33.725518 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.725543 kubelet[2891]: W0913 02:27:33.725539 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.725803 kubelet[2891]: E0913 02:27:33.725696 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.725803 kubelet[2891]: I0913 02:27:33.725735 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/6c998d1d-8056-4958-bc44-d35b2d7c5300-varrun\") pod \"csi-node-driver-f4w47\" (UID: \"6c998d1d-8056-4958-bc44-d35b2d7c5300\") " pod="calico-system/csi-node-driver-f4w47" Sep 13 02:27:33.725803 kubelet[2891]: E0913 02:27:33.725791 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.725803 kubelet[2891]: W0913 02:27:33.725804 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.726363 kubelet[2891]: E0913 02:27:33.726324 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.727461 kubelet[2891]: E0913 02:27:33.727430 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.727461 kubelet[2891]: W0913 02:27:33.727450 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.727581 kubelet[2891]: E0913 02:27:33.727472 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.727711 kubelet[2891]: E0913 02:27:33.727693 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.727711 kubelet[2891]: W0913 02:27:33.727710 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.728725 kubelet[2891]: E0913 02:27:33.728699 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.728998 kubelet[2891]: E0913 02:27:33.728969 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.728998 kubelet[2891]: W0913 02:27:33.728989 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.729086 kubelet[2891]: E0913 02:27:33.729004 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.730033 kubelet[2891]: E0913 02:27:33.730009 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.730120 kubelet[2891]: W0913 02:27:33.730028 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.730120 kubelet[2891]: E0913 02:27:33.730050 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.733294 kubelet[2891]: E0913 02:27:33.732703 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.733294 kubelet[2891]: W0913 02:27:33.732742 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.733294 kubelet[2891]: E0913 02:27:33.732784 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.742991 containerd[1607]: time="2025-09-13T02:27:33.742938260Z" level=info msg="connecting to shim ea7717106fb5ca554b776fae16cee75dcdb1f5b2aa8c5343bc544b867510573e" address="unix:///run/containerd/s/5e44cd1bbd754495f10fcea0531c2648549263cde425a6dfa348be0ba75b8eb1" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:27:33.798065 systemd[1]: Started cri-containerd-ea7717106fb5ca554b776fae16cee75dcdb1f5b2aa8c5343bc544b867510573e.scope - libcontainer container ea7717106fb5ca554b776fae16cee75dcdb1f5b2aa8c5343bc544b867510573e. Sep 13 02:27:33.826929 kubelet[2891]: E0913 02:27:33.826891 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.827195 kubelet[2891]: W0913 02:27:33.826938 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.827195 kubelet[2891]: E0913 02:27:33.826964 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.827498 kubelet[2891]: E0913 02:27:33.827334 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.828438 kubelet[2891]: W0913 02:27:33.828344 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.828641 kubelet[2891]: E0913 02:27:33.828617 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.828733 kubelet[2891]: W0913 02:27:33.828655 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.828733 kubelet[2891]: E0913 02:27:33.828672 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.828966 kubelet[2891]: E0913 02:27:33.828388 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.829029 kubelet[2891]: E0913 02:27:33.829006 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.829029 kubelet[2891]: W0913 02:27:33.829020 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.829365 kubelet[2891]: E0913 02:27:33.829047 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.829450 kubelet[2891]: E0913 02:27:33.829316 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.829503 kubelet[2891]: W0913 02:27:33.829449 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.829503 kubelet[2891]: E0913 02:27:33.829467 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.829962 kubelet[2891]: E0913 02:27:33.829708 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.829962 kubelet[2891]: W0913 02:27:33.829728 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.830073 kubelet[2891]: E0913 02:27:33.829997 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.830073 kubelet[2891]: W0913 02:27:33.830010 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.830073 kubelet[2891]: E0913 02:27:33.830024 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.830463 kubelet[2891]: E0913 02:27:33.829755 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.832246 kubelet[2891]: E0913 02:27:33.832222 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.832246 kubelet[2891]: W0913 02:27:33.832243 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.832382 kubelet[2891]: E0913 02:27:33.832303 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.832815 kubelet[2891]: E0913 02:27:33.832723 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.832815 kubelet[2891]: W0913 02:27:33.832745 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.833245 kubelet[2891]: E0913 02:27:33.833205 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.833737 kubelet[2891]: E0913 02:27:33.833546 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.833737 kubelet[2891]: W0913 02:27:33.833560 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.835103 kubelet[2891]: E0913 02:27:33.835074 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.835676 kubelet[2891]: E0913 02:27:33.835655 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.835759 kubelet[2891]: W0913 02:27:33.835726 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.835918 kubelet[2891]: E0913 02:27:33.835884 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.836447 kubelet[2891]: E0913 02:27:33.836421 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.836447 kubelet[2891]: W0913 02:27:33.836441 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.837631 kubelet[2891]: E0913 02:27:33.837600 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.837785 kubelet[2891]: E0913 02:27:33.837761 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.837785 kubelet[2891]: W0913 02:27:33.837782 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.837913 kubelet[2891]: E0913 02:27:33.837846 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.838403 kubelet[2891]: E0913 02:27:33.838381 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.838403 kubelet[2891]: W0913 02:27:33.838400 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.838665 kubelet[2891]: E0913 02:27:33.838471 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.839171 kubelet[2891]: E0913 02:27:33.839122 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.839171 kubelet[2891]: W0913 02:27:33.839142 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.840291 kubelet[2891]: E0913 02:27:33.839171 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.840370 kubelet[2891]: E0913 02:27:33.840344 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.840370 kubelet[2891]: W0913 02:27:33.840358 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.840443 kubelet[2891]: E0913 02:27:33.840374 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.840646 kubelet[2891]: E0913 02:27:33.840622 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.840646 kubelet[2891]: W0913 02:27:33.840641 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.844432 kubelet[2891]: E0913 02:27:33.840670 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.844562 kubelet[2891]: E0913 02:27:33.844507 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.844562 kubelet[2891]: W0913 02:27:33.844522 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.844646 kubelet[2891]: E0913 02:27:33.844565 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.846171 kubelet[2891]: E0913 02:27:33.846143 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.846171 kubelet[2891]: W0913 02:27:33.846165 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.846539 kubelet[2891]: E0913 02:27:33.846491 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.848131 kubelet[2891]: E0913 02:27:33.848102 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.848131 kubelet[2891]: W0913 02:27:33.848122 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.848340 kubelet[2891]: E0913 02:27:33.848313 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.849649 kubelet[2891]: E0913 02:27:33.849611 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.849649 kubelet[2891]: W0913 02:27:33.849632 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.849936 kubelet[2891]: E0913 02:27:33.849909 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.849936 kubelet[2891]: W0913 02:27:33.849929 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.850056 kubelet[2891]: E0913 02:27:33.849944 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.850403 kubelet[2891]: E0913 02:27:33.850376 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.855918 kubelet[2891]: E0913 02:27:33.855829 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.855918 kubelet[2891]: W0913 02:27:33.855858 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.855918 kubelet[2891]: E0913 02:27:33.855880 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.858191 kubelet[2891]: E0913 02:27:33.858154 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.858191 kubelet[2891]: W0913 02:27:33.858187 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.858370 kubelet[2891]: E0913 02:27:33.858220 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.861058 kubelet[2891]: E0913 02:27:33.861030 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.861058 kubelet[2891]: W0913 02:27:33.861051 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.861193 kubelet[2891]: E0913 02:27:33.861067 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.883863 kubelet[2891]: E0913 02:27:33.883765 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:33.884434 kubelet[2891]: W0913 02:27:33.884238 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:33.884434 kubelet[2891]: E0913 02:27:33.884306 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:33.983580 containerd[1607]: time="2025-09-13T02:27:33.983434588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-djrws,Uid:baa1ad0a-aa9a-40fb-9891-0610f0cbb498,Namespace:calico-system,Attempt:0,} returns sandbox id \"ea7717106fb5ca554b776fae16cee75dcdb1f5b2aa8c5343bc544b867510573e\"" Sep 13 02:27:35.214805 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2357124755.mount: Deactivated successfully. Sep 13 02:27:35.373868 kubelet[2891]: E0913 02:27:35.373785 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f4w47" podUID="6c998d1d-8056-4958-bc44-d35b2d7c5300" Sep 13 02:27:37.374323 kubelet[2891]: E0913 02:27:37.373876 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f4w47" podUID="6c998d1d-8056-4958-bc44-d35b2d7c5300" Sep 13 02:27:38.555853 containerd[1607]: time="2025-09-13T02:27:38.554750873Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:38.555853 containerd[1607]: time="2025-09-13T02:27:38.555808315Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 13 02:27:38.556850 containerd[1607]: time="2025-09-13T02:27:38.556539301Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:38.558557 containerd[1607]: time="2025-09-13T02:27:38.558519409Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:38.559512 containerd[1607]: time="2025-09-13T02:27:38.559472937Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 5.004316556s" Sep 13 02:27:38.559825 containerd[1607]: time="2025-09-13T02:27:38.559619820Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 13 02:27:38.562124 containerd[1607]: time="2025-09-13T02:27:38.562085921Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 13 02:27:38.586545 containerd[1607]: time="2025-09-13T02:27:38.586496871Z" level=info msg="CreateContainer within sandbox \"b793392b4d06663902128b66556f00b8140d8f7119e4e6c05e0d063c8a422a6d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 13 02:27:38.596531 containerd[1607]: time="2025-09-13T02:27:38.596489650Z" level=info msg="Container b893994c2f2a49551494b8d1546f779f27e03003c18d79ff845904a5052d56ca: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:27:38.613532 containerd[1607]: time="2025-09-13T02:27:38.613480878Z" level=info msg="CreateContainer within sandbox \"b793392b4d06663902128b66556f00b8140d8f7119e4e6c05e0d063c8a422a6d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"b893994c2f2a49551494b8d1546f779f27e03003c18d79ff845904a5052d56ca\"" Sep 13 02:27:38.615283 containerd[1607]: time="2025-09-13T02:27:38.614404571Z" level=info msg="StartContainer for \"b893994c2f2a49551494b8d1546f779f27e03003c18d79ff845904a5052d56ca\"" Sep 13 02:27:38.618067 containerd[1607]: time="2025-09-13T02:27:38.617962332Z" level=info msg="connecting to shim b893994c2f2a49551494b8d1546f779f27e03003c18d79ff845904a5052d56ca" address="unix:///run/containerd/s/3708b5bfbac0f42f213617fb118fe46c274e0379e6a2673c8498f048ef535664" protocol=ttrpc version=3 Sep 13 02:27:38.654502 systemd[1]: Started cri-containerd-b893994c2f2a49551494b8d1546f779f27e03003c18d79ff845904a5052d56ca.scope - libcontainer container b893994c2f2a49551494b8d1546f779f27e03003c18d79ff845904a5052d56ca. Sep 13 02:27:38.729041 containerd[1607]: time="2025-09-13T02:27:38.728984494Z" level=info msg="StartContainer for \"b893994c2f2a49551494b8d1546f779f27e03003c18d79ff845904a5052d56ca\" returns successfully" Sep 13 02:27:39.375623 kubelet[2891]: E0913 02:27:39.375355 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f4w47" podUID="6c998d1d-8056-4958-bc44-d35b2d7c5300" Sep 13 02:27:39.551001 kubelet[2891]: I0913 02:27:39.549951 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-57b685b94f-cfjpz" podStartSLOduration=2.543245376 podStartE2EDuration="7.549924033s" podCreationTimestamp="2025-09-13 02:27:32 +0000 UTC" firstStartedPulling="2025-09-13 02:27:33.554403107 +0000 UTC m=+20.321713873" lastFinishedPulling="2025-09-13 02:27:38.561081746 +0000 UTC m=+25.328392530" observedRunningTime="2025-09-13 02:27:39.549097814 +0000 UTC m=+26.316408630" watchObservedRunningTime="2025-09-13 02:27:39.549924033 +0000 UTC m=+26.317234810" Sep 13 02:27:39.557077 kubelet[2891]: E0913 02:27:39.556967 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.557077 kubelet[2891]: W0913 02:27:39.557001 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.557077 kubelet[2891]: E0913 02:27:39.557039 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.557685 kubelet[2891]: E0913 02:27:39.557421 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.557685 kubelet[2891]: W0913 02:27:39.557436 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.557685 kubelet[2891]: E0913 02:27:39.557459 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.557864 kubelet[2891]: E0913 02:27:39.557710 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.557864 kubelet[2891]: W0913 02:27:39.557724 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.557864 kubelet[2891]: E0913 02:27:39.557738 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.558389 kubelet[2891]: E0913 02:27:39.557997 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.558389 kubelet[2891]: W0913 02:27:39.558010 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.558389 kubelet[2891]: E0913 02:27:39.558024 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.558389 kubelet[2891]: E0913 02:27:39.558267 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.558389 kubelet[2891]: W0913 02:27:39.558280 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.558389 kubelet[2891]: E0913 02:27:39.558292 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.559029 kubelet[2891]: E0913 02:27:39.558511 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.559029 kubelet[2891]: W0913 02:27:39.558523 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.559029 kubelet[2891]: E0913 02:27:39.558536 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.559029 kubelet[2891]: E0913 02:27:39.558784 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.559029 kubelet[2891]: W0913 02:27:39.558797 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.559029 kubelet[2891]: E0913 02:27:39.558810 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.559029 kubelet[2891]: E0913 02:27:39.559011 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.559029 kubelet[2891]: W0913 02:27:39.559024 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.560051 kubelet[2891]: E0913 02:27:39.559037 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.560051 kubelet[2891]: E0913 02:27:39.559249 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.560051 kubelet[2891]: W0913 02:27:39.559261 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.560051 kubelet[2891]: E0913 02:27:39.559299 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.560051 kubelet[2891]: E0913 02:27:39.559534 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.560051 kubelet[2891]: W0913 02:27:39.559546 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.560051 kubelet[2891]: E0913 02:27:39.559560 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.560051 kubelet[2891]: E0913 02:27:39.559818 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.560051 kubelet[2891]: W0913 02:27:39.559859 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.560051 kubelet[2891]: E0913 02:27:39.559873 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.561678 kubelet[2891]: E0913 02:27:39.560214 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.561678 kubelet[2891]: W0913 02:27:39.560227 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.561678 kubelet[2891]: E0913 02:27:39.560298 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.561678 kubelet[2891]: E0913 02:27:39.560565 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.561678 kubelet[2891]: W0913 02:27:39.560578 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.561678 kubelet[2891]: E0913 02:27:39.560618 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.561678 kubelet[2891]: E0913 02:27:39.560925 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.561678 kubelet[2891]: W0913 02:27:39.560964 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.561678 kubelet[2891]: E0913 02:27:39.560980 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.561678 kubelet[2891]: E0913 02:27:39.561262 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.562330 kubelet[2891]: W0913 02:27:39.561310 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.562330 kubelet[2891]: E0913 02:27:39.561325 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.587312 kubelet[2891]: E0913 02:27:39.587140 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.587312 kubelet[2891]: W0913 02:27:39.587179 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.587312 kubelet[2891]: E0913 02:27:39.587212 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.588203 kubelet[2891]: E0913 02:27:39.588171 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.588203 kubelet[2891]: W0913 02:27:39.588194 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.588345 kubelet[2891]: E0913 02:27:39.588219 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.589345 kubelet[2891]: E0913 02:27:39.589317 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.589345 kubelet[2891]: W0913 02:27:39.589337 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.589491 kubelet[2891]: E0913 02:27:39.589371 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.589909 kubelet[2891]: E0913 02:27:39.589886 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.589909 kubelet[2891]: W0913 02:27:39.589905 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.590487 kubelet[2891]: E0913 02:27:39.590099 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.590680 kubelet[2891]: E0913 02:27:39.590659 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.590680 kubelet[2891]: W0913 02:27:39.590677 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.591346 kubelet[2891]: E0913 02:27:39.590700 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.591581 kubelet[2891]: E0913 02:27:39.591556 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.591581 kubelet[2891]: W0913 02:27:39.591576 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.591709 kubelet[2891]: E0913 02:27:39.591599 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.592482 kubelet[2891]: E0913 02:27:39.592452 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.592482 kubelet[2891]: W0913 02:27:39.592482 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.592702 kubelet[2891]: E0913 02:27:39.592560 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.593067 kubelet[2891]: E0913 02:27:39.593042 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.593067 kubelet[2891]: W0913 02:27:39.593062 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.593566 kubelet[2891]: E0913 02:27:39.593310 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.593829 kubelet[2891]: E0913 02:27:39.593797 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.593829 kubelet[2891]: W0913 02:27:39.593816 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.594476 kubelet[2891]: E0913 02:27:39.594441 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.594476 kubelet[2891]: W0913 02:27:39.594461 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.595199 kubelet[2891]: E0913 02:27:39.594477 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.595199 kubelet[2891]: E0913 02:27:39.594582 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.595199 kubelet[2891]: E0913 02:27:39.595152 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.595199 kubelet[2891]: W0913 02:27:39.595167 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.595877 kubelet[2891]: E0913 02:27:39.595407 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.596220 kubelet[2891]: E0913 02:27:39.596189 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.596220 kubelet[2891]: W0913 02:27:39.596208 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.596430 kubelet[2891]: E0913 02:27:39.596402 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.597001 kubelet[2891]: E0913 02:27:39.596976 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.597001 kubelet[2891]: W0913 02:27:39.596996 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.597129 kubelet[2891]: E0913 02:27:39.597035 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.597974 kubelet[2891]: E0913 02:27:39.597948 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.597974 kubelet[2891]: W0913 02:27:39.597968 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.598099 kubelet[2891]: E0913 02:27:39.597990 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.598484 kubelet[2891]: E0913 02:27:39.598453 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.598484 kubelet[2891]: W0913 02:27:39.598472 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.598718 kubelet[2891]: E0913 02:27:39.598687 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.599196 kubelet[2891]: E0913 02:27:39.599176 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.599196 kubelet[2891]: W0913 02:27:39.599194 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.599482 kubelet[2891]: E0913 02:27:39.599451 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.602458 kubelet[2891]: E0913 02:27:39.602430 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.602458 kubelet[2891]: W0913 02:27:39.602451 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.602598 kubelet[2891]: E0913 02:27:39.602473 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:39.603572 kubelet[2891]: E0913 02:27:39.603538 2891 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 02:27:39.603572 kubelet[2891]: W0913 02:27:39.603564 2891 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 02:27:39.603684 kubelet[2891]: E0913 02:27:39.603580 2891 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 02:27:40.194308 containerd[1607]: time="2025-09-13T02:27:40.193461463Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:40.196312 containerd[1607]: time="2025-09-13T02:27:40.194337744Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 13 02:27:40.196312 containerd[1607]: time="2025-09-13T02:27:40.195194798Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:40.197705 containerd[1607]: time="2025-09-13T02:27:40.197579329Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:40.199071 containerd[1607]: time="2025-09-13T02:27:40.198487042Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.636347874s" Sep 13 02:27:40.199071 containerd[1607]: time="2025-09-13T02:27:40.198530049Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 13 02:27:40.201564 containerd[1607]: time="2025-09-13T02:27:40.201517583Z" level=info msg="CreateContainer within sandbox \"ea7717106fb5ca554b776fae16cee75dcdb1f5b2aa8c5343bc544b867510573e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 13 02:27:40.222457 containerd[1607]: time="2025-09-13T02:27:40.222421259Z" level=info msg="Container dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:27:40.230624 containerd[1607]: time="2025-09-13T02:27:40.230509625Z" level=info msg="CreateContainer within sandbox \"ea7717106fb5ca554b776fae16cee75dcdb1f5b2aa8c5343bc544b867510573e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa\"" Sep 13 02:27:40.231493 containerd[1607]: time="2025-09-13T02:27:40.231459754Z" level=info msg="StartContainer for \"dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa\"" Sep 13 02:27:40.233566 containerd[1607]: time="2025-09-13T02:27:40.233534018Z" level=info msg="connecting to shim dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa" address="unix:///run/containerd/s/5e44cd1bbd754495f10fcea0531c2648549263cde425a6dfa348be0ba75b8eb1" protocol=ttrpc version=3 Sep 13 02:27:40.276459 systemd[1]: Started cri-containerd-dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa.scope - libcontainer container dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa. Sep 13 02:27:40.343519 containerd[1607]: time="2025-09-13T02:27:40.343387608Z" level=info msg="StartContainer for \"dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa\" returns successfully" Sep 13 02:27:40.363091 systemd[1]: cri-containerd-dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa.scope: Deactivated successfully. Sep 13 02:27:40.455148 containerd[1607]: time="2025-09-13T02:27:40.454856903Z" level=info msg="received exit event container_id:\"dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa\" id:\"dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa\" pid:3566 exited_at:{seconds:1757730460 nanos:369480818}" Sep 13 02:27:40.456300 containerd[1607]: time="2025-09-13T02:27:40.456200771Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa\" id:\"dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa\" pid:3566 exited_at:{seconds:1757730460 nanos:369480818}" Sep 13 02:27:40.491989 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dfdb551299f4c009c3306415a9c3d9b9479ebf03c70b9c4c4b5a30a8ed5accfa-rootfs.mount: Deactivated successfully. Sep 13 02:27:41.374332 kubelet[2891]: E0913 02:27:41.374153 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f4w47" podUID="6c998d1d-8056-4958-bc44-d35b2d7c5300" Sep 13 02:27:41.547434 containerd[1607]: time="2025-09-13T02:27:41.546926829Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 13 02:27:43.386298 kubelet[2891]: E0913 02:27:43.383493 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f4w47" podUID="6c998d1d-8056-4958-bc44-d35b2d7c5300" Sep 13 02:27:45.374469 kubelet[2891]: E0913 02:27:45.374308 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f4w47" podUID="6c998d1d-8056-4958-bc44-d35b2d7c5300" Sep 13 02:27:47.373820 kubelet[2891]: E0913 02:27:47.373376 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f4w47" podUID="6c998d1d-8056-4958-bc44-d35b2d7c5300" Sep 13 02:27:48.014040 containerd[1607]: time="2025-09-13T02:27:48.013932984Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:48.015584 containerd[1607]: time="2025-09-13T02:27:48.015537127Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 13 02:27:48.016328 containerd[1607]: time="2025-09-13T02:27:48.016287087Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:48.019302 containerd[1607]: time="2025-09-13T02:27:48.019144496Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:48.020191 containerd[1607]: time="2025-09-13T02:27:48.020048742Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 6.472964812s" Sep 13 02:27:48.020191 containerd[1607]: time="2025-09-13T02:27:48.020086390Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 13 02:27:48.033073 containerd[1607]: time="2025-09-13T02:27:48.031939701Z" level=info msg="CreateContainer within sandbox \"ea7717106fb5ca554b776fae16cee75dcdb1f5b2aa8c5343bc544b867510573e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 13 02:27:48.045240 containerd[1607]: time="2025-09-13T02:27:48.045193322Z" level=info msg="Container f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:27:48.060685 containerd[1607]: time="2025-09-13T02:27:48.060612536Z" level=info msg="CreateContainer within sandbox \"ea7717106fb5ca554b776fae16cee75dcdb1f5b2aa8c5343bc544b867510573e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11\"" Sep 13 02:27:48.067737 containerd[1607]: time="2025-09-13T02:27:48.067700533Z" level=info msg="StartContainer for \"f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11\"" Sep 13 02:27:48.073987 containerd[1607]: time="2025-09-13T02:27:48.073934159Z" level=info msg="connecting to shim f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11" address="unix:///run/containerd/s/5e44cd1bbd754495f10fcea0531c2648549263cde425a6dfa348be0ba75b8eb1" protocol=ttrpc version=3 Sep 13 02:27:48.111902 systemd[1]: Started cri-containerd-f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11.scope - libcontainer container f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11. Sep 13 02:27:48.195505 containerd[1607]: time="2025-09-13T02:27:48.195365774Z" level=info msg="StartContainer for \"f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11\" returns successfully" Sep 13 02:27:49.132831 systemd[1]: cri-containerd-f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11.scope: Deactivated successfully. Sep 13 02:27:49.133670 systemd[1]: cri-containerd-f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11.scope: Consumed 667ms CPU time, 162.5M memory peak, 5.8M read from disk, 171.3M written to disk. Sep 13 02:27:49.213569 kubelet[2891]: I0913 02:27:49.213520 2891 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 13 02:27:49.224111 containerd[1607]: time="2025-09-13T02:27:49.224045617Z" level=info msg="received exit event container_id:\"f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11\" id:\"f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11\" pid:3628 exited_at:{seconds:1757730469 nanos:223801150}" Sep 13 02:27:49.225339 containerd[1607]: time="2025-09-13T02:27:49.224839332Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11\" id:\"f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11\" pid:3628 exited_at:{seconds:1757730469 nanos:223801150}" Sep 13 02:27:49.321091 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f8c0390aeaa849692ad5fc27bf8590562238160f17b060763cdd3c1b88166a11-rootfs.mount: Deactivated successfully. Sep 13 02:27:49.326694 systemd[1]: Created slice kubepods-burstable-pod8e4d431c_d01f_4945_ba50_358a55b9fce8.slice - libcontainer container kubepods-burstable-pod8e4d431c_d01f_4945_ba50_358a55b9fce8.slice. Sep 13 02:27:49.356214 systemd[1]: Created slice kubepods-besteffort-podea298ee1_7ac7_4d0b_b9d1_ba1d4e4a4647.slice - libcontainer container kubepods-besteffort-podea298ee1_7ac7_4d0b_b9d1_ba1d4e4a4647.slice. Sep 13 02:27:49.369930 systemd[1]: Created slice kubepods-besteffort-pod67566ce3_7b09_4cb1_952c_0ee8a7201943.slice - libcontainer container kubepods-besteffort-pod67566ce3_7b09_4cb1_952c_0ee8a7201943.slice. Sep 13 02:27:49.376174 kubelet[2891]: I0913 02:27:49.376126 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e4d431c-d01f-4945-ba50-358a55b9fce8-config-volume\") pod \"coredns-7c65d6cfc9-d994l\" (UID: \"8e4d431c-d01f-4945-ba50-358a55b9fce8\") " pod="kube-system/coredns-7c65d6cfc9-d994l" Sep 13 02:27:49.376624 kubelet[2891]: I0913 02:27:49.376427 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdhln\" (UniqueName: \"kubernetes.io/projected/ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647-kube-api-access-pdhln\") pod \"calico-kube-controllers-7b6cb57bfd-zmfgm\" (UID: \"ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647\") " pod="calico-system/calico-kube-controllers-7b6cb57bfd-zmfgm" Sep 13 02:27:49.376986 kubelet[2891]: I0913 02:27:49.376727 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/67566ce3-7b09-4cb1-952c-0ee8a7201943-calico-apiserver-certs\") pod \"calico-apiserver-68dfd7d7bf-8jk84\" (UID: \"67566ce3-7b09-4cb1-952c-0ee8a7201943\") " pod="calico-apiserver/calico-apiserver-68dfd7d7bf-8jk84" Sep 13 02:27:49.376986 kubelet[2891]: I0913 02:27:49.376915 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647-tigera-ca-bundle\") pod \"calico-kube-controllers-7b6cb57bfd-zmfgm\" (UID: \"ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647\") " pod="calico-system/calico-kube-controllers-7b6cb57bfd-zmfgm" Sep 13 02:27:49.377354 kubelet[2891]: I0913 02:27:49.376954 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7xn\" (UniqueName: \"kubernetes.io/projected/67566ce3-7b09-4cb1-952c-0ee8a7201943-kube-api-access-gn7xn\") pod \"calico-apiserver-68dfd7d7bf-8jk84\" (UID: \"67566ce3-7b09-4cb1-952c-0ee8a7201943\") " pod="calico-apiserver/calico-apiserver-68dfd7d7bf-8jk84" Sep 13 02:27:49.377354 kubelet[2891]: I0913 02:27:49.377120 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v568\" (UniqueName: \"kubernetes.io/projected/cea724f8-4020-4ab6-998d-91057a769d18-kube-api-access-7v568\") pod \"coredns-7c65d6cfc9-4ljgx\" (UID: \"cea724f8-4020-4ab6-998d-91057a769d18\") " pod="kube-system/coredns-7c65d6cfc9-4ljgx" Sep 13 02:27:49.378038 kubelet[2891]: I0913 02:27:49.377416 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6jtq\" (UniqueName: \"kubernetes.io/projected/8e4d431c-d01f-4945-ba50-358a55b9fce8-kube-api-access-v6jtq\") pod \"coredns-7c65d6cfc9-d994l\" (UID: \"8e4d431c-d01f-4945-ba50-358a55b9fce8\") " pod="kube-system/coredns-7c65d6cfc9-d994l" Sep 13 02:27:49.378038 kubelet[2891]: I0913 02:27:49.377464 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cea724f8-4020-4ab6-998d-91057a769d18-config-volume\") pod \"coredns-7c65d6cfc9-4ljgx\" (UID: \"cea724f8-4020-4ab6-998d-91057a769d18\") " pod="kube-system/coredns-7c65d6cfc9-4ljgx" Sep 13 02:27:49.379383 kubelet[2891]: W0913 02:27:49.379205 2891 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:srv-1di1n.gb1.brightbox.com" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'srv-1di1n.gb1.brightbox.com' and this object Sep 13 02:27:49.380215 kubelet[2891]: E0913 02:27:49.380171 2891 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:srv-1di1n.gb1.brightbox.com\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'srv-1di1n.gb1.brightbox.com' and this object" logger="UnhandledError" Sep 13 02:27:49.381311 kubelet[2891]: W0913 02:27:49.380261 2891 reflector.go:561] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:srv-1di1n.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'srv-1di1n.gb1.brightbox.com' and this object Sep 13 02:27:49.381394 kubelet[2891]: E0913 02:27:49.381322 2891 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:srv-1di1n.gb1.brightbox.com\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'srv-1di1n.gb1.brightbox.com' and this object" logger="UnhandledError" Sep 13 02:27:49.395857 systemd[1]: Created slice kubepods-burstable-podcea724f8_4020_4ab6_998d_91057a769d18.slice - libcontainer container kubepods-burstable-podcea724f8_4020_4ab6_998d_91057a769d18.slice. Sep 13 02:27:49.410634 systemd[1]: Created slice kubepods-besteffort-pod6c998d1d_8056_4958_bc44_d35b2d7c5300.slice - libcontainer container kubepods-besteffort-pod6c998d1d_8056_4958_bc44_d35b2d7c5300.slice. Sep 13 02:27:49.432601 systemd[1]: Created slice kubepods-besteffort-pod8fd46d61_8757_4454_9256_75911979d5bd.slice - libcontainer container kubepods-besteffort-pod8fd46d61_8757_4454_9256_75911979d5bd.slice. Sep 13 02:27:49.447509 containerd[1607]: time="2025-09-13T02:27:49.447455900Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-f4w47,Uid:6c998d1d-8056-4958-bc44-d35b2d7c5300,Namespace:calico-system,Attempt:0,}" Sep 13 02:27:49.447877 systemd[1]: Created slice kubepods-besteffort-podbeafbeb8_f50a_4f44_a6b9_f78f0837f465.slice - libcontainer container kubepods-besteffort-podbeafbeb8_f50a_4f44_a6b9_f78f0837f465.slice. Sep 13 02:27:49.465341 systemd[1]: Created slice kubepods-besteffort-pod3a8e3276_5493_4f4c_985f_4d4f3efec5dd.slice - libcontainer container kubepods-besteffort-pod3a8e3276_5493_4f4c_985f_4d4f3efec5dd.slice. Sep 13 02:27:49.480458 kubelet[2891]: I0913 02:27:49.478537 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fd46d61-8757-4454-9256-75911979d5bd-whisker-ca-bundle\") pod \"whisker-648f7c5fcc-fptpz\" (UID: \"8fd46d61-8757-4454-9256-75911979d5bd\") " pod="calico-system/whisker-648f7c5fcc-fptpz" Sep 13 02:27:49.480458 kubelet[2891]: I0913 02:27:49.478622 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/3a8e3276-5493-4f4c-985f-4d4f3efec5dd-goldmane-key-pair\") pod \"goldmane-7988f88666-dzch9\" (UID: \"3a8e3276-5493-4f4c-985f-4d4f3efec5dd\") " pod="calico-system/goldmane-7988f88666-dzch9" Sep 13 02:27:49.480458 kubelet[2891]: I0913 02:27:49.478672 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8fd46d61-8757-4454-9256-75911979d5bd-whisker-backend-key-pair\") pod \"whisker-648f7c5fcc-fptpz\" (UID: \"8fd46d61-8757-4454-9256-75911979d5bd\") " pod="calico-system/whisker-648f7c5fcc-fptpz" Sep 13 02:27:49.480458 kubelet[2891]: I0913 02:27:49.478697 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8lgb\" (UniqueName: \"kubernetes.io/projected/8fd46d61-8757-4454-9256-75911979d5bd-kube-api-access-v8lgb\") pod \"whisker-648f7c5fcc-fptpz\" (UID: \"8fd46d61-8757-4454-9256-75911979d5bd\") " pod="calico-system/whisker-648f7c5fcc-fptpz" Sep 13 02:27:49.480458 kubelet[2891]: I0913 02:27:49.478740 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a8e3276-5493-4f4c-985f-4d4f3efec5dd-goldmane-ca-bundle\") pod \"goldmane-7988f88666-dzch9\" (UID: \"3a8e3276-5493-4f4c-985f-4d4f3efec5dd\") " pod="calico-system/goldmane-7988f88666-dzch9" Sep 13 02:27:49.480798 kubelet[2891]: I0913 02:27:49.478807 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jw7m\" (UniqueName: \"kubernetes.io/projected/3a8e3276-5493-4f4c-985f-4d4f3efec5dd-kube-api-access-9jw7m\") pod \"goldmane-7988f88666-dzch9\" (UID: \"3a8e3276-5493-4f4c-985f-4d4f3efec5dd\") " pod="calico-system/goldmane-7988f88666-dzch9" Sep 13 02:27:49.480798 kubelet[2891]: I0913 02:27:49.478856 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/beafbeb8-f50a-4f44-a6b9-f78f0837f465-calico-apiserver-certs\") pod \"calico-apiserver-68dfd7d7bf-cfxb9\" (UID: \"beafbeb8-f50a-4f44-a6b9-f78f0837f465\") " pod="calico-apiserver/calico-apiserver-68dfd7d7bf-cfxb9" Sep 13 02:27:49.480798 kubelet[2891]: I0913 02:27:49.478907 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a8e3276-5493-4f4c-985f-4d4f3efec5dd-config\") pod \"goldmane-7988f88666-dzch9\" (UID: \"3a8e3276-5493-4f4c-985f-4d4f3efec5dd\") " pod="calico-system/goldmane-7988f88666-dzch9" Sep 13 02:27:49.480798 kubelet[2891]: I0913 02:27:49.478933 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dgn5\" (UniqueName: \"kubernetes.io/projected/beafbeb8-f50a-4f44-a6b9-f78f0837f465-kube-api-access-8dgn5\") pod \"calico-apiserver-68dfd7d7bf-cfxb9\" (UID: \"beafbeb8-f50a-4f44-a6b9-f78f0837f465\") " pod="calico-apiserver/calico-apiserver-68dfd7d7bf-cfxb9" Sep 13 02:27:49.623541 containerd[1607]: time="2025-09-13T02:27:49.622343853Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 13 02:27:49.673853 containerd[1607]: time="2025-09-13T02:27:49.673683964Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-d994l,Uid:8e4d431c-d01f-4945-ba50-358a55b9fce8,Namespace:kube-system,Attempt:0,}" Sep 13 02:27:49.683024 containerd[1607]: time="2025-09-13T02:27:49.680790228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b6cb57bfd-zmfgm,Uid:ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647,Namespace:calico-system,Attempt:0,}" Sep 13 02:27:49.706707 containerd[1607]: time="2025-09-13T02:27:49.706661234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-4ljgx,Uid:cea724f8-4020-4ab6-998d-91057a769d18,Namespace:kube-system,Attempt:0,}" Sep 13 02:27:49.744187 containerd[1607]: time="2025-09-13T02:27:49.744136420Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-648f7c5fcc-fptpz,Uid:8fd46d61-8757-4454-9256-75911979d5bd,Namespace:calico-system,Attempt:0,}" Sep 13 02:27:49.785914 containerd[1607]: time="2025-09-13T02:27:49.785855562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-dzch9,Uid:3a8e3276-5493-4f4c-985f-4d4f3efec5dd,Namespace:calico-system,Attempt:0,}" Sep 13 02:27:49.828556 containerd[1607]: time="2025-09-13T02:27:49.828491229Z" level=error msg="Failed to destroy network for sandbox \"7eddcb650485c211216c65fc043c40beebfc9364d3ba504541ace056d762d837\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.830221 containerd[1607]: time="2025-09-13T02:27:49.830172665Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-f4w47,Uid:6c998d1d-8056-4958-bc44-d35b2d7c5300,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7eddcb650485c211216c65fc043c40beebfc9364d3ba504541ace056d762d837\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.831606 kubelet[2891]: E0913 02:27:49.831085 2891 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7eddcb650485c211216c65fc043c40beebfc9364d3ba504541ace056d762d837\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.831606 kubelet[2891]: E0913 02:27:49.831199 2891 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7eddcb650485c211216c65fc043c40beebfc9364d3ba504541ace056d762d837\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-f4w47" Sep 13 02:27:49.831606 kubelet[2891]: E0913 02:27:49.831234 2891 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7eddcb650485c211216c65fc043c40beebfc9364d3ba504541ace056d762d837\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-f4w47" Sep 13 02:27:49.831967 kubelet[2891]: E0913 02:27:49.831324 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-f4w47_calico-system(6c998d1d-8056-4958-bc44-d35b2d7c5300)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-f4w47_calico-system(6c998d1d-8056-4958-bc44-d35b2d7c5300)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7eddcb650485c211216c65fc043c40beebfc9364d3ba504541ace056d762d837\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-f4w47" podUID="6c998d1d-8056-4958-bc44-d35b2d7c5300" Sep 13 02:27:49.884672 containerd[1607]: time="2025-09-13T02:27:49.884595148Z" level=error msg="Failed to destroy network for sandbox \"892262a233ac30c01c44b90a567be9a6766e7251d07908fa5de7aa891be76cb1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.888851 containerd[1607]: time="2025-09-13T02:27:49.888687945Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b6cb57bfd-zmfgm,Uid:ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"892262a233ac30c01c44b90a567be9a6766e7251d07908fa5de7aa891be76cb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.890211 kubelet[2891]: E0913 02:27:49.890162 2891 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"892262a233ac30c01c44b90a567be9a6766e7251d07908fa5de7aa891be76cb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.890395 kubelet[2891]: E0913 02:27:49.890364 2891 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"892262a233ac30c01c44b90a567be9a6766e7251d07908fa5de7aa891be76cb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b6cb57bfd-zmfgm" Sep 13 02:27:49.891212 kubelet[2891]: E0913 02:27:49.890495 2891 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"892262a233ac30c01c44b90a567be9a6766e7251d07908fa5de7aa891be76cb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b6cb57bfd-zmfgm" Sep 13 02:27:49.891338 kubelet[2891]: E0913 02:27:49.890961 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7b6cb57bfd-zmfgm_calico-system(ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7b6cb57bfd-zmfgm_calico-system(ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"892262a233ac30c01c44b90a567be9a6766e7251d07908fa5de7aa891be76cb1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7b6cb57bfd-zmfgm" podUID="ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647" Sep 13 02:27:49.899233 containerd[1607]: time="2025-09-13T02:27:49.899181719Z" level=error msg="Failed to destroy network for sandbox \"4540f72e4eb06917ce80576d15aa5b370a8002549c345ff263ba6bf45ebbdbb2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.900789 containerd[1607]: time="2025-09-13T02:27:49.900747087Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-d994l,Uid:8e4d431c-d01f-4945-ba50-358a55b9fce8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4540f72e4eb06917ce80576d15aa5b370a8002549c345ff263ba6bf45ebbdbb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.901216 kubelet[2891]: E0913 02:27:49.901169 2891 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4540f72e4eb06917ce80576d15aa5b370a8002549c345ff263ba6bf45ebbdbb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.901759 kubelet[2891]: E0913 02:27:49.901401 2891 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4540f72e4eb06917ce80576d15aa5b370a8002549c345ff263ba6bf45ebbdbb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-d994l" Sep 13 02:27:49.901759 kubelet[2891]: E0913 02:27:49.901437 2891 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4540f72e4eb06917ce80576d15aa5b370a8002549c345ff263ba6bf45ebbdbb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-d994l" Sep 13 02:27:49.901759 kubelet[2891]: E0913 02:27:49.901501 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-d994l_kube-system(8e4d431c-d01f-4945-ba50-358a55b9fce8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-d994l_kube-system(8e4d431c-d01f-4945-ba50-358a55b9fce8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4540f72e4eb06917ce80576d15aa5b370a8002549c345ff263ba6bf45ebbdbb2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-d994l" podUID="8e4d431c-d01f-4945-ba50-358a55b9fce8" Sep 13 02:27:49.943068 containerd[1607]: time="2025-09-13T02:27:49.942772908Z" level=error msg="Failed to destroy network for sandbox \"387c8ca26dff0e78644785583209248b77fe59836818387998e082a8e8cbb76c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.944923 containerd[1607]: time="2025-09-13T02:27:49.944878710Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-4ljgx,Uid:cea724f8-4020-4ab6-998d-91057a769d18,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"387c8ca26dff0e78644785583209248b77fe59836818387998e082a8e8cbb76c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.945772 kubelet[2891]: E0913 02:27:49.945134 2891 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"387c8ca26dff0e78644785583209248b77fe59836818387998e082a8e8cbb76c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.945772 kubelet[2891]: E0913 02:27:49.945205 2891 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"387c8ca26dff0e78644785583209248b77fe59836818387998e082a8e8cbb76c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-4ljgx" Sep 13 02:27:49.945772 kubelet[2891]: E0913 02:27:49.945231 2891 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"387c8ca26dff0e78644785583209248b77fe59836818387998e082a8e8cbb76c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-4ljgx" Sep 13 02:27:49.945971 kubelet[2891]: E0913 02:27:49.945309 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-4ljgx_kube-system(cea724f8-4020-4ab6-998d-91057a769d18)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-4ljgx_kube-system(cea724f8-4020-4ab6-998d-91057a769d18)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"387c8ca26dff0e78644785583209248b77fe59836818387998e082a8e8cbb76c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-4ljgx" podUID="cea724f8-4020-4ab6-998d-91057a769d18" Sep 13 02:27:49.978580 containerd[1607]: time="2025-09-13T02:27:49.978422198Z" level=error msg="Failed to destroy network for sandbox \"6a11769a72ec30c484431a171725e1cba0f7ecf808d6b9d2cb98143c66760eae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.984979 containerd[1607]: time="2025-09-13T02:27:49.984850156Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-648f7c5fcc-fptpz,Uid:8fd46d61-8757-4454-9256-75911979d5bd,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a11769a72ec30c484431a171725e1cba0f7ecf808d6b9d2cb98143c66760eae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.986019 kubelet[2891]: E0913 02:27:49.985493 2891 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a11769a72ec30c484431a171725e1cba0f7ecf808d6b9d2cb98143c66760eae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:49.986019 kubelet[2891]: E0913 02:27:49.985560 2891 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a11769a72ec30c484431a171725e1cba0f7ecf808d6b9d2cb98143c66760eae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-648f7c5fcc-fptpz" Sep 13 02:27:49.986019 kubelet[2891]: E0913 02:27:49.985693 2891 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a11769a72ec30c484431a171725e1cba0f7ecf808d6b9d2cb98143c66760eae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-648f7c5fcc-fptpz" Sep 13 02:27:49.986360 kubelet[2891]: E0913 02:27:49.985757 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-648f7c5fcc-fptpz_calico-system(8fd46d61-8757-4454-9256-75911979d5bd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-648f7c5fcc-fptpz_calico-system(8fd46d61-8757-4454-9256-75911979d5bd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6a11769a72ec30c484431a171725e1cba0f7ecf808d6b9d2cb98143c66760eae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-648f7c5fcc-fptpz" podUID="8fd46d61-8757-4454-9256-75911979d5bd" Sep 13 02:27:49.999851 containerd[1607]: time="2025-09-13T02:27:49.999741417Z" level=error msg="Failed to destroy network for sandbox \"54213d79c46b366c74cbe7e403858be5a0e21265db52ba579473b9a77782513e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:50.001335 containerd[1607]: time="2025-09-13T02:27:50.001290053Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-dzch9,Uid:3a8e3276-5493-4f4c-985f-4d4f3efec5dd,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"54213d79c46b366c74cbe7e403858be5a0e21265db52ba579473b9a77782513e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:50.002106 kubelet[2891]: E0913 02:27:50.001538 2891 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"54213d79c46b366c74cbe7e403858be5a0e21265db52ba579473b9a77782513e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:50.002106 kubelet[2891]: E0913 02:27:50.001623 2891 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"54213d79c46b366c74cbe7e403858be5a0e21265db52ba579473b9a77782513e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-dzch9" Sep 13 02:27:50.002106 kubelet[2891]: E0913 02:27:50.001659 2891 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"54213d79c46b366c74cbe7e403858be5a0e21265db52ba579473b9a77782513e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-dzch9" Sep 13 02:27:50.002410 kubelet[2891]: E0913 02:27:50.001712 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-dzch9_calico-system(3a8e3276-5493-4f4c-985f-4d4f3efec5dd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-dzch9_calico-system(3a8e3276-5493-4f4c-985f-4d4f3efec5dd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"54213d79c46b366c74cbe7e403858be5a0e21265db52ba579473b9a77782513e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-dzch9" podUID="3a8e3276-5493-4f4c-985f-4d4f3efec5dd" Sep 13 02:27:50.332700 systemd[1]: run-netns-cni\x2d6cc10536\x2d1408\x2d8b34\x2dcb68\x2d4ce726d46a35.mount: Deactivated successfully. Sep 13 02:27:50.483632 kubelet[2891]: E0913 02:27:50.483557 2891 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Sep 13 02:27:50.484353 kubelet[2891]: E0913 02:27:50.483711 2891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67566ce3-7b09-4cb1-952c-0ee8a7201943-calico-apiserver-certs podName:67566ce3-7b09-4cb1-952c-0ee8a7201943 nodeName:}" failed. No retries permitted until 2025-09-13 02:27:50.983675487 +0000 UTC m=+37.750986253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/67566ce3-7b09-4cb1-952c-0ee8a7201943-calico-apiserver-certs") pod "calico-apiserver-68dfd7d7bf-8jk84" (UID: "67566ce3-7b09-4cb1-952c-0ee8a7201943") : failed to sync secret cache: timed out waiting for the condition Sep 13 02:27:50.592717 kubelet[2891]: E0913 02:27:50.592465 2891 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Sep 13 02:27:50.592717 kubelet[2891]: E0913 02:27:50.592599 2891 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/beafbeb8-f50a-4f44-a6b9-f78f0837f465-calico-apiserver-certs podName:beafbeb8-f50a-4f44-a6b9-f78f0837f465 nodeName:}" failed. No retries permitted until 2025-09-13 02:27:51.092549079 +0000 UTC m=+37.859859846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/beafbeb8-f50a-4f44-a6b9-f78f0837f465-calico-apiserver-certs") pod "calico-apiserver-68dfd7d7bf-cfxb9" (UID: "beafbeb8-f50a-4f44-a6b9-f78f0837f465") : failed to sync secret cache: timed out waiting for the condition Sep 13 02:27:51.187554 containerd[1607]: time="2025-09-13T02:27:51.187502226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68dfd7d7bf-8jk84,Uid:67566ce3-7b09-4cb1-952c-0ee8a7201943,Namespace:calico-apiserver,Attempt:0,}" Sep 13 02:27:51.255736 containerd[1607]: time="2025-09-13T02:27:51.255629077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68dfd7d7bf-cfxb9,Uid:beafbeb8-f50a-4f44-a6b9-f78f0837f465,Namespace:calico-apiserver,Attempt:0,}" Sep 13 02:27:51.330539 containerd[1607]: time="2025-09-13T02:27:51.330457966Z" level=error msg="Failed to destroy network for sandbox \"9b9c8b85bc6f2effcdeaf785ff3291ddc817cbc2b643eeb25ce0beac15124569\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:51.333172 containerd[1607]: time="2025-09-13T02:27:51.333108016Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68dfd7d7bf-8jk84,Uid:67566ce3-7b09-4cb1-952c-0ee8a7201943,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b9c8b85bc6f2effcdeaf785ff3291ddc817cbc2b643eeb25ce0beac15124569\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:51.333951 systemd[1]: run-netns-cni\x2daa178fdc\x2dd69c\x2d8d86\x2d355e\x2da519ec46fc5c.mount: Deactivated successfully. Sep 13 02:27:51.335535 kubelet[2891]: E0913 02:27:51.334423 2891 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b9c8b85bc6f2effcdeaf785ff3291ddc817cbc2b643eeb25ce0beac15124569\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:51.335535 kubelet[2891]: E0913 02:27:51.334498 2891 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b9c8b85bc6f2effcdeaf785ff3291ddc817cbc2b643eeb25ce0beac15124569\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-68dfd7d7bf-8jk84" Sep 13 02:27:51.335535 kubelet[2891]: E0913 02:27:51.334524 2891 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b9c8b85bc6f2effcdeaf785ff3291ddc817cbc2b643eeb25ce0beac15124569\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-68dfd7d7bf-8jk84" Sep 13 02:27:51.335948 kubelet[2891]: E0913 02:27:51.334577 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-68dfd7d7bf-8jk84_calico-apiserver(67566ce3-7b09-4cb1-952c-0ee8a7201943)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-68dfd7d7bf-8jk84_calico-apiserver(67566ce3-7b09-4cb1-952c-0ee8a7201943)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9b9c8b85bc6f2effcdeaf785ff3291ddc817cbc2b643eeb25ce0beac15124569\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-68dfd7d7bf-8jk84" podUID="67566ce3-7b09-4cb1-952c-0ee8a7201943" Sep 13 02:27:51.407334 containerd[1607]: time="2025-09-13T02:27:51.407217613Z" level=error msg="Failed to destroy network for sandbox \"77622d961075708ffa3b1c46eabd47ed95056ee76acdfa232e4ed00562cb93ea\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:51.411842 containerd[1607]: time="2025-09-13T02:27:51.410763593Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68dfd7d7bf-cfxb9,Uid:beafbeb8-f50a-4f44-a6b9-f78f0837f465,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"77622d961075708ffa3b1c46eabd47ed95056ee76acdfa232e4ed00562cb93ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:51.412022 kubelet[2891]: E0913 02:27:51.411058 2891 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"77622d961075708ffa3b1c46eabd47ed95056ee76acdfa232e4ed00562cb93ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 02:27:51.412022 kubelet[2891]: E0913 02:27:51.411133 2891 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"77622d961075708ffa3b1c46eabd47ed95056ee76acdfa232e4ed00562cb93ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-68dfd7d7bf-cfxb9" Sep 13 02:27:51.412022 kubelet[2891]: E0913 02:27:51.411158 2891 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"77622d961075708ffa3b1c46eabd47ed95056ee76acdfa232e4ed00562cb93ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-68dfd7d7bf-cfxb9" Sep 13 02:27:51.410977 systemd[1]: run-netns-cni\x2d2b452abb\x2d2a1d\x2d5bdf\x2dbeb0\x2d86fc54d735cd.mount: Deactivated successfully. Sep 13 02:27:51.412575 kubelet[2891]: E0913 02:27:51.411218 2891 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-68dfd7d7bf-cfxb9_calico-apiserver(beafbeb8-f50a-4f44-a6b9-f78f0837f465)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-68dfd7d7bf-cfxb9_calico-apiserver(beafbeb8-f50a-4f44-a6b9-f78f0837f465)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"77622d961075708ffa3b1c46eabd47ed95056ee76acdfa232e4ed00562cb93ea\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-68dfd7d7bf-cfxb9" podUID="beafbeb8-f50a-4f44-a6b9-f78f0837f465" Sep 13 02:27:59.121007 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount228673350.mount: Deactivated successfully. Sep 13 02:27:59.227996 containerd[1607]: time="2025-09-13T02:27:59.198434927Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:59.238231 containerd[1607]: time="2025-09-13T02:27:59.238044215Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 13 02:27:59.242759 containerd[1607]: time="2025-09-13T02:27:59.242666663Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:59.244508 containerd[1607]: time="2025-09-13T02:27:59.244473516Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:27:59.248455 containerd[1607]: time="2025-09-13T02:27:59.248407356Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 9.624976227s" Sep 13 02:27:59.248540 containerd[1607]: time="2025-09-13T02:27:59.248462836Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 13 02:27:59.276760 containerd[1607]: time="2025-09-13T02:27:59.276706842Z" level=info msg="CreateContainer within sandbox \"ea7717106fb5ca554b776fae16cee75dcdb1f5b2aa8c5343bc544b867510573e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 13 02:27:59.308328 containerd[1607]: time="2025-09-13T02:27:59.306650243Z" level=info msg="Container de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:27:59.309222 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount681050598.mount: Deactivated successfully. Sep 13 02:27:59.360321 containerd[1607]: time="2025-09-13T02:27:59.360193808Z" level=info msg="CreateContainer within sandbox \"ea7717106fb5ca554b776fae16cee75dcdb1f5b2aa8c5343bc544b867510573e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca\"" Sep 13 02:27:59.361349 containerd[1607]: time="2025-09-13T02:27:59.361154163Z" level=info msg="StartContainer for \"de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca\"" Sep 13 02:27:59.373128 containerd[1607]: time="2025-09-13T02:27:59.373019152Z" level=info msg="connecting to shim de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca" address="unix:///run/containerd/s/5e44cd1bbd754495f10fcea0531c2648549263cde425a6dfa348be0ba75b8eb1" protocol=ttrpc version=3 Sep 13 02:27:59.512691 systemd[1]: Started cri-containerd-de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca.scope - libcontainer container de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca. Sep 13 02:27:59.612525 containerd[1607]: time="2025-09-13T02:27:59.612478237Z" level=info msg="StartContainer for \"de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca\" returns successfully" Sep 13 02:27:59.801633 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 13 02:27:59.804378 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 13 02:27:59.823168 kubelet[2891]: I0913 02:27:59.822489 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-djrws" podStartSLOduration=1.558750096 podStartE2EDuration="26.822423209s" podCreationTimestamp="2025-09-13 02:27:33 +0000 UTC" firstStartedPulling="2025-09-13 02:27:33.985680559 +0000 UTC m=+20.752991324" lastFinishedPulling="2025-09-13 02:27:59.249353672 +0000 UTC m=+46.016664437" observedRunningTime="2025-09-13 02:27:59.814745118 +0000 UTC m=+46.582055909" watchObservedRunningTime="2025-09-13 02:27:59.822423209 +0000 UTC m=+46.589733999" Sep 13 02:28:00.269409 kubelet[2891]: I0913 02:28:00.269002 2891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8lgb\" (UniqueName: \"kubernetes.io/projected/8fd46d61-8757-4454-9256-75911979d5bd-kube-api-access-v8lgb\") pod \"8fd46d61-8757-4454-9256-75911979d5bd\" (UID: \"8fd46d61-8757-4454-9256-75911979d5bd\") " Sep 13 02:28:00.269409 kubelet[2891]: I0913 02:28:00.269067 2891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fd46d61-8757-4454-9256-75911979d5bd-whisker-ca-bundle\") pod \"8fd46d61-8757-4454-9256-75911979d5bd\" (UID: \"8fd46d61-8757-4454-9256-75911979d5bd\") " Sep 13 02:28:00.269409 kubelet[2891]: I0913 02:28:00.269101 2891 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8fd46d61-8757-4454-9256-75911979d5bd-whisker-backend-key-pair\") pod \"8fd46d61-8757-4454-9256-75911979d5bd\" (UID: \"8fd46d61-8757-4454-9256-75911979d5bd\") " Sep 13 02:28:00.280258 kubelet[2891]: I0913 02:28:00.280168 2891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd46d61-8757-4454-9256-75911979d5bd-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "8fd46d61-8757-4454-9256-75911979d5bd" (UID: "8fd46d61-8757-4454-9256-75911979d5bd"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 13 02:28:00.285479 kubelet[2891]: I0913 02:28:00.282823 2891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd46d61-8757-4454-9256-75911979d5bd-kube-api-access-v8lgb" (OuterVolumeSpecName: "kube-api-access-v8lgb") pod "8fd46d61-8757-4454-9256-75911979d5bd" (UID: "8fd46d61-8757-4454-9256-75911979d5bd"). InnerVolumeSpecName "kube-api-access-v8lgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 13 02:28:00.283984 systemd[1]: var-lib-kubelet-pods-8fd46d61\x2d8757\x2d4454\x2d9256\x2d75911979d5bd-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dv8lgb.mount: Deactivated successfully. Sep 13 02:28:00.291557 systemd[1]: var-lib-kubelet-pods-8fd46d61\x2d8757\x2d4454\x2d9256\x2d75911979d5bd-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 13 02:28:00.292917 kubelet[2891]: I0913 02:28:00.292854 2891 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd46d61-8757-4454-9256-75911979d5bd-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "8fd46d61-8757-4454-9256-75911979d5bd" (UID: "8fd46d61-8757-4454-9256-75911979d5bd"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 13 02:28:00.304852 containerd[1607]: time="2025-09-13T02:28:00.304798309Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca\" id:\"07446a854c18359e184e395ebafee727266c13032cc3286acc019d47caeb448f\" pid:3941 exit_status:1 exited_at:{seconds:1757730480 nanos:298616601}" Sep 13 02:28:00.370031 kubelet[2891]: I0913 02:28:00.369985 2891 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fd46d61-8757-4454-9256-75911979d5bd-whisker-ca-bundle\") on node \"srv-1di1n.gb1.brightbox.com\" DevicePath \"\"" Sep 13 02:28:00.370643 kubelet[2891]: I0913 02:28:00.370560 2891 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8fd46d61-8757-4454-9256-75911979d5bd-whisker-backend-key-pair\") on node \"srv-1di1n.gb1.brightbox.com\" DevicePath \"\"" Sep 13 02:28:00.370643 kubelet[2891]: I0913 02:28:00.370589 2891 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8lgb\" (UniqueName: \"kubernetes.io/projected/8fd46d61-8757-4454-9256-75911979d5bd-kube-api-access-v8lgb\") on node \"srv-1di1n.gb1.brightbox.com\" DevicePath \"\"" Sep 13 02:28:00.775002 systemd[1]: Removed slice kubepods-besteffort-pod8fd46d61_8757_4454_9256_75911979d5bd.slice - libcontainer container kubepods-besteffort-pod8fd46d61_8757_4454_9256_75911979d5bd.slice. Sep 13 02:28:00.935777 systemd[1]: Created slice kubepods-besteffort-pod3e296407_fde4_4a37_b69c_6ccd8c194171.slice - libcontainer container kubepods-besteffort-pod3e296407_fde4_4a37_b69c_6ccd8c194171.slice. Sep 13 02:28:00.976587 kubelet[2891]: I0913 02:28:00.976532 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e296407-fde4-4a37-b69c-6ccd8c194171-whisker-ca-bundle\") pod \"whisker-5754459988-gpp8n\" (UID: \"3e296407-fde4-4a37-b69c-6ccd8c194171\") " pod="calico-system/whisker-5754459988-gpp8n" Sep 13 02:28:00.976587 kubelet[2891]: I0913 02:28:00.976594 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsmrb\" (UniqueName: \"kubernetes.io/projected/3e296407-fde4-4a37-b69c-6ccd8c194171-kube-api-access-lsmrb\") pod \"whisker-5754459988-gpp8n\" (UID: \"3e296407-fde4-4a37-b69c-6ccd8c194171\") " pod="calico-system/whisker-5754459988-gpp8n" Sep 13 02:28:00.977236 kubelet[2891]: I0913 02:28:00.976640 2891 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3e296407-fde4-4a37-b69c-6ccd8c194171-whisker-backend-key-pair\") pod \"whisker-5754459988-gpp8n\" (UID: \"3e296407-fde4-4a37-b69c-6ccd8c194171\") " pod="calico-system/whisker-5754459988-gpp8n" Sep 13 02:28:01.135132 containerd[1607]: time="2025-09-13T02:28:01.134696631Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca\" id:\"209f381675e39abeddfcafc8354152b31b1a9d8d465782a936952ce55d6b7dae\" pid:3989 exit_status:1 exited_at:{seconds:1757730481 nanos:134016409}" Sep 13 02:28:01.247874 containerd[1607]: time="2025-09-13T02:28:01.247791525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5754459988-gpp8n,Uid:3e296407-fde4-4a37-b69c-6ccd8c194171,Namespace:calico-system,Attempt:0,}" Sep 13 02:28:01.380668 kubelet[2891]: I0913 02:28:01.380618 2891 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd46d61-8757-4454-9256-75911979d5bd" path="/var/lib/kubelet/pods/8fd46d61-8757-4454-9256-75911979d5bd/volumes" Sep 13 02:28:01.745989 systemd-networkd[1500]: cali346558e1ef7: Link UP Sep 13 02:28:01.747469 systemd-networkd[1500]: cali346558e1ef7: Gained carrier Sep 13 02:28:01.776348 containerd[1607]: 2025-09-13 02:28:01.295 [INFO][4005] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 13 02:28:01.776348 containerd[1607]: 2025-09-13 02:28:01.336 [INFO][4005] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0 whisker-5754459988- calico-system 3e296407-fde4-4a37-b69c-6ccd8c194171 894 0 2025-09-13 02:28:00 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5754459988 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-1di1n.gb1.brightbox.com whisker-5754459988-gpp8n eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali346558e1ef7 [] [] }} ContainerID="6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" Namespace="calico-system" Pod="whisker-5754459988-gpp8n" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-" Sep 13 02:28:01.776348 containerd[1607]: 2025-09-13 02:28:01.336 [INFO][4005] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" Namespace="calico-system" Pod="whisker-5754459988-gpp8n" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0" Sep 13 02:28:01.776348 containerd[1607]: 2025-09-13 02:28:01.544 [INFO][4014] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" HandleID="k8s-pod-network.6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" Workload="srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0" Sep 13 02:28:01.777017 containerd[1607]: 2025-09-13 02:28:01.547 [INFO][4014] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" HandleID="k8s-pod-network.6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" Workload="srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000212220), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-1di1n.gb1.brightbox.com", "pod":"whisker-5754459988-gpp8n", "timestamp":"2025-09-13 02:28:01.544697309 +0000 UTC"}, Hostname:"srv-1di1n.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 02:28:01.777017 containerd[1607]: 2025-09-13 02:28:01.547 [INFO][4014] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 02:28:01.777017 containerd[1607]: 2025-09-13 02:28:01.548 [INFO][4014] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 02:28:01.777017 containerd[1607]: 2025-09-13 02:28:01.553 [INFO][4014] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1di1n.gb1.brightbox.com' Sep 13 02:28:01.777017 containerd[1607]: 2025-09-13 02:28:01.681 [INFO][4014] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:01.777017 containerd[1607]: 2025-09-13 02:28:01.694 [INFO][4014] ipam/ipam.go 394: Looking up existing affinities for host host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:01.777017 containerd[1607]: 2025-09-13 02:28:01.700 [INFO][4014] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:01.777017 containerd[1607]: 2025-09-13 02:28:01.704 [INFO][4014] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:01.777017 containerd[1607]: 2025-09-13 02:28:01.708 [INFO][4014] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:01.779000 containerd[1607]: 2025-09-13 02:28:01.708 [INFO][4014] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:01.779000 containerd[1607]: 2025-09-13 02:28:01.710 [INFO][4014] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21 Sep 13 02:28:01.779000 containerd[1607]: 2025-09-13 02:28:01.715 [INFO][4014] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:01.779000 containerd[1607]: 2025-09-13 02:28:01.724 [INFO][4014] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.65/26] block=192.168.56.64/26 handle="k8s-pod-network.6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:01.779000 containerd[1607]: 2025-09-13 02:28:01.724 [INFO][4014] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.65/26] handle="k8s-pod-network.6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:01.779000 containerd[1607]: 2025-09-13 02:28:01.724 [INFO][4014] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 02:28:01.779000 containerd[1607]: 2025-09-13 02:28:01.725 [INFO][4014] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.65/26] IPv6=[] ContainerID="6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" HandleID="k8s-pod-network.6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" Workload="srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0" Sep 13 02:28:01.779635 containerd[1607]: 2025-09-13 02:28:01.729 [INFO][4005] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" Namespace="calico-system" Pod="whisker-5754459988-gpp8n" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0", GenerateName:"whisker-5754459988-", Namespace:"calico-system", SelfLink:"", UID:"3e296407-fde4-4a37-b69c-6ccd8c194171", ResourceVersion:"894", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 28, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5754459988", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"", Pod:"whisker-5754459988-gpp8n", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.56.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali346558e1ef7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:01.779635 containerd[1607]: 2025-09-13 02:28:01.730 [INFO][4005] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.65/32] ContainerID="6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" Namespace="calico-system" Pod="whisker-5754459988-gpp8n" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0" Sep 13 02:28:01.779804 containerd[1607]: 2025-09-13 02:28:01.730 [INFO][4005] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali346558e1ef7 ContainerID="6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" Namespace="calico-system" Pod="whisker-5754459988-gpp8n" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0" Sep 13 02:28:01.779804 containerd[1607]: 2025-09-13 02:28:01.748 [INFO][4005] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" Namespace="calico-system" Pod="whisker-5754459988-gpp8n" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0" Sep 13 02:28:01.779918 containerd[1607]: 2025-09-13 02:28:01.749 [INFO][4005] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" Namespace="calico-system" Pod="whisker-5754459988-gpp8n" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0", GenerateName:"whisker-5754459988-", Namespace:"calico-system", SelfLink:"", UID:"3e296407-fde4-4a37-b69c-6ccd8c194171", ResourceVersion:"894", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 28, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5754459988", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21", Pod:"whisker-5754459988-gpp8n", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.56.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali346558e1ef7", MAC:"52:df:e5:b5:56:7c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:01.780021 containerd[1607]: 2025-09-13 02:28:01.767 [INFO][4005] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" Namespace="calico-system" Pod="whisker-5754459988-gpp8n" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-whisker--5754459988--gpp8n-eth0" Sep 13 02:28:02.031293 containerd[1607]: time="2025-09-13T02:28:02.031055212Z" level=info msg="connecting to shim 6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21" address="unix:///run/containerd/s/b410d8422fce08f44b30c15436d48f045ffb6941926f31c5ddb4a3ec6bbc3032" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:28:02.093801 systemd[1]: Started cri-containerd-6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21.scope - libcontainer container 6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21. Sep 13 02:28:02.233224 containerd[1607]: time="2025-09-13T02:28:02.233162295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5754459988-gpp8n,Uid:3e296407-fde4-4a37-b69c-6ccd8c194171,Namespace:calico-system,Attempt:0,} returns sandbox id \"6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21\"" Sep 13 02:28:02.244353 containerd[1607]: time="2025-09-13T02:28:02.244133038Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 13 02:28:02.403406 containerd[1607]: time="2025-09-13T02:28:02.402619778Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-d994l,Uid:8e4d431c-d01f-4945-ba50-358a55b9fce8,Namespace:kube-system,Attempt:0,}" Sep 13 02:28:02.404289 containerd[1607]: time="2025-09-13T02:28:02.402623887Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68dfd7d7bf-cfxb9,Uid:beafbeb8-f50a-4f44-a6b9-f78f0837f465,Namespace:calico-apiserver,Attempt:0,}" Sep 13 02:28:02.405031 containerd[1607]: time="2025-09-13T02:28:02.402967010Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-f4w47,Uid:6c998d1d-8056-4958-bc44-d35b2d7c5300,Namespace:calico-system,Attempt:0,}" Sep 13 02:28:02.822194 systemd-networkd[1500]: cali4f2175a1488: Link UP Sep 13 02:28:02.825512 systemd-networkd[1500]: cali4f2175a1488: Gained carrier Sep 13 02:28:02.891388 containerd[1607]: 2025-09-13 02:28:02.612 [INFO][4176] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0 csi-node-driver- calico-system 6c998d1d-8056-4958-bc44-d35b2d7c5300 692 0 2025-09-13 02:27:33 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-1di1n.gb1.brightbox.com csi-node-driver-f4w47 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali4f2175a1488 [] [] }} ContainerID="8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" Namespace="calico-system" Pod="csi-node-driver-f4w47" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-" Sep 13 02:28:02.891388 containerd[1607]: 2025-09-13 02:28:02.612 [INFO][4176] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" Namespace="calico-system" Pod="csi-node-driver-f4w47" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0" Sep 13 02:28:02.891388 containerd[1607]: 2025-09-13 02:28:02.715 [INFO][4217] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" HandleID="k8s-pod-network.8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" Workload="srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0" Sep 13 02:28:02.892086 containerd[1607]: 2025-09-13 02:28:02.716 [INFO][4217] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" HandleID="k8s-pod-network.8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" Workload="srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003482a0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-1di1n.gb1.brightbox.com", "pod":"csi-node-driver-f4w47", "timestamp":"2025-09-13 02:28:02.715774797 +0000 UTC"}, Hostname:"srv-1di1n.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 02:28:02.892086 containerd[1607]: 2025-09-13 02:28:02.718 [INFO][4217] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 02:28:02.892086 containerd[1607]: 2025-09-13 02:28:02.718 [INFO][4217] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 02:28:02.892086 containerd[1607]: 2025-09-13 02:28:02.718 [INFO][4217] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1di1n.gb1.brightbox.com' Sep 13 02:28:02.892086 containerd[1607]: 2025-09-13 02:28:02.739 [INFO][4217] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:02.892086 containerd[1607]: 2025-09-13 02:28:02.750 [INFO][4217] ipam/ipam.go 394: Looking up existing affinities for host host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:02.892086 containerd[1607]: 2025-09-13 02:28:02.760 [INFO][4217] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:02.892086 containerd[1607]: 2025-09-13 02:28:02.765 [INFO][4217] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:02.892086 containerd[1607]: 2025-09-13 02:28:02.771 [INFO][4217] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:02.892545 containerd[1607]: 2025-09-13 02:28:02.771 [INFO][4217] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:02.892545 containerd[1607]: 2025-09-13 02:28:02.773 [INFO][4217] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1 Sep 13 02:28:02.892545 containerd[1607]: 2025-09-13 02:28:02.780 [INFO][4217] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:02.892545 containerd[1607]: 2025-09-13 02:28:02.791 [INFO][4217] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.66/26] block=192.168.56.64/26 handle="k8s-pod-network.8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:02.892545 containerd[1607]: 2025-09-13 02:28:02.792 [INFO][4217] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.66/26] handle="k8s-pod-network.8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:02.892545 containerd[1607]: 2025-09-13 02:28:02.792 [INFO][4217] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 02:28:02.892545 containerd[1607]: 2025-09-13 02:28:02.792 [INFO][4217] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.66/26] IPv6=[] ContainerID="8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" HandleID="k8s-pod-network.8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" Workload="srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0" Sep 13 02:28:02.893331 containerd[1607]: 2025-09-13 02:28:02.798 [INFO][4176] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" Namespace="calico-system" Pod="csi-node-driver-f4w47" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6c998d1d-8056-4958-bc44-d35b2d7c5300", ResourceVersion:"692", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-f4w47", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.56.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4f2175a1488", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:02.893434 containerd[1607]: 2025-09-13 02:28:02.799 [INFO][4176] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.66/32] ContainerID="8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" Namespace="calico-system" Pod="csi-node-driver-f4w47" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0" Sep 13 02:28:02.893434 containerd[1607]: 2025-09-13 02:28:02.799 [INFO][4176] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4f2175a1488 ContainerID="8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" Namespace="calico-system" Pod="csi-node-driver-f4w47" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0" Sep 13 02:28:02.893434 containerd[1607]: 2025-09-13 02:28:02.830 [INFO][4176] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" Namespace="calico-system" Pod="csi-node-driver-f4w47" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0" Sep 13 02:28:02.893572 containerd[1607]: 2025-09-13 02:28:02.833 [INFO][4176] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" Namespace="calico-system" Pod="csi-node-driver-f4w47" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6c998d1d-8056-4958-bc44-d35b2d7c5300", ResourceVersion:"692", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1", Pod:"csi-node-driver-f4w47", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.56.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4f2175a1488", MAC:"0a:cd:da:21:c5:5a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:02.893660 containerd[1607]: 2025-09-13 02:28:02.866 [INFO][4176] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" Namespace="calico-system" Pod="csi-node-driver-f4w47" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-csi--node--driver--f4w47-eth0" Sep 13 02:28:02.949565 containerd[1607]: time="2025-09-13T02:28:02.949480309Z" level=info msg="connecting to shim 8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1" address="unix:///run/containerd/s/49c4a009abc1085a6a0f5890a1d9c56a9cf8fd461ceacc86a9aa03ac448bb36b" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:28:02.968936 systemd-networkd[1500]: calif8c344e8fc4: Link UP Sep 13 02:28:02.972605 systemd-networkd[1500]: calif8c344e8fc4: Gained carrier Sep 13 02:28:03.008502 containerd[1607]: 2025-09-13 02:28:02.619 [INFO][4179] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0 calico-apiserver-68dfd7d7bf- calico-apiserver beafbeb8-f50a-4f44-a6b9-f78f0837f465 823 0 2025-09-13 02:27:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:68dfd7d7bf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-1di1n.gb1.brightbox.com calico-apiserver-68dfd7d7bf-cfxb9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif8c344e8fc4 [] [] }} ContainerID="f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-cfxb9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-" Sep 13 02:28:03.008502 containerd[1607]: 2025-09-13 02:28:02.620 [INFO][4179] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-cfxb9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0" Sep 13 02:28:03.008502 containerd[1607]: 2025-09-13 02:28:02.746 [INFO][4223] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" HandleID="k8s-pod-network.f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" Workload="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0" Sep 13 02:28:03.008791 containerd[1607]: 2025-09-13 02:28:02.746 [INFO][4223] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" HandleID="k8s-pod-network.f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" Workload="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000389d60), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-1di1n.gb1.brightbox.com", "pod":"calico-apiserver-68dfd7d7bf-cfxb9", "timestamp":"2025-09-13 02:28:02.746512647 +0000 UTC"}, Hostname:"srv-1di1n.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 02:28:03.008791 containerd[1607]: 2025-09-13 02:28:02.746 [INFO][4223] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 02:28:03.008791 containerd[1607]: 2025-09-13 02:28:02.793 [INFO][4223] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 02:28:03.008791 containerd[1607]: 2025-09-13 02:28:02.793 [INFO][4223] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1di1n.gb1.brightbox.com' Sep 13 02:28:03.008791 containerd[1607]: 2025-09-13 02:28:02.853 [INFO][4223] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.008791 containerd[1607]: 2025-09-13 02:28:02.881 [INFO][4223] ipam/ipam.go 394: Looking up existing affinities for host host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.008791 containerd[1607]: 2025-09-13 02:28:02.900 [INFO][4223] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.008791 containerd[1607]: 2025-09-13 02:28:02.905 [INFO][4223] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.008791 containerd[1607]: 2025-09-13 02:28:02.909 [INFO][4223] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.009213 containerd[1607]: 2025-09-13 02:28:02.909 [INFO][4223] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.009213 containerd[1607]: 2025-09-13 02:28:02.913 [INFO][4223] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492 Sep 13 02:28:03.009213 containerd[1607]: 2025-09-13 02:28:02.927 [INFO][4223] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.009213 containerd[1607]: 2025-09-13 02:28:02.940 [INFO][4223] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.67/26] block=192.168.56.64/26 handle="k8s-pod-network.f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.009213 containerd[1607]: 2025-09-13 02:28:02.940 [INFO][4223] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.67/26] handle="k8s-pod-network.f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.009213 containerd[1607]: 2025-09-13 02:28:02.941 [INFO][4223] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 02:28:03.009213 containerd[1607]: 2025-09-13 02:28:02.941 [INFO][4223] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.67/26] IPv6=[] ContainerID="f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" HandleID="k8s-pod-network.f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" Workload="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0" Sep 13 02:28:03.009918 containerd[1607]: 2025-09-13 02:28:02.951 [INFO][4179] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-cfxb9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0", GenerateName:"calico-apiserver-68dfd7d7bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"beafbeb8-f50a-4f44-a6b9-f78f0837f465", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"68dfd7d7bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-68dfd7d7bf-cfxb9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.56.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif8c344e8fc4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:03.010027 containerd[1607]: 2025-09-13 02:28:02.951 [INFO][4179] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.67/32] ContainerID="f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-cfxb9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0" Sep 13 02:28:03.010027 containerd[1607]: 2025-09-13 02:28:02.951 [INFO][4179] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif8c344e8fc4 ContainerID="f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-cfxb9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0" Sep 13 02:28:03.010027 containerd[1607]: 2025-09-13 02:28:02.981 [INFO][4179] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-cfxb9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0" Sep 13 02:28:03.010198 containerd[1607]: 2025-09-13 02:28:02.982 [INFO][4179] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-cfxb9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0", GenerateName:"calico-apiserver-68dfd7d7bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"beafbeb8-f50a-4f44-a6b9-f78f0837f465", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"68dfd7d7bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492", Pod:"calico-apiserver-68dfd7d7bf-cfxb9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.56.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif8c344e8fc4", MAC:"e6:ff:a8:11:d5:8b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:03.010319 containerd[1607]: 2025-09-13 02:28:03.000 [INFO][4179] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-cfxb9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--cfxb9-eth0" Sep 13 02:28:03.015623 systemd[1]: Started cri-containerd-8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1.scope - libcontainer container 8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1. Sep 13 02:28:03.078350 systemd-networkd[1500]: cali30f162bf4e8: Link UP Sep 13 02:28:03.086517 systemd-networkd[1500]: cali30f162bf4e8: Gained carrier Sep 13 02:28:03.093699 containerd[1607]: time="2025-09-13T02:28:03.093175002Z" level=info msg="connecting to shim f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492" address="unix:///run/containerd/s/189e3681f18deb8626b69c86967ded6dc7587abb74e35b90803df1ead521e5dc" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:28:03.123499 containerd[1607]: 2025-09-13 02:28:02.640 [INFO][4168] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0 coredns-7c65d6cfc9- kube-system 8e4d431c-d01f-4945-ba50-358a55b9fce8 810 0 2025-09-13 02:27:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-1di1n.gb1.brightbox.com coredns-7c65d6cfc9-d994l eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali30f162bf4e8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d994l" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-" Sep 13 02:28:03.123499 containerd[1607]: 2025-09-13 02:28:02.640 [INFO][4168] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d994l" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0" Sep 13 02:28:03.123499 containerd[1607]: 2025-09-13 02:28:02.765 [INFO][4230] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" HandleID="k8s-pod-network.153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" Workload="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0" Sep 13 02:28:03.124124 containerd[1607]: 2025-09-13 02:28:02.766 [INFO][4230] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" HandleID="k8s-pod-network.153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" Workload="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fe30), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-1di1n.gb1.brightbox.com", "pod":"coredns-7c65d6cfc9-d994l", "timestamp":"2025-09-13 02:28:02.7659498 +0000 UTC"}, Hostname:"srv-1di1n.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 02:28:03.124124 containerd[1607]: 2025-09-13 02:28:02.766 [INFO][4230] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 02:28:03.124124 containerd[1607]: 2025-09-13 02:28:02.941 [INFO][4230] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 02:28:03.124124 containerd[1607]: 2025-09-13 02:28:02.943 [INFO][4230] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1di1n.gb1.brightbox.com' Sep 13 02:28:03.124124 containerd[1607]: 2025-09-13 02:28:02.963 [INFO][4230] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.124124 containerd[1607]: 2025-09-13 02:28:02.991 [INFO][4230] ipam/ipam.go 394: Looking up existing affinities for host host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.124124 containerd[1607]: 2025-09-13 02:28:03.016 [INFO][4230] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.124124 containerd[1607]: 2025-09-13 02:28:03.021 [INFO][4230] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.124124 containerd[1607]: 2025-09-13 02:28:03.026 [INFO][4230] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.125361 containerd[1607]: 2025-09-13 02:28:03.026 [INFO][4230] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.125361 containerd[1607]: 2025-09-13 02:28:03.030 [INFO][4230] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2 Sep 13 02:28:03.125361 containerd[1607]: 2025-09-13 02:28:03.041 [INFO][4230] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.125361 containerd[1607]: 2025-09-13 02:28:03.059 [INFO][4230] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.68/26] block=192.168.56.64/26 handle="k8s-pod-network.153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.125361 containerd[1607]: 2025-09-13 02:28:03.059 [INFO][4230] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.68/26] handle="k8s-pod-network.153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:03.125361 containerd[1607]: 2025-09-13 02:28:03.059 [INFO][4230] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 02:28:03.125361 containerd[1607]: 2025-09-13 02:28:03.059 [INFO][4230] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.68/26] IPv6=[] ContainerID="153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" HandleID="k8s-pod-network.153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" Workload="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0" Sep 13 02:28:03.125947 containerd[1607]: 2025-09-13 02:28:03.065 [INFO][4168] cni-plugin/k8s.go 418: Populated endpoint ContainerID="153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d994l" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"8e4d431c-d01f-4945-ba50-358a55b9fce8", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7c65d6cfc9-d994l", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.56.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali30f162bf4e8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:03.125947 containerd[1607]: 2025-09-13 02:28:03.066 [INFO][4168] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.68/32] ContainerID="153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d994l" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0" Sep 13 02:28:03.125947 containerd[1607]: 2025-09-13 02:28:03.067 [INFO][4168] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali30f162bf4e8 ContainerID="153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d994l" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0" Sep 13 02:28:03.125947 containerd[1607]: 2025-09-13 02:28:03.087 [INFO][4168] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d994l" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0" Sep 13 02:28:03.125947 containerd[1607]: 2025-09-13 02:28:03.091 [INFO][4168] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d994l" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"8e4d431c-d01f-4945-ba50-358a55b9fce8", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2", Pod:"coredns-7c65d6cfc9-d994l", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.56.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali30f162bf4e8", MAC:"b2:da:1b:80:50:1b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:03.125947 containerd[1607]: 2025-09-13 02:28:03.115 [INFO][4168] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-d994l" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--d994l-eth0" Sep 13 02:28:03.154804 containerd[1607]: time="2025-09-13T02:28:03.154748287Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-f4w47,Uid:6c998d1d-8056-4958-bc44-d35b2d7c5300,Namespace:calico-system,Attempt:0,} returns sandbox id \"8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1\"" Sep 13 02:28:03.177873 systemd[1]: Started cri-containerd-f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492.scope - libcontainer container f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492. Sep 13 02:28:03.195317 containerd[1607]: time="2025-09-13T02:28:03.194848235Z" level=info msg="connecting to shim 153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2" address="unix:///run/containerd/s/34438091f907b041ea3a56d20fe441fdd30e5e9fc4638a51fb29adabd12bb223" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:28:03.229386 systemd-networkd[1500]: cali346558e1ef7: Gained IPv6LL Sep 13 02:28:03.241540 systemd[1]: Started cri-containerd-153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2.scope - libcontainer container 153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2. Sep 13 02:28:03.363352 containerd[1607]: time="2025-09-13T02:28:03.362393324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-d994l,Uid:8e4d431c-d01f-4945-ba50-358a55b9fce8,Namespace:kube-system,Attempt:0,} returns sandbox id \"153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2\"" Sep 13 02:28:03.368675 containerd[1607]: time="2025-09-13T02:28:03.368570816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68dfd7d7bf-cfxb9,Uid:beafbeb8-f50a-4f44-a6b9-f78f0837f465,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492\"" Sep 13 02:28:03.384303 containerd[1607]: time="2025-09-13T02:28:03.384017094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b6cb57bfd-zmfgm,Uid:ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647,Namespace:calico-system,Attempt:0,}" Sep 13 02:28:03.386213 containerd[1607]: time="2025-09-13T02:28:03.386090906Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-dzch9,Uid:3a8e3276-5493-4f4c-985f-4d4f3efec5dd,Namespace:calico-system,Attempt:0,}" Sep 13 02:28:03.398495 containerd[1607]: time="2025-09-13T02:28:03.398448498Z" level=info msg="CreateContainer within sandbox \"153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 13 02:28:03.483538 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4225497407.mount: Deactivated successfully. Sep 13 02:28:03.488093 containerd[1607]: time="2025-09-13T02:28:03.487947625Z" level=info msg="Container 7afc902036b25c6aad630caf005b5b4b114d85c1dfb93c617ea427b5162778f9: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:28:03.497471 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1048847056.mount: Deactivated successfully. Sep 13 02:28:03.540421 containerd[1607]: time="2025-09-13T02:28:03.540375004Z" level=info msg="CreateContainer within sandbox \"153f7591c839498701cd619c4373ed94ce42880925a5ece28254f993ce616fe2\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7afc902036b25c6aad630caf005b5b4b114d85c1dfb93c617ea427b5162778f9\"" Sep 13 02:28:03.543521 containerd[1607]: time="2025-09-13T02:28:03.543481161Z" level=info msg="StartContainer for \"7afc902036b25c6aad630caf005b5b4b114d85c1dfb93c617ea427b5162778f9\"" Sep 13 02:28:03.554170 containerd[1607]: time="2025-09-13T02:28:03.553669774Z" level=info msg="connecting to shim 7afc902036b25c6aad630caf005b5b4b114d85c1dfb93c617ea427b5162778f9" address="unix:///run/containerd/s/34438091f907b041ea3a56d20fe441fdd30e5e9fc4638a51fb29adabd12bb223" protocol=ttrpc version=3 Sep 13 02:28:03.616655 systemd[1]: Started cri-containerd-7afc902036b25c6aad630caf005b5b4b114d85c1dfb93c617ea427b5162778f9.scope - libcontainer container 7afc902036b25c6aad630caf005b5b4b114d85c1dfb93c617ea427b5162778f9. Sep 13 02:28:03.697155 containerd[1607]: time="2025-09-13T02:28:03.697014818Z" level=info msg="StartContainer for \"7afc902036b25c6aad630caf005b5b4b114d85c1dfb93c617ea427b5162778f9\" returns successfully" Sep 13 02:28:03.726461 systemd-networkd[1500]: vxlan.calico: Link UP Sep 13 02:28:03.726493 systemd-networkd[1500]: vxlan.calico: Gained carrier Sep 13 02:28:03.885580 systemd-networkd[1500]: cali8ccb48357a7: Link UP Sep 13 02:28:03.889799 systemd-networkd[1500]: cali8ccb48357a7: Gained carrier Sep 13 02:28:03.988383 kubelet[2891]: I0913 02:28:03.988190 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-d994l" podStartSLOduration=44.975409555 podStartE2EDuration="44.975409555s" podCreationTimestamp="2025-09-13 02:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 02:28:03.974244304 +0000 UTC m=+50.741555096" watchObservedRunningTime="2025-09-13 02:28:03.975409555 +0000 UTC m=+50.742720328" Sep 13 02:28:04.056442 systemd-networkd[1500]: calif8c344e8fc4: Gained IPv6LL Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.553 [INFO][4423] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0 goldmane-7988f88666- calico-system 3a8e3276-5493-4f4c-985f-4d4f3efec5dd 822 0 2025-09-13 02:27:32 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-1di1n.gb1.brightbox.com goldmane-7988f88666-dzch9 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali8ccb48357a7 [] [] }} ContainerID="0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" Namespace="calico-system" Pod="goldmane-7988f88666-dzch9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-" Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.555 [INFO][4423] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" Namespace="calico-system" Pod="goldmane-7988f88666-dzch9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0" Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.691 [INFO][4459] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" HandleID="k8s-pod-network.0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" Workload="srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0" Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.691 [INFO][4459] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" HandleID="k8s-pod-network.0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" Workload="srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000336eb0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-1di1n.gb1.brightbox.com", "pod":"goldmane-7988f88666-dzch9", "timestamp":"2025-09-13 02:28:03.691479364 +0000 UTC"}, Hostname:"srv-1di1n.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.691 [INFO][4459] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.692 [INFO][4459] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.692 [INFO][4459] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1di1n.gb1.brightbox.com' Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.739 [INFO][4459] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.769 [INFO][4459] ipam/ipam.go 394: Looking up existing affinities for host host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.788 [INFO][4459] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.796 [INFO][4459] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.803 [INFO][4459] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.804 [INFO][4459] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.808 [INFO][4459] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9 Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.823 [INFO][4459] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.839 [INFO][4459] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.69/26] block=192.168.56.64/26 handle="k8s-pod-network.0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.839 [INFO][4459] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.69/26] handle="k8s-pod-network.0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.839 [INFO][4459] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 02:28:04.082958 containerd[1607]: 2025-09-13 02:28:03.840 [INFO][4459] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.69/26] IPv6=[] ContainerID="0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" HandleID="k8s-pod-network.0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" Workload="srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0" Sep 13 02:28:04.086521 containerd[1607]: 2025-09-13 02:28:03.857 [INFO][4423] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" Namespace="calico-system" Pod="goldmane-7988f88666-dzch9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"3a8e3276-5493-4f4c-985f-4d4f3efec5dd", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-7988f88666-dzch9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.56.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali8ccb48357a7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:04.086521 containerd[1607]: 2025-09-13 02:28:03.858 [INFO][4423] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.69/32] ContainerID="0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" Namespace="calico-system" Pod="goldmane-7988f88666-dzch9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0" Sep 13 02:28:04.086521 containerd[1607]: 2025-09-13 02:28:03.859 [INFO][4423] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8ccb48357a7 ContainerID="0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" Namespace="calico-system" Pod="goldmane-7988f88666-dzch9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0" Sep 13 02:28:04.086521 containerd[1607]: 2025-09-13 02:28:03.895 [INFO][4423] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" Namespace="calico-system" Pod="goldmane-7988f88666-dzch9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0" Sep 13 02:28:04.086521 containerd[1607]: 2025-09-13 02:28:03.919 [INFO][4423] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" Namespace="calico-system" Pod="goldmane-7988f88666-dzch9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"3a8e3276-5493-4f4c-985f-4d4f3efec5dd", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9", Pod:"goldmane-7988f88666-dzch9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.56.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali8ccb48357a7", MAC:"06:94:d3:f9:2d:14", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:04.086521 containerd[1607]: 2025-09-13 02:28:03.988 [INFO][4423] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" Namespace="calico-system" Pod="goldmane-7988f88666-dzch9" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-goldmane--7988f88666--dzch9-eth0" Sep 13 02:28:04.129495 systemd-networkd[1500]: cali8dba808d5ac: Link UP Sep 13 02:28:04.132837 systemd-networkd[1500]: cali8dba808d5ac: Gained carrier Sep 13 02:28:04.183538 systemd-networkd[1500]: cali30f162bf4e8: Gained IPv6LL Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:03.592 [INFO][4418] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0 calico-kube-controllers-7b6cb57bfd- calico-system ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647 819 0 2025-09-13 02:27:33 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7b6cb57bfd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-1di1n.gb1.brightbox.com calico-kube-controllers-7b6cb57bfd-zmfgm eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali8dba808d5ac [] [] }} ContainerID="f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" Namespace="calico-system" Pod="calico-kube-controllers-7b6cb57bfd-zmfgm" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-" Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:03.593 [INFO][4418] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" Namespace="calico-system" Pod="calico-kube-controllers-7b6cb57bfd-zmfgm" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0" Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:03.771 [INFO][4475] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" HandleID="k8s-pod-network.f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" Workload="srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0" Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:03.773 [INFO][4475] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" HandleID="k8s-pod-network.f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" Workload="srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000362d60), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-1di1n.gb1.brightbox.com", "pod":"calico-kube-controllers-7b6cb57bfd-zmfgm", "timestamp":"2025-09-13 02:28:03.771118758 +0000 UTC"}, Hostname:"srv-1di1n.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:03.774 [INFO][4475] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:03.840 [INFO][4475] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:03.840 [INFO][4475] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1di1n.gb1.brightbox.com' Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:03.917 [INFO][4475] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:03.992 [INFO][4475] ipam/ipam.go 394: Looking up existing affinities for host host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:04.026 [INFO][4475] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:04.035 [INFO][4475] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:04.052 [INFO][4475] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:04.052 [INFO][4475] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:04.075 [INFO][4475] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056 Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:04.085 [INFO][4475] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:04.112 [INFO][4475] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.70/26] block=192.168.56.64/26 handle="k8s-pod-network.f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:04.112 [INFO][4475] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.70/26] handle="k8s-pod-network.f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:04.112 [INFO][4475] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 02:28:04.200284 containerd[1607]: 2025-09-13 02:28:04.112 [INFO][4475] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.70/26] IPv6=[] ContainerID="f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" HandleID="k8s-pod-network.f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" Workload="srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0" Sep 13 02:28:04.201750 containerd[1607]: 2025-09-13 02:28:04.122 [INFO][4418] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" Namespace="calico-system" Pod="calico-kube-controllers-7b6cb57bfd-zmfgm" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0", GenerateName:"calico-kube-controllers-7b6cb57bfd-", Namespace:"calico-system", SelfLink:"", UID:"ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b6cb57bfd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-7b6cb57bfd-zmfgm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.56.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8dba808d5ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:04.201750 containerd[1607]: 2025-09-13 02:28:04.122 [INFO][4418] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.70/32] ContainerID="f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" Namespace="calico-system" Pod="calico-kube-controllers-7b6cb57bfd-zmfgm" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0" Sep 13 02:28:04.201750 containerd[1607]: 2025-09-13 02:28:04.122 [INFO][4418] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8dba808d5ac ContainerID="f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" Namespace="calico-system" Pod="calico-kube-controllers-7b6cb57bfd-zmfgm" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0" Sep 13 02:28:04.201750 containerd[1607]: 2025-09-13 02:28:04.135 [INFO][4418] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" Namespace="calico-system" Pod="calico-kube-controllers-7b6cb57bfd-zmfgm" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0" Sep 13 02:28:04.201750 containerd[1607]: 2025-09-13 02:28:04.136 [INFO][4418] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" Namespace="calico-system" Pod="calico-kube-controllers-7b6cb57bfd-zmfgm" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0", GenerateName:"calico-kube-controllers-7b6cb57bfd-", Namespace:"calico-system", SelfLink:"", UID:"ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b6cb57bfd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056", Pod:"calico-kube-controllers-7b6cb57bfd-zmfgm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.56.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8dba808d5ac", MAC:"e2:7d:95:1a:5a:51", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:04.201750 containerd[1607]: 2025-09-13 02:28:04.171 [INFO][4418] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" Namespace="calico-system" Pod="calico-kube-controllers-7b6cb57bfd-zmfgm" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--kube--controllers--7b6cb57bfd--zmfgm-eth0" Sep 13 02:28:04.267745 containerd[1607]: time="2025-09-13T02:28:04.267341112Z" level=info msg="connecting to shim 0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9" address="unix:///run/containerd/s/af2f5cfb431f36280f7e62515d1a0d6de543f8c73cbbccab434953e0e37d1889" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:28:04.279646 containerd[1607]: time="2025-09-13T02:28:04.279593837Z" level=info msg="connecting to shim f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056" address="unix:///run/containerd/s/fe1d490620715e40a37ed65484c3b6e1cbe2ef76ed495243f1dc4eedb5126c29" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:28:04.379204 containerd[1607]: time="2025-09-13T02:28:04.379058756Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-4ljgx,Uid:cea724f8-4020-4ab6-998d-91057a769d18,Namespace:kube-system,Attempt:0,}" Sep 13 02:28:04.381482 systemd[1]: Started cri-containerd-f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056.scope - libcontainer container f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056. Sep 13 02:28:04.392770 containerd[1607]: time="2025-09-13T02:28:04.392509327Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68dfd7d7bf-8jk84,Uid:67566ce3-7b09-4cb1-952c-0ee8a7201943,Namespace:calico-apiserver,Attempt:0,}" Sep 13 02:28:04.394622 systemd[1]: Started cri-containerd-0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9.scope - libcontainer container 0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9. Sep 13 02:28:04.696384 systemd-networkd[1500]: cali4f2175a1488: Gained IPv6LL Sep 13 02:28:04.844682 containerd[1607]: time="2025-09-13T02:28:04.844629892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-dzch9,Uid:3a8e3276-5493-4f4c-985f-4d4f3efec5dd,Namespace:calico-system,Attempt:0,} returns sandbox id \"0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9\"" Sep 13 02:28:05.011387 containerd[1607]: time="2025-09-13T02:28:05.007256304Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b6cb57bfd-zmfgm,Uid:ea298ee1-7ac7-4d0b-b9d1-ba1d4e4a4647,Namespace:calico-system,Attempt:0,} returns sandbox id \"f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056\"" Sep 13 02:28:05.015561 systemd-networkd[1500]: vxlan.calico: Gained IPv6LL Sep 13 02:28:05.048719 systemd-networkd[1500]: calic69f8a54aaa: Link UP Sep 13 02:28:05.053684 systemd-networkd[1500]: calic69f8a54aaa: Gained carrier Sep 13 02:28:05.092685 containerd[1607]: time="2025-09-13T02:28:05.092611956Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:05.094902 containerd[1607]: time="2025-09-13T02:28:05.094155094Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 13 02:28:05.096302 containerd[1607]: time="2025-09-13T02:28:05.095908058Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:05.105831 containerd[1607]: time="2025-09-13T02:28:05.105429895Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:05.111189 containerd[1607]: time="2025-09-13T02:28:05.110526495Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 2.865762213s" Sep 13 02:28:05.111189 containerd[1607]: time="2025-09-13T02:28:05.111121111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 13 02:28:05.115705 containerd[1607]: time="2025-09-13T02:28:05.115398393Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.777 [INFO][4629] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0 coredns-7c65d6cfc9- kube-system cea724f8-4020-4ab6-998d-91057a769d18 820 0 2025-09-13 02:27:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-1di1n.gb1.brightbox.com coredns-7c65d6cfc9-4ljgx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic69f8a54aaa [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" Namespace="kube-system" Pod="coredns-7c65d6cfc9-4ljgx" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.777 [INFO][4629] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" Namespace="kube-system" Pod="coredns-7c65d6cfc9-4ljgx" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.932 [INFO][4662] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" HandleID="k8s-pod-network.c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" Workload="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.932 [INFO][4662] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" HandleID="k8s-pod-network.c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" Workload="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4130), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-1di1n.gb1.brightbox.com", "pod":"coredns-7c65d6cfc9-4ljgx", "timestamp":"2025-09-13 02:28:04.932330526 +0000 UTC"}, Hostname:"srv-1di1n.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.932 [INFO][4662] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.933 [INFO][4662] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.933 [INFO][4662] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1di1n.gb1.brightbox.com' Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.946 [INFO][4662] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.956 [INFO][4662] ipam/ipam.go 394: Looking up existing affinities for host host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.971 [INFO][4662] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.980 [INFO][4662] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.985 [INFO][4662] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.986 [INFO][4662] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:04.990 [INFO][4662] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064 Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:05.000 [INFO][4662] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:05.025 [INFO][4662] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.71/26] block=192.168.56.64/26 handle="k8s-pod-network.c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:05.027 [INFO][4662] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.71/26] handle="k8s-pod-network.c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:05.027 [INFO][4662] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 02:28:05.124650 containerd[1607]: 2025-09-13 02:28:05.028 [INFO][4662] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.71/26] IPv6=[] ContainerID="c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" HandleID="k8s-pod-network.c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" Workload="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0" Sep 13 02:28:05.128229 containerd[1607]: 2025-09-13 02:28:05.040 [INFO][4629] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" Namespace="kube-system" Pod="coredns-7c65d6cfc9-4ljgx" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"cea724f8-4020-4ab6-998d-91057a769d18", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7c65d6cfc9-4ljgx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.56.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic69f8a54aaa", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:05.128229 containerd[1607]: 2025-09-13 02:28:05.040 [INFO][4629] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.71/32] ContainerID="c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" Namespace="kube-system" Pod="coredns-7c65d6cfc9-4ljgx" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0" Sep 13 02:28:05.128229 containerd[1607]: 2025-09-13 02:28:05.040 [INFO][4629] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic69f8a54aaa ContainerID="c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" Namespace="kube-system" Pod="coredns-7c65d6cfc9-4ljgx" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0" Sep 13 02:28:05.128229 containerd[1607]: 2025-09-13 02:28:05.072 [INFO][4629] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" Namespace="kube-system" Pod="coredns-7c65d6cfc9-4ljgx" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0" Sep 13 02:28:05.128229 containerd[1607]: 2025-09-13 02:28:05.080 [INFO][4629] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" Namespace="kube-system" Pod="coredns-7c65d6cfc9-4ljgx" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"cea724f8-4020-4ab6-998d-91057a769d18", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064", Pod:"coredns-7c65d6cfc9-4ljgx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.56.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic69f8a54aaa", MAC:"86:13:87:98:78:16", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:05.128229 containerd[1607]: 2025-09-13 02:28:05.116 [INFO][4629] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" Namespace="kube-system" Pod="coredns-7c65d6cfc9-4ljgx" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--4ljgx-eth0" Sep 13 02:28:05.130111 containerd[1607]: time="2025-09-13T02:28:05.129155111Z" level=info msg="CreateContainer within sandbox \"6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 13 02:28:05.157669 containerd[1607]: time="2025-09-13T02:28:05.155719597Z" level=info msg="Container 7b805a652afccb4069df13b3528662133e0634f7e5f04e1dccdb92cbf3c5ec14: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:28:05.191112 containerd[1607]: time="2025-09-13T02:28:05.190476729Z" level=info msg="CreateContainer within sandbox \"6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"7b805a652afccb4069df13b3528662133e0634f7e5f04e1dccdb92cbf3c5ec14\"" Sep 13 02:28:05.194516 containerd[1607]: time="2025-09-13T02:28:05.194444562Z" level=info msg="StartContainer for \"7b805a652afccb4069df13b3528662133e0634f7e5f04e1dccdb92cbf3c5ec14\"" Sep 13 02:28:05.204844 containerd[1607]: time="2025-09-13T02:28:05.204800692Z" level=info msg="connecting to shim 7b805a652afccb4069df13b3528662133e0634f7e5f04e1dccdb92cbf3c5ec14" address="unix:///run/containerd/s/b410d8422fce08f44b30c15436d48f045ffb6941926f31c5ddb4a3ec6bbc3032" protocol=ttrpc version=3 Sep 13 02:28:05.212480 containerd[1607]: time="2025-09-13T02:28:05.211323947Z" level=info msg="connecting to shim c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064" address="unix:///run/containerd/s/0294cf1d11b2860ea7ec9d45a2a7790a2fc9451a48b39fb0c8ea7f94532746e0" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:28:05.233094 systemd-networkd[1500]: cali9ceaef818a2: Link UP Sep 13 02:28:05.235571 systemd-networkd[1500]: cali9ceaef818a2: Gained carrier Sep 13 02:28:05.271686 systemd-networkd[1500]: cali8ccb48357a7: Gained IPv6LL Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:04.791 [INFO][4638] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0 calico-apiserver-68dfd7d7bf- calico-apiserver 67566ce3-7b09-4cb1-952c-0ee8a7201943 818 0 2025-09-13 02:27:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:68dfd7d7bf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-1di1n.gb1.brightbox.com calico-apiserver-68dfd7d7bf-8jk84 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9ceaef818a2 [] [] }} ContainerID="d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-8jk84" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-" Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:04.791 [INFO][4638] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-8jk84" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0" Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.038 [INFO][4667] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" HandleID="k8s-pod-network.d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" Workload="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0" Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.041 [INFO][4667] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" HandleID="k8s-pod-network.d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" Workload="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000321ac0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-1di1n.gb1.brightbox.com", "pod":"calico-apiserver-68dfd7d7bf-8jk84", "timestamp":"2025-09-13 02:28:05.038806483 +0000 UTC"}, Hostname:"srv-1di1n.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.041 [INFO][4667] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.041 [INFO][4667] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.041 [INFO][4667] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1di1n.gb1.brightbox.com' Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.099 [INFO][4667] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.132 [INFO][4667] ipam/ipam.go 394: Looking up existing affinities for host host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.150 [INFO][4667] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.160 [INFO][4667] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.170 [INFO][4667] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.170 [INFO][4667] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.177 [INFO][4667] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537 Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.194 [INFO][4667] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.212 [INFO][4667] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.72/26] block=192.168.56.64/26 handle="k8s-pod-network.d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.213 [INFO][4667] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.72/26] handle="k8s-pod-network.d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" host="srv-1di1n.gb1.brightbox.com" Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.213 [INFO][4667] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 02:28:05.298374 containerd[1607]: 2025-09-13 02:28:05.213 [INFO][4667] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.72/26] IPv6=[] ContainerID="d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" HandleID="k8s-pod-network.d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" Workload="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0" Sep 13 02:28:05.299820 containerd[1607]: 2025-09-13 02:28:05.220 [INFO][4638] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-8jk84" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0", GenerateName:"calico-apiserver-68dfd7d7bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"67566ce3-7b09-4cb1-952c-0ee8a7201943", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"68dfd7d7bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-68dfd7d7bf-8jk84", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.56.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9ceaef818a2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:05.299820 containerd[1607]: 2025-09-13 02:28:05.220 [INFO][4638] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.72/32] ContainerID="d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-8jk84" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0" Sep 13 02:28:05.299820 containerd[1607]: 2025-09-13 02:28:05.221 [INFO][4638] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9ceaef818a2 ContainerID="d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-8jk84" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0" Sep 13 02:28:05.299820 containerd[1607]: 2025-09-13 02:28:05.237 [INFO][4638] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-8jk84" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0" Sep 13 02:28:05.299820 containerd[1607]: 2025-09-13 02:28:05.244 [INFO][4638] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-8jk84" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0", GenerateName:"calico-apiserver-68dfd7d7bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"67566ce3-7b09-4cb1-952c-0ee8a7201943", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 2, 27, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"68dfd7d7bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1di1n.gb1.brightbox.com", ContainerID:"d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537", Pod:"calico-apiserver-68dfd7d7bf-8jk84", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.56.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9ceaef818a2", MAC:"b6:8a:b9:ea:a6:88", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 02:28:05.299820 containerd[1607]: 2025-09-13 02:28:05.271 [INFO][4638] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" Namespace="calico-apiserver" Pod="calico-apiserver-68dfd7d7bf-8jk84" WorkloadEndpoint="srv--1di1n.gb1.brightbox.com-k8s-calico--apiserver--68dfd7d7bf--8jk84-eth0" Sep 13 02:28:05.326477 systemd[1]: Started cri-containerd-7b805a652afccb4069df13b3528662133e0634f7e5f04e1dccdb92cbf3c5ec14.scope - libcontainer container 7b805a652afccb4069df13b3528662133e0634f7e5f04e1dccdb92cbf3c5ec14. Sep 13 02:28:05.355797 systemd[1]: Started cri-containerd-c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064.scope - libcontainer container c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064. Sep 13 02:28:05.398386 containerd[1607]: time="2025-09-13T02:28:05.397082364Z" level=info msg="connecting to shim d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537" address="unix:///run/containerd/s/845d7a3fd67574b5998ddb2e5876db699aa72b8a985f4bcf12394121e259a303" namespace=k8s.io protocol=ttrpc version=3 Sep 13 02:28:05.463508 systemd-networkd[1500]: cali8dba808d5ac: Gained IPv6LL Sep 13 02:28:05.479500 systemd[1]: Started cri-containerd-d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537.scope - libcontainer container d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537. Sep 13 02:28:05.556482 containerd[1607]: time="2025-09-13T02:28:05.554873621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-4ljgx,Uid:cea724f8-4020-4ab6-998d-91057a769d18,Namespace:kube-system,Attempt:0,} returns sandbox id \"c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064\"" Sep 13 02:28:05.561633 containerd[1607]: time="2025-09-13T02:28:05.560960405Z" level=info msg="CreateContainer within sandbox \"c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 13 02:28:05.581666 containerd[1607]: time="2025-09-13T02:28:05.580763072Z" level=info msg="Container a06448f1a012ac80cc78d34b3ee18f6738c8d92f898dac60f4040b75bb897a4b: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:28:05.612742 containerd[1607]: time="2025-09-13T02:28:05.612610167Z" level=info msg="CreateContainer within sandbox \"c1e247c12dbdbc703299a3fa47cb14989da47ecf38034610f686f6122fed5064\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a06448f1a012ac80cc78d34b3ee18f6738c8d92f898dac60f4040b75bb897a4b\"" Sep 13 02:28:05.619132 containerd[1607]: time="2025-09-13T02:28:05.619068286Z" level=info msg="StartContainer for \"a06448f1a012ac80cc78d34b3ee18f6738c8d92f898dac60f4040b75bb897a4b\"" Sep 13 02:28:05.623321 containerd[1607]: time="2025-09-13T02:28:05.622997739Z" level=info msg="connecting to shim a06448f1a012ac80cc78d34b3ee18f6738c8d92f898dac60f4040b75bb897a4b" address="unix:///run/containerd/s/0294cf1d11b2860ea7ec9d45a2a7790a2fc9451a48b39fb0c8ea7f94532746e0" protocol=ttrpc version=3 Sep 13 02:28:05.706671 systemd[1]: Started cri-containerd-a06448f1a012ac80cc78d34b3ee18f6738c8d92f898dac60f4040b75bb897a4b.scope - libcontainer container a06448f1a012ac80cc78d34b3ee18f6738c8d92f898dac60f4040b75bb897a4b. Sep 13 02:28:05.715692 containerd[1607]: time="2025-09-13T02:28:05.715559504Z" level=info msg="StartContainer for \"7b805a652afccb4069df13b3528662133e0634f7e5f04e1dccdb92cbf3c5ec14\" returns successfully" Sep 13 02:28:05.788170 containerd[1607]: time="2025-09-13T02:28:05.788111801Z" level=info msg="StartContainer for \"a06448f1a012ac80cc78d34b3ee18f6738c8d92f898dac60f4040b75bb897a4b\" returns successfully" Sep 13 02:28:05.890798 containerd[1607]: time="2025-09-13T02:28:05.890339791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-68dfd7d7bf-8jk84,Uid:67566ce3-7b09-4cb1-952c-0ee8a7201943,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537\"" Sep 13 02:28:06.058599 kubelet[2891]: I0913 02:28:06.058447 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-4ljgx" podStartSLOduration=47.058418483 podStartE2EDuration="47.058418483s" podCreationTimestamp="2025-09-13 02:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 02:28:06.056042223 +0000 UTC m=+52.823353011" watchObservedRunningTime="2025-09-13 02:28:06.058418483 +0000 UTC m=+52.825729268" Sep 13 02:28:06.168123 systemd-networkd[1500]: calic69f8a54aaa: Gained IPv6LL Sep 13 02:28:06.615894 systemd-networkd[1500]: cali9ceaef818a2: Gained IPv6LL Sep 13 02:28:07.176429 containerd[1607]: time="2025-09-13T02:28:07.176370983Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:07.177567 containerd[1607]: time="2025-09-13T02:28:07.177383567Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 13 02:28:07.178509 containerd[1607]: time="2025-09-13T02:28:07.178469755Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:07.181737 containerd[1607]: time="2025-09-13T02:28:07.181700688Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:07.182623 containerd[1607]: time="2025-09-13T02:28:07.182541992Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.067103975s" Sep 13 02:28:07.182623 containerd[1607]: time="2025-09-13T02:28:07.182588017Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 13 02:28:07.184548 containerd[1607]: time="2025-09-13T02:28:07.184451609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 13 02:28:07.188537 containerd[1607]: time="2025-09-13T02:28:07.188501190Z" level=info msg="CreateContainer within sandbox \"8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 13 02:28:07.204859 containerd[1607]: time="2025-09-13T02:28:07.204060188Z" level=info msg="Container cbeac5998cdd7e3c03f206ff77fe0ca6b8af0899266fbf051270781b6bf85dd6: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:28:07.211489 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount466102730.mount: Deactivated successfully. Sep 13 02:28:07.225797 containerd[1607]: time="2025-09-13T02:28:07.225744839Z" level=info msg="CreateContainer within sandbox \"8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"cbeac5998cdd7e3c03f206ff77fe0ca6b8af0899266fbf051270781b6bf85dd6\"" Sep 13 02:28:07.227607 containerd[1607]: time="2025-09-13T02:28:07.227579650Z" level=info msg="StartContainer for \"cbeac5998cdd7e3c03f206ff77fe0ca6b8af0899266fbf051270781b6bf85dd6\"" Sep 13 02:28:07.232105 containerd[1607]: time="2025-09-13T02:28:07.232065419Z" level=info msg="connecting to shim cbeac5998cdd7e3c03f206ff77fe0ca6b8af0899266fbf051270781b6bf85dd6" address="unix:///run/containerd/s/49c4a009abc1085a6a0f5890a1d9c56a9cf8fd461ceacc86a9aa03ac448bb36b" protocol=ttrpc version=3 Sep 13 02:28:07.274468 systemd[1]: Started cri-containerd-cbeac5998cdd7e3c03f206ff77fe0ca6b8af0899266fbf051270781b6bf85dd6.scope - libcontainer container cbeac5998cdd7e3c03f206ff77fe0ca6b8af0899266fbf051270781b6bf85dd6. Sep 13 02:28:07.334362 containerd[1607]: time="2025-09-13T02:28:07.334302178Z" level=info msg="StartContainer for \"cbeac5998cdd7e3c03f206ff77fe0ca6b8af0899266fbf051270781b6bf85dd6\" returns successfully" Sep 13 02:28:08.958666 containerd[1607]: time="2025-09-13T02:28:08.958583218Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca\" id:\"ba8e3ab92a93f6666ffaf181d8126eba9ce0740d41c7909052ec749d6373bcb1\" pid:4950 exited_at:{seconds:1757730488 nanos:957194083}" Sep 13 02:28:11.594185 containerd[1607]: time="2025-09-13T02:28:11.594054179Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:11.595923 containerd[1607]: time="2025-09-13T02:28:11.595886052Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 13 02:28:11.596651 containerd[1607]: time="2025-09-13T02:28:11.596589746Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:11.599540 containerd[1607]: time="2025-09-13T02:28:11.599098781Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:11.600144 containerd[1607]: time="2025-09-13T02:28:11.600107447Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 4.415421299s" Sep 13 02:28:11.600223 containerd[1607]: time="2025-09-13T02:28:11.600148174Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 13 02:28:11.601782 containerd[1607]: time="2025-09-13T02:28:11.601747908Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 13 02:28:11.606141 containerd[1607]: time="2025-09-13T02:28:11.606104858Z" level=info msg="CreateContainer within sandbox \"f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 13 02:28:11.616484 containerd[1607]: time="2025-09-13T02:28:11.616436529Z" level=info msg="Container de714ba170fd416568fa5b0af6244e0c937a2ec01e844c88b8b5638d11d8fb5b: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:28:11.627370 containerd[1607]: time="2025-09-13T02:28:11.627314165Z" level=info msg="CreateContainer within sandbox \"f33f3a76b8367b7c927cd7da133f74946b8d9e07f98fdb9afdb765f3e727b492\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"de714ba170fd416568fa5b0af6244e0c937a2ec01e844c88b8b5638d11d8fb5b\"" Sep 13 02:28:11.628906 containerd[1607]: time="2025-09-13T02:28:11.628844896Z" level=info msg="StartContainer for \"de714ba170fd416568fa5b0af6244e0c937a2ec01e844c88b8b5638d11d8fb5b\"" Sep 13 02:28:11.630221 containerd[1607]: time="2025-09-13T02:28:11.630174708Z" level=info msg="connecting to shim de714ba170fd416568fa5b0af6244e0c937a2ec01e844c88b8b5638d11d8fb5b" address="unix:///run/containerd/s/189e3681f18deb8626b69c86967ded6dc7587abb74e35b90803df1ead521e5dc" protocol=ttrpc version=3 Sep 13 02:28:11.677663 systemd[1]: Started cri-containerd-de714ba170fd416568fa5b0af6244e0c937a2ec01e844c88b8b5638d11d8fb5b.scope - libcontainer container de714ba170fd416568fa5b0af6244e0c937a2ec01e844c88b8b5638d11d8fb5b. Sep 13 02:28:11.764428 containerd[1607]: time="2025-09-13T02:28:11.764381500Z" level=info msg="StartContainer for \"de714ba170fd416568fa5b0af6244e0c937a2ec01e844c88b8b5638d11d8fb5b\" returns successfully" Sep 13 02:28:13.627439 kubelet[2891]: I0913 02:28:13.624607 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-68dfd7d7bf-cfxb9" podStartSLOduration=36.396197468 podStartE2EDuration="44.624583455s" podCreationTimestamp="2025-09-13 02:27:29 +0000 UTC" firstStartedPulling="2025-09-13 02:28:03.373010619 +0000 UTC m=+50.140321391" lastFinishedPulling="2025-09-13 02:28:11.601396591 +0000 UTC m=+58.368707378" observedRunningTime="2025-09-13 02:28:12.096075355 +0000 UTC m=+58.863386152" watchObservedRunningTime="2025-09-13 02:28:13.624583455 +0000 UTC m=+60.391894236" Sep 13 02:28:15.856920 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3908826114.mount: Deactivated successfully. Sep 13 02:28:17.253841 containerd[1607]: time="2025-09-13T02:28:17.253760485Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:17.268426 containerd[1607]: time="2025-09-13T02:28:17.268373758Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 13 02:28:17.324359 containerd[1607]: time="2025-09-13T02:28:17.322344946Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:17.325459 containerd[1607]: time="2025-09-13T02:28:17.325422952Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:17.328082 containerd[1607]: time="2025-09-13T02:28:17.328044568Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 5.725268408s" Sep 13 02:28:17.328195 containerd[1607]: time="2025-09-13T02:28:17.328085457Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 13 02:28:17.330257 containerd[1607]: time="2025-09-13T02:28:17.330224451Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 13 02:28:17.358539 containerd[1607]: time="2025-09-13T02:28:17.358261523Z" level=info msg="CreateContainer within sandbox \"0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 13 02:28:17.455303 containerd[1607]: time="2025-09-13T02:28:17.453408295Z" level=info msg="Container fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:28:17.538579 containerd[1607]: time="2025-09-13T02:28:17.538429171Z" level=info msg="CreateContainer within sandbox \"0845f5b44e56c4c7f9fc6be8852ebcaf1b1dea97d2cc03cd65c22ed781d3c7b9\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6\"" Sep 13 02:28:17.539624 containerd[1607]: time="2025-09-13T02:28:17.539487782Z" level=info msg="StartContainer for \"fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6\"" Sep 13 02:28:17.546922 containerd[1607]: time="2025-09-13T02:28:17.546854968Z" level=info msg="connecting to shim fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6" address="unix:///run/containerd/s/af2f5cfb431f36280f7e62515d1a0d6de543f8c73cbbccab434953e0e37d1889" protocol=ttrpc version=3 Sep 13 02:28:17.734820 systemd[1]: Started cri-containerd-fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6.scope - libcontainer container fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6. Sep 13 02:28:18.037258 containerd[1607]: time="2025-09-13T02:28:18.037212156Z" level=info msg="StartContainer for \"fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6\" returns successfully" Sep 13 02:28:18.245295 kubelet[2891]: I0913 02:28:18.244527 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-dzch9" podStartSLOduration=33.758979443 podStartE2EDuration="46.229962331s" podCreationTimestamp="2025-09-13 02:27:32 +0000 UTC" firstStartedPulling="2025-09-13 02:28:04.858089316 +0000 UTC m=+51.625400082" lastFinishedPulling="2025-09-13 02:28:17.329072177 +0000 UTC m=+64.096382970" observedRunningTime="2025-09-13 02:28:18.217576122 +0000 UTC m=+64.984886931" watchObservedRunningTime="2025-09-13 02:28:18.229962331 +0000 UTC m=+64.997273110" Sep 13 02:28:18.604923 containerd[1607]: time="2025-09-13T02:28:18.604871523Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6\" id:\"cc14df4e984082ecad5fb679ab603bdd21cb6feb2ba0fd089666dec9f0288cbf\" pid:5083 exit_status:1 exited_at:{seconds:1757730498 nanos:566876556}" Sep 13 02:28:19.430320 containerd[1607]: time="2025-09-13T02:28:19.428566049Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6\" id:\"8a970e228bdbdf9d7b524ff19db6475b7c7f69c903513d3c16eb76fd9796b554\" pid:5110 exit_status:1 exited_at:{seconds:1757730499 nanos:427900846}" Sep 13 02:28:20.176751 containerd[1607]: time="2025-09-13T02:28:20.176476085Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6\" id:\"62b2b5453436d98ed21a109065788b43ea31eef7c1ad8ad6c020aa9ce74f1adf\" pid:5136 exit_status:1 exited_at:{seconds:1757730500 nanos:173366074}" Sep 13 02:28:23.556968 containerd[1607]: time="2025-09-13T02:28:23.556901021Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:23.559836 containerd[1607]: time="2025-09-13T02:28:23.559797805Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 13 02:28:23.577288 containerd[1607]: time="2025-09-13T02:28:23.576136015Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:23.583898 containerd[1607]: time="2025-09-13T02:28:23.583465723Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:23.586283 containerd[1607]: time="2025-09-13T02:28:23.584560344Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 6.254296254s" Sep 13 02:28:23.586283 containerd[1607]: time="2025-09-13T02:28:23.584610065Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 13 02:28:23.599940 containerd[1607]: time="2025-09-13T02:28:23.599876661Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 13 02:28:23.693090 containerd[1607]: time="2025-09-13T02:28:23.692996046Z" level=info msg="CreateContainer within sandbox \"f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 13 02:28:23.719588 containerd[1607]: time="2025-09-13T02:28:23.719521291Z" level=info msg="Container 0d0a1638bd2a8dbb61636340dec2ba2b05b3c3a29743fc59ca8d5a1bf762d918: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:28:23.736650 containerd[1607]: time="2025-09-13T02:28:23.736596147Z" level=info msg="CreateContainer within sandbox \"f62fcac2abcfd99471df61132c10616227b8c9ecab09149eabac401d23982056\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"0d0a1638bd2a8dbb61636340dec2ba2b05b3c3a29743fc59ca8d5a1bf762d918\"" Sep 13 02:28:23.739494 containerd[1607]: time="2025-09-13T02:28:23.738588202Z" level=info msg="StartContainer for \"0d0a1638bd2a8dbb61636340dec2ba2b05b3c3a29743fc59ca8d5a1bf762d918\"" Sep 13 02:28:23.756675 containerd[1607]: time="2025-09-13T02:28:23.756585980Z" level=info msg="connecting to shim 0d0a1638bd2a8dbb61636340dec2ba2b05b3c3a29743fc59ca8d5a1bf762d918" address="unix:///run/containerd/s/fe1d490620715e40a37ed65484c3b6e1cbe2ef76ed495243f1dc4eedb5126c29" protocol=ttrpc version=3 Sep 13 02:28:23.895536 systemd[1]: Started cri-containerd-0d0a1638bd2a8dbb61636340dec2ba2b05b3c3a29743fc59ca8d5a1bf762d918.scope - libcontainer container 0d0a1638bd2a8dbb61636340dec2ba2b05b3c3a29743fc59ca8d5a1bf762d918. Sep 13 02:28:24.069952 containerd[1607]: time="2025-09-13T02:28:24.069794963Z" level=info msg="StartContainer for \"0d0a1638bd2a8dbb61636340dec2ba2b05b3c3a29743fc59ca8d5a1bf762d918\" returns successfully" Sep 13 02:28:25.651472 containerd[1607]: time="2025-09-13T02:28:25.651113228Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d0a1638bd2a8dbb61636340dec2ba2b05b3c3a29743fc59ca8d5a1bf762d918\" id:\"913b46e7d74d872d97493f86e709866204ce2be9979afaae77bfecf3e3c1ccc3\" pid:5213 exited_at:{seconds:1757730505 nanos:647752846}" Sep 13 02:28:25.724181 kubelet[2891]: I0913 02:28:25.720422 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7b6cb57bfd-zmfgm" podStartSLOduration=34.126165801 podStartE2EDuration="52.711075091s" podCreationTimestamp="2025-09-13 02:27:33 +0000 UTC" firstStartedPulling="2025-09-13 02:28:05.01460367 +0000 UTC m=+51.781914444" lastFinishedPulling="2025-09-13 02:28:23.599512968 +0000 UTC m=+70.366823734" observedRunningTime="2025-09-13 02:28:24.510532815 +0000 UTC m=+71.277843624" watchObservedRunningTime="2025-09-13 02:28:25.711075091 +0000 UTC m=+72.478385873" Sep 13 02:28:28.716953 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1678076113.mount: Deactivated successfully. Sep 13 02:28:28.749701 containerd[1607]: time="2025-09-13T02:28:28.749633871Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:28.751505 containerd[1607]: time="2025-09-13T02:28:28.751413909Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 13 02:28:28.754287 containerd[1607]: time="2025-09-13T02:28:28.752894320Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:28.756938 containerd[1607]: time="2025-09-13T02:28:28.756253635Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:28.757321 containerd[1607]: time="2025-09-13T02:28:28.757013214Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 5.157073714s" Sep 13 02:28:28.757407 containerd[1607]: time="2025-09-13T02:28:28.757328352Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 13 02:28:28.761377 containerd[1607]: time="2025-09-13T02:28:28.761153870Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 13 02:28:28.766765 containerd[1607]: time="2025-09-13T02:28:28.766725800Z" level=info msg="CreateContainer within sandbox \"6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 13 02:28:28.786767 containerd[1607]: time="2025-09-13T02:28:28.786425762Z" level=info msg="Container 0e25c1de23e921be4385c9931efeea3315c0d183b1ac4e08b90e9e854b85e6b0: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:28:28.802512 containerd[1607]: time="2025-09-13T02:28:28.802444855Z" level=info msg="CreateContainer within sandbox \"6012e3eaae86e89015e07c55c8829ed8df261fdb86ab72a7a1ce097bdd851a21\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"0e25c1de23e921be4385c9931efeea3315c0d183b1ac4e08b90e9e854b85e6b0\"" Sep 13 02:28:28.803812 containerd[1607]: time="2025-09-13T02:28:28.803538532Z" level=info msg="StartContainer for \"0e25c1de23e921be4385c9931efeea3315c0d183b1ac4e08b90e9e854b85e6b0\"" Sep 13 02:28:28.805733 containerd[1607]: time="2025-09-13T02:28:28.805600299Z" level=info msg="connecting to shim 0e25c1de23e921be4385c9931efeea3315c0d183b1ac4e08b90e9e854b85e6b0" address="unix:///run/containerd/s/b410d8422fce08f44b30c15436d48f045ffb6941926f31c5ddb4a3ec6bbc3032" protocol=ttrpc version=3 Sep 13 02:28:28.859789 systemd[1]: Started cri-containerd-0e25c1de23e921be4385c9931efeea3315c0d183b1ac4e08b90e9e854b85e6b0.scope - libcontainer container 0e25c1de23e921be4385c9931efeea3315c0d183b1ac4e08b90e9e854b85e6b0. Sep 13 02:28:29.033733 containerd[1607]: time="2025-09-13T02:28:29.033455815Z" level=info msg="StartContainer for \"0e25c1de23e921be4385c9931efeea3315c0d183b1ac4e08b90e9e854b85e6b0\" returns successfully" Sep 13 02:28:29.310863 containerd[1607]: time="2025-09-13T02:28:29.310690636Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:29.320302 containerd[1607]: time="2025-09-13T02:28:29.319518107Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 13 02:28:29.323692 containerd[1607]: time="2025-09-13T02:28:29.323483772Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 562.290161ms" Sep 13 02:28:29.323692 containerd[1607]: time="2025-09-13T02:28:29.323563426Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 13 02:28:29.325455 containerd[1607]: time="2025-09-13T02:28:29.325418871Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 13 02:28:29.327445 containerd[1607]: time="2025-09-13T02:28:29.327006641Z" level=info msg="CreateContainer within sandbox \"d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 13 02:28:29.358630 containerd[1607]: time="2025-09-13T02:28:29.358567817Z" level=info msg="Container d1b33c7729835b18bb0799769e8cc623485f057c8948db084d845a0d85668f65: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:28:29.380733 containerd[1607]: time="2025-09-13T02:28:29.380681654Z" level=info msg="CreateContainer within sandbox \"d2711e5dfe7ea15ee0ff5323c9596561d74c6b86e37461938530ad61ffe9c537\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d1b33c7729835b18bb0799769e8cc623485f057c8948db084d845a0d85668f65\"" Sep 13 02:28:29.383498 containerd[1607]: time="2025-09-13T02:28:29.383467552Z" level=info msg="StartContainer for \"d1b33c7729835b18bb0799769e8cc623485f057c8948db084d845a0d85668f65\"" Sep 13 02:28:29.387828 containerd[1607]: time="2025-09-13T02:28:29.387788431Z" level=info msg="connecting to shim d1b33c7729835b18bb0799769e8cc623485f057c8948db084d845a0d85668f65" address="unix:///run/containerd/s/845d7a3fd67574b5998ddb2e5876db699aa72b8a985f4bcf12394121e259a303" protocol=ttrpc version=3 Sep 13 02:28:29.436791 systemd[1]: Started cri-containerd-d1b33c7729835b18bb0799769e8cc623485f057c8948db084d845a0d85668f65.scope - libcontainer container d1b33c7729835b18bb0799769e8cc623485f057c8948db084d845a0d85668f65. Sep 13 02:28:29.792483 containerd[1607]: time="2025-09-13T02:28:29.792395886Z" level=info msg="StartContainer for \"d1b33c7729835b18bb0799769e8cc623485f057c8948db084d845a0d85668f65\" returns successfully" Sep 13 02:28:30.595194 kubelet[2891]: I0913 02:28:30.585682 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5754459988-gpp8n" podStartSLOduration=4.054656283 podStartE2EDuration="30.571302143s" podCreationTimestamp="2025-09-13 02:28:00 +0000 UTC" firstStartedPulling="2025-09-13 02:28:02.243819386 +0000 UTC m=+49.011130152" lastFinishedPulling="2025-09-13 02:28:28.76046524 +0000 UTC m=+75.527776012" observedRunningTime="2025-09-13 02:28:29.546813207 +0000 UTC m=+76.314124007" watchObservedRunningTime="2025-09-13 02:28:30.571302143 +0000 UTC m=+77.338612915" Sep 13 02:28:32.516418 kubelet[2891]: I0913 02:28:32.516372 2891 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 02:28:32.532865 containerd[1607]: time="2025-09-13T02:28:32.532803262Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:32.535635 containerd[1607]: time="2025-09-13T02:28:32.535372819Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 13 02:28:32.536770 containerd[1607]: time="2025-09-13T02:28:32.536408422Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:32.541309 containerd[1607]: time="2025-09-13T02:28:32.540159934Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 02:28:32.542893 containerd[1607]: time="2025-09-13T02:28:32.542843725Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 3.217383824s" Sep 13 02:28:32.545401 containerd[1607]: time="2025-09-13T02:28:32.542890910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 13 02:28:32.557643 containerd[1607]: time="2025-09-13T02:28:32.557579364Z" level=info msg="CreateContainer within sandbox \"8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 13 02:28:32.620621 containerd[1607]: time="2025-09-13T02:28:32.620542310Z" level=info msg="Container cc0aa626e0ed41f44630b3dc2a2fbb5ce4084f2e9674b1e14faf1d241d377d58: CDI devices from CRI Config.CDIDevices: []" Sep 13 02:28:32.637514 containerd[1607]: time="2025-09-13T02:28:32.637459966Z" level=info msg="CreateContainer within sandbox \"8a259e64de4cae6486546eabfba6be123a573332a4b49e88d4ae0a77a9748ad1\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"cc0aa626e0ed41f44630b3dc2a2fbb5ce4084f2e9674b1e14faf1d241d377d58\"" Sep 13 02:28:32.640576 containerd[1607]: time="2025-09-13T02:28:32.640543740Z" level=info msg="StartContainer for \"cc0aa626e0ed41f44630b3dc2a2fbb5ce4084f2e9674b1e14faf1d241d377d58\"" Sep 13 02:28:32.644673 containerd[1607]: time="2025-09-13T02:28:32.644436526Z" level=info msg="connecting to shim cc0aa626e0ed41f44630b3dc2a2fbb5ce4084f2e9674b1e14faf1d241d377d58" address="unix:///run/containerd/s/49c4a009abc1085a6a0f5890a1d9c56a9cf8fd461ceacc86a9aa03ac448bb36b" protocol=ttrpc version=3 Sep 13 02:28:32.722753 systemd[1]: Started cri-containerd-cc0aa626e0ed41f44630b3dc2a2fbb5ce4084f2e9674b1e14faf1d241d377d58.scope - libcontainer container cc0aa626e0ed41f44630b3dc2a2fbb5ce4084f2e9674b1e14faf1d241d377d58. Sep 13 02:28:33.133572 containerd[1607]: time="2025-09-13T02:28:33.133496245Z" level=info msg="StartContainer for \"cc0aa626e0ed41f44630b3dc2a2fbb5ce4084f2e9674b1e14faf1d241d377d58\" returns successfully" Sep 13 02:28:33.717079 kubelet[2891]: I0913 02:28:33.717004 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-68dfd7d7bf-8jk84" podStartSLOduration=41.285382162 podStartE2EDuration="1m4.716982404s" podCreationTimestamp="2025-09-13 02:27:29 +0000 UTC" firstStartedPulling="2025-09-13 02:28:05.8932297 +0000 UTC m=+52.660540474" lastFinishedPulling="2025-09-13 02:28:29.324829948 +0000 UTC m=+76.092140716" observedRunningTime="2025-09-13 02:28:30.597592161 +0000 UTC m=+77.364902941" watchObservedRunningTime="2025-09-13 02:28:33.716982404 +0000 UTC m=+80.484293180" Sep 13 02:28:33.717712 kubelet[2891]: I0913 02:28:33.717580 2891 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-f4w47" podStartSLOduration=31.329931491 podStartE2EDuration="1m0.717263236s" podCreationTimestamp="2025-09-13 02:27:33 +0000 UTC" firstStartedPulling="2025-09-13 02:28:03.159464578 +0000 UTC m=+49.926775345" lastFinishedPulling="2025-09-13 02:28:32.546796316 +0000 UTC m=+79.314107090" observedRunningTime="2025-09-13 02:28:33.716239058 +0000 UTC m=+80.483549877" watchObservedRunningTime="2025-09-13 02:28:33.717263236 +0000 UTC m=+80.484574016" Sep 13 02:28:33.869170 kubelet[2891]: I0913 02:28:33.865884 2891 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 13 02:28:33.880834 kubelet[2891]: I0913 02:28:33.880697 2891 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 13 02:28:39.184772 containerd[1607]: time="2025-09-13T02:28:39.184691704Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca\" id:\"9a6b2431d507f653cf7e12bbcba9c5ec4a06d205a0c3c140847901c12727f394\" pid:5361 exited_at:{seconds:1757730519 nanos:176130585}" Sep 13 02:28:49.818347 containerd[1607]: time="2025-09-13T02:28:49.818245102Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d0a1638bd2a8dbb61636340dec2ba2b05b3c3a29743fc59ca8d5a1bf762d918\" id:\"a20cc508622cc31746d9c6c0129e05ea0c9370adaec76d3936ed50aad1e2ecfd\" pid:5416 exited_at:{seconds:1757730529 nanos:817419868}" Sep 13 02:28:50.309653 containerd[1607]: time="2025-09-13T02:28:50.309538867Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6\" id:\"7f752027623293850f66fcbd8adcef8020254b5b578921f036d680e7a7f23051\" pid:5437 exited_at:{seconds:1757730530 nanos:308198630}" Sep 13 02:28:51.082518 systemd[1]: Started sshd@9-10.230.67.142:22-139.178.89.65:36696.service - OpenSSH per-connection server daemon (139.178.89.65:36696). Sep 13 02:28:52.179598 sshd[5455]: Accepted publickey for core from 139.178.89.65 port 36696 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:28:52.185040 sshd-session[5455]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:28:52.197246 systemd-logind[1589]: New session 12 of user core. Sep 13 02:28:52.205316 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 13 02:28:53.492290 sshd[5457]: Connection closed by 139.178.89.65 port 36696 Sep 13 02:28:53.492037 sshd-session[5455]: pam_unix(sshd:session): session closed for user core Sep 13 02:28:53.506898 systemd[1]: sshd@9-10.230.67.142:22-139.178.89.65:36696.service: Deactivated successfully. Sep 13 02:28:53.510207 systemd[1]: session-12.scope: Deactivated successfully. Sep 13 02:28:53.512050 systemd-logind[1589]: Session 12 logged out. Waiting for processes to exit. Sep 13 02:28:53.514203 systemd-logind[1589]: Removed session 12. Sep 13 02:28:56.346236 containerd[1607]: time="2025-09-13T02:28:56.346177982Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6\" id:\"56e50c9d3e1f87840035fd259b1c6fef130714fc733cc473ee6280a2f2ea0c93\" pid:5481 exited_at:{seconds:1757730536 nanos:343663549}" Sep 13 02:28:58.653550 systemd[1]: Started sshd@10-10.230.67.142:22-139.178.89.65:36708.service - OpenSSH per-connection server daemon (139.178.89.65:36708). Sep 13 02:28:59.664907 sshd[5492]: Accepted publickey for core from 139.178.89.65 port 36708 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:28:59.667905 sshd-session[5492]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:28:59.678256 systemd-logind[1589]: New session 13 of user core. Sep 13 02:28:59.685648 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 13 02:29:00.557098 sshd[5494]: Connection closed by 139.178.89.65 port 36708 Sep 13 02:29:00.559409 sshd-session[5492]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:00.564927 systemd-logind[1589]: Session 13 logged out. Waiting for processes to exit. Sep 13 02:29:00.566352 systemd[1]: sshd@10-10.230.67.142:22-139.178.89.65:36708.service: Deactivated successfully. Sep 13 02:29:00.571434 systemd[1]: session-13.scope: Deactivated successfully. Sep 13 02:29:00.576395 systemd-logind[1589]: Removed session 13. Sep 13 02:29:05.715642 systemd[1]: Started sshd@11-10.230.67.142:22-139.178.89.65:42248.service - OpenSSH per-connection server daemon (139.178.89.65:42248). Sep 13 02:29:06.707295 sshd[5508]: Accepted publickey for core from 139.178.89.65 port 42248 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:29:06.710130 sshd-session[5508]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:29:06.723390 systemd-logind[1589]: New session 14 of user core. Sep 13 02:29:06.732024 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 13 02:29:07.501923 sshd[5510]: Connection closed by 139.178.89.65 port 42248 Sep 13 02:29:07.503208 sshd-session[5508]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:07.508943 systemd[1]: sshd@11-10.230.67.142:22-139.178.89.65:42248.service: Deactivated successfully. Sep 13 02:29:07.512900 systemd[1]: session-14.scope: Deactivated successfully. Sep 13 02:29:07.515429 systemd-logind[1589]: Session 14 logged out. Waiting for processes to exit. Sep 13 02:29:07.518750 systemd-logind[1589]: Removed session 14. Sep 13 02:29:07.657573 systemd[1]: Started sshd@12-10.230.67.142:22-139.178.89.65:42262.service - OpenSSH per-connection server daemon (139.178.89.65:42262). Sep 13 02:29:08.568165 sshd[5523]: Accepted publickey for core from 139.178.89.65 port 42262 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:29:08.570107 sshd-session[5523]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:29:08.578793 systemd-logind[1589]: New session 15 of user core. Sep 13 02:29:08.586489 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 13 02:29:09.281349 containerd[1607]: time="2025-09-13T02:29:09.281282073Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca\" id:\"c9aeebd6cd8238e7066eb56c50b4474859f9aba38ca6783dec2ffdff18d026cc\" pid:5539 exited_at:{seconds:1757730549 nanos:280312856}" Sep 13 02:29:09.510779 sshd[5525]: Connection closed by 139.178.89.65 port 42262 Sep 13 02:29:09.515659 sshd-session[5523]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:09.536927 systemd[1]: sshd@12-10.230.67.142:22-139.178.89.65:42262.service: Deactivated successfully. Sep 13 02:29:09.541889 systemd[1]: session-15.scope: Deactivated successfully. Sep 13 02:29:09.547354 systemd-logind[1589]: Session 15 logged out. Waiting for processes to exit. Sep 13 02:29:09.549799 systemd-logind[1589]: Removed session 15. Sep 13 02:29:09.668629 systemd[1]: Started sshd@13-10.230.67.142:22-139.178.89.65:42278.service - OpenSSH per-connection server daemon (139.178.89.65:42278). Sep 13 02:29:10.646951 sshd[5563]: Accepted publickey for core from 139.178.89.65 port 42278 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:29:10.649718 sshd-session[5563]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:29:10.657665 systemd-logind[1589]: New session 16 of user core. Sep 13 02:29:10.665678 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 13 02:29:11.357401 sshd[5565]: Connection closed by 139.178.89.65 port 42278 Sep 13 02:29:11.357451 sshd-session[5563]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:11.362535 systemd[1]: sshd@13-10.230.67.142:22-139.178.89.65:42278.service: Deactivated successfully. Sep 13 02:29:11.364939 systemd[1]: session-16.scope: Deactivated successfully. Sep 13 02:29:11.366789 systemd-logind[1589]: Session 16 logged out. Waiting for processes to exit. Sep 13 02:29:11.368767 systemd-logind[1589]: Removed session 16. Sep 13 02:29:16.512221 systemd[1]: Started sshd@14-10.230.67.142:22-139.178.89.65:46760.service - OpenSSH per-connection server daemon (139.178.89.65:46760). Sep 13 02:29:17.432975 sshd[5580]: Accepted publickey for core from 139.178.89.65 port 46760 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:29:17.434921 sshd-session[5580]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:29:17.443343 systemd-logind[1589]: New session 17 of user core. Sep 13 02:29:17.448460 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 13 02:29:18.145912 sshd[5582]: Connection closed by 139.178.89.65 port 46760 Sep 13 02:29:18.147376 sshd-session[5580]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:18.153610 systemd[1]: sshd@14-10.230.67.142:22-139.178.89.65:46760.service: Deactivated successfully. Sep 13 02:29:18.158244 systemd[1]: session-17.scope: Deactivated successfully. Sep 13 02:29:18.159823 systemd-logind[1589]: Session 17 logged out. Waiting for processes to exit. Sep 13 02:29:18.161913 systemd-logind[1589]: Removed session 17. Sep 13 02:29:18.363966 containerd[1607]: time="2025-09-13T02:29:18.354405902Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d0a1638bd2a8dbb61636340dec2ba2b05b3c3a29743fc59ca8d5a1bf762d918\" id:\"f67c295b74e271a5283334e772134c62b3bff50871a854f0dff675125d9008df\" pid:5606 exited_at:{seconds:1757730558 nanos:353825446}" Sep 13 02:29:19.743976 containerd[1607]: time="2025-09-13T02:29:19.743905513Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d0a1638bd2a8dbb61636340dec2ba2b05b3c3a29743fc59ca8d5a1bf762d918\" id:\"bafd821e02244c32a917cc629802783ec8da742a4511e3162cdb7805cc33b3a5\" pid:5635 exited_at:{seconds:1757730559 nanos:743522176}" Sep 13 02:29:19.921254 containerd[1607]: time="2025-09-13T02:29:19.921196253Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6\" id:\"58376b36087e3ff04c00d30529e6fde1a5d40c463a541487b9c7b5ff7bf7518e\" pid:5656 exited_at:{seconds:1757730559 nanos:920423473}" Sep 13 02:29:23.309061 systemd[1]: Started sshd@15-10.230.67.142:22-139.178.89.65:53028.service - OpenSSH per-connection server daemon (139.178.89.65:53028). Sep 13 02:29:24.311308 sshd[5670]: Accepted publickey for core from 139.178.89.65 port 53028 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:29:24.316323 sshd-session[5670]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:29:24.327584 systemd-logind[1589]: New session 18 of user core. Sep 13 02:29:24.336800 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 13 02:29:25.395656 sshd[5672]: Connection closed by 139.178.89.65 port 53028 Sep 13 02:29:25.396489 sshd-session[5670]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:25.403658 systemd[1]: sshd@15-10.230.67.142:22-139.178.89.65:53028.service: Deactivated successfully. Sep 13 02:29:25.406998 systemd[1]: session-18.scope: Deactivated successfully. Sep 13 02:29:25.410885 systemd-logind[1589]: Session 18 logged out. Waiting for processes to exit. Sep 13 02:29:25.414407 systemd-logind[1589]: Removed session 18. Sep 13 02:29:30.551913 systemd[1]: Started sshd@16-10.230.67.142:22-139.178.89.65:34828.service - OpenSSH per-connection server daemon (139.178.89.65:34828). Sep 13 02:29:31.496975 sshd[5693]: Accepted publickey for core from 139.178.89.65 port 34828 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:29:31.500979 sshd-session[5693]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:29:31.516462 systemd-logind[1589]: New session 19 of user core. Sep 13 02:29:31.521450 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 13 02:29:32.446290 sshd[5695]: Connection closed by 139.178.89.65 port 34828 Sep 13 02:29:32.450031 sshd-session[5693]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:32.460498 systemd[1]: sshd@16-10.230.67.142:22-139.178.89.65:34828.service: Deactivated successfully. Sep 13 02:29:32.462352 systemd-logind[1589]: Session 19 logged out. Waiting for processes to exit. Sep 13 02:29:32.463873 systemd[1]: session-19.scope: Deactivated successfully. Sep 13 02:29:32.468146 systemd-logind[1589]: Removed session 19. Sep 13 02:29:32.606536 systemd[1]: Started sshd@17-10.230.67.142:22-139.178.89.65:34830.service - OpenSSH per-connection server daemon (139.178.89.65:34830). Sep 13 02:29:33.519708 sshd[5707]: Accepted publickey for core from 139.178.89.65 port 34830 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:29:33.522372 sshd-session[5707]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:29:33.532568 systemd-logind[1589]: New session 20 of user core. Sep 13 02:29:33.539473 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 13 02:29:33.638944 update_engine[1593]: I20250913 02:29:33.638832 1593 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 13 02:29:33.638944 update_engine[1593]: I20250913 02:29:33.638942 1593 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 13 02:29:33.640958 update_engine[1593]: I20250913 02:29:33.640920 1593 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 13 02:29:33.642365 update_engine[1593]: I20250913 02:29:33.642333 1593 omaha_request_params.cc:62] Current group set to beta Sep 13 02:29:33.642594 update_engine[1593]: I20250913 02:29:33.642549 1593 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 13 02:29:33.642594 update_engine[1593]: I20250913 02:29:33.642571 1593 update_attempter.cc:643] Scheduling an action processor start. Sep 13 02:29:33.643055 update_engine[1593]: I20250913 02:29:33.642599 1593 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 13 02:29:33.644988 update_engine[1593]: I20250913 02:29:33.644026 1593 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 13 02:29:33.644988 update_engine[1593]: I20250913 02:29:33.644125 1593 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 13 02:29:33.644988 update_engine[1593]: I20250913 02:29:33.644142 1593 omaha_request_action.cc:272] Request: Sep 13 02:29:33.644988 update_engine[1593]: Sep 13 02:29:33.644988 update_engine[1593]: Sep 13 02:29:33.644988 update_engine[1593]: Sep 13 02:29:33.644988 update_engine[1593]: Sep 13 02:29:33.644988 update_engine[1593]: Sep 13 02:29:33.644988 update_engine[1593]: Sep 13 02:29:33.644988 update_engine[1593]: Sep 13 02:29:33.644988 update_engine[1593]: Sep 13 02:29:33.644988 update_engine[1593]: I20250913 02:29:33.644153 1593 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 13 02:29:33.655486 update_engine[1593]: I20250913 02:29:33.655156 1593 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 13 02:29:33.656173 update_engine[1593]: I20250913 02:29:33.655582 1593 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 13 02:29:33.676107 update_engine[1593]: E20250913 02:29:33.676038 1593 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 13 02:29:33.676282 update_engine[1593]: I20250913 02:29:33.676165 1593 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 13 02:29:33.693554 locksmithd[1629]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 13 02:29:34.654589 sshd[5710]: Connection closed by 139.178.89.65 port 34830 Sep 13 02:29:34.656593 sshd-session[5707]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:34.680173 systemd[1]: sshd@17-10.230.67.142:22-139.178.89.65:34830.service: Deactivated successfully. Sep 13 02:29:34.687608 systemd[1]: session-20.scope: Deactivated successfully. Sep 13 02:29:34.689583 systemd-logind[1589]: Session 20 logged out. Waiting for processes to exit. Sep 13 02:29:34.693312 systemd-logind[1589]: Removed session 20. Sep 13 02:29:34.814544 systemd[1]: Started sshd@18-10.230.67.142:22-139.178.89.65:34836.service - OpenSSH per-connection server daemon (139.178.89.65:34836). Sep 13 02:29:35.790852 sshd[5721]: Accepted publickey for core from 139.178.89.65 port 34836 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:29:35.793676 sshd-session[5721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:29:35.801000 systemd-logind[1589]: New session 21 of user core. Sep 13 02:29:35.809451 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 13 02:29:39.919732 sshd[5736]: Connection closed by 139.178.89.65 port 34836 Sep 13 02:29:39.927306 sshd-session[5721]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:39.956054 systemd-logind[1589]: Session 21 logged out. Waiting for processes to exit. Sep 13 02:29:39.965980 systemd[1]: sshd@18-10.230.67.142:22-139.178.89.65:34836.service: Deactivated successfully. Sep 13 02:29:39.970752 systemd[1]: session-21.scope: Deactivated successfully. Sep 13 02:29:39.972787 systemd[1]: session-21.scope: Consumed 850ms CPU time, 84.3M memory peak. Sep 13 02:29:39.981129 systemd-logind[1589]: Removed session 21. Sep 13 02:29:40.092222 systemd[1]: Started sshd@19-10.230.67.142:22-139.178.89.65:34848.service - OpenSSH per-connection server daemon (139.178.89.65:34848). Sep 13 02:29:40.860508 containerd[1607]: time="2025-09-13T02:29:40.860433825Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca\" id:\"45ab959674fca5c967008e4aeb4d7eaac88528941b8e299102f2103aca9fe653\" pid:5767 exited_at:{seconds:1757730580 nanos:614971484}" Sep 13 02:29:41.150300 sshd[5785]: Accepted publickey for core from 139.178.89.65 port 34848 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:29:41.154443 sshd-session[5785]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:29:41.184578 systemd-logind[1589]: New session 22 of user core. Sep 13 02:29:41.189516 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 13 02:29:42.796312 sshd[5788]: Connection closed by 139.178.89.65 port 34848 Sep 13 02:29:42.798026 sshd-session[5785]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:42.808971 systemd[1]: sshd@19-10.230.67.142:22-139.178.89.65:34848.service: Deactivated successfully. Sep 13 02:29:42.809034 systemd-logind[1589]: Session 22 logged out. Waiting for processes to exit. Sep 13 02:29:42.813275 systemd[1]: session-22.scope: Deactivated successfully. Sep 13 02:29:42.815379 systemd[1]: session-22.scope: Consumed 594ms CPU time, 70M memory peak. Sep 13 02:29:42.819887 systemd-logind[1589]: Removed session 22. Sep 13 02:29:42.963602 systemd[1]: Started sshd@20-10.230.67.142:22-139.178.89.65:43774.service - OpenSSH per-connection server daemon (139.178.89.65:43774). Sep 13 02:29:43.568118 update_engine[1593]: I20250913 02:29:43.563390 1593 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 13 02:29:43.576019 update_engine[1593]: I20250913 02:29:43.575969 1593 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 13 02:29:43.576482 update_engine[1593]: I20250913 02:29:43.576432 1593 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 13 02:29:43.576927 update_engine[1593]: E20250913 02:29:43.576885 1593 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 13 02:29:43.577023 update_engine[1593]: I20250913 02:29:43.576987 1593 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 13 02:29:43.928306 sshd[5798]: Accepted publickey for core from 139.178.89.65 port 43774 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:29:43.929022 sshd-session[5798]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:29:43.936230 systemd-logind[1589]: New session 23 of user core. Sep 13 02:29:43.944865 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 13 02:29:44.715923 sshd[5800]: Connection closed by 139.178.89.65 port 43774 Sep 13 02:29:44.717079 sshd-session[5798]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:44.726374 systemd[1]: sshd@20-10.230.67.142:22-139.178.89.65:43774.service: Deactivated successfully. Sep 13 02:29:44.731156 systemd[1]: session-23.scope: Deactivated successfully. Sep 13 02:29:44.734461 systemd-logind[1589]: Session 23 logged out. Waiting for processes to exit. Sep 13 02:29:44.739682 systemd-logind[1589]: Removed session 23. Sep 13 02:29:49.889380 systemd[1]: Started sshd@21-10.230.67.142:22-139.178.89.65:43776.service - OpenSSH per-connection server daemon (139.178.89.65:43776). Sep 13 02:29:50.100613 containerd[1607]: time="2025-09-13T02:29:50.094185713Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d0a1638bd2a8dbb61636340dec2ba2b05b3c3a29743fc59ca8d5a1bf762d918\" id:\"1b4232fac502ae2bf1253f7654ac55a56fc5abcfd5d2f18c22d3595cef3104ae\" pid:5837 exited_at:{seconds:1757730590 nanos:18080566}" Sep 13 02:29:50.297506 containerd[1607]: time="2025-09-13T02:29:50.297439615Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6\" id:\"abe0de5301111466ccd22106dc4766ef5631147335b2b3f33e0c30a6afbaf641\" pid:5848 exited_at:{seconds:1757730590 nanos:296847391}" Sep 13 02:29:50.952106 sshd[5831]: Accepted publickey for core from 139.178.89.65 port 43776 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:29:50.953932 sshd-session[5831]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:29:50.967977 systemd-logind[1589]: New session 24 of user core. Sep 13 02:29:50.980640 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 13 02:29:52.002618 sshd[5866]: Connection closed by 139.178.89.65 port 43776 Sep 13 02:29:52.002450 sshd-session[5831]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:52.008866 systemd[1]: sshd@21-10.230.67.142:22-139.178.89.65:43776.service: Deactivated successfully. Sep 13 02:29:52.011900 systemd[1]: session-24.scope: Deactivated successfully. Sep 13 02:29:52.014112 systemd-logind[1589]: Session 24 logged out. Waiting for processes to exit. Sep 13 02:29:52.016849 systemd-logind[1589]: Removed session 24. Sep 13 02:29:53.553486 update_engine[1593]: I20250913 02:29:53.552551 1593 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 13 02:29:53.553486 update_engine[1593]: I20250913 02:29:53.552979 1593 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 13 02:29:53.553486 update_engine[1593]: I20250913 02:29:53.553417 1593 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 13 02:29:53.556496 update_engine[1593]: E20250913 02:29:53.556377 1593 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 13 02:29:53.556496 update_engine[1593]: I20250913 02:29:53.556452 1593 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 13 02:29:56.442477 containerd[1607]: time="2025-09-13T02:29:56.442314193Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb6383658f08c103829adf23b0e284c155de395d6970b28b928807773ddfe6b6\" id:\"1af7c008961f8787ca24bf6a08597fae1490df9fb682d27fd4ca98ceddde3f36\" pid:5888 exited_at:{seconds:1757730596 nanos:441729256}" Sep 13 02:29:57.157859 systemd[1]: Started sshd@22-10.230.67.142:22-139.178.89.65:54342.service - OpenSSH per-connection server daemon (139.178.89.65:54342). Sep 13 02:29:58.099522 sshd[5899]: Accepted publickey for core from 139.178.89.65 port 54342 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:29:58.102323 sshd-session[5899]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:29:58.113430 systemd-logind[1589]: New session 25 of user core. Sep 13 02:29:58.120417 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 13 02:29:58.976117 sshd[5901]: Connection closed by 139.178.89.65 port 54342 Sep 13 02:29:58.976595 sshd-session[5899]: pam_unix(sshd:session): session closed for user core Sep 13 02:29:58.992226 systemd[1]: sshd@22-10.230.67.142:22-139.178.89.65:54342.service: Deactivated successfully. Sep 13 02:29:58.996749 systemd[1]: session-25.scope: Deactivated successfully. Sep 13 02:29:59.000535 systemd-logind[1589]: Session 25 logged out. Waiting for processes to exit. Sep 13 02:29:59.003052 systemd-logind[1589]: Removed session 25. Sep 13 02:30:03.555339 update_engine[1593]: I20250913 02:30:03.555042 1593 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 13 02:30:03.556682 update_engine[1593]: I20250913 02:30:03.556051 1593 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 13 02:30:03.556682 update_engine[1593]: I20250913 02:30:03.556602 1593 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 13 02:30:03.557109 update_engine[1593]: E20250913 02:30:03.557077 1593 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 13 02:30:03.557350 update_engine[1593]: I20250913 02:30:03.557319 1593 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 13 02:30:03.562445 update_engine[1593]: I20250913 02:30:03.562296 1593 omaha_request_action.cc:617] Omaha request response: Sep 13 02:30:03.562964 update_engine[1593]: E20250913 02:30:03.562932 1593 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 13 02:30:03.637226 update_engine[1593]: I20250913 02:30:03.637144 1593 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 13 02:30:03.637449 update_engine[1593]: I20250913 02:30:03.637419 1593 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 13 02:30:03.637537 update_engine[1593]: I20250913 02:30:03.637509 1593 update_attempter.cc:306] Processing Done. Sep 13 02:30:03.638687 update_engine[1593]: E20250913 02:30:03.637689 1593 update_attempter.cc:619] Update failed. Sep 13 02:30:03.638687 update_engine[1593]: I20250913 02:30:03.637714 1593 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 13 02:30:03.638687 update_engine[1593]: I20250913 02:30:03.637725 1593 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 13 02:30:03.638687 update_engine[1593]: I20250913 02:30:03.637737 1593 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 13 02:30:03.638687 update_engine[1593]: I20250913 02:30:03.637861 1593 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 13 02:30:03.638687 update_engine[1593]: I20250913 02:30:03.637913 1593 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 13 02:30:03.638687 update_engine[1593]: I20250913 02:30:03.637928 1593 omaha_request_action.cc:272] Request: Sep 13 02:30:03.638687 update_engine[1593]: Sep 13 02:30:03.638687 update_engine[1593]: Sep 13 02:30:03.638687 update_engine[1593]: Sep 13 02:30:03.638687 update_engine[1593]: Sep 13 02:30:03.638687 update_engine[1593]: Sep 13 02:30:03.638687 update_engine[1593]: Sep 13 02:30:03.638687 update_engine[1593]: I20250913 02:30:03.637938 1593 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 13 02:30:03.638687 update_engine[1593]: I20250913 02:30:03.638187 1593 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 13 02:30:03.638687 update_engine[1593]: I20250913 02:30:03.638566 1593 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 13 02:30:03.640644 update_engine[1593]: E20250913 02:30:03.640293 1593 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 13 02:30:03.640644 update_engine[1593]: I20250913 02:30:03.640352 1593 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 13 02:30:03.640644 update_engine[1593]: I20250913 02:30:03.640368 1593 omaha_request_action.cc:617] Omaha request response: Sep 13 02:30:03.640644 update_engine[1593]: I20250913 02:30:03.640378 1593 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 13 02:30:03.640644 update_engine[1593]: I20250913 02:30:03.640388 1593 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 13 02:30:03.640644 update_engine[1593]: I20250913 02:30:03.640395 1593 update_attempter.cc:306] Processing Done. Sep 13 02:30:03.640644 update_engine[1593]: I20250913 02:30:03.640405 1593 update_attempter.cc:310] Error event sent. Sep 13 02:30:03.640644 update_engine[1593]: I20250913 02:30:03.640425 1593 update_check_scheduler.cc:74] Next update check in 47m59s Sep 13 02:30:03.687064 locksmithd[1629]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 13 02:30:03.687064 locksmithd[1629]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 13 02:30:04.134521 systemd[1]: Started sshd@23-10.230.67.142:22-139.178.89.65:34310.service - OpenSSH per-connection server daemon (139.178.89.65:34310). Sep 13 02:30:05.104757 sshd[5913]: Accepted publickey for core from 139.178.89.65 port 34310 ssh2: RSA SHA256:TXj6QjDDe3r+3pW1NiK6yCZ5ZwfeWHeexMio95l14ss Sep 13 02:30:05.110858 sshd-session[5913]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 02:30:05.121468 systemd-logind[1589]: New session 26 of user core. Sep 13 02:30:05.127789 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 13 02:30:06.138484 sshd[5916]: Connection closed by 139.178.89.65 port 34310 Sep 13 02:30:06.139011 sshd-session[5913]: pam_unix(sshd:session): session closed for user core Sep 13 02:30:06.158238 systemd[1]: sshd@23-10.230.67.142:22-139.178.89.65:34310.service: Deactivated successfully. Sep 13 02:30:06.163137 systemd[1]: session-26.scope: Deactivated successfully. Sep 13 02:30:06.177400 systemd-logind[1589]: Session 26 logged out. Waiting for processes to exit. Sep 13 02:30:06.180364 systemd-logind[1589]: Removed session 26. Sep 13 02:30:09.360763 containerd[1607]: time="2025-09-13T02:30:09.360704935Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de298953a2bbd4c5a1f31771f4620618dcd9a38b79e3e652948790d93aef6fca\" id:\"3164ce2a409931df24c66168f4b6b1e3deebca523dbdcb72316566aba36ce2ce\" pid:5940 exited_at:{seconds:1757730609 nanos:358626505}"