Sep 9 05:38:31.877008 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Sep 8 22:13:49 -00 2025 Sep 9 05:38:31.877039 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=34d704fb26999c645221adf783007b0add8c1672b7c5860358d83aa19335714a Sep 9 05:38:31.878097 kernel: BIOS-provided physical RAM map: Sep 9 05:38:31.878117 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 9 05:38:31.878124 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 9 05:38:31.878132 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 9 05:38:31.878140 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Sep 9 05:38:31.878148 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Sep 9 05:38:31.878155 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 9 05:38:31.878163 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 9 05:38:31.878175 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 9 05:38:31.878182 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 9 05:38:31.878189 kernel: NX (Execute Disable) protection: active Sep 9 05:38:31.878197 kernel: APIC: Static calls initialized Sep 9 05:38:31.878206 kernel: SMBIOS 2.8 present. Sep 9 05:38:31.878215 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Sep 9 05:38:31.878226 kernel: DMI: Memory slots populated: 1/1 Sep 9 05:38:31.878234 kernel: Hypervisor detected: KVM Sep 9 05:38:31.878243 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 9 05:38:31.878251 kernel: kvm-clock: using sched offset of 5218010424 cycles Sep 9 05:38:31.878261 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 9 05:38:31.878269 kernel: tsc: Detected 2294.576 MHz processor Sep 9 05:38:31.878278 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 9 05:38:31.878287 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 9 05:38:31.878296 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Sep 9 05:38:31.878307 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 9 05:38:31.878315 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 9 05:38:31.878324 kernel: Using GB pages for direct mapping Sep 9 05:38:31.878332 kernel: ACPI: Early table checksum verification disabled Sep 9 05:38:31.878340 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Sep 9 05:38:31.878349 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:38:31.878357 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:38:31.878366 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:38:31.878374 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Sep 9 05:38:31.878385 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:38:31.878393 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:38:31.878401 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:38:31.878409 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:38:31.878418 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Sep 9 05:38:31.878426 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Sep 9 05:38:31.878439 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Sep 9 05:38:31.878450 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Sep 9 05:38:31.878458 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Sep 9 05:38:31.878468 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Sep 9 05:38:31.878476 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Sep 9 05:38:31.878485 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 9 05:38:31.878494 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 9 05:38:31.878503 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Sep 9 05:38:31.878514 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00001000-0x7ffdbfff] Sep 9 05:38:31.878523 kernel: NODE_DATA(0) allocated [mem 0x7ffd4dc0-0x7ffdbfff] Sep 9 05:38:31.878532 kernel: Zone ranges: Sep 9 05:38:31.878541 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 9 05:38:31.878550 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Sep 9 05:38:31.878559 kernel: Normal empty Sep 9 05:38:31.878567 kernel: Device empty Sep 9 05:38:31.878576 kernel: Movable zone start for each node Sep 9 05:38:31.878585 kernel: Early memory node ranges Sep 9 05:38:31.878596 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 9 05:38:31.878604 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Sep 9 05:38:31.878613 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Sep 9 05:38:31.878622 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 9 05:38:31.878631 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 9 05:38:31.878640 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Sep 9 05:38:31.878649 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 9 05:38:31.878657 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 9 05:38:31.878666 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 9 05:38:31.878675 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 9 05:38:31.878686 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 9 05:38:31.878695 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 9 05:38:31.878704 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 9 05:38:31.878713 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 9 05:38:31.878721 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 9 05:38:31.878730 kernel: TSC deadline timer available Sep 9 05:38:31.878739 kernel: CPU topo: Max. logical packages: 16 Sep 9 05:38:31.878748 kernel: CPU topo: Max. logical dies: 16 Sep 9 05:38:31.878756 kernel: CPU topo: Max. dies per package: 1 Sep 9 05:38:31.878768 kernel: CPU topo: Max. threads per core: 1 Sep 9 05:38:31.878776 kernel: CPU topo: Num. cores per package: 1 Sep 9 05:38:31.878785 kernel: CPU topo: Num. threads per package: 1 Sep 9 05:38:31.878794 kernel: CPU topo: Allowing 2 present CPUs plus 14 hotplug CPUs Sep 9 05:38:31.878802 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 9 05:38:31.878811 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 9 05:38:31.878820 kernel: Booting paravirtualized kernel on KVM Sep 9 05:38:31.878829 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 9 05:38:31.878838 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 9 05:38:31.878850 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 9 05:38:31.878859 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 9 05:38:31.878876 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 9 05:38:31.878885 kernel: kvm-guest: PV spinlocks enabled Sep 9 05:38:31.878894 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 9 05:38:31.878904 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=34d704fb26999c645221adf783007b0add8c1672b7c5860358d83aa19335714a Sep 9 05:38:31.878914 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 05:38:31.878922 kernel: random: crng init done Sep 9 05:38:31.878934 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 05:38:31.878943 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 9 05:38:31.878952 kernel: Fallback order for Node 0: 0 Sep 9 05:38:31.878961 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524154 Sep 9 05:38:31.878969 kernel: Policy zone: DMA32 Sep 9 05:38:31.878978 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 05:38:31.878987 kernel: software IO TLB: area num 16. Sep 9 05:38:31.878996 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 9 05:38:31.879005 kernel: ftrace: allocating 40102 entries in 157 pages Sep 9 05:38:31.879016 kernel: ftrace: allocated 157 pages with 5 groups Sep 9 05:38:31.879025 kernel: Dynamic Preempt: voluntary Sep 9 05:38:31.879034 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 05:38:31.879044 kernel: rcu: RCU event tracing is enabled. Sep 9 05:38:31.879063 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 9 05:38:31.879073 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 05:38:31.879082 kernel: Rude variant of Tasks RCU enabled. Sep 9 05:38:31.879091 kernel: Tracing variant of Tasks RCU enabled. Sep 9 05:38:31.879100 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 05:38:31.879112 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 9 05:38:31.879121 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 05:38:31.879130 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 05:38:31.879139 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 05:38:31.879148 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Sep 9 05:38:31.879157 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 05:38:31.879166 kernel: Console: colour VGA+ 80x25 Sep 9 05:38:31.879185 kernel: printk: legacy console [tty0] enabled Sep 9 05:38:31.879194 kernel: printk: legacy console [ttyS0] enabled Sep 9 05:38:31.879204 kernel: ACPI: Core revision 20240827 Sep 9 05:38:31.879213 kernel: APIC: Switch to symmetric I/O mode setup Sep 9 05:38:31.879222 kernel: x2apic enabled Sep 9 05:38:31.879234 kernel: APIC: Switched APIC routing to: physical x2apic Sep 9 05:38:31.879244 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2113312ac93, max_idle_ns: 440795244843 ns Sep 9 05:38:31.879253 kernel: Calibrating delay loop (skipped) preset value.. 4589.15 BogoMIPS (lpj=2294576) Sep 9 05:38:31.879263 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 9 05:38:31.879272 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Sep 9 05:38:31.879284 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Sep 9 05:38:31.879293 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 9 05:38:31.879302 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 9 05:38:31.879312 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 9 05:38:31.879321 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 9 05:38:31.879330 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 9 05:38:31.879339 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 9 05:38:31.879349 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 9 05:38:31.879358 kernel: TAA: Mitigation: Clear CPU buffers Sep 9 05:38:31.879367 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 9 05:38:31.879376 kernel: GDS: Unknown: Dependent on hypervisor status Sep 9 05:38:31.879388 kernel: active return thunk: its_return_thunk Sep 9 05:38:31.879397 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 9 05:38:31.879407 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 9 05:38:31.879416 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 9 05:38:31.879426 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 9 05:38:31.879435 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Sep 9 05:38:31.879444 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Sep 9 05:38:31.879454 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Sep 9 05:38:31.879463 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Sep 9 05:38:31.879472 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 9 05:38:31.879481 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Sep 9 05:38:31.879493 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Sep 9 05:38:31.879502 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Sep 9 05:38:31.879512 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Sep 9 05:38:31.879521 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Sep 9 05:38:31.879530 kernel: Freeing SMP alternatives memory: 32K Sep 9 05:38:31.879539 kernel: pid_max: default: 32768 minimum: 301 Sep 9 05:38:31.879548 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 05:38:31.879558 kernel: landlock: Up and running. Sep 9 05:38:31.879567 kernel: SELinux: Initializing. Sep 9 05:38:31.879576 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 05:38:31.879586 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 05:38:31.879597 kernel: smpboot: CPU0: Intel Xeon Processor (Cascadelake) (family: 0x6, model: 0x55, stepping: 0x6) Sep 9 05:38:31.879607 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. Sep 9 05:38:31.879616 kernel: signal: max sigframe size: 3632 Sep 9 05:38:31.879626 kernel: rcu: Hierarchical SRCU implementation. Sep 9 05:38:31.879635 kernel: rcu: Max phase no-delay instances is 400. Sep 9 05:38:31.879645 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Sep 9 05:38:31.879654 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 9 05:38:31.879664 kernel: smp: Bringing up secondary CPUs ... Sep 9 05:38:31.879673 kernel: smpboot: x86: Booting SMP configuration: Sep 9 05:38:31.879682 kernel: .... node #0, CPUs: #1 Sep 9 05:38:31.879694 kernel: smp: Brought up 1 node, 2 CPUs Sep 9 05:38:31.879703 kernel: smpboot: Total of 2 processors activated (9178.30 BogoMIPS) Sep 9 05:38:31.879713 kernel: Memory: 1895704K/2096616K available (14336K kernel code, 2428K rwdata, 9960K rodata, 54036K init, 2932K bss, 194928K reserved, 0K cma-reserved) Sep 9 05:38:31.879723 kernel: devtmpfs: initialized Sep 9 05:38:31.879732 kernel: x86/mm: Memory block size: 128MB Sep 9 05:38:31.879742 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 05:38:31.879751 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 9 05:38:31.879761 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 05:38:31.879773 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 05:38:31.879782 kernel: audit: initializing netlink subsys (disabled) Sep 9 05:38:31.879791 kernel: audit: type=2000 audit(1757396308.310:1): state=initialized audit_enabled=0 res=1 Sep 9 05:38:31.879801 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 05:38:31.879810 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 9 05:38:31.879819 kernel: cpuidle: using governor menu Sep 9 05:38:31.879829 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 05:38:31.879838 kernel: dca service started, version 1.12.1 Sep 9 05:38:31.879848 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 9 05:38:31.879869 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Sep 9 05:38:31.879879 kernel: PCI: Using configuration type 1 for base access Sep 9 05:38:31.879888 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 9 05:38:31.879898 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 05:38:31.879907 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 05:38:31.879916 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 05:38:31.879926 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 05:38:31.879935 kernel: ACPI: Added _OSI(Module Device) Sep 9 05:38:31.879944 kernel: ACPI: Added _OSI(Processor Device) Sep 9 05:38:31.879956 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 05:38:31.879966 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 05:38:31.879975 kernel: ACPI: Interpreter enabled Sep 9 05:38:31.879985 kernel: ACPI: PM: (supports S0 S5) Sep 9 05:38:31.879994 kernel: ACPI: Using IOAPIC for interrupt routing Sep 9 05:38:31.880003 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 9 05:38:31.880013 kernel: PCI: Using E820 reservations for host bridge windows Sep 9 05:38:31.880022 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 9 05:38:31.880032 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 05:38:31.880320 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 05:38:31.880420 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 9 05:38:31.880510 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 9 05:38:31.880523 kernel: PCI host bridge to bus 0000:00 Sep 9 05:38:31.880616 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 9 05:38:31.880698 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 9 05:38:31.880778 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 9 05:38:31.880870 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Sep 9 05:38:31.880950 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 9 05:38:31.881029 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Sep 9 05:38:31.881143 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 05:38:31.881249 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 9 05:38:31.881356 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 conventional PCI endpoint Sep 9 05:38:31.881452 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfa000000-0xfbffffff pref] Sep 9 05:38:31.881542 kernel: pci 0000:00:01.0: BAR 1 [mem 0xfea50000-0xfea50fff] Sep 9 05:38:31.881631 kernel: pci 0000:00:01.0: ROM [mem 0xfea40000-0xfea4ffff pref] Sep 9 05:38:31.881719 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 9 05:38:31.881816 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:38:31.881917 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea51000-0xfea51fff] Sep 9 05:38:31.882007 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 9 05:38:31.883394 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 9 05:38:31.883504 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 05:38:31.883608 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:38:31.883704 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea52000-0xfea52fff] Sep 9 05:38:31.883797 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 9 05:38:31.883896 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 9 05:38:31.883986 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 05:38:31.884113 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:38:31.884207 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea53000-0xfea53fff] Sep 9 05:38:31.884297 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 9 05:38:31.884387 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 9 05:38:31.884477 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 05:38:31.884576 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:38:31.884708 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea54000-0xfea54fff] Sep 9 05:38:31.884804 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 9 05:38:31.884901 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 9 05:38:31.884991 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 05:38:31.886121 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:38:31.886229 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea55000-0xfea55fff] Sep 9 05:38:31.886323 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 9 05:38:31.886416 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 9 05:38:31.886538 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 05:38:31.887035 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:38:31.887178 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea56000-0xfea56fff] Sep 9 05:38:31.887276 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 9 05:38:31.887367 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 9 05:38:31.887458 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 05:38:31.887555 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:38:31.887657 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea57000-0xfea57fff] Sep 9 05:38:31.887746 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 9 05:38:31.887836 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 9 05:38:31.887937 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 05:38:31.888038 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:38:31.888142 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea58000-0xfea58fff] Sep 9 05:38:31.888236 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 9 05:38:31.888355 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 9 05:38:31.888508 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 05:38:31.888607 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 9 05:38:31.888698 kernel: pci 0000:00:03.0: BAR 0 [io 0xc0c0-0xc0df] Sep 9 05:38:31.888789 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfea59000-0xfea59fff] Sep 9 05:38:31.888885 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Sep 9 05:38:31.890131 kernel: pci 0000:00:03.0: ROM [mem 0xfea00000-0xfea3ffff pref] Sep 9 05:38:31.890259 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 9 05:38:31.890358 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc07f] Sep 9 05:38:31.890464 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfea5a000-0xfea5afff] Sep 9 05:38:31.890570 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfd004000-0xfd007fff 64bit pref] Sep 9 05:38:31.890668 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 9 05:38:31.890759 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 9 05:38:31.890867 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 9 05:38:31.890959 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc0e0-0xc0ff] Sep 9 05:38:31.891048 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea5b000-0xfea5bfff] Sep 9 05:38:31.892231 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 9 05:38:31.892332 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 9 05:38:31.892437 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Sep 9 05:38:31.892532 kernel: pci 0000:01:00.0: BAR 0 [mem 0xfda00000-0xfda000ff 64bit] Sep 9 05:38:31.892631 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 9 05:38:31.892724 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 9 05:38:31.892818 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 9 05:38:31.892930 kernel: pci_bus 0000:02: extended config space not accessible Sep 9 05:38:31.893036 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 conventional PCI endpoint Sep 9 05:38:31.893181 kernel: pci 0000:02:01.0: BAR 0 [mem 0xfd800000-0xfd80000f] Sep 9 05:38:31.893277 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 9 05:38:31.893383 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Sep 9 05:38:31.893478 kernel: pci 0000:03:00.0: BAR 0 [mem 0xfe800000-0xfe803fff 64bit] Sep 9 05:38:31.893571 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 9 05:38:31.893673 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Sep 9 05:38:31.893767 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Sep 9 05:38:31.893867 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 9 05:38:31.893963 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 9 05:38:31.896077 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 9 05:38:31.896204 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 9 05:38:31.896305 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 9 05:38:31.896402 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 9 05:38:31.896416 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 9 05:38:31.896426 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 9 05:38:31.896440 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 9 05:38:31.896450 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 9 05:38:31.896460 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 9 05:38:31.896470 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 9 05:38:31.896480 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 9 05:38:31.896489 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 9 05:38:31.896499 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 9 05:38:31.896509 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 9 05:38:31.896518 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 9 05:38:31.896530 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 9 05:38:31.896540 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 9 05:38:31.896550 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 9 05:38:31.896559 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 9 05:38:31.896569 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 9 05:38:31.896579 kernel: iommu: Default domain type: Translated Sep 9 05:38:31.896589 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 9 05:38:31.896598 kernel: PCI: Using ACPI for IRQ routing Sep 9 05:38:31.896608 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 9 05:38:31.896620 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 9 05:38:31.896629 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Sep 9 05:38:31.896721 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 9 05:38:31.896813 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 9 05:38:31.896912 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 9 05:38:31.896925 kernel: vgaarb: loaded Sep 9 05:38:31.896935 kernel: clocksource: Switched to clocksource kvm-clock Sep 9 05:38:31.896945 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 05:38:31.896955 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 05:38:31.896968 kernel: pnp: PnP ACPI init Sep 9 05:38:31.897074 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 9 05:38:31.897089 kernel: pnp: PnP ACPI: found 5 devices Sep 9 05:38:31.897098 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 9 05:38:31.897109 kernel: NET: Registered PF_INET protocol family Sep 9 05:38:31.897118 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 05:38:31.897129 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 9 05:38:31.897138 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 05:38:31.897151 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 9 05:38:31.897161 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 9 05:38:31.897171 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 9 05:38:31.897180 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 05:38:31.897190 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 05:38:31.897199 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 05:38:31.897209 kernel: NET: Registered PF_XDP protocol family Sep 9 05:38:31.897300 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Sep 9 05:38:31.897396 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 9 05:38:31.897489 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 9 05:38:31.897580 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 9 05:38:31.897670 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 9 05:38:31.897759 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 9 05:38:31.897848 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 9 05:38:31.897948 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 9 05:38:31.898038 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Sep 9 05:38:31.898168 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Sep 9 05:38:31.898262 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Sep 9 05:38:31.898352 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Sep 9 05:38:31.898442 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Sep 9 05:38:31.898531 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Sep 9 05:38:31.898621 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Sep 9 05:38:31.898711 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Sep 9 05:38:31.898804 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 9 05:38:31.898909 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 9 05:38:31.898999 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 9 05:38:31.899117 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 9 05:38:31.899210 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 9 05:38:31.899300 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 05:38:31.899393 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 9 05:38:31.899485 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 9 05:38:31.899574 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 9 05:38:31.899664 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 05:38:31.899754 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 9 05:38:31.899844 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 9 05:38:31.899941 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 9 05:38:31.900031 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 05:38:31.902167 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 9 05:38:31.902267 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 9 05:38:31.902364 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 9 05:38:31.902453 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 05:38:31.902545 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 9 05:38:31.902634 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 9 05:38:31.902725 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 9 05:38:31.902819 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 05:38:31.902917 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 9 05:38:31.903012 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 9 05:38:31.903119 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 9 05:38:31.903210 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 05:38:31.903302 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 9 05:38:31.903396 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 9 05:38:31.903486 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 9 05:38:31.903575 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 05:38:31.903666 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 9 05:38:31.903759 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 9 05:38:31.903849 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 9 05:38:31.903948 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 05:38:31.904035 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 9 05:38:31.905049 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 9 05:38:31.905157 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 9 05:38:31.905240 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Sep 9 05:38:31.905320 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 9 05:38:31.905406 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Sep 9 05:38:31.905500 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 9 05:38:31.905585 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Sep 9 05:38:31.905669 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 05:38:31.905761 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 9 05:38:31.905852 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Sep 9 05:38:31.905947 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 9 05:38:31.906037 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 05:38:31.908092 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Sep 9 05:38:31.908178 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 9 05:38:31.908255 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 05:38:31.908337 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Sep 9 05:38:31.908414 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 9 05:38:31.908494 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 05:38:31.908580 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Sep 9 05:38:31.908657 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 9 05:38:31.908733 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 05:38:31.908814 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Sep 9 05:38:31.908899 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 9 05:38:31.908975 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 05:38:31.909075 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Sep 9 05:38:31.909154 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Sep 9 05:38:31.909230 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 05:38:31.909311 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Sep 9 05:38:31.909388 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 9 05:38:31.909463 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 05:38:31.909476 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 9 05:38:31.909488 kernel: PCI: CLS 0 bytes, default 64 Sep 9 05:38:31.909498 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 9 05:38:31.909507 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Sep 9 05:38:31.909517 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 9 05:38:31.909526 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2113312ac93, max_idle_ns: 440795244843 ns Sep 9 05:38:31.909535 kernel: Initialise system trusted keyrings Sep 9 05:38:31.909545 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 9 05:38:31.909554 kernel: Key type asymmetric registered Sep 9 05:38:31.909563 kernel: Asymmetric key parser 'x509' registered Sep 9 05:38:31.909575 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 9 05:38:31.909592 kernel: io scheduler mq-deadline registered Sep 9 05:38:31.909608 kernel: io scheduler kyber registered Sep 9 05:38:31.909623 kernel: io scheduler bfq registered Sep 9 05:38:31.909757 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 9 05:38:31.909842 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 9 05:38:31.909958 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 05:38:31.910051 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 9 05:38:31.910165 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 9 05:38:31.910265 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 05:38:31.910348 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 9 05:38:31.910431 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 9 05:38:31.910530 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 05:38:31.910622 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 9 05:38:31.910718 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 9 05:38:31.910812 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 05:38:31.910910 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 9 05:38:31.911000 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 9 05:38:31.913119 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 05:38:31.913217 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 9 05:38:31.913313 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 9 05:38:31.913405 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 05:38:31.913497 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 9 05:38:31.913586 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 9 05:38:31.913678 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 05:38:31.913769 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 9 05:38:31.913870 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 9 05:38:31.913962 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 05:38:31.913975 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 9 05:38:31.913987 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 9 05:38:31.913998 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 9 05:38:31.914008 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 05:38:31.914018 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 9 05:38:31.914031 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 9 05:38:31.914042 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 9 05:38:31.914052 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 9 05:38:31.914071 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 9 05:38:31.914169 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 9 05:38:31.914255 kernel: rtc_cmos 00:03: registered as rtc0 Sep 9 05:38:31.914338 kernel: rtc_cmos 00:03: setting system clock to 2025-09-09T05:38:31 UTC (1757396311) Sep 9 05:38:31.914420 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Sep 9 05:38:31.914436 kernel: intel_pstate: CPU model not supported Sep 9 05:38:31.914447 kernel: NET: Registered PF_INET6 protocol family Sep 9 05:38:31.914457 kernel: Segment Routing with IPv6 Sep 9 05:38:31.914467 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 05:38:31.914477 kernel: NET: Registered PF_PACKET protocol family Sep 9 05:38:31.914488 kernel: Key type dns_resolver registered Sep 9 05:38:31.914498 kernel: IPI shorthand broadcast: enabled Sep 9 05:38:31.914508 kernel: sched_clock: Marking stable (3329002373, 121340738)->(3689977683, -239634572) Sep 9 05:38:31.914519 kernel: registered taskstats version 1 Sep 9 05:38:31.914531 kernel: Loading compiled-in X.509 certificates Sep 9 05:38:31.914541 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: f610abecf8d2943295243a86f7aa958542b6f677' Sep 9 05:38:31.914552 kernel: Demotion targets for Node 0: null Sep 9 05:38:31.914562 kernel: Key type .fscrypt registered Sep 9 05:38:31.914571 kernel: Key type fscrypt-provisioning registered Sep 9 05:38:31.914582 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 05:38:31.914592 kernel: ima: Allocated hash algorithm: sha1 Sep 9 05:38:31.914602 kernel: ima: No architecture policies found Sep 9 05:38:31.914612 kernel: clk: Disabling unused clocks Sep 9 05:38:31.914625 kernel: Warning: unable to open an initial console. Sep 9 05:38:31.914635 kernel: Freeing unused kernel image (initmem) memory: 54036K Sep 9 05:38:31.914645 kernel: Write protecting the kernel read-only data: 24576k Sep 9 05:38:31.914656 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Sep 9 05:38:31.914666 kernel: Run /init as init process Sep 9 05:38:31.914676 kernel: with arguments: Sep 9 05:38:31.914686 kernel: /init Sep 9 05:38:31.914696 kernel: with environment: Sep 9 05:38:31.914706 kernel: HOME=/ Sep 9 05:38:31.914718 kernel: TERM=linux Sep 9 05:38:31.914728 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 05:38:31.914740 systemd[1]: Successfully made /usr/ read-only. Sep 9 05:38:31.914753 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 05:38:31.914765 systemd[1]: Detected virtualization kvm. Sep 9 05:38:31.914775 systemd[1]: Detected architecture x86-64. Sep 9 05:38:31.914788 systemd[1]: Running in initrd. Sep 9 05:38:31.914800 systemd[1]: No hostname configured, using default hostname. Sep 9 05:38:31.914811 systemd[1]: Hostname set to . Sep 9 05:38:31.914821 systemd[1]: Initializing machine ID from VM UUID. Sep 9 05:38:31.914831 systemd[1]: Queued start job for default target initrd.target. Sep 9 05:38:31.914842 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:38:31.914852 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:38:31.914871 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 05:38:31.914882 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 05:38:31.914895 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 05:38:31.914906 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 05:38:31.914918 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 05:38:31.914929 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 05:38:31.914940 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:38:31.914951 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:38:31.914961 systemd[1]: Reached target paths.target - Path Units. Sep 9 05:38:31.914974 systemd[1]: Reached target slices.target - Slice Units. Sep 9 05:38:31.914985 systemd[1]: Reached target swap.target - Swaps. Sep 9 05:38:31.914995 systemd[1]: Reached target timers.target - Timer Units. Sep 9 05:38:31.915006 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 05:38:31.915017 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 05:38:31.915027 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 05:38:31.915038 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 05:38:31.915048 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:38:31.915074 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 05:38:31.915087 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:38:31.915098 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 05:38:31.915108 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 05:38:31.915119 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 05:38:31.915130 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 05:38:31.915141 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 05:38:31.915151 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 05:38:31.915162 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 05:38:31.915175 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 05:38:31.915185 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:38:31.915196 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 05:38:31.915207 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:38:31.915220 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 05:38:31.915231 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 05:38:31.915270 systemd-journald[229]: Collecting audit messages is disabled. Sep 9 05:38:31.915296 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:38:31.915311 systemd-journald[229]: Journal started Sep 9 05:38:31.915335 systemd-journald[229]: Runtime Journal (/run/log/journal/296251331e714f6eac2080412db7d6ba) is 4.7M, max 38.2M, 33.4M free. Sep 9 05:38:31.887112 systemd-modules-load[231]: Inserted module 'overlay' Sep 9 05:38:31.950352 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 05:38:31.950378 kernel: Bridge firewalling registered Sep 9 05:38:31.950392 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 05:38:31.918521 systemd-modules-load[231]: Inserted module 'br_netfilter' Sep 9 05:38:31.953920 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 05:38:31.954979 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:38:31.959158 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 05:38:31.960342 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 05:38:31.963214 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 05:38:31.966160 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 05:38:31.983267 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:38:31.985598 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:38:31.988469 systemd-tmpfiles[250]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 05:38:31.992888 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:38:31.994751 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 05:38:31.995386 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 05:38:31.996875 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 05:38:32.032016 dracut-cmdline[268]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=34d704fb26999c645221adf783007b0add8c1672b7c5860358d83aa19335714a Sep 9 05:38:32.044571 systemd-resolved[267]: Positive Trust Anchors: Sep 9 05:38:32.044589 systemd-resolved[267]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 05:38:32.044630 systemd-resolved[267]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 05:38:32.048331 systemd-resolved[267]: Defaulting to hostname 'linux'. Sep 9 05:38:32.049417 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 05:38:32.050098 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:38:32.155131 kernel: SCSI subsystem initialized Sep 9 05:38:32.169085 kernel: Loading iSCSI transport class v2.0-870. Sep 9 05:38:32.181100 kernel: iscsi: registered transport (tcp) Sep 9 05:38:32.207153 kernel: iscsi: registered transport (qla4xxx) Sep 9 05:38:32.207218 kernel: QLogic iSCSI HBA Driver Sep 9 05:38:32.237695 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 05:38:32.256119 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:38:32.258785 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 05:38:32.331807 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 05:38:32.333902 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 05:38:32.397124 kernel: raid6: avx512x4 gen() 17523 MB/s Sep 9 05:38:32.414085 kernel: raid6: avx512x2 gen() 17477 MB/s Sep 9 05:38:32.431117 kernel: raid6: avx512x1 gen() 17449 MB/s Sep 9 05:38:32.448122 kernel: raid6: avx2x4 gen() 17399 MB/s Sep 9 05:38:32.465113 kernel: raid6: avx2x2 gen() 17334 MB/s Sep 9 05:38:32.482138 kernel: raid6: avx2x1 gen() 13353 MB/s Sep 9 05:38:32.482242 kernel: raid6: using algorithm avx512x4 gen() 17523 MB/s Sep 9 05:38:32.500193 kernel: raid6: .... xor() 7475 MB/s, rmw enabled Sep 9 05:38:32.500297 kernel: raid6: using avx512x2 recovery algorithm Sep 9 05:38:32.531122 kernel: xor: automatically using best checksumming function avx Sep 9 05:38:32.710099 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 05:38:32.717617 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 05:38:32.719698 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:38:32.752119 systemd-udevd[478]: Using default interface naming scheme 'v255'. Sep 9 05:38:32.758890 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:38:32.765127 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 05:38:32.806026 dracut-pre-trigger[485]: rd.md=0: removing MD RAID activation Sep 9 05:38:32.839406 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 05:38:32.842185 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 05:38:32.926964 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:38:32.931432 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 05:38:33.012079 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Sep 9 05:38:33.016443 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Sep 9 05:38:33.030674 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 05:38:33.030725 kernel: GPT:17805311 != 125829119 Sep 9 05:38:33.030739 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 05:38:33.030751 kernel: GPT:17805311 != 125829119 Sep 9 05:38:33.030763 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 05:38:33.030776 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 05:38:33.044073 kernel: cryptd: max_cpu_qlen set to 1000 Sep 9 05:38:33.055081 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 9 05:38:33.057073 kernel: ACPI: bus type USB registered Sep 9 05:38:33.059087 kernel: usbcore: registered new interface driver usbfs Sep 9 05:38:33.063674 kernel: usbcore: registered new interface driver hub Sep 9 05:38:33.063708 kernel: usbcore: registered new device driver usb Sep 9 05:38:33.063910 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:38:33.064034 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:38:33.066623 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:38:33.072615 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:38:33.074601 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:38:33.111529 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 9 05:38:33.206470 kernel: AES CTR mode by8 optimization enabled Sep 9 05:38:33.206520 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 9 05:38:33.206869 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Sep 9 05:38:33.207123 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 9 05:38:33.207363 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 9 05:38:33.207585 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Sep 9 05:38:33.207814 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Sep 9 05:38:33.208033 kernel: hub 1-0:1.0: USB hub found Sep 9 05:38:33.208329 kernel: hub 1-0:1.0: 4 ports detected Sep 9 05:38:33.208572 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 9 05:38:33.208984 kernel: hub 2-0:1.0: USB hub found Sep 9 05:38:33.209305 kernel: hub 2-0:1.0: 4 ports detected Sep 9 05:38:33.209561 kernel: libata version 3.00 loaded. Sep 9 05:38:33.209590 kernel: ahci 0000:00:1f.2: version 3.0 Sep 9 05:38:33.209838 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 9 05:38:33.209869 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 9 05:38:33.210118 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 9 05:38:33.210354 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 9 05:38:33.210582 kernel: scsi host0: ahci Sep 9 05:38:33.210826 kernel: scsi host1: ahci Sep 9 05:38:33.211043 kernel: scsi host2: ahci Sep 9 05:38:33.211305 kernel: scsi host3: ahci Sep 9 05:38:33.211519 kernel: scsi host4: ahci Sep 9 05:38:33.211745 kernel: scsi host5: ahci Sep 9 05:38:33.211976 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 41 lpm-pol 1 Sep 9 05:38:33.212006 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 41 lpm-pol 1 Sep 9 05:38:33.212033 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 41 lpm-pol 1 Sep 9 05:38:33.212075 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 41 lpm-pol 1 Sep 9 05:38:33.212102 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 41 lpm-pol 1 Sep 9 05:38:33.212127 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 41 lpm-pol 1 Sep 9 05:38:33.214538 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 05:38:33.215381 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:38:33.236719 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 9 05:38:33.237258 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 9 05:38:33.255367 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 9 05:38:33.257737 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 05:38:33.284853 disk-uuid[632]: Primary Header is updated. Sep 9 05:38:33.284853 disk-uuid[632]: Secondary Entries is updated. Sep 9 05:38:33.284853 disk-uuid[632]: Secondary Header is updated. Sep 9 05:38:33.290931 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 05:38:33.365108 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 9 05:38:33.475098 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 9 05:38:33.478120 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 9 05:38:33.478207 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 9 05:38:33.482351 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 9 05:38:33.484162 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 9 05:38:33.486144 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 9 05:38:33.511081 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 05:38:33.516134 kernel: usbcore: registered new interface driver usbhid Sep 9 05:38:33.516211 kernel: usbhid: USB HID core driver Sep 9 05:38:33.522122 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Sep 9 05:38:33.522180 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Sep 9 05:38:33.545588 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 05:38:33.547928 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 05:38:33.549841 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:38:33.552238 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 05:38:33.555080 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 05:38:33.602650 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 05:38:34.308119 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 05:38:34.313431 disk-uuid[633]: The operation has completed successfully. Sep 9 05:38:34.356636 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 05:38:34.356763 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 05:38:34.389591 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 05:38:34.416874 sh[658]: Success Sep 9 05:38:34.446140 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 05:38:34.446219 kernel: device-mapper: uevent: version 1.0.3 Sep 9 05:38:34.447123 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 05:38:34.458103 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 9 05:38:34.502129 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 05:38:34.504175 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 05:38:34.513753 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 05:38:34.522374 kernel: BTRFS: device fsid eee400a1-88b9-480b-9c0c-54d171140f9a devid 1 transid 35 /dev/mapper/usr (253:0) scanned by mount (670) Sep 9 05:38:34.522419 kernel: BTRFS info (device dm-0): first mount of filesystem eee400a1-88b9-480b-9c0c-54d171140f9a Sep 9 05:38:34.524444 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:38:34.530158 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 05:38:34.530196 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 05:38:34.532623 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 05:38:34.534385 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 05:38:34.536474 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 05:38:34.538285 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 05:38:34.541285 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 05:38:34.566118 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (699) Sep 9 05:38:34.568126 kernel: BTRFS info (device vda6): first mount of filesystem df6b516e-a914-4199-9bb5-7fc056237ce5 Sep 9 05:38:34.568198 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:38:34.574249 kernel: BTRFS info (device vda6): turning on async discard Sep 9 05:38:34.574314 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 05:38:34.583091 kernel: BTRFS info (device vda6): last unmount of filesystem df6b516e-a914-4199-9bb5-7fc056237ce5 Sep 9 05:38:34.585599 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 05:38:34.587964 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 05:38:34.730274 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 05:38:34.736202 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 05:38:34.749909 ignition[744]: Ignition 2.21.0 Sep 9 05:38:34.750088 ignition[744]: Stage: fetch-offline Sep 9 05:38:34.750150 ignition[744]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:38:34.750160 ignition[744]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 05:38:34.750290 ignition[744]: parsed url from cmdline: "" Sep 9 05:38:34.753552 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 05:38:34.750294 ignition[744]: no config URL provided Sep 9 05:38:34.750300 ignition[744]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 05:38:34.750307 ignition[744]: no config at "/usr/lib/ignition/user.ign" Sep 9 05:38:34.750312 ignition[744]: failed to fetch config: resource requires networking Sep 9 05:38:34.750510 ignition[744]: Ignition finished successfully Sep 9 05:38:34.782827 systemd-networkd[845]: lo: Link UP Sep 9 05:38:34.782840 systemd-networkd[845]: lo: Gained carrier Sep 9 05:38:34.784516 systemd-networkd[845]: Enumeration completed Sep 9 05:38:34.784628 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 05:38:34.785177 systemd-networkd[845]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:38:34.785181 systemd-networkd[845]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:38:34.785742 systemd[1]: Reached target network.target - Network. Sep 9 05:38:34.786615 systemd-networkd[845]: eth0: Link UP Sep 9 05:38:34.786751 systemd-networkd[845]: eth0: Gained carrier Sep 9 05:38:34.786762 systemd-networkd[845]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:38:34.789496 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 9 05:38:34.797155 systemd-networkd[845]: eth0: DHCPv4 address 10.244.98.182/30, gateway 10.244.98.181 acquired from 10.244.98.181 Sep 9 05:38:34.817676 ignition[849]: Ignition 2.21.0 Sep 9 05:38:34.817693 ignition[849]: Stage: fetch Sep 9 05:38:34.817926 ignition[849]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:38:34.817937 ignition[849]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 05:38:34.818048 ignition[849]: parsed url from cmdline: "" Sep 9 05:38:34.818073 ignition[849]: no config URL provided Sep 9 05:38:34.818080 ignition[849]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 05:38:34.818088 ignition[849]: no config at "/usr/lib/ignition/user.ign" Sep 9 05:38:34.818398 ignition[849]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Sep 9 05:38:34.820213 ignition[849]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Sep 9 05:38:34.820233 ignition[849]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Sep 9 05:38:34.836088 ignition[849]: GET result: OK Sep 9 05:38:34.836214 ignition[849]: parsing config with SHA512: 7ce25eea6b08db4c1e153c8d22400a370060eae017e6e1d8cef59f9c216de078bd6f223403b2a7bbd76f65fe1b0a3d9a72c926116014dd746b642f7562dba87b Sep 9 05:38:34.841113 unknown[849]: fetched base config from "system" Sep 9 05:38:34.841127 unknown[849]: fetched base config from "system" Sep 9 05:38:34.841538 ignition[849]: fetch: fetch complete Sep 9 05:38:34.841134 unknown[849]: fetched user config from "openstack" Sep 9 05:38:34.841545 ignition[849]: fetch: fetch passed Sep 9 05:38:34.841598 ignition[849]: Ignition finished successfully Sep 9 05:38:34.844775 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 9 05:38:34.846292 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 05:38:34.882827 ignition[856]: Ignition 2.21.0 Sep 9 05:38:34.882843 ignition[856]: Stage: kargs Sep 9 05:38:34.883021 ignition[856]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:38:34.883032 ignition[856]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 05:38:34.884078 ignition[856]: kargs: kargs passed Sep 9 05:38:34.887324 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 05:38:34.884127 ignition[856]: Ignition finished successfully Sep 9 05:38:34.890259 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 05:38:34.926500 ignition[863]: Ignition 2.21.0 Sep 9 05:38:34.926517 ignition[863]: Stage: disks Sep 9 05:38:34.926685 ignition[863]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:38:34.926695 ignition[863]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 05:38:34.927443 ignition[863]: disks: disks passed Sep 9 05:38:34.927488 ignition[863]: Ignition finished successfully Sep 9 05:38:34.929167 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 05:38:34.930360 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 05:38:34.931284 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 05:38:34.931727 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 05:38:34.932559 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 05:38:34.933317 systemd[1]: Reached target basic.target - Basic System. Sep 9 05:38:34.935107 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 05:38:34.981439 systemd-fsck[871]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 9 05:38:34.985328 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 05:38:34.988978 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 05:38:35.112080 kernel: EXT4-fs (vda9): mounted filesystem 91c315eb-0fc3-4e95-bf9b-06acc06be6bc r/w with ordered data mode. Quota mode: none. Sep 9 05:38:35.112979 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 05:38:35.114651 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 05:38:35.117287 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 05:38:35.120138 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 05:38:35.121233 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 05:38:35.134610 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Sep 9 05:38:35.137858 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 05:38:35.138708 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 05:38:35.140894 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 05:38:35.147763 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 05:38:35.148893 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (879) Sep 9 05:38:35.152829 kernel: BTRFS info (device vda6): first mount of filesystem df6b516e-a914-4199-9bb5-7fc056237ce5 Sep 9 05:38:35.152865 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:38:35.160162 kernel: BTRFS info (device vda6): turning on async discard Sep 9 05:38:35.160201 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 05:38:35.162398 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 05:38:35.210102 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 05:38:35.210442 initrd-setup-root[906]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 05:38:35.218838 initrd-setup-root[914]: cut: /sysroot/etc/group: No such file or directory Sep 9 05:38:35.227660 initrd-setup-root[921]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 05:38:35.233789 initrd-setup-root[928]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 05:38:35.356316 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 05:38:35.360321 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 05:38:35.362286 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 05:38:35.378095 kernel: BTRFS info (device vda6): last unmount of filesystem df6b516e-a914-4199-9bb5-7fc056237ce5 Sep 9 05:38:35.393629 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 05:38:35.405984 ignition[997]: INFO : Ignition 2.21.0 Sep 9 05:38:35.408153 ignition[997]: INFO : Stage: mount Sep 9 05:38:35.408153 ignition[997]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:38:35.408153 ignition[997]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 05:38:35.409490 ignition[997]: INFO : mount: mount passed Sep 9 05:38:35.409490 ignition[997]: INFO : Ignition finished successfully Sep 9 05:38:35.411114 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 05:38:35.524593 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 05:38:36.232126 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 05:38:36.267537 systemd-networkd[845]: eth0: Gained IPv6LL Sep 9 05:38:37.776424 systemd-networkd[845]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:18ad:24:19ff:fef4:62b6/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:18ad:24:19ff:fef4:62b6/64 assigned by NDisc. Sep 9 05:38:37.776457 systemd-networkd[845]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 9 05:38:38.240135 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 05:38:42.253098 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 05:38:42.262804 coreos-metadata[881]: Sep 09 05:38:42.262 WARN failed to locate config-drive, using the metadata service API instead Sep 9 05:38:42.286934 coreos-metadata[881]: Sep 09 05:38:42.286 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 9 05:38:42.300818 coreos-metadata[881]: Sep 09 05:38:42.300 INFO Fetch successful Sep 9 05:38:42.302398 coreos-metadata[881]: Sep 09 05:38:42.301 INFO wrote hostname srv-dlh9b.gb1.brightbox.com to /sysroot/etc/hostname Sep 9 05:38:42.304480 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Sep 9 05:38:42.304716 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Sep 9 05:38:42.310923 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 05:38:42.355697 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 05:38:42.396086 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1012) Sep 9 05:38:42.396215 kernel: BTRFS info (device vda6): first mount of filesystem df6b516e-a914-4199-9bb5-7fc056237ce5 Sep 9 05:38:42.396254 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:38:42.405773 kernel: BTRFS info (device vda6): turning on async discard Sep 9 05:38:42.405846 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 05:38:42.409929 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 05:38:42.441741 ignition[1030]: INFO : Ignition 2.21.0 Sep 9 05:38:42.441741 ignition[1030]: INFO : Stage: files Sep 9 05:38:42.442929 ignition[1030]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:38:42.442929 ignition[1030]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 05:38:42.447091 ignition[1030]: DEBUG : files: compiled without relabeling support, skipping Sep 9 05:38:42.447999 ignition[1030]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 05:38:42.448686 ignition[1030]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 05:38:42.453036 ignition[1030]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 05:38:42.453745 ignition[1030]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 05:38:42.454549 unknown[1030]: wrote ssh authorized keys file for user: core Sep 9 05:38:42.455230 ignition[1030]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 05:38:42.457763 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 9 05:38:42.457763 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Sep 9 05:38:42.812843 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 05:38:44.988979 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 9 05:38:44.988979 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 9 05:38:44.995298 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Sep 9 05:38:45.362875 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 05:38:47.009023 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 9 05:38:47.009023 ignition[1030]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 05:38:47.012443 ignition[1030]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 05:38:47.015387 ignition[1030]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 05:38:47.015387 ignition[1030]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 05:38:47.015387 ignition[1030]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 9 05:38:47.015387 ignition[1030]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 05:38:47.015387 ignition[1030]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 05:38:47.015387 ignition[1030]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 05:38:47.015387 ignition[1030]: INFO : files: files passed Sep 9 05:38:47.015387 ignition[1030]: INFO : Ignition finished successfully Sep 9 05:38:47.018078 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 05:38:47.023187 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 05:38:47.025431 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 05:38:47.040148 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 05:38:47.040276 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 05:38:47.047657 initrd-setup-root-after-ignition[1060]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:38:47.047657 initrd-setup-root-after-ignition[1060]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:38:47.050096 initrd-setup-root-after-ignition[1064]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:38:47.052041 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 05:38:47.052934 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 05:38:47.054404 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 05:38:47.114664 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 05:38:47.114835 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 05:38:47.116402 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 05:38:47.117491 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 05:38:47.118815 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 05:38:47.120010 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 05:38:47.156284 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 05:38:47.159244 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 05:38:47.207833 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:38:47.209587 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:38:47.210388 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 05:38:47.212344 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 05:38:47.212502 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 05:38:47.214028 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 05:38:47.214949 systemd[1]: Stopped target basic.target - Basic System. Sep 9 05:38:47.215806 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 05:38:47.216325 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 05:38:47.217715 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 05:38:47.219124 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 05:38:47.220460 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 05:38:47.221724 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 05:38:47.223138 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 05:38:47.224161 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 05:38:47.225182 systemd[1]: Stopped target swap.target - Swaps. Sep 9 05:38:47.226090 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 05:38:47.226226 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 05:38:47.227334 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:38:47.227924 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:38:47.233734 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 05:38:47.233852 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:38:47.234533 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 05:38:47.234661 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 05:38:47.235641 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 05:38:47.235806 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 05:38:47.236799 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 05:38:47.236928 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 05:38:47.239476 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 05:38:47.241147 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 05:38:47.242158 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 05:38:47.242286 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:38:47.244226 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 05:38:47.244366 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 05:38:47.249909 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 05:38:47.252865 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 05:38:47.265955 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 05:38:47.271597 ignition[1084]: INFO : Ignition 2.21.0 Sep 9 05:38:47.271597 ignition[1084]: INFO : Stage: umount Sep 9 05:38:47.273831 ignition[1084]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:38:47.273831 ignition[1084]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 05:38:47.275743 ignition[1084]: INFO : umount: umount passed Sep 9 05:38:47.276992 ignition[1084]: INFO : Ignition finished successfully Sep 9 05:38:47.277357 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 05:38:47.277470 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 05:38:47.279408 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 05:38:47.279467 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 05:38:47.280288 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 05:38:47.280331 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 05:38:47.280972 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 9 05:38:47.281009 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 9 05:38:47.281723 systemd[1]: Stopped target network.target - Network. Sep 9 05:38:47.282409 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 05:38:47.282451 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 05:38:47.283208 systemd[1]: Stopped target paths.target - Path Units. Sep 9 05:38:47.283878 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 05:38:47.287097 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:38:47.287992 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 05:38:47.288381 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 05:38:47.289313 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 05:38:47.289359 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 05:38:47.290024 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 05:38:47.290074 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 05:38:47.290711 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 05:38:47.290772 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 05:38:47.292297 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 05:38:47.292339 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 05:38:47.293555 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 05:38:47.297037 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 05:38:47.302335 systemd-networkd[845]: eth0: DHCPv6 lease lost Sep 9 05:38:47.305265 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 05:38:47.305799 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 05:38:47.309155 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 05:38:47.309735 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 05:38:47.309829 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:38:47.314290 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:38:47.314512 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 05:38:47.314611 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 05:38:47.317183 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 05:38:47.317795 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 05:38:47.319560 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 05:38:47.319688 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:38:47.323100 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 05:38:47.323990 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 05:38:47.324096 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 05:38:47.324960 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 05:38:47.325028 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:38:47.326153 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 05:38:47.326230 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 05:38:47.327206 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:38:47.333703 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 05:38:47.335383 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 05:38:47.335514 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:38:47.337946 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 05:38:47.338544 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 05:38:47.339476 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 05:38:47.339508 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:38:47.340448 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 05:38:47.340490 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 05:38:47.341515 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 05:38:47.341604 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 05:38:47.342708 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 05:38:47.342793 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 05:38:47.345170 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 05:38:47.345586 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 05:38:47.345637 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:38:47.347009 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 05:38:47.347068 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:38:47.350171 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 9 05:38:47.350214 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:38:47.353573 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 05:38:47.353619 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:38:47.354121 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:38:47.354160 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:38:47.355718 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 05:38:47.358142 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 05:38:47.359772 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 05:38:47.359875 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 05:38:47.364876 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 05:38:47.375492 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 05:38:47.379189 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 05:38:47.379501 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 05:38:47.381604 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 05:38:47.385009 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 05:38:47.404790 systemd[1]: Switching root. Sep 9 05:38:47.451189 systemd-journald[229]: Journal stopped Sep 9 05:38:48.601743 systemd-journald[229]: Received SIGTERM from PID 1 (systemd). Sep 9 05:38:48.601873 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 05:38:48.601892 kernel: SELinux: policy capability open_perms=1 Sep 9 05:38:48.601912 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 05:38:48.601925 kernel: SELinux: policy capability always_check_network=0 Sep 9 05:38:48.601942 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 05:38:48.601955 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 05:38:48.601967 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 05:38:48.601988 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 05:38:48.602000 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 05:38:48.602015 kernel: audit: type=1403 audit(1757396327.616:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 05:38:48.602041 systemd[1]: Successfully loaded SELinux policy in 58.565ms. Sep 9 05:38:48.610397 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.024ms. Sep 9 05:38:48.610437 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 05:38:48.610455 systemd[1]: Detected virtualization kvm. Sep 9 05:38:48.610471 systemd[1]: Detected architecture x86-64. Sep 9 05:38:48.613078 systemd[1]: Detected first boot. Sep 9 05:38:48.613118 systemd[1]: Hostname set to . Sep 9 05:38:48.613144 systemd[1]: Initializing machine ID from VM UUID. Sep 9 05:38:48.613167 zram_generator::config[1133]: No configuration found. Sep 9 05:38:48.613196 kernel: Guest personality initialized and is inactive Sep 9 05:38:48.613211 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 9 05:38:48.613223 kernel: Initialized host personality Sep 9 05:38:48.613235 kernel: NET: Registered PF_VSOCK protocol family Sep 9 05:38:48.613256 systemd[1]: Populated /etc with preset unit settings. Sep 9 05:38:48.613272 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 05:38:48.613298 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 05:38:48.613322 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 05:38:48.613342 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 05:38:48.613356 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 05:38:48.613370 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 05:38:48.613383 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 05:38:48.613404 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 05:38:48.613418 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 05:38:48.613431 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 05:38:48.613449 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 05:38:48.613462 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 05:38:48.613479 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:38:48.613493 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:38:48.613506 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 05:38:48.613526 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 05:38:48.613545 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 05:38:48.613563 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 05:38:48.613576 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 9 05:38:48.613591 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:38:48.613605 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:38:48.613622 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 05:38:48.613638 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 05:38:48.613653 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 05:38:48.613673 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 05:38:48.613686 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:38:48.613700 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 05:38:48.613713 systemd[1]: Reached target slices.target - Slice Units. Sep 9 05:38:48.613727 systemd[1]: Reached target swap.target - Swaps. Sep 9 05:38:48.613741 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 05:38:48.613754 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 05:38:48.613770 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 05:38:48.613784 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:38:48.613797 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 05:38:48.613811 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:38:48.613825 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 05:38:48.613840 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 05:38:48.613853 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 05:38:48.613871 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 05:38:48.613889 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:38:48.613905 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 05:38:48.613918 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 05:38:48.613932 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 05:38:48.613947 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 05:38:48.613960 systemd[1]: Reached target machines.target - Containers. Sep 9 05:38:48.613973 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 05:38:48.613988 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:38:48.614001 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 05:38:48.614018 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 05:38:48.614033 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:38:48.614048 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 05:38:48.614080 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:38:48.614094 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 05:38:48.614107 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:38:48.614125 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 05:38:48.614138 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 05:38:48.614152 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 05:38:48.614168 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 05:38:48.614181 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 05:38:48.614195 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:38:48.614216 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 05:38:48.614235 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 05:38:48.614249 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 05:38:48.614263 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 05:38:48.614280 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 05:38:48.614293 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 05:38:48.614314 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 05:38:48.614329 systemd[1]: Stopped verity-setup.service. Sep 9 05:38:48.614343 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:38:48.614358 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 05:38:48.614371 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 05:38:48.614385 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 05:38:48.614399 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 05:38:48.614412 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 05:38:48.614429 kernel: loop: module loaded Sep 9 05:38:48.614443 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 05:38:48.614456 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:38:48.614470 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 05:38:48.614483 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 05:38:48.614497 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:38:48.614514 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:38:48.614528 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:38:48.614541 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:38:48.614557 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:38:48.614571 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:38:48.614584 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 05:38:48.614598 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 05:38:48.614612 kernel: fuse: init (API version 7.41) Sep 9 05:38:48.614630 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 05:38:48.614644 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 05:38:48.614658 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 05:38:48.614678 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 05:38:48.614697 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 05:38:48.614711 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 05:38:48.614727 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 05:38:48.614741 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 05:38:48.614755 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:38:48.614768 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 05:38:48.614817 systemd-journald[1217]: Collecting audit messages is disabled. Sep 9 05:38:48.614853 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 05:38:48.614867 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 05:38:48.619777 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 05:38:48.619805 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 05:38:48.619820 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 05:38:48.619835 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:38:48.619849 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 05:38:48.619863 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:38:48.619881 systemd-journald[1217]: Journal started Sep 9 05:38:48.619922 systemd-journald[1217]: Runtime Journal (/run/log/journal/296251331e714f6eac2080412db7d6ba) is 4.7M, max 38.2M, 33.4M free. Sep 9 05:38:48.257021 systemd[1]: Queued start job for default target multi-user.target. Sep 9 05:38:48.277491 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 9 05:38:48.278134 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 05:38:48.639141 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 05:38:48.643078 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 05:38:48.646072 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 05:38:48.650206 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 05:38:48.651020 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:38:48.658160 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 05:38:48.665450 systemd-tmpfiles[1244]: ACLs are not supported, ignoring. Sep 9 05:38:48.665469 systemd-tmpfiles[1244]: ACLs are not supported, ignoring. Sep 9 05:38:48.682079 kernel: ACPI: bus type drm_connector registered Sep 9 05:38:48.683149 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 05:38:48.683505 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 05:38:48.695464 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 05:38:48.696216 kernel: loop0: detected capacity change from 0 to 111000 Sep 9 05:38:48.697538 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:38:48.703501 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 05:38:48.707796 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 05:38:48.712494 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 05:38:48.724082 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 05:38:48.723374 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 05:38:48.747505 kernel: loop1: detected capacity change from 0 to 8 Sep 9 05:38:48.766079 kernel: loop2: detected capacity change from 0 to 229808 Sep 9 05:38:48.786597 systemd-journald[1217]: Time spent on flushing to /var/log/journal/296251331e714f6eac2080412db7d6ba is 65.945ms for 1185 entries. Sep 9 05:38:48.786597 systemd-journald[1217]: System Journal (/var/log/journal/296251331e714f6eac2080412db7d6ba) is 8M, max 584.8M, 576.8M free. Sep 9 05:38:48.867249 systemd-journald[1217]: Received client request to flush runtime journal. Sep 9 05:38:48.867292 kernel: loop3: detected capacity change from 0 to 128016 Sep 9 05:38:48.785539 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 05:38:48.871788 kernel: loop4: detected capacity change from 0 to 111000 Sep 9 05:38:48.797420 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 05:38:48.800302 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 05:38:48.801682 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:38:48.865287 systemd-tmpfiles[1287]: ACLs are not supported, ignoring. Sep 9 05:38:48.865304 systemd-tmpfiles[1287]: ACLs are not supported, ignoring. Sep 9 05:38:48.871970 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 05:38:48.881422 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:38:48.888376 kernel: loop5: detected capacity change from 0 to 8 Sep 9 05:38:48.893074 kernel: loop6: detected capacity change from 0 to 229808 Sep 9 05:38:48.914080 kernel: loop7: detected capacity change from 0 to 128016 Sep 9 05:38:48.937853 (sd-merge)[1292]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Sep 9 05:38:48.938381 (sd-merge)[1292]: Merged extensions into '/usr'. Sep 9 05:38:48.944443 systemd[1]: Reload requested from client PID 1253 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 05:38:48.944462 systemd[1]: Reloading... Sep 9 05:38:49.095069 zram_generator::config[1320]: No configuration found. Sep 9 05:38:49.198929 ldconfig[1249]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 05:38:49.356101 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 05:38:49.356568 systemd[1]: Reloading finished in 411 ms. Sep 9 05:38:49.385269 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 05:38:49.386311 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 05:38:49.405232 systemd[1]: Starting ensure-sysext.service... Sep 9 05:38:49.409320 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 05:38:49.442154 systemd[1]: Reload requested from client PID 1376 ('systemctl') (unit ensure-sysext.service)... Sep 9 05:38:49.442173 systemd[1]: Reloading... Sep 9 05:38:49.457667 systemd-tmpfiles[1377]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 05:38:49.457700 systemd-tmpfiles[1377]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 05:38:49.457985 systemd-tmpfiles[1377]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 05:38:49.458837 systemd-tmpfiles[1377]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 05:38:49.460950 systemd-tmpfiles[1377]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 05:38:49.461787 systemd-tmpfiles[1377]: ACLs are not supported, ignoring. Sep 9 05:38:49.461852 systemd-tmpfiles[1377]: ACLs are not supported, ignoring. Sep 9 05:38:49.467644 systemd-tmpfiles[1377]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 05:38:49.467656 systemd-tmpfiles[1377]: Skipping /boot Sep 9 05:38:49.483413 systemd-tmpfiles[1377]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 05:38:49.483427 systemd-tmpfiles[1377]: Skipping /boot Sep 9 05:38:49.557135 zram_generator::config[1408]: No configuration found. Sep 9 05:38:49.777429 systemd[1]: Reloading finished in 334 ms. Sep 9 05:38:49.793708 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 05:38:49.807522 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:38:49.816563 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:38:49.821319 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 05:38:49.831564 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 05:38:49.838453 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 05:38:49.843921 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:38:49.847616 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 05:38:49.851947 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:38:49.852164 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:38:49.857804 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:38:49.863649 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:38:49.866373 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:38:49.868226 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:38:49.868365 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:38:49.868468 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:38:49.872304 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:38:49.872513 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:38:49.872696 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:38:49.872785 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:38:49.872874 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:38:49.878328 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:38:49.878602 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:38:49.887143 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 05:38:49.888250 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:38:49.888378 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:38:49.888527 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:38:49.893758 systemd[1]: Finished ensure-sysext.service. Sep 9 05:38:49.895109 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 05:38:49.900539 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 05:38:49.916983 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 05:38:49.919286 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 05:38:49.923598 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 05:38:49.924051 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 05:38:49.924686 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:38:49.926140 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:38:49.926898 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 05:38:49.930285 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 05:38:49.941023 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:38:49.941256 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:38:49.942402 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 05:38:49.943638 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:38:49.944302 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:38:49.945999 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 05:38:49.947722 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 05:38:49.953430 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:38:49.970668 systemd-udevd[1466]: Using default interface naming scheme 'v255'. Sep 9 05:38:49.990418 augenrules[1503]: No rules Sep 9 05:38:49.992672 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:38:49.993079 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:38:49.998464 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:38:50.002931 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 05:38:50.013092 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 05:38:50.137008 systemd-resolved[1465]: Positive Trust Anchors: Sep 9 05:38:50.138092 systemd-resolved[1465]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 05:38:50.138140 systemd-resolved[1465]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 05:38:50.145202 systemd-resolved[1465]: Using system hostname 'srv-dlh9b.gb1.brightbox.com'. Sep 9 05:38:50.147142 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 05:38:50.150732 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:38:50.193184 systemd-networkd[1514]: lo: Link UP Sep 9 05:38:50.193193 systemd-networkd[1514]: lo: Gained carrier Sep 9 05:38:50.195418 systemd-networkd[1514]: Enumeration completed Sep 9 05:38:50.195518 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 05:38:50.202192 systemd[1]: Reached target network.target - Network. Sep 9 05:38:50.205195 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 05:38:50.206731 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 05:38:50.228179 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 05:38:50.228715 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 05:38:50.229190 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 05:38:50.230141 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 05:38:50.230587 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 9 05:38:50.230985 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 05:38:50.231420 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 05:38:50.231451 systemd[1]: Reached target paths.target - Path Units. Sep 9 05:38:50.231798 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 05:38:50.232320 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 05:38:50.232803 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 05:38:50.233254 systemd[1]: Reached target timers.target - Timer Units. Sep 9 05:38:50.235019 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 05:38:50.236939 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 05:38:50.241471 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 05:38:50.242125 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 05:38:50.242568 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 05:38:50.249741 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 05:38:50.250502 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 05:38:50.251791 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 05:38:50.253370 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 05:38:50.254471 systemd[1]: Reached target basic.target - Basic System. Sep 9 05:38:50.254909 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 05:38:50.254939 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 05:38:50.256216 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 05:38:50.260279 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 9 05:38:50.264327 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 05:38:50.270210 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 05:38:50.275216 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 05:38:50.282144 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 05:38:50.283134 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 05:38:50.289260 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 9 05:38:50.295311 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 05:38:50.300954 extend-filesystems[1553]: Found /dev/vda6 Sep 9 05:38:50.304258 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 05:38:50.311123 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 05:38:50.313904 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 05:38:50.316074 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 05:38:50.322673 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 05:38:50.326110 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 05:38:50.326767 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 05:38:50.327812 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 05:38:50.335230 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 05:38:50.338129 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 05:38:50.354252 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 05:38:50.356785 google_oslogin_nss_cache[1554]: oslogin_cache_refresh[1554]: Refreshing passwd entry cache Sep 9 05:38:50.359756 oslogin_cache_refresh[1554]: Refreshing passwd entry cache Sep 9 05:38:50.366295 google_oslogin_nss_cache[1554]: oslogin_cache_refresh[1554]: Failure getting users, quitting Sep 9 05:38:50.368069 oslogin_cache_refresh[1554]: Failure getting users, quitting Sep 9 05:38:50.369197 google_oslogin_nss_cache[1554]: oslogin_cache_refresh[1554]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 05:38:50.369197 google_oslogin_nss_cache[1554]: oslogin_cache_refresh[1554]: Refreshing group entry cache Sep 9 05:38:50.369197 google_oslogin_nss_cache[1554]: oslogin_cache_refresh[1554]: Failure getting groups, quitting Sep 9 05:38:50.369197 google_oslogin_nss_cache[1554]: oslogin_cache_refresh[1554]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 05:38:50.368095 oslogin_cache_refresh[1554]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 05:38:50.368143 oslogin_cache_refresh[1554]: Refreshing group entry cache Sep 9 05:38:50.368647 oslogin_cache_refresh[1554]: Failure getting groups, quitting Sep 9 05:38:50.368656 oslogin_cache_refresh[1554]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 05:38:50.372627 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 9 05:38:50.373244 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 9 05:38:50.378218 jq[1552]: false Sep 9 05:38:50.384794 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 05:38:50.386126 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 05:38:50.392611 extend-filesystems[1553]: Found /dev/vda9 Sep 9 05:38:50.401296 extend-filesystems[1553]: Checking size of /dev/vda9 Sep 9 05:38:50.401798 jq[1568]: true Sep 9 05:38:50.409555 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 05:38:50.409782 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 05:38:50.412808 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 9 05:38:50.419660 update_engine[1567]: I20250909 05:38:50.418166 1567 main.cc:92] Flatcar Update Engine starting Sep 9 05:38:50.430434 dbus-daemon[1549]: [system] SELinux support is enabled Sep 9 05:38:50.430956 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 05:38:50.435412 (ntainerd)[1587]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 05:38:50.436468 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 05:38:50.436499 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 05:38:50.438155 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 05:38:50.438180 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 05:38:50.438729 systemd[1]: Started update-engine.service - Update Engine. Sep 9 05:38:50.440995 update_engine[1567]: I20250909 05:38:50.440915 1567 update_check_scheduler.cc:74] Next update check in 10m42s Sep 9 05:38:50.447273 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 05:38:50.459349 tar[1578]: linux-amd64/LICENSE Sep 9 05:38:50.459349 tar[1578]: linux-amd64/helm Sep 9 05:38:50.460954 jq[1590]: true Sep 9 05:38:50.464599 extend-filesystems[1553]: Resized partition /dev/vda9 Sep 9 05:38:50.468226 extend-filesystems[1598]: resize2fs 1.47.2 (1-Jan-2025) Sep 9 05:38:50.484467 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Sep 9 05:38:50.508989 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 05:38:50.509300 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 05:38:50.601114 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Sep 9 05:38:50.616148 extend-filesystems[1598]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 9 05:38:50.616148 extend-filesystems[1598]: old_desc_blocks = 1, new_desc_blocks = 8 Sep 9 05:38:50.616148 extend-filesystems[1598]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Sep 9 05:38:50.618972 extend-filesystems[1553]: Resized filesystem in /dev/vda9 Sep 9 05:38:50.616531 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 05:38:50.618098 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 05:38:50.624178 bash[1622]: Updated "/home/core/.ssh/authorized_keys" Sep 9 05:38:50.625308 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 05:38:50.633316 systemd[1]: Starting sshkeys.service... Sep 9 05:38:50.650089 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 05:38:50.700962 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 05:38:50.707326 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 05:38:50.720233 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 9 05:38:50.725126 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 9 05:38:50.739676 systemd-networkd[1514]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:38:50.739686 systemd-networkd[1514]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:38:50.743376 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 05:38:50.748849 systemd-networkd[1514]: eth0: Link UP Sep 9 05:38:50.749024 systemd-networkd[1514]: eth0: Gained carrier Sep 9 05:38:50.749047 systemd-networkd[1514]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:38:50.756660 systemd-logind[1565]: New seat seat0. Sep 9 05:38:50.759709 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 05:38:50.764710 locksmithd[1595]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 05:38:50.781070 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 05:38:50.789496 sshd_keygen[1572]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 05:38:50.803154 systemd-networkd[1514]: eth0: DHCPv4 address 10.244.98.182/30, gateway 10.244.98.181 acquired from 10.244.98.181 Sep 9 05:38:50.803328 dbus-daemon[1549]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1514 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 9 05:38:50.807505 systemd-timesyncd[1483]: Network configuration changed, trying to establish connection. Sep 9 05:38:50.808406 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 9 05:38:50.819246 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 9 05:38:50.819380 containerd[1587]: time="2025-09-09T05:38:50Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 05:38:50.826024 containerd[1587]: time="2025-09-09T05:38:50.824887895Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 05:38:50.839076 kernel: ACPI: button: Power Button [PWRF] Sep 9 05:38:50.854933 containerd[1587]: time="2025-09-09T05:38:50.854880080Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="18.225µs" Sep 9 05:38:50.854933 containerd[1587]: time="2025-09-09T05:38:50.854921404Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 05:38:50.855042 containerd[1587]: time="2025-09-09T05:38:50.854948604Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 05:38:50.857476 containerd[1587]: time="2025-09-09T05:38:50.857198725Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 05:38:50.857476 containerd[1587]: time="2025-09-09T05:38:50.857234447Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 05:38:50.857476 containerd[1587]: time="2025-09-09T05:38:50.857266000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 05:38:50.857476 containerd[1587]: time="2025-09-09T05:38:50.857327673Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 05:38:50.857476 containerd[1587]: time="2025-09-09T05:38:50.857345590Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 05:38:50.858750 containerd[1587]: time="2025-09-09T05:38:50.858689047Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 05:38:50.858750 containerd[1587]: time="2025-09-09T05:38:50.858718048Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 05:38:50.858750 containerd[1587]: time="2025-09-09T05:38:50.858737299Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 05:38:50.858750 containerd[1587]: time="2025-09-09T05:38:50.858750951Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 05:38:50.858899 containerd[1587]: time="2025-09-09T05:38:50.858840518Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 05:38:50.859722 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 05:38:50.860158 containerd[1587]: time="2025-09-09T05:38:50.859050005Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 05:38:50.860206 containerd[1587]: time="2025-09-09T05:38:50.860182279Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 05:38:50.860206 containerd[1587]: time="2025-09-09T05:38:50.860196876Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 05:38:50.861270 containerd[1587]: time="2025-09-09T05:38:50.861216364Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 05:38:50.864228 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 05:38:50.864866 containerd[1587]: time="2025-09-09T05:38:50.864815407Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 05:38:50.864934 containerd[1587]: time="2025-09-09T05:38:50.864917691Z" level=info msg="metadata content store policy set" policy=shared Sep 9 05:38:50.868330 containerd[1587]: time="2025-09-09T05:38:50.868300371Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 05:38:50.868406 containerd[1587]: time="2025-09-09T05:38:50.868364497Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 05:38:50.868406 containerd[1587]: time="2025-09-09T05:38:50.868386112Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 05:38:50.868454 containerd[1587]: time="2025-09-09T05:38:50.868433177Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 05:38:50.868478 containerd[1587]: time="2025-09-09T05:38:50.868452345Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 05:38:50.868478 containerd[1587]: time="2025-09-09T05:38:50.868466843Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 05:38:50.868537 containerd[1587]: time="2025-09-09T05:38:50.868483526Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 05:38:50.868537 containerd[1587]: time="2025-09-09T05:38:50.868499664Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 05:38:50.868537 containerd[1587]: time="2025-09-09T05:38:50.868526809Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 05:38:50.868609 containerd[1587]: time="2025-09-09T05:38:50.868538513Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 05:38:50.868609 containerd[1587]: time="2025-09-09T05:38:50.868552376Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 05:38:50.868609 containerd[1587]: time="2025-09-09T05:38:50.868570355Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869256306Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869307421Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869332407Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869346503Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869370235Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869389447Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869404135Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869419223Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869435056Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869455288Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869468794Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869574900Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869595945Z" level=info msg="Start snapshots syncer" Sep 9 05:38:50.870524 containerd[1587]: time="2025-09-09T05:38:50.869627444Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 05:38:50.870892 containerd[1587]: time="2025-09-09T05:38:50.869976576Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 05:38:50.870892 containerd[1587]: time="2025-09-09T05:38:50.870043582Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 05:38:50.874948 containerd[1587]: time="2025-09-09T05:38:50.874919353Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 05:38:50.875079 containerd[1587]: time="2025-09-09T05:38:50.875048009Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 05:38:50.875113 containerd[1587]: time="2025-09-09T05:38:50.875099062Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 05:38:50.875144 containerd[1587]: time="2025-09-09T05:38:50.875114592Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 05:38:50.875144 containerd[1587]: time="2025-09-09T05:38:50.875129733Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 05:38:50.875189 containerd[1587]: time="2025-09-09T05:38:50.875143259Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 05:38:50.875189 containerd[1587]: time="2025-09-09T05:38:50.875155323Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 05:38:50.875189 containerd[1587]: time="2025-09-09T05:38:50.875172648Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 05:38:50.875270 containerd[1587]: time="2025-09-09T05:38:50.875213106Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 05:38:50.875270 containerd[1587]: time="2025-09-09T05:38:50.875231608Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 05:38:50.875270 containerd[1587]: time="2025-09-09T05:38:50.875245371Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 05:38:50.875335 containerd[1587]: time="2025-09-09T05:38:50.875279758Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 05:38:50.875335 containerd[1587]: time="2025-09-09T05:38:50.875296887Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 05:38:50.875335 containerd[1587]: time="2025-09-09T05:38:50.875306381Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 05:38:50.875335 containerd[1587]: time="2025-09-09T05:38:50.875315258Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 05:38:50.875335 containerd[1587]: time="2025-09-09T05:38:50.875323447Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 05:38:50.875335 containerd[1587]: time="2025-09-09T05:38:50.875332182Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 05:38:50.875468 containerd[1587]: time="2025-09-09T05:38:50.875357225Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 05:38:50.875468 containerd[1587]: time="2025-09-09T05:38:50.875388414Z" level=info msg="runtime interface created" Sep 9 05:38:50.875468 containerd[1587]: time="2025-09-09T05:38:50.875394322Z" level=info msg="created NRI interface" Sep 9 05:38:50.875468 containerd[1587]: time="2025-09-09T05:38:50.875403049Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 05:38:50.875468 containerd[1587]: time="2025-09-09T05:38:50.875419122Z" level=info msg="Connect containerd service" Sep 9 05:38:50.875468 containerd[1587]: time="2025-09-09T05:38:50.875454253Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 05:38:50.881565 containerd[1587]: time="2025-09-09T05:38:50.879371372Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 05:38:50.893393 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 05:38:50.893772 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 05:38:50.901418 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 05:38:50.927710 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 05:38:50.938431 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 05:38:50.949436 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 9 05:38:50.950163 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 05:38:51.039082 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 9 05:38:51.044077 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 9 05:38:51.064452 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 9 05:38:51.067263 dbus-daemon[1549]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 9 05:38:51.068086 dbus-daemon[1549]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.8' (uid=0 pid=1642 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 9 05:38:51.073472 systemd[1]: Starting polkit.service - Authorization Manager... Sep 9 05:38:51.182040 containerd[1587]: time="2025-09-09T05:38:51.181993008Z" level=info msg="Start subscribing containerd event" Sep 9 05:38:51.182262 containerd[1587]: time="2025-09-09T05:38:51.182222325Z" level=info msg="Start recovering state" Sep 9 05:38:51.182354 containerd[1587]: time="2025-09-09T05:38:51.182330610Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 05:38:51.182397 containerd[1587]: time="2025-09-09T05:38:51.182383739Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 05:38:51.182682 containerd[1587]: time="2025-09-09T05:38:51.182657666Z" level=info msg="Start event monitor" Sep 9 05:38:51.182871 containerd[1587]: time="2025-09-09T05:38:51.182858145Z" level=info msg="Start cni network conf syncer for default" Sep 9 05:38:51.182942 containerd[1587]: time="2025-09-09T05:38:51.182932938Z" level=info msg="Start streaming server" Sep 9 05:38:51.183037 containerd[1587]: time="2025-09-09T05:38:51.183027035Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 05:38:51.183133 containerd[1587]: time="2025-09-09T05:38:51.183123765Z" level=info msg="runtime interface starting up..." Sep 9 05:38:51.183278 containerd[1587]: time="2025-09-09T05:38:51.183206885Z" level=info msg="starting plugins..." Sep 9 05:38:51.185691 containerd[1587]: time="2025-09-09T05:38:51.183403844Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 05:38:51.185691 containerd[1587]: time="2025-09-09T05:38:51.183570827Z" level=info msg="containerd successfully booted in 0.365845s" Sep 9 05:38:51.183851 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 05:38:51.683094 systemd-resolved[1465]: Clock change detected. Flushing caches. Sep 9 05:38:51.683252 systemd-timesyncd[1483]: Contacted time server 178.62.68.79:123 (0.flatcar.pool.ntp.org). Sep 9 05:38:51.683302 systemd-timesyncd[1483]: Initial clock synchronization to Tue 2025-09-09 05:38:51.683038 UTC. Sep 9 05:38:51.685739 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:38:51.698770 tar[1578]: linux-amd64/README.md Sep 9 05:38:51.710932 systemd-logind[1565]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 9 05:38:51.744529 polkitd[1670]: Started polkitd version 126 Sep 9 05:38:51.745765 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 05:38:51.759554 polkitd[1670]: Loading rules from directory /etc/polkit-1/rules.d Sep 9 05:38:51.760045 polkitd[1670]: Loading rules from directory /run/polkit-1/rules.d Sep 9 05:38:51.760105 polkitd[1670]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 9 05:38:51.760420 polkitd[1670]: Loading rules from directory /usr/local/share/polkit-1/rules.d Sep 9 05:38:51.760441 polkitd[1670]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 9 05:38:51.760475 polkitd[1670]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 9 05:38:51.762932 polkitd[1670]: Finished loading, compiling and executing 2 rules Sep 9 05:38:51.764706 systemd[1]: Started polkit.service - Authorization Manager. Sep 9 05:38:51.769194 dbus-daemon[1549]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 9 05:38:51.772835 polkitd[1670]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 9 05:38:51.805464 systemd-hostnamed[1642]: Hostname set to (static) Sep 9 05:38:51.931770 systemd-logind[1565]: Watching system buttons on /dev/input/event3 (Power Button) Sep 9 05:38:52.001738 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:38:52.749024 systemd-networkd[1514]: eth0: Gained IPv6LL Sep 9 05:38:52.752631 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 05:38:52.756365 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 05:38:52.759514 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:38:52.761191 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 05:38:52.799024 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 05:38:53.042721 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 05:38:53.042857 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 05:38:53.274688 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 05:38:53.281339 systemd[1]: Started sshd@0-10.244.98.182:22-139.178.89.65:52544.service - OpenSSH per-connection server daemon (139.178.89.65:52544). Sep 9 05:38:53.716020 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:38:53.732311 (kubelet)[1726]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:38:54.214883 sshd[1718]: Accepted publickey for core from 139.178.89.65 port 52544 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:38:54.215967 sshd-session[1718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:54.238018 systemd-logind[1565]: New session 1 of user core. Sep 9 05:38:54.238547 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 05:38:54.245610 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 05:38:54.263436 systemd-networkd[1514]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:18ad:24:19ff:fef4:62b6/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:18ad:24:19ff:fef4:62b6/64 assigned by NDisc. Sep 9 05:38:54.263804 systemd-networkd[1514]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 9 05:38:54.286218 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 05:38:54.291015 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 05:38:54.306359 (systemd)[1734]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 05:38:54.309819 systemd-logind[1565]: New session c1 of user core. Sep 9 05:38:54.339580 kubelet[1726]: E0909 05:38:54.339504 1726 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:38:54.343367 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:38:54.344053 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:38:54.344931 systemd[1]: kubelet.service: Consumed 1.219s CPU time, 266.3M memory peak. Sep 9 05:38:54.459799 systemd[1734]: Queued start job for default target default.target. Sep 9 05:38:54.468441 systemd[1734]: Created slice app.slice - User Application Slice. Sep 9 05:38:54.468700 systemd[1734]: Reached target paths.target - Paths. Sep 9 05:38:54.468819 systemd[1734]: Reached target timers.target - Timers. Sep 9 05:38:54.470541 systemd[1734]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 05:38:54.515551 systemd[1734]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 05:38:54.515893 systemd[1734]: Reached target sockets.target - Sockets. Sep 9 05:38:54.516162 systemd[1734]: Reached target basic.target - Basic System. Sep 9 05:38:54.516356 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 05:38:54.517740 systemd[1734]: Reached target default.target - Main User Target. Sep 9 05:38:54.517791 systemd[1734]: Startup finished in 195ms. Sep 9 05:38:54.526170 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 05:38:55.057762 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 05:38:55.058022 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 05:38:55.163941 systemd[1]: Started sshd@1-10.244.98.182:22-139.178.89.65:52560.service - OpenSSH per-connection server daemon (139.178.89.65:52560). Sep 9 05:38:56.083165 sshd[1749]: Accepted publickey for core from 139.178.89.65 port 52560 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:38:56.086480 sshd-session[1749]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:56.096486 systemd-logind[1565]: New session 2 of user core. Sep 9 05:38:56.110042 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 05:38:56.450959 login[1660]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 9 05:38:56.454589 login[1659]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 9 05:38:56.457919 systemd-logind[1565]: New session 3 of user core. Sep 9 05:38:56.466002 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 05:38:56.470553 systemd-logind[1565]: New session 4 of user core. Sep 9 05:38:56.480031 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 05:38:56.703800 sshd[1752]: Connection closed by 139.178.89.65 port 52560 Sep 9 05:38:56.703078 sshd-session[1749]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:56.713399 systemd[1]: sshd@1-10.244.98.182:22-139.178.89.65:52560.service: Deactivated successfully. Sep 9 05:38:56.716419 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 05:38:56.717782 systemd-logind[1565]: Session 2 logged out. Waiting for processes to exit. Sep 9 05:38:56.720443 systemd-logind[1565]: Removed session 2. Sep 9 05:38:56.868207 systemd[1]: Started sshd@2-10.244.98.182:22-139.178.89.65:52574.service - OpenSSH per-connection server daemon (139.178.89.65:52574). Sep 9 05:38:57.777198 sshd[1787]: Accepted publickey for core from 139.178.89.65 port 52574 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:38:57.779814 sshd-session[1787]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:38:57.790715 systemd-logind[1565]: New session 5 of user core. Sep 9 05:38:57.804134 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 05:38:58.401063 sshd[1790]: Connection closed by 139.178.89.65 port 52574 Sep 9 05:38:58.402212 sshd-session[1787]: pam_unix(sshd:session): session closed for user core Sep 9 05:38:58.410563 systemd[1]: sshd@2-10.244.98.182:22-139.178.89.65:52574.service: Deactivated successfully. Sep 9 05:38:58.413385 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 05:38:58.415436 systemd-logind[1565]: Session 5 logged out. Waiting for processes to exit. Sep 9 05:38:58.417001 systemd-logind[1565]: Removed session 5. Sep 9 05:38:59.078694 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 05:38:59.078809 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Sep 9 05:38:59.092948 coreos-metadata[1548]: Sep 09 05:38:59.092 WARN failed to locate config-drive, using the metadata service API instead Sep 9 05:38:59.093301 coreos-metadata[1631]: Sep 09 05:38:59.092 WARN failed to locate config-drive, using the metadata service API instead Sep 9 05:38:59.111264 coreos-metadata[1631]: Sep 09 05:38:59.111 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Sep 9 05:38:59.111719 coreos-metadata[1548]: Sep 09 05:38:59.111 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Sep 9 05:38:59.119168 coreos-metadata[1548]: Sep 09 05:38:59.119 INFO Fetch failed with 404: resource not found Sep 9 05:38:59.119168 coreos-metadata[1548]: Sep 09 05:38:59.119 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 9 05:38:59.120071 coreos-metadata[1548]: Sep 09 05:38:59.120 INFO Fetch successful Sep 9 05:38:59.120242 coreos-metadata[1548]: Sep 09 05:38:59.120 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Sep 9 05:38:59.131632 coreos-metadata[1631]: Sep 09 05:38:59.131 INFO Fetch successful Sep 9 05:38:59.131632 coreos-metadata[1631]: Sep 09 05:38:59.131 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 9 05:38:59.135197 coreos-metadata[1548]: Sep 09 05:38:59.135 INFO Fetch successful Sep 9 05:38:59.135485 coreos-metadata[1548]: Sep 09 05:38:59.135 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Sep 9 05:38:59.151812 coreos-metadata[1548]: Sep 09 05:38:59.151 INFO Fetch successful Sep 9 05:38:59.151812 coreos-metadata[1548]: Sep 09 05:38:59.151 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Sep 9 05:38:59.167124 coreos-metadata[1548]: Sep 09 05:38:59.167 INFO Fetch successful Sep 9 05:38:59.167398 coreos-metadata[1548]: Sep 09 05:38:59.167 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Sep 9 05:38:59.173380 coreos-metadata[1631]: Sep 09 05:38:59.173 INFO Fetch successful Sep 9 05:38:59.175639 unknown[1631]: wrote ssh authorized keys file for user: core Sep 9 05:38:59.184682 coreos-metadata[1548]: Sep 09 05:38:59.182 INFO Fetch successful Sep 9 05:38:59.197208 update-ssh-keys[1800]: Updated "/home/core/.ssh/authorized_keys" Sep 9 05:38:59.198656 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 9 05:38:59.202948 systemd[1]: Finished sshkeys.service. Sep 9 05:38:59.220957 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 9 05:38:59.222175 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 05:38:59.222396 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 05:38:59.224947 systemd[1]: Startup finished in 3.413s (kernel) + 15.967s (initrd) + 11.246s (userspace) = 30.627s. Sep 9 05:39:04.561487 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 05:39:04.564844 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:39:04.750944 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:39:04.763335 (kubelet)[1815]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:39:04.837176 kubelet[1815]: E0909 05:39:04.836861 1815 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:39:04.843135 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:39:04.843359 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:39:04.843853 systemd[1]: kubelet.service: Consumed 222ms CPU time, 108.2M memory peak. Sep 9 05:39:08.561836 systemd[1]: Started sshd@3-10.244.98.182:22-139.178.89.65:36830.service - OpenSSH per-connection server daemon (139.178.89.65:36830). Sep 9 05:39:09.499263 sshd[1824]: Accepted publickey for core from 139.178.89.65 port 36830 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:39:09.502341 sshd-session[1824]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:39:09.511787 systemd-logind[1565]: New session 6 of user core. Sep 9 05:39:09.520116 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 05:39:10.124778 sshd[1827]: Connection closed by 139.178.89.65 port 36830 Sep 9 05:39:10.126470 sshd-session[1824]: pam_unix(sshd:session): session closed for user core Sep 9 05:39:10.133794 systemd[1]: sshd@3-10.244.98.182:22-139.178.89.65:36830.service: Deactivated successfully. Sep 9 05:39:10.136144 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 05:39:10.137631 systemd-logind[1565]: Session 6 logged out. Waiting for processes to exit. Sep 9 05:39:10.138939 systemd-logind[1565]: Removed session 6. Sep 9 05:39:10.288054 systemd[1]: Started sshd@4-10.244.98.182:22-139.178.89.65:57244.service - OpenSSH per-connection server daemon (139.178.89.65:57244). Sep 9 05:39:11.206804 sshd[1833]: Accepted publickey for core from 139.178.89.65 port 57244 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:39:11.208843 sshd-session[1833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:39:11.216579 systemd-logind[1565]: New session 7 of user core. Sep 9 05:39:11.226207 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 05:39:11.823737 sshd[1836]: Connection closed by 139.178.89.65 port 57244 Sep 9 05:39:11.824441 sshd-session[1833]: pam_unix(sshd:session): session closed for user core Sep 9 05:39:11.830749 systemd-logind[1565]: Session 7 logged out. Waiting for processes to exit. Sep 9 05:39:11.831297 systemd[1]: sshd@4-10.244.98.182:22-139.178.89.65:57244.service: Deactivated successfully. Sep 9 05:39:11.834257 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 05:39:11.837817 systemd-logind[1565]: Removed session 7. Sep 9 05:39:11.976111 systemd[1]: Started sshd@5-10.244.98.182:22-139.178.89.65:57256.service - OpenSSH per-connection server daemon (139.178.89.65:57256). Sep 9 05:39:12.905457 sshd[1842]: Accepted publickey for core from 139.178.89.65 port 57256 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:39:12.908112 sshd-session[1842]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:39:12.921733 systemd-logind[1565]: New session 8 of user core. Sep 9 05:39:12.929958 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 05:39:13.528986 sshd[1845]: Connection closed by 139.178.89.65 port 57256 Sep 9 05:39:13.529782 sshd-session[1842]: pam_unix(sshd:session): session closed for user core Sep 9 05:39:13.535003 systemd-logind[1565]: Session 8 logged out. Waiting for processes to exit. Sep 9 05:39:13.535409 systemd[1]: sshd@5-10.244.98.182:22-139.178.89.65:57256.service: Deactivated successfully. Sep 9 05:39:13.538572 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 05:39:13.541077 systemd-logind[1565]: Removed session 8. Sep 9 05:39:13.694420 systemd[1]: Started sshd@6-10.244.98.182:22-139.178.89.65:57258.service - OpenSSH per-connection server daemon (139.178.89.65:57258). Sep 9 05:39:14.691486 sshd[1851]: Accepted publickey for core from 139.178.89.65 port 57258 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:39:14.693777 sshd-session[1851]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:39:14.702705 systemd-logind[1565]: New session 9 of user core. Sep 9 05:39:14.708951 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 05:39:15.061751 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 05:39:15.066333 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:39:15.222731 sudo[1858]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 05:39:15.223054 sudo[1858]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:39:15.243106 sudo[1858]: pam_unix(sudo:session): session closed for user root Sep 9 05:39:15.268826 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:39:15.283028 (kubelet)[1865]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:39:15.347007 kubelet[1865]: E0909 05:39:15.346375 1865 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:39:15.351292 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:39:15.351553 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:39:15.352883 systemd[1]: kubelet.service: Consumed 215ms CPU time, 108.9M memory peak. Sep 9 05:39:15.397768 sshd[1854]: Connection closed by 139.178.89.65 port 57258 Sep 9 05:39:15.399047 sshd-session[1851]: pam_unix(sshd:session): session closed for user core Sep 9 05:39:15.406966 systemd[1]: sshd@6-10.244.98.182:22-139.178.89.65:57258.service: Deactivated successfully. Sep 9 05:39:15.410891 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 05:39:15.412477 systemd-logind[1565]: Session 9 logged out. Waiting for processes to exit. Sep 9 05:39:15.414732 systemd-logind[1565]: Removed session 9. Sep 9 05:39:15.559468 systemd[1]: Started sshd@7-10.244.98.182:22-139.178.89.65:57264.service - OpenSSH per-connection server daemon (139.178.89.65:57264). Sep 9 05:39:16.483144 sshd[1877]: Accepted publickey for core from 139.178.89.65 port 57264 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:39:16.485267 sshd-session[1877]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:39:16.493278 systemd-logind[1565]: New session 10 of user core. Sep 9 05:39:16.499849 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 05:39:16.962281 sudo[1882]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 05:39:16.962566 sudo[1882]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:39:16.968122 sudo[1882]: pam_unix(sudo:session): session closed for user root Sep 9 05:39:16.982698 sudo[1881]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 05:39:16.983031 sudo[1881]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:39:16.998996 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:39:17.050856 augenrules[1904]: No rules Sep 9 05:39:17.052403 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:39:17.052639 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:39:17.055014 sudo[1881]: pam_unix(sudo:session): session closed for user root Sep 9 05:39:17.198858 sshd[1880]: Connection closed by 139.178.89.65 port 57264 Sep 9 05:39:17.200050 sshd-session[1877]: pam_unix(sshd:session): session closed for user core Sep 9 05:39:17.207142 systemd[1]: sshd@7-10.244.98.182:22-139.178.89.65:57264.service: Deactivated successfully. Sep 9 05:39:17.207713 systemd-logind[1565]: Session 10 logged out. Waiting for processes to exit. Sep 9 05:39:17.209586 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 05:39:17.212065 systemd-logind[1565]: Removed session 10. Sep 9 05:39:17.362086 systemd[1]: Started sshd@8-10.244.98.182:22-139.178.89.65:57272.service - OpenSSH per-connection server daemon (139.178.89.65:57272). Sep 9 05:39:18.271280 sshd[1913]: Accepted publickey for core from 139.178.89.65 port 57272 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:39:18.273727 sshd-session[1913]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:39:18.282222 systemd-logind[1565]: New session 11 of user core. Sep 9 05:39:18.289889 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 05:39:18.752362 sudo[1917]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 05:39:18.753251 sudo[1917]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:39:19.182938 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 05:39:19.210436 (dockerd)[1934]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 05:39:19.516614 dockerd[1934]: time="2025-09-09T05:39:19.516296084Z" level=info msg="Starting up" Sep 9 05:39:19.519803 dockerd[1934]: time="2025-09-09T05:39:19.519621194Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 05:39:19.536714 dockerd[1934]: time="2025-09-09T05:39:19.536647042Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 05:39:19.551371 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3080977088-merged.mount: Deactivated successfully. Sep 9 05:39:19.576142 dockerd[1934]: time="2025-09-09T05:39:19.576069824Z" level=info msg="Loading containers: start." Sep 9 05:39:19.595710 kernel: Initializing XFRM netlink socket Sep 9 05:39:19.920032 systemd-networkd[1514]: docker0: Link UP Sep 9 05:39:19.922387 dockerd[1934]: time="2025-09-09T05:39:19.921795999Z" level=info msg="Loading containers: done." Sep 9 05:39:19.937195 dockerd[1934]: time="2025-09-09T05:39:19.937124282Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 05:39:19.937705 dockerd[1934]: time="2025-09-09T05:39:19.937624401Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 05:39:19.938046 dockerd[1934]: time="2025-09-09T05:39:19.938011580Z" level=info msg="Initializing buildkit" Sep 9 05:39:19.961357 dockerd[1934]: time="2025-09-09T05:39:19.961120596Z" level=info msg="Completed buildkit initialization" Sep 9 05:39:19.973411 dockerd[1934]: time="2025-09-09T05:39:19.973366018Z" level=info msg="Daemon has completed initialization" Sep 9 05:39:19.974106 dockerd[1934]: time="2025-09-09T05:39:19.973625521Z" level=info msg="API listen on /run/docker.sock" Sep 9 05:39:19.973683 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 05:39:20.547012 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2028860221-merged.mount: Deactivated successfully. Sep 9 05:39:21.194686 containerd[1587]: time="2025-09-09T05:39:21.194513461Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\"" Sep 9 05:39:22.071846 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3459709525.mount: Deactivated successfully. Sep 9 05:39:24.285052 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 9 05:39:24.629586 containerd[1587]: time="2025-09-09T05:39:24.628653645Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:24.629586 containerd[1587]: time="2025-09-09T05:39:24.629450570Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.4: active requests=0, bytes read=30078672" Sep 9 05:39:24.629586 containerd[1587]: time="2025-09-09T05:39:24.629538905Z" level=info msg="ImageCreate event name:\"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:24.631971 containerd[1587]: time="2025-09-09T05:39:24.631940726Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:24.632944 containerd[1587]: time="2025-09-09T05:39:24.632921088Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.4\" with image id \"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\", size \"30075464\" in 3.438330831s" Sep 9 05:39:24.633049 containerd[1587]: time="2025-09-09T05:39:24.633036171Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\" returns image reference \"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\"" Sep 9 05:39:24.634021 containerd[1587]: time="2025-09-09T05:39:24.633975668Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\"" Sep 9 05:39:25.560373 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 9 05:39:25.562747 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:39:25.729078 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:39:25.740304 (kubelet)[2213]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:39:25.787421 kubelet[2213]: E0909 05:39:25.787380 2213 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:39:25.790957 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:39:25.791197 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:39:25.791714 systemd[1]: kubelet.service: Consumed 179ms CPU time, 108.3M memory peak. Sep 9 05:39:26.979455 containerd[1587]: time="2025-09-09T05:39:26.979333104Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:26.982159 containerd[1587]: time="2025-09-09T05:39:26.982087192Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.4: active requests=0, bytes read=26018074" Sep 9 05:39:26.984818 containerd[1587]: time="2025-09-09T05:39:26.984718873Z" level=info msg="ImageCreate event name:\"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:26.989347 containerd[1587]: time="2025-09-09T05:39:26.988306761Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:26.989822 containerd[1587]: time="2025-09-09T05:39:26.989296720Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.4\" with image id \"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\", size \"27646961\" in 2.355172535s" Sep 9 05:39:26.989822 containerd[1587]: time="2025-09-09T05:39:26.989812637Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\" returns image reference \"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\"" Sep 9 05:39:26.990686 containerd[1587]: time="2025-09-09T05:39:26.990410128Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\"" Sep 9 05:39:28.751431 containerd[1587]: time="2025-09-09T05:39:28.750449407Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:28.751431 containerd[1587]: time="2025-09-09T05:39:28.751130690Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.4: active requests=0, bytes read=20153919" Sep 9 05:39:28.751431 containerd[1587]: time="2025-09-09T05:39:28.751379352Z" level=info msg="ImageCreate event name:\"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:28.753737 containerd[1587]: time="2025-09-09T05:39:28.753710712Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:28.754732 containerd[1587]: time="2025-09-09T05:39:28.754704390Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.4\" with image id \"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\", size \"21782824\" in 1.764260718s" Sep 9 05:39:28.754789 containerd[1587]: time="2025-09-09T05:39:28.754737448Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\" returns image reference \"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\"" Sep 9 05:39:28.755341 containerd[1587]: time="2025-09-09T05:39:28.755320239Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\"" Sep 9 05:39:29.984233 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1359872215.mount: Deactivated successfully. Sep 9 05:39:30.536376 containerd[1587]: time="2025-09-09T05:39:30.536287082Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:30.537309 containerd[1587]: time="2025-09-09T05:39:30.537256209Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.4: active requests=0, bytes read=31899634" Sep 9 05:39:30.537625 containerd[1587]: time="2025-09-09T05:39:30.537569703Z" level=info msg="ImageCreate event name:\"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:30.539637 containerd[1587]: time="2025-09-09T05:39:30.539582176Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:30.540487 containerd[1587]: time="2025-09-09T05:39:30.540427020Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.4\" with image id \"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\", repo tag \"registry.k8s.io/kube-proxy:v1.33.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\", size \"31898645\" in 1.784988531s" Sep 9 05:39:30.540487 containerd[1587]: time="2025-09-09T05:39:30.540473081Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\" returns image reference \"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\"" Sep 9 05:39:30.541261 containerd[1587]: time="2025-09-09T05:39:30.541200531Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 9 05:39:31.223416 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2336365509.mount: Deactivated successfully. Sep 9 05:39:32.214064 containerd[1587]: time="2025-09-09T05:39:32.214016130Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:32.215881 containerd[1587]: time="2025-09-09T05:39:32.215852530Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942246" Sep 9 05:39:32.216089 containerd[1587]: time="2025-09-09T05:39:32.216057623Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:32.218979 containerd[1587]: time="2025-09-09T05:39:32.218943850Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:32.220203 containerd[1587]: time="2025-09-09T05:39:32.220057545Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.678831212s" Sep 9 05:39:32.220203 containerd[1587]: time="2025-09-09T05:39:32.220089270Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Sep 9 05:39:32.221117 containerd[1587]: time="2025-09-09T05:39:32.220951012Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 05:39:32.819025 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2321222618.mount: Deactivated successfully. Sep 9 05:39:32.822706 containerd[1587]: time="2025-09-09T05:39:32.822174495Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:39:32.822867 containerd[1587]: time="2025-09-09T05:39:32.822843714Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Sep 9 05:39:32.824181 containerd[1587]: time="2025-09-09T05:39:32.823123618Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:39:32.824641 containerd[1587]: time="2025-09-09T05:39:32.824618426Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:39:32.825417 containerd[1587]: time="2025-09-09T05:39:32.825389423Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 604.165294ms" Sep 9 05:39:32.825509 containerd[1587]: time="2025-09-09T05:39:32.825494029Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 9 05:39:32.826008 containerd[1587]: time="2025-09-09T05:39:32.825989803Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 9 05:39:33.515610 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2437268351.mount: Deactivated successfully. Sep 9 05:39:35.491930 containerd[1587]: time="2025-09-09T05:39:35.491808746Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:35.492963 containerd[1587]: time="2025-09-09T05:39:35.492925876Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58377879" Sep 9 05:39:35.493696 containerd[1587]: time="2025-09-09T05:39:35.493373012Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:35.497417 containerd[1587]: time="2025-09-09T05:39:35.496123600Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:35.497417 containerd[1587]: time="2025-09-09T05:39:35.497156201Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 2.671021812s" Sep 9 05:39:35.497417 containerd[1587]: time="2025-09-09T05:39:35.497217382Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Sep 9 05:39:35.812059 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 9 05:39:35.816431 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:39:36.034464 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:39:36.049558 (kubelet)[2361]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:39:36.139517 kubelet[2361]: E0909 05:39:36.139315 2361 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:39:36.145786 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:39:36.146086 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:39:36.146619 systemd[1]: kubelet.service: Consumed 227ms CPU time, 110.4M memory peak. Sep 9 05:39:36.611057 update_engine[1567]: I20250909 05:39:36.609804 1567 update_attempter.cc:509] Updating boot flags... Sep 9 05:39:39.439148 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:39:39.439872 systemd[1]: kubelet.service: Consumed 227ms CPU time, 110.4M memory peak. Sep 9 05:39:39.443039 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:39:39.476506 systemd[1]: Reload requested from client PID 2405 ('systemctl') (unit session-11.scope)... Sep 9 05:39:39.476690 systemd[1]: Reloading... Sep 9 05:39:39.612751 zram_generator::config[2449]: No configuration found. Sep 9 05:39:39.876706 systemd[1]: Reloading finished in 399 ms. Sep 9 05:39:39.956549 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:39:39.960019 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 05:39:39.960358 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:39:39.960430 systemd[1]: kubelet.service: Consumed 123ms CPU time, 98M memory peak. Sep 9 05:39:39.962639 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:39:40.133755 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:39:40.150106 (kubelet)[2519]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 05:39:40.204935 kubelet[2519]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:39:40.206108 kubelet[2519]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 05:39:40.206108 kubelet[2519]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:39:40.206108 kubelet[2519]: I0909 05:39:40.205396 2519 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 05:39:40.565739 kubelet[2519]: I0909 05:39:40.565687 2519 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 9 05:39:40.565978 kubelet[2519]: I0909 05:39:40.565960 2519 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 05:39:40.566557 kubelet[2519]: I0909 05:39:40.566533 2519 server.go:956] "Client rotation is on, will bootstrap in background" Sep 9 05:39:40.621018 kubelet[2519]: I0909 05:39:40.620988 2519 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 05:39:40.622632 kubelet[2519]: E0909 05:39:40.622602 2519 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.244.98.182:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.244.98.182:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 9 05:39:40.638181 kubelet[2519]: I0909 05:39:40.638145 2519 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 05:39:40.645627 kubelet[2519]: I0909 05:39:40.645098 2519 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 05:39:40.650253 kubelet[2519]: I0909 05:39:40.650191 2519 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 05:39:40.654527 kubelet[2519]: I0909 05:39:40.650485 2519 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-dlh9b.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 05:39:40.654838 kubelet[2519]: I0909 05:39:40.654823 2519 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 05:39:40.654912 kubelet[2519]: I0909 05:39:40.654903 2519 container_manager_linux.go:303] "Creating device plugin manager" Sep 9 05:39:40.655115 kubelet[2519]: I0909 05:39:40.655104 2519 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:39:40.658219 kubelet[2519]: I0909 05:39:40.657933 2519 kubelet.go:480] "Attempting to sync node with API server" Sep 9 05:39:40.658219 kubelet[2519]: I0909 05:39:40.657968 2519 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 05:39:40.658219 kubelet[2519]: I0909 05:39:40.658008 2519 kubelet.go:386] "Adding apiserver pod source" Sep 9 05:39:40.658219 kubelet[2519]: I0909 05:39:40.658031 2519 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 05:39:40.672311 kubelet[2519]: I0909 05:39:40.671820 2519 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 05:39:40.674681 kubelet[2519]: I0909 05:39:40.673189 2519 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 9 05:39:40.678143 kubelet[2519]: W0909 05:39:40.677760 2519 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 05:39:40.685702 kubelet[2519]: E0909 05:39:40.685632 2519 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.244.98.182:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-dlh9b.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.98.182:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 9 05:39:40.689547 kubelet[2519]: I0909 05:39:40.689518 2519 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 05:39:40.689865 kubelet[2519]: I0909 05:39:40.689851 2519 server.go:1289] "Started kubelet" Sep 9 05:39:40.694142 kubelet[2519]: E0909 05:39:40.693993 2519 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.244.98.182:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.98.182:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 9 05:39:40.702832 kubelet[2519]: I0909 05:39:40.702619 2519 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 05:39:40.705185 kubelet[2519]: E0909 05:39:40.699640 2519 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.244.98.182:6443/api/v1/namespaces/default/events\": dial tcp 10.244.98.182:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-dlh9b.gb1.brightbox.com.186386b278911e13 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-dlh9b.gb1.brightbox.com,UID:srv-dlh9b.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-dlh9b.gb1.brightbox.com,},FirstTimestamp:2025-09-09 05:39:40.689702419 +0000 UTC m=+0.534875746,LastTimestamp:2025-09-09 05:39:40.689702419 +0000 UTC m=+0.534875746,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-dlh9b.gb1.brightbox.com,}" Sep 9 05:39:40.711524 kubelet[2519]: I0909 05:39:40.711460 2519 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 05:39:40.713695 kubelet[2519]: I0909 05:39:40.713564 2519 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 05:39:40.714056 kubelet[2519]: E0909 05:39:40.714035 2519 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-dlh9b.gb1.brightbox.com\" not found" Sep 9 05:39:40.714262 kubelet[2519]: I0909 05:39:40.714243 2519 server.go:317] "Adding debug handlers to kubelet server" Sep 9 05:39:40.720157 kubelet[2519]: I0909 05:39:40.719947 2519 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 05:39:40.720157 kubelet[2519]: I0909 05:39:40.720032 2519 reconciler.go:26] "Reconciler: start to sync state" Sep 9 05:39:40.720786 kubelet[2519]: I0909 05:39:40.720733 2519 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 05:39:40.720982 kubelet[2519]: I0909 05:39:40.720964 2519 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 05:39:40.721173 kubelet[2519]: I0909 05:39:40.721157 2519 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 05:39:40.721750 kubelet[2519]: I0909 05:39:40.721705 2519 factory.go:223] Registration of the systemd container factory successfully Sep 9 05:39:40.722090 kubelet[2519]: I0909 05:39:40.721784 2519 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 05:39:40.725884 kubelet[2519]: I0909 05:39:40.723889 2519 factory.go:223] Registration of the containerd container factory successfully Sep 9 05:39:40.731763 kubelet[2519]: E0909 05:39:40.730310 2519 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.98.182:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-dlh9b.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.98.182:6443: connect: connection refused" interval="200ms" Sep 9 05:39:40.736248 kubelet[2519]: I0909 05:39:40.736209 2519 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 9 05:39:40.739165 kubelet[2519]: I0909 05:39:40.739144 2519 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 9 05:39:40.739294 kubelet[2519]: I0909 05:39:40.739286 2519 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 9 05:39:40.739369 kubelet[2519]: I0909 05:39:40.739360 2519 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 05:39:40.739414 kubelet[2519]: I0909 05:39:40.739409 2519 kubelet.go:2436] "Starting kubelet main sync loop" Sep 9 05:39:40.739523 kubelet[2519]: E0909 05:39:40.739497 2519 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 05:39:40.752761 kubelet[2519]: E0909 05:39:40.752733 2519 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.244.98.182:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.98.182:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 9 05:39:40.754699 kubelet[2519]: E0909 05:39:40.754635 2519 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.244.98.182:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.98.182:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 9 05:39:40.754807 kubelet[2519]: E0909 05:39:40.754687 2519 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 05:39:40.760988 kubelet[2519]: I0909 05:39:40.760963 2519 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 05:39:40.761408 kubelet[2519]: I0909 05:39:40.760981 2519 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 05:39:40.761461 kubelet[2519]: I0909 05:39:40.761418 2519 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:39:40.762402 kubelet[2519]: I0909 05:39:40.762384 2519 policy_none.go:49] "None policy: Start" Sep 9 05:39:40.762472 kubelet[2519]: I0909 05:39:40.762414 2519 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 05:39:40.762472 kubelet[2519]: I0909 05:39:40.762430 2519 state_mem.go:35] "Initializing new in-memory state store" Sep 9 05:39:40.768227 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 05:39:40.784055 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 05:39:40.788609 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 05:39:40.800507 kubelet[2519]: E0909 05:39:40.799900 2519 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 9 05:39:40.800507 kubelet[2519]: I0909 05:39:40.800108 2519 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 05:39:40.800507 kubelet[2519]: I0909 05:39:40.800122 2519 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 05:39:40.801634 kubelet[2519]: I0909 05:39:40.801172 2519 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 05:39:40.803104 kubelet[2519]: E0909 05:39:40.802796 2519 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 05:39:40.803104 kubelet[2519]: E0909 05:39:40.802871 2519 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-dlh9b.gb1.brightbox.com\" not found" Sep 9 05:39:40.861180 systemd[1]: Created slice kubepods-burstable-pod57a85f26c4568eb74535e295a97b7549.slice - libcontainer container kubepods-burstable-pod57a85f26c4568eb74535e295a97b7549.slice. Sep 9 05:39:40.874858 kubelet[2519]: E0909 05:39:40.874802 2519 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-dlh9b.gb1.brightbox.com\" not found" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.876441 systemd[1]: Created slice kubepods-burstable-pod93b13c2d8116cafd4c821e3765cb43eb.slice - libcontainer container kubepods-burstable-pod93b13c2d8116cafd4c821e3765cb43eb.slice. Sep 9 05:39:40.883615 kubelet[2519]: E0909 05:39:40.883576 2519 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-dlh9b.gb1.brightbox.com\" not found" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.885282 systemd[1]: Created slice kubepods-burstable-pode280dfff4430c8a4ee56085d9ee6c011.slice - libcontainer container kubepods-burstable-pode280dfff4430c8a4ee56085d9ee6c011.slice. Sep 9 05:39:40.887750 kubelet[2519]: E0909 05:39:40.887725 2519 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-dlh9b.gb1.brightbox.com\" not found" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.902612 kubelet[2519]: I0909 05:39:40.902270 2519 kubelet_node_status.go:75] "Attempting to register node" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.902892 kubelet[2519]: E0909 05:39:40.902867 2519 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.98.182:6443/api/v1/nodes\": dial tcp 10.244.98.182:6443: connect: connection refused" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.921817 kubelet[2519]: I0909 05:39:40.921726 2519 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/93b13c2d8116cafd4c821e3765cb43eb-flexvolume-dir\") pod \"kube-controller-manager-srv-dlh9b.gb1.brightbox.com\" (UID: \"93b13c2d8116cafd4c821e3765cb43eb\") " pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.921817 kubelet[2519]: I0909 05:39:40.921821 2519 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/93b13c2d8116cafd4c821e3765cb43eb-k8s-certs\") pod \"kube-controller-manager-srv-dlh9b.gb1.brightbox.com\" (UID: \"93b13c2d8116cafd4c821e3765cb43eb\") " pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.922158 kubelet[2519]: I0909 05:39:40.921879 2519 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/57a85f26c4568eb74535e295a97b7549-ca-certs\") pod \"kube-apiserver-srv-dlh9b.gb1.brightbox.com\" (UID: \"57a85f26c4568eb74535e295a97b7549\") " pod="kube-system/kube-apiserver-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.922158 kubelet[2519]: I0909 05:39:40.921923 2519 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/57a85f26c4568eb74535e295a97b7549-k8s-certs\") pod \"kube-apiserver-srv-dlh9b.gb1.brightbox.com\" (UID: \"57a85f26c4568eb74535e295a97b7549\") " pod="kube-system/kube-apiserver-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.922158 kubelet[2519]: I0909 05:39:40.921969 2519 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/93b13c2d8116cafd4c821e3765cb43eb-kubeconfig\") pod \"kube-controller-manager-srv-dlh9b.gb1.brightbox.com\" (UID: \"93b13c2d8116cafd4c821e3765cb43eb\") " pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.922158 kubelet[2519]: I0909 05:39:40.922013 2519 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/93b13c2d8116cafd4c821e3765cb43eb-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-dlh9b.gb1.brightbox.com\" (UID: \"93b13c2d8116cafd4c821e3765cb43eb\") " pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.922158 kubelet[2519]: I0909 05:39:40.922057 2519 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e280dfff4430c8a4ee56085d9ee6c011-kubeconfig\") pod \"kube-scheduler-srv-dlh9b.gb1.brightbox.com\" (UID: \"e280dfff4430c8a4ee56085d9ee6c011\") " pod="kube-system/kube-scheduler-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.922569 kubelet[2519]: I0909 05:39:40.922099 2519 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/57a85f26c4568eb74535e295a97b7549-usr-share-ca-certificates\") pod \"kube-apiserver-srv-dlh9b.gb1.brightbox.com\" (UID: \"57a85f26c4568eb74535e295a97b7549\") " pod="kube-system/kube-apiserver-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.922569 kubelet[2519]: I0909 05:39:40.922137 2519 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/93b13c2d8116cafd4c821e3765cb43eb-ca-certs\") pod \"kube-controller-manager-srv-dlh9b.gb1.brightbox.com\" (UID: \"93b13c2d8116cafd4c821e3765cb43eb\") " pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:40.931709 kubelet[2519]: E0909 05:39:40.931572 2519 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.98.182:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-dlh9b.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.98.182:6443: connect: connection refused" interval="400ms" Sep 9 05:39:41.106690 kubelet[2519]: I0909 05:39:41.106606 2519 kubelet_node_status.go:75] "Attempting to register node" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:41.107584 kubelet[2519]: E0909 05:39:41.107526 2519 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.98.182:6443/api/v1/nodes\": dial tcp 10.244.98.182:6443: connect: connection refused" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:41.181523 containerd[1587]: time="2025-09-09T05:39:41.180907597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-dlh9b.gb1.brightbox.com,Uid:57a85f26c4568eb74535e295a97b7549,Namespace:kube-system,Attempt:0,}" Sep 9 05:39:41.192851 containerd[1587]: time="2025-09-09T05:39:41.192804198Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-dlh9b.gb1.brightbox.com,Uid:93b13c2d8116cafd4c821e3765cb43eb,Namespace:kube-system,Attempt:0,}" Sep 9 05:39:41.193206 containerd[1587]: time="2025-09-09T05:39:41.193052706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-dlh9b.gb1.brightbox.com,Uid:e280dfff4430c8a4ee56085d9ee6c011,Namespace:kube-system,Attempt:0,}" Sep 9 05:39:41.300684 containerd[1587]: time="2025-09-09T05:39:41.299877375Z" level=info msg="connecting to shim e778bab5714e357dcc277aa1b4573b432823aa501c16a9bbf72da632c726a0d4" address="unix:///run/containerd/s/9a2528f0512fffd111e0cbe71455205c8ccda2f0153564042313fda22a793644" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:39:41.302480 containerd[1587]: time="2025-09-09T05:39:41.302438044Z" level=info msg="connecting to shim 08bad4788f543f86af42f380a0fe96d7a1f95c3a30e0a83e5a8673f0450b07f3" address="unix:///run/containerd/s/c2ed28394c1cd81f5299f277a61f8ebdea9b825c25b78e9f77363760dc0bfe1c" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:39:41.304482 containerd[1587]: time="2025-09-09T05:39:41.304417051Z" level=info msg="connecting to shim f47010c7e2a7e7a0b04e760051c71c0b20ff45e23a5de38ae8660952b57032ad" address="unix:///run/containerd/s/ba3c0008c925837a18b16fd15a5e45477eb247af8e7a04d96a5991b1aec17f8d" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:39:41.332798 kubelet[2519]: E0909 05:39:41.332728 2519 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.98.182:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-dlh9b.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.98.182:6443: connect: connection refused" interval="800ms" Sep 9 05:39:41.420067 systemd[1]: Started cri-containerd-f47010c7e2a7e7a0b04e760051c71c0b20ff45e23a5de38ae8660952b57032ad.scope - libcontainer container f47010c7e2a7e7a0b04e760051c71c0b20ff45e23a5de38ae8660952b57032ad. Sep 9 05:39:41.432871 systemd[1]: Started cri-containerd-08bad4788f543f86af42f380a0fe96d7a1f95c3a30e0a83e5a8673f0450b07f3.scope - libcontainer container 08bad4788f543f86af42f380a0fe96d7a1f95c3a30e0a83e5a8673f0450b07f3. Sep 9 05:39:41.449128 systemd[1]: Started cri-containerd-e778bab5714e357dcc277aa1b4573b432823aa501c16a9bbf72da632c726a0d4.scope - libcontainer container e778bab5714e357dcc277aa1b4573b432823aa501c16a9bbf72da632c726a0d4. Sep 9 05:39:41.509926 kubelet[2519]: I0909 05:39:41.509892 2519 kubelet_node_status.go:75] "Attempting to register node" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:41.510509 kubelet[2519]: E0909 05:39:41.510484 2519 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.98.182:6443/api/v1/nodes\": dial tcp 10.244.98.182:6443: connect: connection refused" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:41.544777 containerd[1587]: time="2025-09-09T05:39:41.544736868Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-dlh9b.gb1.brightbox.com,Uid:e280dfff4430c8a4ee56085d9ee6c011,Namespace:kube-system,Attempt:0,} returns sandbox id \"08bad4788f543f86af42f380a0fe96d7a1f95c3a30e0a83e5a8673f0450b07f3\"" Sep 9 05:39:41.545295 containerd[1587]: time="2025-09-09T05:39:41.544857077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-dlh9b.gb1.brightbox.com,Uid:57a85f26c4568eb74535e295a97b7549,Namespace:kube-system,Attempt:0,} returns sandbox id \"f47010c7e2a7e7a0b04e760051c71c0b20ff45e23a5de38ae8660952b57032ad\"" Sep 9 05:39:41.555287 containerd[1587]: time="2025-09-09T05:39:41.555058199Z" level=info msg="CreateContainer within sandbox \"08bad4788f543f86af42f380a0fe96d7a1f95c3a30e0a83e5a8673f0450b07f3\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 05:39:41.556454 containerd[1587]: time="2025-09-09T05:39:41.556415031Z" level=info msg="CreateContainer within sandbox \"f47010c7e2a7e7a0b04e760051c71c0b20ff45e23a5de38ae8660952b57032ad\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 05:39:41.562221 containerd[1587]: time="2025-09-09T05:39:41.562116309Z" level=info msg="Container 51e45207968ff96e5bc005d14fea87221b380534ab2ceed1e961d2204578e696: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:39:41.568904 containerd[1587]: time="2025-09-09T05:39:41.568845591Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-dlh9b.gb1.brightbox.com,Uid:93b13c2d8116cafd4c821e3765cb43eb,Namespace:kube-system,Attempt:0,} returns sandbox id \"e778bab5714e357dcc277aa1b4573b432823aa501c16a9bbf72da632c726a0d4\"" Sep 9 05:39:41.574763 containerd[1587]: time="2025-09-09T05:39:41.574511887Z" level=info msg="CreateContainer within sandbox \"e778bab5714e357dcc277aa1b4573b432823aa501c16a9bbf72da632c726a0d4\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 05:39:41.575855 containerd[1587]: time="2025-09-09T05:39:41.575803623Z" level=info msg="Container ef88e0775fef7d642f23474963f896ab11d3752e384bfbc8d098048521bd21a2: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:39:41.584436 containerd[1587]: time="2025-09-09T05:39:41.584143594Z" level=info msg="CreateContainer within sandbox \"08bad4788f543f86af42f380a0fe96d7a1f95c3a30e0a83e5a8673f0450b07f3\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"51e45207968ff96e5bc005d14fea87221b380534ab2ceed1e961d2204578e696\"" Sep 9 05:39:41.585725 containerd[1587]: time="2025-09-09T05:39:41.585700207Z" level=info msg="StartContainer for \"51e45207968ff96e5bc005d14fea87221b380534ab2ceed1e961d2204578e696\"" Sep 9 05:39:41.588056 containerd[1587]: time="2025-09-09T05:39:41.588021976Z" level=info msg="connecting to shim 51e45207968ff96e5bc005d14fea87221b380534ab2ceed1e961d2204578e696" address="unix:///run/containerd/s/c2ed28394c1cd81f5299f277a61f8ebdea9b825c25b78e9f77363760dc0bfe1c" protocol=ttrpc version=3 Sep 9 05:39:41.592285 containerd[1587]: time="2025-09-09T05:39:41.592248775Z" level=info msg="CreateContainer within sandbox \"f47010c7e2a7e7a0b04e760051c71c0b20ff45e23a5de38ae8660952b57032ad\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ef88e0775fef7d642f23474963f896ab11d3752e384bfbc8d098048521bd21a2\"" Sep 9 05:39:41.594701 containerd[1587]: time="2025-09-09T05:39:41.593648848Z" level=info msg="StartContainer for \"ef88e0775fef7d642f23474963f896ab11d3752e384bfbc8d098048521bd21a2\"" Sep 9 05:39:41.594701 containerd[1587]: time="2025-09-09T05:39:41.594491739Z" level=info msg="Container a48931617fd427084262da07541cc3a08fed37078456c9bdf56df47f15451a6b: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:39:41.594701 containerd[1587]: time="2025-09-09T05:39:41.594618015Z" level=info msg="connecting to shim ef88e0775fef7d642f23474963f896ab11d3752e384bfbc8d098048521bd21a2" address="unix:///run/containerd/s/ba3c0008c925837a18b16fd15a5e45477eb247af8e7a04d96a5991b1aec17f8d" protocol=ttrpc version=3 Sep 9 05:39:41.600236 containerd[1587]: time="2025-09-09T05:39:41.600204697Z" level=info msg="CreateContainer within sandbox \"e778bab5714e357dcc277aa1b4573b432823aa501c16a9bbf72da632c726a0d4\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a48931617fd427084262da07541cc3a08fed37078456c9bdf56df47f15451a6b\"" Sep 9 05:39:41.600947 containerd[1587]: time="2025-09-09T05:39:41.600924736Z" level=info msg="StartContainer for \"a48931617fd427084262da07541cc3a08fed37078456c9bdf56df47f15451a6b\"" Sep 9 05:39:41.602051 containerd[1587]: time="2025-09-09T05:39:41.602024445Z" level=info msg="connecting to shim a48931617fd427084262da07541cc3a08fed37078456c9bdf56df47f15451a6b" address="unix:///run/containerd/s/9a2528f0512fffd111e0cbe71455205c8ccda2f0153564042313fda22a793644" protocol=ttrpc version=3 Sep 9 05:39:41.630167 systemd[1]: Started cri-containerd-51e45207968ff96e5bc005d14fea87221b380534ab2ceed1e961d2204578e696.scope - libcontainer container 51e45207968ff96e5bc005d14fea87221b380534ab2ceed1e961d2204578e696. Sep 9 05:39:41.641305 systemd[1]: Started cri-containerd-ef88e0775fef7d642f23474963f896ab11d3752e384bfbc8d098048521bd21a2.scope - libcontainer container ef88e0775fef7d642f23474963f896ab11d3752e384bfbc8d098048521bd21a2. Sep 9 05:39:41.653895 systemd[1]: Started cri-containerd-a48931617fd427084262da07541cc3a08fed37078456c9bdf56df47f15451a6b.scope - libcontainer container a48931617fd427084262da07541cc3a08fed37078456c9bdf56df47f15451a6b. Sep 9 05:39:41.722086 containerd[1587]: time="2025-09-09T05:39:41.721955729Z" level=info msg="StartContainer for \"ef88e0775fef7d642f23474963f896ab11d3752e384bfbc8d098048521bd21a2\" returns successfully" Sep 9 05:39:41.754164 containerd[1587]: time="2025-09-09T05:39:41.753724923Z" level=info msg="StartContainer for \"a48931617fd427084262da07541cc3a08fed37078456c9bdf56df47f15451a6b\" returns successfully" Sep 9 05:39:41.774548 kubelet[2519]: E0909 05:39:41.774495 2519 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-dlh9b.gb1.brightbox.com\" not found" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:41.784746 containerd[1587]: time="2025-09-09T05:39:41.784637492Z" level=info msg="StartContainer for \"51e45207968ff96e5bc005d14fea87221b380534ab2ceed1e961d2204578e696\" returns successfully" Sep 9 05:39:41.785901 kubelet[2519]: E0909 05:39:41.785735 2519 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-dlh9b.gb1.brightbox.com\" not found" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:41.837932 kubelet[2519]: E0909 05:39:41.837693 2519 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.244.98.182:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.98.182:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 9 05:39:42.312686 kubelet[2519]: I0909 05:39:42.312641 2519 kubelet_node_status.go:75] "Attempting to register node" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:42.794467 kubelet[2519]: E0909 05:39:42.794203 2519 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-dlh9b.gb1.brightbox.com\" not found" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:42.795537 kubelet[2519]: E0909 05:39:42.795184 2519 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-dlh9b.gb1.brightbox.com\" not found" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:43.794855 kubelet[2519]: E0909 05:39:43.794685 2519 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-dlh9b.gb1.brightbox.com\" not found" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:43.794855 kubelet[2519]: E0909 05:39:43.794770 2519 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-dlh9b.gb1.brightbox.com\" not found" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:43.927521 kubelet[2519]: E0909 05:39:43.927483 2519 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-dlh9b.gb1.brightbox.com\" not found" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:44.010265 kubelet[2519]: I0909 05:39:44.009954 2519 kubelet_node_status.go:78] "Successfully registered node" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:44.016973 kubelet[2519]: I0909 05:39:44.016741 2519 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:44.085929 kubelet[2519]: E0909 05:39:44.085717 2519 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-dlh9b.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:44.085929 kubelet[2519]: I0909 05:39:44.085749 2519 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:44.089183 kubelet[2519]: E0909 05:39:44.088070 2519 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-dlh9b.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:44.089183 kubelet[2519]: I0909 05:39:44.089057 2519 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:44.091810 kubelet[2519]: E0909 05:39:44.091777 2519 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-dlh9b.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:44.280684 kubelet[2519]: I0909 05:39:44.280012 2519 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:44.282586 kubelet[2519]: E0909 05:39:44.282545 2519 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-dlh9b.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:44.680907 kubelet[2519]: I0909 05:39:44.680789 2519 apiserver.go:52] "Watching apiserver" Sep 9 05:39:44.720230 kubelet[2519]: I0909 05:39:44.720142 2519 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 05:39:44.795703 kubelet[2519]: I0909 05:39:44.795307 2519 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:44.806010 kubelet[2519]: I0909 05:39:44.805802 2519 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 9 05:39:46.116898 systemd[1]: Reload requested from client PID 2799 ('systemctl') (unit session-11.scope)... Sep 9 05:39:46.116916 systemd[1]: Reloading... Sep 9 05:39:46.231802 zram_generator::config[2840]: No configuration found. Sep 9 05:39:46.540593 systemd[1]: Reloading finished in 423 ms. Sep 9 05:39:46.578148 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:39:46.589990 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 05:39:46.590297 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:39:46.590373 systemd[1]: kubelet.service: Consumed 1.047s CPU time, 128.9M memory peak. Sep 9 05:39:46.592415 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:39:46.778893 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:39:46.792567 (kubelet)[2908]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 05:39:46.909945 kubelet[2908]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:39:46.909945 kubelet[2908]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 05:39:46.909945 kubelet[2908]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:39:46.910385 kubelet[2908]: I0909 05:39:46.910024 2908 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 05:39:46.925717 kubelet[2908]: I0909 05:39:46.925085 2908 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 9 05:39:46.925717 kubelet[2908]: I0909 05:39:46.925127 2908 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 05:39:46.925717 kubelet[2908]: I0909 05:39:46.925417 2908 server.go:956] "Client rotation is on, will bootstrap in background" Sep 9 05:39:46.926932 kubelet[2908]: I0909 05:39:46.926905 2908 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 9 05:39:46.931568 kubelet[2908]: I0909 05:39:46.931238 2908 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 05:39:46.939625 kubelet[2908]: I0909 05:39:46.939600 2908 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 05:39:46.945485 kubelet[2908]: I0909 05:39:46.945452 2908 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 05:39:46.945960 kubelet[2908]: I0909 05:39:46.945931 2908 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 05:39:46.946348 kubelet[2908]: I0909 05:39:46.946048 2908 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-dlh9b.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 05:39:46.946507 kubelet[2908]: I0909 05:39:46.946497 2908 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 05:39:46.946580 kubelet[2908]: I0909 05:39:46.946559 2908 container_manager_linux.go:303] "Creating device plugin manager" Sep 9 05:39:46.946715 kubelet[2908]: I0909 05:39:46.946704 2908 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:39:46.947009 kubelet[2908]: I0909 05:39:46.946996 2908 kubelet.go:480] "Attempting to sync node with API server" Sep 9 05:39:46.947083 kubelet[2908]: I0909 05:39:46.947075 2908 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 05:39:46.947163 kubelet[2908]: I0909 05:39:46.947156 2908 kubelet.go:386] "Adding apiserver pod source" Sep 9 05:39:46.947218 kubelet[2908]: I0909 05:39:46.947212 2908 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 05:39:46.953140 kubelet[2908]: I0909 05:39:46.953108 2908 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 05:39:46.953828 kubelet[2908]: I0909 05:39:46.953811 2908 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 9 05:39:46.959542 kubelet[2908]: I0909 05:39:46.959520 2908 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 05:39:46.959714 kubelet[2908]: I0909 05:39:46.959705 2908 server.go:1289] "Started kubelet" Sep 9 05:39:46.963691 kubelet[2908]: I0909 05:39:46.963655 2908 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 05:39:46.968787 kubelet[2908]: I0909 05:39:46.967953 2908 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 05:39:46.968787 kubelet[2908]: I0909 05:39:46.968098 2908 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 05:39:46.970216 kubelet[2908]: I0909 05:39:46.970180 2908 server.go:317] "Adding debug handlers to kubelet server" Sep 9 05:39:46.973745 kubelet[2908]: I0909 05:39:46.973692 2908 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 05:39:46.973920 kubelet[2908]: I0909 05:39:46.973896 2908 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 05:39:46.974719 kubelet[2908]: I0909 05:39:46.974486 2908 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 05:39:46.981303 kubelet[2908]: I0909 05:39:46.981279 2908 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 05:39:46.981722 kubelet[2908]: I0909 05:39:46.981708 2908 reconciler.go:26] "Reconciler: start to sync state" Sep 9 05:39:46.988699 kubelet[2908]: I0909 05:39:46.987494 2908 factory.go:223] Registration of the systemd container factory successfully Sep 9 05:39:46.988977 kubelet[2908]: I0909 05:39:46.988955 2908 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 05:39:46.992813 kubelet[2908]: I0909 05:39:46.992788 2908 factory.go:223] Registration of the containerd container factory successfully Sep 9 05:39:46.995322 kubelet[2908]: I0909 05:39:46.995284 2908 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 9 05:39:46.996847 kubelet[2908]: I0909 05:39:46.996771 2908 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 9 05:39:46.996847 kubelet[2908]: I0909 05:39:46.996813 2908 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 9 05:39:46.996847 kubelet[2908]: I0909 05:39:46.996842 2908 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 05:39:46.996990 kubelet[2908]: I0909 05:39:46.996861 2908 kubelet.go:2436] "Starting kubelet main sync loop" Sep 9 05:39:46.996990 kubelet[2908]: E0909 05:39:46.996911 2908 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 05:39:47.074671 kubelet[2908]: I0909 05:39:47.074624 2908 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 05:39:47.074671 kubelet[2908]: I0909 05:39:47.074644 2908 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 05:39:47.074671 kubelet[2908]: I0909 05:39:47.074680 2908 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:39:47.074886 kubelet[2908]: I0909 05:39:47.074847 2908 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 05:39:47.074886 kubelet[2908]: I0909 05:39:47.074857 2908 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 05:39:47.074886 kubelet[2908]: I0909 05:39:47.074873 2908 policy_none.go:49] "None policy: Start" Sep 9 05:39:47.074964 kubelet[2908]: I0909 05:39:47.074891 2908 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 05:39:47.074964 kubelet[2908]: I0909 05:39:47.074905 2908 state_mem.go:35] "Initializing new in-memory state store" Sep 9 05:39:47.075023 kubelet[2908]: I0909 05:39:47.075012 2908 state_mem.go:75] "Updated machine memory state" Sep 9 05:39:47.085769 kubelet[2908]: E0909 05:39:47.085023 2908 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 9 05:39:47.085769 kubelet[2908]: I0909 05:39:47.085639 2908 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 05:39:47.086855 kubelet[2908]: I0909 05:39:47.085655 2908 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 05:39:47.092919 kubelet[2908]: I0909 05:39:47.091569 2908 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 05:39:47.092919 kubelet[2908]: E0909 05:39:47.092419 2908 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 05:39:47.100927 kubelet[2908]: I0909 05:39:47.100037 2908 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.107939 kubelet[2908]: I0909 05:39:47.107386 2908 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.109374 kubelet[2908]: I0909 05:39:47.108315 2908 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.121785 kubelet[2908]: I0909 05:39:47.120675 2908 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 9 05:39:47.121785 kubelet[2908]: E0909 05:39:47.120750 2908 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-dlh9b.gb1.brightbox.com\" already exists" pod="kube-system/kube-scheduler-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.125893 kubelet[2908]: I0909 05:39:47.125452 2908 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 9 05:39:47.125893 kubelet[2908]: I0909 05:39:47.125509 2908 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 9 05:39:47.214623 kubelet[2908]: I0909 05:39:47.214198 2908 kubelet_node_status.go:75] "Attempting to register node" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.226149 kubelet[2908]: I0909 05:39:47.225533 2908 kubelet_node_status.go:124] "Node was previously registered" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.226149 kubelet[2908]: I0909 05:39:47.225745 2908 kubelet_node_status.go:78] "Successfully registered node" node="srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.282812 kubelet[2908]: I0909 05:39:47.282496 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/93b13c2d8116cafd4c821e3765cb43eb-ca-certs\") pod \"kube-controller-manager-srv-dlh9b.gb1.brightbox.com\" (UID: \"93b13c2d8116cafd4c821e3765cb43eb\") " pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.282812 kubelet[2908]: I0909 05:39:47.282551 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/93b13c2d8116cafd4c821e3765cb43eb-k8s-certs\") pod \"kube-controller-manager-srv-dlh9b.gb1.brightbox.com\" (UID: \"93b13c2d8116cafd4c821e3765cb43eb\") " pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.282812 kubelet[2908]: I0909 05:39:47.282572 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/93b13c2d8116cafd4c821e3765cb43eb-kubeconfig\") pod \"kube-controller-manager-srv-dlh9b.gb1.brightbox.com\" (UID: \"93b13c2d8116cafd4c821e3765cb43eb\") " pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.282812 kubelet[2908]: I0909 05:39:47.282592 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/93b13c2d8116cafd4c821e3765cb43eb-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-dlh9b.gb1.brightbox.com\" (UID: \"93b13c2d8116cafd4c821e3765cb43eb\") " pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.282812 kubelet[2908]: I0909 05:39:47.282613 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/57a85f26c4568eb74535e295a97b7549-ca-certs\") pod \"kube-apiserver-srv-dlh9b.gb1.brightbox.com\" (UID: \"57a85f26c4568eb74535e295a97b7549\") " pod="kube-system/kube-apiserver-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.283079 kubelet[2908]: I0909 05:39:47.282628 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/57a85f26c4568eb74535e295a97b7549-k8s-certs\") pod \"kube-apiserver-srv-dlh9b.gb1.brightbox.com\" (UID: \"57a85f26c4568eb74535e295a97b7549\") " pod="kube-system/kube-apiserver-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.283079 kubelet[2908]: I0909 05:39:47.282676 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/57a85f26c4568eb74535e295a97b7549-usr-share-ca-certificates\") pod \"kube-apiserver-srv-dlh9b.gb1.brightbox.com\" (UID: \"57a85f26c4568eb74535e295a97b7549\") " pod="kube-system/kube-apiserver-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.283079 kubelet[2908]: I0909 05:39:47.282693 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/93b13c2d8116cafd4c821e3765cb43eb-flexvolume-dir\") pod \"kube-controller-manager-srv-dlh9b.gb1.brightbox.com\" (UID: \"93b13c2d8116cafd4c821e3765cb43eb\") " pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.283079 kubelet[2908]: I0909 05:39:47.282709 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e280dfff4430c8a4ee56085d9ee6c011-kubeconfig\") pod \"kube-scheduler-srv-dlh9b.gb1.brightbox.com\" (UID: \"e280dfff4430c8a4ee56085d9ee6c011\") " pod="kube-system/kube-scheduler-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:47.949962 kubelet[2908]: I0909 05:39:47.949908 2908 apiserver.go:52] "Watching apiserver" Sep 9 05:39:47.982222 kubelet[2908]: I0909 05:39:47.982162 2908 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 05:39:48.049163 kubelet[2908]: I0909 05:39:48.048200 2908 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:48.058570 kubelet[2908]: I0909 05:39:48.057835 2908 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 9 05:39:48.058570 kubelet[2908]: E0909 05:39:48.057907 2908 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-dlh9b.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-dlh9b.gb1.brightbox.com" Sep 9 05:39:48.088681 kubelet[2908]: I0909 05:39:48.087841 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-dlh9b.gb1.brightbox.com" podStartSLOduration=4.087814639 podStartE2EDuration="4.087814639s" podCreationTimestamp="2025-09-09 05:39:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:39:48.077011466 +0000 UTC m=+1.261115742" watchObservedRunningTime="2025-09-09 05:39:48.087814639 +0000 UTC m=+1.271918890" Sep 9 05:39:48.099651 kubelet[2908]: I0909 05:39:48.099171 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-dlh9b.gb1.brightbox.com" podStartSLOduration=1.099152864 podStartE2EDuration="1.099152864s" podCreationTimestamp="2025-09-09 05:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:39:48.089143145 +0000 UTC m=+1.273247417" watchObservedRunningTime="2025-09-09 05:39:48.099152864 +0000 UTC m=+1.283257115" Sep 9 05:39:48.099651 kubelet[2908]: I0909 05:39:48.099287 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-dlh9b.gb1.brightbox.com" podStartSLOduration=1.099281438 podStartE2EDuration="1.099281438s" podCreationTimestamp="2025-09-09 05:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:39:48.098957241 +0000 UTC m=+1.283061515" watchObservedRunningTime="2025-09-09 05:39:48.099281438 +0000 UTC m=+1.283385713" Sep 9 05:39:51.870339 kubelet[2908]: I0909 05:39:51.870228 2908 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 05:39:51.871640 containerd[1587]: time="2025-09-09T05:39:51.871574730Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 05:39:51.872365 kubelet[2908]: I0909 05:39:51.871918 2908 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 05:39:52.857383 systemd[1]: Created slice kubepods-besteffort-pod83ec95f7_d02f_4569_9d94_41bc503bcf9b.slice - libcontainer container kubepods-besteffort-pod83ec95f7_d02f_4569_9d94_41bc503bcf9b.slice. Sep 9 05:39:52.918955 kubelet[2908]: I0909 05:39:52.918793 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/83ec95f7-d02f-4569-9d94-41bc503bcf9b-kube-proxy\") pod \"kube-proxy-lgj5l\" (UID: \"83ec95f7-d02f-4569-9d94-41bc503bcf9b\") " pod="kube-system/kube-proxy-lgj5l" Sep 9 05:39:52.918955 kubelet[2908]: I0909 05:39:52.918837 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/83ec95f7-d02f-4569-9d94-41bc503bcf9b-xtables-lock\") pod \"kube-proxy-lgj5l\" (UID: \"83ec95f7-d02f-4569-9d94-41bc503bcf9b\") " pod="kube-system/kube-proxy-lgj5l" Sep 9 05:39:52.918955 kubelet[2908]: I0909 05:39:52.918854 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83ec95f7-d02f-4569-9d94-41bc503bcf9b-lib-modules\") pod \"kube-proxy-lgj5l\" (UID: \"83ec95f7-d02f-4569-9d94-41bc503bcf9b\") " pod="kube-system/kube-proxy-lgj5l" Sep 9 05:39:52.918955 kubelet[2908]: I0909 05:39:52.918873 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw9hk\" (UniqueName: \"kubernetes.io/projected/83ec95f7-d02f-4569-9d94-41bc503bcf9b-kube-api-access-tw9hk\") pod \"kube-proxy-lgj5l\" (UID: \"83ec95f7-d02f-4569-9d94-41bc503bcf9b\") " pod="kube-system/kube-proxy-lgj5l" Sep 9 05:39:53.098541 systemd[1]: Created slice kubepods-besteffort-pod933a6288_05c4_4dd1_b25b_e958f712bf82.slice - libcontainer container kubepods-besteffort-pod933a6288_05c4_4dd1_b25b_e958f712bf82.slice. Sep 9 05:39:53.121107 kubelet[2908]: I0909 05:39:53.120922 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7nlv\" (UniqueName: \"kubernetes.io/projected/933a6288-05c4-4dd1-b25b-e958f712bf82-kube-api-access-w7nlv\") pod \"tigera-operator-755d956888-gtxlr\" (UID: \"933a6288-05c4-4dd1-b25b-e958f712bf82\") " pod="tigera-operator/tigera-operator-755d956888-gtxlr" Sep 9 05:39:53.121107 kubelet[2908]: I0909 05:39:53.120991 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/933a6288-05c4-4dd1-b25b-e958f712bf82-var-lib-calico\") pod \"tigera-operator-755d956888-gtxlr\" (UID: \"933a6288-05c4-4dd1-b25b-e958f712bf82\") " pod="tigera-operator/tigera-operator-755d956888-gtxlr" Sep 9 05:39:53.168455 containerd[1587]: time="2025-09-09T05:39:53.168363540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lgj5l,Uid:83ec95f7-d02f-4569-9d94-41bc503bcf9b,Namespace:kube-system,Attempt:0,}" Sep 9 05:39:53.208919 containerd[1587]: time="2025-09-09T05:39:53.208828724Z" level=info msg="connecting to shim b0d242bedf598309b7f9379cda2e55df6c8912ac77f45b7ca0b2b7913ed0e7e5" address="unix:///run/containerd/s/ee91c2701b95b6bd3a90342f5a293d3074da0b261bd3882c39630c81910cc177" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:39:53.248896 systemd[1]: Started cri-containerd-b0d242bedf598309b7f9379cda2e55df6c8912ac77f45b7ca0b2b7913ed0e7e5.scope - libcontainer container b0d242bedf598309b7f9379cda2e55df6c8912ac77f45b7ca0b2b7913ed0e7e5. Sep 9 05:39:53.287481 containerd[1587]: time="2025-09-09T05:39:53.287444355Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lgj5l,Uid:83ec95f7-d02f-4569-9d94-41bc503bcf9b,Namespace:kube-system,Attempt:0,} returns sandbox id \"b0d242bedf598309b7f9379cda2e55df6c8912ac77f45b7ca0b2b7913ed0e7e5\"" Sep 9 05:39:53.295065 containerd[1587]: time="2025-09-09T05:39:53.295028797Z" level=info msg="CreateContainer within sandbox \"b0d242bedf598309b7f9379cda2e55df6c8912ac77f45b7ca0b2b7913ed0e7e5\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 05:39:53.311696 containerd[1587]: time="2025-09-09T05:39:53.309805667Z" level=info msg="Container bc8d51f422c973a65c1b17ea3bafb2175c61a6cf9d835f27ba0a244d059b0ea9: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:39:53.321524 containerd[1587]: time="2025-09-09T05:39:53.321464282Z" level=info msg="CreateContainer within sandbox \"b0d242bedf598309b7f9379cda2e55df6c8912ac77f45b7ca0b2b7913ed0e7e5\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"bc8d51f422c973a65c1b17ea3bafb2175c61a6cf9d835f27ba0a244d059b0ea9\"" Sep 9 05:39:53.323558 containerd[1587]: time="2025-09-09T05:39:53.322466340Z" level=info msg="StartContainer for \"bc8d51f422c973a65c1b17ea3bafb2175c61a6cf9d835f27ba0a244d059b0ea9\"" Sep 9 05:39:53.325553 containerd[1587]: time="2025-09-09T05:39:53.325521255Z" level=info msg="connecting to shim bc8d51f422c973a65c1b17ea3bafb2175c61a6cf9d835f27ba0a244d059b0ea9" address="unix:///run/containerd/s/ee91c2701b95b6bd3a90342f5a293d3074da0b261bd3882c39630c81910cc177" protocol=ttrpc version=3 Sep 9 05:39:53.350900 systemd[1]: Started cri-containerd-bc8d51f422c973a65c1b17ea3bafb2175c61a6cf9d835f27ba0a244d059b0ea9.scope - libcontainer container bc8d51f422c973a65c1b17ea3bafb2175c61a6cf9d835f27ba0a244d059b0ea9. Sep 9 05:39:53.394735 containerd[1587]: time="2025-09-09T05:39:53.394623195Z" level=info msg="StartContainer for \"bc8d51f422c973a65c1b17ea3bafb2175c61a6cf9d835f27ba0a244d059b0ea9\" returns successfully" Sep 9 05:39:53.404912 containerd[1587]: time="2025-09-09T05:39:53.404772178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-gtxlr,Uid:933a6288-05c4-4dd1-b25b-e958f712bf82,Namespace:tigera-operator,Attempt:0,}" Sep 9 05:39:53.428460 containerd[1587]: time="2025-09-09T05:39:53.428398467Z" level=info msg="connecting to shim 0709574632ea6f5d2007442e62d7c427c7962a074f5eb8047e04ce55858d4615" address="unix:///run/containerd/s/42f3093b985c5592cb457298b2bfd3d03b7202dd7bdb6ece1bca3491aecaec9e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:39:53.471304 systemd[1]: Started cri-containerd-0709574632ea6f5d2007442e62d7c427c7962a074f5eb8047e04ce55858d4615.scope - libcontainer container 0709574632ea6f5d2007442e62d7c427c7962a074f5eb8047e04ce55858d4615. Sep 9 05:39:53.551095 containerd[1587]: time="2025-09-09T05:39:53.551054251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-gtxlr,Uid:933a6288-05c4-4dd1-b25b-e958f712bf82,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"0709574632ea6f5d2007442e62d7c427c7962a074f5eb8047e04ce55858d4615\"" Sep 9 05:39:53.554431 containerd[1587]: time="2025-09-09T05:39:53.554396961Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 05:39:54.042475 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount838779722.mount: Deactivated successfully. Sep 9 05:39:54.090473 kubelet[2908]: I0909 05:39:54.090330 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-lgj5l" podStartSLOduration=2.090308353 podStartE2EDuration="2.090308353s" podCreationTimestamp="2025-09-09 05:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:39:54.089746835 +0000 UTC m=+7.273851120" watchObservedRunningTime="2025-09-09 05:39:54.090308353 +0000 UTC m=+7.274412605" Sep 9 05:39:55.429638 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1662643558.mount: Deactivated successfully. Sep 9 05:39:56.065045 containerd[1587]: time="2025-09-09T05:39:56.064908412Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:56.066685 containerd[1587]: time="2025-09-09T05:39:56.066623046Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 9 05:39:56.067548 containerd[1587]: time="2025-09-09T05:39:56.067081023Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:56.069607 containerd[1587]: time="2025-09-09T05:39:56.069560557Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:39:56.071069 containerd[1587]: time="2025-09-09T05:39:56.071038931Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 2.516607328s" Sep 9 05:39:56.071278 containerd[1587]: time="2025-09-09T05:39:56.071181856Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 9 05:39:56.075679 containerd[1587]: time="2025-09-09T05:39:56.075617144Z" level=info msg="CreateContainer within sandbox \"0709574632ea6f5d2007442e62d7c427c7962a074f5eb8047e04ce55858d4615\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 05:39:56.084798 containerd[1587]: time="2025-09-09T05:39:56.084204325Z" level=info msg="Container 64b869f2123377db12a9c3765144b7c327a4d8417a7f9dbb0a70981375ca707a: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:39:56.102765 containerd[1587]: time="2025-09-09T05:39:56.102697182Z" level=info msg="CreateContainer within sandbox \"0709574632ea6f5d2007442e62d7c427c7962a074f5eb8047e04ce55858d4615\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"64b869f2123377db12a9c3765144b7c327a4d8417a7f9dbb0a70981375ca707a\"" Sep 9 05:39:56.104856 containerd[1587]: time="2025-09-09T05:39:56.104809861Z" level=info msg="StartContainer for \"64b869f2123377db12a9c3765144b7c327a4d8417a7f9dbb0a70981375ca707a\"" Sep 9 05:39:56.106902 containerd[1587]: time="2025-09-09T05:39:56.106868903Z" level=info msg="connecting to shim 64b869f2123377db12a9c3765144b7c327a4d8417a7f9dbb0a70981375ca707a" address="unix:///run/containerd/s/42f3093b985c5592cb457298b2bfd3d03b7202dd7bdb6ece1bca3491aecaec9e" protocol=ttrpc version=3 Sep 9 05:39:56.130919 systemd[1]: Started cri-containerd-64b869f2123377db12a9c3765144b7c327a4d8417a7f9dbb0a70981375ca707a.scope - libcontainer container 64b869f2123377db12a9c3765144b7c327a4d8417a7f9dbb0a70981375ca707a. Sep 9 05:39:56.165545 containerd[1587]: time="2025-09-09T05:39:56.165454517Z" level=info msg="StartContainer for \"64b869f2123377db12a9c3765144b7c327a4d8417a7f9dbb0a70981375ca707a\" returns successfully" Sep 9 05:40:02.961519 sudo[1917]: pam_unix(sudo:session): session closed for user root Sep 9 05:40:03.105084 sshd[1916]: Connection closed by 139.178.89.65 port 57272 Sep 9 05:40:03.110555 sshd-session[1913]: pam_unix(sshd:session): session closed for user core Sep 9 05:40:03.117555 systemd[1]: sshd@8-10.244.98.182:22-139.178.89.65:57272.service: Deactivated successfully. Sep 9 05:40:03.117712 systemd-logind[1565]: Session 11 logged out. Waiting for processes to exit. Sep 9 05:40:03.120589 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 05:40:03.123825 systemd[1]: session-11.scope: Consumed 6.097s CPU time, 152.5M memory peak. Sep 9 05:40:03.127785 systemd-logind[1565]: Removed session 11. Sep 9 05:40:07.409173 kubelet[2908]: I0909 05:40:07.408460 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-gtxlr" podStartSLOduration=11.888661504 podStartE2EDuration="14.408400271s" podCreationTimestamp="2025-09-09 05:39:53 +0000 UTC" firstStartedPulling="2025-09-09 05:39:53.552846044 +0000 UTC m=+6.736950296" lastFinishedPulling="2025-09-09 05:39:56.072584807 +0000 UTC m=+9.256689063" observedRunningTime="2025-09-09 05:39:57.104808314 +0000 UTC m=+10.288912589" watchObservedRunningTime="2025-09-09 05:40:07.408400271 +0000 UTC m=+20.592504549" Sep 9 05:40:07.422072 kubelet[2908]: I0909 05:40:07.421882 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33723457-39be-4fe2-8db7-9172c95f1c9c-tigera-ca-bundle\") pod \"calico-typha-5fd6b5c47c-t9958\" (UID: \"33723457-39be-4fe2-8db7-9172c95f1c9c\") " pod="calico-system/calico-typha-5fd6b5c47c-t9958" Sep 9 05:40:07.422072 kubelet[2908]: I0909 05:40:07.421921 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/33723457-39be-4fe2-8db7-9172c95f1c9c-typha-certs\") pod \"calico-typha-5fd6b5c47c-t9958\" (UID: \"33723457-39be-4fe2-8db7-9172c95f1c9c\") " pod="calico-system/calico-typha-5fd6b5c47c-t9958" Sep 9 05:40:07.422072 kubelet[2908]: I0909 05:40:07.421941 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-477n8\" (UniqueName: \"kubernetes.io/projected/33723457-39be-4fe2-8db7-9172c95f1c9c-kube-api-access-477n8\") pod \"calico-typha-5fd6b5c47c-t9958\" (UID: \"33723457-39be-4fe2-8db7-9172c95f1c9c\") " pod="calico-system/calico-typha-5fd6b5c47c-t9958" Sep 9 05:40:07.431602 systemd[1]: Created slice kubepods-besteffort-pod33723457_39be_4fe2_8db7_9172c95f1c9c.slice - libcontainer container kubepods-besteffort-pod33723457_39be_4fe2_8db7_9172c95f1c9c.slice. Sep 9 05:40:07.743639 containerd[1587]: time="2025-09-09T05:40:07.743012565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5fd6b5c47c-t9958,Uid:33723457-39be-4fe2-8db7-9172c95f1c9c,Namespace:calico-system,Attempt:0,}" Sep 9 05:40:07.786785 containerd[1587]: time="2025-09-09T05:40:07.786283658Z" level=info msg="connecting to shim ed3920dd48fd719a6383fb0a70b282dcd24030101a5c1f64a3abdf8b4ccf1467" address="unix:///run/containerd/s/a3c4a7afa7e19625874791b9ffb50ee2732bd5e7732b70ead6501751616514b6" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:40:07.833364 systemd[1]: Started cri-containerd-ed3920dd48fd719a6383fb0a70b282dcd24030101a5c1f64a3abdf8b4ccf1467.scope - libcontainer container ed3920dd48fd719a6383fb0a70b282dcd24030101a5c1f64a3abdf8b4ccf1467. Sep 9 05:40:07.888236 systemd[1]: Created slice kubepods-besteffort-podca54b2ac_d9d6_49d9_9f04_1ba00135f4f9.slice - libcontainer container kubepods-besteffort-podca54b2ac_d9d6_49d9_9f04_1ba00135f4f9.slice. Sep 9 05:40:07.926852 kubelet[2908]: I0909 05:40:07.926715 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9-xtables-lock\") pod \"calico-node-k8kln\" (UID: \"ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9\") " pod="calico-system/calico-node-k8kln" Sep 9 05:40:07.927384 kubelet[2908]: I0909 05:40:07.926943 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9-cni-net-dir\") pod \"calico-node-k8kln\" (UID: \"ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9\") " pod="calico-system/calico-node-k8kln" Sep 9 05:40:07.927384 kubelet[2908]: I0909 05:40:07.926976 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9-flexvol-driver-host\") pod \"calico-node-k8kln\" (UID: \"ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9\") " pod="calico-system/calico-node-k8kln" Sep 9 05:40:07.927384 kubelet[2908]: I0909 05:40:07.927297 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9-policysync\") pod \"calico-node-k8kln\" (UID: \"ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9\") " pod="calico-system/calico-node-k8kln" Sep 9 05:40:07.928232 kubelet[2908]: I0909 05:40:07.927458 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9-tigera-ca-bundle\") pod \"calico-node-k8kln\" (UID: \"ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9\") " pod="calico-system/calico-node-k8kln" Sep 9 05:40:07.928232 kubelet[2908]: I0909 05:40:07.927610 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9-var-run-calico\") pod \"calico-node-k8kln\" (UID: \"ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9\") " pod="calico-system/calico-node-k8kln" Sep 9 05:40:07.928232 kubelet[2908]: I0909 05:40:07.927642 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9-lib-modules\") pod \"calico-node-k8kln\" (UID: \"ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9\") " pod="calico-system/calico-node-k8kln" Sep 9 05:40:07.928232 kubelet[2908]: I0909 05:40:07.927828 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdj2f\" (UniqueName: \"kubernetes.io/projected/ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9-kube-api-access-xdj2f\") pod \"calico-node-k8kln\" (UID: \"ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9\") " pod="calico-system/calico-node-k8kln" Sep 9 05:40:07.928735 kubelet[2908]: I0909 05:40:07.928512 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9-var-lib-calico\") pod \"calico-node-k8kln\" (UID: \"ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9\") " pod="calico-system/calico-node-k8kln" Sep 9 05:40:07.929034 kubelet[2908]: I0909 05:40:07.928870 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9-cni-bin-dir\") pod \"calico-node-k8kln\" (UID: \"ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9\") " pod="calico-system/calico-node-k8kln" Sep 9 05:40:07.929034 kubelet[2908]: I0909 05:40:07.928902 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9-node-certs\") pod \"calico-node-k8kln\" (UID: \"ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9\") " pod="calico-system/calico-node-k8kln" Sep 9 05:40:07.929034 kubelet[2908]: I0909 05:40:07.928997 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9-cni-log-dir\") pod \"calico-node-k8kln\" (UID: \"ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9\") " pod="calico-system/calico-node-k8kln" Sep 9 05:40:07.984372 containerd[1587]: time="2025-09-09T05:40:07.984247861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5fd6b5c47c-t9958,Uid:33723457-39be-4fe2-8db7-9172c95f1c9c,Namespace:calico-system,Attempt:0,} returns sandbox id \"ed3920dd48fd719a6383fb0a70b282dcd24030101a5c1f64a3abdf8b4ccf1467\"" Sep 9 05:40:07.987802 containerd[1587]: time="2025-09-09T05:40:07.987739786Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 05:40:08.045216 kubelet[2908]: E0909 05:40:08.043822 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.045651 kubelet[2908]: W0909 05:40:08.045412 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.045651 kubelet[2908]: E0909 05:40:08.045474 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.058970 kubelet[2908]: E0909 05:40:08.058945 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.058970 kubelet[2908]: W0909 05:40:08.058963 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.059147 kubelet[2908]: E0909 05:40:08.058983 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.075242 kubelet[2908]: E0909 05:40:08.075194 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-czgjp" podUID="27cc7ce2-3e74-49cc-83d6-1559ba07fa7b" Sep 9 05:40:08.125791 kubelet[2908]: E0909 05:40:08.125719 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.125791 kubelet[2908]: W0909 05:40:08.125789 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.126014 kubelet[2908]: E0909 05:40:08.125828 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.126369 kubelet[2908]: E0909 05:40:08.126332 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.126487 kubelet[2908]: W0909 05:40:08.126362 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.126487 kubelet[2908]: E0909 05:40:08.126407 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.126806 kubelet[2908]: E0909 05:40:08.126778 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.126861 kubelet[2908]: W0909 05:40:08.126805 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.126861 kubelet[2908]: E0909 05:40:08.126823 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.127375 kubelet[2908]: E0909 05:40:08.127346 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.127424 kubelet[2908]: W0909 05:40:08.127373 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.127461 kubelet[2908]: E0909 05:40:08.127398 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.127853 kubelet[2908]: E0909 05:40:08.127826 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.127924 kubelet[2908]: W0909 05:40:08.127852 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.127924 kubelet[2908]: E0909 05:40:08.127894 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.128328 kubelet[2908]: E0909 05:40:08.128273 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.128328 kubelet[2908]: W0909 05:40:08.128303 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.128456 kubelet[2908]: E0909 05:40:08.128380 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.128828 kubelet[2908]: E0909 05:40:08.128807 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.128879 kubelet[2908]: W0909 05:40:08.128831 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.128879 kubelet[2908]: E0909 05:40:08.128850 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.129251 kubelet[2908]: E0909 05:40:08.129212 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.129318 kubelet[2908]: W0909 05:40:08.129260 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.129318 kubelet[2908]: E0909 05:40:08.129279 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.129908 kubelet[2908]: E0909 05:40:08.129871 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.129908 kubelet[2908]: W0909 05:40:08.129895 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.130201 kubelet[2908]: E0909 05:40:08.129917 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.130284 kubelet[2908]: E0909 05:40:08.130242 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.130284 kubelet[2908]: W0909 05:40:08.130252 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.130284 kubelet[2908]: E0909 05:40:08.130264 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.130631 kubelet[2908]: E0909 05:40:08.130524 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.130631 kubelet[2908]: W0909 05:40:08.130537 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.130631 kubelet[2908]: E0909 05:40:08.130551 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.131123 kubelet[2908]: E0909 05:40:08.130767 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.131123 kubelet[2908]: W0909 05:40:08.130776 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.131123 kubelet[2908]: E0909 05:40:08.130785 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.131123 kubelet[2908]: E0909 05:40:08.131039 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.131123 kubelet[2908]: W0909 05:40:08.131056 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.131123 kubelet[2908]: E0909 05:40:08.131065 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.131535 kubelet[2908]: E0909 05:40:08.131228 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.131535 kubelet[2908]: W0909 05:40:08.131240 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.131535 kubelet[2908]: E0909 05:40:08.131251 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.131535 kubelet[2908]: E0909 05:40:08.131402 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.131535 kubelet[2908]: W0909 05:40:08.131409 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.131535 kubelet[2908]: E0909 05:40:08.131439 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.132243 kubelet[2908]: E0909 05:40:08.131597 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.132243 kubelet[2908]: W0909 05:40:08.131606 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.132243 kubelet[2908]: E0909 05:40:08.131614 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.132243 kubelet[2908]: E0909 05:40:08.131938 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.132243 kubelet[2908]: W0909 05:40:08.131948 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.132243 kubelet[2908]: E0909 05:40:08.131962 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.132243 kubelet[2908]: E0909 05:40:08.132144 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.132243 kubelet[2908]: W0909 05:40:08.132152 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.132243 kubelet[2908]: E0909 05:40:08.132162 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.132600 kubelet[2908]: E0909 05:40:08.132308 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.132600 kubelet[2908]: W0909 05:40:08.132314 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.132600 kubelet[2908]: E0909 05:40:08.132322 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.132600 kubelet[2908]: E0909 05:40:08.132450 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.132600 kubelet[2908]: W0909 05:40:08.132457 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.132600 kubelet[2908]: E0909 05:40:08.132465 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.132887 kubelet[2908]: E0909 05:40:08.132721 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.132887 kubelet[2908]: W0909 05:40:08.132728 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.132887 kubelet[2908]: E0909 05:40:08.132737 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.132887 kubelet[2908]: I0909 05:40:08.132777 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27cc7ce2-3e74-49cc-83d6-1559ba07fa7b-kubelet-dir\") pod \"csi-node-driver-czgjp\" (UID: \"27cc7ce2-3e74-49cc-83d6-1559ba07fa7b\") " pod="calico-system/csi-node-driver-czgjp" Sep 9 05:40:08.133066 kubelet[2908]: E0909 05:40:08.132931 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.133066 kubelet[2908]: W0909 05:40:08.132939 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.133066 kubelet[2908]: E0909 05:40:08.132948 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.133066 kubelet[2908]: I0909 05:40:08.132973 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/27cc7ce2-3e74-49cc-83d6-1559ba07fa7b-socket-dir\") pod \"csi-node-driver-czgjp\" (UID: \"27cc7ce2-3e74-49cc-83d6-1559ba07fa7b\") " pod="calico-system/csi-node-driver-czgjp" Sep 9 05:40:08.133703 kubelet[2908]: E0909 05:40:08.133161 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.133703 kubelet[2908]: W0909 05:40:08.133169 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.133703 kubelet[2908]: E0909 05:40:08.133179 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.133703 kubelet[2908]: I0909 05:40:08.133199 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnt7m\" (UniqueName: \"kubernetes.io/projected/27cc7ce2-3e74-49cc-83d6-1559ba07fa7b-kube-api-access-gnt7m\") pod \"csi-node-driver-czgjp\" (UID: \"27cc7ce2-3e74-49cc-83d6-1559ba07fa7b\") " pod="calico-system/csi-node-driver-czgjp" Sep 9 05:40:08.133703 kubelet[2908]: E0909 05:40:08.133480 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.133703 kubelet[2908]: W0909 05:40:08.133498 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.133703 kubelet[2908]: E0909 05:40:08.133515 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.134206 kubelet[2908]: E0909 05:40:08.134031 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.134206 kubelet[2908]: W0909 05:40:08.134057 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.134206 kubelet[2908]: E0909 05:40:08.134072 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.134404 kubelet[2908]: E0909 05:40:08.134392 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.134474 kubelet[2908]: W0909 05:40:08.134463 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.134631 kubelet[2908]: E0909 05:40:08.134527 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.134771 kubelet[2908]: E0909 05:40:08.134759 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.134829 kubelet[2908]: W0909 05:40:08.134819 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.134898 kubelet[2908]: E0909 05:40:08.134888 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.135264 kubelet[2908]: E0909 05:40:08.135132 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.135264 kubelet[2908]: W0909 05:40:08.135147 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.135264 kubelet[2908]: E0909 05:40:08.135159 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.135264 kubelet[2908]: I0909 05:40:08.135197 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/27cc7ce2-3e74-49cc-83d6-1559ba07fa7b-registration-dir\") pod \"csi-node-driver-czgjp\" (UID: \"27cc7ce2-3e74-49cc-83d6-1559ba07fa7b\") " pod="calico-system/csi-node-driver-czgjp" Sep 9 05:40:08.135519 kubelet[2908]: E0909 05:40:08.135508 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.135585 kubelet[2908]: W0909 05:40:08.135575 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.135649 kubelet[2908]: E0909 05:40:08.135635 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.136060 kubelet[2908]: E0909 05:40:08.135917 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.136060 kubelet[2908]: W0909 05:40:08.135930 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.136060 kubelet[2908]: E0909 05:40:08.135942 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.136261 kubelet[2908]: E0909 05:40:08.136249 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.136333 kubelet[2908]: W0909 05:40:08.136322 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.136479 kubelet[2908]: E0909 05:40:08.136389 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.136479 kubelet[2908]: I0909 05:40:08.136427 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/27cc7ce2-3e74-49cc-83d6-1559ba07fa7b-varrun\") pod \"csi-node-driver-czgjp\" (UID: \"27cc7ce2-3e74-49cc-83d6-1559ba07fa7b\") " pod="calico-system/csi-node-driver-czgjp" Sep 9 05:40:08.136718 kubelet[2908]: E0909 05:40:08.136700 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.136718 kubelet[2908]: W0909 05:40:08.136715 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.136815 kubelet[2908]: E0909 05:40:08.136727 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.136882 kubelet[2908]: E0909 05:40:08.136869 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.136882 kubelet[2908]: W0909 05:40:08.136880 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.136979 kubelet[2908]: E0909 05:40:08.136888 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.137086 kubelet[2908]: E0909 05:40:08.137073 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.137086 kubelet[2908]: W0909 05:40:08.137083 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.137161 kubelet[2908]: E0909 05:40:08.137092 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.137240 kubelet[2908]: E0909 05:40:08.137229 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.137348 kubelet[2908]: W0909 05:40:08.137241 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.137348 kubelet[2908]: E0909 05:40:08.137249 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.198194 containerd[1587]: time="2025-09-09T05:40:08.198157317Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-k8kln,Uid:ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9,Namespace:calico-system,Attempt:0,}" Sep 9 05:40:08.220233 containerd[1587]: time="2025-09-09T05:40:08.219872449Z" level=info msg="connecting to shim 7ed7708ecb9d13a1365062fbbc4ff2c9f7d16475ed95b3c8b96f1b63b463e86d" address="unix:///run/containerd/s/5d6565286909f2e243dc909bf9e53a94f45be71fcbfce7b47cb4ab1446a6e0b4" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:40:08.238174 kubelet[2908]: E0909 05:40:08.238142 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.238325 kubelet[2908]: W0909 05:40:08.238311 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.238435 kubelet[2908]: E0909 05:40:08.238422 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.239329 kubelet[2908]: E0909 05:40:08.239309 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.239780 kubelet[2908]: W0909 05:40:08.239499 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.239780 kubelet[2908]: E0909 05:40:08.239763 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.241160 kubelet[2908]: E0909 05:40:08.241096 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.241160 kubelet[2908]: W0909 05:40:08.241115 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.241160 kubelet[2908]: E0909 05:40:08.241132 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.243839 kubelet[2908]: E0909 05:40:08.243818 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.244122 kubelet[2908]: W0909 05:40:08.243938 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.244122 kubelet[2908]: E0909 05:40:08.243962 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.244914 kubelet[2908]: E0909 05:40:08.244809 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.245142 kubelet[2908]: W0909 05:40:08.244995 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.245142 kubelet[2908]: E0909 05:40:08.245016 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.246466 kubelet[2908]: E0909 05:40:08.246268 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.247764 kubelet[2908]: W0909 05:40:08.246597 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.247764 kubelet[2908]: E0909 05:40:08.246619 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.248273 kubelet[2908]: E0909 05:40:08.248146 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.248273 kubelet[2908]: W0909 05:40:08.248243 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.248273 kubelet[2908]: E0909 05:40:08.248258 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.249685 kubelet[2908]: E0909 05:40:08.249162 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.249685 kubelet[2908]: W0909 05:40:08.249179 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.249685 kubelet[2908]: E0909 05:40:08.249191 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.250154 kubelet[2908]: E0909 05:40:08.250025 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.250690 kubelet[2908]: W0909 05:40:08.250536 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.250690 kubelet[2908]: E0909 05:40:08.250558 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.251939 kubelet[2908]: E0909 05:40:08.251924 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.252291 kubelet[2908]: W0909 05:40:08.252181 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.252291 kubelet[2908]: E0909 05:40:08.252202 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.254255 kubelet[2908]: E0909 05:40:08.254041 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.254255 kubelet[2908]: W0909 05:40:08.254057 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.254255 kubelet[2908]: E0909 05:40:08.254074 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.254652 kubelet[2908]: E0909 05:40:08.254418 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.254652 kubelet[2908]: W0909 05:40:08.254625 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.254652 kubelet[2908]: E0909 05:40:08.254638 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.256055 kubelet[2908]: E0909 05:40:08.255930 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.256055 kubelet[2908]: W0909 05:40:08.255944 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.256055 kubelet[2908]: E0909 05:40:08.255956 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.256688 kubelet[2908]: E0909 05:40:08.256413 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.256688 kubelet[2908]: W0909 05:40:08.256599 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.256688 kubelet[2908]: E0909 05:40:08.256613 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.257753 kubelet[2908]: E0909 05:40:08.257721 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.257920 kubelet[2908]: W0909 05:40:08.257907 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.258202 kubelet[2908]: E0909 05:40:08.258015 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.259813 kubelet[2908]: E0909 05:40:08.259283 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.259813 kubelet[2908]: W0909 05:40:08.259778 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.259813 kubelet[2908]: E0909 05:40:08.259799 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.262727 kubelet[2908]: E0909 05:40:08.260345 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.262727 kubelet[2908]: W0909 05:40:08.262687 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.262727 kubelet[2908]: E0909 05:40:08.262709 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.264950 kubelet[2908]: E0909 05:40:08.264850 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.265268 kubelet[2908]: W0909 05:40:08.265043 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.265268 kubelet[2908]: E0909 05:40:08.265063 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.266139 kubelet[2908]: E0909 05:40:08.266124 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.266642 kubelet[2908]: W0909 05:40:08.266483 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.266642 kubelet[2908]: E0909 05:40:08.266505 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.268064 kubelet[2908]: E0909 05:40:08.267415 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.268064 kubelet[2908]: W0909 05:40:08.267494 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.268064 kubelet[2908]: E0909 05:40:08.267507 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.268683 kubelet[2908]: E0909 05:40:08.268510 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.269355 kubelet[2908]: W0909 05:40:08.268764 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.269355 kubelet[2908]: E0909 05:40:08.268784 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.270681 kubelet[2908]: E0909 05:40:08.270444 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.270681 kubelet[2908]: W0909 05:40:08.270459 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.270681 kubelet[2908]: E0909 05:40:08.270471 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.271769 kubelet[2908]: E0909 05:40:08.271650 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.272387 kubelet[2908]: W0909 05:40:08.272291 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.272387 kubelet[2908]: E0909 05:40:08.272311 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.273092 kubelet[2908]: E0909 05:40:08.273046 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.273730 kubelet[2908]: W0909 05:40:08.273293 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.273730 kubelet[2908]: E0909 05:40:08.273312 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.274940 kubelet[2908]: E0909 05:40:08.274924 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.275098 kubelet[2908]: W0909 05:40:08.275022 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.275098 kubelet[2908]: E0909 05:40:08.275051 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.299646 systemd[1]: Started cri-containerd-7ed7708ecb9d13a1365062fbbc4ff2c9f7d16475ed95b3c8b96f1b63b463e86d.scope - libcontainer container 7ed7708ecb9d13a1365062fbbc4ff2c9f7d16475ed95b3c8b96f1b63b463e86d. Sep 9 05:40:08.329912 kubelet[2908]: E0909 05:40:08.329826 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:08.329912 kubelet[2908]: W0909 05:40:08.329850 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:08.329912 kubelet[2908]: E0909 05:40:08.329872 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:08.431079 containerd[1587]: time="2025-09-09T05:40:08.430979299Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-k8kln,Uid:ca54b2ac-d9d6-49d9-9f04-1ba00135f4f9,Namespace:calico-system,Attempt:0,} returns sandbox id \"7ed7708ecb9d13a1365062fbbc4ff2c9f7d16475ed95b3c8b96f1b63b463e86d\"" Sep 9 05:40:09.802198 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2103232963.mount: Deactivated successfully. Sep 9 05:40:09.998205 kubelet[2908]: E0909 05:40:09.997700 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-czgjp" podUID="27cc7ce2-3e74-49cc-83d6-1559ba07fa7b" Sep 9 05:40:11.667611 containerd[1587]: time="2025-09-09T05:40:11.667495277Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:11.669921 containerd[1587]: time="2025-09-09T05:40:11.669396674Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 9 05:40:11.669921 containerd[1587]: time="2025-09-09T05:40:11.669561650Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:11.673707 containerd[1587]: time="2025-09-09T05:40:11.673650048Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:11.677654 containerd[1587]: time="2025-09-09T05:40:11.677607658Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 3.68980275s" Sep 9 05:40:11.677885 containerd[1587]: time="2025-09-09T05:40:11.677744985Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 9 05:40:11.711965 containerd[1587]: time="2025-09-09T05:40:11.710669136Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 05:40:11.733712 containerd[1587]: time="2025-09-09T05:40:11.733643292Z" level=info msg="CreateContainer within sandbox \"ed3920dd48fd719a6383fb0a70b282dcd24030101a5c1f64a3abdf8b4ccf1467\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 05:40:11.742685 containerd[1587]: time="2025-09-09T05:40:11.741932908Z" level=info msg="Container 83b4919ffb7b8328b79b966b44f783f6ab056cb7651fa5615b2d965c696fc2ed: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:11.763312 containerd[1587]: time="2025-09-09T05:40:11.763239935Z" level=info msg="CreateContainer within sandbox \"ed3920dd48fd719a6383fb0a70b282dcd24030101a5c1f64a3abdf8b4ccf1467\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"83b4919ffb7b8328b79b966b44f783f6ab056cb7651fa5615b2d965c696fc2ed\"" Sep 9 05:40:11.767013 containerd[1587]: time="2025-09-09T05:40:11.766981882Z" level=info msg="StartContainer for \"83b4919ffb7b8328b79b966b44f783f6ab056cb7651fa5615b2d965c696fc2ed\"" Sep 9 05:40:11.769732 containerd[1587]: time="2025-09-09T05:40:11.769670178Z" level=info msg="connecting to shim 83b4919ffb7b8328b79b966b44f783f6ab056cb7651fa5615b2d965c696fc2ed" address="unix:///run/containerd/s/a3c4a7afa7e19625874791b9ffb50ee2732bd5e7732b70ead6501751616514b6" protocol=ttrpc version=3 Sep 9 05:40:11.802132 systemd[1]: Started cri-containerd-83b4919ffb7b8328b79b966b44f783f6ab056cb7651fa5615b2d965c696fc2ed.scope - libcontainer container 83b4919ffb7b8328b79b966b44f783f6ab056cb7651fa5615b2d965c696fc2ed. Sep 9 05:40:11.880030 containerd[1587]: time="2025-09-09T05:40:11.879973797Z" level=info msg="StartContainer for \"83b4919ffb7b8328b79b966b44f783f6ab056cb7651fa5615b2d965c696fc2ed\" returns successfully" Sep 9 05:40:11.998779 kubelet[2908]: E0909 05:40:11.997728 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-czgjp" podUID="27cc7ce2-3e74-49cc-83d6-1559ba07fa7b" Sep 9 05:40:12.161835 kubelet[2908]: E0909 05:40:12.161652 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.161835 kubelet[2908]: W0909 05:40:12.161828 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.162042 kubelet[2908]: E0909 05:40:12.161851 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.162839 kubelet[2908]: E0909 05:40:12.162817 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.162839 kubelet[2908]: W0909 05:40:12.162834 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.162964 kubelet[2908]: E0909 05:40:12.162849 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.163019 kubelet[2908]: E0909 05:40:12.163009 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.163052 kubelet[2908]: W0909 05:40:12.163019 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.163052 kubelet[2908]: E0909 05:40:12.163027 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.163207 kubelet[2908]: E0909 05:40:12.163180 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.163207 kubelet[2908]: W0909 05:40:12.163187 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.163207 kubelet[2908]: E0909 05:40:12.163203 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.163368 kubelet[2908]: E0909 05:40:12.163358 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.163368 kubelet[2908]: W0909 05:40:12.163368 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.163440 kubelet[2908]: E0909 05:40:12.163375 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.163583 kubelet[2908]: E0909 05:40:12.163570 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.163583 kubelet[2908]: W0909 05:40:12.163582 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.163678 kubelet[2908]: E0909 05:40:12.163593 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.163765 kubelet[2908]: E0909 05:40:12.163755 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.163765 kubelet[2908]: W0909 05:40:12.163764 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.164049 kubelet[2908]: E0909 05:40:12.163782 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.164049 kubelet[2908]: E0909 05:40:12.163970 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.164049 kubelet[2908]: W0909 05:40:12.163979 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.164049 kubelet[2908]: E0909 05:40:12.163989 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.164714 kubelet[2908]: E0909 05:40:12.164699 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.164714 kubelet[2908]: W0909 05:40:12.164713 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.164896 kubelet[2908]: E0909 05:40:12.164724 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.164896 kubelet[2908]: E0909 05:40:12.164875 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.164896 kubelet[2908]: W0909 05:40:12.164882 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.164896 kubelet[2908]: E0909 05:40:12.164889 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.165315 kubelet[2908]: E0909 05:40:12.165302 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.165357 kubelet[2908]: W0909 05:40:12.165315 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.165357 kubelet[2908]: E0909 05:40:12.165339 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.165601 kubelet[2908]: E0909 05:40:12.165577 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.165601 kubelet[2908]: W0909 05:40:12.165588 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.165601 kubelet[2908]: E0909 05:40:12.165597 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.166053 kubelet[2908]: E0909 05:40:12.166039 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.166100 kubelet[2908]: W0909 05:40:12.166064 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.166100 kubelet[2908]: E0909 05:40:12.166074 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.166248 kubelet[2908]: E0909 05:40:12.166235 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.166248 kubelet[2908]: W0909 05:40:12.166245 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.166320 kubelet[2908]: E0909 05:40:12.166253 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.167757 kubelet[2908]: E0909 05:40:12.167732 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.167757 kubelet[2908]: W0909 05:40:12.167747 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.167870 kubelet[2908]: E0909 05:40:12.167771 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.188045 kubelet[2908]: E0909 05:40:12.187943 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.189829 kubelet[2908]: W0909 05:40:12.187966 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.189829 kubelet[2908]: E0909 05:40:12.189731 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.191367 kubelet[2908]: E0909 05:40:12.191336 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.192114 kubelet[2908]: W0909 05:40:12.191412 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.192114 kubelet[2908]: E0909 05:40:12.191519 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.192615 kubelet[2908]: E0909 05:40:12.192599 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.193546 kubelet[2908]: W0909 05:40:12.192708 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.193546 kubelet[2908]: E0909 05:40:12.192745 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.195461 kubelet[2908]: E0909 05:40:12.195445 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.195950 kubelet[2908]: W0909 05:40:12.195928 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.196010 kubelet[2908]: E0909 05:40:12.195954 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.196991 kubelet[2908]: E0909 05:40:12.196971 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.196991 kubelet[2908]: W0909 05:40:12.196987 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.198023 kubelet[2908]: E0909 05:40:12.197210 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.199680 kubelet[2908]: E0909 05:40:12.198260 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.199680 kubelet[2908]: W0909 05:40:12.198376 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.199680 kubelet[2908]: E0909 05:40:12.198390 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.199931 kubelet[2908]: E0909 05:40:12.199917 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.200148 kubelet[2908]: W0909 05:40:12.199979 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.200148 kubelet[2908]: E0909 05:40:12.199998 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.200148 kubelet[2908]: I0909 05:40:12.200037 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5fd6b5c47c-t9958" podStartSLOduration=1.476223425 podStartE2EDuration="5.20001816s" podCreationTimestamp="2025-09-09 05:40:07 +0000 UTC" firstStartedPulling="2025-09-09 05:40:07.985876912 +0000 UTC m=+21.169981168" lastFinishedPulling="2025-09-09 05:40:11.709671652 +0000 UTC m=+24.893775903" observedRunningTime="2025-09-09 05:40:12.195721739 +0000 UTC m=+25.379826010" watchObservedRunningTime="2025-09-09 05:40:12.20001816 +0000 UTC m=+25.384125249" Sep 9 05:40:12.200353 kubelet[2908]: E0909 05:40:12.200343 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.200412 kubelet[2908]: W0909 05:40:12.200402 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.200488 kubelet[2908]: E0909 05:40:12.200476 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.202400 kubelet[2908]: E0909 05:40:12.202305 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.202400 kubelet[2908]: W0909 05:40:12.202399 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.202772 kubelet[2908]: E0909 05:40:12.202420 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.203975 kubelet[2908]: E0909 05:40:12.203935 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.203975 kubelet[2908]: W0909 05:40:12.203952 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.204101 kubelet[2908]: E0909 05:40:12.203965 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.204209 kubelet[2908]: E0909 05:40:12.204200 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.204247 kubelet[2908]: W0909 05:40:12.204209 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.204385 kubelet[2908]: E0909 05:40:12.204373 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.204690 kubelet[2908]: E0909 05:40:12.204571 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.204690 kubelet[2908]: W0909 05:40:12.204594 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.204690 kubelet[2908]: E0909 05:40:12.204605 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.206101 kubelet[2908]: E0909 05:40:12.206079 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.206101 kubelet[2908]: W0909 05:40:12.206094 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.206197 kubelet[2908]: E0909 05:40:12.206113 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.209532 kubelet[2908]: E0909 05:40:12.208925 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.209532 kubelet[2908]: W0909 05:40:12.208961 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.209532 kubelet[2908]: E0909 05:40:12.208976 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.209532 kubelet[2908]: E0909 05:40:12.209141 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.209532 kubelet[2908]: W0909 05:40:12.209148 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.209532 kubelet[2908]: E0909 05:40:12.209170 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.209532 kubelet[2908]: E0909 05:40:12.209319 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.209532 kubelet[2908]: W0909 05:40:12.209327 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.209532 kubelet[2908]: E0909 05:40:12.209336 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.210724 kubelet[2908]: E0909 05:40:12.210573 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.210724 kubelet[2908]: W0909 05:40:12.210591 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.210724 kubelet[2908]: E0909 05:40:12.210605 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:12.211222 kubelet[2908]: E0909 05:40:12.211052 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:12.211222 kubelet[2908]: W0909 05:40:12.211066 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:12.211222 kubelet[2908]: E0909 05:40:12.211078 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.178316 kubelet[2908]: E0909 05:40:13.178245 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.178316 kubelet[2908]: W0909 05:40:13.178273 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.178316 kubelet[2908]: E0909 05:40:13.178298 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.179631 kubelet[2908]: E0909 05:40:13.179432 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.179631 kubelet[2908]: W0909 05:40:13.179619 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.179871 kubelet[2908]: E0909 05:40:13.179640 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.180480 kubelet[2908]: E0909 05:40:13.180463 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.180523 kubelet[2908]: W0909 05:40:13.180482 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.180872 kubelet[2908]: E0909 05:40:13.180581 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.181049 kubelet[2908]: E0909 05:40:13.181033 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.181167 kubelet[2908]: W0909 05:40:13.181052 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.181167 kubelet[2908]: E0909 05:40:13.181066 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.181461 kubelet[2908]: E0909 05:40:13.181280 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.181461 kubelet[2908]: W0909 05:40:13.181295 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.181461 kubelet[2908]: E0909 05:40:13.181305 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.181586 kubelet[2908]: E0909 05:40:13.181493 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.181586 kubelet[2908]: W0909 05:40:13.181551 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.182015 kubelet[2908]: E0909 05:40:13.181564 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.182701 kubelet[2908]: E0909 05:40:13.182682 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.182701 kubelet[2908]: W0909 05:40:13.182699 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.182805 kubelet[2908]: E0909 05:40:13.182713 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.183682 kubelet[2908]: E0909 05:40:13.183645 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.183682 kubelet[2908]: W0909 05:40:13.183676 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.184102 kubelet[2908]: E0909 05:40:13.183691 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.184483 kubelet[2908]: E0909 05:40:13.184457 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.184483 kubelet[2908]: W0909 05:40:13.184474 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.184793 kubelet[2908]: E0909 05:40:13.184488 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.184953 kubelet[2908]: E0909 05:40:13.184937 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.185003 kubelet[2908]: W0909 05:40:13.184953 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.185003 kubelet[2908]: E0909 05:40:13.184970 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.185682 kubelet[2908]: E0909 05:40:13.185648 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.185682 kubelet[2908]: W0909 05:40:13.185677 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.185783 kubelet[2908]: E0909 05:40:13.185691 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.185925 kubelet[2908]: E0909 05:40:13.185913 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.185966 kubelet[2908]: W0909 05:40:13.185926 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.185966 kubelet[2908]: E0909 05:40:13.185936 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.186546 kubelet[2908]: E0909 05:40:13.186531 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.186631 kubelet[2908]: W0909 05:40:13.186547 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.186631 kubelet[2908]: E0909 05:40:13.186560 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.187787 kubelet[2908]: E0909 05:40:13.187557 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.187787 kubelet[2908]: W0909 05:40:13.187783 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.187877 kubelet[2908]: E0909 05:40:13.187798 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.188543 kubelet[2908]: E0909 05:40:13.188496 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.188543 kubelet[2908]: W0909 05:40:13.188513 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.188543 kubelet[2908]: E0909 05:40:13.188526 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.200429 kubelet[2908]: E0909 05:40:13.200142 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.200429 kubelet[2908]: W0909 05:40:13.200175 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.200429 kubelet[2908]: E0909 05:40:13.200203 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.201553 kubelet[2908]: E0909 05:40:13.201236 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.201553 kubelet[2908]: W0909 05:40:13.201254 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.201553 kubelet[2908]: E0909 05:40:13.201275 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.201855 kubelet[2908]: E0909 05:40:13.201838 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.202184 kubelet[2908]: W0909 05:40:13.202030 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.202184 kubelet[2908]: E0909 05:40:13.202058 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.202752 kubelet[2908]: E0909 05:40:13.202701 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.202832 kubelet[2908]: W0909 05:40:13.202752 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.202832 kubelet[2908]: E0909 05:40:13.202778 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.203141 kubelet[2908]: E0909 05:40:13.203123 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.203195 kubelet[2908]: W0909 05:40:13.203142 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.203195 kubelet[2908]: E0909 05:40:13.203160 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.203554 kubelet[2908]: E0909 05:40:13.203542 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.203589 kubelet[2908]: W0909 05:40:13.203554 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.203589 kubelet[2908]: E0909 05:40:13.203565 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.204127 kubelet[2908]: E0909 05:40:13.203999 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.204127 kubelet[2908]: W0909 05:40:13.204012 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.204127 kubelet[2908]: E0909 05:40:13.204022 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.204449 kubelet[2908]: E0909 05:40:13.204436 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.204449 kubelet[2908]: W0909 05:40:13.204449 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.204519 kubelet[2908]: E0909 05:40:13.204459 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.206571 kubelet[2908]: E0909 05:40:13.206550 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.206571 kubelet[2908]: W0909 05:40:13.206568 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.206727 kubelet[2908]: E0909 05:40:13.206582 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.207216 kubelet[2908]: E0909 05:40:13.207201 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.207216 kubelet[2908]: W0909 05:40:13.207215 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.207290 kubelet[2908]: E0909 05:40:13.207228 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.207895 kubelet[2908]: E0909 05:40:13.207654 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.207895 kubelet[2908]: W0909 05:40:13.207739 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.207895 kubelet[2908]: E0909 05:40:13.207778 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.208199 kubelet[2908]: E0909 05:40:13.208185 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.208199 kubelet[2908]: W0909 05:40:13.208200 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.208564 kubelet[2908]: E0909 05:40:13.208217 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.208845 kubelet[2908]: E0909 05:40:13.208828 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.208845 kubelet[2908]: W0909 05:40:13.208843 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.208941 kubelet[2908]: E0909 05:40:13.208854 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.209346 kubelet[2908]: E0909 05:40:13.209320 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.209346 kubelet[2908]: W0909 05:40:13.209344 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.209470 kubelet[2908]: E0909 05:40:13.209355 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.209599 kubelet[2908]: E0909 05:40:13.209538 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.209599 kubelet[2908]: W0909 05:40:13.209545 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.209599 kubelet[2908]: E0909 05:40:13.209553 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.209943 kubelet[2908]: E0909 05:40:13.209929 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.209943 kubelet[2908]: W0909 05:40:13.209942 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.210016 kubelet[2908]: E0909 05:40:13.209956 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.210430 kubelet[2908]: E0909 05:40:13.210334 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.210430 kubelet[2908]: W0909 05:40:13.210346 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.210430 kubelet[2908]: E0909 05:40:13.210357 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.210680 kubelet[2908]: E0909 05:40:13.210642 2908 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:40:13.210818 kubelet[2908]: W0909 05:40:13.210654 2908 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:40:13.210818 kubelet[2908]: E0909 05:40:13.210735 2908 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:40:13.263944 containerd[1587]: time="2025-09-09T05:40:13.263903693Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:13.265095 containerd[1587]: time="2025-09-09T05:40:13.265062247Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 9 05:40:13.265462 containerd[1587]: time="2025-09-09T05:40:13.265422883Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:13.269225 containerd[1587]: time="2025-09-09T05:40:13.269171934Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:13.270103 containerd[1587]: time="2025-09-09T05:40:13.269995540Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.559278251s" Sep 9 05:40:13.270103 containerd[1587]: time="2025-09-09T05:40:13.270030718Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 9 05:40:13.275891 containerd[1587]: time="2025-09-09T05:40:13.275837080Z" level=info msg="CreateContainer within sandbox \"7ed7708ecb9d13a1365062fbbc4ff2c9f7d16475ed95b3c8b96f1b63b463e86d\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 05:40:13.303528 containerd[1587]: time="2025-09-09T05:40:13.303282693Z" level=info msg="Container eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:13.330211 containerd[1587]: time="2025-09-09T05:40:13.330171579Z" level=info msg="CreateContainer within sandbox \"7ed7708ecb9d13a1365062fbbc4ff2c9f7d16475ed95b3c8b96f1b63b463e86d\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9\"" Sep 9 05:40:13.331937 containerd[1587]: time="2025-09-09T05:40:13.331904998Z" level=info msg="StartContainer for \"eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9\"" Sep 9 05:40:13.333601 containerd[1587]: time="2025-09-09T05:40:13.333550757Z" level=info msg="connecting to shim eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9" address="unix:///run/containerd/s/5d6565286909f2e243dc909bf9e53a94f45be71fcbfce7b47cb4ab1446a6e0b4" protocol=ttrpc version=3 Sep 9 05:40:13.366923 systemd[1]: Started cri-containerd-eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9.scope - libcontainer container eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9. Sep 9 05:40:13.417684 containerd[1587]: time="2025-09-09T05:40:13.417619215Z" level=info msg="StartContainer for \"eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9\" returns successfully" Sep 9 05:40:13.434022 systemd[1]: cri-containerd-eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9.scope: Deactivated successfully. Sep 9 05:40:13.482970 containerd[1587]: time="2025-09-09T05:40:13.482908668Z" level=info msg="received exit event container_id:\"eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9\" id:\"eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9\" pid:3615 exited_at:{seconds:1757396413 nanos:437132674}" Sep 9 05:40:13.503582 containerd[1587]: time="2025-09-09T05:40:13.502515102Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9\" id:\"eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9\" pid:3615 exited_at:{seconds:1757396413 nanos:437132674}" Sep 9 05:40:13.532868 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eb2579019984bd1dc72dc7f3192ba7ca58eca9898238ec27fbe5172ec662a3c9-rootfs.mount: Deactivated successfully. Sep 9 05:40:13.998253 kubelet[2908]: E0909 05:40:13.998131 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-czgjp" podUID="27cc7ce2-3e74-49cc-83d6-1559ba07fa7b" Sep 9 05:40:14.163373 containerd[1587]: time="2025-09-09T05:40:14.163250212Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 05:40:16.000794 kubelet[2908]: E0909 05:40:15.999947 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-czgjp" podUID="27cc7ce2-3e74-49cc-83d6-1559ba07fa7b" Sep 9 05:40:17.998564 kubelet[2908]: E0909 05:40:17.998495 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-czgjp" podUID="27cc7ce2-3e74-49cc-83d6-1559ba07fa7b" Sep 9 05:40:18.987711 containerd[1587]: time="2025-09-09T05:40:18.987590871Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:18.990245 containerd[1587]: time="2025-09-09T05:40:18.990196549Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 9 05:40:18.990805 containerd[1587]: time="2025-09-09T05:40:18.990749780Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:18.993857 containerd[1587]: time="2025-09-09T05:40:18.993803958Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:18.995692 containerd[1587]: time="2025-09-09T05:40:18.995560399Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 4.832191882s" Sep 9 05:40:18.995692 containerd[1587]: time="2025-09-09T05:40:18.995599529Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 9 05:40:19.002845 containerd[1587]: time="2025-09-09T05:40:19.002816816Z" level=info msg="CreateContainer within sandbox \"7ed7708ecb9d13a1365062fbbc4ff2c9f7d16475ed95b3c8b96f1b63b463e86d\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 05:40:19.012917 containerd[1587]: time="2025-09-09T05:40:19.012818569Z" level=info msg="Container eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:19.023553 containerd[1587]: time="2025-09-09T05:40:19.023410371Z" level=info msg="CreateContainer within sandbox \"7ed7708ecb9d13a1365062fbbc4ff2c9f7d16475ed95b3c8b96f1b63b463e86d\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb\"" Sep 9 05:40:19.024570 containerd[1587]: time="2025-09-09T05:40:19.024543422Z" level=info msg="StartContainer for \"eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb\"" Sep 9 05:40:19.027594 containerd[1587]: time="2025-09-09T05:40:19.027211509Z" level=info msg="connecting to shim eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb" address="unix:///run/containerd/s/5d6565286909f2e243dc909bf9e53a94f45be71fcbfce7b47cb4ab1446a6e0b4" protocol=ttrpc version=3 Sep 9 05:40:19.058904 systemd[1]: Started cri-containerd-eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb.scope - libcontainer container eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb. Sep 9 05:40:19.151966 containerd[1587]: time="2025-09-09T05:40:19.151908318Z" level=info msg="StartContainer for \"eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb\" returns successfully" Sep 9 05:40:19.731278 systemd[1]: cri-containerd-eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb.scope: Deactivated successfully. Sep 9 05:40:19.732835 systemd[1]: cri-containerd-eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb.scope: Consumed 579ms CPU time, 166.4M memory peak, 10.5M read from disk, 171.3M written to disk. Sep 9 05:40:19.760676 containerd[1587]: time="2025-09-09T05:40:19.760623468Z" level=info msg="received exit event container_id:\"eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb\" id:\"eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb\" pid:3678 exited_at:{seconds:1757396419 nanos:760129983}" Sep 9 05:40:19.763191 containerd[1587]: time="2025-09-09T05:40:19.763137067Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb\" id:\"eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb\" pid:3678 exited_at:{seconds:1757396419 nanos:760129983}" Sep 9 05:40:19.793966 kubelet[2908]: I0909 05:40:19.791605 2908 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 05:40:19.877583 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eab948794bb5b72c859c7285c8a7161a770e745d3c8454144fb94e44af4dc6eb-rootfs.mount: Deactivated successfully. Sep 9 05:40:19.884952 systemd[1]: Created slice kubepods-besteffort-podc286ba84_712e_4b9f_82a6_8f9e50268837.slice - libcontainer container kubepods-besteffort-podc286ba84_712e_4b9f_82a6_8f9e50268837.slice. Sep 9 05:40:19.918985 systemd[1]: Created slice kubepods-burstable-pod9ed1e42f_a14e_4222_8edd_72d601806d5a.slice - libcontainer container kubepods-burstable-pod9ed1e42f_a14e_4222_8edd_72d601806d5a.slice. Sep 9 05:40:19.937838 systemd[1]: Created slice kubepods-besteffort-podb61265df_84eb_42d3_bb4e_3993302b5b2c.slice - libcontainer container kubepods-besteffort-podb61265df_84eb_42d3_bb4e_3993302b5b2c.slice. Sep 9 05:40:19.953930 systemd[1]: Created slice kubepods-besteffort-pod343f6693_fe0c_4929_a714_bd0106a3b358.slice - libcontainer container kubepods-besteffort-pod343f6693_fe0c_4929_a714_bd0106a3b358.slice. Sep 9 05:40:19.965073 systemd[1]: Created slice kubepods-besteffort-pod5238b60f_81c4_4831_b073_cc699bd0b55a.slice - libcontainer container kubepods-besteffort-pod5238b60f_81c4_4831_b073_cc699bd0b55a.slice. Sep 9 05:40:19.967745 kubelet[2908]: I0909 05:40:19.967717 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsx2x\" (UniqueName: \"kubernetes.io/projected/04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7-kube-api-access-hsx2x\") pod \"coredns-674b8bbfcf-d4mz8\" (UID: \"04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7\") " pod="kube-system/coredns-674b8bbfcf-d4mz8" Sep 9 05:40:19.967972 kubelet[2908]: I0909 05:40:19.967926 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5238b60f-81c4-4831-b073-cc699bd0b55a-calico-apiserver-certs\") pod \"calico-apiserver-867ddcbfff-ngl89\" (UID: \"5238b60f-81c4-4831-b073-cc699bd0b55a\") " pod="calico-apiserver/calico-apiserver-867ddcbfff-ngl89" Sep 9 05:40:19.967972 kubelet[2908]: I0909 05:40:19.967954 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b61265df-84eb-42d3-bb4e-3993302b5b2c-calico-apiserver-certs\") pod \"calico-apiserver-867ddcbfff-tcnvd\" (UID: \"b61265df-84eb-42d3-bb4e-3993302b5b2c\") " pod="calico-apiserver/calico-apiserver-867ddcbfff-tcnvd" Sep 9 05:40:19.968119 kubelet[2908]: I0909 05:40:19.968058 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/51584f79-8718-43e1-9ab4-f95d38f910f1-goldmane-key-pair\") pod \"goldmane-54d579b49d-jg28r\" (UID: \"51584f79-8718-43e1-9ab4-f95d38f910f1\") " pod="calico-system/goldmane-54d579b49d-jg28r" Sep 9 05:40:19.968119 kubelet[2908]: I0909 05:40:19.968080 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcj4t\" (UniqueName: \"kubernetes.io/projected/5238b60f-81c4-4831-b073-cc699bd0b55a-kube-api-access-pcj4t\") pod \"calico-apiserver-867ddcbfff-ngl89\" (UID: \"5238b60f-81c4-4831-b073-cc699bd0b55a\") " pod="calico-apiserver/calico-apiserver-867ddcbfff-ngl89" Sep 9 05:40:19.968119 kubelet[2908]: I0909 05:40:19.968100 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c286ba84-712e-4b9f-82a6-8f9e50268837-whisker-ca-bundle\") pod \"whisker-695bd4b7b8-jdxn7\" (UID: \"c286ba84-712e-4b9f-82a6-8f9e50268837\") " pod="calico-system/whisker-695bd4b7b8-jdxn7" Sep 9 05:40:19.968559 kubelet[2908]: I0909 05:40:19.968265 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51584f79-8718-43e1-9ab4-f95d38f910f1-config\") pod \"goldmane-54d579b49d-jg28r\" (UID: \"51584f79-8718-43e1-9ab4-f95d38f910f1\") " pod="calico-system/goldmane-54d579b49d-jg28r" Sep 9 05:40:19.968559 kubelet[2908]: I0909 05:40:19.968513 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g676n\" (UniqueName: \"kubernetes.io/projected/51584f79-8718-43e1-9ab4-f95d38f910f1-kube-api-access-g676n\") pod \"goldmane-54d579b49d-jg28r\" (UID: \"51584f79-8718-43e1-9ab4-f95d38f910f1\") " pod="calico-system/goldmane-54d579b49d-jg28r" Sep 9 05:40:19.968559 kubelet[2908]: I0909 05:40:19.968536 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7-config-volume\") pod \"coredns-674b8bbfcf-d4mz8\" (UID: \"04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7\") " pod="kube-system/coredns-674b8bbfcf-d4mz8" Sep 9 05:40:19.968897 kubelet[2908]: I0909 05:40:19.968772 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hcbf\" (UniqueName: \"kubernetes.io/projected/9ed1e42f-a14e-4222-8edd-72d601806d5a-kube-api-access-7hcbf\") pod \"coredns-674b8bbfcf-vtps6\" (UID: \"9ed1e42f-a14e-4222-8edd-72d601806d5a\") " pod="kube-system/coredns-674b8bbfcf-vtps6" Sep 9 05:40:19.969033 kubelet[2908]: I0909 05:40:19.969013 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/343f6693-fe0c-4929-a714-bd0106a3b358-tigera-ca-bundle\") pod \"calico-kube-controllers-586648bd4c-qvfkx\" (UID: \"343f6693-fe0c-4929-a714-bd0106a3b358\") " pod="calico-system/calico-kube-controllers-586648bd4c-qvfkx" Sep 9 05:40:19.969134 kubelet[2908]: I0909 05:40:19.969123 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ed1e42f-a14e-4222-8edd-72d601806d5a-config-volume\") pod \"coredns-674b8bbfcf-vtps6\" (UID: \"9ed1e42f-a14e-4222-8edd-72d601806d5a\") " pod="kube-system/coredns-674b8bbfcf-vtps6" Sep 9 05:40:19.969298 kubelet[2908]: I0909 05:40:19.969226 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6q2q\" (UniqueName: \"kubernetes.io/projected/c286ba84-712e-4b9f-82a6-8f9e50268837-kube-api-access-s6q2q\") pod \"whisker-695bd4b7b8-jdxn7\" (UID: \"c286ba84-712e-4b9f-82a6-8f9e50268837\") " pod="calico-system/whisker-695bd4b7b8-jdxn7" Sep 9 05:40:19.969298 kubelet[2908]: I0909 05:40:19.969253 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c286ba84-712e-4b9f-82a6-8f9e50268837-whisker-backend-key-pair\") pod \"whisker-695bd4b7b8-jdxn7\" (UID: \"c286ba84-712e-4b9f-82a6-8f9e50268837\") " pod="calico-system/whisker-695bd4b7b8-jdxn7" Sep 9 05:40:19.969736 kubelet[2908]: I0909 05:40:19.969400 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf8hk\" (UniqueName: \"kubernetes.io/projected/b61265df-84eb-42d3-bb4e-3993302b5b2c-kube-api-access-xf8hk\") pod \"calico-apiserver-867ddcbfff-tcnvd\" (UID: \"b61265df-84eb-42d3-bb4e-3993302b5b2c\") " pod="calico-apiserver/calico-apiserver-867ddcbfff-tcnvd" Sep 9 05:40:19.969736 kubelet[2908]: I0909 05:40:19.969473 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51584f79-8718-43e1-9ab4-f95d38f910f1-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-jg28r\" (UID: \"51584f79-8718-43e1-9ab4-f95d38f910f1\") " pod="calico-system/goldmane-54d579b49d-jg28r" Sep 9 05:40:19.969736 kubelet[2908]: I0909 05:40:19.969554 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fsfq\" (UniqueName: \"kubernetes.io/projected/343f6693-fe0c-4929-a714-bd0106a3b358-kube-api-access-5fsfq\") pod \"calico-kube-controllers-586648bd4c-qvfkx\" (UID: \"343f6693-fe0c-4929-a714-bd0106a3b358\") " pod="calico-system/calico-kube-controllers-586648bd4c-qvfkx" Sep 9 05:40:19.975268 systemd[1]: Created slice kubepods-burstable-pod04f1a8dd_8e48_4ae0_b634_feaea7fa7cc7.slice - libcontainer container kubepods-burstable-pod04f1a8dd_8e48_4ae0_b634_feaea7fa7cc7.slice. Sep 9 05:40:19.984086 systemd[1]: Created slice kubepods-besteffort-pod51584f79_8718_43e1_9ab4_f95d38f910f1.slice - libcontainer container kubepods-besteffort-pod51584f79_8718_43e1_9ab4_f95d38f910f1.slice. Sep 9 05:40:20.009843 systemd[1]: Created slice kubepods-besteffort-pod27cc7ce2_3e74_49cc_83d6_1559ba07fa7b.slice - libcontainer container kubepods-besteffort-pod27cc7ce2_3e74_49cc_83d6_1559ba07fa7b.slice. Sep 9 05:40:20.022104 containerd[1587]: time="2025-09-09T05:40:20.022054857Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-czgjp,Uid:27cc7ce2-3e74-49cc-83d6-1559ba07fa7b,Namespace:calico-system,Attempt:0,}" Sep 9 05:40:20.214125 containerd[1587]: time="2025-09-09T05:40:20.211332455Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-695bd4b7b8-jdxn7,Uid:c286ba84-712e-4b9f-82a6-8f9e50268837,Namespace:calico-system,Attempt:0,}" Sep 9 05:40:20.231127 containerd[1587]: time="2025-09-09T05:40:20.230776032Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vtps6,Uid:9ed1e42f-a14e-4222-8edd-72d601806d5a,Namespace:kube-system,Attempt:0,}" Sep 9 05:40:20.256005 containerd[1587]: time="2025-09-09T05:40:20.255333015Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867ddcbfff-tcnvd,Uid:b61265df-84eb-42d3-bb4e-3993302b5b2c,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:40:20.259296 containerd[1587]: time="2025-09-09T05:40:20.259112819Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 05:40:20.274536 containerd[1587]: time="2025-09-09T05:40:20.272289568Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586648bd4c-qvfkx,Uid:343f6693-fe0c-4929-a714-bd0106a3b358,Namespace:calico-system,Attempt:0,}" Sep 9 05:40:20.290335 containerd[1587]: time="2025-09-09T05:40:20.289858836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867ddcbfff-ngl89,Uid:5238b60f-81c4-4831-b073-cc699bd0b55a,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:40:20.290335 containerd[1587]: time="2025-09-09T05:40:20.290329982Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-d4mz8,Uid:04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7,Namespace:kube-system,Attempt:0,}" Sep 9 05:40:20.331478 containerd[1587]: time="2025-09-09T05:40:20.331400363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-jg28r,Uid:51584f79-8718-43e1-9ab4-f95d38f910f1,Namespace:calico-system,Attempt:0,}" Sep 9 05:40:20.629419 containerd[1587]: time="2025-09-09T05:40:20.629340774Z" level=error msg="Failed to destroy network for sandbox \"5c735c4235b440d8961ac5d744900de6d5bdfca63b6aab1a4d12314ce58702b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.641685 containerd[1587]: time="2025-09-09T05:40:20.641170839Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vtps6,Uid:9ed1e42f-a14e-4222-8edd-72d601806d5a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c735c4235b440d8961ac5d744900de6d5bdfca63b6aab1a4d12314ce58702b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.643815 kubelet[2908]: E0909 05:40:20.643589 2908 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c735c4235b440d8961ac5d744900de6d5bdfca63b6aab1a4d12314ce58702b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.643815 kubelet[2908]: E0909 05:40:20.643712 2908 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c735c4235b440d8961ac5d744900de6d5bdfca63b6aab1a4d12314ce58702b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vtps6" Sep 9 05:40:20.643815 kubelet[2908]: E0909 05:40:20.643740 2908 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c735c4235b440d8961ac5d744900de6d5bdfca63b6aab1a4d12314ce58702b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vtps6" Sep 9 05:40:20.646940 kubelet[2908]: E0909 05:40:20.646724 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-vtps6_kube-system(9ed1e42f-a14e-4222-8edd-72d601806d5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-vtps6_kube-system(9ed1e42f-a14e-4222-8edd-72d601806d5a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5c735c4235b440d8961ac5d744900de6d5bdfca63b6aab1a4d12314ce58702b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-vtps6" podUID="9ed1e42f-a14e-4222-8edd-72d601806d5a" Sep 9 05:40:20.654008 containerd[1587]: time="2025-09-09T05:40:20.653891038Z" level=error msg="Failed to destroy network for sandbox \"a3a87ebb9d45e4d60289243c47d275530f0a391062c785247f154f135d429cdc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.655483 containerd[1587]: time="2025-09-09T05:40:20.655389256Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-czgjp,Uid:27cc7ce2-3e74-49cc-83d6-1559ba07fa7b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3a87ebb9d45e4d60289243c47d275530f0a391062c785247f154f135d429cdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.668144 kubelet[2908]: E0909 05:40:20.668010 2908 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3a87ebb9d45e4d60289243c47d275530f0a391062c785247f154f135d429cdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.669275 kubelet[2908]: E0909 05:40:20.669208 2908 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3a87ebb9d45e4d60289243c47d275530f0a391062c785247f154f135d429cdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-czgjp" Sep 9 05:40:20.669275 kubelet[2908]: E0909 05:40:20.669270 2908 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3a87ebb9d45e4d60289243c47d275530f0a391062c785247f154f135d429cdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-czgjp" Sep 9 05:40:20.669471 kubelet[2908]: E0909 05:40:20.669352 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-czgjp_calico-system(27cc7ce2-3e74-49cc-83d6-1559ba07fa7b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-czgjp_calico-system(27cc7ce2-3e74-49cc-83d6-1559ba07fa7b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a3a87ebb9d45e4d60289243c47d275530f0a391062c785247f154f135d429cdc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-czgjp" podUID="27cc7ce2-3e74-49cc-83d6-1559ba07fa7b" Sep 9 05:40:20.690118 containerd[1587]: time="2025-09-09T05:40:20.689900417Z" level=error msg="Failed to destroy network for sandbox \"d8526853c1709a24f5b86e8376f3f12f69c92754c0f09b5b14166b03cc715ac7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.691301 containerd[1587]: time="2025-09-09T05:40:20.690970173Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-jg28r,Uid:51584f79-8718-43e1-9ab4-f95d38f910f1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8526853c1709a24f5b86e8376f3f12f69c92754c0f09b5b14166b03cc715ac7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.692017 kubelet[2908]: E0909 05:40:20.691968 2908 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8526853c1709a24f5b86e8376f3f12f69c92754c0f09b5b14166b03cc715ac7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.692128 kubelet[2908]: E0909 05:40:20.692058 2908 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8526853c1709a24f5b86e8376f3f12f69c92754c0f09b5b14166b03cc715ac7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-jg28r" Sep 9 05:40:20.692128 kubelet[2908]: E0909 05:40:20.692088 2908 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8526853c1709a24f5b86e8376f3f12f69c92754c0f09b5b14166b03cc715ac7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-jg28r" Sep 9 05:40:20.693369 kubelet[2908]: E0909 05:40:20.692179 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-jg28r_calico-system(51584f79-8718-43e1-9ab4-f95d38f910f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-jg28r_calico-system(51584f79-8718-43e1-9ab4-f95d38f910f1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d8526853c1709a24f5b86e8376f3f12f69c92754c0f09b5b14166b03cc715ac7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-jg28r" podUID="51584f79-8718-43e1-9ab4-f95d38f910f1" Sep 9 05:40:20.731682 containerd[1587]: time="2025-09-09T05:40:20.731600515Z" level=error msg="Failed to destroy network for sandbox \"2b111b57cc9b061536149459f01ab4c1525e50e154950bf90e0e21e261ea9fd2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.732419 containerd[1587]: time="2025-09-09T05:40:20.732345668Z" level=error msg="Failed to destroy network for sandbox \"1e5c8b3c96760add050e3f6bb6b233158a5bee2c67974ac98d3a80d65da89ab2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.732882 containerd[1587]: time="2025-09-09T05:40:20.732721022Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867ddcbfff-ngl89,Uid:5238b60f-81c4-4831-b073-cc699bd0b55a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b111b57cc9b061536149459f01ab4c1525e50e154950bf90e0e21e261ea9fd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.733171 kubelet[2908]: E0909 05:40:20.733122 2908 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b111b57cc9b061536149459f01ab4c1525e50e154950bf90e0e21e261ea9fd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.733278 kubelet[2908]: E0909 05:40:20.733209 2908 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b111b57cc9b061536149459f01ab4c1525e50e154950bf90e0e21e261ea9fd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867ddcbfff-ngl89" Sep 9 05:40:20.733278 kubelet[2908]: E0909 05:40:20.733260 2908 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b111b57cc9b061536149459f01ab4c1525e50e154950bf90e0e21e261ea9fd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867ddcbfff-ngl89" Sep 9 05:40:20.733630 kubelet[2908]: E0909 05:40:20.733336 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867ddcbfff-ngl89_calico-apiserver(5238b60f-81c4-4831-b073-cc699bd0b55a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867ddcbfff-ngl89_calico-apiserver(5238b60f-81c4-4831-b073-cc699bd0b55a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2b111b57cc9b061536149459f01ab4c1525e50e154950bf90e0e21e261ea9fd2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867ddcbfff-ngl89" podUID="5238b60f-81c4-4831-b073-cc699bd0b55a" Sep 9 05:40:20.735390 kubelet[2908]: E0909 05:40:20.734767 2908 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e5c8b3c96760add050e3f6bb6b233158a5bee2c67974ac98d3a80d65da89ab2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.735390 kubelet[2908]: E0909 05:40:20.734823 2908 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e5c8b3c96760add050e3f6bb6b233158a5bee2c67974ac98d3a80d65da89ab2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-d4mz8" Sep 9 05:40:20.735390 kubelet[2908]: E0909 05:40:20.734847 2908 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e5c8b3c96760add050e3f6bb6b233158a5bee2c67974ac98d3a80d65da89ab2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-d4mz8" Sep 9 05:40:20.735521 containerd[1587]: time="2025-09-09T05:40:20.733960310Z" level=error msg="Failed to destroy network for sandbox \"7e9a9b2685f90363660bac75617543df79929395dbb9fe42d548e1a0d36ba17e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.735521 containerd[1587]: time="2025-09-09T05:40:20.734273341Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-d4mz8,Uid:04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e5c8b3c96760add050e3f6bb6b233158a5bee2c67974ac98d3a80d65da89ab2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.735521 containerd[1587]: time="2025-09-09T05:40:20.735332539Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-695bd4b7b8-jdxn7,Uid:c286ba84-712e-4b9f-82a6-8f9e50268837,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e9a9b2685f90363660bac75617543df79929395dbb9fe42d548e1a0d36ba17e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.735688 kubelet[2908]: E0909 05:40:20.734902 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-d4mz8_kube-system(04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-d4mz8_kube-system(04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1e5c8b3c96760add050e3f6bb6b233158a5bee2c67974ac98d3a80d65da89ab2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-d4mz8" podUID="04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7" Sep 9 05:40:20.735688 kubelet[2908]: E0909 05:40:20.735585 2908 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e9a9b2685f90363660bac75617543df79929395dbb9fe42d548e1a0d36ba17e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.736296 kubelet[2908]: E0909 05:40:20.736148 2908 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e9a9b2685f90363660bac75617543df79929395dbb9fe42d548e1a0d36ba17e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-695bd4b7b8-jdxn7" Sep 9 05:40:20.736296 kubelet[2908]: E0909 05:40:20.736189 2908 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e9a9b2685f90363660bac75617543df79929395dbb9fe42d548e1a0d36ba17e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-695bd4b7b8-jdxn7" Sep 9 05:40:20.736569 kubelet[2908]: E0909 05:40:20.736257 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-695bd4b7b8-jdxn7_calico-system(c286ba84-712e-4b9f-82a6-8f9e50268837)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-695bd4b7b8-jdxn7_calico-system(c286ba84-712e-4b9f-82a6-8f9e50268837)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7e9a9b2685f90363660bac75617543df79929395dbb9fe42d548e1a0d36ba17e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-695bd4b7b8-jdxn7" podUID="c286ba84-712e-4b9f-82a6-8f9e50268837" Sep 9 05:40:20.757262 containerd[1587]: time="2025-09-09T05:40:20.757115786Z" level=error msg="Failed to destroy network for sandbox \"3d32184f976d77b5dca29be6ec08f849c251ee695440f7fad50980fa4be799f3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.757412 containerd[1587]: time="2025-09-09T05:40:20.757362123Z" level=error msg="Failed to destroy network for sandbox \"441f6f51a858011240540034c12254970c181631dd11b71b7087e2f2a009be50\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.758338 containerd[1587]: time="2025-09-09T05:40:20.758231838Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867ddcbfff-tcnvd,Uid:b61265df-84eb-42d3-bb4e-3993302b5b2c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d32184f976d77b5dca29be6ec08f849c251ee695440f7fad50980fa4be799f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.759780 kubelet[2908]: E0909 05:40:20.758822 2908 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d32184f976d77b5dca29be6ec08f849c251ee695440f7fad50980fa4be799f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.759918 kubelet[2908]: E0909 05:40:20.759847 2908 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d32184f976d77b5dca29be6ec08f849c251ee695440f7fad50980fa4be799f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867ddcbfff-tcnvd" Sep 9 05:40:20.759918 kubelet[2908]: E0909 05:40:20.759877 2908 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d32184f976d77b5dca29be6ec08f849c251ee695440f7fad50980fa4be799f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-867ddcbfff-tcnvd" Sep 9 05:40:20.760107 kubelet[2908]: E0909 05:40:20.759964 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-867ddcbfff-tcnvd_calico-apiserver(b61265df-84eb-42d3-bb4e-3993302b5b2c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-867ddcbfff-tcnvd_calico-apiserver(b61265df-84eb-42d3-bb4e-3993302b5b2c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3d32184f976d77b5dca29be6ec08f849c251ee695440f7fad50980fa4be799f3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-867ddcbfff-tcnvd" podUID="b61265df-84eb-42d3-bb4e-3993302b5b2c" Sep 9 05:40:20.760181 containerd[1587]: time="2025-09-09T05:40:20.760095005Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586648bd4c-qvfkx,Uid:343f6693-fe0c-4929-a714-bd0106a3b358,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"441f6f51a858011240540034c12254970c181631dd11b71b7087e2f2a009be50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.760409 kubelet[2908]: E0909 05:40:20.760289 2908 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"441f6f51a858011240540034c12254970c181631dd11b71b7087e2f2a009be50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:40:20.760409 kubelet[2908]: E0909 05:40:20.760326 2908 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"441f6f51a858011240540034c12254970c181631dd11b71b7087e2f2a009be50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-586648bd4c-qvfkx" Sep 9 05:40:20.760752 kubelet[2908]: E0909 05:40:20.760721 2908 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"441f6f51a858011240540034c12254970c181631dd11b71b7087e2f2a009be50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-586648bd4c-qvfkx" Sep 9 05:40:20.760964 kubelet[2908]: E0909 05:40:20.760783 2908 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-586648bd4c-qvfkx_calico-system(343f6693-fe0c-4929-a714-bd0106a3b358)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-586648bd4c-qvfkx_calico-system(343f6693-fe0c-4929-a714-bd0106a3b358)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"441f6f51a858011240540034c12254970c181631dd11b71b7087e2f2a009be50\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-586648bd4c-qvfkx" podUID="343f6693-fe0c-4929-a714-bd0106a3b358" Sep 9 05:40:21.049814 systemd[1]: run-netns-cni\x2db073895c\x2dd53a\x2d9d42\x2d1dfb\x2da6dff21858a3.mount: Deactivated successfully. Sep 9 05:40:28.786791 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3567719657.mount: Deactivated successfully. Sep 9 05:40:28.837683 containerd[1587]: time="2025-09-09T05:40:28.826393496Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:28.838794 containerd[1587]: time="2025-09-09T05:40:28.838753110Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 9 05:40:28.839769 containerd[1587]: time="2025-09-09T05:40:28.839733485Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:28.840713 containerd[1587]: time="2025-09-09T05:40:28.840655422Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:28.843604 containerd[1587]: time="2025-09-09T05:40:28.843563464Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 8.582165083s" Sep 9 05:40:28.843840 containerd[1587]: time="2025-09-09T05:40:28.843738653Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 9 05:40:28.878947 containerd[1587]: time="2025-09-09T05:40:28.878910029Z" level=info msg="CreateContainer within sandbox \"7ed7708ecb9d13a1365062fbbc4ff2c9f7d16475ed95b3c8b96f1b63b463e86d\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 05:40:28.923209 containerd[1587]: time="2025-09-09T05:40:28.922821863Z" level=info msg="Container b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:28.924117 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4250214832.mount: Deactivated successfully. Sep 9 05:40:28.942182 containerd[1587]: time="2025-09-09T05:40:28.942140307Z" level=info msg="CreateContainer within sandbox \"7ed7708ecb9d13a1365062fbbc4ff2c9f7d16475ed95b3c8b96f1b63b463e86d\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a\"" Sep 9 05:40:28.943056 containerd[1587]: time="2025-09-09T05:40:28.943029253Z" level=info msg="StartContainer for \"b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a\"" Sep 9 05:40:28.950229 containerd[1587]: time="2025-09-09T05:40:28.950162370Z" level=info msg="connecting to shim b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a" address="unix:///run/containerd/s/5d6565286909f2e243dc909bf9e53a94f45be71fcbfce7b47cb4ab1446a6e0b4" protocol=ttrpc version=3 Sep 9 05:40:29.110881 systemd[1]: Started cri-containerd-b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a.scope - libcontainer container b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a. Sep 9 05:40:29.180944 containerd[1587]: time="2025-09-09T05:40:29.180900144Z" level=info msg="StartContainer for \"b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a\" returns successfully" Sep 9 05:40:29.343569 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 05:40:29.344423 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 05:40:29.367902 kubelet[2908]: I0909 05:40:29.364528 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-k8kln" podStartSLOduration=1.952565376 podStartE2EDuration="22.364508714s" podCreationTimestamp="2025-09-09 05:40:07 +0000 UTC" firstStartedPulling="2025-09-09 05:40:08.432604949 +0000 UTC m=+21.616709202" lastFinishedPulling="2025-09-09 05:40:28.844548289 +0000 UTC m=+42.028652540" observedRunningTime="2025-09-09 05:40:29.364174394 +0000 UTC m=+42.548278673" watchObservedRunningTime="2025-09-09 05:40:29.364508714 +0000 UTC m=+42.548612988" Sep 9 05:40:29.730916 containerd[1587]: time="2025-09-09T05:40:29.730540836Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a\" id:\"023ed30f0655601ce4458ddad864c18bf73d10998b7f103995377ca74d6031a1\" pid:3990 exit_status:1 exited_at:{seconds:1757396429 nanos:729877142}" Sep 9 05:40:29.763966 kubelet[2908]: I0909 05:40:29.763868 2908 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6q2q\" (UniqueName: \"kubernetes.io/projected/c286ba84-712e-4b9f-82a6-8f9e50268837-kube-api-access-s6q2q\") pod \"c286ba84-712e-4b9f-82a6-8f9e50268837\" (UID: \"c286ba84-712e-4b9f-82a6-8f9e50268837\") " Sep 9 05:40:29.763966 kubelet[2908]: I0909 05:40:29.763921 2908 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c286ba84-712e-4b9f-82a6-8f9e50268837-whisker-backend-key-pair\") pod \"c286ba84-712e-4b9f-82a6-8f9e50268837\" (UID: \"c286ba84-712e-4b9f-82a6-8f9e50268837\") " Sep 9 05:40:29.763966 kubelet[2908]: I0909 05:40:29.763946 2908 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c286ba84-712e-4b9f-82a6-8f9e50268837-whisker-ca-bundle\") pod \"c286ba84-712e-4b9f-82a6-8f9e50268837\" (UID: \"c286ba84-712e-4b9f-82a6-8f9e50268837\") " Sep 9 05:40:29.771149 kubelet[2908]: I0909 05:40:29.771104 2908 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c286ba84-712e-4b9f-82a6-8f9e50268837-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "c286ba84-712e-4b9f-82a6-8f9e50268837" (UID: "c286ba84-712e-4b9f-82a6-8f9e50268837"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 05:40:29.780821 kubelet[2908]: I0909 05:40:29.780738 2908 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c286ba84-712e-4b9f-82a6-8f9e50268837-kube-api-access-s6q2q" (OuterVolumeSpecName: "kube-api-access-s6q2q") pod "c286ba84-712e-4b9f-82a6-8f9e50268837" (UID: "c286ba84-712e-4b9f-82a6-8f9e50268837"). InnerVolumeSpecName "kube-api-access-s6q2q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 05:40:29.781110 kubelet[2908]: I0909 05:40:29.780957 2908 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c286ba84-712e-4b9f-82a6-8f9e50268837-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "c286ba84-712e-4b9f-82a6-8f9e50268837" (UID: "c286ba84-712e-4b9f-82a6-8f9e50268837"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 05:40:29.785423 systemd[1]: var-lib-kubelet-pods-c286ba84\x2d712e\x2d4b9f\x2d82a6\x2d8f9e50268837-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ds6q2q.mount: Deactivated successfully. Sep 9 05:40:29.785539 systemd[1]: var-lib-kubelet-pods-c286ba84\x2d712e\x2d4b9f\x2d82a6\x2d8f9e50268837-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 05:40:29.864672 kubelet[2908]: I0909 05:40:29.864580 2908 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s6q2q\" (UniqueName: \"kubernetes.io/projected/c286ba84-712e-4b9f-82a6-8f9e50268837-kube-api-access-s6q2q\") on node \"srv-dlh9b.gb1.brightbox.com\" DevicePath \"\"" Sep 9 05:40:29.864672 kubelet[2908]: I0909 05:40:29.864622 2908 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c286ba84-712e-4b9f-82a6-8f9e50268837-whisker-backend-key-pair\") on node \"srv-dlh9b.gb1.brightbox.com\" DevicePath \"\"" Sep 9 05:40:29.864672 kubelet[2908]: I0909 05:40:29.864634 2908 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c286ba84-712e-4b9f-82a6-8f9e50268837-whisker-ca-bundle\") on node \"srv-dlh9b.gb1.brightbox.com\" DevicePath \"\"" Sep 9 05:40:30.296853 systemd[1]: Removed slice kubepods-besteffort-podc286ba84_712e_4b9f_82a6_8f9e50268837.slice - libcontainer container kubepods-besteffort-podc286ba84_712e_4b9f_82a6_8f9e50268837.slice. Sep 9 05:40:30.394105 systemd[1]: Created slice kubepods-besteffort-pod0d171813_85d3_405a_a6e3_aa2a33931c89.slice - libcontainer container kubepods-besteffort-pod0d171813_85d3_405a_a6e3_aa2a33931c89.slice. Sep 9 05:40:30.470497 kubelet[2908]: I0909 05:40:30.470404 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0d171813-85d3-405a-a6e3-aa2a33931c89-whisker-backend-key-pair\") pod \"whisker-64d755cd59-mwrgh\" (UID: \"0d171813-85d3-405a-a6e3-aa2a33931c89\") " pod="calico-system/whisker-64d755cd59-mwrgh" Sep 9 05:40:30.471368 kubelet[2908]: I0909 05:40:30.470641 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d171813-85d3-405a-a6e3-aa2a33931c89-whisker-ca-bundle\") pod \"whisker-64d755cd59-mwrgh\" (UID: \"0d171813-85d3-405a-a6e3-aa2a33931c89\") " pod="calico-system/whisker-64d755cd59-mwrgh" Sep 9 05:40:30.471368 kubelet[2908]: I0909 05:40:30.471040 2908 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx57v\" (UniqueName: \"kubernetes.io/projected/0d171813-85d3-405a-a6e3-aa2a33931c89-kube-api-access-wx57v\") pod \"whisker-64d755cd59-mwrgh\" (UID: \"0d171813-85d3-405a-a6e3-aa2a33931c89\") " pod="calico-system/whisker-64d755cd59-mwrgh" Sep 9 05:40:30.491432 containerd[1587]: time="2025-09-09T05:40:30.491390569Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a\" id:\"d6cf1c956b8ec63a6916cfd83226535815f6664f6f8e1c30e614429f52aab6a1\" pid:4032 exit_status:1 exited_at:{seconds:1757396430 nanos:490964541}" Sep 9 05:40:30.699959 containerd[1587]: time="2025-09-09T05:40:30.699895210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64d755cd59-mwrgh,Uid:0d171813-85d3-405a-a6e3-aa2a33931c89,Namespace:calico-system,Attempt:0,}" Sep 9 05:40:31.008938 kubelet[2908]: I0909 05:40:31.007883 2908 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c286ba84-712e-4b9f-82a6-8f9e50268837" path="/var/lib/kubelet/pods/c286ba84-712e-4b9f-82a6-8f9e50268837/volumes" Sep 9 05:40:31.033336 systemd-networkd[1514]: cali488a47dc3d2: Link UP Sep 9 05:40:31.038193 systemd-networkd[1514]: cali488a47dc3d2: Gained carrier Sep 9 05:40:31.062756 containerd[1587]: 2025-09-09 05:40:30.757 [INFO][4049] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:40:31.062756 containerd[1587]: 2025-09-09 05:40:30.790 [INFO][4049] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0 whisker-64d755cd59- calico-system 0d171813-85d3-405a-a6e3-aa2a33931c89 897 0 2025-09-09 05:40:30 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:64d755cd59 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-dlh9b.gb1.brightbox.com whisker-64d755cd59-mwrgh eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali488a47dc3d2 [] [] }} ContainerID="15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" Namespace="calico-system" Pod="whisker-64d755cd59-mwrgh" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-" Sep 9 05:40:31.062756 containerd[1587]: 2025-09-09 05:40:30.790 [INFO][4049] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" Namespace="calico-system" Pod="whisker-64d755cd59-mwrgh" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0" Sep 9 05:40:31.062756 containerd[1587]: 2025-09-09 05:40:30.945 [INFO][4057] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" HandleID="k8s-pod-network.15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" Workload="srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0" Sep 9 05:40:31.063962 containerd[1587]: 2025-09-09 05:40:30.947 [INFO][4057] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" HandleID="k8s-pod-network.15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" Workload="srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00034db30), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-dlh9b.gb1.brightbox.com", "pod":"whisker-64d755cd59-mwrgh", "timestamp":"2025-09-09 05:40:30.945378247 +0000 UTC"}, Hostname:"srv-dlh9b.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:40:31.063962 containerd[1587]: 2025-09-09 05:40:30.947 [INFO][4057] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:40:31.063962 containerd[1587]: 2025-09-09 05:40:30.947 [INFO][4057] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:40:31.063962 containerd[1587]: 2025-09-09 05:40:30.947 [INFO][4057] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-dlh9b.gb1.brightbox.com' Sep 9 05:40:31.063962 containerd[1587]: 2025-09-09 05:40:30.963 [INFO][4057] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:31.063962 containerd[1587]: 2025-09-09 05:40:30.971 [INFO][4057] ipam/ipam.go 394: Looking up existing affinities for host host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:31.063962 containerd[1587]: 2025-09-09 05:40:30.978 [INFO][4057] ipam/ipam.go 511: Trying affinity for 192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:31.063962 containerd[1587]: 2025-09-09 05:40:30.980 [INFO][4057] ipam/ipam.go 158: Attempting to load block cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:31.063962 containerd[1587]: 2025-09-09 05:40:30.983 [INFO][4057] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:31.064248 containerd[1587]: 2025-09-09 05:40:30.983 [INFO][4057] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.104.192/26 handle="k8s-pod-network.15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:31.064248 containerd[1587]: 2025-09-09 05:40:30.985 [INFO][4057] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1 Sep 9 05:40:31.064248 containerd[1587]: 2025-09-09 05:40:30.997 [INFO][4057] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.104.192/26 handle="k8s-pod-network.15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:31.064248 containerd[1587]: 2025-09-09 05:40:31.004 [INFO][4057] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.104.193/26] block=192.168.104.192/26 handle="k8s-pod-network.15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:31.064248 containerd[1587]: 2025-09-09 05:40:31.004 [INFO][4057] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.104.193/26] handle="k8s-pod-network.15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:31.064248 containerd[1587]: 2025-09-09 05:40:31.004 [INFO][4057] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:40:31.064248 containerd[1587]: 2025-09-09 05:40:31.004 [INFO][4057] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.193/26] IPv6=[] ContainerID="15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" HandleID="k8s-pod-network.15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" Workload="srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0" Sep 9 05:40:31.064435 containerd[1587]: 2025-09-09 05:40:31.017 [INFO][4049] cni-plugin/k8s.go 418: Populated endpoint ContainerID="15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" Namespace="calico-system" Pod="whisker-64d755cd59-mwrgh" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0", GenerateName:"whisker-64d755cd59-", Namespace:"calico-system", SelfLink:"", UID:"0d171813-85d3-405a-a6e3-aa2a33931c89", ResourceVersion:"897", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 40, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"64d755cd59", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"", Pod:"whisker-64d755cd59-mwrgh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.104.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali488a47dc3d2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:31.064435 containerd[1587]: 2025-09-09 05:40:31.017 [INFO][4049] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.104.193/32] ContainerID="15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" Namespace="calico-system" Pod="whisker-64d755cd59-mwrgh" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0" Sep 9 05:40:31.070914 containerd[1587]: 2025-09-09 05:40:31.017 [INFO][4049] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali488a47dc3d2 ContainerID="15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" Namespace="calico-system" Pod="whisker-64d755cd59-mwrgh" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0" Sep 9 05:40:31.070914 containerd[1587]: 2025-09-09 05:40:31.031 [INFO][4049] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" Namespace="calico-system" Pod="whisker-64d755cd59-mwrgh" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0" Sep 9 05:40:31.071268 containerd[1587]: 2025-09-09 05:40:31.031 [INFO][4049] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" Namespace="calico-system" Pod="whisker-64d755cd59-mwrgh" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0", GenerateName:"whisker-64d755cd59-", Namespace:"calico-system", SelfLink:"", UID:"0d171813-85d3-405a-a6e3-aa2a33931c89", ResourceVersion:"897", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 40, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"64d755cd59", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1", Pod:"whisker-64d755cd59-mwrgh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.104.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali488a47dc3d2", MAC:"62:68:64:5f:bd:e0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:31.071367 containerd[1587]: 2025-09-09 05:40:31.047 [INFO][4049] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" Namespace="calico-system" Pod="whisker-64d755cd59-mwrgh" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-whisker--64d755cd59--mwrgh-eth0" Sep 9 05:40:31.240897 containerd[1587]: time="2025-09-09T05:40:31.240838046Z" level=info msg="connecting to shim 15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1" address="unix:///run/containerd/s/e4aa748816c897971b56d08b0308006053c17374b206883653c49047e8cd10bd" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:40:31.297905 systemd[1]: Started cri-containerd-15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1.scope - libcontainer container 15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1. Sep 9 05:40:31.438706 containerd[1587]: time="2025-09-09T05:40:31.437382530Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64d755cd59-mwrgh,Uid:0d171813-85d3-405a-a6e3-aa2a33931c89,Namespace:calico-system,Attempt:0,} returns sandbox id \"15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1\"" Sep 9 05:40:31.446003 containerd[1587]: time="2025-09-09T05:40:31.445958445Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 05:40:31.697913 containerd[1587]: time="2025-09-09T05:40:31.697852836Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a\" id:\"9ca02b203989c71e370e8a97c5f92aba8796f4acaf9da644a93e9155d36d801c\" pid:4208 exit_status:1 exited_at:{seconds:1757396431 nanos:697366450}" Sep 9 05:40:32.013120 systemd-networkd[1514]: vxlan.calico: Link UP Sep 9 05:40:32.013131 systemd-networkd[1514]: vxlan.calico: Gained carrier Sep 9 05:40:32.020648 containerd[1587]: time="2025-09-09T05:40:32.020583620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-d4mz8,Uid:04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7,Namespace:kube-system,Attempt:0,}" Sep 9 05:40:32.027241 containerd[1587]: time="2025-09-09T05:40:32.027193926Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586648bd4c-qvfkx,Uid:343f6693-fe0c-4929-a714-bd0106a3b358,Namespace:calico-system,Attempt:0,}" Sep 9 05:40:32.268831 systemd-networkd[1514]: cali488a47dc3d2: Gained IPv6LL Sep 9 05:40:32.335709 systemd-networkd[1514]: cali7beee9b2d4e: Link UP Sep 9 05:40:32.339711 systemd-networkd[1514]: cali7beee9b2d4e: Gained carrier Sep 9 05:40:32.375239 containerd[1587]: 2025-09-09 05:40:32.164 [INFO][4288] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0 coredns-674b8bbfcf- kube-system 04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7 824 0 2025-09-09 05:39:53 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-dlh9b.gb1.brightbox.com coredns-674b8bbfcf-d4mz8 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7beee9b2d4e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" Namespace="kube-system" Pod="coredns-674b8bbfcf-d4mz8" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-" Sep 9 05:40:32.375239 containerd[1587]: 2025-09-09 05:40:32.164 [INFO][4288] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" Namespace="kube-system" Pod="coredns-674b8bbfcf-d4mz8" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0" Sep 9 05:40:32.375239 containerd[1587]: 2025-09-09 05:40:32.230 [INFO][4333] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" HandleID="k8s-pod-network.164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" Workload="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0" Sep 9 05:40:32.375492 containerd[1587]: 2025-09-09 05:40:32.230 [INFO][4333] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" HandleID="k8s-pod-network.164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" Workload="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5020), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-dlh9b.gb1.brightbox.com", "pod":"coredns-674b8bbfcf-d4mz8", "timestamp":"2025-09-09 05:40:32.230207902 +0000 UTC"}, Hostname:"srv-dlh9b.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:40:32.375492 containerd[1587]: 2025-09-09 05:40:32.230 [INFO][4333] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:40:32.375492 containerd[1587]: 2025-09-09 05:40:32.230 [INFO][4333] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:40:32.375492 containerd[1587]: 2025-09-09 05:40:32.230 [INFO][4333] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-dlh9b.gb1.brightbox.com' Sep 9 05:40:32.375492 containerd[1587]: 2025-09-09 05:40:32.245 [INFO][4333] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.375492 containerd[1587]: 2025-09-09 05:40:32.256 [INFO][4333] ipam/ipam.go 394: Looking up existing affinities for host host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.375492 containerd[1587]: 2025-09-09 05:40:32.272 [INFO][4333] ipam/ipam.go 511: Trying affinity for 192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.375492 containerd[1587]: 2025-09-09 05:40:32.277 [INFO][4333] ipam/ipam.go 158: Attempting to load block cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.375492 containerd[1587]: 2025-09-09 05:40:32.281 [INFO][4333] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.376057 containerd[1587]: 2025-09-09 05:40:32.282 [INFO][4333] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.104.192/26 handle="k8s-pod-network.164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.376057 containerd[1587]: 2025-09-09 05:40:32.285 [INFO][4333] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121 Sep 9 05:40:32.376057 containerd[1587]: 2025-09-09 05:40:32.303 [INFO][4333] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.104.192/26 handle="k8s-pod-network.164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.376057 containerd[1587]: 2025-09-09 05:40:32.314 [INFO][4333] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.104.194/26] block=192.168.104.192/26 handle="k8s-pod-network.164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.376057 containerd[1587]: 2025-09-09 05:40:32.314 [INFO][4333] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.104.194/26] handle="k8s-pod-network.164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.376057 containerd[1587]: 2025-09-09 05:40:32.314 [INFO][4333] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:40:32.376057 containerd[1587]: 2025-09-09 05:40:32.314 [INFO][4333] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.194/26] IPv6=[] ContainerID="164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" HandleID="k8s-pod-network.164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" Workload="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0" Sep 9 05:40:32.376263 containerd[1587]: 2025-09-09 05:40:32.324 [INFO][4288] cni-plugin/k8s.go 418: Populated endpoint ContainerID="164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" Namespace="kube-system" Pod="coredns-674b8bbfcf-d4mz8" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 39, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"", Pod:"coredns-674b8bbfcf-d4mz8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.104.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7beee9b2d4e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:32.376263 containerd[1587]: 2025-09-09 05:40:32.324 [INFO][4288] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.104.194/32] ContainerID="164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" Namespace="kube-system" Pod="coredns-674b8bbfcf-d4mz8" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0" Sep 9 05:40:32.376263 containerd[1587]: 2025-09-09 05:40:32.324 [INFO][4288] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7beee9b2d4e ContainerID="164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" Namespace="kube-system" Pod="coredns-674b8bbfcf-d4mz8" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0" Sep 9 05:40:32.376263 containerd[1587]: 2025-09-09 05:40:32.339 [INFO][4288] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" Namespace="kube-system" Pod="coredns-674b8bbfcf-d4mz8" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0" Sep 9 05:40:32.376263 containerd[1587]: 2025-09-09 05:40:32.342 [INFO][4288] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" Namespace="kube-system" Pod="coredns-674b8bbfcf-d4mz8" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 39, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121", Pod:"coredns-674b8bbfcf-d4mz8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.104.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7beee9b2d4e", MAC:"ba:d5:38:30:29:5a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:32.376263 containerd[1587]: 2025-09-09 05:40:32.365 [INFO][4288] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" Namespace="kube-system" Pod="coredns-674b8bbfcf-d4mz8" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--d4mz8-eth0" Sep 9 05:40:32.411219 containerd[1587]: time="2025-09-09T05:40:32.411158732Z" level=info msg="connecting to shim 164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121" address="unix:///run/containerd/s/ee6df10c64af56718d053ff9b346831de111c46b4503353720bc010638f63f04" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:40:32.432994 systemd-networkd[1514]: cali3ad8b7ae6b8: Link UP Sep 9 05:40:32.434063 systemd-networkd[1514]: cali3ad8b7ae6b8: Gained carrier Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.195 [INFO][4297] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0 calico-kube-controllers-586648bd4c- calico-system 343f6693-fe0c-4929-a714-bd0106a3b358 823 0 2025-09-09 05:40:08 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:586648bd4c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-dlh9b.gb1.brightbox.com calico-kube-controllers-586648bd4c-qvfkx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali3ad8b7ae6b8 [] [] }} ContainerID="c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" Namespace="calico-system" Pod="calico-kube-controllers-586648bd4c-qvfkx" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-" Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.195 [INFO][4297] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" Namespace="calico-system" Pod="calico-kube-controllers-586648bd4c-qvfkx" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0" Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.256 [INFO][4339] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" HandleID="k8s-pod-network.c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" Workload="srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0" Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.257 [INFO][4339] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" HandleID="k8s-pod-network.c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" Workload="srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f7f0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-dlh9b.gb1.brightbox.com", "pod":"calico-kube-controllers-586648bd4c-qvfkx", "timestamp":"2025-09-09 05:40:32.256542214 +0000 UTC"}, Hostname:"srv-dlh9b.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.257 [INFO][4339] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.314 [INFO][4339] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.315 [INFO][4339] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-dlh9b.gb1.brightbox.com' Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.348 [INFO][4339] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.365 [INFO][4339] ipam/ipam.go 394: Looking up existing affinities for host host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.386 [INFO][4339] ipam/ipam.go 511: Trying affinity for 192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.390 [INFO][4339] ipam/ipam.go 158: Attempting to load block cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.397 [INFO][4339] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.397 [INFO][4339] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.104.192/26 handle="k8s-pod-network.c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.400 [INFO][4339] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647 Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.405 [INFO][4339] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.104.192/26 handle="k8s-pod-network.c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.417 [INFO][4339] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.104.195/26] block=192.168.104.192/26 handle="k8s-pod-network.c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.417 [INFO][4339] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.104.195/26] handle="k8s-pod-network.c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.417 [INFO][4339] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:40:32.462329 containerd[1587]: 2025-09-09 05:40:32.417 [INFO][4339] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.195/26] IPv6=[] ContainerID="c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" HandleID="k8s-pod-network.c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" Workload="srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0" Sep 9 05:40:32.464306 containerd[1587]: 2025-09-09 05:40:32.424 [INFO][4297] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" Namespace="calico-system" Pod="calico-kube-controllers-586648bd4c-qvfkx" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0", GenerateName:"calico-kube-controllers-586648bd4c-", Namespace:"calico-system", SelfLink:"", UID:"343f6693-fe0c-4929-a714-bd0106a3b358", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 40, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"586648bd4c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-586648bd4c-qvfkx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.104.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3ad8b7ae6b8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:32.464306 containerd[1587]: 2025-09-09 05:40:32.425 [INFO][4297] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.104.195/32] ContainerID="c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" Namespace="calico-system" Pod="calico-kube-controllers-586648bd4c-qvfkx" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0" Sep 9 05:40:32.464306 containerd[1587]: 2025-09-09 05:40:32.425 [INFO][4297] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3ad8b7ae6b8 ContainerID="c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" Namespace="calico-system" Pod="calico-kube-controllers-586648bd4c-qvfkx" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0" Sep 9 05:40:32.464306 containerd[1587]: 2025-09-09 05:40:32.435 [INFO][4297] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" Namespace="calico-system" Pod="calico-kube-controllers-586648bd4c-qvfkx" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0" Sep 9 05:40:32.464306 containerd[1587]: 2025-09-09 05:40:32.437 [INFO][4297] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" Namespace="calico-system" Pod="calico-kube-controllers-586648bd4c-qvfkx" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0", GenerateName:"calico-kube-controllers-586648bd4c-", Namespace:"calico-system", SelfLink:"", UID:"343f6693-fe0c-4929-a714-bd0106a3b358", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 40, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"586648bd4c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647", Pod:"calico-kube-controllers-586648bd4c-qvfkx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.104.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3ad8b7ae6b8", MAC:"f2:48:b6:b5:b9:91", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:32.464306 containerd[1587]: 2025-09-09 05:40:32.456 [INFO][4297] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" Namespace="calico-system" Pod="calico-kube-controllers-586648bd4c-qvfkx" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--kube--controllers--586648bd4c--qvfkx-eth0" Sep 9 05:40:32.478866 systemd[1]: Started cri-containerd-164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121.scope - libcontainer container 164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121. Sep 9 05:40:32.518424 containerd[1587]: time="2025-09-09T05:40:32.517417721Z" level=info msg="connecting to shim c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647" address="unix:///run/containerd/s/33e74e9c88df9a147f91fbb0f3f888c2b1d51a16d3a78c80cdbbd5da21bc48e0" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:40:32.560526 systemd[1]: Started cri-containerd-c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647.scope - libcontainer container c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647. Sep 9 05:40:32.584632 containerd[1587]: time="2025-09-09T05:40:32.584563679Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-d4mz8,Uid:04f1a8dd-8e48-4ae0-b634-feaea7fa7cc7,Namespace:kube-system,Attempt:0,} returns sandbox id \"164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121\"" Sep 9 05:40:32.596567 containerd[1587]: time="2025-09-09T05:40:32.596531396Z" level=info msg="CreateContainer within sandbox \"164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:40:32.608070 containerd[1587]: time="2025-09-09T05:40:32.607825147Z" level=info msg="Container 993542d0802991418a14b903c97ba7ed78b6315a359be606984d72b231ee53ad: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:32.630122 containerd[1587]: time="2025-09-09T05:40:32.630059626Z" level=info msg="CreateContainer within sandbox \"164bebab4b027b86698ffc473274cd8ed39a97c613acb0d5319577704a1f6121\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"993542d0802991418a14b903c97ba7ed78b6315a359be606984d72b231ee53ad\"" Sep 9 05:40:32.631157 containerd[1587]: time="2025-09-09T05:40:32.631130891Z" level=info msg="StartContainer for \"993542d0802991418a14b903c97ba7ed78b6315a359be606984d72b231ee53ad\"" Sep 9 05:40:32.632564 containerd[1587]: time="2025-09-09T05:40:32.632506445Z" level=info msg="connecting to shim 993542d0802991418a14b903c97ba7ed78b6315a359be606984d72b231ee53ad" address="unix:///run/containerd/s/ee6df10c64af56718d053ff9b346831de111c46b4503353720bc010638f63f04" protocol=ttrpc version=3 Sep 9 05:40:32.662288 systemd[1]: Started cri-containerd-993542d0802991418a14b903c97ba7ed78b6315a359be606984d72b231ee53ad.scope - libcontainer container 993542d0802991418a14b903c97ba7ed78b6315a359be606984d72b231ee53ad. Sep 9 05:40:32.698775 containerd[1587]: time="2025-09-09T05:40:32.698699854Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586648bd4c-qvfkx,Uid:343f6693-fe0c-4929-a714-bd0106a3b358,Namespace:calico-system,Attempt:0,} returns sandbox id \"c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647\"" Sep 9 05:40:32.746047 containerd[1587]: time="2025-09-09T05:40:32.745983688Z" level=info msg="StartContainer for \"993542d0802991418a14b903c97ba7ed78b6315a359be606984d72b231ee53ad\" returns successfully" Sep 9 05:40:32.998773 containerd[1587]: time="2025-09-09T05:40:32.998340473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vtps6,Uid:9ed1e42f-a14e-4222-8edd-72d601806d5a,Namespace:kube-system,Attempt:0,}" Sep 9 05:40:33.000127 containerd[1587]: time="2025-09-09T05:40:33.000034136Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-czgjp,Uid:27cc7ce2-3e74-49cc-83d6-1559ba07fa7b,Namespace:calico-system,Attempt:0,}" Sep 9 05:40:33.001174 containerd[1587]: time="2025-09-09T05:40:33.001134035Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867ddcbfff-tcnvd,Uid:b61265df-84eb-42d3-bb4e-3993302b5b2c,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:40:33.010701 containerd[1587]: time="2025-09-09T05:40:33.010532926Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:33.010852 containerd[1587]: time="2025-09-09T05:40:33.010646583Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 9 05:40:33.013748 containerd[1587]: time="2025-09-09T05:40:33.013644192Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:33.024951 containerd[1587]: time="2025-09-09T05:40:33.024206125Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:33.028387 containerd[1587]: time="2025-09-09T05:40:33.028338280Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.582326254s" Sep 9 05:40:33.028812 containerd[1587]: time="2025-09-09T05:40:33.028780693Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 9 05:40:33.038260 containerd[1587]: time="2025-09-09T05:40:33.038037637Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 05:40:33.050175 containerd[1587]: time="2025-09-09T05:40:33.050124219Z" level=info msg="CreateContainer within sandbox \"15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 05:40:33.064480 containerd[1587]: time="2025-09-09T05:40:33.064336759Z" level=info msg="Container e40e1b8a7130111d8b0d717a092d47707a28ea6822363654c6981e19b2841c11: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:33.083993 containerd[1587]: time="2025-09-09T05:40:33.083944558Z" level=info msg="CreateContainer within sandbox \"15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"e40e1b8a7130111d8b0d717a092d47707a28ea6822363654c6981e19b2841c11\"" Sep 9 05:40:33.086405 containerd[1587]: time="2025-09-09T05:40:33.086362568Z" level=info msg="StartContainer for \"e40e1b8a7130111d8b0d717a092d47707a28ea6822363654c6981e19b2841c11\"" Sep 9 05:40:33.093574 containerd[1587]: time="2025-09-09T05:40:33.093530257Z" level=info msg="connecting to shim e40e1b8a7130111d8b0d717a092d47707a28ea6822363654c6981e19b2841c11" address="unix:///run/containerd/s/e4aa748816c897971b56d08b0308006053c17374b206883653c49047e8cd10bd" protocol=ttrpc version=3 Sep 9 05:40:33.146965 systemd[1]: Started cri-containerd-e40e1b8a7130111d8b0d717a092d47707a28ea6822363654c6981e19b2841c11.scope - libcontainer container e40e1b8a7130111d8b0d717a092d47707a28ea6822363654c6981e19b2841c11. Sep 9 05:40:33.279890 systemd-networkd[1514]: cali91594eb397e: Link UP Sep 9 05:40:33.282355 systemd-networkd[1514]: cali91594eb397e: Gained carrier Sep 9 05:40:33.293688 containerd[1587]: time="2025-09-09T05:40:33.293552648Z" level=info msg="StartContainer for \"e40e1b8a7130111d8b0d717a092d47707a28ea6822363654c6981e19b2841c11\" returns successfully" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.076 [INFO][4530] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0 calico-apiserver-867ddcbfff- calico-apiserver b61265df-84eb-42d3-bb4e-3993302b5b2c 826 0 2025-09-09 05:40:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:867ddcbfff projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-dlh9b.gb1.brightbox.com calico-apiserver-867ddcbfff-tcnvd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali91594eb397e [] [] }} ContainerID="133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-tcnvd" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.076 [INFO][4530] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-tcnvd" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.192 [INFO][4564] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" HandleID="k8s-pod-network.133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" Workload="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.192 [INFO][4564] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" HandleID="k8s-pod-network.133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" Workload="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024fa90), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-dlh9b.gb1.brightbox.com", "pod":"calico-apiserver-867ddcbfff-tcnvd", "timestamp":"2025-09-09 05:40:33.192304354 +0000 UTC"}, Hostname:"srv-dlh9b.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.192 [INFO][4564] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.192 [INFO][4564] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.192 [INFO][4564] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-dlh9b.gb1.brightbox.com' Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.207 [INFO][4564] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.217 [INFO][4564] ipam/ipam.go 394: Looking up existing affinities for host host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.234 [INFO][4564] ipam/ipam.go 511: Trying affinity for 192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.238 [INFO][4564] ipam/ipam.go 158: Attempting to load block cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.242 [INFO][4564] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.244 [INFO][4564] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.104.192/26 handle="k8s-pod-network.133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.247 [INFO][4564] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2 Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.255 [INFO][4564] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.104.192/26 handle="k8s-pod-network.133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.267 [INFO][4564] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.104.196/26] block=192.168.104.192/26 handle="k8s-pod-network.133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.268 [INFO][4564] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.104.196/26] handle="k8s-pod-network.133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.268 [INFO][4564] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:40:33.317731 containerd[1587]: 2025-09-09 05:40:33.268 [INFO][4564] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.196/26] IPv6=[] ContainerID="133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" HandleID="k8s-pod-network.133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" Workload="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0" Sep 9 05:40:33.321239 containerd[1587]: 2025-09-09 05:40:33.273 [INFO][4530] cni-plugin/k8s.go 418: Populated endpoint ContainerID="133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-tcnvd" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0", GenerateName:"calico-apiserver-867ddcbfff-", Namespace:"calico-apiserver", SelfLink:"", UID:"b61265df-84eb-42d3-bb4e-3993302b5b2c", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 40, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"867ddcbfff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-867ddcbfff-tcnvd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.104.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali91594eb397e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:33.321239 containerd[1587]: 2025-09-09 05:40:33.274 [INFO][4530] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.104.196/32] ContainerID="133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-tcnvd" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0" Sep 9 05:40:33.321239 containerd[1587]: 2025-09-09 05:40:33.274 [INFO][4530] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali91594eb397e ContainerID="133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-tcnvd" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0" Sep 9 05:40:33.321239 containerd[1587]: 2025-09-09 05:40:33.278 [INFO][4530] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-tcnvd" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0" Sep 9 05:40:33.321239 containerd[1587]: 2025-09-09 05:40:33.286 [INFO][4530] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-tcnvd" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0", GenerateName:"calico-apiserver-867ddcbfff-", Namespace:"calico-apiserver", SelfLink:"", UID:"b61265df-84eb-42d3-bb4e-3993302b5b2c", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 40, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"867ddcbfff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2", Pod:"calico-apiserver-867ddcbfff-tcnvd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.104.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali91594eb397e", MAC:"c2:58:eb:d4:ee:8f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:33.321239 containerd[1587]: 2025-09-09 05:40:33.311 [INFO][4530] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-tcnvd" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--tcnvd-eth0" Sep 9 05:40:33.357923 systemd-networkd[1514]: vxlan.calico: Gained IPv6LL Sep 9 05:40:33.392845 containerd[1587]: time="2025-09-09T05:40:33.392800868Z" level=info msg="connecting to shim 133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2" address="unix:///run/containerd/s/0e37f2de51892ecd22bfe9e85d1367985f4a312e16a740d4eb941e57614365ea" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:40:33.398366 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2454186110.mount: Deactivated successfully. Sep 9 05:40:33.417362 systemd-networkd[1514]: cali02beb0ef342: Link UP Sep 9 05:40:33.418463 systemd-networkd[1514]: cali02beb0ef342: Gained carrier Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.124 [INFO][4529] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0 coredns-674b8bbfcf- kube-system 9ed1e42f-a14e-4222-8edd-72d601806d5a 817 0 2025-09-09 05:39:53 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-dlh9b.gb1.brightbox.com coredns-674b8bbfcf-vtps6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali02beb0ef342 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" Namespace="kube-system" Pod="coredns-674b8bbfcf-vtps6" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-" Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.124 [INFO][4529] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" Namespace="kube-system" Pod="coredns-674b8bbfcf-vtps6" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0" Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.216 [INFO][4586] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" HandleID="k8s-pod-network.7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" Workload="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0" Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.217 [INFO][4586] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" HandleID="k8s-pod-network.7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" Workload="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000393980), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-dlh9b.gb1.brightbox.com", "pod":"coredns-674b8bbfcf-vtps6", "timestamp":"2025-09-09 05:40:33.216682257 +0000 UTC"}, Hostname:"srv-dlh9b.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.218 [INFO][4586] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.268 [INFO][4586] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.268 [INFO][4586] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-dlh9b.gb1.brightbox.com' Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.314 [INFO][4586] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.331 [INFO][4586] ipam/ipam.go 394: Looking up existing affinities for host host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.343 [INFO][4586] ipam/ipam.go 511: Trying affinity for 192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.350 [INFO][4586] ipam/ipam.go 158: Attempting to load block cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.354 [INFO][4586] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.355 [INFO][4586] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.104.192/26 handle="k8s-pod-network.7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.360 [INFO][4586] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5 Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.372 [INFO][4586] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.104.192/26 handle="k8s-pod-network.7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.397 [INFO][4586] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.104.197/26] block=192.168.104.192/26 handle="k8s-pod-network.7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.401 [INFO][4586] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.104.197/26] handle="k8s-pod-network.7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.402 [INFO][4586] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:40:33.462469 containerd[1587]: 2025-09-09 05:40:33.402 [INFO][4586] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.197/26] IPv6=[] ContainerID="7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" HandleID="k8s-pod-network.7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" Workload="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0" Sep 9 05:40:33.465921 containerd[1587]: 2025-09-09 05:40:33.412 [INFO][4529] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" Namespace="kube-system" Pod="coredns-674b8bbfcf-vtps6" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9ed1e42f-a14e-4222-8edd-72d601806d5a", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 39, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"", Pod:"coredns-674b8bbfcf-vtps6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.104.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali02beb0ef342", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:33.465921 containerd[1587]: 2025-09-09 05:40:33.412 [INFO][4529] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.104.197/32] ContainerID="7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" Namespace="kube-system" Pod="coredns-674b8bbfcf-vtps6" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0" Sep 9 05:40:33.465921 containerd[1587]: 2025-09-09 05:40:33.412 [INFO][4529] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali02beb0ef342 ContainerID="7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" Namespace="kube-system" Pod="coredns-674b8bbfcf-vtps6" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0" Sep 9 05:40:33.465921 containerd[1587]: 2025-09-09 05:40:33.419 [INFO][4529] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" Namespace="kube-system" Pod="coredns-674b8bbfcf-vtps6" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0" Sep 9 05:40:33.465921 containerd[1587]: 2025-09-09 05:40:33.421 [INFO][4529] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" Namespace="kube-system" Pod="coredns-674b8bbfcf-vtps6" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9ed1e42f-a14e-4222-8edd-72d601806d5a", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 39, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5", Pod:"coredns-674b8bbfcf-vtps6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.104.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali02beb0ef342", MAC:"0e:a5:07:3c:e0:60", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:33.465921 containerd[1587]: 2025-09-09 05:40:33.452 [INFO][4529] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" Namespace="kube-system" Pod="coredns-674b8bbfcf-vtps6" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vtps6-eth0" Sep 9 05:40:33.483052 systemd[1]: Started cri-containerd-133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2.scope - libcontainer container 133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2. Sep 9 05:40:33.531822 systemd-networkd[1514]: calic66c1b5866b: Link UP Sep 9 05:40:33.534852 systemd-networkd[1514]: calic66c1b5866b: Gained carrier Sep 9 05:40:33.549105 systemd-networkd[1514]: cali3ad8b7ae6b8: Gained IPv6LL Sep 9 05:40:33.552192 systemd-networkd[1514]: cali7beee9b2d4e: Gained IPv6LL Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.155 [INFO][4548] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0 csi-node-driver- calico-system 27cc7ce2-3e74-49cc-83d6-1559ba07fa7b 706 0 2025-09-09 05:40:08 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-dlh9b.gb1.brightbox.com csi-node-driver-czgjp eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calic66c1b5866b [] [] }} ContainerID="da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" Namespace="calico-system" Pod="csi-node-driver-czgjp" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-" Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.156 [INFO][4548] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" Namespace="calico-system" Pod="csi-node-driver-czgjp" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0" Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.242 [INFO][4602] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" HandleID="k8s-pod-network.da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" Workload="srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0" Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.242 [INFO][4602] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" HandleID="k8s-pod-network.da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" Workload="srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fe30), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-dlh9b.gb1.brightbox.com", "pod":"csi-node-driver-czgjp", "timestamp":"2025-09-09 05:40:33.242548693 +0000 UTC"}, Hostname:"srv-dlh9b.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.242 [INFO][4602] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.402 [INFO][4602] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.403 [INFO][4602] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-dlh9b.gb1.brightbox.com' Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.429 [INFO][4602] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.454 [INFO][4602] ipam/ipam.go 394: Looking up existing affinities for host host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.466 [INFO][4602] ipam/ipam.go 511: Trying affinity for 192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.472 [INFO][4602] ipam/ipam.go 158: Attempting to load block cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.479 [INFO][4602] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.480 [INFO][4602] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.104.192/26 handle="k8s-pod-network.da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.484 [INFO][4602] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966 Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.494 [INFO][4602] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.104.192/26 handle="k8s-pod-network.da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.513 [INFO][4602] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.104.198/26] block=192.168.104.192/26 handle="k8s-pod-network.da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.514 [INFO][4602] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.104.198/26] handle="k8s-pod-network.da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.514 [INFO][4602] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:40:33.578022 containerd[1587]: 2025-09-09 05:40:33.515 [INFO][4602] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.198/26] IPv6=[] ContainerID="da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" HandleID="k8s-pod-network.da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" Workload="srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0" Sep 9 05:40:33.581019 containerd[1587]: 2025-09-09 05:40:33.526 [INFO][4548] cni-plugin/k8s.go 418: Populated endpoint ContainerID="da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" Namespace="calico-system" Pod="csi-node-driver-czgjp" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"27cc7ce2-3e74-49cc-83d6-1559ba07fa7b", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 40, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-czgjp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.104.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic66c1b5866b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:33.581019 containerd[1587]: 2025-09-09 05:40:33.527 [INFO][4548] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.104.198/32] ContainerID="da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" Namespace="calico-system" Pod="csi-node-driver-czgjp" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0" Sep 9 05:40:33.581019 containerd[1587]: 2025-09-09 05:40:33.527 [INFO][4548] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic66c1b5866b ContainerID="da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" Namespace="calico-system" Pod="csi-node-driver-czgjp" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0" Sep 9 05:40:33.581019 containerd[1587]: 2025-09-09 05:40:33.535 [INFO][4548] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" Namespace="calico-system" Pod="csi-node-driver-czgjp" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0" Sep 9 05:40:33.581019 containerd[1587]: 2025-09-09 05:40:33.537 [INFO][4548] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" Namespace="calico-system" Pod="csi-node-driver-czgjp" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"27cc7ce2-3e74-49cc-83d6-1559ba07fa7b", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 40, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966", Pod:"csi-node-driver-czgjp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.104.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic66c1b5866b", MAC:"8e:41:e3:34:47:e5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:33.581019 containerd[1587]: 2025-09-09 05:40:33.564 [INFO][4548] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" Namespace="calico-system" Pod="csi-node-driver-czgjp" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-csi--node--driver--czgjp-eth0" Sep 9 05:40:33.591793 containerd[1587]: time="2025-09-09T05:40:33.590955665Z" level=info msg="connecting to shim 7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5" address="unix:///run/containerd/s/39dac05391c562f24386d43da867573a2f0620253ee021eb6d5214be1c86b331" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:40:33.627679 kubelet[2908]: I0909 05:40:33.587953 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-d4mz8" podStartSLOduration=40.580140726 podStartE2EDuration="40.580140726s" podCreationTimestamp="2025-09-09 05:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:40:33.579894475 +0000 UTC m=+46.763998751" watchObservedRunningTime="2025-09-09 05:40:33.580140726 +0000 UTC m=+46.764245000" Sep 9 05:40:33.657741 containerd[1587]: time="2025-09-09T05:40:33.656973390Z" level=info msg="connecting to shim da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966" address="unix:///run/containerd/s/cc9283db35f16017fddb28fad62a55941f2b3ccbb168e4a63f2b51b28c4ece18" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:40:33.682015 systemd[1]: Started cri-containerd-7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5.scope - libcontainer container 7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5. Sep 9 05:40:33.725865 systemd[1]: Started cri-containerd-da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966.scope - libcontainer container da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966. Sep 9 05:40:33.731757 containerd[1587]: time="2025-09-09T05:40:33.731092799Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867ddcbfff-tcnvd,Uid:b61265df-84eb-42d3-bb4e-3993302b5b2c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2\"" Sep 9 05:40:33.782177 containerd[1587]: time="2025-09-09T05:40:33.782139965Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-czgjp,Uid:27cc7ce2-3e74-49cc-83d6-1559ba07fa7b,Namespace:calico-system,Attempt:0,} returns sandbox id \"da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966\"" Sep 9 05:40:33.788994 containerd[1587]: time="2025-09-09T05:40:33.788817705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vtps6,Uid:9ed1e42f-a14e-4222-8edd-72d601806d5a,Namespace:kube-system,Attempt:0,} returns sandbox id \"7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5\"" Sep 9 05:40:33.794427 containerd[1587]: time="2025-09-09T05:40:33.794228906Z" level=info msg="CreateContainer within sandbox \"7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:40:33.800419 containerd[1587]: time="2025-09-09T05:40:33.800382840Z" level=info msg="Container 6dcd0f5fb55316289eb9cad70bab2d711a74d72c512765e954b0cf4aebfc6239: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:33.804751 containerd[1587]: time="2025-09-09T05:40:33.804616483Z" level=info msg="CreateContainer within sandbox \"7069ab95f0bd7d6cf1c26600b7969d15a3fc5ab14ae29579ed7959d6294a01c5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6dcd0f5fb55316289eb9cad70bab2d711a74d72c512765e954b0cf4aebfc6239\"" Sep 9 05:40:33.807017 containerd[1587]: time="2025-09-09T05:40:33.806826147Z" level=info msg="StartContainer for \"6dcd0f5fb55316289eb9cad70bab2d711a74d72c512765e954b0cf4aebfc6239\"" Sep 9 05:40:33.808050 containerd[1587]: time="2025-09-09T05:40:33.808010520Z" level=info msg="connecting to shim 6dcd0f5fb55316289eb9cad70bab2d711a74d72c512765e954b0cf4aebfc6239" address="unix:///run/containerd/s/39dac05391c562f24386d43da867573a2f0620253ee021eb6d5214be1c86b331" protocol=ttrpc version=3 Sep 9 05:40:33.829913 systemd[1]: Started cri-containerd-6dcd0f5fb55316289eb9cad70bab2d711a74d72c512765e954b0cf4aebfc6239.scope - libcontainer container 6dcd0f5fb55316289eb9cad70bab2d711a74d72c512765e954b0cf4aebfc6239. Sep 9 05:40:33.881457 containerd[1587]: time="2025-09-09T05:40:33.878871520Z" level=info msg="StartContainer for \"6dcd0f5fb55316289eb9cad70bab2d711a74d72c512765e954b0cf4aebfc6239\" returns successfully" Sep 9 05:40:34.391047 kubelet[2908]: I0909 05:40:34.390984 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-vtps6" podStartSLOduration=41.390960895 podStartE2EDuration="41.390960895s" podCreationTimestamp="2025-09-09 05:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:40:34.367759626 +0000 UTC m=+47.551863897" watchObservedRunningTime="2025-09-09 05:40:34.390960895 +0000 UTC m=+47.575065169" Sep 9 05:40:35.000411 containerd[1587]: time="2025-09-09T05:40:34.999965811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-jg28r,Uid:51584f79-8718-43e1-9ab4-f95d38f910f1,Namespace:calico-system,Attempt:0,}" Sep 9 05:40:35.000411 containerd[1587]: time="2025-09-09T05:40:35.000176483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867ddcbfff-ngl89,Uid:5238b60f-81c4-4831-b073-cc699bd0b55a,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:40:35.085017 systemd-networkd[1514]: cali91594eb397e: Gained IPv6LL Sep 9 05:40:35.150931 systemd-networkd[1514]: cali02beb0ef342: Gained IPv6LL Sep 9 05:40:35.201773 systemd-networkd[1514]: cali1f569cc943b: Link UP Sep 9 05:40:35.209916 systemd-networkd[1514]: cali1f569cc943b: Gained carrier Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.077 [INFO][4823] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0 calico-apiserver-867ddcbfff- calico-apiserver 5238b60f-81c4-4831-b073-cc699bd0b55a 827 0 2025-09-09 05:40:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:867ddcbfff projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-dlh9b.gb1.brightbox.com calico-apiserver-867ddcbfff-ngl89 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1f569cc943b [] [] }} ContainerID="f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-ngl89" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-" Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.078 [INFO][4823] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-ngl89" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0" Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.127 [INFO][4844] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" HandleID="k8s-pod-network.f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" Workload="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0" Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.127 [INFO][4844] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" HandleID="k8s-pod-network.f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" Workload="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5110), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-dlh9b.gb1.brightbox.com", "pod":"calico-apiserver-867ddcbfff-ngl89", "timestamp":"2025-09-09 05:40:35.127030884 +0000 UTC"}, Hostname:"srv-dlh9b.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.127 [INFO][4844] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.127 [INFO][4844] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.127 [INFO][4844] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-dlh9b.gb1.brightbox.com' Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.137 [INFO][4844] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.146 [INFO][4844] ipam/ipam.go 394: Looking up existing affinities for host host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.155 [INFO][4844] ipam/ipam.go 511: Trying affinity for 192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.158 [INFO][4844] ipam/ipam.go 158: Attempting to load block cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.162 [INFO][4844] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.162 [INFO][4844] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.104.192/26 handle="k8s-pod-network.f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.167 [INFO][4844] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.175 [INFO][4844] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.104.192/26 handle="k8s-pod-network.f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.183 [INFO][4844] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.104.199/26] block=192.168.104.192/26 handle="k8s-pod-network.f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.183 [INFO][4844] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.104.199/26] handle="k8s-pod-network.f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.183 [INFO][4844] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:40:35.232303 containerd[1587]: 2025-09-09 05:40:35.184 [INFO][4844] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.199/26] IPv6=[] ContainerID="f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" HandleID="k8s-pod-network.f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" Workload="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0" Sep 9 05:40:35.234642 containerd[1587]: 2025-09-09 05:40:35.187 [INFO][4823] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-ngl89" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0", GenerateName:"calico-apiserver-867ddcbfff-", Namespace:"calico-apiserver", SelfLink:"", UID:"5238b60f-81c4-4831-b073-cc699bd0b55a", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 40, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"867ddcbfff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-867ddcbfff-ngl89", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.104.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1f569cc943b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:35.234642 containerd[1587]: 2025-09-09 05:40:35.188 [INFO][4823] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.104.199/32] ContainerID="f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-ngl89" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0" Sep 9 05:40:35.234642 containerd[1587]: 2025-09-09 05:40:35.188 [INFO][4823] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1f569cc943b ContainerID="f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-ngl89" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0" Sep 9 05:40:35.234642 containerd[1587]: 2025-09-09 05:40:35.212 [INFO][4823] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-ngl89" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0" Sep 9 05:40:35.234642 containerd[1587]: 2025-09-09 05:40:35.213 [INFO][4823] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-ngl89" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0", GenerateName:"calico-apiserver-867ddcbfff-", Namespace:"calico-apiserver", SelfLink:"", UID:"5238b60f-81c4-4831-b073-cc699bd0b55a", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 40, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"867ddcbfff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d", Pod:"calico-apiserver-867ddcbfff-ngl89", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.104.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1f569cc943b", MAC:"fa:d4:00:10:2c:c8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:35.234642 containerd[1587]: 2025-09-09 05:40:35.227 [INFO][4823] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" Namespace="calico-apiserver" Pod="calico-apiserver-867ddcbfff-ngl89" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-calico--apiserver--867ddcbfff--ngl89-eth0" Sep 9 05:40:35.280847 containerd[1587]: time="2025-09-09T05:40:35.280082860Z" level=info msg="connecting to shim f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d" address="unix:///run/containerd/s/b08f91c2ce314cc4ce23101414b81af99403e94a0a6efd0e99fda6b71aaaf97e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:40:35.328006 systemd-networkd[1514]: cali96d9824577d: Link UP Sep 9 05:40:35.332782 systemd-networkd[1514]: cali96d9824577d: Gained carrier Sep 9 05:40:35.341634 systemd-networkd[1514]: calic66c1b5866b: Gained IPv6LL Sep 9 05:40:35.361887 systemd[1]: Started cri-containerd-f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d.scope - libcontainer container f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d. Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.081 [INFO][4821] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0 goldmane-54d579b49d- calico-system 51584f79-8718-43e1-9ab4-f95d38f910f1 825 0 2025-09-09 05:40:07 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-dlh9b.gb1.brightbox.com goldmane-54d579b49d-jg28r eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali96d9824577d [] [] }} ContainerID="da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" Namespace="calico-system" Pod="goldmane-54d579b49d-jg28r" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-" Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.082 [INFO][4821] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" Namespace="calico-system" Pod="goldmane-54d579b49d-jg28r" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0" Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.128 [INFO][4849] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" HandleID="k8s-pod-network.da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" Workload="srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0" Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.129 [INFO][4849] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" HandleID="k8s-pod-network.da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" Workload="srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5710), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-dlh9b.gb1.brightbox.com", "pod":"goldmane-54d579b49d-jg28r", "timestamp":"2025-09-09 05:40:35.128861986 +0000 UTC"}, Hostname:"srv-dlh9b.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.129 [INFO][4849] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.184 [INFO][4849] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.184 [INFO][4849] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-dlh9b.gb1.brightbox.com' Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.238 [INFO][4849] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.248 [INFO][4849] ipam/ipam.go 394: Looking up existing affinities for host host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.267 [INFO][4849] ipam/ipam.go 511: Trying affinity for 192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.274 [INFO][4849] ipam/ipam.go 158: Attempting to load block cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.282 [INFO][4849] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.104.192/26 host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.282 [INFO][4849] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.104.192/26 handle="k8s-pod-network.da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.285 [INFO][4849] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6 Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.297 [INFO][4849] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.104.192/26 handle="k8s-pod-network.da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.310 [INFO][4849] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.104.200/26] block=192.168.104.192/26 handle="k8s-pod-network.da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.310 [INFO][4849] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.104.200/26] handle="k8s-pod-network.da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" host="srv-dlh9b.gb1.brightbox.com" Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.310 [INFO][4849] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:40:35.376390 containerd[1587]: 2025-09-09 05:40:35.310 [INFO][4849] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.104.200/26] IPv6=[] ContainerID="da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" HandleID="k8s-pod-network.da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" Workload="srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0" Sep 9 05:40:35.378874 containerd[1587]: 2025-09-09 05:40:35.320 [INFO][4821] cni-plugin/k8s.go 418: Populated endpoint ContainerID="da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" Namespace="calico-system" Pod="goldmane-54d579b49d-jg28r" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"51584f79-8718-43e1-9ab4-f95d38f910f1", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 40, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-54d579b49d-jg28r", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.104.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali96d9824577d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:35.378874 containerd[1587]: 2025-09-09 05:40:35.320 [INFO][4821] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.104.200/32] ContainerID="da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" Namespace="calico-system" Pod="goldmane-54d579b49d-jg28r" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0" Sep 9 05:40:35.378874 containerd[1587]: 2025-09-09 05:40:35.320 [INFO][4821] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali96d9824577d ContainerID="da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" Namespace="calico-system" Pod="goldmane-54d579b49d-jg28r" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0" Sep 9 05:40:35.378874 containerd[1587]: 2025-09-09 05:40:35.332 [INFO][4821] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" Namespace="calico-system" Pod="goldmane-54d579b49d-jg28r" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0" Sep 9 05:40:35.378874 containerd[1587]: 2025-09-09 05:40:35.338 [INFO][4821] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" Namespace="calico-system" Pod="goldmane-54d579b49d-jg28r" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"51584f79-8718-43e1-9ab4-f95d38f910f1", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 40, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-dlh9b.gb1.brightbox.com", ContainerID:"da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6", Pod:"goldmane-54d579b49d-jg28r", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.104.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali96d9824577d", MAC:"96:35:e6:91:6a:e1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:40:35.378874 containerd[1587]: 2025-09-09 05:40:35.366 [INFO][4821] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" Namespace="calico-system" Pod="goldmane-54d579b49d-jg28r" WorkloadEndpoint="srv--dlh9b.gb1.brightbox.com-k8s-goldmane--54d579b49d--jg28r-eth0" Sep 9 05:40:35.480746 containerd[1587]: time="2025-09-09T05:40:35.480508328Z" level=info msg="connecting to shim da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6" address="unix:///run/containerd/s/c5d392f53cc427be1347bb04f9518783fa019970efd7c224977e27f7d4063e5f" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:40:35.622667 systemd[1]: Started cri-containerd-da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6.scope - libcontainer container da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6. Sep 9 05:40:35.711533 containerd[1587]: time="2025-09-09T05:40:35.711492545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-867ddcbfff-ngl89,Uid:5238b60f-81c4-4831-b073-cc699bd0b55a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d\"" Sep 9 05:40:35.769948 containerd[1587]: time="2025-09-09T05:40:35.769819355Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-jg28r,Uid:51584f79-8718-43e1-9ab4-f95d38f910f1,Namespace:calico-system,Attempt:0,} returns sandbox id \"da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6\"" Sep 9 05:40:36.941293 systemd-networkd[1514]: cali96d9824577d: Gained IPv6LL Sep 9 05:40:37.004922 systemd-networkd[1514]: cali1f569cc943b: Gained IPv6LL Sep 9 05:40:37.612126 containerd[1587]: time="2025-09-09T05:40:37.611695652Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:37.630847 containerd[1587]: time="2025-09-09T05:40:37.630735894Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 9 05:40:37.632691 containerd[1587]: time="2025-09-09T05:40:37.632399557Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:37.635318 containerd[1587]: time="2025-09-09T05:40:37.635264374Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:37.636531 containerd[1587]: time="2025-09-09T05:40:37.636400462Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 4.598316759s" Sep 9 05:40:37.636531 containerd[1587]: time="2025-09-09T05:40:37.636436833Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 9 05:40:37.638391 containerd[1587]: time="2025-09-09T05:40:37.638196096Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 05:40:37.667158 containerd[1587]: time="2025-09-09T05:40:37.667115932Z" level=info msg="CreateContainer within sandbox \"c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 05:40:37.691799 containerd[1587]: time="2025-09-09T05:40:37.691137137Z" level=info msg="Container 80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:37.697336 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2291030929.mount: Deactivated successfully. Sep 9 05:40:37.703170 containerd[1587]: time="2025-09-09T05:40:37.703128467Z" level=info msg="CreateContainer within sandbox \"c1f668163f2b1640a02d86db7e23b218ba744fc4666056793569d2bc5eaaa647\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436\"" Sep 9 05:40:37.705585 containerd[1587]: time="2025-09-09T05:40:37.703826097Z" level=info msg="StartContainer for \"80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436\"" Sep 9 05:40:37.706093 containerd[1587]: time="2025-09-09T05:40:37.706067023Z" level=info msg="connecting to shim 80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436" address="unix:///run/containerd/s/33e74e9c88df9a147f91fbb0f3f888c2b1d51a16d3a78c80cdbbd5da21bc48e0" protocol=ttrpc version=3 Sep 9 05:40:37.796104 systemd[1]: Started cri-containerd-80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436.scope - libcontainer container 80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436. Sep 9 05:40:37.879243 containerd[1587]: time="2025-09-09T05:40:37.879104870Z" level=info msg="StartContainer for \"80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436\" returns successfully" Sep 9 05:40:38.460700 containerd[1587]: time="2025-09-09T05:40:38.460582998Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436\" id:\"93f2acff63468438ee3691a079eb80a139fe46035426bf864f6f7b532e142a44\" pid:5039 exited_at:{seconds:1757396438 nanos:458060757}" Sep 9 05:40:38.696794 kubelet[2908]: I0909 05:40:38.696696 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-586648bd4c-qvfkx" podStartSLOduration=25.759951896 podStartE2EDuration="30.696648349s" podCreationTimestamp="2025-09-09 05:40:08 +0000 UTC" firstStartedPulling="2025-09-09 05:40:32.701150848 +0000 UTC m=+45.885255099" lastFinishedPulling="2025-09-09 05:40:37.637847301 +0000 UTC m=+50.821951552" observedRunningTime="2025-09-09 05:40:38.420935517 +0000 UTC m=+51.605039792" watchObservedRunningTime="2025-09-09 05:40:38.696648349 +0000 UTC m=+51.880752617" Sep 9 05:40:40.700349 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1955117358.mount: Deactivated successfully. Sep 9 05:40:40.745818 containerd[1587]: time="2025-09-09T05:40:40.745593218Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:40.747188 containerd[1587]: time="2025-09-09T05:40:40.747146665Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 9 05:40:40.748296 containerd[1587]: time="2025-09-09T05:40:40.747524067Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:40.750122 containerd[1587]: time="2025-09-09T05:40:40.750074301Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:40.751441 containerd[1587]: time="2025-09-09T05:40:40.751391442Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 3.112829818s" Sep 9 05:40:40.751441 containerd[1587]: time="2025-09-09T05:40:40.751439719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 9 05:40:40.753230 containerd[1587]: time="2025-09-09T05:40:40.753196451Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:40:40.759819 containerd[1587]: time="2025-09-09T05:40:40.758795952Z" level=info msg="CreateContainer within sandbox \"15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 05:40:40.767177 containerd[1587]: time="2025-09-09T05:40:40.767138535Z" level=info msg="Container c2859eb3e9c845b5d3e45724b88b6ce6e9de3a8b67cf3afb28eae344734c23b4: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:40.775388 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2473131332.mount: Deactivated successfully. Sep 9 05:40:40.789972 containerd[1587]: time="2025-09-09T05:40:40.789901986Z" level=info msg="CreateContainer within sandbox \"15a033b0ce0dc621847aaed1fd4652f6beb806ab7130efe3b049aa1dfa481fe1\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"c2859eb3e9c845b5d3e45724b88b6ce6e9de3a8b67cf3afb28eae344734c23b4\"" Sep 9 05:40:40.791681 containerd[1587]: time="2025-09-09T05:40:40.791593981Z" level=info msg="StartContainer for \"c2859eb3e9c845b5d3e45724b88b6ce6e9de3a8b67cf3afb28eae344734c23b4\"" Sep 9 05:40:40.796705 containerd[1587]: time="2025-09-09T05:40:40.796555136Z" level=info msg="connecting to shim c2859eb3e9c845b5d3e45724b88b6ce6e9de3a8b67cf3afb28eae344734c23b4" address="unix:///run/containerd/s/e4aa748816c897971b56d08b0308006053c17374b206883653c49047e8cd10bd" protocol=ttrpc version=3 Sep 9 05:40:40.850938 systemd[1]: Started cri-containerd-c2859eb3e9c845b5d3e45724b88b6ce6e9de3a8b67cf3afb28eae344734c23b4.scope - libcontainer container c2859eb3e9c845b5d3e45724b88b6ce6e9de3a8b67cf3afb28eae344734c23b4. Sep 9 05:40:40.925507 containerd[1587]: time="2025-09-09T05:40:40.925423874Z" level=info msg="StartContainer for \"c2859eb3e9c845b5d3e45724b88b6ce6e9de3a8b67cf3afb28eae344734c23b4\" returns successfully" Sep 9 05:40:41.408729 kubelet[2908]: I0909 05:40:41.408587 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-64d755cd59-mwrgh" podStartSLOduration=2.09733594 podStartE2EDuration="11.408547751s" podCreationTimestamp="2025-09-09 05:40:30 +0000 UTC" firstStartedPulling="2025-09-09 05:40:31.441369787 +0000 UTC m=+44.625474038" lastFinishedPulling="2025-09-09 05:40:40.752581598 +0000 UTC m=+53.936685849" observedRunningTime="2025-09-09 05:40:41.406456413 +0000 UTC m=+54.590560762" watchObservedRunningTime="2025-09-09 05:40:41.408547751 +0000 UTC m=+54.592652116" Sep 9 05:40:45.004701 containerd[1587]: time="2025-09-09T05:40:45.003995558Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:45.006099 containerd[1587]: time="2025-09-09T05:40:45.006012356Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 9 05:40:45.006612 containerd[1587]: time="2025-09-09T05:40:45.006570923Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:45.009675 containerd[1587]: time="2025-09-09T05:40:45.008652856Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:45.009675 containerd[1587]: time="2025-09-09T05:40:45.009268903Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 4.256040977s" Sep 9 05:40:45.009675 containerd[1587]: time="2025-09-09T05:40:45.009614496Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 05:40:45.011020 containerd[1587]: time="2025-09-09T05:40:45.011001339Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 05:40:45.016222 containerd[1587]: time="2025-09-09T05:40:45.015472682Z" level=info msg="CreateContainer within sandbox \"133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:40:45.020576 containerd[1587]: time="2025-09-09T05:40:45.020541312Z" level=info msg="Container f046f8a38f77ea3162312a7c27eea2224d2ea81f9878606bd5aef5cb67e86eda: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:45.043555 containerd[1587]: time="2025-09-09T05:40:45.043514638Z" level=info msg="CreateContainer within sandbox \"133c33fbce367f0c7554ebf51641f9ef16c0de1e7258d12aaf3e47be27f8ccf2\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f046f8a38f77ea3162312a7c27eea2224d2ea81f9878606bd5aef5cb67e86eda\"" Sep 9 05:40:45.045640 containerd[1587]: time="2025-09-09T05:40:45.045571760Z" level=info msg="StartContainer for \"f046f8a38f77ea3162312a7c27eea2224d2ea81f9878606bd5aef5cb67e86eda\"" Sep 9 05:40:45.046928 containerd[1587]: time="2025-09-09T05:40:45.046899944Z" level=info msg="connecting to shim f046f8a38f77ea3162312a7c27eea2224d2ea81f9878606bd5aef5cb67e86eda" address="unix:///run/containerd/s/0e37f2de51892ecd22bfe9e85d1367985f4a312e16a740d4eb941e57614365ea" protocol=ttrpc version=3 Sep 9 05:40:45.086884 systemd[1]: Started cri-containerd-f046f8a38f77ea3162312a7c27eea2224d2ea81f9878606bd5aef5cb67e86eda.scope - libcontainer container f046f8a38f77ea3162312a7c27eea2224d2ea81f9878606bd5aef5cb67e86eda. Sep 9 05:40:45.174088 containerd[1587]: time="2025-09-09T05:40:45.174054056Z" level=info msg="StartContainer for \"f046f8a38f77ea3162312a7c27eea2224d2ea81f9878606bd5aef5cb67e86eda\" returns successfully" Sep 9 05:40:45.458571 kubelet[2908]: I0909 05:40:45.458050 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-867ddcbfff-tcnvd" podStartSLOduration=30.181109349 podStartE2EDuration="41.457953454s" podCreationTimestamp="2025-09-09 05:40:04 +0000 UTC" firstStartedPulling="2025-09-09 05:40:33.733922712 +0000 UTC m=+46.918026969" lastFinishedPulling="2025-09-09 05:40:45.010766823 +0000 UTC m=+58.194871074" observedRunningTime="2025-09-09 05:40:45.456492651 +0000 UTC m=+58.640596903" watchObservedRunningTime="2025-09-09 05:40:45.457953454 +0000 UTC m=+58.642057707" Sep 9 05:40:46.902480 containerd[1587]: time="2025-09-09T05:40:46.902411883Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:46.903465 containerd[1587]: time="2025-09-09T05:40:46.903412820Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 9 05:40:46.904105 containerd[1587]: time="2025-09-09T05:40:46.904048981Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:46.910370 containerd[1587]: time="2025-09-09T05:40:46.910262458Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:46.911215 containerd[1587]: time="2025-09-09T05:40:46.911186943Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.899859352s" Sep 9 05:40:46.911350 containerd[1587]: time="2025-09-09T05:40:46.911333989Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 9 05:40:46.914417 containerd[1587]: time="2025-09-09T05:40:46.914319753Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:40:46.921337 containerd[1587]: time="2025-09-09T05:40:46.921045763Z" level=info msg="CreateContainer within sandbox \"da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 05:40:46.974027 containerd[1587]: time="2025-09-09T05:40:46.969220311Z" level=info msg="Container 922b9d7b6cbc3be3f3f4c74cc1494d9880e1938bd3aa3e2e6b9fa9198439f95a: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:47.036477 containerd[1587]: time="2025-09-09T05:40:47.036427795Z" level=info msg="CreateContainer within sandbox \"da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"922b9d7b6cbc3be3f3f4c74cc1494d9880e1938bd3aa3e2e6b9fa9198439f95a\"" Sep 9 05:40:47.094632 containerd[1587]: time="2025-09-09T05:40:47.093431604Z" level=info msg="StartContainer for \"922b9d7b6cbc3be3f3f4c74cc1494d9880e1938bd3aa3e2e6b9fa9198439f95a\"" Sep 9 05:40:47.097021 containerd[1587]: time="2025-09-09T05:40:47.096753875Z" level=info msg="connecting to shim 922b9d7b6cbc3be3f3f4c74cc1494d9880e1938bd3aa3e2e6b9fa9198439f95a" address="unix:///run/containerd/s/cc9283db35f16017fddb28fad62a55941f2b3ccbb168e4a63f2b51b28c4ece18" protocol=ttrpc version=3 Sep 9 05:40:47.185956 systemd[1]: Started cri-containerd-922b9d7b6cbc3be3f3f4c74cc1494d9880e1938bd3aa3e2e6b9fa9198439f95a.scope - libcontainer container 922b9d7b6cbc3be3f3f4c74cc1494d9880e1938bd3aa3e2e6b9fa9198439f95a. Sep 9 05:40:47.276946 containerd[1587]: time="2025-09-09T05:40:47.276880482Z" level=info msg="StartContainer for \"922b9d7b6cbc3be3f3f4c74cc1494d9880e1938bd3aa3e2e6b9fa9198439f95a\" returns successfully" Sep 9 05:40:47.291857 containerd[1587]: time="2025-09-09T05:40:47.291805180Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:47.293523 containerd[1587]: time="2025-09-09T05:40:47.293167513Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 05:40:47.299517 containerd[1587]: time="2025-09-09T05:40:47.298758728Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 384.396804ms" Sep 9 05:40:47.300333 containerd[1587]: time="2025-09-09T05:40:47.299674027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 05:40:47.310336 containerd[1587]: time="2025-09-09T05:40:47.310301689Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 05:40:47.314480 containerd[1587]: time="2025-09-09T05:40:47.312992232Z" level=info msg="CreateContainer within sandbox \"f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:40:47.327132 containerd[1587]: time="2025-09-09T05:40:47.327096125Z" level=info msg="Container 713e6f2cb3eaa530f32a1ae063c7031810ed3141fa6c33c6a98a76a0ce483ed0: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:47.337680 containerd[1587]: time="2025-09-09T05:40:47.337155532Z" level=info msg="CreateContainer within sandbox \"f98180c3954b7b2eb0fd16f0a57513d50d2f20de23fa2b0891deb5215d572b4d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"713e6f2cb3eaa530f32a1ae063c7031810ed3141fa6c33c6a98a76a0ce483ed0\"" Sep 9 05:40:47.340678 containerd[1587]: time="2025-09-09T05:40:47.339205407Z" level=info msg="StartContainer for \"713e6f2cb3eaa530f32a1ae063c7031810ed3141fa6c33c6a98a76a0ce483ed0\"" Sep 9 05:40:47.342328 containerd[1587]: time="2025-09-09T05:40:47.342296055Z" level=info msg="connecting to shim 713e6f2cb3eaa530f32a1ae063c7031810ed3141fa6c33c6a98a76a0ce483ed0" address="unix:///run/containerd/s/b08f91c2ce314cc4ce23101414b81af99403e94a0a6efd0e99fda6b71aaaf97e" protocol=ttrpc version=3 Sep 9 05:40:47.376858 systemd[1]: Started cri-containerd-713e6f2cb3eaa530f32a1ae063c7031810ed3141fa6c33c6a98a76a0ce483ed0.scope - libcontainer container 713e6f2cb3eaa530f32a1ae063c7031810ed3141fa6c33c6a98a76a0ce483ed0. Sep 9 05:40:47.515471 kubelet[2908]: I0909 05:40:47.515352 2908 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:40:47.542654 containerd[1587]: time="2025-09-09T05:40:47.542610335Z" level=info msg="StartContainer for \"713e6f2cb3eaa530f32a1ae063c7031810ed3141fa6c33c6a98a76a0ce483ed0\" returns successfully" Sep 9 05:40:48.530226 kubelet[2908]: I0909 05:40:48.528621 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-867ddcbfff-ngl89" podStartSLOduration=32.937810226 podStartE2EDuration="44.528598923s" podCreationTimestamp="2025-09-09 05:40:04 +0000 UTC" firstStartedPulling="2025-09-09 05:40:35.714267824 +0000 UTC m=+48.898372086" lastFinishedPulling="2025-09-09 05:40:47.305056532 +0000 UTC m=+60.489160783" observedRunningTime="2025-09-09 05:40:48.526493315 +0000 UTC m=+61.710597590" watchObservedRunningTime="2025-09-09 05:40:48.528598923 +0000 UTC m=+61.712703207" Sep 9 05:40:49.512113 kubelet[2908]: I0909 05:40:49.512050 2908 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:40:53.789527 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2941483428.mount: Deactivated successfully. Sep 9 05:40:54.410086 containerd[1587]: time="2025-09-09T05:40:54.410034677Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:54.418186 containerd[1587]: time="2025-09-09T05:40:54.418098128Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 9 05:40:54.436130 containerd[1587]: time="2025-09-09T05:40:54.436040514Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:54.438312 containerd[1587]: time="2025-09-09T05:40:54.438281296Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:54.440689 containerd[1587]: time="2025-09-09T05:40:54.440396934Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 7.130050108s" Sep 9 05:40:54.440689 containerd[1587]: time="2025-09-09T05:40:54.440464520Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 9 05:40:54.489937 containerd[1587]: time="2025-09-09T05:40:54.489554873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 05:40:54.648670 containerd[1587]: time="2025-09-09T05:40:54.648595546Z" level=info msg="CreateContainer within sandbox \"da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 05:40:54.663215 containerd[1587]: time="2025-09-09T05:40:54.662635827Z" level=info msg="Container b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:54.697958 containerd[1587]: time="2025-09-09T05:40:54.697810701Z" level=info msg="CreateContainer within sandbox \"da260e8ea2091ee85c436cdd2858b956eadd7492ceccc90d3cdeab75848abff6\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65\"" Sep 9 05:40:54.703278 containerd[1587]: time="2025-09-09T05:40:54.702979645Z" level=info msg="StartContainer for \"b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65\"" Sep 9 05:40:54.707329 containerd[1587]: time="2025-09-09T05:40:54.707285184Z" level=info msg="connecting to shim b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65" address="unix:///run/containerd/s/c5d392f53cc427be1347bb04f9518783fa019970efd7c224977e27f7d4063e5f" protocol=ttrpc version=3 Sep 9 05:40:54.908848 systemd[1]: Started cri-containerd-b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65.scope - libcontainer container b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65. Sep 9 05:40:55.066384 containerd[1587]: time="2025-09-09T05:40:55.065869596Z" level=info msg="StartContainer for \"b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65\" returns successfully" Sep 9 05:40:55.813426 kubelet[2908]: I0909 05:40:55.802927 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-jg28r" podStartSLOduration=30.093705003 podStartE2EDuration="48.797279855s" podCreationTimestamp="2025-09-09 05:40:07 +0000 UTC" firstStartedPulling="2025-09-09 05:40:35.772812623 +0000 UTC m=+48.956916874" lastFinishedPulling="2025-09-09 05:40:54.476387473 +0000 UTC m=+67.660491726" observedRunningTime="2025-09-09 05:40:55.765034391 +0000 UTC m=+68.949138663" watchObservedRunningTime="2025-09-09 05:40:55.797279855 +0000 UTC m=+68.981384130" Sep 9 05:40:55.953580 containerd[1587]: time="2025-09-09T05:40:55.953509251Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65\" id:\"21ec9369eb49029d0e4a0317baf762676d3e33c899125d6fbacd54c81fabf7df\" pid:5297 exit_status:1 exited_at:{seconds:1757396455 nanos:944022640}" Sep 9 05:40:56.833434 containerd[1587]: time="2025-09-09T05:40:56.833306761Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65\" id:\"ba33197876fcf7bef142ad34d1ea84f3ed4e3d0663e5e454b5641db2d7f174f3\" pid:5319 exit_status:1 exited_at:{seconds:1757396456 nanos:832682988}" Sep 9 05:40:57.176025 containerd[1587]: time="2025-09-09T05:40:57.175776015Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:57.179057 containerd[1587]: time="2025-09-09T05:40:57.179017577Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 9 05:40:57.180133 containerd[1587]: time="2025-09-09T05:40:57.180096511Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:57.190820 containerd[1587]: time="2025-09-09T05:40:57.190783210Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:40:57.192646 containerd[1587]: time="2025-09-09T05:40:57.192591770Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.702976925s" Sep 9 05:40:57.192926 containerd[1587]: time="2025-09-09T05:40:57.192802718Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 9 05:40:57.211735 containerd[1587]: time="2025-09-09T05:40:57.211691863Z" level=info msg="CreateContainer within sandbox \"da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 05:40:57.220161 containerd[1587]: time="2025-09-09T05:40:57.220118947Z" level=info msg="Container 834a2afa0b84f960b117cbf41ecd0202403973f5860d2f2ad0a75e71f227948d: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:40:57.228341 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3207530967.mount: Deactivated successfully. Sep 9 05:40:57.238833 containerd[1587]: time="2025-09-09T05:40:57.238744277Z" level=info msg="CreateContainer within sandbox \"da69e1e76d5d7305bc24c61a1dc15f0d211f35f332567921682eecda8a5af966\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"834a2afa0b84f960b117cbf41ecd0202403973f5860d2f2ad0a75e71f227948d\"" Sep 9 05:40:57.239577 containerd[1587]: time="2025-09-09T05:40:57.239549942Z" level=info msg="StartContainer for \"834a2afa0b84f960b117cbf41ecd0202403973f5860d2f2ad0a75e71f227948d\"" Sep 9 05:40:57.242375 containerd[1587]: time="2025-09-09T05:40:57.242347477Z" level=info msg="connecting to shim 834a2afa0b84f960b117cbf41ecd0202403973f5860d2f2ad0a75e71f227948d" address="unix:///run/containerd/s/cc9283db35f16017fddb28fad62a55941f2b3ccbb168e4a63f2b51b28c4ece18" protocol=ttrpc version=3 Sep 9 05:40:57.282925 systemd[1]: Started cri-containerd-834a2afa0b84f960b117cbf41ecd0202403973f5860d2f2ad0a75e71f227948d.scope - libcontainer container 834a2afa0b84f960b117cbf41ecd0202403973f5860d2f2ad0a75e71f227948d. Sep 9 05:40:57.361987 containerd[1587]: time="2025-09-09T05:40:57.361939131Z" level=info msg="StartContainer for \"834a2afa0b84f960b117cbf41ecd0202403973f5860d2f2ad0a75e71f227948d\" returns successfully" Sep 9 05:40:57.818677 kubelet[2908]: I0909 05:40:57.817911 2908 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-czgjp" podStartSLOduration=26.407282905 podStartE2EDuration="49.817883087s" podCreationTimestamp="2025-09-09 05:40:08 +0000 UTC" firstStartedPulling="2025-09-09 05:40:33.784330334 +0000 UTC m=+46.968434585" lastFinishedPulling="2025-09-09 05:40:57.194930468 +0000 UTC m=+70.379034767" observedRunningTime="2025-09-09 05:40:57.80759861 +0000 UTC m=+70.991702885" watchObservedRunningTime="2025-09-09 05:40:57.817883087 +0000 UTC m=+71.001987362" Sep 9 05:40:57.889271 containerd[1587]: time="2025-09-09T05:40:57.889096749Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65\" id:\"9c071ef1038d623e4b440e4d803c23ecd48d08fafd65efb815384ca17e9e8427\" pid:5380 exit_status:1 exited_at:{seconds:1757396457 nanos:888612172}" Sep 9 05:40:58.322928 kubelet[2908]: I0909 05:40:58.322882 2908 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 05:40:58.323354 kubelet[2908]: I0909 05:40:58.323116 2908 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 05:41:01.562342 containerd[1587]: time="2025-09-09T05:41:01.561926425Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a\" id:\"6f65ea8f686ade1ffad29da19ec12076a97e4122708bef1a4ff2679a84ed0c82\" pid:5426 exited_at:{seconds:1757396461 nanos:560824875}" Sep 9 05:41:08.663340 containerd[1587]: time="2025-09-09T05:41:08.663286106Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436\" id:\"776a8ddb2798f507b4b44f2bb28630a70a4af9cf0fb5532446561e05f2d2dd3c\" pid:5453 exited_at:{seconds:1757396468 nanos:641246147}" Sep 9 05:41:10.112100 kubelet[2908]: I0909 05:41:10.111943 2908 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:41:21.727301 systemd[1]: Started sshd@9-10.244.98.182:22-139.178.89.65:40796.service - OpenSSH per-connection server daemon (139.178.89.65:40796). Sep 9 05:41:22.743160 sshd[5482]: Accepted publickey for core from 139.178.89.65 port 40796 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:41:22.747195 sshd-session[5482]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:41:22.759128 systemd-logind[1565]: New session 12 of user core. Sep 9 05:41:22.764839 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 05:41:24.009936 sshd[5485]: Connection closed by 139.178.89.65 port 40796 Sep 9 05:41:24.012597 sshd-session[5482]: pam_unix(sshd:session): session closed for user core Sep 9 05:41:24.027042 systemd[1]: sshd@9-10.244.98.182:22-139.178.89.65:40796.service: Deactivated successfully. Sep 9 05:41:24.032365 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 05:41:24.045747 systemd-logind[1565]: Session 12 logged out. Waiting for processes to exit. Sep 9 05:41:24.047700 systemd-logind[1565]: Removed session 12. Sep 9 05:41:25.151695 containerd[1587]: time="2025-09-09T05:41:25.151626633Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436\" id:\"44dc723fb17c2cdc78feed66c6352d6d3f63f8a4544ae8c978c16c83aeff87cb\" pid:5512 exited_at:{seconds:1757396485 nanos:74223060}" Sep 9 05:41:28.178443 containerd[1587]: time="2025-09-09T05:41:28.178191082Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65\" id:\"244a3f9dd6f52140abfda1d6b2018c54edffc5dad04417677fbd50e3c8cd82a3\" pid:5533 exited_at:{seconds:1757396488 nanos:176833476}" Sep 9 05:41:29.172557 systemd[1]: Started sshd@10-10.244.98.182:22-139.178.89.65:40810.service - OpenSSH per-connection server daemon (139.178.89.65:40810). Sep 9 05:41:30.149794 sshd[5547]: Accepted publickey for core from 139.178.89.65 port 40810 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:41:30.151604 sshd-session[5547]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:41:30.163025 systemd-logind[1565]: New session 13 of user core. Sep 9 05:41:30.168935 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 05:41:30.993044 sshd[5550]: Connection closed by 139.178.89.65 port 40810 Sep 9 05:41:30.994542 sshd-session[5547]: pam_unix(sshd:session): session closed for user core Sep 9 05:41:31.003396 systemd[1]: sshd@10-10.244.98.182:22-139.178.89.65:40810.service: Deactivated successfully. Sep 9 05:41:31.008225 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 05:41:31.011588 systemd-logind[1565]: Session 13 logged out. Waiting for processes to exit. Sep 9 05:41:31.014281 systemd-logind[1565]: Removed session 13. Sep 9 05:41:31.569769 containerd[1587]: time="2025-09-09T05:41:31.569600128Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a\" id:\"4a6ddb8dc6906222bea2e60af893dd28f5d00fd0b53e2c41b90e5b2d4c702152\" pid:5575 exited_at:{seconds:1757396491 nanos:569093051}" Sep 9 05:41:36.154531 systemd[1]: Started sshd@11-10.244.98.182:22-139.178.89.65:54326.service - OpenSSH per-connection server daemon (139.178.89.65:54326). Sep 9 05:41:37.120291 sshd[5588]: Accepted publickey for core from 139.178.89.65 port 54326 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:41:37.124183 sshd-session[5588]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:41:37.134133 systemd-logind[1565]: New session 14 of user core. Sep 9 05:41:37.137868 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 05:41:37.931357 sshd[5591]: Connection closed by 139.178.89.65 port 54326 Sep 9 05:41:37.936241 sshd-session[5588]: pam_unix(sshd:session): session closed for user core Sep 9 05:41:37.943632 systemd[1]: sshd@11-10.244.98.182:22-139.178.89.65:54326.service: Deactivated successfully. Sep 9 05:41:37.946469 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 05:41:37.950801 systemd-logind[1565]: Session 14 logged out. Waiting for processes to exit. Sep 9 05:41:37.953267 systemd-logind[1565]: Removed session 14. Sep 9 05:41:38.100998 systemd[1]: Started sshd@12-10.244.98.182:22-139.178.89.65:54330.service - OpenSSH per-connection server daemon (139.178.89.65:54330). Sep 9 05:41:38.467279 containerd[1587]: time="2025-09-09T05:41:38.459423228Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436\" id:\"7377d42c8a5728d06e9be0c2318f97dc4d4bc8ae9e17ee02b0d25a4c6d580beb\" pid:5620 exited_at:{seconds:1757396498 nanos:454976410}" Sep 9 05:41:39.083595 sshd[5604]: Accepted publickey for core from 139.178.89.65 port 54330 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:41:39.087825 sshd-session[5604]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:41:39.101748 systemd-logind[1565]: New session 15 of user core. Sep 9 05:41:39.112978 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 05:41:39.910207 sshd[5630]: Connection closed by 139.178.89.65 port 54330 Sep 9 05:41:39.915500 sshd-session[5604]: pam_unix(sshd:session): session closed for user core Sep 9 05:41:39.923171 systemd-logind[1565]: Session 15 logged out. Waiting for processes to exit. Sep 9 05:41:39.923961 systemd[1]: sshd@12-10.244.98.182:22-139.178.89.65:54330.service: Deactivated successfully. Sep 9 05:41:39.928397 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 05:41:39.931100 systemd-logind[1565]: Removed session 15. Sep 9 05:41:40.077911 systemd[1]: Started sshd@13-10.244.98.182:22-139.178.89.65:54340.service - OpenSSH per-connection server daemon (139.178.89.65:54340). Sep 9 05:41:41.105688 sshd[5640]: Accepted publickey for core from 139.178.89.65 port 54340 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:41:41.107262 sshd-session[5640]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:41:41.122869 systemd-logind[1565]: New session 16 of user core. Sep 9 05:41:41.130866 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 05:41:41.553831 containerd[1587]: time="2025-09-09T05:41:41.552901796Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65\" id:\"6c72e18e058fbb886600ef9e6d4b3ab95da2b30d87db7d9264afea187953d441\" pid:5658 exited_at:{seconds:1757396501 nanos:552566672}" Sep 9 05:41:41.928351 sshd[5649]: Connection closed by 139.178.89.65 port 54340 Sep 9 05:41:41.935011 sshd-session[5640]: pam_unix(sshd:session): session closed for user core Sep 9 05:41:41.960879 systemd-logind[1565]: Session 16 logged out. Waiting for processes to exit. Sep 9 05:41:41.961006 systemd[1]: sshd@13-10.244.98.182:22-139.178.89.65:54340.service: Deactivated successfully. Sep 9 05:41:41.965386 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 05:41:41.969554 systemd-logind[1565]: Removed session 16. Sep 9 05:41:47.097872 systemd[1]: Started sshd@14-10.244.98.182:22-139.178.89.65:42182.service - OpenSSH per-connection server daemon (139.178.89.65:42182). Sep 9 05:41:48.106732 sshd[5680]: Accepted publickey for core from 139.178.89.65 port 42182 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:41:48.110292 sshd-session[5680]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:41:48.121349 systemd-logind[1565]: New session 17 of user core. Sep 9 05:41:48.125858 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 05:41:49.198821 sshd[5684]: Connection closed by 139.178.89.65 port 42182 Sep 9 05:41:49.206582 sshd-session[5680]: pam_unix(sshd:session): session closed for user core Sep 9 05:41:49.221104 systemd[1]: sshd@14-10.244.98.182:22-139.178.89.65:42182.service: Deactivated successfully. Sep 9 05:41:49.226202 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 05:41:49.233366 systemd-logind[1565]: Session 17 logged out. Waiting for processes to exit. Sep 9 05:41:49.235201 systemd-logind[1565]: Removed session 17. Sep 9 05:41:54.364244 systemd[1]: Started sshd@15-10.244.98.182:22-139.178.89.65:50584.service - OpenSSH per-connection server daemon (139.178.89.65:50584). Sep 9 05:41:55.308819 sshd[5709]: Accepted publickey for core from 139.178.89.65 port 50584 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:41:55.313113 sshd-session[5709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:41:55.329875 systemd-logind[1565]: New session 18 of user core. Sep 9 05:41:55.333067 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 05:41:56.199400 sshd[5712]: Connection closed by 139.178.89.65 port 50584 Sep 9 05:41:56.198621 sshd-session[5709]: pam_unix(sshd:session): session closed for user core Sep 9 05:41:56.207286 systemd[1]: sshd@15-10.244.98.182:22-139.178.89.65:50584.service: Deactivated successfully. Sep 9 05:41:56.210116 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 05:41:56.213645 systemd-logind[1565]: Session 18 logged out. Waiting for processes to exit. Sep 9 05:41:56.214891 systemd-logind[1565]: Removed session 18. Sep 9 05:41:58.105684 containerd[1587]: time="2025-09-09T05:41:58.103627389Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65\" id:\"e5a7252c0428514019b7eb12b3b670191f40191df4b116c778f7ff394238358d\" pid:5737 exited_at:{seconds:1757396517 nanos:973452817}" Sep 9 05:42:01.359655 systemd[1]: Started sshd@16-10.244.98.182:22-139.178.89.65:36484.service - OpenSSH per-connection server daemon (139.178.89.65:36484). Sep 9 05:42:01.701038 containerd[1587]: time="2025-09-09T05:42:01.700645967Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a\" id:\"72f982da8cfb075c24b5e2d08b11469799cb99f5b2d3b03a3ff57f3f47ce2b82\" pid:5760 exited_at:{seconds:1757396521 nanos:700135809}" Sep 9 05:42:02.353344 sshd[5765]: Accepted publickey for core from 139.178.89.65 port 36484 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:42:02.355371 sshd-session[5765]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:42:02.365125 systemd-logind[1565]: New session 19 of user core. Sep 9 05:42:02.370988 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 05:42:03.451238 sshd[5776]: Connection closed by 139.178.89.65 port 36484 Sep 9 05:42:03.452322 sshd-session[5765]: pam_unix(sshd:session): session closed for user core Sep 9 05:42:03.459197 systemd[1]: sshd@16-10.244.98.182:22-139.178.89.65:36484.service: Deactivated successfully. Sep 9 05:42:03.462482 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 05:42:03.465826 systemd-logind[1565]: Session 19 logged out. Waiting for processes to exit. Sep 9 05:42:03.468847 systemd-logind[1565]: Removed session 19. Sep 9 05:42:03.612418 systemd[1]: Started sshd@17-10.244.98.182:22-139.178.89.65:36486.service - OpenSSH per-connection server daemon (139.178.89.65:36486). Sep 9 05:42:04.568894 sshd[5795]: Accepted publickey for core from 139.178.89.65 port 36486 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:42:04.571771 sshd-session[5795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:42:04.577709 systemd-logind[1565]: New session 20 of user core. Sep 9 05:42:04.584791 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 05:42:05.642556 sshd[5798]: Connection closed by 139.178.89.65 port 36486 Sep 9 05:42:05.644189 sshd-session[5795]: pam_unix(sshd:session): session closed for user core Sep 9 05:42:05.656991 systemd[1]: sshd@17-10.244.98.182:22-139.178.89.65:36486.service: Deactivated successfully. Sep 9 05:42:05.658158 systemd-logind[1565]: Session 20 logged out. Waiting for processes to exit. Sep 9 05:42:05.661758 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 05:42:05.668730 systemd-logind[1565]: Removed session 20. Sep 9 05:42:05.793737 systemd[1]: Started sshd@18-10.244.98.182:22-139.178.89.65:36492.service - OpenSSH per-connection server daemon (139.178.89.65:36492). Sep 9 05:42:06.739726 sshd[5808]: Accepted publickey for core from 139.178.89.65 port 36492 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:42:06.746655 sshd-session[5808]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:42:06.761123 systemd-logind[1565]: New session 21 of user core. Sep 9 05:42:06.766914 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 05:42:08.385484 sshd[5811]: Connection closed by 139.178.89.65 port 36492 Sep 9 05:42:08.404897 sshd-session[5808]: pam_unix(sshd:session): session closed for user core Sep 9 05:42:08.418609 systemd-logind[1565]: Session 21 logged out. Waiting for processes to exit. Sep 9 05:42:08.420990 systemd[1]: sshd@18-10.244.98.182:22-139.178.89.65:36492.service: Deactivated successfully. Sep 9 05:42:08.424601 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 05:42:08.430906 systemd-logind[1565]: Removed session 21. Sep 9 05:42:08.546551 systemd[1]: Started sshd@19-10.244.98.182:22-139.178.89.65:36498.service - OpenSSH per-connection server daemon (139.178.89.65:36498). Sep 9 05:42:09.019161 containerd[1587]: time="2025-09-09T05:42:09.016087844Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436\" id:\"639c8125e3eb529d140c82359b3a0e727c932e31b62327987a052e819f0d67d0\" pid:5856 exited_at:{seconds:1757396529 nanos:15129007}" Sep 9 05:42:09.563331 sshd[5841]: Accepted publickey for core from 139.178.89.65 port 36498 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:42:09.565603 sshd-session[5841]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:42:09.573365 systemd-logind[1565]: New session 22 of user core. Sep 9 05:42:09.580078 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 05:42:11.365928 sshd[5865]: Connection closed by 139.178.89.65 port 36498 Sep 9 05:42:11.374611 sshd-session[5841]: pam_unix(sshd:session): session closed for user core Sep 9 05:42:11.391448 systemd[1]: sshd@19-10.244.98.182:22-139.178.89.65:36498.service: Deactivated successfully. Sep 9 05:42:11.398602 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 05:42:11.399826 systemd-logind[1565]: Session 22 logged out. Waiting for processes to exit. Sep 9 05:42:11.403281 systemd-logind[1565]: Removed session 22. Sep 9 05:42:11.524829 systemd[1]: Started sshd@20-10.244.98.182:22-139.178.89.65:37828.service - OpenSSH per-connection server daemon (139.178.89.65:37828). Sep 9 05:42:12.520117 sshd[5876]: Accepted publickey for core from 139.178.89.65 port 37828 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:42:12.522215 sshd-session[5876]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:42:12.531808 systemd-logind[1565]: New session 23 of user core. Sep 9 05:42:12.537852 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 05:42:13.572816 sshd[5879]: Connection closed by 139.178.89.65 port 37828 Sep 9 05:42:13.573230 sshd-session[5876]: pam_unix(sshd:session): session closed for user core Sep 9 05:42:13.580067 systemd-logind[1565]: Session 23 logged out. Waiting for processes to exit. Sep 9 05:42:13.581054 systemd[1]: sshd@20-10.244.98.182:22-139.178.89.65:37828.service: Deactivated successfully. Sep 9 05:42:13.584582 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 05:42:13.588399 systemd-logind[1565]: Removed session 23. Sep 9 05:42:18.729087 systemd[1]: Started sshd@21-10.244.98.182:22-139.178.89.65:37844.service - OpenSSH per-connection server daemon (139.178.89.65:37844). Sep 9 05:42:19.712335 sshd[5893]: Accepted publickey for core from 139.178.89.65 port 37844 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:42:19.714683 sshd-session[5893]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:42:19.721855 systemd-logind[1565]: New session 24 of user core. Sep 9 05:42:19.728802 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 05:42:20.730606 sshd[5896]: Connection closed by 139.178.89.65 port 37844 Sep 9 05:42:20.731140 sshd-session[5893]: pam_unix(sshd:session): session closed for user core Sep 9 05:42:20.742886 systemd[1]: sshd@21-10.244.98.182:22-139.178.89.65:37844.service: Deactivated successfully. Sep 9 05:42:20.745901 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 05:42:20.747271 systemd-logind[1565]: Session 24 logged out. Waiting for processes to exit. Sep 9 05:42:20.750051 systemd-logind[1565]: Removed session 24. Sep 9 05:42:24.914034 containerd[1587]: time="2025-09-09T05:42:24.906038164Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80718273735062a4d1f3a39f74118ba4069baab79be139a01e0c414774ee4436\" id:\"26295771702d100c4c40351fc4edba5b6ce67e8a6a627a151467595b2cc19874\" pid:5923 exited_at:{seconds:1757396544 nanos:905590805}" Sep 9 05:42:25.887085 systemd[1]: Started sshd@22-10.244.98.182:22-139.178.89.65:48100.service - OpenSSH per-connection server daemon (139.178.89.65:48100). Sep 9 05:42:26.866501 sshd[5933]: Accepted publickey for core from 139.178.89.65 port 48100 ssh2: RSA SHA256:wfqhBLbQHemrYEOiWJay6G0utZpK1ZEAyCkpmDG3O1s Sep 9 05:42:26.869632 sshd-session[5933]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:42:26.883720 systemd-logind[1565]: New session 25 of user core. Sep 9 05:42:26.888858 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 9 05:42:27.992180 sshd[5936]: Connection closed by 139.178.89.65 port 48100 Sep 9 05:42:27.992361 sshd-session[5933]: pam_unix(sshd:session): session closed for user core Sep 9 05:42:28.005362 systemd-logind[1565]: Session 25 logged out. Waiting for processes to exit. Sep 9 05:42:28.006000 systemd[1]: sshd@22-10.244.98.182:22-139.178.89.65:48100.service: Deactivated successfully. Sep 9 05:42:28.009855 systemd[1]: session-25.scope: Deactivated successfully. Sep 9 05:42:28.014229 systemd-logind[1565]: Removed session 25. Sep 9 05:42:28.182745 systemd[1]: Started sshd@23-10.244.98.182:22-198.235.24.19:53650.service - OpenSSH per-connection server daemon (198.235.24.19:53650). Sep 9 05:42:28.249077 containerd[1587]: time="2025-09-09T05:42:28.248852409Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b8826bee7e802d8dcda2ee6e878b598b128c2cb38c863c399d5bd455e0016b65\" id:\"f93814e64a7d74744b375ff4151dc68b115338ad1b7f71cc140d3ededa8a7687\" pid:5957 exited_at:{seconds:1757396548 nanos:247374295}" Sep 9 05:42:28.631388 sshd[5971]: Connection closed by 198.235.24.19 port 53650 Sep 9 05:42:28.632336 systemd[1]: sshd@23-10.244.98.182:22-198.235.24.19:53650.service: Deactivated successfully. Sep 9 05:42:31.612979 containerd[1587]: time="2025-09-09T05:42:31.612861578Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39cf40cf95d611472d19cc88886b48614270a7b19726836b5b1eba5919d5a5a\" id:\"ad0b2f592dc014610fcbf6ed1ca6c9c0f6d3c43ba9a947001cf33a107a60a7a5\" pid:5988 exited_at:{seconds:1757396551 nanos:612259542}"