Sep 9 03:28:12.039292 kernel: Linux version 6.6.104-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Sep 8 22:41:17 -00 2025 Sep 9 03:28:12.039328 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=99a67175ee6aabbc03a22dabcade16d60ad192b31c4118a259bf1f24bbfa2d29 Sep 9 03:28:12.039342 kernel: BIOS-provided physical RAM map: Sep 9 03:28:12.039358 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 9 03:28:12.039367 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 9 03:28:12.039377 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 9 03:28:12.039389 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Sep 9 03:28:12.039400 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Sep 9 03:28:12.039410 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 9 03:28:12.039420 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 9 03:28:12.039431 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 9 03:28:12.039441 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 9 03:28:12.039456 kernel: NX (Execute Disable) protection: active Sep 9 03:28:12.039467 kernel: APIC: Static calls initialized Sep 9 03:28:12.039479 kernel: SMBIOS 2.8 present. Sep 9 03:28:12.039491 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Sep 9 03:28:12.039503 kernel: Hypervisor detected: KVM Sep 9 03:28:12.039518 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 9 03:28:12.039530 kernel: kvm-clock: using sched offset of 4484908231 cycles Sep 9 03:28:12.039542 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 9 03:28:12.039554 kernel: tsc: Detected 2499.998 MHz processor Sep 9 03:28:12.039566 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 9 03:28:12.039578 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 9 03:28:12.039589 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Sep 9 03:28:12.039600 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 9 03:28:12.039612 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 9 03:28:12.039628 kernel: Using GB pages for direct mapping Sep 9 03:28:12.039639 kernel: ACPI: Early table checksum verification disabled Sep 9 03:28:12.039651 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Sep 9 03:28:12.039662 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 03:28:12.039674 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 03:28:12.039685 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 03:28:12.039697 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Sep 9 03:28:12.039708 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 03:28:12.039720 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 03:28:12.039735 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 03:28:12.039747 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 03:28:12.039759 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Sep 9 03:28:12.039770 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Sep 9 03:28:12.039782 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Sep 9 03:28:12.039800 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Sep 9 03:28:12.039812 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Sep 9 03:28:12.039828 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Sep 9 03:28:12.039840 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Sep 9 03:28:12.039852 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Sep 9 03:28:12.039864 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Sep 9 03:28:12.039876 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Sep 9 03:28:12.039888 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Sep 9 03:28:12.039900 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Sep 9 03:28:12.039916 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Sep 9 03:28:12.043402 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Sep 9 03:28:12.043432 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Sep 9 03:28:12.043454 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Sep 9 03:28:12.043467 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Sep 9 03:28:12.043479 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Sep 9 03:28:12.043497 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Sep 9 03:28:12.043509 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Sep 9 03:28:12.043521 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Sep 9 03:28:12.043533 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Sep 9 03:28:12.043558 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Sep 9 03:28:12.043572 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 9 03:28:12.043584 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 9 03:28:12.043597 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Sep 9 03:28:12.043609 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Sep 9 03:28:12.043621 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Sep 9 03:28:12.043634 kernel: Zone ranges: Sep 9 03:28:12.043646 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 9 03:28:12.043658 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Sep 9 03:28:12.043675 kernel: Normal empty Sep 9 03:28:12.043688 kernel: Movable zone start for each node Sep 9 03:28:12.043700 kernel: Early memory node ranges Sep 9 03:28:12.043712 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 9 03:28:12.043724 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Sep 9 03:28:12.043736 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Sep 9 03:28:12.043748 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 9 03:28:12.043760 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 9 03:28:12.043772 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Sep 9 03:28:12.043784 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 9 03:28:12.043801 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 9 03:28:12.043813 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 9 03:28:12.043826 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 9 03:28:12.043838 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 9 03:28:12.043850 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 9 03:28:12.043862 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 9 03:28:12.043874 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 9 03:28:12.043886 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 9 03:28:12.043898 kernel: TSC deadline timer available Sep 9 03:28:12.043915 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Sep 9 03:28:12.043962 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 9 03:28:12.043977 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 9 03:28:12.043989 kernel: Booting paravirtualized kernel on KVM Sep 9 03:28:12.044001 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 9 03:28:12.044014 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 9 03:28:12.044026 kernel: percpu: Embedded 58 pages/cpu s197160 r8192 d32216 u262144 Sep 9 03:28:12.044038 kernel: pcpu-alloc: s197160 r8192 d32216 u262144 alloc=1*2097152 Sep 9 03:28:12.044050 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 9 03:28:12.044068 kernel: kvm-guest: PV spinlocks enabled Sep 9 03:28:12.044081 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 9 03:28:12.044095 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=99a67175ee6aabbc03a22dabcade16d60ad192b31c4118a259bf1f24bbfa2d29 Sep 9 03:28:12.044108 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 03:28:12.044119 kernel: random: crng init done Sep 9 03:28:12.044131 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 03:28:12.044144 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 9 03:28:12.044156 kernel: Fallback order for Node 0: 0 Sep 9 03:28:12.044173 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Sep 9 03:28:12.044185 kernel: Policy zone: DMA32 Sep 9 03:28:12.044197 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 03:28:12.044209 kernel: software IO TLB: area num 16. Sep 9 03:28:12.044222 kernel: Memory: 1901532K/2096616K available (12288K kernel code, 2293K rwdata, 22744K rodata, 42880K init, 2316K bss, 194824K reserved, 0K cma-reserved) Sep 9 03:28:12.044234 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 9 03:28:12.044246 kernel: Kernel/User page tables isolation: enabled Sep 9 03:28:12.044258 kernel: ftrace: allocating 37969 entries in 149 pages Sep 9 03:28:12.044270 kernel: ftrace: allocated 149 pages with 4 groups Sep 9 03:28:12.044287 kernel: Dynamic Preempt: voluntary Sep 9 03:28:12.044299 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 03:28:12.044312 kernel: rcu: RCU event tracing is enabled. Sep 9 03:28:12.044325 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 9 03:28:12.044338 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 03:28:12.044362 kernel: Rude variant of Tasks RCU enabled. Sep 9 03:28:12.044379 kernel: Tracing variant of Tasks RCU enabled. Sep 9 03:28:12.044392 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 03:28:12.044405 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 9 03:28:12.044417 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Sep 9 03:28:12.044430 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 03:28:12.044443 kernel: Console: colour VGA+ 80x25 Sep 9 03:28:12.044460 kernel: printk: console [tty0] enabled Sep 9 03:28:12.044473 kernel: printk: console [ttyS0] enabled Sep 9 03:28:12.044486 kernel: ACPI: Core revision 20230628 Sep 9 03:28:12.044499 kernel: APIC: Switch to symmetric I/O mode setup Sep 9 03:28:12.044511 kernel: x2apic enabled Sep 9 03:28:12.044529 kernel: APIC: Switched APIC routing to: physical x2apic Sep 9 03:28:12.044542 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Sep 9 03:28:12.044569 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499998) Sep 9 03:28:12.044583 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 9 03:28:12.044596 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Sep 9 03:28:12.044614 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Sep 9 03:28:12.044632 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 9 03:28:12.044645 kernel: Spectre V2 : Mitigation: Retpolines Sep 9 03:28:12.044658 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 9 03:28:12.044683 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Sep 9 03:28:12.044696 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 9 03:28:12.044709 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 9 03:28:12.044722 kernel: MDS: Mitigation: Clear CPU buffers Sep 9 03:28:12.044734 kernel: MMIO Stale Data: Unknown: No mitigations Sep 9 03:28:12.044747 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 9 03:28:12.044759 kernel: active return thunk: its_return_thunk Sep 9 03:28:12.044772 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 9 03:28:12.044785 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 9 03:28:12.044797 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 9 03:28:12.044810 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 9 03:28:12.044827 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 9 03:28:12.044840 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Sep 9 03:28:12.044853 kernel: Freeing SMP alternatives memory: 32K Sep 9 03:28:12.044865 kernel: pid_max: default: 32768 minimum: 301 Sep 9 03:28:12.044878 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Sep 9 03:28:12.044891 kernel: landlock: Up and running. Sep 9 03:28:12.044903 kernel: SELinux: Initializing. Sep 9 03:28:12.044916 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 03:28:12.044956 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 03:28:12.044971 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Sep 9 03:28:12.044984 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 03:28:12.045003 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 03:28:12.045016 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 03:28:12.045029 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Sep 9 03:28:12.045042 kernel: signal: max sigframe size: 1776 Sep 9 03:28:12.045055 kernel: rcu: Hierarchical SRCU implementation. Sep 9 03:28:12.045068 kernel: rcu: Max phase no-delay instances is 400. Sep 9 03:28:12.045081 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 9 03:28:12.045094 kernel: smp: Bringing up secondary CPUs ... Sep 9 03:28:12.045106 kernel: smpboot: x86: Booting SMP configuration: Sep 9 03:28:12.045124 kernel: .... node #0, CPUs: #1 Sep 9 03:28:12.045137 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Sep 9 03:28:12.045150 kernel: smp: Brought up 1 node, 2 CPUs Sep 9 03:28:12.045162 kernel: smpboot: Max logical packages: 16 Sep 9 03:28:12.045175 kernel: smpboot: Total of 2 processors activated (9999.99 BogoMIPS) Sep 9 03:28:12.045188 kernel: devtmpfs: initialized Sep 9 03:28:12.045201 kernel: x86/mm: Memory block size: 128MB Sep 9 03:28:12.045214 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 03:28:12.045227 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 9 03:28:12.045244 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 03:28:12.045257 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 03:28:12.045270 kernel: audit: initializing netlink subsys (disabled) Sep 9 03:28:12.045283 kernel: audit: type=2000 audit(1757388490.502:1): state=initialized audit_enabled=0 res=1 Sep 9 03:28:12.045295 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 03:28:12.045308 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 9 03:28:12.045321 kernel: cpuidle: using governor menu Sep 9 03:28:12.045334 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 03:28:12.045346 kernel: dca service started, version 1.12.1 Sep 9 03:28:12.045364 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Sep 9 03:28:12.045377 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Sep 9 03:28:12.045390 kernel: PCI: Using configuration type 1 for base access Sep 9 03:28:12.045403 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 9 03:28:12.045416 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 03:28:12.045429 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 03:28:12.045441 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 03:28:12.045454 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 03:28:12.045467 kernel: ACPI: Added _OSI(Module Device) Sep 9 03:28:12.045484 kernel: ACPI: Added _OSI(Processor Device) Sep 9 03:28:12.045497 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 03:28:12.045510 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 03:28:12.045523 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Sep 9 03:28:12.045536 kernel: ACPI: Interpreter enabled Sep 9 03:28:12.045548 kernel: ACPI: PM: (supports S0 S5) Sep 9 03:28:12.045561 kernel: ACPI: Using IOAPIC for interrupt routing Sep 9 03:28:12.045574 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 9 03:28:12.045587 kernel: PCI: Using E820 reservations for host bridge windows Sep 9 03:28:12.045604 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 9 03:28:12.045617 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 03:28:12.045910 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 03:28:12.046163 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 9 03:28:12.046338 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 9 03:28:12.046358 kernel: PCI host bridge to bus 0000:00 Sep 9 03:28:12.046553 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 9 03:28:12.046723 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 9 03:28:12.046881 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 9 03:28:12.047088 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Sep 9 03:28:12.047248 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 9 03:28:12.047416 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Sep 9 03:28:12.047574 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 03:28:12.047821 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Sep 9 03:28:12.048099 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Sep 9 03:28:12.048288 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Sep 9 03:28:12.048473 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Sep 9 03:28:12.048647 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Sep 9 03:28:12.048853 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 9 03:28:12.052778 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Sep 9 03:28:12.053011 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Sep 9 03:28:12.053229 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Sep 9 03:28:12.053407 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Sep 9 03:28:12.053606 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Sep 9 03:28:12.053778 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Sep 9 03:28:12.056010 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Sep 9 03:28:12.056191 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Sep 9 03:28:12.056410 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Sep 9 03:28:12.056582 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Sep 9 03:28:12.056779 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Sep 9 03:28:12.060996 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Sep 9 03:28:12.061218 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Sep 9 03:28:12.061408 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Sep 9 03:28:12.061626 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Sep 9 03:28:12.061803 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Sep 9 03:28:12.062046 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Sep 9 03:28:12.062227 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Sep 9 03:28:12.062402 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Sep 9 03:28:12.062578 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Sep 9 03:28:12.062759 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Sep 9 03:28:12.065430 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Sep 9 03:28:12.065621 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Sep 9 03:28:12.065798 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Sep 9 03:28:12.066011 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Sep 9 03:28:12.066218 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Sep 9 03:28:12.066391 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 9 03:28:12.066598 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Sep 9 03:28:12.066772 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Sep 9 03:28:12.067982 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Sep 9 03:28:12.068194 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Sep 9 03:28:12.068368 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Sep 9 03:28:12.068606 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Sep 9 03:28:12.068787 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Sep 9 03:28:12.071008 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 9 03:28:12.071185 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 9 03:28:12.071355 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 03:28:12.071557 kernel: pci_bus 0000:02: extended config space not accessible Sep 9 03:28:12.071761 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Sep 9 03:28:12.071985 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Sep 9 03:28:12.072165 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 9 03:28:12.072340 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 9 03:28:12.072560 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Sep 9 03:28:12.072738 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Sep 9 03:28:12.072915 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 9 03:28:12.073113 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 9 03:28:12.073283 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 03:28:12.073500 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Sep 9 03:28:12.073683 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Sep 9 03:28:12.073856 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 9 03:28:12.077169 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 9 03:28:12.077353 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 03:28:12.077528 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 9 03:28:12.077709 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 9 03:28:12.077910 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 03:28:12.078124 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 9 03:28:12.078328 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 9 03:28:12.078522 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 03:28:12.078724 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 9 03:28:12.078909 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 9 03:28:12.080192 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 03:28:12.080401 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 9 03:28:12.080593 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 9 03:28:12.080793 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 03:28:12.081039 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 9 03:28:12.081218 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 9 03:28:12.081392 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 03:28:12.081413 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 9 03:28:12.081427 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 9 03:28:12.081440 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 9 03:28:12.081453 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 9 03:28:12.081474 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 9 03:28:12.081487 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 9 03:28:12.081500 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 9 03:28:12.081513 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 9 03:28:12.081526 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 9 03:28:12.081539 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 9 03:28:12.081552 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 9 03:28:12.081565 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 9 03:28:12.081578 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 9 03:28:12.081596 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 9 03:28:12.081609 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 9 03:28:12.081622 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 9 03:28:12.081635 kernel: iommu: Default domain type: Translated Sep 9 03:28:12.081648 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 9 03:28:12.081661 kernel: PCI: Using ACPI for IRQ routing Sep 9 03:28:12.081674 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 9 03:28:12.081699 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 9 03:28:12.081712 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Sep 9 03:28:12.081906 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 9 03:28:12.082131 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 9 03:28:12.082304 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 9 03:28:12.082324 kernel: vgaarb: loaded Sep 9 03:28:12.082338 kernel: clocksource: Switched to clocksource kvm-clock Sep 9 03:28:12.082351 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 03:28:12.082364 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 03:28:12.082377 kernel: pnp: PnP ACPI init Sep 9 03:28:12.082581 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 9 03:28:12.082604 kernel: pnp: PnP ACPI: found 5 devices Sep 9 03:28:12.082617 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 9 03:28:12.082630 kernel: NET: Registered PF_INET protocol family Sep 9 03:28:12.082643 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 03:28:12.082656 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 9 03:28:12.082669 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 03:28:12.082682 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 9 03:28:12.082702 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 9 03:28:12.082715 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 9 03:28:12.082729 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 03:28:12.082742 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 03:28:12.082755 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 03:28:12.082768 kernel: NET: Registered PF_XDP protocol family Sep 9 03:28:12.083017 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Sep 9 03:28:12.083196 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 9 03:28:12.083375 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 9 03:28:12.083546 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 9 03:28:12.083717 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 9 03:28:12.083887 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 9 03:28:12.084086 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 9 03:28:12.084273 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 9 03:28:12.084465 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Sep 9 03:28:12.084637 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Sep 9 03:28:12.084809 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Sep 9 03:28:12.085023 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Sep 9 03:28:12.085197 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Sep 9 03:28:12.085378 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Sep 9 03:28:12.085557 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Sep 9 03:28:12.085728 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Sep 9 03:28:12.085935 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 9 03:28:12.086153 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 9 03:28:12.086323 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 9 03:28:12.086492 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 9 03:28:12.086666 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 9 03:28:12.086838 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 03:28:12.087059 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 9 03:28:12.087235 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 9 03:28:12.087415 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 9 03:28:12.087594 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 03:28:12.087776 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 9 03:28:12.088042 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 9 03:28:12.088214 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 9 03:28:12.088393 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 03:28:12.088570 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 9 03:28:12.088739 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 9 03:28:12.088911 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 9 03:28:12.089119 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 03:28:12.089290 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 9 03:28:12.089459 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 9 03:28:12.089629 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 9 03:28:12.089799 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 03:28:12.090021 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 9 03:28:12.090200 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 9 03:28:12.090370 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 9 03:28:12.090539 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 03:28:12.090708 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 9 03:28:12.090877 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 9 03:28:12.091081 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 9 03:28:12.091252 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 03:28:12.091422 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 9 03:28:12.091591 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 9 03:28:12.091765 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 9 03:28:12.091971 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 03:28:12.092142 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 9 03:28:12.092301 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 9 03:28:12.092457 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 9 03:28:12.092623 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Sep 9 03:28:12.092779 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 9 03:28:12.092984 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Sep 9 03:28:12.093173 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 9 03:28:12.093338 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Sep 9 03:28:12.093500 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 03:28:12.093672 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 9 03:28:12.093860 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Sep 9 03:28:12.094089 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 9 03:28:12.094252 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 03:28:12.094434 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Sep 9 03:28:12.094600 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 9 03:28:12.094763 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 03:28:12.095012 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Sep 9 03:28:12.095179 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 9 03:28:12.095339 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 03:28:12.095531 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Sep 9 03:28:12.095716 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 9 03:28:12.095877 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 03:28:12.096106 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Sep 9 03:28:12.096279 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 9 03:28:12.096444 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 03:28:12.096617 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Sep 9 03:28:12.096780 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Sep 9 03:28:12.096996 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 03:28:12.097188 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Sep 9 03:28:12.097351 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 9 03:28:12.097519 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 03:28:12.097540 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 9 03:28:12.097555 kernel: PCI: CLS 0 bytes, default 64 Sep 9 03:28:12.097569 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 9 03:28:12.097582 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Sep 9 03:28:12.097608 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 9 03:28:12.097622 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Sep 9 03:28:12.097635 kernel: Initialise system trusted keyrings Sep 9 03:28:12.097654 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 9 03:28:12.097684 kernel: Key type asymmetric registered Sep 9 03:28:12.097696 kernel: Asymmetric key parser 'x509' registered Sep 9 03:28:12.097709 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Sep 9 03:28:12.097722 kernel: io scheduler mq-deadline registered Sep 9 03:28:12.097747 kernel: io scheduler kyber registered Sep 9 03:28:12.097760 kernel: io scheduler bfq registered Sep 9 03:28:12.097972 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 9 03:28:12.098152 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 9 03:28:12.098334 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 03:28:12.098507 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 9 03:28:12.098683 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 9 03:28:12.098887 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 03:28:12.099114 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 9 03:28:12.099287 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 9 03:28:12.099466 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 03:28:12.099648 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 9 03:28:12.099836 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 9 03:28:12.100045 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 03:28:12.100222 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 9 03:28:12.100394 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 9 03:28:12.100575 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 03:28:12.100749 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 9 03:28:12.100985 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 9 03:28:12.101165 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 03:28:12.101338 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 9 03:28:12.101508 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 9 03:28:12.101709 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 03:28:12.101888 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 9 03:28:12.102098 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 9 03:28:12.102296 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 03:28:12.102318 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 9 03:28:12.102333 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 9 03:28:12.102354 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 9 03:28:12.102368 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 03:28:12.102382 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 9 03:28:12.102396 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 9 03:28:12.102409 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 9 03:28:12.102423 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 9 03:28:12.102606 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 9 03:28:12.102629 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 9 03:28:12.102788 kernel: rtc_cmos 00:03: registered as rtc0 Sep 9 03:28:12.103013 kernel: rtc_cmos 00:03: setting system clock to 2025-09-09T03:28:11 UTC (1757388491) Sep 9 03:28:12.103178 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Sep 9 03:28:12.103198 kernel: intel_pstate: CPU model not supported Sep 9 03:28:12.103212 kernel: NET: Registered PF_INET6 protocol family Sep 9 03:28:12.103226 kernel: Segment Routing with IPv6 Sep 9 03:28:12.103251 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 03:28:12.103264 kernel: NET: Registered PF_PACKET protocol family Sep 9 03:28:12.103277 kernel: Key type dns_resolver registered Sep 9 03:28:12.103296 kernel: IPI shorthand broadcast: enabled Sep 9 03:28:12.103323 kernel: sched_clock: Marking stable (1275004604, 241375343)->(1642026946, -125646999) Sep 9 03:28:12.103336 kernel: registered taskstats version 1 Sep 9 03:28:12.103350 kernel: Loading compiled-in X.509 certificates Sep 9 03:28:12.103364 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.104-flatcar: cc5240ef94b546331b2896cdc739274c03278c51' Sep 9 03:28:12.103377 kernel: Key type .fscrypt registered Sep 9 03:28:12.103390 kernel: Key type fscrypt-provisioning registered Sep 9 03:28:12.103404 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 03:28:12.103417 kernel: ima: Allocated hash algorithm: sha1 Sep 9 03:28:12.103436 kernel: ima: No architecture policies found Sep 9 03:28:12.103454 kernel: clk: Disabling unused clocks Sep 9 03:28:12.103467 kernel: Freeing unused kernel image (initmem) memory: 42880K Sep 9 03:28:12.103481 kernel: Write protecting the kernel read-only data: 36864k Sep 9 03:28:12.103494 kernel: Freeing unused kernel image (rodata/data gap) memory: 1832K Sep 9 03:28:12.103508 kernel: Run /init as init process Sep 9 03:28:12.103521 kernel: with arguments: Sep 9 03:28:12.103535 kernel: /init Sep 9 03:28:12.103548 kernel: with environment: Sep 9 03:28:12.103566 kernel: HOME=/ Sep 9 03:28:12.103579 kernel: TERM=linux Sep 9 03:28:12.103592 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 03:28:12.103609 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 9 03:28:12.103626 systemd[1]: Detected virtualization kvm. Sep 9 03:28:12.103640 systemd[1]: Detected architecture x86-64. Sep 9 03:28:12.103654 systemd[1]: Running in initrd. Sep 9 03:28:12.103668 systemd[1]: No hostname configured, using default hostname. Sep 9 03:28:12.103687 systemd[1]: Hostname set to . Sep 9 03:28:12.103702 systemd[1]: Initializing machine ID from VM UUID. Sep 9 03:28:12.103716 systemd[1]: Queued start job for default target initrd.target. Sep 9 03:28:12.103730 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 03:28:12.103744 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 03:28:12.103759 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 03:28:12.103774 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 03:28:12.103788 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 03:28:12.103808 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 03:28:12.103824 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 03:28:12.103839 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 03:28:12.103853 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 03:28:12.103868 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 03:28:12.103882 systemd[1]: Reached target paths.target - Path Units. Sep 9 03:28:12.103896 systemd[1]: Reached target slices.target - Slice Units. Sep 9 03:28:12.103915 systemd[1]: Reached target swap.target - Swaps. Sep 9 03:28:12.103930 systemd[1]: Reached target timers.target - Timer Units. Sep 9 03:28:12.103979 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 03:28:12.103995 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 03:28:12.104010 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 03:28:12.104025 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 9 03:28:12.104039 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 03:28:12.104054 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 03:28:12.104075 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 03:28:12.104089 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 03:28:12.104104 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 03:28:12.104118 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 03:28:12.104132 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 03:28:12.104147 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 03:28:12.104161 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 03:28:12.104176 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 03:28:12.104190 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 03:28:12.104209 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 03:28:12.104224 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 03:28:12.104239 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 03:28:12.104295 systemd-journald[202]: Collecting audit messages is disabled. Sep 9 03:28:12.104333 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 03:28:12.104349 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 03:28:12.104363 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 03:28:12.104377 kernel: Bridge firewalling registered Sep 9 03:28:12.104397 systemd-journald[202]: Journal started Sep 9 03:28:12.104423 systemd-journald[202]: Runtime Journal (/run/log/journal/af0dee13c5c641a59b550b8b79f263bb) is 4.7M, max 38.0M, 33.2M free. Sep 9 03:28:12.040002 systemd-modules-load[203]: Inserted module 'overlay' Sep 9 03:28:12.145169 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 03:28:12.078659 systemd-modules-load[203]: Inserted module 'br_netfilter' Sep 9 03:28:12.147351 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 03:28:12.148457 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 03:28:12.162179 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 03:28:12.164126 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 03:28:12.168122 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 03:28:12.180495 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 03:28:12.194288 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 03:28:12.197294 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 03:28:12.199009 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 03:28:12.203501 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 03:28:12.210165 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 03:28:12.214104 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 03:28:12.229726 dracut-cmdline[236]: dracut-dracut-053 Sep 9 03:28:12.236289 dracut-cmdline[236]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=99a67175ee6aabbc03a22dabcade16d60ad192b31c4118a259bf1f24bbfa2d29 Sep 9 03:28:12.264768 systemd-resolved[238]: Positive Trust Anchors: Sep 9 03:28:12.264790 systemd-resolved[238]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 03:28:12.264834 systemd-resolved[238]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 03:28:12.270334 systemd-resolved[238]: Defaulting to hostname 'linux'. Sep 9 03:28:12.272136 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 03:28:12.273315 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 03:28:12.349078 kernel: SCSI subsystem initialized Sep 9 03:28:12.360978 kernel: Loading iSCSI transport class v2.0-870. Sep 9 03:28:12.374967 kernel: iscsi: registered transport (tcp) Sep 9 03:28:12.401386 kernel: iscsi: registered transport (qla4xxx) Sep 9 03:28:12.401478 kernel: QLogic iSCSI HBA Driver Sep 9 03:28:12.457760 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 03:28:12.470200 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 03:28:12.501021 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 03:28:12.501119 kernel: device-mapper: uevent: version 1.0.3 Sep 9 03:28:12.504990 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Sep 9 03:28:12.551999 kernel: raid6: sse2x4 gen() 13930 MB/s Sep 9 03:28:12.569979 kernel: raid6: sse2x2 gen() 9391 MB/s Sep 9 03:28:12.588685 kernel: raid6: sse2x1 gen() 10272 MB/s Sep 9 03:28:12.588769 kernel: raid6: using algorithm sse2x4 gen() 13930 MB/s Sep 9 03:28:12.607730 kernel: raid6: .... xor() 7720 MB/s, rmw enabled Sep 9 03:28:12.607816 kernel: raid6: using ssse3x2 recovery algorithm Sep 9 03:28:12.634000 kernel: xor: automatically using best checksumming function avx Sep 9 03:28:12.832983 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 03:28:12.847730 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 03:28:12.855159 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 03:28:12.884750 systemd-udevd[421]: Using default interface naming scheme 'v255'. Sep 9 03:28:12.892726 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 03:28:12.901159 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 03:28:12.923330 dracut-pre-trigger[426]: rd.md=0: removing MD RAID activation Sep 9 03:28:12.964476 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 03:28:12.971197 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 03:28:13.084325 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 03:28:13.096057 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 03:28:13.129399 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 03:28:13.137587 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 03:28:13.138413 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 03:28:13.142415 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 03:28:13.152207 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 03:28:13.174708 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 03:28:13.223089 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Sep 9 03:28:13.228968 kernel: cryptd: max_cpu_qlen set to 1000 Sep 9 03:28:13.242052 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Sep 9 03:28:13.252007 kernel: AVX version of gcm_enc/dec engaged. Sep 9 03:28:13.252052 kernel: AES CTR mode by8 optimization enabled Sep 9 03:28:13.265545 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 03:28:13.283207 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 03:28:13.283259 kernel: GPT:17805311 != 125829119 Sep 9 03:28:13.283286 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 03:28:13.283305 kernel: GPT:17805311 != 125829119 Sep 9 03:28:13.283322 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 03:28:13.283340 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 03:28:13.265750 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 03:28:13.282441 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 03:28:13.284181 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 03:28:13.284445 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 03:28:13.286880 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 03:28:13.310683 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 03:28:13.313358 kernel: libata version 3.00 loaded. Sep 9 03:28:13.330152 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/vda6 scanned by (udev-worker) (479) Sep 9 03:28:13.343851 kernel: BTRFS: device fsid 7cd16ef1-c91b-4e35-a9b3-a431b3c1949a devid 1 transid 36 /dev/vda3 scanned by (udev-worker) (466) Sep 9 03:28:13.354967 kernel: ACPI: bus type USB registered Sep 9 03:28:13.355022 kernel: usbcore: registered new interface driver usbfs Sep 9 03:28:13.355044 kernel: usbcore: registered new interface driver hub Sep 9 03:28:13.357804 kernel: usbcore: registered new device driver usb Sep 9 03:28:13.360641 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 9 03:28:13.401114 kernel: ahci 0000:00:1f.2: version 3.0 Sep 9 03:28:13.406635 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 9 03:28:13.406677 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Sep 9 03:28:13.406921 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 9 03:28:13.407170 kernel: scsi host0: ahci Sep 9 03:28:13.407383 kernel: scsi host1: ahci Sep 9 03:28:13.411499 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 9 03:28:13.414066 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Sep 9 03:28:13.414294 kernel: scsi host2: ahci Sep 9 03:28:13.414527 kernel: scsi host3: ahci Sep 9 03:28:13.414790 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 9 03:28:13.415049 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 9 03:28:13.416259 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Sep 9 03:28:13.416494 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Sep 9 03:28:13.416717 kernel: hub 1-0:1.0: USB hub found Sep 9 03:28:13.416981 kernel: hub 1-0:1.0: 4 ports detected Sep 9 03:28:13.417211 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 9 03:28:13.417443 kernel: hub 2-0:1.0: USB hub found Sep 9 03:28:13.417668 kernel: hub 2-0:1.0: 4 ports detected Sep 9 03:28:13.417882 kernel: scsi host4: ahci Sep 9 03:28:13.411459 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 03:28:13.504494 kernel: scsi host5: ahci Sep 9 03:28:13.504761 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 Sep 9 03:28:13.504784 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 Sep 9 03:28:13.504803 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 Sep 9 03:28:13.504821 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 Sep 9 03:28:13.504839 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 Sep 9 03:28:13.504857 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 Sep 9 03:28:13.502762 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 03:28:13.509685 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 9 03:28:13.510555 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 9 03:28:13.529681 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 9 03:28:13.540201 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 03:28:13.544462 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 03:28:13.548209 disk-uuid[563]: Primary Header is updated. Sep 9 03:28:13.548209 disk-uuid[563]: Secondary Entries is updated. Sep 9 03:28:13.548209 disk-uuid[563]: Secondary Header is updated. Sep 9 03:28:13.555991 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 03:28:13.561969 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 03:28:13.572336 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 03:28:13.574634 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 03:28:13.653961 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 9 03:28:13.731449 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 9 03:28:13.731531 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 9 03:28:13.731950 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 9 03:28:13.734712 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 9 03:28:13.734996 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 9 03:28:13.738228 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 9 03:28:13.805968 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 03:28:13.812259 kernel: usbcore: registered new interface driver usbhid Sep 9 03:28:13.812300 kernel: usbhid: USB HID core driver Sep 9 03:28:13.820491 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Sep 9 03:28:13.820548 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Sep 9 03:28:14.572200 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 03:28:14.572715 disk-uuid[564]: The operation has completed successfully. Sep 9 03:28:14.631127 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 03:28:14.631284 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 03:28:14.650159 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 03:28:14.662315 sh[586]: Success Sep 9 03:28:14.679111 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Sep 9 03:28:14.742638 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 03:28:14.752431 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 03:28:14.754363 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 03:28:14.783060 kernel: BTRFS info (device dm-0): first mount of filesystem 7cd16ef1-c91b-4e35-a9b3-a431b3c1949a Sep 9 03:28:14.783132 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 9 03:28:14.785201 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Sep 9 03:28:14.787431 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 03:28:14.789191 kernel: BTRFS info (device dm-0): using free space tree Sep 9 03:28:14.799422 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 03:28:14.800965 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 03:28:14.814188 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 03:28:14.819105 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 03:28:14.831083 kernel: BTRFS info (device vda6): first mount of filesystem a5263def-4663-4ce6-b873-45a7d7f1ec33 Sep 9 03:28:14.831135 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 03:28:14.833001 kernel: BTRFS info (device vda6): using free space tree Sep 9 03:28:14.839957 kernel: BTRFS info (device vda6): auto enabling async discard Sep 9 03:28:14.857006 kernel: BTRFS info (device vda6): last unmount of filesystem a5263def-4663-4ce6-b873-45a7d7f1ec33 Sep 9 03:28:14.857123 systemd[1]: mnt-oem.mount: Deactivated successfully. Sep 9 03:28:14.865558 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 03:28:14.874110 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 03:28:15.008459 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 03:28:15.019348 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 03:28:15.029826 ignition[680]: Ignition 2.19.0 Sep 9 03:28:15.029871 ignition[680]: Stage: fetch-offline Sep 9 03:28:15.029988 ignition[680]: no configs at "/usr/lib/ignition/base.d" Sep 9 03:28:15.030015 ignition[680]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 03:28:15.030183 ignition[680]: parsed url from cmdline: "" Sep 9 03:28:15.030199 ignition[680]: no config URL provided Sep 9 03:28:15.030209 ignition[680]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 03:28:15.030226 ignition[680]: no config at "/usr/lib/ignition/user.ign" Sep 9 03:28:15.036404 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 03:28:15.030235 ignition[680]: failed to fetch config: resource requires networking Sep 9 03:28:15.031458 ignition[680]: Ignition finished successfully Sep 9 03:28:15.057717 systemd-networkd[771]: lo: Link UP Sep 9 03:28:15.057742 systemd-networkd[771]: lo: Gained carrier Sep 9 03:28:15.060260 systemd-networkd[771]: Enumeration completed Sep 9 03:28:15.060827 systemd-networkd[771]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 03:28:15.060833 systemd-networkd[771]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 03:28:15.061115 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 03:28:15.062497 systemd[1]: Reached target network.target - Network. Sep 9 03:28:15.063194 systemd-networkd[771]: eth0: Link UP Sep 9 03:28:15.063201 systemd-networkd[771]: eth0: Gained carrier Sep 9 03:28:15.063213 systemd-networkd[771]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 03:28:15.073230 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 9 03:28:15.094049 systemd-networkd[771]: eth0: DHCPv4 address 10.244.20.50/30, gateway 10.244.20.49 acquired from 10.244.20.49 Sep 9 03:28:15.098725 ignition[774]: Ignition 2.19.0 Sep 9 03:28:15.098744 ignition[774]: Stage: fetch Sep 9 03:28:15.099072 ignition[774]: no configs at "/usr/lib/ignition/base.d" Sep 9 03:28:15.099094 ignition[774]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 03:28:15.099237 ignition[774]: parsed url from cmdline: "" Sep 9 03:28:15.099244 ignition[774]: no config URL provided Sep 9 03:28:15.099254 ignition[774]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 03:28:15.099271 ignition[774]: no config at "/usr/lib/ignition/user.ign" Sep 9 03:28:15.099503 ignition[774]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Sep 9 03:28:15.099526 ignition[774]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Sep 9 03:28:15.099587 ignition[774]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Sep 9 03:28:15.119853 ignition[774]: GET result: OK Sep 9 03:28:15.120199 ignition[774]: parsing config with SHA512: eec0e4a3e5e84779b02f5961d8e6fb6555d0e3c5b980c579a81065f72dfe686218a03c7fa9fbf15f7dc8404076d429a94f8eadc6469155b363cff66d1c1107f3 Sep 9 03:28:15.127916 unknown[774]: fetched base config from "system" Sep 9 03:28:15.127953 unknown[774]: fetched base config from "system" Sep 9 03:28:15.129436 ignition[774]: fetch: fetch complete Sep 9 03:28:15.127964 unknown[774]: fetched user config from "openstack" Sep 9 03:28:15.129447 ignition[774]: fetch: fetch passed Sep 9 03:28:15.131722 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 9 03:28:15.129553 ignition[774]: Ignition finished successfully Sep 9 03:28:15.144120 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 03:28:15.180064 ignition[781]: Ignition 2.19.0 Sep 9 03:28:15.180086 ignition[781]: Stage: kargs Sep 9 03:28:15.180405 ignition[781]: no configs at "/usr/lib/ignition/base.d" Sep 9 03:28:15.180427 ignition[781]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 03:28:15.185837 ignition[781]: kargs: kargs passed Sep 9 03:28:15.186612 ignition[781]: Ignition finished successfully Sep 9 03:28:15.188171 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 03:28:15.195218 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 03:28:15.223423 ignition[787]: Ignition 2.19.0 Sep 9 03:28:15.223453 ignition[787]: Stage: disks Sep 9 03:28:15.223731 ignition[787]: no configs at "/usr/lib/ignition/base.d" Sep 9 03:28:15.223754 ignition[787]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 03:28:15.225258 ignition[787]: disks: disks passed Sep 9 03:28:15.227595 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 03:28:15.225342 ignition[787]: Ignition finished successfully Sep 9 03:28:15.229959 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 03:28:15.231339 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 03:28:15.232834 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 03:28:15.234437 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 03:28:15.238094 systemd[1]: Reached target basic.target - Basic System. Sep 9 03:28:15.246151 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 03:28:15.265970 systemd-fsck[796]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Sep 9 03:28:15.270003 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 03:28:15.279084 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 03:28:15.400961 kernel: EXT4-fs (vda9): mounted filesystem ee55a213-d578-493d-a79b-e10c399cd35c r/w with ordered data mode. Quota mode: none. Sep 9 03:28:15.401662 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 03:28:15.403407 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 03:28:15.410136 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 03:28:15.423169 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 03:28:15.426774 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 03:28:15.429177 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Sep 9 03:28:15.430371 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 03:28:15.430416 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 03:28:15.436231 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 03:28:15.442119 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/vda6 scanned by mount (804) Sep 9 03:28:15.442172 kernel: BTRFS info (device vda6): first mount of filesystem a5263def-4663-4ce6-b873-45a7d7f1ec33 Sep 9 03:28:15.442194 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 03:28:15.442211 kernel: BTRFS info (device vda6): using free space tree Sep 9 03:28:15.447987 kernel: BTRFS info (device vda6): auto enabling async discard Sep 9 03:28:15.449853 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 03:28:15.455130 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 03:28:15.537858 initrd-setup-root[834]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 03:28:15.547675 initrd-setup-root[841]: cut: /sysroot/etc/group: No such file or directory Sep 9 03:28:15.556962 initrd-setup-root[848]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 03:28:15.564683 initrd-setup-root[855]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 03:28:15.672217 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 03:28:15.679064 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 03:28:15.683125 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 03:28:15.695973 kernel: BTRFS info (device vda6): last unmount of filesystem a5263def-4663-4ce6-b873-45a7d7f1ec33 Sep 9 03:28:15.726259 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 03:28:15.732976 ignition[922]: INFO : Ignition 2.19.0 Sep 9 03:28:15.732976 ignition[922]: INFO : Stage: mount Sep 9 03:28:15.732976 ignition[922]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 03:28:15.732976 ignition[922]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 03:28:15.737535 ignition[922]: INFO : mount: mount passed Sep 9 03:28:15.737535 ignition[922]: INFO : Ignition finished successfully Sep 9 03:28:15.736780 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 03:28:15.781115 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 03:28:16.810268 systemd-networkd[771]: eth0: Gained IPv6LL Sep 9 03:28:17.515133 systemd-networkd[771]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:50c:24:19ff:fef4:1432/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:50c:24:19ff:fef4:1432/64 assigned by NDisc. Sep 9 03:28:17.515153 systemd-networkd[771]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 9 03:28:22.606228 coreos-metadata[806]: Sep 09 03:28:22.605 WARN failed to locate config-drive, using the metadata service API instead Sep 9 03:28:22.629787 coreos-metadata[806]: Sep 09 03:28:22.629 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 9 03:28:22.644307 coreos-metadata[806]: Sep 09 03:28:22.644 INFO Fetch successful Sep 9 03:28:22.645502 coreos-metadata[806]: Sep 09 03:28:22.645 INFO wrote hostname srv-sy1m0.gb1.brightbox.com to /sysroot/etc/hostname Sep 9 03:28:22.647230 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Sep 9 03:28:22.647400 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Sep 9 03:28:22.655051 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 03:28:22.678212 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 03:28:22.689951 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 scanned by mount (939) Sep 9 03:28:22.690000 kernel: BTRFS info (device vda6): first mount of filesystem a5263def-4663-4ce6-b873-45a7d7f1ec33 Sep 9 03:28:22.692983 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 03:28:22.695889 kernel: BTRFS info (device vda6): using free space tree Sep 9 03:28:22.699946 kernel: BTRFS info (device vda6): auto enabling async discard Sep 9 03:28:22.703210 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 03:28:22.741104 ignition[957]: INFO : Ignition 2.19.0 Sep 9 03:28:22.741104 ignition[957]: INFO : Stage: files Sep 9 03:28:22.742918 ignition[957]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 03:28:22.742918 ignition[957]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 03:28:22.742918 ignition[957]: DEBUG : files: compiled without relabeling support, skipping Sep 9 03:28:22.745805 ignition[957]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 03:28:22.745805 ignition[957]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 03:28:22.747946 ignition[957]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 03:28:22.747946 ignition[957]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 03:28:22.749916 ignition[957]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 03:28:22.748299 unknown[957]: wrote ssh authorized keys file for user: core Sep 9 03:28:22.752112 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Sep 9 03:28:22.752112 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Sep 9 03:28:22.752112 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 9 03:28:22.752112 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 9 03:28:23.084778 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Sep 9 03:28:24.144968 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 9 03:28:24.144968 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/install.sh" Sep 9 03:28:24.144968 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 03:28:24.144968 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 03:28:24.158178 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 03:28:24.158178 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 03:28:24.158178 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 03:28:24.158178 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 03:28:24.158178 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 03:28:24.158178 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 03:28:24.158178 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 03:28:24.158178 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 03:28:24.158178 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 03:28:24.158178 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 03:28:24.158178 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 9 03:28:24.843697 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET result: OK Sep 9 03:28:26.476125 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 03:28:26.476125 ignition[957]: INFO : files: op(c): [started] processing unit "containerd.service" Sep 9 03:28:26.481802 ignition[957]: INFO : files: op(c): op(d): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Sep 9 03:28:26.481802 ignition[957]: INFO : files: op(c): op(d): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Sep 9 03:28:26.481802 ignition[957]: INFO : files: op(c): [finished] processing unit "containerd.service" Sep 9 03:28:26.481802 ignition[957]: INFO : files: op(e): [started] processing unit "prepare-helm.service" Sep 9 03:28:26.481802 ignition[957]: INFO : files: op(e): op(f): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 03:28:26.481802 ignition[957]: INFO : files: op(e): op(f): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 03:28:26.481802 ignition[957]: INFO : files: op(e): [finished] processing unit "prepare-helm.service" Sep 9 03:28:26.481802 ignition[957]: INFO : files: op(10): [started] setting preset to enabled for "prepare-helm.service" Sep 9 03:28:26.481802 ignition[957]: INFO : files: op(10): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 03:28:26.481802 ignition[957]: INFO : files: createResultFile: createFiles: op(11): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 03:28:26.481802 ignition[957]: INFO : files: createResultFile: createFiles: op(11): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 03:28:26.481802 ignition[957]: INFO : files: files passed Sep 9 03:28:26.481802 ignition[957]: INFO : Ignition finished successfully Sep 9 03:28:26.483236 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 03:28:26.494191 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 03:28:26.503127 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 03:28:26.511430 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 03:28:26.511601 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 03:28:26.521784 initrd-setup-root-after-ignition[985]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 03:28:26.523569 initrd-setup-root-after-ignition[985]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 03:28:26.524680 initrd-setup-root-after-ignition[989]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 03:28:26.526875 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 03:28:26.528455 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 03:28:26.534131 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 03:28:26.594081 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 03:28:26.594279 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 03:28:26.596147 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 03:28:26.597529 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 03:28:26.599294 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 03:28:26.609055 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 03:28:26.625910 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 03:28:26.630165 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 03:28:26.648703 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 03:28:26.649664 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 03:28:26.651429 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 03:28:26.652966 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 03:28:26.653132 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 03:28:26.655124 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 03:28:26.656141 systemd[1]: Stopped target basic.target - Basic System. Sep 9 03:28:26.657647 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 03:28:26.659074 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 03:28:26.660461 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 03:28:26.662146 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 03:28:26.663797 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 03:28:26.665428 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 03:28:26.667109 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 03:28:26.668656 systemd[1]: Stopped target swap.target - Swaps. Sep 9 03:28:26.670151 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 03:28:26.670342 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 03:28:26.672187 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 03:28:26.673201 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 03:28:26.674647 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 03:28:26.677080 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 03:28:26.678362 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 03:28:26.678627 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 03:28:26.680478 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 03:28:26.680650 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 03:28:26.681806 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 03:28:26.682062 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 03:28:26.690255 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 03:28:26.691082 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 03:28:26.691329 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 03:28:26.703235 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 03:28:26.705806 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 03:28:26.706052 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 03:28:26.707236 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 03:28:26.707547 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 03:28:26.719946 ignition[1009]: INFO : Ignition 2.19.0 Sep 9 03:28:26.719946 ignition[1009]: INFO : Stage: umount Sep 9 03:28:26.719946 ignition[1009]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 03:28:26.719946 ignition[1009]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 9 03:28:26.719232 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 03:28:26.725865 ignition[1009]: INFO : umount: umount passed Sep 9 03:28:26.725865 ignition[1009]: INFO : Ignition finished successfully Sep 9 03:28:26.719387 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 03:28:26.724000 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 03:28:26.724169 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 03:28:26.727751 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 03:28:26.727895 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 03:28:26.728635 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 03:28:26.728704 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 03:28:26.731099 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 9 03:28:26.731163 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 9 03:28:26.733053 systemd[1]: Stopped target network.target - Network. Sep 9 03:28:26.733655 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 03:28:26.733742 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 03:28:26.735113 systemd[1]: Stopped target paths.target - Path Units. Sep 9 03:28:26.737580 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 03:28:26.741204 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 03:28:26.742305 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 03:28:26.742943 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 03:28:26.743627 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 03:28:26.745060 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 03:28:26.746477 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 03:28:26.746558 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 03:28:26.747870 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 03:28:26.747965 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 03:28:26.749592 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 03:28:26.749721 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 03:28:26.751677 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 03:28:26.754175 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 03:28:26.757183 systemd-networkd[771]: eth0: DHCPv6 lease lost Sep 9 03:28:26.757321 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 03:28:26.758237 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 03:28:26.758377 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 03:28:26.761214 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 03:28:26.761372 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 03:28:26.764424 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 03:28:26.764604 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 03:28:26.770607 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 03:28:26.770744 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 03:28:26.771908 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 03:28:26.772048 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 03:28:26.779096 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 03:28:26.779977 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 03:28:26.780054 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 03:28:26.783891 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 03:28:26.783988 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 03:28:26.784698 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 03:28:26.784780 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 03:28:26.785551 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 03:28:26.785618 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 03:28:26.787400 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 03:28:26.799518 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 03:28:26.799819 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 03:28:26.802236 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 03:28:26.802373 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 03:28:26.804391 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 03:28:26.804505 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 03:28:26.806161 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 03:28:26.806222 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 03:28:26.807634 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 03:28:26.807720 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 03:28:26.809800 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 03:28:26.809869 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 03:28:26.811305 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 03:28:26.811372 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 03:28:26.818170 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 03:28:26.819050 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 03:28:26.819134 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 03:28:26.820780 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 03:28:26.820864 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 03:28:26.839077 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 03:28:26.839282 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 03:28:26.841605 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 03:28:26.846263 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 03:28:26.868182 systemd[1]: Switching root. Sep 9 03:28:26.903245 systemd-journald[202]: Journal stopped Sep 9 03:28:28.444132 systemd-journald[202]: Received SIGTERM from PID 1 (systemd). Sep 9 03:28:28.444296 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 03:28:28.444343 kernel: SELinux: policy capability open_perms=1 Sep 9 03:28:28.444370 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 03:28:28.444396 kernel: SELinux: policy capability always_check_network=0 Sep 9 03:28:28.444416 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 03:28:28.444451 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 03:28:28.444480 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 03:28:28.444508 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 03:28:28.444536 kernel: audit: type=1403 audit(1757388507.207:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 03:28:28.444581 systemd[1]: Successfully loaded SELinux policy in 53.232ms. Sep 9 03:28:28.444639 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 20.831ms. Sep 9 03:28:28.444663 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 9 03:28:28.444695 systemd[1]: Detected virtualization kvm. Sep 9 03:28:28.444727 systemd[1]: Detected architecture x86-64. Sep 9 03:28:28.444762 systemd[1]: Detected first boot. Sep 9 03:28:28.444789 systemd[1]: Hostname set to . Sep 9 03:28:28.444811 systemd[1]: Initializing machine ID from VM UUID. Sep 9 03:28:28.444831 zram_generator::config[1071]: No configuration found. Sep 9 03:28:28.444865 systemd[1]: Populated /etc with preset unit settings. Sep 9 03:28:28.444887 systemd[1]: Queued start job for default target multi-user.target. Sep 9 03:28:28.444916 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 9 03:28:28.444966 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 03:28:28.445010 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 03:28:28.445039 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 03:28:28.445061 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 03:28:28.445087 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 03:28:28.445120 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 03:28:28.445148 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 03:28:28.445189 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 03:28:28.445211 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 03:28:28.445231 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 03:28:28.445267 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 03:28:28.445289 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 03:28:28.445316 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 03:28:28.445338 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 03:28:28.445358 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 9 03:28:28.445378 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 03:28:28.445398 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 03:28:28.445418 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 03:28:28.445458 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 03:28:28.445487 systemd[1]: Reached target slices.target - Slice Units. Sep 9 03:28:28.445518 systemd[1]: Reached target swap.target - Swaps. Sep 9 03:28:28.445556 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 03:28:28.445576 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 03:28:28.445600 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 03:28:28.445655 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 9 03:28:28.445700 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 03:28:28.445723 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 03:28:28.445750 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 03:28:28.445778 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 03:28:28.445800 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 03:28:28.445820 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 03:28:28.445852 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 03:28:28.445881 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 03:28:28.445908 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 03:28:28.445949 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 03:28:28.445992 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 03:28:28.446014 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 03:28:28.446033 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 03:28:28.446059 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 03:28:28.446080 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 03:28:28.446112 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 03:28:28.446142 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 03:28:28.446162 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 03:28:28.446195 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 03:28:28.446215 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 03:28:28.446236 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 03:28:28.446256 systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. Sep 9 03:28:28.446286 systemd[1]: systemd-journald.service: (This warning is only shown for the first unit using IP firewalling.) Sep 9 03:28:28.446319 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 03:28:28.446347 kernel: fuse: init (API version 7.39) Sep 9 03:28:28.446368 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 03:28:28.446389 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 03:28:28.446408 kernel: loop: module loaded Sep 9 03:28:28.446428 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 03:28:28.446448 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 03:28:28.446482 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 03:28:28.446504 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 03:28:28.446537 kernel: ACPI: bus type drm_connector registered Sep 9 03:28:28.446558 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 03:28:28.446585 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 03:28:28.446606 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 03:28:28.446626 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 03:28:28.446714 systemd-journald[1180]: Collecting audit messages is disabled. Sep 9 03:28:28.446790 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 03:28:28.446828 systemd-journald[1180]: Journal started Sep 9 03:28:28.446876 systemd-journald[1180]: Runtime Journal (/run/log/journal/af0dee13c5c641a59b550b8b79f263bb) is 4.7M, max 38.0M, 33.2M free. Sep 9 03:28:28.448999 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 03:28:28.453070 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 03:28:28.455377 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 03:28:28.456962 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 03:28:28.457240 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 03:28:28.458692 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 03:28:28.459200 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 03:28:28.460514 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 03:28:28.460907 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 03:28:28.462331 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 03:28:28.462599 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 03:28:28.464234 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 03:28:28.464608 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 03:28:28.465922 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 03:28:28.466437 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 03:28:28.467827 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 03:28:28.469556 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 03:28:28.471014 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 03:28:28.484305 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 03:28:28.493099 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 03:28:28.496031 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 03:28:28.497862 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 03:28:28.509160 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 03:28:28.524120 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 03:28:28.525030 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 03:28:28.535114 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 03:28:28.536193 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 03:28:28.542110 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 03:28:28.546211 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 03:28:28.553287 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 03:28:28.556193 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 03:28:28.566297 systemd-journald[1180]: Time spent on flushing to /var/log/journal/af0dee13c5c641a59b550b8b79f263bb is 63.459ms for 1127 entries. Sep 9 03:28:28.566297 systemd-journald[1180]: System Journal (/var/log/journal/af0dee13c5c641a59b550b8b79f263bb) is 8.0M, max 584.8M, 576.8M free. Sep 9 03:28:28.644444 systemd-journald[1180]: Received client request to flush runtime journal. Sep 9 03:28:28.580459 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 03:28:28.582845 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 03:28:28.617453 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 03:28:28.639811 systemd-tmpfiles[1225]: ACLs are not supported, ignoring. Sep 9 03:28:28.639831 systemd-tmpfiles[1225]: ACLs are not supported, ignoring. Sep 9 03:28:28.649431 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 03:28:28.652563 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 03:28:28.669202 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 03:28:28.738545 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 03:28:28.751007 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Sep 9 03:28:28.752764 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 03:28:28.762269 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 03:28:28.773001 udevadm[1244]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Sep 9 03:28:28.788235 systemd-tmpfiles[1247]: ACLs are not supported, ignoring. Sep 9 03:28:28.788263 systemd-tmpfiles[1247]: ACLs are not supported, ignoring. Sep 9 03:28:28.797523 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 03:28:29.301720 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 03:28:29.310179 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 03:28:29.346402 systemd-udevd[1253]: Using default interface naming scheme 'v255'. Sep 9 03:28:29.374758 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 03:28:29.384103 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 03:28:29.417364 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 03:28:29.464673 systemd[1]: Found device dev-ttyS0.device - /dev/ttyS0. Sep 9 03:28:29.491964 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (1265) Sep 9 03:28:29.518136 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 03:28:29.657354 systemd-networkd[1258]: lo: Link UP Sep 9 03:28:29.657880 systemd-networkd[1258]: lo: Gained carrier Sep 9 03:28:29.660599 systemd-networkd[1258]: Enumeration completed Sep 9 03:28:29.661163 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 03:28:29.662067 systemd-networkd[1258]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 03:28:29.662181 systemd-networkd[1258]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 03:28:29.663024 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 9 03:28:29.667521 systemd-networkd[1258]: eth0: Link UP Sep 9 03:28:29.667532 systemd-networkd[1258]: eth0: Gained carrier Sep 9 03:28:29.667551 systemd-networkd[1258]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 03:28:29.671088 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 03:28:29.679961 kernel: ACPI: button: Power Button [PWRF] Sep 9 03:28:29.693434 systemd-networkd[1258]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 03:28:29.717504 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 03:28:29.731269 systemd-networkd[1258]: eth0: DHCPv4 address 10.244.20.50/30, gateway 10.244.20.49 acquired from 10.244.20.49 Sep 9 03:28:29.733954 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 03:28:29.757054 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 9 03:28:29.757481 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Sep 9 03:28:29.763779 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Sep 9 03:28:29.764139 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 9 03:28:29.822224 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 03:28:30.001500 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 03:28:30.040403 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Sep 9 03:28:30.056210 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Sep 9 03:28:30.070974 lvm[1293]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 9 03:28:30.104275 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Sep 9 03:28:30.107041 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 03:28:30.114270 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Sep 9 03:28:30.123017 lvm[1296]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 9 03:28:30.156270 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Sep 9 03:28:30.157993 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 03:28:30.158991 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 03:28:30.159158 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 03:28:30.159982 systemd[1]: Reached target machines.target - Containers. Sep 9 03:28:30.162447 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Sep 9 03:28:30.168139 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 03:28:30.171319 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 03:28:30.172366 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 03:28:30.174114 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 03:28:30.186098 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Sep 9 03:28:30.197146 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 03:28:30.201531 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 03:28:30.227971 kernel: loop0: detected capacity change from 0 to 8 Sep 9 03:28:30.233779 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 03:28:30.250988 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 03:28:30.258973 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 03:28:30.259902 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Sep 9 03:28:30.269050 kernel: loop1: detected capacity change from 0 to 221472 Sep 9 03:28:30.307605 kernel: loop2: detected capacity change from 0 to 140768 Sep 9 03:28:30.349025 kernel: loop3: detected capacity change from 0 to 142488 Sep 9 03:28:30.388899 kernel: loop4: detected capacity change from 0 to 8 Sep 9 03:28:30.394041 kernel: loop5: detected capacity change from 0 to 221472 Sep 9 03:28:30.420952 kernel: loop6: detected capacity change from 0 to 140768 Sep 9 03:28:30.449980 kernel: loop7: detected capacity change from 0 to 142488 Sep 9 03:28:30.465246 (sd-merge)[1318]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Sep 9 03:28:30.466116 (sd-merge)[1318]: Merged extensions into '/usr'. Sep 9 03:28:30.489032 systemd[1]: Reloading requested from client PID 1304 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 03:28:30.489069 systemd[1]: Reloading... Sep 9 03:28:30.585979 zram_generator::config[1352]: No configuration found. Sep 9 03:28:30.699090 systemd-networkd[1258]: eth0: Gained IPv6LL Sep 9 03:28:30.816856 ldconfig[1300]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 03:28:30.859780 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 9 03:28:30.953471 systemd[1]: Reloading finished in 463 ms. Sep 9 03:28:30.980561 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 03:28:30.982332 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 03:28:30.983698 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 03:28:30.995240 systemd[1]: Starting ensure-sysext.service... Sep 9 03:28:30.998333 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 03:28:31.021076 systemd[1]: Reloading requested from client PID 1411 ('systemctl') (unit ensure-sysext.service)... Sep 9 03:28:31.021101 systemd[1]: Reloading... Sep 9 03:28:31.045869 systemd-tmpfiles[1412]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 03:28:31.046566 systemd-tmpfiles[1412]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 03:28:31.048181 systemd-tmpfiles[1412]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 03:28:31.048666 systemd-tmpfiles[1412]: ACLs are not supported, ignoring. Sep 9 03:28:31.048790 systemd-tmpfiles[1412]: ACLs are not supported, ignoring. Sep 9 03:28:31.057331 systemd-tmpfiles[1412]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 03:28:31.057346 systemd-tmpfiles[1412]: Skipping /boot Sep 9 03:28:31.077876 systemd-tmpfiles[1412]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 03:28:31.078070 systemd-tmpfiles[1412]: Skipping /boot Sep 9 03:28:31.115002 zram_generator::config[1439]: No configuration found. Sep 9 03:28:31.307214 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 9 03:28:31.399782 systemd[1]: Reloading finished in 378 ms. Sep 9 03:28:31.429655 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 03:28:31.438303 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 9 03:28:31.444152 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 03:28:31.454114 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 03:28:31.459831 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 03:28:31.466125 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 03:28:31.486290 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 03:28:31.487126 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 03:28:31.492616 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 03:28:31.498735 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 03:28:31.512082 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 03:28:31.513896 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 03:28:31.514340 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 03:28:31.520038 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 03:28:31.521304 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 03:28:31.521548 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 03:28:31.521694 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 03:28:31.523718 systemd-networkd[1258]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:50c:24:19ff:fef4:1432/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:50c:24:19ff:fef4:1432/64 assigned by NDisc. Sep 9 03:28:31.523730 systemd-networkd[1258]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 9 03:28:31.535262 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 03:28:31.535870 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 03:28:31.550341 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 03:28:31.552293 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 03:28:31.552550 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 03:28:31.554433 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 03:28:31.563033 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 03:28:31.563556 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 03:28:31.566489 systemd[1]: Finished ensure-sysext.service. Sep 9 03:28:31.567694 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 03:28:31.569014 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 03:28:31.570290 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 03:28:31.570523 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 03:28:31.578429 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 03:28:31.578648 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 03:28:31.590184 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 03:28:31.595368 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 03:28:31.595705 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 03:28:31.610836 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 03:28:31.626456 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 03:28:31.631703 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 03:28:31.636970 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 03:28:31.637725 augenrules[1545]: No rules Sep 9 03:28:31.643034 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 9 03:28:31.672654 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 03:28:31.713068 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 03:28:31.714363 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 03:28:31.718187 systemd-resolved[1508]: Positive Trust Anchors: Sep 9 03:28:31.718650 systemd-resolved[1508]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 03:28:31.718797 systemd-resolved[1508]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 03:28:31.725864 systemd-resolved[1508]: Using system hostname 'srv-sy1m0.gb1.brightbox.com'. Sep 9 03:28:31.729178 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 03:28:31.730242 systemd[1]: Reached target network.target - Network. Sep 9 03:28:31.731067 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 03:28:31.731963 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 03:28:31.732845 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 03:28:31.733874 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 03:28:31.734848 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 03:28:31.736107 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 03:28:31.737174 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 03:28:31.738159 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 03:28:31.739118 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 03:28:31.739268 systemd[1]: Reached target paths.target - Path Units. Sep 9 03:28:31.740046 systemd[1]: Reached target timers.target - Timer Units. Sep 9 03:28:31.742061 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 03:28:31.745161 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 03:28:31.748167 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 03:28:31.749436 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 03:28:31.750219 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 03:28:31.750897 systemd[1]: Reached target basic.target - Basic System. Sep 9 03:28:31.751886 systemd[1]: System is tainted: cgroupsv1 Sep 9 03:28:31.751964 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 03:28:31.752022 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 03:28:31.755040 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 03:28:31.759886 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 9 03:28:31.770138 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 03:28:31.777875 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 03:28:31.792151 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 03:28:31.792990 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 03:28:31.803541 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 03:28:31.811110 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 03:28:31.816954 jq[1564]: false Sep 9 03:28:31.823771 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 03:28:31.819760 dbus-daemon[1561]: [system] SELinux support is enabled Sep 9 03:28:31.824260 dbus-daemon[1561]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1258 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 9 03:28:31.838178 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 03:28:31.850121 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 03:28:31.854861 extend-filesystems[1566]: Found loop4 Sep 9 03:28:31.854861 extend-filesystems[1566]: Found loop5 Sep 9 03:28:31.854861 extend-filesystems[1566]: Found loop6 Sep 9 03:28:31.854861 extend-filesystems[1566]: Found loop7 Sep 9 03:28:31.854861 extend-filesystems[1566]: Found vda Sep 9 03:28:31.854861 extend-filesystems[1566]: Found vda1 Sep 9 03:28:31.854861 extend-filesystems[1566]: Found vda2 Sep 9 03:28:31.854861 extend-filesystems[1566]: Found vda3 Sep 9 03:28:31.854861 extend-filesystems[1566]: Found usr Sep 9 03:28:31.854861 extend-filesystems[1566]: Found vda4 Sep 9 03:28:31.854861 extend-filesystems[1566]: Found vda6 Sep 9 03:28:31.854861 extend-filesystems[1566]: Found vda7 Sep 9 03:28:31.927455 extend-filesystems[1566]: Found vda9 Sep 9 03:28:31.927455 extend-filesystems[1566]: Checking size of /dev/vda9 Sep 9 03:28:31.861103 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 03:28:31.934316 extend-filesystems[1566]: Resized partition /dev/vda9 Sep 9 03:28:31.882150 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 03:28:31.884736 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 03:28:31.893158 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 03:28:31.936221 update_engine[1588]: I20250909 03:28:31.919710 1588 main.cc:92] Flatcar Update Engine starting Sep 9 03:28:31.936221 update_engine[1588]: I20250909 03:28:31.922808 1588 update_check_scheduler.cc:74] Next update check in 6m43s Sep 9 03:28:31.904077 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 03:28:31.928680 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 03:28:31.938866 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 03:28:31.939267 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 03:28:31.947192 extend-filesystems[1599]: resize2fs 1.47.1 (20-May-2024) Sep 9 03:28:31.958090 jq[1589]: true Sep 9 03:28:31.978089 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Sep 9 03:28:31.974460 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 03:28:31.974911 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 03:28:31.986149 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 03:28:31.987776 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 03:28:31.988779 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 03:28:32.506819 systemd-resolved[1508]: Clock change detected. Flushing caches. Sep 9 03:28:32.511893 systemd-timesyncd[1536]: Contacted time server 162.159.200.123:123 (0.flatcar.pool.ntp.org). Sep 9 03:28:32.512005 systemd-timesyncd[1536]: Initial clock synchronization to Tue 2025-09-09 03:28:32.505998 UTC. Sep 9 03:28:32.520539 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (1264) Sep 9 03:28:32.544302 (ntainerd)[1609]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 03:28:32.549194 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 03:28:32.549880 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 03:28:32.553061 tar[1603]: linux-amd64/helm Sep 9 03:28:32.552214 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 03:28:32.551706 dbus-daemon[1561]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 9 03:28:32.552246 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 03:28:32.555305 systemd[1]: Started update-engine.service - Update Engine. Sep 9 03:28:32.571709 jq[1608]: true Sep 9 03:28:32.559036 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 03:28:32.575218 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 03:28:32.654260 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 9 03:28:32.872396 bash[1639]: Updated "/home/core/.ssh/authorized_keys" Sep 9 03:28:32.875503 systemd-logind[1583]: Watching system buttons on /dev/input/event2 (Power Button) Sep 9 03:28:32.875551 systemd-logind[1583]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 9 03:28:32.877377 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 03:28:32.878015 systemd-logind[1583]: New seat seat0. Sep 9 03:28:32.894174 systemd[1]: Starting sshkeys.service... Sep 9 03:28:32.901255 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 03:28:32.957123 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 9 03:28:32.967179 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 9 03:28:32.981832 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Sep 9 03:28:33.022972 extend-filesystems[1599]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 9 03:28:33.022972 extend-filesystems[1599]: old_desc_blocks = 1, new_desc_blocks = 8 Sep 9 03:28:33.022972 extend-filesystems[1599]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Sep 9 03:28:33.042612 extend-filesystems[1566]: Resized filesystem in /dev/vda9 Sep 9 03:28:33.024889 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 03:28:33.025354 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 03:28:33.053855 locksmithd[1622]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 03:28:33.180558 dbus-daemon[1561]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 9 03:28:33.180877 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 9 03:28:33.184645 dbus-daemon[1561]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.8' (uid=0 pid=1625 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 9 03:28:33.197874 systemd[1]: Starting polkit.service - Authorization Manager... Sep 9 03:28:33.204932 sshd_keygen[1601]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 03:28:33.228945 containerd[1609]: time="2025-09-09T03:28:33.228796219Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Sep 9 03:28:33.243502 polkitd[1673]: Started polkitd version 121 Sep 9 03:28:33.266810 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 03:28:33.266091 polkitd[1673]: Loading rules from directory /etc/polkit-1/rules.d Sep 9 03:28:33.266196 polkitd[1673]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 9 03:28:33.271611 polkitd[1673]: Finished loading, compiling and executing 2 rules Sep 9 03:28:33.273465 dbus-daemon[1561]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 9 03:28:33.275238 polkitd[1673]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 9 03:28:33.278174 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 03:28:33.284138 systemd[1]: Started polkit.service - Authorization Manager. Sep 9 03:28:33.321167 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 03:28:33.321624 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 03:28:33.339344 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 03:28:33.348727 containerd[1609]: time="2025-09-09T03:28:33.348634735Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Sep 9 03:28:33.360997 containerd[1609]: time="2025-09-09T03:28:33.360161084Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.104-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Sep 9 03:28:33.360997 containerd[1609]: time="2025-09-09T03:28:33.360235952Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Sep 9 03:28:33.360997 containerd[1609]: time="2025-09-09T03:28:33.360284940Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Sep 9 03:28:33.361080 systemd-hostnamed[1625]: Hostname set to (static) Sep 9 03:28:33.365786 containerd[1609]: time="2025-09-09T03:28:33.362989273Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Sep 9 03:28:33.365786 containerd[1609]: time="2025-09-09T03:28:33.363043607Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Sep 9 03:28:33.365786 containerd[1609]: time="2025-09-09T03:28:33.363246715Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Sep 9 03:28:33.365786 containerd[1609]: time="2025-09-09T03:28:33.363298564Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Sep 9 03:28:33.365786 containerd[1609]: time="2025-09-09T03:28:33.365116839Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 9 03:28:33.365786 containerd[1609]: time="2025-09-09T03:28:33.365156290Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Sep 9 03:28:33.365786 containerd[1609]: time="2025-09-09T03:28:33.365177572Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Sep 9 03:28:33.365786 containerd[1609]: time="2025-09-09T03:28:33.365206510Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Sep 9 03:28:33.365786 containerd[1609]: time="2025-09-09T03:28:33.365342382Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Sep 9 03:28:33.366198 containerd[1609]: time="2025-09-09T03:28:33.366169806Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Sep 9 03:28:33.367063 containerd[1609]: time="2025-09-09T03:28:33.367031496Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 9 03:28:33.367183 containerd[1609]: time="2025-09-09T03:28:33.367159386Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Sep 9 03:28:33.369388 containerd[1609]: time="2025-09-09T03:28:33.368810311Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Sep 9 03:28:33.369388 containerd[1609]: time="2025-09-09T03:28:33.368987850Z" level=info msg="metadata content store policy set" policy=shared Sep 9 03:28:33.373710 containerd[1609]: time="2025-09-09T03:28:33.373676306Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Sep 9 03:28:33.374116 containerd[1609]: time="2025-09-09T03:28:33.374080570Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Sep 9 03:28:33.374257 containerd[1609]: time="2025-09-09T03:28:33.374233023Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Sep 9 03:28:33.374361 containerd[1609]: time="2025-09-09T03:28:33.374337278Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Sep 9 03:28:33.374581 containerd[1609]: time="2025-09-09T03:28:33.374555754Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Sep 9 03:28:33.376611 containerd[1609]: time="2025-09-09T03:28:33.375915121Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Sep 9 03:28:33.376611 containerd[1609]: time="2025-09-09T03:28:33.376416738Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Sep 9 03:28:33.376823 containerd[1609]: time="2025-09-09T03:28:33.376795584Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Sep 9 03:28:33.376932 containerd[1609]: time="2025-09-09T03:28:33.376899502Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Sep 9 03:28:33.378478 containerd[1609]: time="2025-09-09T03:28:33.378392421Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Sep 9 03:28:33.378539 containerd[1609]: time="2025-09-09T03:28:33.378480500Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Sep 9 03:28:33.378539 containerd[1609]: time="2025-09-09T03:28:33.378517519Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Sep 9 03:28:33.378605 containerd[1609]: time="2025-09-09T03:28:33.378544823Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Sep 9 03:28:33.378605 containerd[1609]: time="2025-09-09T03:28:33.378568335Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Sep 9 03:28:33.378605 containerd[1609]: time="2025-09-09T03:28:33.378591108Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Sep 9 03:28:33.378738 containerd[1609]: time="2025-09-09T03:28:33.378611477Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Sep 9 03:28:33.378738 containerd[1609]: time="2025-09-09T03:28:33.378631903Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Sep 9 03:28:33.378738 containerd[1609]: time="2025-09-09T03:28:33.378653525Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Sep 9 03:28:33.378738 containerd[1609]: time="2025-09-09T03:28:33.378713123Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.378737611Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.378780783Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.378804283Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.378838437Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.378860644Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.378889385Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.378908633Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.378946073Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.378969519Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.378989765Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.379010003Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.379067242Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.379095098Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.379147734Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379220 containerd[1609]: time="2025-09-09T03:28:33.379170784Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379789 containerd[1609]: time="2025-09-09T03:28:33.379196995Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Sep 9 03:28:33.379789 containerd[1609]: time="2025-09-09T03:28:33.379287705Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Sep 9 03:28:33.379789 containerd[1609]: time="2025-09-09T03:28:33.379319317Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Sep 9 03:28:33.379789 containerd[1609]: time="2025-09-09T03:28:33.379338744Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Sep 9 03:28:33.379789 containerd[1609]: time="2025-09-09T03:28:33.379357825Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Sep 9 03:28:33.379789 containerd[1609]: time="2025-09-09T03:28:33.379373378Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.379789 containerd[1609]: time="2025-09-09T03:28:33.379400614Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Sep 9 03:28:33.379789 containerd[1609]: time="2025-09-09T03:28:33.379418943Z" level=info msg="NRI interface is disabled by configuration." Sep 9 03:28:33.379789 containerd[1609]: time="2025-09-09T03:28:33.379446391Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Sep 9 03:28:33.380131 containerd[1609]: time="2025-09-09T03:28:33.379875134Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:false] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:false SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Sep 9 03:28:33.380131 containerd[1609]: time="2025-09-09T03:28:33.379985292Z" level=info msg="Connect containerd service" Sep 9 03:28:33.380131 containerd[1609]: time="2025-09-09T03:28:33.380065923Z" level=info msg="using legacy CRI server" Sep 9 03:28:33.380131 containerd[1609]: time="2025-09-09T03:28:33.380081915Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 03:28:33.380529 containerd[1609]: time="2025-09-09T03:28:33.380264306Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Sep 9 03:28:33.381164 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 03:28:33.392111 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 03:28:33.402268 containerd[1609]: time="2025-09-09T03:28:33.402107581Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 03:28:33.406161 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 9 03:28:33.408527 containerd[1609]: time="2025-09-09T03:28:33.404914056Z" level=info msg="Start subscribing containerd event" Sep 9 03:28:33.408813 containerd[1609]: time="2025-09-09T03:28:33.408541966Z" level=info msg="Start recovering state" Sep 9 03:28:33.408897 containerd[1609]: time="2025-09-09T03:28:33.408869635Z" level=info msg="Start event monitor" Sep 9 03:28:33.408947 containerd[1609]: time="2025-09-09T03:28:33.408920147Z" level=info msg="Start snapshots syncer" Sep 9 03:28:33.408947 containerd[1609]: time="2025-09-09T03:28:33.408944834Z" level=info msg="Start cni network conf syncer for default" Sep 9 03:28:33.409656 containerd[1609]: time="2025-09-09T03:28:33.408968211Z" level=info msg="Start streaming server" Sep 9 03:28:33.411447 containerd[1609]: time="2025-09-09T03:28:33.411359716Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 03:28:33.411556 containerd[1609]: time="2025-09-09T03:28:33.411529180Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 03:28:33.411667 containerd[1609]: time="2025-09-09T03:28:33.411644275Z" level=info msg="containerd successfully booted in 0.184818s" Sep 9 03:28:33.411883 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 03:28:33.415642 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 03:28:33.671680 tar[1603]: linux-amd64/LICENSE Sep 9 03:28:33.672405 tar[1603]: linux-amd64/README.md Sep 9 03:28:33.689892 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 03:28:34.134977 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 03:28:34.144677 (kubelet)[1721]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 03:28:34.348235 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 03:28:34.357296 systemd[1]: Started sshd@0-10.244.20.50:22-147.75.109.163:59892.service - OpenSSH per-connection server daemon (147.75.109.163:59892). Sep 9 03:28:34.753964 kubelet[1721]: E0909 03:28:34.753899 1721 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 03:28:34.756652 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 03:28:34.756999 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 03:28:35.524670 sshd[1726]: Accepted publickey for core from 147.75.109.163 port 59892 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:28:35.526680 sshd[1726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:28:35.545336 systemd-logind[1583]: New session 1 of user core. Sep 9 03:28:35.547973 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 03:28:35.560235 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 03:28:35.580329 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 03:28:35.590213 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 03:28:35.603826 (systemd)[1736]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 03:28:35.762327 systemd[1736]: Queued start job for default target default.target. Sep 9 03:28:35.763378 systemd[1736]: Created slice app.slice - User Application Slice. Sep 9 03:28:35.763434 systemd[1736]: Reached target paths.target - Paths. Sep 9 03:28:35.763458 systemd[1736]: Reached target timers.target - Timers. Sep 9 03:28:35.768873 systemd[1736]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 03:28:35.785503 systemd[1736]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 03:28:35.785579 systemd[1736]: Reached target sockets.target - Sockets. Sep 9 03:28:35.785603 systemd[1736]: Reached target basic.target - Basic System. Sep 9 03:28:35.785667 systemd[1736]: Reached target default.target - Main User Target. Sep 9 03:28:35.785719 systemd[1736]: Startup finished in 172ms. Sep 9 03:28:35.785887 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 03:28:35.802663 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 03:28:36.579274 systemd[1]: Started sshd@1-10.244.20.50:22-147.75.109.163:59894.service - OpenSSH per-connection server daemon (147.75.109.163:59894). Sep 9 03:28:37.467686 sshd[1749]: Accepted publickey for core from 147.75.109.163 port 59894 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:28:37.469883 sshd[1749]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:28:37.478826 systemd-logind[1583]: New session 2 of user core. Sep 9 03:28:37.486572 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 03:28:38.089214 sshd[1749]: pam_unix(sshd:session): session closed for user core Sep 9 03:28:38.095090 systemd[1]: sshd@1-10.244.20.50:22-147.75.109.163:59894.service: Deactivated successfully. Sep 9 03:28:38.095708 systemd-logind[1583]: Session 2 logged out. Waiting for processes to exit. Sep 9 03:28:38.101326 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 03:28:38.102437 systemd-logind[1583]: Removed session 2. Sep 9 03:28:38.308270 systemd[1]: Started sshd@2-10.244.20.50:22-147.75.109.163:59906.service - OpenSSH per-connection server daemon (147.75.109.163:59906). Sep 9 03:28:38.468938 login[1706]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 9 03:28:38.472679 login[1704]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 9 03:28:38.477190 systemd-logind[1583]: New session 3 of user core. Sep 9 03:28:38.485264 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 03:28:38.488996 systemd-logind[1583]: New session 4 of user core. Sep 9 03:28:38.496270 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 03:28:39.258895 sshd[1757]: Accepted publickey for core from 147.75.109.163 port 59906 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:28:39.261173 sshd[1757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:28:39.268450 systemd-logind[1583]: New session 5 of user core. Sep 9 03:28:39.284538 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 03:28:39.413264 coreos-metadata[1560]: Sep 09 03:28:39.413 WARN failed to locate config-drive, using the metadata service API instead Sep 9 03:28:39.439888 coreos-metadata[1560]: Sep 09 03:28:39.439 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Sep 9 03:28:39.472164 coreos-metadata[1560]: Sep 09 03:28:39.472 INFO Fetch failed with 404: resource not found Sep 9 03:28:39.472164 coreos-metadata[1560]: Sep 09 03:28:39.472 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 9 03:28:39.482575 coreos-metadata[1560]: Sep 09 03:28:39.482 INFO Fetch successful Sep 9 03:28:39.482720 coreos-metadata[1560]: Sep 09 03:28:39.482 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Sep 9 03:28:39.501857 coreos-metadata[1560]: Sep 09 03:28:39.501 INFO Fetch successful Sep 9 03:28:39.501857 coreos-metadata[1560]: Sep 09 03:28:39.501 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Sep 9 03:28:39.623828 coreos-metadata[1560]: Sep 09 03:28:39.623 INFO Fetch successful Sep 9 03:28:39.623828 coreos-metadata[1560]: Sep 09 03:28:39.623 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Sep 9 03:28:39.709060 coreos-metadata[1560]: Sep 09 03:28:39.708 INFO Fetch successful Sep 9 03:28:39.709060 coreos-metadata[1560]: Sep 09 03:28:39.708 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Sep 9 03:28:39.734407 coreos-metadata[1560]: Sep 09 03:28:39.734 INFO Fetch successful Sep 9 03:28:39.763537 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 9 03:28:39.764580 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 03:28:39.950125 sshd[1757]: pam_unix(sshd:session): session closed for user core Sep 9 03:28:39.954661 systemd[1]: sshd@2-10.244.20.50:22-147.75.109.163:59906.service: Deactivated successfully. Sep 9 03:28:39.958491 systemd-logind[1583]: Session 5 logged out. Waiting for processes to exit. Sep 9 03:28:39.959284 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 03:28:39.962141 systemd-logind[1583]: Removed session 5. Sep 9 03:28:40.154117 coreos-metadata[1648]: Sep 09 03:28:40.154 WARN failed to locate config-drive, using the metadata service API instead Sep 9 03:28:40.177406 coreos-metadata[1648]: Sep 09 03:28:40.177 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Sep 9 03:28:40.214543 coreos-metadata[1648]: Sep 09 03:28:40.214 INFO Fetch successful Sep 9 03:28:40.215270 coreos-metadata[1648]: Sep 09 03:28:40.215 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 9 03:28:41.008798 coreos-metadata[1648]: Sep 09 03:28:41.008 INFO Fetch successful Sep 9 03:28:41.011137 unknown[1648]: wrote ssh authorized keys file for user: core Sep 9 03:28:41.036593 update-ssh-keys[1805]: Updated "/home/core/.ssh/authorized_keys" Sep 9 03:28:41.041009 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 9 03:28:41.045171 systemd[1]: Finished sshkeys.service. Sep 9 03:28:41.049061 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 03:28:41.049244 systemd[1]: Startup finished in 16.904s (kernel) + 13.385s (userspace) = 30.290s. Sep 9 03:28:44.763434 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 03:28:44.773051 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 03:28:44.967030 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 03:28:44.973134 (kubelet)[1825]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 03:28:45.054839 kubelet[1825]: E0909 03:28:45.053007 1825 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 03:28:45.056869 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 03:28:45.057177 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 03:28:50.105170 systemd[1]: Started sshd@3-10.244.20.50:22-147.75.109.163:55328.service - OpenSSH per-connection server daemon (147.75.109.163:55328). Sep 9 03:28:50.997778 sshd[1833]: Accepted publickey for core from 147.75.109.163 port 55328 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:28:50.999880 sshd[1833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:28:51.007361 systemd-logind[1583]: New session 6 of user core. Sep 9 03:28:51.017170 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 03:28:51.625921 sshd[1833]: pam_unix(sshd:session): session closed for user core Sep 9 03:28:51.629823 systemd-logind[1583]: Session 6 logged out. Waiting for processes to exit. Sep 9 03:28:51.630910 systemd[1]: sshd@3-10.244.20.50:22-147.75.109.163:55328.service: Deactivated successfully. Sep 9 03:28:51.635235 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 03:28:51.637258 systemd-logind[1583]: Removed session 6. Sep 9 03:28:51.782144 systemd[1]: Started sshd@4-10.244.20.50:22-147.75.109.163:55334.service - OpenSSH per-connection server daemon (147.75.109.163:55334). Sep 9 03:28:52.667090 sshd[1841]: Accepted publickey for core from 147.75.109.163 port 55334 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:28:52.669155 sshd[1841]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:28:52.676358 systemd-logind[1583]: New session 7 of user core. Sep 9 03:28:52.683237 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 03:28:53.281481 sshd[1841]: pam_unix(sshd:session): session closed for user core Sep 9 03:28:53.285385 systemd-logind[1583]: Session 7 logged out. Waiting for processes to exit. Sep 9 03:28:53.285865 systemd[1]: sshd@4-10.244.20.50:22-147.75.109.163:55334.service: Deactivated successfully. Sep 9 03:28:53.289717 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 03:28:53.290647 systemd-logind[1583]: Removed session 7. Sep 9 03:28:53.432149 systemd[1]: Started sshd@5-10.244.20.50:22-147.75.109.163:55342.service - OpenSSH per-connection server daemon (147.75.109.163:55342). Sep 9 03:28:54.317494 sshd[1849]: Accepted publickey for core from 147.75.109.163 port 55342 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:28:54.319834 sshd[1849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:28:54.327990 systemd-logind[1583]: New session 8 of user core. Sep 9 03:28:54.339279 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 03:28:54.934508 sshd[1849]: pam_unix(sshd:session): session closed for user core Sep 9 03:28:54.939116 systemd[1]: sshd@5-10.244.20.50:22-147.75.109.163:55342.service: Deactivated successfully. Sep 9 03:28:54.943345 systemd-logind[1583]: Session 8 logged out. Waiting for processes to exit. Sep 9 03:28:54.943482 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 03:28:54.946229 systemd-logind[1583]: Removed session 8. Sep 9 03:28:55.078903 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 03:28:55.085041 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 03:28:55.092119 systemd[1]: Started sshd@6-10.244.20.50:22-147.75.109.163:55356.service - OpenSSH per-connection server daemon (147.75.109.163:55356). Sep 9 03:28:55.246970 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 03:28:55.252673 (kubelet)[1871]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 03:28:55.350589 kubelet[1871]: E0909 03:28:55.350436 1871 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 03:28:55.353519 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 03:28:55.353888 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 03:28:55.991545 sshd[1858]: Accepted publickey for core from 147.75.109.163 port 55356 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:28:55.993913 sshd[1858]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:28:56.002035 systemd-logind[1583]: New session 9 of user core. Sep 9 03:28:56.017208 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 03:28:56.502900 sudo[1881]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 03:28:56.503419 sudo[1881]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 03:28:56.518220 sudo[1881]: pam_unix(sudo:session): session closed for user root Sep 9 03:28:56.663237 sshd[1858]: pam_unix(sshd:session): session closed for user core Sep 9 03:28:56.668223 systemd[1]: sshd@6-10.244.20.50:22-147.75.109.163:55356.service: Deactivated successfully. Sep 9 03:28:56.668794 systemd-logind[1583]: Session 9 logged out. Waiting for processes to exit. Sep 9 03:28:56.673465 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 03:28:56.675474 systemd-logind[1583]: Removed session 9. Sep 9 03:28:56.822192 systemd[1]: Started sshd@7-10.244.20.50:22-147.75.109.163:55372.service - OpenSSH per-connection server daemon (147.75.109.163:55372). Sep 9 03:28:57.723729 sshd[1886]: Accepted publickey for core from 147.75.109.163 port 55372 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:28:57.725967 sshd[1886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:28:57.733956 systemd-logind[1583]: New session 10 of user core. Sep 9 03:28:57.742166 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 03:28:58.202921 sudo[1891]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 03:28:58.203407 sudo[1891]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 03:28:58.208624 sudo[1891]: pam_unix(sudo:session): session closed for user root Sep 9 03:28:58.216458 sudo[1890]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Sep 9 03:28:58.217006 sudo[1890]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 03:28:58.236120 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Sep 9 03:28:58.239997 auditctl[1894]: No rules Sep 9 03:28:58.240567 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 03:28:58.240988 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Sep 9 03:28:58.248320 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 9 03:28:58.287513 augenrules[1913]: No rules Sep 9 03:28:58.288367 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 9 03:28:58.290733 sudo[1890]: pam_unix(sudo:session): session closed for user root Sep 9 03:28:58.436091 sshd[1886]: pam_unix(sshd:session): session closed for user core Sep 9 03:28:58.441106 systemd[1]: sshd@7-10.244.20.50:22-147.75.109.163:55372.service: Deactivated successfully. Sep 9 03:28:58.445274 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 03:28:58.446449 systemd-logind[1583]: Session 10 logged out. Waiting for processes to exit. Sep 9 03:28:58.447939 systemd-logind[1583]: Removed session 10. Sep 9 03:28:58.594323 systemd[1]: Started sshd@8-10.244.20.50:22-147.75.109.163:55374.service - OpenSSH per-connection server daemon (147.75.109.163:55374). Sep 9 03:28:59.495052 sshd[1922]: Accepted publickey for core from 147.75.109.163 port 55374 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:28:59.497310 sshd[1922]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:28:59.505171 systemd-logind[1583]: New session 11 of user core. Sep 9 03:28:59.512291 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 03:28:59.981641 sudo[1926]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 03:28:59.982349 sudo[1926]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 03:29:00.459295 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 03:29:00.468475 (dockerd)[1943]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 03:29:00.928668 dockerd[1943]: time="2025-09-09T03:29:00.927789328Z" level=info msg="Starting up" Sep 9 03:29:01.194244 dockerd[1943]: time="2025-09-09T03:29:01.194123102Z" level=info msg="Loading containers: start." Sep 9 03:29:01.337806 kernel: Initializing XFRM netlink socket Sep 9 03:29:01.459459 systemd-networkd[1258]: docker0: Link UP Sep 9 03:29:01.486913 dockerd[1943]: time="2025-09-09T03:29:01.486854236Z" level=info msg="Loading containers: done." Sep 9 03:29:01.507187 dockerd[1943]: time="2025-09-09T03:29:01.507116361Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 03:29:01.507382 dockerd[1943]: time="2025-09-09T03:29:01.507275224Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Sep 9 03:29:01.507507 dockerd[1943]: time="2025-09-09T03:29:01.507468981Z" level=info msg="Daemon has completed initialization" Sep 9 03:29:01.561132 dockerd[1943]: time="2025-09-09T03:29:01.560873713Z" level=info msg="API listen on /run/docker.sock" Sep 9 03:29:01.561644 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 03:29:02.667959 containerd[1609]: time="2025-09-09T03:29:02.667867809Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 9 03:29:03.396499 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 9 03:29:03.528368 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2081557589.mount: Deactivated successfully. Sep 9 03:29:05.492856 containerd[1609]: time="2025-09-09T03:29:05.492671284Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:05.494636 containerd[1609]: time="2025-09-09T03:29:05.494433359Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=28079639" Sep 9 03:29:05.496770 containerd[1609]: time="2025-09-09T03:29:05.495429027Z" level=info msg="ImageCreate event name:\"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:05.499723 containerd[1609]: time="2025-09-09T03:29:05.499686527Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:05.501596 containerd[1609]: time="2025-09-09T03:29:05.501553755Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"28076431\" in 2.833574846s" Sep 9 03:29:05.501792 containerd[1609]: time="2025-09-09T03:29:05.501763243Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\"" Sep 9 03:29:05.504045 containerd[1609]: time="2025-09-09T03:29:05.503899385Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 9 03:29:05.514493 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 9 03:29:05.530067 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 03:29:05.705974 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 03:29:05.717360 (kubelet)[2156]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 03:29:05.778353 kubelet[2156]: E0909 03:29:05.778224 2156 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 03:29:05.781341 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 03:29:05.781953 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 03:29:08.316831 containerd[1609]: time="2025-09-09T03:29:08.316418493Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:08.319012 containerd[1609]: time="2025-09-09T03:29:08.318723625Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=24714689" Sep 9 03:29:08.319875 containerd[1609]: time="2025-09-09T03:29:08.319836378Z" level=info msg="ImageCreate event name:\"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:08.324698 containerd[1609]: time="2025-09-09T03:29:08.323965250Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:08.325887 containerd[1609]: time="2025-09-09T03:29:08.325840871Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"26317875\" in 2.821788796s" Sep 9 03:29:08.325983 containerd[1609]: time="2025-09-09T03:29:08.325895672Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\"" Sep 9 03:29:08.327076 containerd[1609]: time="2025-09-09T03:29:08.326931485Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 9 03:29:11.184810 containerd[1609]: time="2025-09-09T03:29:11.183458061Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:11.184810 containerd[1609]: time="2025-09-09T03:29:11.184809929Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=18782435" Sep 9 03:29:11.186083 containerd[1609]: time="2025-09-09T03:29:11.186049438Z" level=info msg="ImageCreate event name:\"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:11.191347 containerd[1609]: time="2025-09-09T03:29:11.191280009Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:11.194383 containerd[1609]: time="2025-09-09T03:29:11.194154209Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"20385639\" in 2.867173521s" Sep 9 03:29:11.194383 containerd[1609]: time="2025-09-09T03:29:11.194207520Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\"" Sep 9 03:29:11.195609 containerd[1609]: time="2025-09-09T03:29:11.195577310Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 9 03:29:13.187915 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3133872015.mount: Deactivated successfully. Sep 9 03:29:14.085435 containerd[1609]: time="2025-09-09T03:29:14.085316882Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:14.087692 containerd[1609]: time="2025-09-09T03:29:14.087629402Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=30384263" Sep 9 03:29:14.087924 containerd[1609]: time="2025-09-09T03:29:14.087819568Z" level=info msg="ImageCreate event name:\"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:14.090787 containerd[1609]: time="2025-09-09T03:29:14.090731542Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:14.092431 containerd[1609]: time="2025-09-09T03:29:14.092024223Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"30383274\" in 2.89639515s" Sep 9 03:29:14.092431 containerd[1609]: time="2025-09-09T03:29:14.092111071Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\"" Sep 9 03:29:14.096523 containerd[1609]: time="2025-09-09T03:29:14.096449914Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 03:29:14.772858 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2008540305.mount: Deactivated successfully. Sep 9 03:29:16.014720 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 9 03:29:16.030057 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 03:29:16.108548 containerd[1609]: time="2025-09-09T03:29:16.108447888Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:16.112764 containerd[1609]: time="2025-09-09T03:29:16.111481690Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Sep 9 03:29:16.113206 containerd[1609]: time="2025-09-09T03:29:16.113132699Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:16.240993 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 03:29:16.256371 (kubelet)[2245]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 03:29:16.298001 containerd[1609]: time="2025-09-09T03:29:16.297605661Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:16.303771 containerd[1609]: time="2025-09-09T03:29:16.302593013Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 2.206059991s" Sep 9 03:29:16.303771 containerd[1609]: time="2025-09-09T03:29:16.302696649Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 9 03:29:16.304695 containerd[1609]: time="2025-09-09T03:29:16.304596904Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 03:29:16.322760 kubelet[2245]: E0909 03:29:16.322678 2245 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 03:29:16.327979 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 03:29:16.328311 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 03:29:17.145574 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2648536486.mount: Deactivated successfully. Sep 9 03:29:17.153758 containerd[1609]: time="2025-09-09T03:29:17.153699442Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:17.155358 containerd[1609]: time="2025-09-09T03:29:17.154808839Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Sep 9 03:29:17.155647 containerd[1609]: time="2025-09-09T03:29:17.155614651Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:17.159561 containerd[1609]: time="2025-09-09T03:29:17.159516453Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:17.160699 containerd[1609]: time="2025-09-09T03:29:17.160661182Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 855.991912ms" Sep 9 03:29:17.160828 containerd[1609]: time="2025-09-09T03:29:17.160704029Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 9 03:29:17.161896 containerd[1609]: time="2025-09-09T03:29:17.161867745Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 9 03:29:17.774126 update_engine[1588]: I20250909 03:29:17.773957 1588 update_attempter.cc:509] Updating boot flags... Sep 9 03:29:17.822795 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (2264) Sep 9 03:29:17.922885 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3998135345.mount: Deactivated successfully. Sep 9 03:29:17.950781 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (2263) Sep 9 03:29:20.572717 containerd[1609]: time="2025-09-09T03:29:20.572452759Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:20.575527 containerd[1609]: time="2025-09-09T03:29:20.575448530Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910717" Sep 9 03:29:20.576541 containerd[1609]: time="2025-09-09T03:29:20.576458148Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:20.584301 containerd[1609]: time="2025-09-09T03:29:20.583481643Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:20.586140 containerd[1609]: time="2025-09-09T03:29:20.586092388Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 3.424173437s" Sep 9 03:29:20.586253 containerd[1609]: time="2025-09-09T03:29:20.586189651Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 9 03:29:24.832107 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 03:29:24.841174 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 03:29:24.888711 systemd[1]: Reloading requested from client PID 2354 ('systemctl') (unit session-11.scope)... Sep 9 03:29:24.888998 systemd[1]: Reloading... Sep 9 03:29:25.059474 zram_generator::config[2389]: No configuration found. Sep 9 03:29:25.264831 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 9 03:29:25.372290 systemd[1]: Reloading finished in 482 ms. Sep 9 03:29:25.436422 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 03:29:25.436926 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 03:29:25.437559 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 03:29:25.445169 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 03:29:25.698003 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 03:29:25.713496 (kubelet)[2469]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 03:29:25.904825 kubelet[2469]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 03:29:25.904825 kubelet[2469]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 03:29:25.904825 kubelet[2469]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 03:29:25.904825 kubelet[2469]: I0909 03:29:25.904446 2469 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 03:29:26.692688 kubelet[2469]: I0909 03:29:26.692580 2469 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 03:29:26.692688 kubelet[2469]: I0909 03:29:26.692640 2469 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 03:29:26.693131 kubelet[2469]: I0909 03:29:26.693095 2469 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 03:29:26.724029 kubelet[2469]: I0909 03:29:26.723883 2469 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 03:29:26.725802 kubelet[2469]: E0909 03:29:26.724865 2469 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.244.20.50:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.244.20.50:6443: connect: connection refused" logger="UnhandledError" Sep 9 03:29:26.736516 kubelet[2469]: E0909 03:29:26.736375 2469 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 9 03:29:26.736516 kubelet[2469]: I0909 03:29:26.736419 2469 server.go:1408] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 9 03:29:26.752807 kubelet[2469]: I0909 03:29:26.752726 2469 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 03:29:26.758780 kubelet[2469]: I0909 03:29:26.758195 2469 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 03:29:26.758780 kubelet[2469]: I0909 03:29:26.758416 2469 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 03:29:26.758780 kubelet[2469]: I0909 03:29:26.758485 2469 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-sy1m0.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":1} Sep 9 03:29:26.759214 kubelet[2469]: I0909 03:29:26.759191 2469 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 03:29:26.759787 kubelet[2469]: I0909 03:29:26.759416 2469 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 03:29:26.759787 kubelet[2469]: I0909 03:29:26.759618 2469 state_mem.go:36] "Initialized new in-memory state store" Sep 9 03:29:26.762741 kubelet[2469]: I0909 03:29:26.762718 2469 kubelet.go:408] "Attempting to sync node with API server" Sep 9 03:29:26.763985 kubelet[2469]: I0909 03:29:26.763963 2469 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 03:29:26.764159 kubelet[2469]: I0909 03:29:26.764139 2469 kubelet.go:314] "Adding apiserver pod source" Sep 9 03:29:26.764313 kubelet[2469]: I0909 03:29:26.764294 2469 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 03:29:26.770784 kubelet[2469]: W0909 03:29:26.770399 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.244.20.50:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-sy1m0.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.244.20.50:6443: connect: connection refused Sep 9 03:29:26.770784 kubelet[2469]: E0909 03:29:26.770485 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.244.20.50:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-sy1m0.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.20.50:6443: connect: connection refused" logger="UnhandledError" Sep 9 03:29:26.773515 kubelet[2469]: W0909 03:29:26.773462 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.244.20.50:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.244.20.50:6443: connect: connection refused Sep 9 03:29:26.773631 kubelet[2469]: E0909 03:29:26.773524 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.244.20.50:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.20.50:6443: connect: connection refused" logger="UnhandledError" Sep 9 03:29:26.773697 kubelet[2469]: I0909 03:29:26.773650 2469 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 9 03:29:26.777047 kubelet[2469]: I0909 03:29:26.777011 2469 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 03:29:26.777722 kubelet[2469]: W0909 03:29:26.777681 2469 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 03:29:26.778715 kubelet[2469]: I0909 03:29:26.778672 2469 server.go:1274] "Started kubelet" Sep 9 03:29:26.779394 kubelet[2469]: I0909 03:29:26.779312 2469 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 03:29:26.781740 kubelet[2469]: I0909 03:29:26.781702 2469 server.go:449] "Adding debug handlers to kubelet server" Sep 9 03:29:26.786534 kubelet[2469]: I0909 03:29:26.785912 2469 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 03:29:26.786534 kubelet[2469]: I0909 03:29:26.786348 2469 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 03:29:26.788517 kubelet[2469]: I0909 03:29:26.787656 2469 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 03:29:26.794148 kubelet[2469]: E0909 03:29:26.788718 2469 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.244.20.50:6443/api/v1/namespaces/default/events\": dial tcp 10.244.20.50:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-sy1m0.gb1.brightbox.com.18637f97272c7ace default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-sy1m0.gb1.brightbox.com,UID:srv-sy1m0.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-sy1m0.gb1.brightbox.com,},FirstTimestamp:2025-09-09 03:29:26.778641102 +0000 UTC m=+1.057908238,LastTimestamp:2025-09-09 03:29:26.778641102 +0000 UTC m=+1.057908238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-sy1m0.gb1.brightbox.com,}" Sep 9 03:29:26.796043 kubelet[2469]: I0909 03:29:26.796016 2469 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 03:29:26.798762 kubelet[2469]: I0909 03:29:26.798720 2469 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 03:29:26.799079 kubelet[2469]: E0909 03:29:26.799047 2469 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-sy1m0.gb1.brightbox.com\" not found" Sep 9 03:29:26.800455 kubelet[2469]: E0909 03:29:26.800399 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.20.50:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-sy1m0.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.20.50:6443: connect: connection refused" interval="200ms" Sep 9 03:29:26.801022 kubelet[2469]: I0909 03:29:26.800998 2469 reconciler.go:26] "Reconciler: start to sync state" Sep 9 03:29:26.801246 kubelet[2469]: I0909 03:29:26.801224 2469 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 03:29:26.802821 kubelet[2469]: W0909 03:29:26.801709 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.244.20.50:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.244.20.50:6443: connect: connection refused Sep 9 03:29:26.802821 kubelet[2469]: E0909 03:29:26.801833 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.244.20.50:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.20.50:6443: connect: connection refused" logger="UnhandledError" Sep 9 03:29:26.802821 kubelet[2469]: I0909 03:29:26.802303 2469 factory.go:221] Registration of the systemd container factory successfully Sep 9 03:29:26.802821 kubelet[2469]: I0909 03:29:26.802412 2469 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 03:29:26.804083 kubelet[2469]: E0909 03:29:26.804059 2469 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 03:29:26.805539 kubelet[2469]: I0909 03:29:26.805518 2469 factory.go:221] Registration of the containerd container factory successfully Sep 9 03:29:26.817967 kubelet[2469]: I0909 03:29:26.817911 2469 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 03:29:26.841631 kubelet[2469]: I0909 03:29:26.841567 2469 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 03:29:26.841631 kubelet[2469]: I0909 03:29:26.841644 2469 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 03:29:26.841903 kubelet[2469]: I0909 03:29:26.841689 2469 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 03:29:26.841903 kubelet[2469]: E0909 03:29:26.841851 2469 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 03:29:26.850871 kubelet[2469]: W0909 03:29:26.850396 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.244.20.50:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.244.20.50:6443: connect: connection refused Sep 9 03:29:26.851820 kubelet[2469]: E0909 03:29:26.851777 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.244.20.50:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.20.50:6443: connect: connection refused" logger="UnhandledError" Sep 9 03:29:26.857645 kubelet[2469]: I0909 03:29:26.857618 2469 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 03:29:26.857808 kubelet[2469]: I0909 03:29:26.857640 2469 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 03:29:26.857808 kubelet[2469]: I0909 03:29:26.857700 2469 state_mem.go:36] "Initialized new in-memory state store" Sep 9 03:29:26.859906 kubelet[2469]: I0909 03:29:26.859856 2469 policy_none.go:49] "None policy: Start" Sep 9 03:29:26.860670 kubelet[2469]: I0909 03:29:26.860646 2469 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 03:29:26.860770 kubelet[2469]: I0909 03:29:26.860679 2469 state_mem.go:35] "Initializing new in-memory state store" Sep 9 03:29:26.880653 kubelet[2469]: I0909 03:29:26.878916 2469 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 03:29:26.880653 kubelet[2469]: I0909 03:29:26.879229 2469 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 03:29:26.880653 kubelet[2469]: I0909 03:29:26.879258 2469 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 03:29:26.880653 kubelet[2469]: I0909 03:29:26.880493 2469 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 03:29:26.882923 kubelet[2469]: E0909 03:29:26.882888 2469 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-sy1m0.gb1.brightbox.com\" not found" Sep 9 03:29:26.982244 kubelet[2469]: I0909 03:29:26.982112 2469 kubelet_node_status.go:72] "Attempting to register node" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:26.984391 kubelet[2469]: E0909 03:29:26.984348 2469 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.244.20.50:6443/api/v1/nodes\": dial tcp 10.244.20.50:6443: connect: connection refused" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.002357 kubelet[2469]: I0909 03:29:27.002325 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ff5c17ac66c1808becf92b6663a84ac7-usr-share-ca-certificates\") pod \"kube-apiserver-srv-sy1m0.gb1.brightbox.com\" (UID: \"ff5c17ac66c1808becf92b6663a84ac7\") " pod="kube-system/kube-apiserver-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.002485 kubelet[2469]: I0909 03:29:27.002370 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/03d5f3d2723f07597cff96f6689dc27d-ca-certs\") pod \"kube-controller-manager-srv-sy1m0.gb1.brightbox.com\" (UID: \"03d5f3d2723f07597cff96f6689dc27d\") " pod="kube-system/kube-controller-manager-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.002485 kubelet[2469]: I0909 03:29:27.002405 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ff5c17ac66c1808becf92b6663a84ac7-ca-certs\") pod \"kube-apiserver-srv-sy1m0.gb1.brightbox.com\" (UID: \"ff5c17ac66c1808becf92b6663a84ac7\") " pod="kube-system/kube-apiserver-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.002485 kubelet[2469]: I0909 03:29:27.002433 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/03d5f3d2723f07597cff96f6689dc27d-flexvolume-dir\") pod \"kube-controller-manager-srv-sy1m0.gb1.brightbox.com\" (UID: \"03d5f3d2723f07597cff96f6689dc27d\") " pod="kube-system/kube-controller-manager-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.002485 kubelet[2469]: I0909 03:29:27.002460 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/03d5f3d2723f07597cff96f6689dc27d-k8s-certs\") pod \"kube-controller-manager-srv-sy1m0.gb1.brightbox.com\" (UID: \"03d5f3d2723f07597cff96f6689dc27d\") " pod="kube-system/kube-controller-manager-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.002704 kubelet[2469]: I0909 03:29:27.002504 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/03d5f3d2723f07597cff96f6689dc27d-kubeconfig\") pod \"kube-controller-manager-srv-sy1m0.gb1.brightbox.com\" (UID: \"03d5f3d2723f07597cff96f6689dc27d\") " pod="kube-system/kube-controller-manager-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.002704 kubelet[2469]: I0909 03:29:27.002534 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/03d5f3d2723f07597cff96f6689dc27d-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-sy1m0.gb1.brightbox.com\" (UID: \"03d5f3d2723f07597cff96f6689dc27d\") " pod="kube-system/kube-controller-manager-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.002704 kubelet[2469]: I0909 03:29:27.002561 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/466ba24f54c8ca888d4b21ef1e2a3ce1-kubeconfig\") pod \"kube-scheduler-srv-sy1m0.gb1.brightbox.com\" (UID: \"466ba24f54c8ca888d4b21ef1e2a3ce1\") " pod="kube-system/kube-scheduler-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.002704 kubelet[2469]: I0909 03:29:27.002589 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ff5c17ac66c1808becf92b6663a84ac7-k8s-certs\") pod \"kube-apiserver-srv-sy1m0.gb1.brightbox.com\" (UID: \"ff5c17ac66c1808becf92b6663a84ac7\") " pod="kube-system/kube-apiserver-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.002950 kubelet[2469]: E0909 03:29:27.002781 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.20.50:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-sy1m0.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.20.50:6443: connect: connection refused" interval="400ms" Sep 9 03:29:27.188519 kubelet[2469]: I0909 03:29:27.188033 2469 kubelet_node_status.go:72] "Attempting to register node" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.188711 kubelet[2469]: E0909 03:29:27.188581 2469 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.244.20.50:6443/api/v1/nodes\": dial tcp 10.244.20.50:6443: connect: connection refused" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.259194 containerd[1609]: time="2025-09-09T03:29:27.259007004Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-sy1m0.gb1.brightbox.com,Uid:ff5c17ac66c1808becf92b6663a84ac7,Namespace:kube-system,Attempt:0,}" Sep 9 03:29:27.265816 containerd[1609]: time="2025-09-09T03:29:27.264962190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-sy1m0.gb1.brightbox.com,Uid:03d5f3d2723f07597cff96f6689dc27d,Namespace:kube-system,Attempt:0,}" Sep 9 03:29:27.265816 containerd[1609]: time="2025-09-09T03:29:27.265352033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-sy1m0.gb1.brightbox.com,Uid:466ba24f54c8ca888d4b21ef1e2a3ce1,Namespace:kube-system,Attempt:0,}" Sep 9 03:29:27.403578 kubelet[2469]: E0909 03:29:27.403475 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.20.50:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-sy1m0.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.20.50:6443: connect: connection refused" interval="800ms" Sep 9 03:29:27.592646 kubelet[2469]: I0909 03:29:27.592589 2469 kubelet_node_status.go:72] "Attempting to register node" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.593970 kubelet[2469]: E0909 03:29:27.593861 2469 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.244.20.50:6443/api/v1/nodes\": dial tcp 10.244.20.50:6443: connect: connection refused" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:27.869832 kubelet[2469]: W0909 03:29:27.869438 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.244.20.50:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.244.20.50:6443: connect: connection refused Sep 9 03:29:27.869832 kubelet[2469]: E0909 03:29:27.869548 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.244.20.50:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.20.50:6443: connect: connection refused" logger="UnhandledError" Sep 9 03:29:27.872338 kubelet[2469]: W0909 03:29:27.872225 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.244.20.50:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.244.20.50:6443: connect: connection refused Sep 9 03:29:27.872338 kubelet[2469]: E0909 03:29:27.872292 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.244.20.50:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.20.50:6443: connect: connection refused" logger="UnhandledError" Sep 9 03:29:27.905562 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3936292779.mount: Deactivated successfully. Sep 9 03:29:27.914774 containerd[1609]: time="2025-09-09T03:29:27.913125562Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 03:29:27.914774 containerd[1609]: time="2025-09-09T03:29:27.914518313Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 03:29:27.915866 containerd[1609]: time="2025-09-09T03:29:27.915811741Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 9 03:29:27.916138 containerd[1609]: time="2025-09-09T03:29:27.916095754Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 9 03:29:27.916265 containerd[1609]: time="2025-09-09T03:29:27.916190610Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 03:29:27.917262 containerd[1609]: time="2025-09-09T03:29:27.917220522Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Sep 9 03:29:27.919893 containerd[1609]: time="2025-09-09T03:29:27.918550118Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 03:29:27.924769 containerd[1609]: time="2025-09-09T03:29:27.923802646Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 03:29:27.926383 containerd[1609]: time="2025-09-09T03:29:27.926336205Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 661.271726ms" Sep 9 03:29:27.930661 containerd[1609]: time="2025-09-09T03:29:27.930624372Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 671.396699ms" Sep 9 03:29:27.932420 containerd[1609]: time="2025-09-09T03:29:27.932375074Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 666.956634ms" Sep 9 03:29:27.946703 kubelet[2469]: W0909 03:29:27.946449 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.244.20.50:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-sy1m0.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.244.20.50:6443: connect: connection refused Sep 9 03:29:27.946703 kubelet[2469]: E0909 03:29:27.946619 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.244.20.50:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-sy1m0.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.20.50:6443: connect: connection refused" logger="UnhandledError" Sep 9 03:29:28.115295 containerd[1609]: time="2025-09-09T03:29:28.114530537Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:29:28.115295 containerd[1609]: time="2025-09-09T03:29:28.114604916Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:29:28.115295 containerd[1609]: time="2025-09-09T03:29:28.114622296Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:28.115295 containerd[1609]: time="2025-09-09T03:29:28.112860363Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:29:28.115295 containerd[1609]: time="2025-09-09T03:29:28.114046110Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:29:28.115295 containerd[1609]: time="2025-09-09T03:29:28.114079237Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:28.115295 containerd[1609]: time="2025-09-09T03:29:28.114241416Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:28.117851 containerd[1609]: time="2025-09-09T03:29:28.116982815Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:29:28.117851 containerd[1609]: time="2025-09-09T03:29:28.117050827Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:29:28.117851 containerd[1609]: time="2025-09-09T03:29:28.117103889Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:28.117851 containerd[1609]: time="2025-09-09T03:29:28.116990637Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:28.120094 containerd[1609]: time="2025-09-09T03:29:28.117740469Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:28.209064 kubelet[2469]: E0909 03:29:28.208990 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.20.50:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-sy1m0.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.20.50:6443: connect: connection refused" interval="1.6s" Sep 9 03:29:28.259642 containerd[1609]: time="2025-09-09T03:29:28.259248253Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-sy1m0.gb1.brightbox.com,Uid:466ba24f54c8ca888d4b21ef1e2a3ce1,Namespace:kube-system,Attempt:0,} returns sandbox id \"87963cda057ff52c3d6a46d1942119f7a377aa58985093f5abaee39f4abcf803\"" Sep 9 03:29:28.268238 containerd[1609]: time="2025-09-09T03:29:28.268089284Z" level=info msg="CreateContainer within sandbox \"87963cda057ff52c3d6a46d1942119f7a377aa58985093f5abaee39f4abcf803\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 03:29:28.273271 kubelet[2469]: W0909 03:29:28.273100 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.244.20.50:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.244.20.50:6443: connect: connection refused Sep 9 03:29:28.273271 kubelet[2469]: E0909 03:29:28.273190 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.244.20.50:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.20.50:6443: connect: connection refused" logger="UnhandledError" Sep 9 03:29:28.277631 containerd[1609]: time="2025-09-09T03:29:28.277528239Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-sy1m0.gb1.brightbox.com,Uid:ff5c17ac66c1808becf92b6663a84ac7,Namespace:kube-system,Attempt:0,} returns sandbox id \"c6488350ce4c2371c42f3e172b59ba41715f7ef6b597e7cea17ec764d281a73e\"" Sep 9 03:29:28.280737 containerd[1609]: time="2025-09-09T03:29:28.280702978Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-sy1m0.gb1.brightbox.com,Uid:03d5f3d2723f07597cff96f6689dc27d,Namespace:kube-system,Attempt:0,} returns sandbox id \"7bebcc09a2092330f3f4788549042b678cb3832afafd538f3dcc0a598d06bd3b\"" Sep 9 03:29:28.281972 containerd[1609]: time="2025-09-09T03:29:28.281938658Z" level=info msg="CreateContainer within sandbox \"c6488350ce4c2371c42f3e172b59ba41715f7ef6b597e7cea17ec764d281a73e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 03:29:28.287442 containerd[1609]: time="2025-09-09T03:29:28.287401231Z" level=info msg="CreateContainer within sandbox \"7bebcc09a2092330f3f4788549042b678cb3832afafd538f3dcc0a598d06bd3b\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 03:29:28.299474 containerd[1609]: time="2025-09-09T03:29:28.299433638Z" level=info msg="CreateContainer within sandbox \"87963cda057ff52c3d6a46d1942119f7a377aa58985093f5abaee39f4abcf803\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"9280ee608b7e1cc5c589bf430f550a032bc91a43d3e70a63bca7bf8c625b3710\"" Sep 9 03:29:28.301245 containerd[1609]: time="2025-09-09T03:29:28.301211074Z" level=info msg="StartContainer for \"9280ee608b7e1cc5c589bf430f550a032bc91a43d3e70a63bca7bf8c625b3710\"" Sep 9 03:29:28.304490 containerd[1609]: time="2025-09-09T03:29:28.304431504Z" level=info msg="CreateContainer within sandbox \"c6488350ce4c2371c42f3e172b59ba41715f7ef6b597e7cea17ec764d281a73e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"39c397a2b2e82243c26233f790a8cebc8358b5723d5d2d219a271dba8bc2e763\"" Sep 9 03:29:28.305259 containerd[1609]: time="2025-09-09T03:29:28.305222111Z" level=info msg="StartContainer for \"39c397a2b2e82243c26233f790a8cebc8358b5723d5d2d219a271dba8bc2e763\"" Sep 9 03:29:28.314639 containerd[1609]: time="2025-09-09T03:29:28.314488404Z" level=info msg="CreateContainer within sandbox \"7bebcc09a2092330f3f4788549042b678cb3832afafd538f3dcc0a598d06bd3b\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"71a2a034259161993dc086a8f518fec3164684466e0ef6bb8dda7f7bae0ad6f0\"" Sep 9 03:29:28.315696 containerd[1609]: time="2025-09-09T03:29:28.315663400Z" level=info msg="StartContainer for \"71a2a034259161993dc086a8f518fec3164684466e0ef6bb8dda7f7bae0ad6f0\"" Sep 9 03:29:28.404859 kubelet[2469]: I0909 03:29:28.403473 2469 kubelet_node_status.go:72] "Attempting to register node" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:28.408766 kubelet[2469]: E0909 03:29:28.405884 2469 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.244.20.50:6443/api/v1/nodes\": dial tcp 10.244.20.50:6443: connect: connection refused" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:28.474188 containerd[1609]: time="2025-09-09T03:29:28.474135343Z" level=info msg="StartContainer for \"39c397a2b2e82243c26233f790a8cebc8358b5723d5d2d219a271dba8bc2e763\" returns successfully" Sep 9 03:29:28.486578 containerd[1609]: time="2025-09-09T03:29:28.486504999Z" level=info msg="StartContainer for \"9280ee608b7e1cc5c589bf430f550a032bc91a43d3e70a63bca7bf8c625b3710\" returns successfully" Sep 9 03:29:28.495700 containerd[1609]: time="2025-09-09T03:29:28.495472769Z" level=info msg="StartContainer for \"71a2a034259161993dc086a8f518fec3164684466e0ef6bb8dda7f7bae0ad6f0\" returns successfully" Sep 9 03:29:30.011834 kubelet[2469]: I0909 03:29:30.010029 2469 kubelet_node_status.go:72] "Attempting to register node" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:31.651805 kubelet[2469]: E0909 03:29:31.649675 2469 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-sy1m0.gb1.brightbox.com\" not found" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:31.773778 kubelet[2469]: I0909 03:29:31.771312 2469 kubelet_node_status.go:75] "Successfully registered node" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:31.773778 kubelet[2469]: E0909 03:29:31.771383 2469 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"srv-sy1m0.gb1.brightbox.com\": node \"srv-sy1m0.gb1.brightbox.com\" not found" Sep 9 03:29:31.776906 kubelet[2469]: I0909 03:29:31.776871 2469 apiserver.go:52] "Watching apiserver" Sep 9 03:29:31.801792 kubelet[2469]: I0909 03:29:31.801706 2469 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 03:29:33.203645 kubelet[2469]: W0909 03:29:33.203444 2469 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 03:29:33.907016 kubelet[2469]: W0909 03:29:33.906966 2469 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 03:29:34.404284 systemd[1]: Reloading requested from client PID 2743 ('systemctl') (unit session-11.scope)... Sep 9 03:29:34.404341 systemd[1]: Reloading... Sep 9 03:29:34.519837 zram_generator::config[2783]: No configuration found. Sep 9 03:29:34.741820 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 9 03:29:34.871196 systemd[1]: Reloading finished in 466 ms. Sep 9 03:29:34.921299 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 03:29:34.941836 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 03:29:34.942641 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 03:29:34.953886 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 03:29:35.232983 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 03:29:35.252510 (kubelet)[2856]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 03:29:35.357775 kubelet[2856]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 03:29:35.357775 kubelet[2856]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 03:29:35.357775 kubelet[2856]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 03:29:35.358607 kubelet[2856]: I0909 03:29:35.357872 2856 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 03:29:35.372358 kubelet[2856]: I0909 03:29:35.372295 2856 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 03:29:35.372358 kubelet[2856]: I0909 03:29:35.372343 2856 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 03:29:35.374063 kubelet[2856]: I0909 03:29:35.372767 2856 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 03:29:35.376069 kubelet[2856]: I0909 03:29:35.376016 2856 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 03:29:35.385773 kubelet[2856]: I0909 03:29:35.382182 2856 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 03:29:35.412781 kubelet[2856]: E0909 03:29:35.411057 2856 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 9 03:29:35.412781 kubelet[2856]: I0909 03:29:35.411117 2856 server.go:1408] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 9 03:29:35.444792 kubelet[2856]: I0909 03:29:35.443705 2856 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 03:29:35.446571 kubelet[2856]: I0909 03:29:35.446528 2856 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 03:29:35.446850 kubelet[2856]: I0909 03:29:35.446782 2856 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 03:29:35.447163 kubelet[2856]: I0909 03:29:35.446853 2856 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-sy1m0.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":1} Sep 9 03:29:35.448188 kubelet[2856]: I0909 03:29:35.447182 2856 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 03:29:35.448188 kubelet[2856]: I0909 03:29:35.447202 2856 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 03:29:35.448188 kubelet[2856]: I0909 03:29:35.447282 2856 state_mem.go:36] "Initialized new in-memory state store" Sep 9 03:29:35.448188 kubelet[2856]: I0909 03:29:35.447496 2856 kubelet.go:408] "Attempting to sync node with API server" Sep 9 03:29:35.448188 kubelet[2856]: I0909 03:29:35.447525 2856 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 03:29:35.448188 kubelet[2856]: I0909 03:29:35.447592 2856 kubelet.go:314] "Adding apiserver pod source" Sep 9 03:29:35.448188 kubelet[2856]: I0909 03:29:35.447635 2856 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 03:29:35.476923 kubelet[2856]: I0909 03:29:35.476870 2856 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 9 03:29:35.482808 kubelet[2856]: I0909 03:29:35.481117 2856 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 03:29:35.488924 kubelet[2856]: I0909 03:29:35.487199 2856 server.go:1274] "Started kubelet" Sep 9 03:29:35.522970 kubelet[2856]: I0909 03:29:35.518705 2856 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 03:29:35.522970 kubelet[2856]: I0909 03:29:35.519346 2856 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 03:29:35.522970 kubelet[2856]: I0909 03:29:35.519471 2856 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 03:29:35.522970 kubelet[2856]: I0909 03:29:35.520787 2856 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 03:29:35.525199 kubelet[2856]: I0909 03:29:35.525159 2856 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 03:29:35.564423 kubelet[2856]: I0909 03:29:35.529491 2856 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 03:29:35.572634 kubelet[2856]: I0909 03:29:35.530786 2856 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 03:29:35.573422 kubelet[2856]: I0909 03:29:35.573221 2856 reconciler.go:26] "Reconciler: start to sync state" Sep 9 03:29:35.573422 kubelet[2856]: I0909 03:29:35.555524 2856 factory.go:221] Registration of the systemd container factory successfully Sep 9 03:29:35.574885 kubelet[2856]: I0909 03:29:35.574651 2856 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 03:29:35.575671 kubelet[2856]: E0909 03:29:35.533877 2856 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-sy1m0.gb1.brightbox.com\" not found" Sep 9 03:29:35.596781 kubelet[2856]: I0909 03:29:35.596025 2856 server.go:449] "Adding debug handlers to kubelet server" Sep 9 03:29:35.609883 kubelet[2856]: I0909 03:29:35.609216 2856 factory.go:221] Registration of the containerd container factory successfully Sep 9 03:29:35.621084 kubelet[2856]: E0909 03:29:35.621030 2856 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 03:29:35.639396 kubelet[2856]: I0909 03:29:35.639111 2856 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 03:29:35.654785 kubelet[2856]: I0909 03:29:35.654657 2856 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 03:29:35.654785 kubelet[2856]: I0909 03:29:35.654724 2856 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 03:29:35.661659 kubelet[2856]: I0909 03:29:35.660932 2856 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 03:29:35.661659 kubelet[2856]: E0909 03:29:35.661037 2856 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 03:29:35.762277 kubelet[2856]: E0909 03:29:35.761992 2856 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 03:29:35.787582 kubelet[2856]: I0909 03:29:35.787099 2856 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 03:29:35.787582 kubelet[2856]: I0909 03:29:35.787129 2856 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 03:29:35.787582 kubelet[2856]: I0909 03:29:35.787180 2856 state_mem.go:36] "Initialized new in-memory state store" Sep 9 03:29:35.787582 kubelet[2856]: I0909 03:29:35.787444 2856 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 03:29:35.787582 kubelet[2856]: I0909 03:29:35.787465 2856 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 03:29:35.787582 kubelet[2856]: I0909 03:29:35.787505 2856 policy_none.go:49] "None policy: Start" Sep 9 03:29:35.789133 kubelet[2856]: I0909 03:29:35.789100 2856 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 03:29:35.789301 kubelet[2856]: I0909 03:29:35.789283 2856 state_mem.go:35] "Initializing new in-memory state store" Sep 9 03:29:35.789790 kubelet[2856]: I0909 03:29:35.789598 2856 state_mem.go:75] "Updated machine memory state" Sep 9 03:29:35.795608 kubelet[2856]: I0909 03:29:35.795582 2856 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 03:29:35.796346 kubelet[2856]: I0909 03:29:35.796061 2856 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 03:29:35.796346 kubelet[2856]: I0909 03:29:35.796099 2856 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 03:29:35.798486 kubelet[2856]: I0909 03:29:35.798447 2856 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 03:29:35.931252 kubelet[2856]: I0909 03:29:35.930335 2856 kubelet_node_status.go:72] "Attempting to register node" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:35.945870 kubelet[2856]: I0909 03:29:35.945820 2856 kubelet_node_status.go:111] "Node was previously registered" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:35.947031 kubelet[2856]: I0909 03:29:35.946150 2856 kubelet_node_status.go:75] "Successfully registered node" node="srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:35.990972 kubelet[2856]: W0909 03:29:35.990920 2856 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 03:29:36.009504 kubelet[2856]: W0909 03:29:36.009474 2856 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 03:29:36.010011 kubelet[2856]: E0909 03:29:36.009981 2856 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-srv-sy1m0.gb1.brightbox.com\" already exists" pod="kube-system/kube-scheduler-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:36.010206 kubelet[2856]: W0909 03:29:36.009908 2856 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 03:29:36.010355 kubelet[2856]: E0909 03:29:36.010332 2856 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-srv-sy1m0.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:36.075368 kubelet[2856]: I0909 03:29:36.074809 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ff5c17ac66c1808becf92b6663a84ac7-k8s-certs\") pod \"kube-apiserver-srv-sy1m0.gb1.brightbox.com\" (UID: \"ff5c17ac66c1808becf92b6663a84ac7\") " pod="kube-system/kube-apiserver-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:36.075368 kubelet[2856]: I0909 03:29:36.074905 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/03d5f3d2723f07597cff96f6689dc27d-ca-certs\") pod \"kube-controller-manager-srv-sy1m0.gb1.brightbox.com\" (UID: \"03d5f3d2723f07597cff96f6689dc27d\") " pod="kube-system/kube-controller-manager-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:36.075368 kubelet[2856]: I0909 03:29:36.074947 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/03d5f3d2723f07597cff96f6689dc27d-kubeconfig\") pod \"kube-controller-manager-srv-sy1m0.gb1.brightbox.com\" (UID: \"03d5f3d2723f07597cff96f6689dc27d\") " pod="kube-system/kube-controller-manager-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:36.075368 kubelet[2856]: I0909 03:29:36.074982 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/03d5f3d2723f07597cff96f6689dc27d-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-sy1m0.gb1.brightbox.com\" (UID: \"03d5f3d2723f07597cff96f6689dc27d\") " pod="kube-system/kube-controller-manager-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:36.075368 kubelet[2856]: I0909 03:29:36.075022 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ff5c17ac66c1808becf92b6663a84ac7-ca-certs\") pod \"kube-apiserver-srv-sy1m0.gb1.brightbox.com\" (UID: \"ff5c17ac66c1808becf92b6663a84ac7\") " pod="kube-system/kube-apiserver-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:36.075943 kubelet[2856]: I0909 03:29:36.075065 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ff5c17ac66c1808becf92b6663a84ac7-usr-share-ca-certificates\") pod \"kube-apiserver-srv-sy1m0.gb1.brightbox.com\" (UID: \"ff5c17ac66c1808becf92b6663a84ac7\") " pod="kube-system/kube-apiserver-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:36.075943 kubelet[2856]: I0909 03:29:36.075094 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/03d5f3d2723f07597cff96f6689dc27d-flexvolume-dir\") pod \"kube-controller-manager-srv-sy1m0.gb1.brightbox.com\" (UID: \"03d5f3d2723f07597cff96f6689dc27d\") " pod="kube-system/kube-controller-manager-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:36.075943 kubelet[2856]: I0909 03:29:36.075127 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/03d5f3d2723f07597cff96f6689dc27d-k8s-certs\") pod \"kube-controller-manager-srv-sy1m0.gb1.brightbox.com\" (UID: \"03d5f3d2723f07597cff96f6689dc27d\") " pod="kube-system/kube-controller-manager-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:36.075943 kubelet[2856]: I0909 03:29:36.075157 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/466ba24f54c8ca888d4b21ef1e2a3ce1-kubeconfig\") pod \"kube-scheduler-srv-sy1m0.gb1.brightbox.com\" (UID: \"466ba24f54c8ca888d4b21ef1e2a3ce1\") " pod="kube-system/kube-scheduler-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:36.473003 kubelet[2856]: I0909 03:29:36.472662 2856 apiserver.go:52] "Watching apiserver" Sep 9 03:29:36.533589 kubelet[2856]: I0909 03:29:36.532074 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-sy1m0.gb1.brightbox.com" podStartSLOduration=3.532035437 podStartE2EDuration="3.532035437s" podCreationTimestamp="2025-09-09 03:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 03:29:36.514301811 +0000 UTC m=+1.248941864" watchObservedRunningTime="2025-09-09 03:29:36.532035437 +0000 UTC m=+1.266675501" Sep 9 03:29:36.545397 kubelet[2856]: I0909 03:29:36.545149 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-sy1m0.gb1.brightbox.com" podStartSLOduration=1.545093269 podStartE2EDuration="1.545093269s" podCreationTimestamp="2025-09-09 03:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 03:29:36.532457194 +0000 UTC m=+1.267097257" watchObservedRunningTime="2025-09-09 03:29:36.545093269 +0000 UTC m=+1.279733331" Sep 9 03:29:36.545975 kubelet[2856]: I0909 03:29:36.545666 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-sy1m0.gb1.brightbox.com" podStartSLOduration=3.545653533 podStartE2EDuration="3.545653533s" podCreationTimestamp="2025-09-09 03:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 03:29:36.544643345 +0000 UTC m=+1.279283419" watchObservedRunningTime="2025-09-09 03:29:36.545653533 +0000 UTC m=+1.280293598" Sep 9 03:29:36.573820 kubelet[2856]: I0909 03:29:36.573648 2856 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 03:29:36.709682 kubelet[2856]: W0909 03:29:36.709598 2856 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 03:29:36.711768 kubelet[2856]: E0909 03:29:36.710247 2856 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-srv-sy1m0.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-sy1m0.gb1.brightbox.com" Sep 9 03:29:40.017119 kubelet[2856]: I0909 03:29:40.017041 2856 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 03:29:40.019818 kubelet[2856]: I0909 03:29:40.018970 2856 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 03:29:40.019913 containerd[1609]: time="2025-09-09T03:29:40.018442903Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 03:29:41.010376 kubelet[2856]: I0909 03:29:41.010281 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/c98a1a51-a1b0-43d2-8939-74ba3a117143-kube-proxy\") pod \"kube-proxy-9qq7x\" (UID: \"c98a1a51-a1b0-43d2-8939-74ba3a117143\") " pod="kube-system/kube-proxy-9qq7x" Sep 9 03:29:41.010376 kubelet[2856]: I0909 03:29:41.010369 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c98a1a51-a1b0-43d2-8939-74ba3a117143-xtables-lock\") pod \"kube-proxy-9qq7x\" (UID: \"c98a1a51-a1b0-43d2-8939-74ba3a117143\") " pod="kube-system/kube-proxy-9qq7x" Sep 9 03:29:41.010703 kubelet[2856]: I0909 03:29:41.010402 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c98a1a51-a1b0-43d2-8939-74ba3a117143-lib-modules\") pod \"kube-proxy-9qq7x\" (UID: \"c98a1a51-a1b0-43d2-8939-74ba3a117143\") " pod="kube-system/kube-proxy-9qq7x" Sep 9 03:29:41.010703 kubelet[2856]: I0909 03:29:41.010435 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvp7r\" (UniqueName: \"kubernetes.io/projected/c98a1a51-a1b0-43d2-8939-74ba3a117143-kube-api-access-cvp7r\") pod \"kube-proxy-9qq7x\" (UID: \"c98a1a51-a1b0-43d2-8939-74ba3a117143\") " pod="kube-system/kube-proxy-9qq7x" Sep 9 03:29:41.111771 kubelet[2856]: I0909 03:29:41.111180 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2a87b721-e45d-47fc-996b-c6cf583e441e-var-lib-calico\") pod \"tigera-operator-58fc44c59b-2zdmz\" (UID: \"2a87b721-e45d-47fc-996b-c6cf583e441e\") " pod="tigera-operator/tigera-operator-58fc44c59b-2zdmz" Sep 9 03:29:41.111771 kubelet[2856]: I0909 03:29:41.111360 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6fnx\" (UniqueName: \"kubernetes.io/projected/2a87b721-e45d-47fc-996b-c6cf583e441e-kube-api-access-j6fnx\") pod \"tigera-operator-58fc44c59b-2zdmz\" (UID: \"2a87b721-e45d-47fc-996b-c6cf583e441e\") " pod="tigera-operator/tigera-operator-58fc44c59b-2zdmz" Sep 9 03:29:41.227195 containerd[1609]: time="2025-09-09T03:29:41.226859152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9qq7x,Uid:c98a1a51-a1b0-43d2-8939-74ba3a117143,Namespace:kube-system,Attempt:0,}" Sep 9 03:29:41.267699 containerd[1609]: time="2025-09-09T03:29:41.267208567Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:29:41.267699 containerd[1609]: time="2025-09-09T03:29:41.267334043Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:29:41.267699 containerd[1609]: time="2025-09-09T03:29:41.267371760Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:41.267699 containerd[1609]: time="2025-09-09T03:29:41.267565039Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:41.335124 containerd[1609]: time="2025-09-09T03:29:41.334959658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9qq7x,Uid:c98a1a51-a1b0-43d2-8939-74ba3a117143,Namespace:kube-system,Attempt:0,} returns sandbox id \"103113f964c6216714a8698bc5c8c2949745ec739a6c2589704ce8b9a4c8293b\"" Sep 9 03:29:41.340167 containerd[1609]: time="2025-09-09T03:29:41.340046926Z" level=info msg="CreateContainer within sandbox \"103113f964c6216714a8698bc5c8c2949745ec739a6c2589704ce8b9a4c8293b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 03:29:41.348270 containerd[1609]: time="2025-09-09T03:29:41.348223441Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-2zdmz,Uid:2a87b721-e45d-47fc-996b-c6cf583e441e,Namespace:tigera-operator,Attempt:0,}" Sep 9 03:29:41.381529 containerd[1609]: time="2025-09-09T03:29:41.379021541Z" level=info msg="CreateContainer within sandbox \"103113f964c6216714a8698bc5c8c2949745ec739a6c2589704ce8b9a4c8293b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"a4d873069efab649261ecb8e20ad3c775bd2ca2de133f9c74221498998c3e38e\"" Sep 9 03:29:41.381529 containerd[1609]: time="2025-09-09T03:29:41.380246951Z" level=info msg="StartContainer for \"a4d873069efab649261ecb8e20ad3c775bd2ca2de133f9c74221498998c3e38e\"" Sep 9 03:29:41.420677 containerd[1609]: time="2025-09-09T03:29:41.420541587Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:29:41.421735 containerd[1609]: time="2025-09-09T03:29:41.421679382Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:29:41.421990 containerd[1609]: time="2025-09-09T03:29:41.421885350Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:41.422919 containerd[1609]: time="2025-09-09T03:29:41.422782571Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:41.506841 containerd[1609]: time="2025-09-09T03:29:41.506698738Z" level=info msg="StartContainer for \"a4d873069efab649261ecb8e20ad3c775bd2ca2de133f9c74221498998c3e38e\" returns successfully" Sep 9 03:29:41.531126 containerd[1609]: time="2025-09-09T03:29:41.530968277Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-2zdmz,Uid:2a87b721-e45d-47fc-996b-c6cf583e441e,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"88ba986181534618ec8a4888650fb86f1863d212f75223d690528b3c7524f4c7\"" Sep 9 03:29:41.534695 containerd[1609]: time="2025-09-09T03:29:41.534564083Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 03:29:41.735827 kubelet[2856]: I0909 03:29:41.734934 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-9qq7x" podStartSLOduration=1.734895249 podStartE2EDuration="1.734895249s" podCreationTimestamp="2025-09-09 03:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 03:29:41.733972502 +0000 UTC m=+6.468612578" watchObservedRunningTime="2025-09-09 03:29:41.734895249 +0000 UTC m=+6.469535313" Sep 9 03:29:43.377808 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2031846096.mount: Deactivated successfully. Sep 9 03:29:44.393701 containerd[1609]: time="2025-09-09T03:29:44.393571462Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:44.395357 containerd[1609]: time="2025-09-09T03:29:44.395043553Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 9 03:29:44.396915 containerd[1609]: time="2025-09-09T03:29:44.396384540Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:44.404895 containerd[1609]: time="2025-09-09T03:29:44.404844353Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:29:44.406756 containerd[1609]: time="2025-09-09T03:29:44.406589262Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 2.871968045s" Sep 9 03:29:44.406756 containerd[1609]: time="2025-09-09T03:29:44.406646483Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 9 03:29:44.410495 containerd[1609]: time="2025-09-09T03:29:44.410455755Z" level=info msg="CreateContainer within sandbox \"88ba986181534618ec8a4888650fb86f1863d212f75223d690528b3c7524f4c7\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 03:29:44.427880 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount374865813.mount: Deactivated successfully. Sep 9 03:29:44.432614 containerd[1609]: time="2025-09-09T03:29:44.432560697Z" level=info msg="CreateContainer within sandbox \"88ba986181534618ec8a4888650fb86f1863d212f75223d690528b3c7524f4c7\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"562df5873d9ef8fc71460d36b8e399d703ea4ab32fa340a4c49c1e4dcdcdcc5b\"" Sep 9 03:29:44.436077 containerd[1609]: time="2025-09-09T03:29:44.434812942Z" level=info msg="StartContainer for \"562df5873d9ef8fc71460d36b8e399d703ea4ab32fa340a4c49c1e4dcdcdcc5b\"" Sep 9 03:29:44.524570 containerd[1609]: time="2025-09-09T03:29:44.524519106Z" level=info msg="StartContainer for \"562df5873d9ef8fc71460d36b8e399d703ea4ab32fa340a4c49c1e4dcdcdcc5b\" returns successfully" Sep 9 03:29:46.304784 kubelet[2856]: I0909 03:29:46.302584 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-2zdmz" podStartSLOduration=3.427321843 podStartE2EDuration="6.302522132s" podCreationTimestamp="2025-09-09 03:29:40 +0000 UTC" firstStartedPulling="2025-09-09 03:29:41.533166338 +0000 UTC m=+6.267806388" lastFinishedPulling="2025-09-09 03:29:44.408366626 +0000 UTC m=+9.143006677" observedRunningTime="2025-09-09 03:29:44.744504645 +0000 UTC m=+9.479144714" watchObservedRunningTime="2025-09-09 03:29:46.302522132 +0000 UTC m=+11.037162194" Sep 9 03:29:52.496787 sudo[1926]: pam_unix(sudo:session): session closed for user root Sep 9 03:29:52.652895 sshd[1922]: pam_unix(sshd:session): session closed for user core Sep 9 03:29:52.666009 systemd[1]: sshd@8-10.244.20.50:22-147.75.109.163:55374.service: Deactivated successfully. Sep 9 03:29:52.682043 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 03:29:52.690660 systemd-logind[1583]: Session 11 logged out. Waiting for processes to exit. Sep 9 03:29:52.699060 systemd-logind[1583]: Removed session 11. Sep 9 03:29:57.716714 kubelet[2856]: I0909 03:29:57.716580 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6d843fef-5325-41ef-9c9b-8f03b08928d8-typha-certs\") pod \"calico-typha-6b544c99b6-g46j9\" (UID: \"6d843fef-5325-41ef-9c9b-8f03b08928d8\") " pod="calico-system/calico-typha-6b544c99b6-g46j9" Sep 9 03:29:57.716714 kubelet[2856]: I0909 03:29:57.716677 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xzkr\" (UniqueName: \"kubernetes.io/projected/6d843fef-5325-41ef-9c9b-8f03b08928d8-kube-api-access-5xzkr\") pod \"calico-typha-6b544c99b6-g46j9\" (UID: \"6d843fef-5325-41ef-9c9b-8f03b08928d8\") " pod="calico-system/calico-typha-6b544c99b6-g46j9" Sep 9 03:29:57.716714 kubelet[2856]: I0909 03:29:57.716730 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d843fef-5325-41ef-9c9b-8f03b08928d8-tigera-ca-bundle\") pod \"calico-typha-6b544c99b6-g46j9\" (UID: \"6d843fef-5325-41ef-9c9b-8f03b08928d8\") " pod="calico-system/calico-typha-6b544c99b6-g46j9" Sep 9 03:29:58.018688 kubelet[2856]: I0909 03:29:58.017847 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ad129935-2cf5-4dff-9d49-0cc13731bda3-cni-log-dir\") pod \"calico-node-pcp84\" (UID: \"ad129935-2cf5-4dff-9d49-0cc13731bda3\") " pod="calico-system/calico-node-pcp84" Sep 9 03:29:58.018688 kubelet[2856]: I0909 03:29:58.017902 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ad129935-2cf5-4dff-9d49-0cc13731bda3-flexvol-driver-host\") pod \"calico-node-pcp84\" (UID: \"ad129935-2cf5-4dff-9d49-0cc13731bda3\") " pod="calico-system/calico-node-pcp84" Sep 9 03:29:58.018688 kubelet[2856]: I0909 03:29:58.017936 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ad129935-2cf5-4dff-9d49-0cc13731bda3-var-lib-calico\") pod \"calico-node-pcp84\" (UID: \"ad129935-2cf5-4dff-9d49-0cc13731bda3\") " pod="calico-system/calico-node-pcp84" Sep 9 03:29:58.018688 kubelet[2856]: I0909 03:29:58.017964 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ad129935-2cf5-4dff-9d49-0cc13731bda3-var-run-calico\") pod \"calico-node-pcp84\" (UID: \"ad129935-2cf5-4dff-9d49-0cc13731bda3\") " pod="calico-system/calico-node-pcp84" Sep 9 03:29:58.018688 kubelet[2856]: I0909 03:29:58.017994 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46ttt\" (UniqueName: \"kubernetes.io/projected/ad129935-2cf5-4dff-9d49-0cc13731bda3-kube-api-access-46ttt\") pod \"calico-node-pcp84\" (UID: \"ad129935-2cf5-4dff-9d49-0cc13731bda3\") " pod="calico-system/calico-node-pcp84" Sep 9 03:29:58.019076 kubelet[2856]: I0909 03:29:58.018022 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad129935-2cf5-4dff-9d49-0cc13731bda3-tigera-ca-bundle\") pod \"calico-node-pcp84\" (UID: \"ad129935-2cf5-4dff-9d49-0cc13731bda3\") " pod="calico-system/calico-node-pcp84" Sep 9 03:29:58.019076 kubelet[2856]: I0909 03:29:58.018050 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ad129935-2cf5-4dff-9d49-0cc13731bda3-cni-bin-dir\") pod \"calico-node-pcp84\" (UID: \"ad129935-2cf5-4dff-9d49-0cc13731bda3\") " pod="calico-system/calico-node-pcp84" Sep 9 03:29:58.019076 kubelet[2856]: I0909 03:29:58.018076 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ad129935-2cf5-4dff-9d49-0cc13731bda3-cni-net-dir\") pod \"calico-node-pcp84\" (UID: \"ad129935-2cf5-4dff-9d49-0cc13731bda3\") " pod="calico-system/calico-node-pcp84" Sep 9 03:29:58.019076 kubelet[2856]: I0909 03:29:58.018102 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ad129935-2cf5-4dff-9d49-0cc13731bda3-policysync\") pod \"calico-node-pcp84\" (UID: \"ad129935-2cf5-4dff-9d49-0cc13731bda3\") " pod="calico-system/calico-node-pcp84" Sep 9 03:29:58.019076 kubelet[2856]: I0909 03:29:58.018128 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ad129935-2cf5-4dff-9d49-0cc13731bda3-lib-modules\") pod \"calico-node-pcp84\" (UID: \"ad129935-2cf5-4dff-9d49-0cc13731bda3\") " pod="calico-system/calico-node-pcp84" Sep 9 03:29:58.019407 kubelet[2856]: I0909 03:29:58.018155 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ad129935-2cf5-4dff-9d49-0cc13731bda3-node-certs\") pod \"calico-node-pcp84\" (UID: \"ad129935-2cf5-4dff-9d49-0cc13731bda3\") " pod="calico-system/calico-node-pcp84" Sep 9 03:29:58.019407 kubelet[2856]: I0909 03:29:58.018197 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ad129935-2cf5-4dff-9d49-0cc13731bda3-xtables-lock\") pod \"calico-node-pcp84\" (UID: \"ad129935-2cf5-4dff-9d49-0cc13731bda3\") " pod="calico-system/calico-node-pcp84" Sep 9 03:29:58.135201 kubelet[2856]: E0909 03:29:58.132886 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.135201 kubelet[2856]: W0909 03:29:58.133072 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.136326 kubelet[2856]: E0909 03:29:58.136160 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.136326 kubelet[2856]: W0909 03:29:58.136190 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.136326 kubelet[2856]: E0909 03:29:58.136226 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.137492 kubelet[2856]: E0909 03:29:58.137281 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.139871 kubelet[2856]: E0909 03:29:58.139840 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.139871 kubelet[2856]: W0909 03:29:58.139867 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.140214 kubelet[2856]: E0909 03:29:58.139893 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.142053 kubelet[2856]: E0909 03:29:58.141544 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.142053 kubelet[2856]: W0909 03:29:58.141926 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.142974 kubelet[2856]: E0909 03:29:58.142515 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.150204 kubelet[2856]: E0909 03:29:58.150163 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.150975 kubelet[2856]: W0909 03:29:58.150857 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.151635 kubelet[2856]: E0909 03:29:58.151581 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.189863 kubelet[2856]: E0909 03:29:58.189785 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.190488 kubelet[2856]: W0909 03:29:58.190141 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.190488 kubelet[2856]: E0909 03:29:58.190220 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.195914 kubelet[2856]: E0909 03:29:58.193119 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.195914 kubelet[2856]: W0909 03:29:58.193140 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.195914 kubelet[2856]: E0909 03:29:58.193158 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.200933 containerd[1609]: time="2025-09-09T03:29:58.196741038Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b544c99b6-g46j9,Uid:6d843fef-5325-41ef-9c9b-8f03b08928d8,Namespace:calico-system,Attempt:0,}" Sep 9 03:29:58.300897 containerd[1609]: time="2025-09-09T03:29:58.299507231Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:29:58.308399 containerd[1609]: time="2025-09-09T03:29:58.301541928Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:29:58.308399 containerd[1609]: time="2025-09-09T03:29:58.301578235Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:58.308399 containerd[1609]: time="2025-09-09T03:29:58.301792077Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:58.334882 kubelet[2856]: E0909 03:29:58.334256 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hzbnh" podUID="4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664" Sep 9 03:29:58.353954 kubelet[2856]: E0909 03:29:58.353907 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.354523 kubelet[2856]: W0909 03:29:58.354134 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.355450 kubelet[2856]: E0909 03:29:58.354173 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.362592 kubelet[2856]: E0909 03:29:58.360922 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.362592 kubelet[2856]: W0909 03:29:58.360964 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.362592 kubelet[2856]: E0909 03:29:58.360995 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.362592 kubelet[2856]: E0909 03:29:58.361888 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.362592 kubelet[2856]: W0909 03:29:58.361905 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.362592 kubelet[2856]: E0909 03:29:58.361922 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.365066 kubelet[2856]: E0909 03:29:58.363119 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.365066 kubelet[2856]: W0909 03:29:58.363140 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.365066 kubelet[2856]: E0909 03:29:58.363161 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.365481 kubelet[2856]: E0909 03:29:58.365336 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.365481 kubelet[2856]: W0909 03:29:58.365356 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.365481 kubelet[2856]: E0909 03:29:58.365373 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.366056 kubelet[2856]: E0909 03:29:58.365908 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.366056 kubelet[2856]: W0909 03:29:58.365928 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.366056 kubelet[2856]: E0909 03:29:58.365946 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.367083 kubelet[2856]: E0909 03:29:58.366518 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.367083 kubelet[2856]: W0909 03:29:58.366538 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.367083 kubelet[2856]: E0909 03:29:58.366958 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.369141 kubelet[2856]: E0909 03:29:58.368952 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.369141 kubelet[2856]: W0909 03:29:58.368973 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.369141 kubelet[2856]: E0909 03:29:58.368990 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.369632 kubelet[2856]: E0909 03:29:58.369495 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.369632 kubelet[2856]: W0909 03:29:58.369514 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.369632 kubelet[2856]: E0909 03:29:58.369530 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.370857 kubelet[2856]: E0909 03:29:58.370516 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.370857 kubelet[2856]: W0909 03:29:58.370536 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.370857 kubelet[2856]: E0909 03:29:58.370553 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.372167 kubelet[2856]: E0909 03:29:58.371934 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.372167 kubelet[2856]: W0909 03:29:58.371955 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.372167 kubelet[2856]: E0909 03:29:58.371972 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.373876 kubelet[2856]: E0909 03:29:58.373263 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.373876 kubelet[2856]: W0909 03:29:58.373295 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.373876 kubelet[2856]: E0909 03:29:58.373313 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.376317 kubelet[2856]: E0909 03:29:58.375550 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.376317 kubelet[2856]: W0909 03:29:58.375909 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.376317 kubelet[2856]: E0909 03:29:58.375931 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.379341 kubelet[2856]: E0909 03:29:58.378540 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.379341 kubelet[2856]: W0909 03:29:58.378561 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.379341 kubelet[2856]: E0909 03:29:58.378580 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.380416 kubelet[2856]: E0909 03:29:58.380232 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.380416 kubelet[2856]: W0909 03:29:58.380252 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.380416 kubelet[2856]: E0909 03:29:58.380285 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.382046 kubelet[2856]: E0909 03:29:58.381875 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.382046 kubelet[2856]: W0909 03:29:58.381896 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.382046 kubelet[2856]: E0909 03:29:58.381915 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.383140 kubelet[2856]: E0909 03:29:58.382395 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.383140 kubelet[2856]: W0909 03:29:58.382415 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.383140 kubelet[2856]: E0909 03:29:58.382433 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.384240 kubelet[2856]: E0909 03:29:58.383673 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.384240 kubelet[2856]: W0909 03:29:58.383694 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.384240 kubelet[2856]: E0909 03:29:58.383712 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.386115 kubelet[2856]: E0909 03:29:58.385883 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.386115 kubelet[2856]: W0909 03:29:58.385905 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.386115 kubelet[2856]: E0909 03:29:58.385927 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.389633 kubelet[2856]: E0909 03:29:58.388599 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.389633 kubelet[2856]: W0909 03:29:58.388622 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.389633 kubelet[2856]: E0909 03:29:58.388642 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.426726 kubelet[2856]: E0909 03:29:58.424656 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.426726 kubelet[2856]: W0909 03:29:58.424724 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.426726 kubelet[2856]: E0909 03:29:58.424775 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.426726 kubelet[2856]: I0909 03:29:58.425804 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664-socket-dir\") pod \"csi-node-driver-hzbnh\" (UID: \"4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664\") " pod="calico-system/csi-node-driver-hzbnh" Sep 9 03:29:58.432023 kubelet[2856]: E0909 03:29:58.427935 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.432023 kubelet[2856]: W0909 03:29:58.427956 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.432517 kubelet[2856]: E0909 03:29:58.432365 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.434391 kubelet[2856]: E0909 03:29:58.433634 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.434391 kubelet[2856]: W0909 03:29:58.433781 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.434391 kubelet[2856]: E0909 03:29:58.433806 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.435094 kubelet[2856]: E0909 03:29:58.434985 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.435414 kubelet[2856]: W0909 03:29:58.435293 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.435666 kubelet[2856]: E0909 03:29:58.435638 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.435987 kubelet[2856]: I0909 03:29:58.435942 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkcbp\" (UniqueName: \"kubernetes.io/projected/4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664-kube-api-access-jkcbp\") pod \"csi-node-driver-hzbnh\" (UID: \"4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664\") " pod="calico-system/csi-node-driver-hzbnh" Sep 9 03:29:58.437538 kubelet[2856]: E0909 03:29:58.437500 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.437941 kubelet[2856]: W0909 03:29:58.437904 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.438298 kubelet[2856]: E0909 03:29:58.438114 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.439636 kubelet[2856]: I0909 03:29:58.438639 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664-varrun\") pod \"csi-node-driver-hzbnh\" (UID: \"4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664\") " pod="calico-system/csi-node-driver-hzbnh" Sep 9 03:29:58.441043 kubelet[2856]: E0909 03:29:58.440213 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.441043 kubelet[2856]: W0909 03:29:58.440289 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.441043 kubelet[2856]: E0909 03:29:58.440312 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.441043 kubelet[2856]: I0909 03:29:58.440340 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664-kubelet-dir\") pod \"csi-node-driver-hzbnh\" (UID: \"4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664\") " pod="calico-system/csi-node-driver-hzbnh" Sep 9 03:29:58.443774 kubelet[2856]: E0909 03:29:58.442993 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.443774 kubelet[2856]: W0909 03:29:58.443024 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.443774 kubelet[2856]: E0909 03:29:58.443056 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.443774 kubelet[2856]: I0909 03:29:58.443083 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664-registration-dir\") pod \"csi-node-driver-hzbnh\" (UID: \"4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664\") " pod="calico-system/csi-node-driver-hzbnh" Sep 9 03:29:58.447642 kubelet[2856]: E0909 03:29:58.446453 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.447642 kubelet[2856]: W0909 03:29:58.446476 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.447642 kubelet[2856]: E0909 03:29:58.446814 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.451146 kubelet[2856]: E0909 03:29:58.450068 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.451146 kubelet[2856]: W0909 03:29:58.450089 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.453916 kubelet[2856]: E0909 03:29:58.453596 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.453916 kubelet[2856]: E0909 03:29:58.453707 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.454486 kubelet[2856]: W0909 03:29:58.454132 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.456454 kubelet[2856]: E0909 03:29:58.455690 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.459533 kubelet[2856]: E0909 03:29:58.459487 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.461048 kubelet[2856]: W0909 03:29:58.459716 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.464607 kubelet[2856]: E0909 03:29:58.463644 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.464607 kubelet[2856]: W0909 03:29:58.463664 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.464607 kubelet[2856]: E0909 03:29:58.463686 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.475122 kubelet[2856]: E0909 03:29:58.474123 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.476762 kubelet[2856]: E0909 03:29:58.474798 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.476762 kubelet[2856]: W0909 03:29:58.476685 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.476762 kubelet[2856]: E0909 03:29:58.476713 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.479469 kubelet[2856]: E0909 03:29:58.479241 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.479469 kubelet[2856]: W0909 03:29:58.479323 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.479469 kubelet[2856]: E0909 03:29:58.479345 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.480010 kubelet[2856]: E0909 03:29:58.479929 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.480010 kubelet[2856]: W0909 03:29:58.479948 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.480010 kubelet[2856]: E0909 03:29:58.479965 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.495670 containerd[1609]: time="2025-09-09T03:29:58.494687793Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-pcp84,Uid:ad129935-2cf5-4dff-9d49-0cc13731bda3,Namespace:calico-system,Attempt:0,}" Sep 9 03:29:58.547481 kubelet[2856]: E0909 03:29:58.547071 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.547481 kubelet[2856]: W0909 03:29:58.547107 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.547481 kubelet[2856]: E0909 03:29:58.547179 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.550119 kubelet[2856]: E0909 03:29:58.550076 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.550683 kubelet[2856]: W0909 03:29:58.550237 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.550683 kubelet[2856]: E0909 03:29:58.550269 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.553043 kubelet[2856]: E0909 03:29:58.551432 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.553043 kubelet[2856]: W0909 03:29:58.552949 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.553043 kubelet[2856]: E0909 03:29:58.552977 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.558398 kubelet[2856]: E0909 03:29:58.555828 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.558398 kubelet[2856]: W0909 03:29:58.555849 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.558398 kubelet[2856]: E0909 03:29:58.556569 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.563565 kubelet[2856]: E0909 03:29:58.560313 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.563565 kubelet[2856]: W0909 03:29:58.560337 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.563565 kubelet[2856]: E0909 03:29:58.560454 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.563565 kubelet[2856]: E0909 03:29:58.561630 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.563565 kubelet[2856]: W0909 03:29:58.561648 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.565768 kubelet[2856]: E0909 03:29:58.565419 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.566628 kubelet[2856]: E0909 03:29:58.566480 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.566628 kubelet[2856]: W0909 03:29:58.566511 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.566872 kubelet[2856]: E0909 03:29:58.566670 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.567945 kubelet[2856]: E0909 03:29:58.567925 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.568244 kubelet[2856]: W0909 03:29:58.568095 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.569467 kubelet[2856]: E0909 03:29:58.569344 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.569467 kubelet[2856]: W0909 03:29:58.569387 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.570971 kubelet[2856]: E0909 03:29:58.570805 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.570971 kubelet[2856]: E0909 03:29:58.570934 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.571226 kubelet[2856]: E0909 03:29:58.571021 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.571226 kubelet[2856]: W0909 03:29:58.571040 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.571226 kubelet[2856]: E0909 03:29:58.571101 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.576926 kubelet[2856]: E0909 03:29:58.576890 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.577098 kubelet[2856]: W0909 03:29:58.576925 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.577098 kubelet[2856]: E0909 03:29:58.577047 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.581087 kubelet[2856]: E0909 03:29:58.580857 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.581087 kubelet[2856]: W0909 03:29:58.580887 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.581087 kubelet[2856]: E0909 03:29:58.580939 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.584697 kubelet[2856]: E0909 03:29:58.583615 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.584697 kubelet[2856]: W0909 03:29:58.583642 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.588811 kubelet[2856]: E0909 03:29:58.588736 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.588811 kubelet[2856]: W0909 03:29:58.588785 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.596308 kubelet[2856]: E0909 03:29:58.592729 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.596308 kubelet[2856]: W0909 03:29:58.593064 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.596308 kubelet[2856]: E0909 03:29:58.593329 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.596308 kubelet[2856]: E0909 03:29:58.593383 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.596308 kubelet[2856]: E0909 03:29:58.593407 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.596308 kubelet[2856]: E0909 03:29:58.596107 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.596308 kubelet[2856]: W0909 03:29:58.596237 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.601148 kubelet[2856]: E0909 03:29:58.600192 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.601725 kubelet[2856]: E0909 03:29:58.601321 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.601725 kubelet[2856]: W0909 03:29:58.601341 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.601725 kubelet[2856]: E0909 03:29:58.601612 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.601725 kubelet[2856]: W0909 03:29:58.601627 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.601725 kubelet[2856]: E0909 03:29:58.601901 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.601725 kubelet[2856]: W0909 03:29:58.601916 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.603704 kubelet[2856]: E0909 03:29:58.602231 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.603704 kubelet[2856]: W0909 03:29:58.602247 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.603704 kubelet[2856]: E0909 03:29:58.602620 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.603704 kubelet[2856]: W0909 03:29:58.602636 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.603704 kubelet[2856]: E0909 03:29:58.602655 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.603704 kubelet[2856]: E0909 03:29:58.602724 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.603704 kubelet[2856]: E0909 03:29:58.603293 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.604068 kubelet[2856]: W0909 03:29:58.603780 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.604068 kubelet[2856]: E0909 03:29:58.603805 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.604068 kubelet[2856]: E0909 03:29:58.603837 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.613615 kubelet[2856]: E0909 03:29:58.604146 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.613615 kubelet[2856]: W0909 03:29:58.604162 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.613615 kubelet[2856]: E0909 03:29:58.604177 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.613615 kubelet[2856]: E0909 03:29:58.605952 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.613615 kubelet[2856]: W0909 03:29:58.605968 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.613615 kubelet[2856]: E0909 03:29:58.605984 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.613615 kubelet[2856]: E0909 03:29:58.608578 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.613615 kubelet[2856]: E0909 03:29:58.612232 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.613615 kubelet[2856]: W0909 03:29:58.612256 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.613615 kubelet[2856]: E0909 03:29:58.612288 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.617176 kubelet[2856]: E0909 03:29:58.612322 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.629618 containerd[1609]: time="2025-09-09T03:29:58.625228180Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:29:58.629618 containerd[1609]: time="2025-09-09T03:29:58.625375404Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:29:58.629618 containerd[1609]: time="2025-09-09T03:29:58.625426822Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:58.629618 containerd[1609]: time="2025-09-09T03:29:58.625721677Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:29:58.647016 kubelet[2856]: E0909 03:29:58.646559 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:29:58.647016 kubelet[2856]: W0909 03:29:58.646590 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:29:58.647016 kubelet[2856]: E0909 03:29:58.646620 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:29:58.827735 containerd[1609]: time="2025-09-09T03:29:58.827382415Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b544c99b6-g46j9,Uid:6d843fef-5325-41ef-9c9b-8f03b08928d8,Namespace:calico-system,Attempt:0,} returns sandbox id \"440e63cdfd3871d10cff20b3100301061597dc5cfd9f436c0e956a5f21a64c9c\"" Sep 9 03:29:58.853441 containerd[1609]: time="2025-09-09T03:29:58.853378390Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 03:29:58.898686 containerd[1609]: time="2025-09-09T03:29:58.898519073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-pcp84,Uid:ad129935-2cf5-4dff-9d49-0cc13731bda3,Namespace:calico-system,Attempt:0,} returns sandbox id \"eaf6d9e136ef4c576f8b50e2eb54acd783db4152ab3f1daed1c7473a1945713a\"" Sep 9 03:29:59.663815 kubelet[2856]: E0909 03:29:59.662434 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hzbnh" podUID="4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664" Sep 9 03:30:01.499385 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3136394455.mount: Deactivated successfully. Sep 9 03:30:01.664883 kubelet[2856]: E0909 03:30:01.664240 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hzbnh" podUID="4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664" Sep 9 03:30:03.664313 kubelet[2856]: E0909 03:30:03.663303 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hzbnh" podUID="4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664" Sep 9 03:30:04.108480 containerd[1609]: time="2025-09-09T03:30:04.108328671Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:04.110433 containerd[1609]: time="2025-09-09T03:30:04.110165075Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 9 03:30:04.111690 containerd[1609]: time="2025-09-09T03:30:04.111328686Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:04.124055 containerd[1609]: time="2025-09-09T03:30:04.123992839Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:04.125656 containerd[1609]: time="2025-09-09T03:30:04.125569525Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 5.272121557s" Sep 9 03:30:04.125656 containerd[1609]: time="2025-09-09T03:30:04.125627458Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 9 03:30:04.130044 containerd[1609]: time="2025-09-09T03:30:04.129906125Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 03:30:04.152119 containerd[1609]: time="2025-09-09T03:30:04.150178214Z" level=info msg="CreateContainer within sandbox \"440e63cdfd3871d10cff20b3100301061597dc5cfd9f436c0e956a5f21a64c9c\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 03:30:04.174778 containerd[1609]: time="2025-09-09T03:30:04.174451352Z" level=info msg="CreateContainer within sandbox \"440e63cdfd3871d10cff20b3100301061597dc5cfd9f436c0e956a5f21a64c9c\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f5717fff07409479a983e5b7cacebb35d13438e0123b1bb0d6cb08adcb52482d\"" Sep 9 03:30:04.175967 containerd[1609]: time="2025-09-09T03:30:04.175836463Z" level=info msg="StartContainer for \"f5717fff07409479a983e5b7cacebb35d13438e0123b1bb0d6cb08adcb52482d\"" Sep 9 03:30:04.303879 containerd[1609]: time="2025-09-09T03:30:04.303706870Z" level=info msg="StartContainer for \"f5717fff07409479a983e5b7cacebb35d13438e0123b1bb0d6cb08adcb52482d\" returns successfully" Sep 9 03:30:04.853353 kubelet[2856]: E0909 03:30:04.853299 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.853353 kubelet[2856]: W0909 03:30:04.853352 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.853353 kubelet[2856]: E0909 03:30:04.853418 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.854602 kubelet[2856]: E0909 03:30:04.853945 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.854602 kubelet[2856]: W0909 03:30:04.853962 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.854602 kubelet[2856]: E0909 03:30:04.853994 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.854602 kubelet[2856]: E0909 03:30:04.854346 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.854602 kubelet[2856]: W0909 03:30:04.854362 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.854602 kubelet[2856]: E0909 03:30:04.854444 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.854995 kubelet[2856]: E0909 03:30:04.854735 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.854995 kubelet[2856]: W0909 03:30:04.854770 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.854995 kubelet[2856]: E0909 03:30:04.854788 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.855603 kubelet[2856]: E0909 03:30:04.855080 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.855603 kubelet[2856]: W0909 03:30:04.855095 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.855603 kubelet[2856]: E0909 03:30:04.855113 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.855603 kubelet[2856]: E0909 03:30:04.855537 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.855603 kubelet[2856]: W0909 03:30:04.855553 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.855603 kubelet[2856]: E0909 03:30:04.855569 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.855994 kubelet[2856]: E0909 03:30:04.855869 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.855994 kubelet[2856]: W0909 03:30:04.855884 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.855994 kubelet[2856]: E0909 03:30:04.855899 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.856357 kubelet[2856]: E0909 03:30:04.856187 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.856357 kubelet[2856]: W0909 03:30:04.856201 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.856357 kubelet[2856]: E0909 03:30:04.856217 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.856634 kubelet[2856]: E0909 03:30:04.856535 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.856634 kubelet[2856]: W0909 03:30:04.856552 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.856634 kubelet[2856]: E0909 03:30:04.856569 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.856974 kubelet[2856]: E0909 03:30:04.856928 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.857040 kubelet[2856]: W0909 03:30:04.856976 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.857040 kubelet[2856]: E0909 03:30:04.856994 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.857400 kubelet[2856]: E0909 03:30:04.857365 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.857400 kubelet[2856]: W0909 03:30:04.857397 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.857544 kubelet[2856]: E0909 03:30:04.857414 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.857764 kubelet[2856]: E0909 03:30:04.857722 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.857842 kubelet[2856]: W0909 03:30:04.857783 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.857842 kubelet[2856]: E0909 03:30:04.857804 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.858089 kubelet[2856]: E0909 03:30:04.858069 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.858152 kubelet[2856]: W0909 03:30:04.858089 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.858152 kubelet[2856]: E0909 03:30:04.858107 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.858456 kubelet[2856]: E0909 03:30:04.858433 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.858456 kubelet[2856]: W0909 03:30:04.858454 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.858578 kubelet[2856]: E0909 03:30:04.858471 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.858821 kubelet[2856]: E0909 03:30:04.858798 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.858821 kubelet[2856]: W0909 03:30:04.858813 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.858925 kubelet[2856]: E0909 03:30:04.858829 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.920988 kubelet[2856]: E0909 03:30:04.920705 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.920988 kubelet[2856]: W0909 03:30:04.920742 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.920988 kubelet[2856]: E0909 03:30:04.920800 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.921418 kubelet[2856]: E0909 03:30:04.921246 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.921418 kubelet[2856]: W0909 03:30:04.921264 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.921418 kubelet[2856]: E0909 03:30:04.921301 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.921700 kubelet[2856]: E0909 03:30:04.921673 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.921700 kubelet[2856]: W0909 03:30:04.921694 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.921936 kubelet[2856]: E0909 03:30:04.921726 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.922190 kubelet[2856]: E0909 03:30:04.922087 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.922190 kubelet[2856]: W0909 03:30:04.922112 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.922190 kubelet[2856]: E0909 03:30:04.922137 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.922479 kubelet[2856]: E0909 03:30:04.922460 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.922568 kubelet[2856]: W0909 03:30:04.922480 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.922568 kubelet[2856]: E0909 03:30:04.922530 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.922915 kubelet[2856]: E0909 03:30:04.922892 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.922915 kubelet[2856]: W0909 03:30:04.922912 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.923056 kubelet[2856]: E0909 03:30:04.923034 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.923296 kubelet[2856]: E0909 03:30:04.923272 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.923380 kubelet[2856]: W0909 03:30:04.923296 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.923521 kubelet[2856]: E0909 03:30:04.923464 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.923593 kubelet[2856]: E0909 03:30:04.923567 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.923593 kubelet[2856]: W0909 03:30:04.923582 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.923702 kubelet[2856]: E0909 03:30:04.923674 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.924038 kubelet[2856]: E0909 03:30:04.924018 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.924038 kubelet[2856]: W0909 03:30:04.924038 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.924179 kubelet[2856]: E0909 03:30:04.924073 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.924833 kubelet[2856]: E0909 03:30:04.924802 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.924833 kubelet[2856]: W0909 03:30:04.924826 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.924964 kubelet[2856]: E0909 03:30:04.924851 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.925123 kubelet[2856]: E0909 03:30:04.925091 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.925123 kubelet[2856]: W0909 03:30:04.925115 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.925341 kubelet[2856]: E0909 03:30:04.925263 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.925436 kubelet[2856]: E0909 03:30:04.925382 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.925436 kubelet[2856]: W0909 03:30:04.925398 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.925570 kubelet[2856]: E0909 03:30:04.925492 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.925740 kubelet[2856]: E0909 03:30:04.925709 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.925740 kubelet[2856]: W0909 03:30:04.925734 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.926045 kubelet[2856]: E0909 03:30:04.925789 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.926181 kubelet[2856]: E0909 03:30:04.926159 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.926290 kubelet[2856]: W0909 03:30:04.926269 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.926425 kubelet[2856]: E0909 03:30:04.926403 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.926738 kubelet[2856]: E0909 03:30:04.926715 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.926832 kubelet[2856]: W0909 03:30:04.926737 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.926832 kubelet[2856]: E0909 03:30:04.926801 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.927167 kubelet[2856]: E0909 03:30:04.927147 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.927167 kubelet[2856]: W0909 03:30:04.927167 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.927279 kubelet[2856]: E0909 03:30:04.927190 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.928128 kubelet[2856]: E0909 03:30:04.928053 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.928128 kubelet[2856]: W0909 03:30:04.928076 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.928128 kubelet[2856]: E0909 03:30:04.928093 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:04.928421 kubelet[2856]: E0909 03:30:04.928391 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:04.928421 kubelet[2856]: W0909 03:30:04.928415 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:04.928912 kubelet[2856]: E0909 03:30:04.928432 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.664631 kubelet[2856]: E0909 03:30:05.663427 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hzbnh" podUID="4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664" Sep 9 03:30:05.837174 kubelet[2856]: I0909 03:30:05.837115 2856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 03:30:05.872020 kubelet[2856]: E0909 03:30:05.871821 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.872020 kubelet[2856]: W0909 03:30:05.871857 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.872020 kubelet[2856]: E0909 03:30:05.871890 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.873802 kubelet[2856]: E0909 03:30:05.873217 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.873802 kubelet[2856]: W0909 03:30:05.873240 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.873802 kubelet[2856]: E0909 03:30:05.873262 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.874267 kubelet[2856]: E0909 03:30:05.874049 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.874267 kubelet[2856]: W0909 03:30:05.874069 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.874267 kubelet[2856]: E0909 03:30:05.874087 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.875002 kubelet[2856]: E0909 03:30:05.874699 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.875002 kubelet[2856]: W0909 03:30:05.874718 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.875002 kubelet[2856]: E0909 03:30:05.874736 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.875726 kubelet[2856]: E0909 03:30:05.875465 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.875726 kubelet[2856]: W0909 03:30:05.875485 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.875726 kubelet[2856]: E0909 03:30:05.875502 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.876407 kubelet[2856]: E0909 03:30:05.876211 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.876407 kubelet[2856]: W0909 03:30:05.876230 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.876407 kubelet[2856]: E0909 03:30:05.876251 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.877241 kubelet[2856]: E0909 03:30:05.876857 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.877241 kubelet[2856]: W0909 03:30:05.876872 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.877241 kubelet[2856]: E0909 03:30:05.876888 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.878024 kubelet[2856]: E0909 03:30:05.877729 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.878024 kubelet[2856]: W0909 03:30:05.877776 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.878024 kubelet[2856]: E0909 03:30:05.877795 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.878725 kubelet[2856]: E0909 03:30:05.878552 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.878725 kubelet[2856]: W0909 03:30:05.878575 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.878725 kubelet[2856]: E0909 03:30:05.878593 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.879535 kubelet[2856]: E0909 03:30:05.879288 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.879535 kubelet[2856]: W0909 03:30:05.879357 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.879535 kubelet[2856]: E0909 03:30:05.879400 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.879911 kubelet[2856]: E0909 03:30:05.879848 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.879911 kubelet[2856]: W0909 03:30:05.879867 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.880304 kubelet[2856]: E0909 03:30:05.879884 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.880734 kubelet[2856]: E0909 03:30:05.880714 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.881029 kubelet[2856]: W0909 03:30:05.880837 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.881029 kubelet[2856]: E0909 03:30:05.880862 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.881338 kubelet[2856]: E0909 03:30:05.881296 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.881682 kubelet[2856]: W0909 03:30:05.881535 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.881682 kubelet[2856]: E0909 03:30:05.881561 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.882493 kubelet[2856]: E0909 03:30:05.882245 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.882493 kubelet[2856]: W0909 03:30:05.882264 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.882493 kubelet[2856]: E0909 03:30:05.882281 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.883017 kubelet[2856]: E0909 03:30:05.882911 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.883017 kubelet[2856]: W0909 03:30:05.882930 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.883017 kubelet[2856]: E0909 03:30:05.882947 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.929867 kubelet[2856]: E0909 03:30:05.929523 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.929867 kubelet[2856]: W0909 03:30:05.929559 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.929867 kubelet[2856]: E0909 03:30:05.929588 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.933089 kubelet[2856]: E0909 03:30:05.931706 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.933089 kubelet[2856]: W0909 03:30:05.931728 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.933089 kubelet[2856]: E0909 03:30:05.931767 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.933089 kubelet[2856]: E0909 03:30:05.932401 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.933089 kubelet[2856]: W0909 03:30:05.932417 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.933089 kubelet[2856]: E0909 03:30:05.932434 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.938835 kubelet[2856]: E0909 03:30:05.937967 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.938835 kubelet[2856]: W0909 03:30:05.938030 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.939216 kubelet[2856]: E0909 03:30:05.939150 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.939363 kubelet[2856]: E0909 03:30:05.939329 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.939363 kubelet[2856]: W0909 03:30:05.939358 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.939570 kubelet[2856]: E0909 03:30:05.939395 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.940065 kubelet[2856]: E0909 03:30:05.939965 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.940065 kubelet[2856]: W0909 03:30:05.939986 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.940065 kubelet[2856]: E0909 03:30:05.940020 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.940710 kubelet[2856]: E0909 03:30:05.940445 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.940710 kubelet[2856]: W0909 03:30:05.940466 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.940710 kubelet[2856]: E0909 03:30:05.940546 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.941770 kubelet[2856]: E0909 03:30:05.941041 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.941770 kubelet[2856]: W0909 03:30:05.941063 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.941770 kubelet[2856]: E0909 03:30:05.941572 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.941770 kubelet[2856]: W0909 03:30:05.941588 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.942166 kubelet[2856]: E0909 03:30:05.942082 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.942166 kubelet[2856]: E0909 03:30:05.942123 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.942306 kubelet[2856]: E0909 03:30:05.942216 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.942306 kubelet[2856]: W0909 03:30:05.942231 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.942306 kubelet[2856]: E0909 03:30:05.942257 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.942594 kubelet[2856]: E0909 03:30:05.942563 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.942594 kubelet[2856]: W0909 03:30:05.942593 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.942770 kubelet[2856]: E0909 03:30:05.942620 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.946493 kubelet[2856]: E0909 03:30:05.945620 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.946493 kubelet[2856]: W0909 03:30:05.945642 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.946493 kubelet[2856]: E0909 03:30:05.945667 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.947720 kubelet[2856]: E0909 03:30:05.947698 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.947904 kubelet[2856]: W0909 03:30:05.947882 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.948289 kubelet[2856]: E0909 03:30:05.948268 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.949504 kubelet[2856]: E0909 03:30:05.949484 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.949829 kubelet[2856]: W0909 03:30:05.949620 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.950137 kubelet[2856]: E0909 03:30:05.950118 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.950772 kubelet[2856]: W0909 03:30:05.950338 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.950772 kubelet[2856]: E0909 03:30:05.950367 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.952531 kubelet[2856]: E0909 03:30:05.951811 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.952531 kubelet[2856]: E0909 03:30:05.952094 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.952531 kubelet[2856]: W0909 03:30:05.952110 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.954203 kubelet[2856]: E0909 03:30:05.952825 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.954203 kubelet[2856]: E0909 03:30:05.953931 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.955001 kubelet[2856]: W0909 03:30:05.954416 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.955001 kubelet[2856]: E0909 03:30:05.954445 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:05.956547 kubelet[2856]: E0909 03:30:05.956527 2856 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 03:30:05.956810 kubelet[2856]: W0909 03:30:05.956697 2856 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 03:30:05.956810 kubelet[2856]: E0909 03:30:05.956731 2856 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 03:30:06.130874 containerd[1609]: time="2025-09-09T03:30:06.130774588Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:06.132482 containerd[1609]: time="2025-09-09T03:30:06.132077923Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 9 03:30:06.133879 containerd[1609]: time="2025-09-09T03:30:06.133828355Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:06.140783 containerd[1609]: time="2025-09-09T03:30:06.140466522Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:06.142053 containerd[1609]: time="2025-09-09T03:30:06.141883084Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 2.011896503s" Sep 9 03:30:06.142053 containerd[1609]: time="2025-09-09T03:30:06.141939650Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 9 03:30:06.147650 containerd[1609]: time="2025-09-09T03:30:06.147481726Z" level=info msg="CreateContainer within sandbox \"eaf6d9e136ef4c576f8b50e2eb54acd783db4152ab3f1daed1c7473a1945713a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 03:30:06.165795 containerd[1609]: time="2025-09-09T03:30:06.165628357Z" level=info msg="CreateContainer within sandbox \"eaf6d9e136ef4c576f8b50e2eb54acd783db4152ab3f1daed1c7473a1945713a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d2cb0eef6c4c42d7d16768907077d4adf7ebf1079caba76ccbea4410fc138db7\"" Sep 9 03:30:06.166700 containerd[1609]: time="2025-09-09T03:30:06.166639828Z" level=info msg="StartContainer for \"d2cb0eef6c4c42d7d16768907077d4adf7ebf1079caba76ccbea4410fc138db7\"" Sep 9 03:30:06.300730 containerd[1609]: time="2025-09-09T03:30:06.300677048Z" level=info msg="StartContainer for \"d2cb0eef6c4c42d7d16768907077d4adf7ebf1079caba76ccbea4410fc138db7\" returns successfully" Sep 9 03:30:06.387475 containerd[1609]: time="2025-09-09T03:30:06.375120187Z" level=info msg="shim disconnected" id=d2cb0eef6c4c42d7d16768907077d4adf7ebf1079caba76ccbea4410fc138db7 namespace=k8s.io Sep 9 03:30:06.387475 containerd[1609]: time="2025-09-09T03:30:06.387172947Z" level=warning msg="cleaning up after shim disconnected" id=d2cb0eef6c4c42d7d16768907077d4adf7ebf1079caba76ccbea4410fc138db7 namespace=k8s.io Sep 9 03:30:06.387475 containerd[1609]: time="2025-09-09T03:30:06.387202670Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 9 03:30:06.844774 containerd[1609]: time="2025-09-09T03:30:06.844509064Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 03:30:06.875485 kubelet[2856]: I0909 03:30:06.873727 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6b544c99b6-g46j9" podStartSLOduration=4.586894579 podStartE2EDuration="9.873682352s" podCreationTimestamp="2025-09-09 03:29:57 +0000 UTC" firstStartedPulling="2025-09-09 03:29:58.840631023 +0000 UTC m=+23.575271069" lastFinishedPulling="2025-09-09 03:30:04.127418788 +0000 UTC m=+28.862058842" observedRunningTime="2025-09-09 03:30:04.849869872 +0000 UTC m=+29.584509940" watchObservedRunningTime="2025-09-09 03:30:06.873682352 +0000 UTC m=+31.608322410" Sep 9 03:30:07.162856 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d2cb0eef6c4c42d7d16768907077d4adf7ebf1079caba76ccbea4410fc138db7-rootfs.mount: Deactivated successfully. Sep 9 03:30:07.663498 kubelet[2856]: E0909 03:30:07.662850 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hzbnh" podUID="4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664" Sep 9 03:30:09.665455 kubelet[2856]: E0909 03:30:09.665133 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hzbnh" podUID="4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664" Sep 9 03:30:11.662474 kubelet[2856]: E0909 03:30:11.662392 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hzbnh" podUID="4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664" Sep 9 03:30:12.126273 containerd[1609]: time="2025-09-09T03:30:12.126148916Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:12.128372 containerd[1609]: time="2025-09-09T03:30:12.128271616Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 9 03:30:12.129532 containerd[1609]: time="2025-09-09T03:30:12.129464490Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:12.141994 containerd[1609]: time="2025-09-09T03:30:12.141932526Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:12.144375 containerd[1609]: time="2025-09-09T03:30:12.144313896Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 5.299742907s" Sep 9 03:30:12.144465 containerd[1609]: time="2025-09-09T03:30:12.144372479Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 9 03:30:12.147951 containerd[1609]: time="2025-09-09T03:30:12.147809661Z" level=info msg="CreateContainer within sandbox \"eaf6d9e136ef4c576f8b50e2eb54acd783db4152ab3f1daed1c7473a1945713a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 03:30:12.209770 containerd[1609]: time="2025-09-09T03:30:12.208782699Z" level=info msg="CreateContainer within sandbox \"eaf6d9e136ef4c576f8b50e2eb54acd783db4152ab3f1daed1c7473a1945713a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a73aa45026f1316dd31dc02075bd4c681c09db8e7510f3263600508929e4ba57\"" Sep 9 03:30:12.210904 containerd[1609]: time="2025-09-09T03:30:12.210868276Z" level=info msg="StartContainer for \"a73aa45026f1316dd31dc02075bd4c681c09db8e7510f3263600508929e4ba57\"" Sep 9 03:30:12.423717 containerd[1609]: time="2025-09-09T03:30:12.423435345Z" level=info msg="StartContainer for \"a73aa45026f1316dd31dc02075bd4c681c09db8e7510f3263600508929e4ba57\" returns successfully" Sep 9 03:30:13.610132 kubelet[2856]: I0909 03:30:13.610034 2856 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 9 03:30:13.658497 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a73aa45026f1316dd31dc02075bd4c681c09db8e7510f3263600508929e4ba57-rootfs.mount: Deactivated successfully. Sep 9 03:30:13.671634 containerd[1609]: time="2025-09-09T03:30:13.667552380Z" level=info msg="shim disconnected" id=a73aa45026f1316dd31dc02075bd4c681c09db8e7510f3263600508929e4ba57 namespace=k8s.io Sep 9 03:30:13.671634 containerd[1609]: time="2025-09-09T03:30:13.669967881Z" level=warning msg="cleaning up after shim disconnected" id=a73aa45026f1316dd31dc02075bd4c681c09db8e7510f3263600508929e4ba57 namespace=k8s.io Sep 9 03:30:13.671634 containerd[1609]: time="2025-09-09T03:30:13.669995802Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 9 03:30:13.689524 containerd[1609]: time="2025-09-09T03:30:13.688587957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hzbnh,Uid:4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664,Namespace:calico-system,Attempt:0,}" Sep 9 03:30:13.791786 kubelet[2856]: I0909 03:30:13.791265 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h68zf\" (UniqueName: \"kubernetes.io/projected/016e9bee-9c8a-4c0a-b6a2-ef76e3666084-kube-api-access-h68zf\") pod \"coredns-7c65d6cfc9-mfkn7\" (UID: \"016e9bee-9c8a-4c0a-b6a2-ef76e3666084\") " pod="kube-system/coredns-7c65d6cfc9-mfkn7" Sep 9 03:30:13.792053 kubelet[2856]: I0909 03:30:13.791948 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60a1b3a8-da11-4959-aadb-274c0e91ef5a-tigera-ca-bundle\") pod \"calico-kube-controllers-567cf5f5d8-smmz5\" (UID: \"60a1b3a8-da11-4959-aadb-274c0e91ef5a\") " pod="calico-system/calico-kube-controllers-567cf5f5d8-smmz5" Sep 9 03:30:13.793083 kubelet[2856]: I0909 03:30:13.792223 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6e64133a-dcc9-4620-a059-3825b5dc1d54-calico-apiserver-certs\") pod \"calico-apiserver-8dcb5dd78-47vcp\" (UID: \"6e64133a-dcc9-4620-a059-3825b5dc1d54\") " pod="calico-apiserver/calico-apiserver-8dcb5dd78-47vcp" Sep 9 03:30:13.793083 kubelet[2856]: I0909 03:30:13.792620 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d110bb2c-45e8-4b87-97ca-6d0d4a087fb1-config-volume\") pod \"coredns-7c65d6cfc9-grrnb\" (UID: \"d110bb2c-45e8-4b87-97ca-6d0d4a087fb1\") " pod="kube-system/coredns-7c65d6cfc9-grrnb" Sep 9 03:30:13.794207 kubelet[2856]: I0909 03:30:13.793388 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5lb\" (UniqueName: \"kubernetes.io/projected/60a1b3a8-da11-4959-aadb-274c0e91ef5a-kube-api-access-lj5lb\") pod \"calico-kube-controllers-567cf5f5d8-smmz5\" (UID: \"60a1b3a8-da11-4959-aadb-274c0e91ef5a\") " pod="calico-system/calico-kube-controllers-567cf5f5d8-smmz5" Sep 9 03:30:13.794207 kubelet[2856]: I0909 03:30:13.793434 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdfjd\" (UniqueName: \"kubernetes.io/projected/d110bb2c-45e8-4b87-97ca-6d0d4a087fb1-kube-api-access-qdfjd\") pod \"coredns-7c65d6cfc9-grrnb\" (UID: \"d110bb2c-45e8-4b87-97ca-6d0d4a087fb1\") " pod="kube-system/coredns-7c65d6cfc9-grrnb" Sep 9 03:30:13.794207 kubelet[2856]: I0909 03:30:13.793469 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqt7q\" (UniqueName: \"kubernetes.io/projected/6e64133a-dcc9-4620-a059-3825b5dc1d54-kube-api-access-vqt7q\") pod \"calico-apiserver-8dcb5dd78-47vcp\" (UID: \"6e64133a-dcc9-4620-a059-3825b5dc1d54\") " pod="calico-apiserver/calico-apiserver-8dcb5dd78-47vcp" Sep 9 03:30:13.794207 kubelet[2856]: I0909 03:30:13.793501 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/016e9bee-9c8a-4c0a-b6a2-ef76e3666084-config-volume\") pod \"coredns-7c65d6cfc9-mfkn7\" (UID: \"016e9bee-9c8a-4c0a-b6a2-ef76e3666084\") " pod="kube-system/coredns-7c65d6cfc9-mfkn7" Sep 9 03:30:13.892807 containerd[1609]: time="2025-09-09T03:30:13.891946202Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 03:30:13.894482 kubelet[2856]: I0909 03:30:13.894436 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3e83bd5-7012-441b-9934-0559a0c6192c-config\") pod \"goldmane-7988f88666-vgmql\" (UID: \"c3e83bd5-7012-441b-9934-0559a0c6192c\") " pod="calico-system/goldmane-7988f88666-vgmql" Sep 9 03:30:13.894606 kubelet[2856]: I0909 03:30:13.894497 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ee2b345f-7033-4c91-91dd-2cd3dbef2a5b-calico-apiserver-certs\") pod \"calico-apiserver-8dcb5dd78-sjn5k\" (UID: \"ee2b345f-7033-4c91-91dd-2cd3dbef2a5b\") " pod="calico-apiserver/calico-apiserver-8dcb5dd78-sjn5k" Sep 9 03:30:13.894606 kubelet[2856]: I0909 03:30:13.894587 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fbb3505-bd00-459e-814c-f3717bb189c3-whisker-ca-bundle\") pod \"whisker-d89f85994-nw628\" (UID: \"3fbb3505-bd00-459e-814c-f3717bb189c3\") " pod="calico-system/whisker-d89f85994-nw628" Sep 9 03:30:13.896364 kubelet[2856]: I0909 03:30:13.894636 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj5m5\" (UniqueName: \"kubernetes.io/projected/3fbb3505-bd00-459e-814c-f3717bb189c3-kube-api-access-xj5m5\") pod \"whisker-d89f85994-nw628\" (UID: \"3fbb3505-bd00-459e-814c-f3717bb189c3\") " pod="calico-system/whisker-d89f85994-nw628" Sep 9 03:30:13.896364 kubelet[2856]: I0909 03:30:13.894694 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e83bd5-7012-441b-9934-0559a0c6192c-goldmane-ca-bundle\") pod \"goldmane-7988f88666-vgmql\" (UID: \"c3e83bd5-7012-441b-9934-0559a0c6192c\") " pod="calico-system/goldmane-7988f88666-vgmql" Sep 9 03:30:13.896364 kubelet[2856]: I0909 03:30:13.894723 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/c3e83bd5-7012-441b-9934-0559a0c6192c-goldmane-key-pair\") pod \"goldmane-7988f88666-vgmql\" (UID: \"c3e83bd5-7012-441b-9934-0559a0c6192c\") " pod="calico-system/goldmane-7988f88666-vgmql" Sep 9 03:30:13.896364 kubelet[2856]: I0909 03:30:13.894766 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwjxp\" (UniqueName: \"kubernetes.io/projected/c3e83bd5-7012-441b-9934-0559a0c6192c-kube-api-access-zwjxp\") pod \"goldmane-7988f88666-vgmql\" (UID: \"c3e83bd5-7012-441b-9934-0559a0c6192c\") " pod="calico-system/goldmane-7988f88666-vgmql" Sep 9 03:30:13.896364 kubelet[2856]: I0909 03:30:13.894815 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hc7d\" (UniqueName: \"kubernetes.io/projected/ee2b345f-7033-4c91-91dd-2cd3dbef2a5b-kube-api-access-8hc7d\") pod \"calico-apiserver-8dcb5dd78-sjn5k\" (UID: \"ee2b345f-7033-4c91-91dd-2cd3dbef2a5b\") " pod="calico-apiserver/calico-apiserver-8dcb5dd78-sjn5k" Sep 9 03:30:13.896796 kubelet[2856]: I0909 03:30:13.894900 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3fbb3505-bd00-459e-814c-f3717bb189c3-whisker-backend-key-pair\") pod \"whisker-d89f85994-nw628\" (UID: \"3fbb3505-bd00-459e-814c-f3717bb189c3\") " pod="calico-system/whisker-d89f85994-nw628" Sep 9 03:30:14.023782 containerd[1609]: time="2025-09-09T03:30:14.022122917Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-567cf5f5d8-smmz5,Uid:60a1b3a8-da11-4959-aadb-274c0e91ef5a,Namespace:calico-system,Attempt:0,}" Sep 9 03:30:14.030186 containerd[1609]: time="2025-09-09T03:30:14.030147178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dcb5dd78-47vcp,Uid:6e64133a-dcc9-4620-a059-3825b5dc1d54,Namespace:calico-apiserver,Attempt:0,}" Sep 9 03:30:14.035159 containerd[1609]: time="2025-09-09T03:30:14.035117642Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-mfkn7,Uid:016e9bee-9c8a-4c0a-b6a2-ef76e3666084,Namespace:kube-system,Attempt:0,}" Sep 9 03:30:14.064407 containerd[1609]: time="2025-09-09T03:30:14.064306562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-grrnb,Uid:d110bb2c-45e8-4b87-97ca-6d0d4a087fb1,Namespace:kube-system,Attempt:0,}" Sep 9 03:30:14.088688 containerd[1609]: time="2025-09-09T03:30:14.088406227Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d89f85994-nw628,Uid:3fbb3505-bd00-459e-814c-f3717bb189c3,Namespace:calico-system,Attempt:0,}" Sep 9 03:30:14.115164 containerd[1609]: time="2025-09-09T03:30:14.115110782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dcb5dd78-sjn5k,Uid:ee2b345f-7033-4c91-91dd-2cd3dbef2a5b,Namespace:calico-apiserver,Attempt:0,}" Sep 9 03:30:14.118628 containerd[1609]: time="2025-09-09T03:30:14.118466749Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-vgmql,Uid:c3e83bd5-7012-441b-9934-0559a0c6192c,Namespace:calico-system,Attempt:0,}" Sep 9 03:30:14.351134 containerd[1609]: time="2025-09-09T03:30:14.351072846Z" level=error msg="Failed to destroy network for sandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.355118 containerd[1609]: time="2025-09-09T03:30:14.355074039Z" level=error msg="Failed to destroy network for sandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.380010 containerd[1609]: time="2025-09-09T03:30:14.379941558Z" level=error msg="encountered an error cleaning up failed sandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.386938 containerd[1609]: time="2025-09-09T03:30:14.386806773Z" level=error msg="encountered an error cleaning up failed sandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.397531 containerd[1609]: time="2025-09-09T03:30:14.397227151Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-567cf5f5d8-smmz5,Uid:60a1b3a8-da11-4959-aadb-274c0e91ef5a,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.410423 containerd[1609]: time="2025-09-09T03:30:14.409468844Z" level=error msg="Failed to destroy network for sandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.410423 containerd[1609]: time="2025-09-09T03:30:14.409943186Z" level=error msg="encountered an error cleaning up failed sandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.410423 containerd[1609]: time="2025-09-09T03:30:14.410012993Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-mfkn7,Uid:016e9bee-9c8a-4c0a-b6a2-ef76e3666084,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.422250 containerd[1609]: time="2025-09-09T03:30:14.422189638Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hzbnh,Uid:4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.422722 kubelet[2856]: E0909 03:30:14.422657 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.423052 kubelet[2856]: E0909 03:30:14.422893 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.423293 kubelet[2856]: E0909 03:30:14.423250 2856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-567cf5f5d8-smmz5" Sep 9 03:30:14.423514 kubelet[2856]: E0909 03:30:14.423478 2856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-567cf5f5d8-smmz5" Sep 9 03:30:14.423781 kubelet[2856]: E0909 03:30:14.423396 2856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-mfkn7" Sep 9 03:30:14.423937 kubelet[2856]: E0909 03:30:14.423908 2856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-mfkn7" Sep 9 03:30:14.424328 kubelet[2856]: E0909 03:30:14.424172 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-mfkn7_kube-system(016e9bee-9c8a-4c0a-b6a2-ef76e3666084)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-mfkn7_kube-system(016e9bee-9c8a-4c0a-b6a2-ef76e3666084)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-mfkn7" podUID="016e9bee-9c8a-4c0a-b6a2-ef76e3666084" Sep 9 03:30:14.424923 kubelet[2856]: E0909 03:30:14.424099 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-567cf5f5d8-smmz5_calico-system(60a1b3a8-da11-4959-aadb-274c0e91ef5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-567cf5f5d8-smmz5_calico-system(60a1b3a8-da11-4959-aadb-274c0e91ef5a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-567cf5f5d8-smmz5" podUID="60a1b3a8-da11-4959-aadb-274c0e91ef5a" Sep 9 03:30:14.428834 containerd[1609]: time="2025-09-09T03:30:14.428090396Z" level=error msg="Failed to destroy network for sandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.428834 containerd[1609]: time="2025-09-09T03:30:14.428591891Z" level=error msg="encountered an error cleaning up failed sandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.428834 containerd[1609]: time="2025-09-09T03:30:14.428658644Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-grrnb,Uid:d110bb2c-45e8-4b87-97ca-6d0d4a087fb1,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.429573 kubelet[2856]: E0909 03:30:14.429534 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.429978 kubelet[2856]: E0909 03:30:14.429809 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.430259 kubelet[2856]: E0909 03:30:14.430100 2856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hzbnh" Sep 9 03:30:14.430431 kubelet[2856]: E0909 03:30:14.430400 2856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hzbnh" Sep 9 03:30:14.430893 kubelet[2856]: E0909 03:30:14.430199 2856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-grrnb" Sep 9 03:30:14.431711 kubelet[2856]: E0909 03:30:14.431035 2856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-grrnb" Sep 9 03:30:14.432613 kubelet[2856]: E0909 03:30:14.432246 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-grrnb_kube-system(d110bb2c-45e8-4b87-97ca-6d0d4a087fb1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-grrnb_kube-system(d110bb2c-45e8-4b87-97ca-6d0d4a087fb1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-grrnb" podUID="d110bb2c-45e8-4b87-97ca-6d0d4a087fb1" Sep 9 03:30:14.432613 kubelet[2856]: E0909 03:30:14.431947 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-hzbnh_calico-system(4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-hzbnh_calico-system(4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-hzbnh" podUID="4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664" Sep 9 03:30:14.519774 containerd[1609]: time="2025-09-09T03:30:14.518419011Z" level=error msg="Failed to destroy network for sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.520221 containerd[1609]: time="2025-09-09T03:30:14.520073189Z" level=error msg="encountered an error cleaning up failed sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.520391 containerd[1609]: time="2025-09-09T03:30:14.520353097Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d89f85994-nw628,Uid:3fbb3505-bd00-459e-814c-f3717bb189c3,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.520989 kubelet[2856]: E0909 03:30:14.520934 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.521240 kubelet[2856]: E0909 03:30:14.521208 2856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d89f85994-nw628" Sep 9 03:30:14.521799 kubelet[2856]: E0909 03:30:14.521741 2856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d89f85994-nw628" Sep 9 03:30:14.522055 kubelet[2856]: E0909 03:30:14.521996 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-d89f85994-nw628_calico-system(3fbb3505-bd00-459e-814c-f3717bb189c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-d89f85994-nw628_calico-system(3fbb3505-bd00-459e-814c-f3717bb189c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-d89f85994-nw628" podUID="3fbb3505-bd00-459e-814c-f3717bb189c3" Sep 9 03:30:14.525946 containerd[1609]: time="2025-09-09T03:30:14.525895926Z" level=error msg="Failed to destroy network for sandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.526444 containerd[1609]: time="2025-09-09T03:30:14.526403791Z" level=error msg="encountered an error cleaning up failed sandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.526562 containerd[1609]: time="2025-09-09T03:30:14.526474081Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dcb5dd78-47vcp,Uid:6e64133a-dcc9-4620-a059-3825b5dc1d54,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.527010 kubelet[2856]: E0909 03:30:14.526804 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.527010 kubelet[2856]: E0909 03:30:14.526858 2856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dcb5dd78-47vcp" Sep 9 03:30:14.527010 kubelet[2856]: E0909 03:30:14.526886 2856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dcb5dd78-47vcp" Sep 9 03:30:14.527276 kubelet[2856]: E0909 03:30:14.526932 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dcb5dd78-47vcp_calico-apiserver(6e64133a-dcc9-4620-a059-3825b5dc1d54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dcb5dd78-47vcp_calico-apiserver(6e64133a-dcc9-4620-a059-3825b5dc1d54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dcb5dd78-47vcp" podUID="6e64133a-dcc9-4620-a059-3825b5dc1d54" Sep 9 03:30:14.536393 containerd[1609]: time="2025-09-09T03:30:14.536287770Z" level=error msg="Failed to destroy network for sandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.537576 containerd[1609]: time="2025-09-09T03:30:14.537371910Z" level=error msg="encountered an error cleaning up failed sandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.537576 containerd[1609]: time="2025-09-09T03:30:14.537468485Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-vgmql,Uid:c3e83bd5-7012-441b-9934-0559a0c6192c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.539840 kubelet[2856]: E0909 03:30:14.537798 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.539840 kubelet[2856]: E0909 03:30:14.537867 2856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-vgmql" Sep 9 03:30:14.539840 kubelet[2856]: E0909 03:30:14.537909 2856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-vgmql" Sep 9 03:30:14.540043 kubelet[2856]: E0909 03:30:14.537961 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-vgmql_calico-system(c3e83bd5-7012-441b-9934-0559a0c6192c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-vgmql_calico-system(c3e83bd5-7012-441b-9934-0559a0c6192c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-vgmql" podUID="c3e83bd5-7012-441b-9934-0559a0c6192c" Sep 9 03:30:14.548692 containerd[1609]: time="2025-09-09T03:30:14.548630973Z" level=error msg="Failed to destroy network for sandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.549244 containerd[1609]: time="2025-09-09T03:30:14.549198112Z" level=error msg="encountered an error cleaning up failed sandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.549424 containerd[1609]: time="2025-09-09T03:30:14.549290128Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dcb5dd78-sjn5k,Uid:ee2b345f-7033-4c91-91dd-2cd3dbef2a5b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.549777 kubelet[2856]: E0909 03:30:14.549580 2856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:14.549777 kubelet[2856]: E0909 03:30:14.549674 2856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dcb5dd78-sjn5k" Sep 9 03:30:14.549777 kubelet[2856]: E0909 03:30:14.549709 2856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dcb5dd78-sjn5k" Sep 9 03:30:14.550911 kubelet[2856]: E0909 03:30:14.549779 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dcb5dd78-sjn5k_calico-apiserver(ee2b345f-7033-4c91-91dd-2cd3dbef2a5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dcb5dd78-sjn5k_calico-apiserver(ee2b345f-7033-4c91-91dd-2cd3dbef2a5b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dcb5dd78-sjn5k" podUID="ee2b345f-7033-4c91-91dd-2cd3dbef2a5b" Sep 9 03:30:14.671686 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2-shm.mount: Deactivated successfully. Sep 9 03:30:14.889182 kubelet[2856]: I0909 03:30:14.889140 2856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:14.894655 kubelet[2856]: I0909 03:30:14.894075 2856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:30:14.912387 kubelet[2856]: I0909 03:30:14.911816 2856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:14.915709 kubelet[2856]: I0909 03:30:14.915669 2856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:14.925301 containerd[1609]: time="2025-09-09T03:30:14.925054637Z" level=info msg="StopPodSandbox for \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\"" Sep 9 03:30:14.925870 containerd[1609]: time="2025-09-09T03:30:14.925843903Z" level=info msg="StopPodSandbox for \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\"" Sep 9 03:30:14.927337 containerd[1609]: time="2025-09-09T03:30:14.927287291Z" level=info msg="Ensure that sandbox feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e in task-service has been cleanup successfully" Sep 9 03:30:14.927772 containerd[1609]: time="2025-09-09T03:30:14.927681518Z" level=info msg="Ensure that sandbox c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5 in task-service has been cleanup successfully" Sep 9 03:30:14.931662 containerd[1609]: time="2025-09-09T03:30:14.931096082Z" level=info msg="StopPodSandbox for \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\"" Sep 9 03:30:14.931662 containerd[1609]: time="2025-09-09T03:30:14.931351400Z" level=info msg="Ensure that sandbox 3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2 in task-service has been cleanup successfully" Sep 9 03:30:14.933164 containerd[1609]: time="2025-09-09T03:30:14.933133337Z" level=info msg="StopPodSandbox for \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\"" Sep 9 03:30:14.933817 containerd[1609]: time="2025-09-09T03:30:14.933786426Z" level=info msg="Ensure that sandbox 1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656 in task-service has been cleanup successfully" Sep 9 03:30:14.939010 kubelet[2856]: I0909 03:30:14.938972 2856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:14.941383 containerd[1609]: time="2025-09-09T03:30:14.941129533Z" level=info msg="StopPodSandbox for \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\"" Sep 9 03:30:14.942773 containerd[1609]: time="2025-09-09T03:30:14.942688033Z" level=info msg="Ensure that sandbox 422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6 in task-service has been cleanup successfully" Sep 9 03:30:14.962094 kubelet[2856]: I0909 03:30:14.960225 2856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:14.962275 containerd[1609]: time="2025-09-09T03:30:14.961845259Z" level=info msg="StopPodSandbox for \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\"" Sep 9 03:30:14.962275 containerd[1609]: time="2025-09-09T03:30:14.962233131Z" level=info msg="Ensure that sandbox dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be in task-service has been cleanup successfully" Sep 9 03:30:14.965103 kubelet[2856]: I0909 03:30:14.965074 2856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:14.965727 containerd[1609]: time="2025-09-09T03:30:14.965682555Z" level=info msg="StopPodSandbox for \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\"" Sep 9 03:30:14.966032 containerd[1609]: time="2025-09-09T03:30:14.965911880Z" level=info msg="Ensure that sandbox 7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b in task-service has been cleanup successfully" Sep 9 03:30:14.971931 kubelet[2856]: I0909 03:30:14.971558 2856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:14.975556 containerd[1609]: time="2025-09-09T03:30:14.973553347Z" level=info msg="StopPodSandbox for \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\"" Sep 9 03:30:14.975556 containerd[1609]: time="2025-09-09T03:30:14.974602648Z" level=info msg="Ensure that sandbox b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f in task-service has been cleanup successfully" Sep 9 03:30:15.047287 containerd[1609]: time="2025-09-09T03:30:15.047184041Z" level=error msg="StopPodSandbox for \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\" failed" error="failed to destroy network for sandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:15.047957 kubelet[2856]: E0909 03:30:15.047719 2856 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:15.054988 kubelet[2856]: E0909 03:30:15.047830 2856 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e"} Sep 9 03:30:15.055210 kubelet[2856]: E0909 03:30:15.055181 2856 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"6e64133a-dcc9-4620-a059-3825b5dc1d54\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 9 03:30:15.055411 kubelet[2856]: E0909 03:30:15.055375 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"6e64133a-dcc9-4620-a059-3825b5dc1d54\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dcb5dd78-47vcp" podUID="6e64133a-dcc9-4620-a059-3825b5dc1d54" Sep 9 03:30:15.102370 containerd[1609]: time="2025-09-09T03:30:15.102312461Z" level=error msg="StopPodSandbox for \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\" failed" error="failed to destroy network for sandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:15.102959 kubelet[2856]: E0909 03:30:15.102903 2856 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:15.103061 kubelet[2856]: E0909 03:30:15.102978 2856 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f"} Sep 9 03:30:15.103126 kubelet[2856]: E0909 03:30:15.103084 2856 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"016e9bee-9c8a-4c0a-b6a2-ef76e3666084\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 9 03:30:15.103238 kubelet[2856]: E0909 03:30:15.103131 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"016e9bee-9c8a-4c0a-b6a2-ef76e3666084\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-mfkn7" podUID="016e9bee-9c8a-4c0a-b6a2-ef76e3666084" Sep 9 03:30:15.105971 containerd[1609]: time="2025-09-09T03:30:15.105928931Z" level=error msg="StopPodSandbox for \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\" failed" error="failed to destroy network for sandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:15.106885 kubelet[2856]: E0909 03:30:15.106830 2856 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:15.106987 kubelet[2856]: E0909 03:30:15.106891 2856 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be"} Sep 9 03:30:15.106987 kubelet[2856]: E0909 03:30:15.106930 2856 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ee2b345f-7033-4c91-91dd-2cd3dbef2a5b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 9 03:30:15.106987 kubelet[2856]: E0909 03:30:15.106959 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ee2b345f-7033-4c91-91dd-2cd3dbef2a5b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dcb5dd78-sjn5k" podUID="ee2b345f-7033-4c91-91dd-2cd3dbef2a5b" Sep 9 03:30:15.131478 containerd[1609]: time="2025-09-09T03:30:15.131314975Z" level=error msg="StopPodSandbox for \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\" failed" error="failed to destroy network for sandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:15.132779 kubelet[2856]: E0909 03:30:15.131822 2856 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:15.132779 kubelet[2856]: E0909 03:30:15.132156 2856 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b"} Sep 9 03:30:15.132779 kubelet[2856]: E0909 03:30:15.132227 2856 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c3e83bd5-7012-441b-9934-0559a0c6192c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 9 03:30:15.132779 kubelet[2856]: E0909 03:30:15.132303 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c3e83bd5-7012-441b-9934-0559a0c6192c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-vgmql" podUID="c3e83bd5-7012-441b-9934-0559a0c6192c" Sep 9 03:30:15.140501 containerd[1609]: time="2025-09-09T03:30:15.140418115Z" level=error msg="StopPodSandbox for \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\" failed" error="failed to destroy network for sandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:15.140972 kubelet[2856]: E0909 03:30:15.140921 2856 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:15.141076 kubelet[2856]: E0909 03:30:15.140989 2856 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5"} Sep 9 03:30:15.141076 kubelet[2856]: E0909 03:30:15.141034 2856 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d110bb2c-45e8-4b87-97ca-6d0d4a087fb1\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 9 03:30:15.141265 kubelet[2856]: E0909 03:30:15.141069 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d110bb2c-45e8-4b87-97ca-6d0d4a087fb1\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-grrnb" podUID="d110bb2c-45e8-4b87-97ca-6d0d4a087fb1" Sep 9 03:30:15.141646 containerd[1609]: time="2025-09-09T03:30:15.141604405Z" level=error msg="StopPodSandbox for \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\" failed" error="failed to destroy network for sandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:15.142198 kubelet[2856]: E0909 03:30:15.142158 2856 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:15.142303 kubelet[2856]: E0909 03:30:15.142204 2856 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2"} Sep 9 03:30:15.142399 kubelet[2856]: E0909 03:30:15.142261 2856 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 9 03:30:15.142399 kubelet[2856]: E0909 03:30:15.142371 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-hzbnh" podUID="4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664" Sep 9 03:30:15.147170 containerd[1609]: time="2025-09-09T03:30:15.146984493Z" level=error msg="StopPodSandbox for \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\" failed" error="failed to destroy network for sandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:15.147457 kubelet[2856]: E0909 03:30:15.147405 2856 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:15.147518 kubelet[2856]: E0909 03:30:15.147465 2856 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6"} Sep 9 03:30:15.147518 kubelet[2856]: E0909 03:30:15.147505 2856 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"60a1b3a8-da11-4959-aadb-274c0e91ef5a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 9 03:30:15.147689 kubelet[2856]: E0909 03:30:15.147553 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"60a1b3a8-da11-4959-aadb-274c0e91ef5a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-567cf5f5d8-smmz5" podUID="60a1b3a8-da11-4959-aadb-274c0e91ef5a" Sep 9 03:30:15.148195 containerd[1609]: time="2025-09-09T03:30:15.148090520Z" level=error msg="StopPodSandbox for \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\" failed" error="failed to destroy network for sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:15.148356 kubelet[2856]: E0909 03:30:15.148313 2856 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:30:15.148437 kubelet[2856]: E0909 03:30:15.148365 2856 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656"} Sep 9 03:30:15.148437 kubelet[2856]: E0909 03:30:15.148403 2856 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"3fbb3505-bd00-459e-814c-f3717bb189c3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 9 03:30:15.148589 kubelet[2856]: E0909 03:30:15.148431 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"3fbb3505-bd00-459e-814c-f3717bb189c3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-d89f85994-nw628" podUID="3fbb3505-bd00-459e-814c-f3717bb189c3" Sep 9 03:30:24.341547 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount126844163.mount: Deactivated successfully. Sep 9 03:30:24.451259 containerd[1609]: time="2025-09-09T03:30:24.429465690Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 9 03:30:24.453352 containerd[1609]: time="2025-09-09T03:30:24.452276283Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 10.549244335s" Sep 9 03:30:24.453352 containerd[1609]: time="2025-09-09T03:30:24.452361162Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 9 03:30:24.453352 containerd[1609]: time="2025-09-09T03:30:24.452732910Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:24.486943 containerd[1609]: time="2025-09-09T03:30:24.486180848Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:24.488694 containerd[1609]: time="2025-09-09T03:30:24.487166984Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:24.562197 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:30:24.550270 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:30:24.550463 systemd-resolved[1508]: Flushed all caches. Sep 9 03:30:24.610238 containerd[1609]: time="2025-09-09T03:30:24.610049208Z" level=info msg="CreateContainer within sandbox \"eaf6d9e136ef4c576f8b50e2eb54acd783db4152ab3f1daed1c7473a1945713a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 03:30:24.681074 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1566826021.mount: Deactivated successfully. Sep 9 03:30:24.703210 containerd[1609]: time="2025-09-09T03:30:24.703144497Z" level=info msg="CreateContainer within sandbox \"eaf6d9e136ef4c576f8b50e2eb54acd783db4152ab3f1daed1c7473a1945713a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"13296b995012e403c9deaea2923b28742a4dca3ce7d0f89d2400ef5396f4dfed\"" Sep 9 03:30:24.705575 containerd[1609]: time="2025-09-09T03:30:24.704995597Z" level=info msg="StartContainer for \"13296b995012e403c9deaea2923b28742a4dca3ce7d0f89d2400ef5396f4dfed\"" Sep 9 03:30:24.899560 kubelet[2856]: I0909 03:30:24.899389 2856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 03:30:25.197563 containerd[1609]: time="2025-09-09T03:30:25.197076619Z" level=info msg="StartContainer for \"13296b995012e403c9deaea2923b28742a4dca3ce7d0f89d2400ef5396f4dfed\" returns successfully" Sep 9 03:30:25.496215 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 03:30:25.496790 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 03:30:25.775444 containerd[1609]: time="2025-09-09T03:30:25.775030386Z" level=info msg="StopPodSandbox for \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\"" Sep 9 03:30:25.923553 containerd[1609]: time="2025-09-09T03:30:25.923243412Z" level=error msg="StopPodSandbox for \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\" failed" error="failed to destroy network for sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 03:30:25.923840 kubelet[2856]: E0909 03:30:25.923678 2856 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:30:25.925251 kubelet[2856]: E0909 03:30:25.923894 2856 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656"} Sep 9 03:30:25.945547 kubelet[2856]: E0909 03:30:25.945477 2856 kubelet.go:2027] "Unhandled Error" err="failed to \"KillPodSandbox\" for \"3fbb3505-bd00-459e-814c-f3717bb189c3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" logger="UnhandledError" Sep 9 03:30:25.947793 kubelet[2856]: E0909 03:30:25.946709 2856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"3fbb3505-bd00-459e-814c-f3717bb189c3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-d89f85994-nw628" podUID="3fbb3505-bd00-459e-814c-f3717bb189c3" Sep 9 03:30:26.189789 kubelet[2856]: I0909 03:30:26.168878 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-pcp84" podStartSLOduration=3.5720216110000003 podStartE2EDuration="29.154951305s" podCreationTimestamp="2025-09-09 03:29:57 +0000 UTC" firstStartedPulling="2025-09-09 03:29:58.905931665 +0000 UTC m=+23.640571726" lastFinishedPulling="2025-09-09 03:30:24.488861375 +0000 UTC m=+49.223501420" observedRunningTime="2025-09-09 03:30:26.154062062 +0000 UTC m=+50.888702119" watchObservedRunningTime="2025-09-09 03:30:26.154951305 +0000 UTC m=+50.889591368" Sep 9 03:30:26.600417 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:30:26.601223 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:30:26.600428 systemd-resolved[1508]: Flushed all caches. Sep 9 03:30:26.663669 containerd[1609]: time="2025-09-09T03:30:26.663608633Z" level=info msg="StopPodSandbox for \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\"" Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:26.777 [INFO][4130] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:26.778 [INFO][4130] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" iface="eth0" netns="/var/run/netns/cni-700942bc-76f5-40cd-65e2-ad64a85c38fc" Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:26.779 [INFO][4130] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" iface="eth0" netns="/var/run/netns/cni-700942bc-76f5-40cd-65e2-ad64a85c38fc" Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:26.780 [INFO][4130] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" iface="eth0" netns="/var/run/netns/cni-700942bc-76f5-40cd-65e2-ad64a85c38fc" Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:26.780 [INFO][4130] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:26.780 [INFO][4130] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:26.991 [INFO][4137] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" HandleID="k8s-pod-network.7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Workload="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:26.996 [INFO][4137] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:26.997 [INFO][4137] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:27.016 [WARNING][4137] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" HandleID="k8s-pod-network.7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Workload="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:27.016 [INFO][4137] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" HandleID="k8s-pod-network.7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Workload="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:27.023 [INFO][4137] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:27.028169 containerd[1609]: 2025-09-09 03:30:27.025 [INFO][4130] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:27.032800 containerd[1609]: time="2025-09-09T03:30:27.030904944Z" level=info msg="TearDown network for sandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\" successfully" Sep 9 03:30:27.032800 containerd[1609]: time="2025-09-09T03:30:27.030955894Z" level=info msg="StopPodSandbox for \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\" returns successfully" Sep 9 03:30:27.032800 containerd[1609]: time="2025-09-09T03:30:27.032430709Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-vgmql,Uid:c3e83bd5-7012-441b-9934-0559a0c6192c,Namespace:calico-system,Attempt:1,}" Sep 9 03:30:27.033633 systemd[1]: run-netns-cni\x2d700942bc\x2d76f5\x2d40cd\x2d65e2\x2dad64a85c38fc.mount: Deactivated successfully. Sep 9 03:30:27.507170 systemd-networkd[1258]: cali7a9354c34b4: Link UP Sep 9 03:30:27.512149 systemd-networkd[1258]: cali7a9354c34b4: Gained carrier Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.116 [INFO][4143] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.147 [INFO][4143] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0 goldmane-7988f88666- calico-system c3e83bd5-7012-441b-9934-0559a0c6192c 895 0 2025-09-09 03:29:57 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-sy1m0.gb1.brightbox.com goldmane-7988f88666-vgmql eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali7a9354c34b4 [] [] }} ContainerID="3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" Namespace="calico-system" Pod="goldmane-7988f88666-vgmql" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-" Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.148 [INFO][4143] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" Namespace="calico-system" Pod="goldmane-7988f88666-vgmql" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.212 [INFO][4166] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" HandleID="k8s-pod-network.3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" Workload="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.212 [INFO][4166] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" HandleID="k8s-pod-network.3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" Workload="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d56a0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-sy1m0.gb1.brightbox.com", "pod":"goldmane-7988f88666-vgmql", "timestamp":"2025-09-09 03:30:27.212023181 +0000 UTC"}, Hostname:"srv-sy1m0.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.212 [INFO][4166] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.212 [INFO][4166] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.212 [INFO][4166] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-sy1m0.gb1.brightbox.com' Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.281 [INFO][4166] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.384 [INFO][4166] ipam/ipam.go 394: Looking up existing affinities for host host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.398 [INFO][4166] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.404 [INFO][4166] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.408 [INFO][4166] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.408 [INFO][4166] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.413 [INFO][4166] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65 Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.424 [INFO][4166] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.438 [INFO][4166] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.1/26] block=192.168.61.0/26 handle="k8s-pod-network.3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.438 [INFO][4166] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.1/26] handle="k8s-pod-network.3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.439 [INFO][4166] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:27.591735 containerd[1609]: 2025-09-09 03:30:27.439 [INFO][4166] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.1/26] IPv6=[] ContainerID="3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" HandleID="k8s-pod-network.3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" Workload="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:27.602666 containerd[1609]: 2025-09-09 03:30:27.451 [INFO][4143] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" Namespace="calico-system" Pod="goldmane-7988f88666-vgmql" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"c3e83bd5-7012-441b-9934-0559a0c6192c", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-7988f88666-vgmql", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7a9354c34b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:27.602666 containerd[1609]: 2025-09-09 03:30:27.451 [INFO][4143] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.1/32] ContainerID="3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" Namespace="calico-system" Pod="goldmane-7988f88666-vgmql" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:27.602666 containerd[1609]: 2025-09-09 03:30:27.451 [INFO][4143] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7a9354c34b4 ContainerID="3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" Namespace="calico-system" Pod="goldmane-7988f88666-vgmql" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:27.602666 containerd[1609]: 2025-09-09 03:30:27.521 [INFO][4143] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" Namespace="calico-system" Pod="goldmane-7988f88666-vgmql" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:27.602666 containerd[1609]: 2025-09-09 03:30:27.546 [INFO][4143] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" Namespace="calico-system" Pod="goldmane-7988f88666-vgmql" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"c3e83bd5-7012-441b-9934-0559a0c6192c", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65", Pod:"goldmane-7988f88666-vgmql", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7a9354c34b4", MAC:"c2:e1:57:77:af:b1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:27.602666 containerd[1609]: 2025-09-09 03:30:27.582 [INFO][4143] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65" Namespace="calico-system" Pod="goldmane-7988f88666-vgmql" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:27.694052 containerd[1609]: time="2025-09-09T03:30:27.694003783Z" level=info msg="StopPodSandbox for \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\"" Sep 9 03:30:27.715879 containerd[1609]: time="2025-09-09T03:30:27.714452855Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:30:27.715879 containerd[1609]: time="2025-09-09T03:30:27.714583498Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:30:27.715879 containerd[1609]: time="2025-09-09T03:30:27.714614883Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:27.715879 containerd[1609]: time="2025-09-09T03:30:27.714810774Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:27.978 [INFO][4280] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:27.979 [INFO][4280] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" iface="eth0" netns="/var/run/netns/cni-b7bbd467-a6a9-7550-a8b0-82dd65def594" Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:27.980 [INFO][4280] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" iface="eth0" netns="/var/run/netns/cni-b7bbd467-a6a9-7550-a8b0-82dd65def594" Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:27.980 [INFO][4280] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" iface="eth0" netns="/var/run/netns/cni-b7bbd467-a6a9-7550-a8b0-82dd65def594" Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:27.980 [INFO][4280] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:27.980 [INFO][4280] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:28.047 [INFO][4326] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" HandleID="k8s-pod-network.422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:28.050 [INFO][4326] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:28.050 [INFO][4326] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:28.070 [WARNING][4326] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" HandleID="k8s-pod-network.422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:28.070 [INFO][4326] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" HandleID="k8s-pod-network.422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:28.074 [INFO][4326] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:28.085780 containerd[1609]: 2025-09-09 03:30:28.080 [INFO][4280] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:28.091230 containerd[1609]: time="2025-09-09T03:30:28.090299501Z" level=info msg="TearDown network for sandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\" successfully" Sep 9 03:30:28.091230 containerd[1609]: time="2025-09-09T03:30:28.090366871Z" level=info msg="StopPodSandbox for \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\" returns successfully" Sep 9 03:30:28.098867 containerd[1609]: time="2025-09-09T03:30:28.096186619Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-567cf5f5d8-smmz5,Uid:60a1b3a8-da11-4959-aadb-274c0e91ef5a,Namespace:calico-system,Attempt:1,}" Sep 9 03:30:28.104184 systemd[1]: run-netns-cni\x2db7bbd467\x2da6a9\x2d7550\x2da8b0\x2d82dd65def594.mount: Deactivated successfully. Sep 9 03:30:28.297490 containerd[1609]: time="2025-09-09T03:30:28.297438978Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-vgmql,Uid:c3e83bd5-7012-441b-9934-0559a0c6192c,Namespace:calico-system,Attempt:1,} returns sandbox id \"3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65\"" Sep 9 03:30:28.329478 containerd[1609]: time="2025-09-09T03:30:28.328914952Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 03:30:28.668554 containerd[1609]: time="2025-09-09T03:30:28.667332850Z" level=info msg="StopPodSandbox for \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\"" Sep 9 03:30:28.669871 containerd[1609]: time="2025-09-09T03:30:28.669608111Z" level=info msg="StopPodSandbox for \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\"" Sep 9 03:30:28.675614 containerd[1609]: time="2025-09-09T03:30:28.671491346Z" level=info msg="StopPodSandbox for \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\"" Sep 9 03:30:28.898832 kernel: bpftool[4458]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Sep 9 03:30:28.913953 systemd-networkd[1258]: cali10723933637: Link UP Sep 9 03:30:28.954529 systemd-networkd[1258]: cali10723933637: Gained carrier Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.471 [INFO][4334] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.523 [INFO][4334] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0 calico-kube-controllers-567cf5f5d8- calico-system 60a1b3a8-da11-4959-aadb-274c0e91ef5a 904 0 2025-09-09 03:29:58 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:567cf5f5d8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-sy1m0.gb1.brightbox.com calico-kube-controllers-567cf5f5d8-smmz5 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali10723933637 [] [] }} ContainerID="16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" Namespace="calico-system" Pod="calico-kube-controllers-567cf5f5d8-smmz5" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-" Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.524 [INFO][4334] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" Namespace="calico-system" Pod="calico-kube-controllers-567cf5f5d8-smmz5" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.651 [INFO][4382] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" HandleID="k8s-pod-network.16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.654 [INFO][4382] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" HandleID="k8s-pod-network.16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000382600), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-sy1m0.gb1.brightbox.com", "pod":"calico-kube-controllers-567cf5f5d8-smmz5", "timestamp":"2025-09-09 03:30:28.65135162 +0000 UTC"}, Hostname:"srv-sy1m0.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.654 [INFO][4382] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.655 [INFO][4382] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.655 [INFO][4382] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-sy1m0.gb1.brightbox.com' Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.712 [INFO][4382] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.739 [INFO][4382] ipam/ipam.go 394: Looking up existing affinities for host host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.769 [INFO][4382] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.777 [INFO][4382] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.790 [INFO][4382] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.790 [INFO][4382] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.798 [INFO][4382] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.820 [INFO][4382] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.851 [INFO][4382] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.2/26] block=192.168.61.0/26 handle="k8s-pod-network.16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.854 [INFO][4382] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.2/26] handle="k8s-pod-network.16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.854 [INFO][4382] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:29.037636 containerd[1609]: 2025-09-09 03:30:28.854 [INFO][4382] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.2/26] IPv6=[] ContainerID="16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" HandleID="k8s-pod-network.16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:29.042674 containerd[1609]: 2025-09-09 03:30:28.877 [INFO][4334] cni-plugin/k8s.go 418: Populated endpoint ContainerID="16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" Namespace="calico-system" Pod="calico-kube-controllers-567cf5f5d8-smmz5" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0", GenerateName:"calico-kube-controllers-567cf5f5d8-", Namespace:"calico-system", SelfLink:"", UID:"60a1b3a8-da11-4959-aadb-274c0e91ef5a", ResourceVersion:"904", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"567cf5f5d8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-567cf5f5d8-smmz5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali10723933637", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:29.042674 containerd[1609]: 2025-09-09 03:30:28.877 [INFO][4334] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.2/32] ContainerID="16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" Namespace="calico-system" Pod="calico-kube-controllers-567cf5f5d8-smmz5" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:29.042674 containerd[1609]: 2025-09-09 03:30:28.877 [INFO][4334] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali10723933637 ContainerID="16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" Namespace="calico-system" Pod="calico-kube-controllers-567cf5f5d8-smmz5" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:29.042674 containerd[1609]: 2025-09-09 03:30:28.952 [INFO][4334] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" Namespace="calico-system" Pod="calico-kube-controllers-567cf5f5d8-smmz5" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:29.042674 containerd[1609]: 2025-09-09 03:30:28.973 [INFO][4334] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" Namespace="calico-system" Pod="calico-kube-controllers-567cf5f5d8-smmz5" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0", GenerateName:"calico-kube-controllers-567cf5f5d8-", Namespace:"calico-system", SelfLink:"", UID:"60a1b3a8-da11-4959-aadb-274c0e91ef5a", ResourceVersion:"904", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"567cf5f5d8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b", Pod:"calico-kube-controllers-567cf5f5d8-smmz5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali10723933637", MAC:"12:36:40:88:eb:f3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:29.042674 containerd[1609]: 2025-09-09 03:30:29.006 [INFO][4334] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b" Namespace="calico-system" Pod="calico-kube-controllers-567cf5f5d8-smmz5" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:29.161385 systemd-networkd[1258]: cali7a9354c34b4: Gained IPv6LL Sep 9 03:30:29.258796 containerd[1609]: time="2025-09-09T03:30:29.253702569Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:30:29.258796 containerd[1609]: time="2025-09-09T03:30:29.253854604Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:30:29.258796 containerd[1609]: time="2025-09-09T03:30:29.253876205Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:29.258796 containerd[1609]: time="2025-09-09T03:30:29.254051936Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.142 [INFO][4435] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.142 [INFO][4435] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" iface="eth0" netns="/var/run/netns/cni-ddad900a-d0b3-5f0d-92c9-16624035de70" Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.145 [INFO][4435] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" iface="eth0" netns="/var/run/netns/cni-ddad900a-d0b3-5f0d-92c9-16624035de70" Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.146 [INFO][4435] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" iface="eth0" netns="/var/run/netns/cni-ddad900a-d0b3-5f0d-92c9-16624035de70" Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.146 [INFO][4435] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.146 [INFO][4435] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.304 [INFO][4487] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" HandleID="k8s-pod-network.c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.309 [INFO][4487] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.309 [INFO][4487] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.335 [WARNING][4487] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" HandleID="k8s-pod-network.c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.335 [INFO][4487] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" HandleID="k8s-pod-network.c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.345 [INFO][4487] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:29.367128 containerd[1609]: 2025-09-09 03:30:29.361 [INFO][4435] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:29.370471 containerd[1609]: time="2025-09-09T03:30:29.368296286Z" level=info msg="TearDown network for sandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\" successfully" Sep 9 03:30:29.370471 containerd[1609]: time="2025-09-09T03:30:29.368433663Z" level=info msg="StopPodSandbox for \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\" returns successfully" Sep 9 03:30:29.373143 containerd[1609]: time="2025-09-09T03:30:29.370570306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-grrnb,Uid:d110bb2c-45e8-4b87-97ca-6d0d4a087fb1,Namespace:kube-system,Attempt:1,}" Sep 9 03:30:29.377909 systemd[1]: run-netns-cni\x2dddad900a\x2dd0b3\x2d5f0d\x2d92c9\x2d16624035de70.mount: Deactivated successfully. Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.092 [INFO][4434] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.093 [INFO][4434] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" iface="eth0" netns="/var/run/netns/cni-83f340d0-fd62-d19f-a3fb-8ded99c9591e" Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.094 [INFO][4434] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" iface="eth0" netns="/var/run/netns/cni-83f340d0-fd62-d19f-a3fb-8ded99c9591e" Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.095 [INFO][4434] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" iface="eth0" netns="/var/run/netns/cni-83f340d0-fd62-d19f-a3fb-8ded99c9591e" Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.095 [INFO][4434] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.095 [INFO][4434] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.328 [INFO][4474] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" HandleID="k8s-pod-network.3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Workload="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.332 [INFO][4474] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.345 [INFO][4474] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.373 [WARNING][4474] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" HandleID="k8s-pod-network.3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Workload="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.373 [INFO][4474] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" HandleID="k8s-pod-network.3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Workload="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.387 [INFO][4474] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:29.432818 containerd[1609]: 2025-09-09 03:30:29.408 [INFO][4434] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:29.432818 containerd[1609]: time="2025-09-09T03:30:29.428284109Z" level=info msg="TearDown network for sandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\" successfully" Sep 9 03:30:29.432818 containerd[1609]: time="2025-09-09T03:30:29.428560645Z" level=info msg="StopPodSandbox for \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\" returns successfully" Sep 9 03:30:29.441798 containerd[1609]: time="2025-09-09T03:30:29.437771032Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hzbnh,Uid:4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664,Namespace:calico-system,Attempt:1,}" Sep 9 03:30:29.440170 systemd[1]: run-netns-cni\x2d83f340d0\x2dfd62\x2dd19f\x2da3fb\x2d8ded99c9591e.mount: Deactivated successfully. Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.158 [INFO][4436] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.158 [INFO][4436] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" iface="eth0" netns="/var/run/netns/cni-0518de4e-46dd-2e4a-6c17-f6c181278182" Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.162 [INFO][4436] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" iface="eth0" netns="/var/run/netns/cni-0518de4e-46dd-2e4a-6c17-f6c181278182" Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.174 [INFO][4436] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" iface="eth0" netns="/var/run/netns/cni-0518de4e-46dd-2e4a-6c17-f6c181278182" Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.177 [INFO][4436] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.178 [INFO][4436] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.333 [INFO][4495] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" HandleID="k8s-pod-network.dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.333 [INFO][4495] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.389 [INFO][4495] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.409 [WARNING][4495] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" HandleID="k8s-pod-network.dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.409 [INFO][4495] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" HandleID="k8s-pod-network.dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.437 [INFO][4495] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:29.471172 containerd[1609]: 2025-09-09 03:30:29.451 [INFO][4436] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:29.476129 containerd[1609]: time="2025-09-09T03:30:29.475883908Z" level=info msg="TearDown network for sandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\" successfully" Sep 9 03:30:29.477683 containerd[1609]: time="2025-09-09T03:30:29.476182038Z" level=info msg="StopPodSandbox for \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\" returns successfully" Sep 9 03:30:29.484729 containerd[1609]: time="2025-09-09T03:30:29.484643258Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dcb5dd78-sjn5k,Uid:ee2b345f-7033-4c91-91dd-2cd3dbef2a5b,Namespace:calico-apiserver,Attempt:1,}" Sep 9 03:30:29.633249 containerd[1609]: time="2025-09-09T03:30:29.630828766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-567cf5f5d8-smmz5,Uid:60a1b3a8-da11-4959-aadb-274c0e91ef5a,Namespace:calico-system,Attempt:1,} returns sandbox id \"16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b\"" Sep 9 03:30:29.671868 containerd[1609]: time="2025-09-09T03:30:29.671535034Z" level=info msg="StopPodSandbox for \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\"" Sep 9 03:30:29.748689 systemd-networkd[1258]: vxlan.calico: Link UP Sep 9 03:30:29.748712 systemd-networkd[1258]: vxlan.calico: Gained carrier Sep 9 03:30:30.117996 systemd-networkd[1258]: cali10723933637: Gained IPv6LL Sep 9 03:30:30.292431 systemd[1]: run-netns-cni\x2d0518de4e\x2d46dd\x2d2e4a\x2d6c17\x2df6c181278182.mount: Deactivated successfully. Sep 9 03:30:30.374539 systemd-networkd[1258]: calie723ed321d6: Link UP Sep 9 03:30:30.378690 systemd-networkd[1258]: calie723ed321d6: Gained carrier Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:29.836 [INFO][4540] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0 csi-node-driver- calico-system 4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664 915 0 2025-09-09 03:29:58 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-sy1m0.gb1.brightbox.com csi-node-driver-hzbnh eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calie723ed321d6 [] [] }} ContainerID="da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" Namespace="calico-system" Pod="csi-node-driver-hzbnh" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-" Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:29.837 [INFO][4540] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" Namespace="calico-system" Pod="csi-node-driver-hzbnh" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.186 [INFO][4623] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" HandleID="k8s-pod-network.da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" Workload="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.188 [INFO][4623] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" HandleID="k8s-pod-network.da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" Workload="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000fec20), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-sy1m0.gb1.brightbox.com", "pod":"csi-node-driver-hzbnh", "timestamp":"2025-09-09 03:30:30.186820498 +0000 UTC"}, Hostname:"srv-sy1m0.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.188 [INFO][4623] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.188 [INFO][4623] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.189 [INFO][4623] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-sy1m0.gb1.brightbox.com' Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.227 [INFO][4623] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.244 [INFO][4623] ipam/ipam.go 394: Looking up existing affinities for host host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.266 [INFO][4623] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.288 [INFO][4623] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.293 [INFO][4623] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.298 [INFO][4623] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.302 [INFO][4623] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26 Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.321 [INFO][4623] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.331 [INFO][4623] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.3/26] block=192.168.61.0/26 handle="k8s-pod-network.da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.332 [INFO][4623] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.3/26] handle="k8s-pod-network.da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.332 [INFO][4623] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:30.437183 containerd[1609]: 2025-09-09 03:30:30.332 [INFO][4623] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.3/26] IPv6=[] ContainerID="da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" HandleID="k8s-pod-network.da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" Workload="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:30.441207 containerd[1609]: 2025-09-09 03:30:30.340 [INFO][4540] cni-plugin/k8s.go 418: Populated endpoint ContainerID="da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" Namespace="calico-system" Pod="csi-node-driver-hzbnh" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664", ResourceVersion:"915", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-hzbnh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie723ed321d6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:30.441207 containerd[1609]: 2025-09-09 03:30:30.340 [INFO][4540] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.3/32] ContainerID="da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" Namespace="calico-system" Pod="csi-node-driver-hzbnh" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:30.441207 containerd[1609]: 2025-09-09 03:30:30.340 [INFO][4540] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie723ed321d6 ContainerID="da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" Namespace="calico-system" Pod="csi-node-driver-hzbnh" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:30.441207 containerd[1609]: 2025-09-09 03:30:30.372 [INFO][4540] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" Namespace="calico-system" Pod="csi-node-driver-hzbnh" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:30.441207 containerd[1609]: 2025-09-09 03:30:30.373 [INFO][4540] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" Namespace="calico-system" Pod="csi-node-driver-hzbnh" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664", ResourceVersion:"915", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26", Pod:"csi-node-driver-hzbnh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie723ed321d6", MAC:"0a:5b:af:9d:fe:b8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:30.441207 containerd[1609]: 2025-09-09 03:30:30.411 [INFO][4540] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26" Namespace="calico-system" Pod="csi-node-driver-hzbnh" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:30.612362 systemd-networkd[1258]: cali93a6f820512: Link UP Sep 9 03:30:30.615386 systemd-networkd[1258]: cali93a6f820512: Gained carrier Sep 9 03:30:30.668954 containerd[1609]: time="2025-09-09T03:30:30.634344704Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:30:30.668954 containerd[1609]: time="2025-09-09T03:30:30.634469921Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:30:30.668954 containerd[1609]: time="2025-09-09T03:30:30.634501715Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:30.668954 containerd[1609]: time="2025-09-09T03:30:30.637930750Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:30.688691 containerd[1609]: time="2025-09-09T03:30:30.684066339Z" level=info msg="StopPodSandbox for \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\"" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:29.684 [INFO][4531] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0 coredns-7c65d6cfc9- kube-system d110bb2c-45e8-4b87-97ca-6d0d4a087fb1 916 0 2025-09-09 03:29:40 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-sy1m0.gb1.brightbox.com coredns-7c65d6cfc9-grrnb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali93a6f820512 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-grrnb" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:29.685 [INFO][4531] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-grrnb" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.192 [INFO][4588] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" HandleID="k8s-pod-network.9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.192 [INFO][4588] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" HandleID="k8s-pod-network.9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000123650), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-sy1m0.gb1.brightbox.com", "pod":"coredns-7c65d6cfc9-grrnb", "timestamp":"2025-09-09 03:30:30.192643289 +0000 UTC"}, Hostname:"srv-sy1m0.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.192 [INFO][4588] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.335 [INFO][4588] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.336 [INFO][4588] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-sy1m0.gb1.brightbox.com' Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.384 [INFO][4588] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.401 [INFO][4588] ipam/ipam.go 394: Looking up existing affinities for host host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.454 [INFO][4588] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.459 [INFO][4588] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.464 [INFO][4588] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.464 [INFO][4588] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.471 [INFO][4588] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.485 [INFO][4588] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.500 [INFO][4588] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.4/26] block=192.168.61.0/26 handle="k8s-pod-network.9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.501 [INFO][4588] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.4/26] handle="k8s-pod-network.9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.502 [INFO][4588] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:30.747841 containerd[1609]: 2025-09-09 03:30:30.503 [INFO][4588] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.4/26] IPv6=[] ContainerID="9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" HandleID="k8s-pod-network.9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:30.748976 containerd[1609]: 2025-09-09 03:30:30.517 [INFO][4531] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-grrnb" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"d110bb2c-45e8-4b87-97ca-6d0d4a087fb1", ResourceVersion:"916", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7c65d6cfc9-grrnb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali93a6f820512", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:30.748976 containerd[1609]: 2025-09-09 03:30:30.518 [INFO][4531] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.4/32] ContainerID="9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-grrnb" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:30.748976 containerd[1609]: 2025-09-09 03:30:30.519 [INFO][4531] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali93a6f820512 ContainerID="9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-grrnb" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:30.748976 containerd[1609]: 2025-09-09 03:30:30.617 [INFO][4531] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-grrnb" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:30.748976 containerd[1609]: 2025-09-09 03:30:30.627 [INFO][4531] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-grrnb" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"d110bb2c-45e8-4b87-97ca-6d0d4a087fb1", ResourceVersion:"916", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca", Pod:"coredns-7c65d6cfc9-grrnb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali93a6f820512", MAC:"66:7a:2c:f6:4a:13", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:30.748976 containerd[1609]: 2025-09-09 03:30:30.707 [INFO][4531] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca" Namespace="kube-system" Pod="coredns-7c65d6cfc9-grrnb" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:30.878816 systemd-networkd[1258]: cali4588905fc42: Link UP Sep 9 03:30:30.889271 systemd-networkd[1258]: cali4588905fc42: Gained carrier Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.204 [INFO][4604] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.204 [INFO][4604] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" iface="eth0" netns="/var/run/netns/cni-20d7eb5a-08c5-e4d2-7cd3-7167c8dfeb1c" Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.212 [INFO][4604] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" iface="eth0" netns="/var/run/netns/cni-20d7eb5a-08c5-e4d2-7cd3-7167c8dfeb1c" Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.216 [INFO][4604] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" iface="eth0" netns="/var/run/netns/cni-20d7eb5a-08c5-e4d2-7cd3-7167c8dfeb1c" Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.217 [INFO][4604] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.217 [INFO][4604] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.476 [INFO][4651] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" HandleID="k8s-pod-network.b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.480 [INFO][4651] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.815 [INFO][4651] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.871 [WARNING][4651] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" HandleID="k8s-pod-network.b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.871 [INFO][4651] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" HandleID="k8s-pod-network.b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.876 [INFO][4651] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:30.950823 containerd[1609]: 2025-09-09 03:30:30.911 [INFO][4604] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:30.957782 containerd[1609]: time="2025-09-09T03:30:30.956252452Z" level=info msg="TearDown network for sandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\" successfully" Sep 9 03:30:30.957782 containerd[1609]: time="2025-09-09T03:30:30.956304622Z" level=info msg="StopPodSandbox for \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\" returns successfully" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.033 [INFO][4553] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0 calico-apiserver-8dcb5dd78- calico-apiserver ee2b345f-7033-4c91-91dd-2cd3dbef2a5b 917 0 2025-09-09 03:29:52 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8dcb5dd78 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-sy1m0.gb1.brightbox.com calico-apiserver-8dcb5dd78-sjn5k eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali4588905fc42 [] [] }} ContainerID="089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-sjn5k" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.038 [INFO][4553] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-sjn5k" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.325 [INFO][4639] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" HandleID="k8s-pod-network.089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.326 [INFO][4639] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" HandleID="k8s-pod-network.089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000125980), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-sy1m0.gb1.brightbox.com", "pod":"calico-apiserver-8dcb5dd78-sjn5k", "timestamp":"2025-09-09 03:30:30.325814344 +0000 UTC"}, Hostname:"srv-sy1m0.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.326 [INFO][4639] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.503 [INFO][4639] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.506 [INFO][4639] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-sy1m0.gb1.brightbox.com' Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.610 [INFO][4639] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.640 [INFO][4639] ipam/ipam.go 394: Looking up existing affinities for host host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.693 [INFO][4639] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.733 [INFO][4639] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.749 [INFO][4639] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.750 [INFO][4639] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.763 [INFO][4639] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45 Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.787 [INFO][4639] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.812 [INFO][4639] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.5/26] block=192.168.61.0/26 handle="k8s-pod-network.089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.812 [INFO][4639] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.5/26] handle="k8s-pod-network.089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.813 [INFO][4639] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:30.957782 containerd[1609]: 2025-09-09 03:30:30.813 [INFO][4639] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.5/26] IPv6=[] ContainerID="089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" HandleID="k8s-pod-network.089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:30.961709 containerd[1609]: 2025-09-09 03:30:30.846 [INFO][4553] cni-plugin/k8s.go 418: Populated endpoint ContainerID="089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-sjn5k" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0", GenerateName:"calico-apiserver-8dcb5dd78-", Namespace:"calico-apiserver", SelfLink:"", UID:"ee2b345f-7033-4c91-91dd-2cd3dbef2a5b", ResourceVersion:"917", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8dcb5dd78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-8dcb5dd78-sjn5k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4588905fc42", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:30.961709 containerd[1609]: 2025-09-09 03:30:30.856 [INFO][4553] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.5/32] ContainerID="089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-sjn5k" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:30.961709 containerd[1609]: 2025-09-09 03:30:30.861 [INFO][4553] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4588905fc42 ContainerID="089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-sjn5k" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:30.961709 containerd[1609]: 2025-09-09 03:30:30.881 [INFO][4553] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-sjn5k" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:30.961709 containerd[1609]: 2025-09-09 03:30:30.895 [INFO][4553] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-sjn5k" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0", GenerateName:"calico-apiserver-8dcb5dd78-", Namespace:"calico-apiserver", SelfLink:"", UID:"ee2b345f-7033-4c91-91dd-2cd3dbef2a5b", ResourceVersion:"917", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8dcb5dd78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45", Pod:"calico-apiserver-8dcb5dd78-sjn5k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4588905fc42", MAC:"ba:e7:f0:83:ec:f8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:30.961709 containerd[1609]: 2025-09-09 03:30:30.937 [INFO][4553] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-sjn5k" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:30.959608 systemd[1]: run-netns-cni\x2d20d7eb5a\x2d08c5\x2de4d2\x2d7cd3\x2d7167c8dfeb1c.mount: Deactivated successfully. Sep 9 03:30:30.977528 containerd[1609]: time="2025-09-09T03:30:30.976596554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-mfkn7,Uid:016e9bee-9c8a-4c0a-b6a2-ef76e3666084,Namespace:kube-system,Attempt:1,}" Sep 9 03:30:31.038177 containerd[1609]: time="2025-09-09T03:30:31.013353648Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:30:31.038177 containerd[1609]: time="2025-09-09T03:30:31.013437426Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:30:31.038177 containerd[1609]: time="2025-09-09T03:30:31.013485848Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:31.038177 containerd[1609]: time="2025-09-09T03:30:31.015126313Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:31.097392 containerd[1609]: time="2025-09-09T03:30:31.097321971Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hzbnh,Uid:4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664,Namespace:calico-system,Attempt:1,} returns sandbox id \"da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26\"" Sep 9 03:30:31.140011 containerd[1609]: time="2025-09-09T03:30:31.133880770Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:30:31.140011 containerd[1609]: time="2025-09-09T03:30:31.135228683Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:30:31.140011 containerd[1609]: time="2025-09-09T03:30:31.135273772Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:31.142715 containerd[1609]: time="2025-09-09T03:30:31.142579550Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:31.143171 systemd-networkd[1258]: vxlan.calico: Gained IPv6LL Sep 9 03:30:31.322862 containerd[1609]: time="2025-09-09T03:30:31.322654625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-grrnb,Uid:d110bb2c-45e8-4b87-97ca-6d0d4a087fb1,Namespace:kube-system,Attempt:1,} returns sandbox id \"9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca\"" Sep 9 03:30:31.344307 containerd[1609]: time="2025-09-09T03:30:31.344156556Z" level=info msg="CreateContainer within sandbox \"9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 03:30:31.399277 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2274454068.mount: Deactivated successfully. Sep 9 03:30:31.445620 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4020441182.mount: Deactivated successfully. Sep 9 03:30:31.477510 containerd[1609]: time="2025-09-09T03:30:31.474724158Z" level=info msg="CreateContainer within sandbox \"9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1330fca7b3e74b27e7ad08fed36edb2d63082e1ddc0b18c55e3ec9e9708857f4\"" Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.155 [INFO][4738] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.155 [INFO][4738] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" iface="eth0" netns="/var/run/netns/cni-0be28729-de2b-9701-0540-7176b1b45196" Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.155 [INFO][4738] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" iface="eth0" netns="/var/run/netns/cni-0be28729-de2b-9701-0540-7176b1b45196" Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.156 [INFO][4738] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" iface="eth0" netns="/var/run/netns/cni-0be28729-de2b-9701-0540-7176b1b45196" Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.156 [INFO][4738] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.156 [INFO][4738] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.419 [INFO][4830] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" HandleID="k8s-pod-network.feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.419 [INFO][4830] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.419 [INFO][4830] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.443 [WARNING][4830] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" HandleID="k8s-pod-network.feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.443 [INFO][4830] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" HandleID="k8s-pod-network.feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.447 [INFO][4830] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:31.479518 containerd[1609]: 2025-09-09 03:30:31.460 [INFO][4738] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:31.479518 containerd[1609]: time="2025-09-09T03:30:31.478953212Z" level=info msg="TearDown network for sandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\" successfully" Sep 9 03:30:31.479518 containerd[1609]: time="2025-09-09T03:30:31.478980990Z" level=info msg="StopPodSandbox for \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\" returns successfully" Sep 9 03:30:31.482501 containerd[1609]: time="2025-09-09T03:30:31.482015638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dcb5dd78-47vcp,Uid:6e64133a-dcc9-4620-a059-3825b5dc1d54,Namespace:calico-apiserver,Attempt:1,}" Sep 9 03:30:31.488205 containerd[1609]: time="2025-09-09T03:30:31.488128818Z" level=info msg="StartContainer for \"1330fca7b3e74b27e7ad08fed36edb2d63082e1ddc0b18c55e3ec9e9708857f4\"" Sep 9 03:30:31.548606 containerd[1609]: time="2025-09-09T03:30:31.548524840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dcb5dd78-sjn5k,Uid:ee2b345f-7033-4c91-91dd-2cd3dbef2a5b,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45\"" Sep 9 03:30:31.588912 systemd-networkd[1258]: calidd9e6b23bb7: Link UP Sep 9 03:30:31.589258 systemd-networkd[1258]: calidd9e6b23bb7: Gained carrier Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.214 [INFO][4792] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0 coredns-7c65d6cfc9- kube-system 016e9bee-9c8a-4c0a-b6a2-ef76e3666084 923 0 2025-09-09 03:29:40 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-sy1m0.gb1.brightbox.com coredns-7c65d6cfc9-mfkn7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calidd9e6b23bb7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mfkn7" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-" Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.215 [INFO][4792] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mfkn7" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.429 [INFO][4861] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" HandleID="k8s-pod-network.ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.434 [INFO][4861] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" HandleID="k8s-pod-network.ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00033e080), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-sy1m0.gb1.brightbox.com", "pod":"coredns-7c65d6cfc9-mfkn7", "timestamp":"2025-09-09 03:30:31.428988637 +0000 UTC"}, Hostname:"srv-sy1m0.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.434 [INFO][4861] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.448 [INFO][4861] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.448 [INFO][4861] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-sy1m0.gb1.brightbox.com' Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.471 [INFO][4861] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.495 [INFO][4861] ipam/ipam.go 394: Looking up existing affinities for host host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.514 [INFO][4861] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.520 [INFO][4861] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.527 [INFO][4861] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.528 [INFO][4861] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.540 [INFO][4861] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567 Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.551 [INFO][4861] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.570 [INFO][4861] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.6/26] block=192.168.61.0/26 handle="k8s-pod-network.ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.571 [INFO][4861] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.6/26] handle="k8s-pod-network.ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.571 [INFO][4861] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:31.686817 containerd[1609]: 2025-09-09 03:30:31.571 [INFO][4861] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.6/26] IPv6=[] ContainerID="ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" HandleID="k8s-pod-network.ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:31.688432 containerd[1609]: 2025-09-09 03:30:31.579 [INFO][4792] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mfkn7" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"016e9bee-9c8a-4c0a-b6a2-ef76e3666084", ResourceVersion:"923", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7c65d6cfc9-mfkn7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calidd9e6b23bb7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:31.688432 containerd[1609]: 2025-09-09 03:30:31.580 [INFO][4792] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.6/32] ContainerID="ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mfkn7" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:31.688432 containerd[1609]: 2025-09-09 03:30:31.580 [INFO][4792] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidd9e6b23bb7 ContainerID="ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mfkn7" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:31.688432 containerd[1609]: 2025-09-09 03:30:31.588 [INFO][4792] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mfkn7" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:31.688432 containerd[1609]: 2025-09-09 03:30:31.593 [INFO][4792] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mfkn7" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"016e9bee-9c8a-4c0a-b6a2-ef76e3666084", ResourceVersion:"923", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567", Pod:"coredns-7c65d6cfc9-mfkn7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calidd9e6b23bb7", MAC:"22:00:47:39:cd:c2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:31.688432 containerd[1609]: 2025-09-09 03:30:31.652 [INFO][4792] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mfkn7" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:31.783945 systemd-networkd[1258]: calie723ed321d6: Gained IPv6LL Sep 9 03:30:31.802319 containerd[1609]: time="2025-09-09T03:30:31.802270592Z" level=info msg="StartContainer for \"1330fca7b3e74b27e7ad08fed36edb2d63082e1ddc0b18c55e3ec9e9708857f4\" returns successfully" Sep 9 03:30:31.844390 containerd[1609]: time="2025-09-09T03:30:31.840363612Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:30:31.844390 containerd[1609]: time="2025-09-09T03:30:31.840453085Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:30:31.844390 containerd[1609]: time="2025-09-09T03:30:31.840481602Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:31.844390 containerd[1609]: time="2025-09-09T03:30:31.840615206Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:32.039120 systemd-networkd[1258]: cali93a6f820512: Gained IPv6LL Sep 9 03:30:32.094034 containerd[1609]: time="2025-09-09T03:30:32.093364646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-mfkn7,Uid:016e9bee-9c8a-4c0a-b6a2-ef76e3666084,Namespace:kube-system,Attempt:1,} returns sandbox id \"ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567\"" Sep 9 03:30:32.113301 containerd[1609]: time="2025-09-09T03:30:32.111559931Z" level=info msg="CreateContainer within sandbox \"ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 03:30:32.147292 systemd-networkd[1258]: cali4a398001ad0: Link UP Sep 9 03:30:32.153156 systemd-networkd[1258]: cali4a398001ad0: Gained carrier Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:31.853 [INFO][4900] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0 calico-apiserver-8dcb5dd78- calico-apiserver 6e64133a-dcc9-4620-a059-3825b5dc1d54 935 0 2025-09-09 03:29:52 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8dcb5dd78 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-sy1m0.gb1.brightbox.com calico-apiserver-8dcb5dd78-47vcp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali4a398001ad0 [] [] }} ContainerID="ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-47vcp" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-" Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:31.855 [INFO][4900] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-47vcp" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:31.995 [INFO][4959] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" HandleID="k8s-pod-network.ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:31.995 [INFO][4959] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" HandleID="k8s-pod-network.ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f6d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-sy1m0.gb1.brightbox.com", "pod":"calico-apiserver-8dcb5dd78-47vcp", "timestamp":"2025-09-09 03:30:31.995367094 +0000 UTC"}, Hostname:"srv-sy1m0.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:31.995 [INFO][4959] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:31.995 [INFO][4959] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:31.996 [INFO][4959] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-sy1m0.gb1.brightbox.com' Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:32.015 [INFO][4959] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:32.036 [INFO][4959] ipam/ipam.go 394: Looking up existing affinities for host host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:32.048 [INFO][4959] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:32.053 [INFO][4959] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:32.058 [INFO][4959] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:32.058 [INFO][4959] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:32.069 [INFO][4959] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9 Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:32.089 [INFO][4959] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:32.116 [INFO][4959] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.7/26] block=192.168.61.0/26 handle="k8s-pod-network.ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:32.117 [INFO][4959] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.7/26] handle="k8s-pod-network.ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:32.117 [INFO][4959] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:32.202433 containerd[1609]: 2025-09-09 03:30:32.117 [INFO][4959] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.7/26] IPv6=[] ContainerID="ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" HandleID="k8s-pod-network.ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:32.206422 containerd[1609]: 2025-09-09 03:30:32.125 [INFO][4900] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-47vcp" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0", GenerateName:"calico-apiserver-8dcb5dd78-", Namespace:"calico-apiserver", SelfLink:"", UID:"6e64133a-dcc9-4620-a059-3825b5dc1d54", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8dcb5dd78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-8dcb5dd78-47vcp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4a398001ad0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:32.206422 containerd[1609]: 2025-09-09 03:30:32.126 [INFO][4900] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.7/32] ContainerID="ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-47vcp" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:32.206422 containerd[1609]: 2025-09-09 03:30:32.127 [INFO][4900] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4a398001ad0 ContainerID="ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-47vcp" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:32.206422 containerd[1609]: 2025-09-09 03:30:32.154 [INFO][4900] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-47vcp" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:32.206422 containerd[1609]: 2025-09-09 03:30:32.155 [INFO][4900] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-47vcp" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0", GenerateName:"calico-apiserver-8dcb5dd78-", Namespace:"calico-apiserver", SelfLink:"", UID:"6e64133a-dcc9-4620-a059-3825b5dc1d54", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8dcb5dd78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9", Pod:"calico-apiserver-8dcb5dd78-47vcp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4a398001ad0", MAC:"7a:d0:24:07:f2:96", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:32.206422 containerd[1609]: 2025-09-09 03:30:32.183 [INFO][4900] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9" Namespace="calico-apiserver" Pod="calico-apiserver-8dcb5dd78-47vcp" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:32.206422 containerd[1609]: time="2025-09-09T03:30:32.206361127Z" level=info msg="CreateContainer within sandbox \"ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3e1e7574d5b7c9c44c4ea612df5359e43a00fd1421902abf1bcd87ff83a87468\"" Sep 9 03:30:32.210711 containerd[1609]: time="2025-09-09T03:30:32.207980963Z" level=info msg="StartContainer for \"3e1e7574d5b7c9c44c4ea612df5359e43a00fd1421902abf1bcd87ff83a87468\"" Sep 9 03:30:32.289591 systemd[1]: run-netns-cni\x2d0be28729\x2dde2b\x2d9701\x2d0540\x2d7176b1b45196.mount: Deactivated successfully. Sep 9 03:30:32.351439 containerd[1609]: time="2025-09-09T03:30:32.346383533Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:30:32.351439 containerd[1609]: time="2025-09-09T03:30:32.346467751Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:30:32.351439 containerd[1609]: time="2025-09-09T03:30:32.346503056Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:32.351439 containerd[1609]: time="2025-09-09T03:30:32.346689009Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:32.528932 kubelet[2856]: I0909 03:30:32.527853 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-grrnb" podStartSLOduration=52.527690104 podStartE2EDuration="52.527690104s" podCreationTimestamp="2025-09-09 03:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 03:30:32.499216128 +0000 UTC m=+57.233856196" watchObservedRunningTime="2025-09-09 03:30:32.527690104 +0000 UTC m=+57.262330168" Sep 9 03:30:32.613405 containerd[1609]: time="2025-09-09T03:30:32.613142067Z" level=info msg="StartContainer for \"3e1e7574d5b7c9c44c4ea612df5359e43a00fd1421902abf1bcd87ff83a87468\" returns successfully" Sep 9 03:30:32.781001 containerd[1609]: time="2025-09-09T03:30:32.780046983Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dcb5dd78-47vcp,Uid:6e64133a-dcc9-4620-a059-3825b5dc1d54,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9\"" Sep 9 03:30:32.807419 systemd-networkd[1258]: cali4588905fc42: Gained IPv6LL Sep 9 03:30:32.869944 systemd-networkd[1258]: calidd9e6b23bb7: Gained IPv6LL Sep 9 03:30:33.277073 systemd[1]: run-containerd-runc-k8s.io-ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9-runc.N9sOIB.mount: Deactivated successfully. Sep 9 03:30:33.478898 kubelet[2856]: I0909 03:30:33.476365 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-mfkn7" podStartSLOduration=53.476334004 podStartE2EDuration="53.476334004s" podCreationTimestamp="2025-09-09 03:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 03:30:33.473403287 +0000 UTC m=+58.208043353" watchObservedRunningTime="2025-09-09 03:30:33.476334004 +0000 UTC m=+58.210974063" Sep 9 03:30:34.022906 systemd-networkd[1258]: cali4a398001ad0: Gained IPv6LL Sep 9 03:30:34.688126 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1683543120.mount: Deactivated successfully. Sep 9 03:30:35.630343 containerd[1609]: time="2025-09-09T03:30:35.630282724Z" level=info msg="StopPodSandbox for \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\"" Sep 9 03:30:35.921697 containerd[1609]: time="2025-09-09T03:30:35.921183659Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 9 03:30:35.935439 containerd[1609]: time="2025-09-09T03:30:35.935374131Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 7.6063974s" Sep 9 03:30:35.937104 containerd[1609]: time="2025-09-09T03:30:35.936977625Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 9 03:30:35.950667 containerd[1609]: time="2025-09-09T03:30:35.949383892Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:35.952900 containerd[1609]: time="2025-09-09T03:30:35.951794322Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:35.953254 containerd[1609]: time="2025-09-09T03:30:35.952926930Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:35.954560 containerd[1609]: time="2025-09-09T03:30:35.954523996Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 03:30:35.972820 containerd[1609]: time="2025-09-09T03:30:35.972705855Z" level=info msg="CreateContainer within sandbox \"3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 03:30:36.001140 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2680248890.mount: Deactivated successfully. Sep 9 03:30:36.007500 containerd[1609]: time="2025-09-09T03:30:36.007449944Z" level=info msg="CreateContainer within sandbox \"3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"f21845a758fca601f1f42de2818f66daeee41b3be7de750d8f969ba0eeefb384\"" Sep 9 03:30:36.010467 containerd[1609]: time="2025-09-09T03:30:36.010434655Z" level=info msg="StartContainer for \"f21845a758fca601f1f42de2818f66daeee41b3be7de750d8f969ba0eeefb384\"" Sep 9 03:30:36.069917 containerd[1609]: 2025-09-09 03:30:35.801 [WARNING][5093] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0", GenerateName:"calico-kube-controllers-567cf5f5d8-", Namespace:"calico-system", SelfLink:"", UID:"60a1b3a8-da11-4959-aadb-274c0e91ef5a", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"567cf5f5d8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b", Pod:"calico-kube-controllers-567cf5f5d8-smmz5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali10723933637", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:36.069917 containerd[1609]: 2025-09-09 03:30:35.804 [INFO][5093] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:36.069917 containerd[1609]: 2025-09-09 03:30:35.804 [INFO][5093] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" iface="eth0" netns="" Sep 9 03:30:36.069917 containerd[1609]: 2025-09-09 03:30:35.804 [INFO][5093] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:36.069917 containerd[1609]: 2025-09-09 03:30:35.806 [INFO][5093] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:36.069917 containerd[1609]: 2025-09-09 03:30:36.019 [INFO][5108] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" HandleID="k8s-pod-network.422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:36.069917 containerd[1609]: 2025-09-09 03:30:36.021 [INFO][5108] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:36.069917 containerd[1609]: 2025-09-09 03:30:36.021 [INFO][5108] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:36.069917 containerd[1609]: 2025-09-09 03:30:36.043 [WARNING][5108] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" HandleID="k8s-pod-network.422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:36.069917 containerd[1609]: 2025-09-09 03:30:36.043 [INFO][5108] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" HandleID="k8s-pod-network.422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:36.069917 containerd[1609]: 2025-09-09 03:30:36.053 [INFO][5108] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:36.069917 containerd[1609]: 2025-09-09 03:30:36.057 [INFO][5093] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:36.072183 containerd[1609]: time="2025-09-09T03:30:36.069979309Z" level=info msg="TearDown network for sandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\" successfully" Sep 9 03:30:36.072183 containerd[1609]: time="2025-09-09T03:30:36.070015232Z" level=info msg="StopPodSandbox for \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\" returns successfully" Sep 9 03:30:36.112313 containerd[1609]: time="2025-09-09T03:30:36.112212321Z" level=info msg="RemovePodSandbox for \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\"" Sep 9 03:30:36.117881 containerd[1609]: time="2025-09-09T03:30:36.117610519Z" level=info msg="Forcibly stopping sandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\"" Sep 9 03:30:36.251913 containerd[1609]: time="2025-09-09T03:30:36.251505640Z" level=info msg="StartContainer for \"f21845a758fca601f1f42de2818f66daeee41b3be7de750d8f969ba0eeefb384\" returns successfully" Sep 9 03:30:36.351067 containerd[1609]: 2025-09-09 03:30:36.210 [WARNING][5149] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0", GenerateName:"calico-kube-controllers-567cf5f5d8-", Namespace:"calico-system", SelfLink:"", UID:"60a1b3a8-da11-4959-aadb-274c0e91ef5a", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"567cf5f5d8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b", Pod:"calico-kube-controllers-567cf5f5d8-smmz5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali10723933637", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:36.351067 containerd[1609]: 2025-09-09 03:30:36.211 [INFO][5149] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:36.351067 containerd[1609]: 2025-09-09 03:30:36.211 [INFO][5149] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" iface="eth0" netns="" Sep 9 03:30:36.351067 containerd[1609]: 2025-09-09 03:30:36.211 [INFO][5149] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:36.351067 containerd[1609]: 2025-09-09 03:30:36.211 [INFO][5149] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:36.351067 containerd[1609]: 2025-09-09 03:30:36.300 [INFO][5157] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" HandleID="k8s-pod-network.422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:36.351067 containerd[1609]: 2025-09-09 03:30:36.300 [INFO][5157] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:36.351067 containerd[1609]: 2025-09-09 03:30:36.300 [INFO][5157] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:36.351067 containerd[1609]: 2025-09-09 03:30:36.321 [WARNING][5157] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" HandleID="k8s-pod-network.422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:36.351067 containerd[1609]: 2025-09-09 03:30:36.321 [INFO][5157] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" HandleID="k8s-pod-network.422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--kube--controllers--567cf5f5d8--smmz5-eth0" Sep 9 03:30:36.351067 containerd[1609]: 2025-09-09 03:30:36.330 [INFO][5157] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:36.351067 containerd[1609]: 2025-09-09 03:30:36.344 [INFO][5149] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6" Sep 9 03:30:36.354327 containerd[1609]: time="2025-09-09T03:30:36.351515347Z" level=info msg="TearDown network for sandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\" successfully" Sep 9 03:30:36.395779 containerd[1609]: time="2025-09-09T03:30:36.395354896Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 9 03:30:36.404897 containerd[1609]: time="2025-09-09T03:30:36.404742345Z" level=info msg="RemovePodSandbox \"422f91bfa8d8e1757c3da95011e16991db65df47239ef368e78d9093ef3626d6\" returns successfully" Sep 9 03:30:36.405833 containerd[1609]: time="2025-09-09T03:30:36.405480832Z" level=info msg="StopPodSandbox for \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\"" Sep 9 03:30:36.566113 containerd[1609]: 2025-09-09 03:30:36.464 [WARNING][5184] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"d110bb2c-45e8-4b87-97ca-6d0d4a087fb1", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca", Pod:"coredns-7c65d6cfc9-grrnb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali93a6f820512", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:36.566113 containerd[1609]: 2025-09-09 03:30:36.465 [INFO][5184] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:36.566113 containerd[1609]: 2025-09-09 03:30:36.465 [INFO][5184] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" iface="eth0" netns="" Sep 9 03:30:36.566113 containerd[1609]: 2025-09-09 03:30:36.465 [INFO][5184] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:36.566113 containerd[1609]: 2025-09-09 03:30:36.465 [INFO][5184] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:36.566113 containerd[1609]: 2025-09-09 03:30:36.531 [INFO][5191] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" HandleID="k8s-pod-network.c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:36.566113 containerd[1609]: 2025-09-09 03:30:36.532 [INFO][5191] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:36.566113 containerd[1609]: 2025-09-09 03:30:36.532 [INFO][5191] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:36.566113 containerd[1609]: 2025-09-09 03:30:36.549 [WARNING][5191] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" HandleID="k8s-pod-network.c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:36.566113 containerd[1609]: 2025-09-09 03:30:36.550 [INFO][5191] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" HandleID="k8s-pod-network.c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:36.566113 containerd[1609]: 2025-09-09 03:30:36.560 [INFO][5191] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:36.566113 containerd[1609]: 2025-09-09 03:30:36.563 [INFO][5184] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:36.567707 containerd[1609]: time="2025-09-09T03:30:36.566191559Z" level=info msg="TearDown network for sandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\" successfully" Sep 9 03:30:36.567707 containerd[1609]: time="2025-09-09T03:30:36.566249929Z" level=info msg="StopPodSandbox for \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\" returns successfully" Sep 9 03:30:36.567707 containerd[1609]: time="2025-09-09T03:30:36.567006788Z" level=info msg="RemovePodSandbox for \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\"" Sep 9 03:30:36.567707 containerd[1609]: time="2025-09-09T03:30:36.567110789Z" level=info msg="Forcibly stopping sandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\"" Sep 9 03:30:36.584975 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:30:36.582694 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:30:36.582792 systemd-resolved[1508]: Flushed all caches. Sep 9 03:30:36.691444 containerd[1609]: 2025-09-09 03:30:36.629 [WARNING][5207] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"d110bb2c-45e8-4b87-97ca-6d0d4a087fb1", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"9853af21499970769348a77e0a8a3c9f4db61f40a04f53bfca5b4d0a284ac5ca", Pod:"coredns-7c65d6cfc9-grrnb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali93a6f820512", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:36.691444 containerd[1609]: 2025-09-09 03:30:36.629 [INFO][5207] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:36.691444 containerd[1609]: 2025-09-09 03:30:36.629 [INFO][5207] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" iface="eth0" netns="" Sep 9 03:30:36.691444 containerd[1609]: 2025-09-09 03:30:36.629 [INFO][5207] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:36.691444 containerd[1609]: 2025-09-09 03:30:36.629 [INFO][5207] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:36.691444 containerd[1609]: 2025-09-09 03:30:36.664 [INFO][5214] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" HandleID="k8s-pod-network.c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:36.691444 containerd[1609]: 2025-09-09 03:30:36.665 [INFO][5214] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:36.691444 containerd[1609]: 2025-09-09 03:30:36.665 [INFO][5214] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:36.691444 containerd[1609]: 2025-09-09 03:30:36.680 [WARNING][5214] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" HandleID="k8s-pod-network.c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:36.691444 containerd[1609]: 2025-09-09 03:30:36.680 [INFO][5214] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" HandleID="k8s-pod-network.c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--grrnb-eth0" Sep 9 03:30:36.691444 containerd[1609]: 2025-09-09 03:30:36.687 [INFO][5214] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:36.691444 containerd[1609]: 2025-09-09 03:30:36.689 [INFO][5207] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5" Sep 9 03:30:36.693972 containerd[1609]: time="2025-09-09T03:30:36.691492361Z" level=info msg="TearDown network for sandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\" successfully" Sep 9 03:30:36.695995 containerd[1609]: time="2025-09-09T03:30:36.695957936Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 9 03:30:36.696086 containerd[1609]: time="2025-09-09T03:30:36.696038049Z" level=info msg="RemovePodSandbox \"c79bf566266e663329435c892643d4a8a46df0bdaf1391cb84a30f3e6a0952d5\" returns successfully" Sep 9 03:30:36.696788 containerd[1609]: time="2025-09-09T03:30:36.696726378Z" level=info msg="StopPodSandbox for \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\"" Sep 9 03:30:36.822848 containerd[1609]: 2025-09-09 03:30:36.755 [WARNING][5229] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0", GenerateName:"calico-apiserver-8dcb5dd78-", Namespace:"calico-apiserver", SelfLink:"", UID:"6e64133a-dcc9-4620-a059-3825b5dc1d54", ResourceVersion:"949", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8dcb5dd78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9", Pod:"calico-apiserver-8dcb5dd78-47vcp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4a398001ad0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:36.822848 containerd[1609]: 2025-09-09 03:30:36.756 [INFO][5229] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:36.822848 containerd[1609]: 2025-09-09 03:30:36.756 [INFO][5229] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" iface="eth0" netns="" Sep 9 03:30:36.822848 containerd[1609]: 2025-09-09 03:30:36.756 [INFO][5229] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:36.822848 containerd[1609]: 2025-09-09 03:30:36.756 [INFO][5229] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:36.822848 containerd[1609]: 2025-09-09 03:30:36.793 [INFO][5236] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" HandleID="k8s-pod-network.feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:36.822848 containerd[1609]: 2025-09-09 03:30:36.794 [INFO][5236] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:36.822848 containerd[1609]: 2025-09-09 03:30:36.794 [INFO][5236] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:36.822848 containerd[1609]: 2025-09-09 03:30:36.810 [WARNING][5236] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" HandleID="k8s-pod-network.feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:36.822848 containerd[1609]: 2025-09-09 03:30:36.810 [INFO][5236] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" HandleID="k8s-pod-network.feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:36.822848 containerd[1609]: 2025-09-09 03:30:36.818 [INFO][5236] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:36.822848 containerd[1609]: 2025-09-09 03:30:36.820 [INFO][5229] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:36.822848 containerd[1609]: time="2025-09-09T03:30:36.822815582Z" level=info msg="TearDown network for sandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\" successfully" Sep 9 03:30:36.823665 containerd[1609]: time="2025-09-09T03:30:36.822857047Z" level=info msg="StopPodSandbox for \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\" returns successfully" Sep 9 03:30:36.825322 containerd[1609]: time="2025-09-09T03:30:36.825279606Z" level=info msg="RemovePodSandbox for \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\"" Sep 9 03:30:36.825408 containerd[1609]: time="2025-09-09T03:30:36.825351210Z" level=info msg="Forcibly stopping sandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\"" Sep 9 03:30:36.962806 containerd[1609]: 2025-09-09 03:30:36.897 [WARNING][5250] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0", GenerateName:"calico-apiserver-8dcb5dd78-", Namespace:"calico-apiserver", SelfLink:"", UID:"6e64133a-dcc9-4620-a059-3825b5dc1d54", ResourceVersion:"949", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8dcb5dd78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9", Pod:"calico-apiserver-8dcb5dd78-47vcp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4a398001ad0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:36.962806 containerd[1609]: 2025-09-09 03:30:36.898 [INFO][5250] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:36.962806 containerd[1609]: 2025-09-09 03:30:36.898 [INFO][5250] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" iface="eth0" netns="" Sep 9 03:30:36.962806 containerd[1609]: 2025-09-09 03:30:36.898 [INFO][5250] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:36.962806 containerd[1609]: 2025-09-09 03:30:36.898 [INFO][5250] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:36.962806 containerd[1609]: 2025-09-09 03:30:36.932 [INFO][5257] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" HandleID="k8s-pod-network.feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:36.962806 containerd[1609]: 2025-09-09 03:30:36.933 [INFO][5257] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:36.962806 containerd[1609]: 2025-09-09 03:30:36.933 [INFO][5257] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:36.962806 containerd[1609]: 2025-09-09 03:30:36.950 [WARNING][5257] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" HandleID="k8s-pod-network.feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:36.962806 containerd[1609]: 2025-09-09 03:30:36.950 [INFO][5257] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" HandleID="k8s-pod-network.feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--47vcp-eth0" Sep 9 03:30:36.962806 containerd[1609]: 2025-09-09 03:30:36.958 [INFO][5257] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:36.962806 containerd[1609]: 2025-09-09 03:30:36.960 [INFO][5250] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e" Sep 9 03:30:36.965456 containerd[1609]: time="2025-09-09T03:30:36.963681563Z" level=info msg="TearDown network for sandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\" successfully" Sep 9 03:30:36.968084 containerd[1609]: time="2025-09-09T03:30:36.967874310Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 9 03:30:36.968084 containerd[1609]: time="2025-09-09T03:30:36.967950839Z" level=info msg="RemovePodSandbox \"feaf008adf2a8ed338a6d3f2faaaf634ba5101ad4eaea1a88f09784872df755e\" returns successfully" Sep 9 03:30:36.969269 containerd[1609]: time="2025-09-09T03:30:36.969188104Z" level=info msg="StopPodSandbox for \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\"" Sep 9 03:30:37.118402 containerd[1609]: 2025-09-09 03:30:37.027 [WARNING][5272] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0", GenerateName:"calico-apiserver-8dcb5dd78-", Namespace:"calico-apiserver", SelfLink:"", UID:"ee2b345f-7033-4c91-91dd-2cd3dbef2a5b", ResourceVersion:"933", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8dcb5dd78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45", Pod:"calico-apiserver-8dcb5dd78-sjn5k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4588905fc42", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:37.118402 containerd[1609]: 2025-09-09 03:30:37.028 [INFO][5272] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:37.118402 containerd[1609]: 2025-09-09 03:30:37.028 [INFO][5272] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" iface="eth0" netns="" Sep 9 03:30:37.118402 containerd[1609]: 2025-09-09 03:30:37.028 [INFO][5272] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:37.118402 containerd[1609]: 2025-09-09 03:30:37.028 [INFO][5272] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:37.118402 containerd[1609]: 2025-09-09 03:30:37.087 [INFO][5279] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" HandleID="k8s-pod-network.dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:37.118402 containerd[1609]: 2025-09-09 03:30:37.087 [INFO][5279] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:37.118402 containerd[1609]: 2025-09-09 03:30:37.087 [INFO][5279] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:37.118402 containerd[1609]: 2025-09-09 03:30:37.101 [WARNING][5279] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" HandleID="k8s-pod-network.dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:37.118402 containerd[1609]: 2025-09-09 03:30:37.101 [INFO][5279] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" HandleID="k8s-pod-network.dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:37.118402 containerd[1609]: 2025-09-09 03:30:37.113 [INFO][5279] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:37.118402 containerd[1609]: 2025-09-09 03:30:37.115 [INFO][5272] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:37.121713 containerd[1609]: time="2025-09-09T03:30:37.118988189Z" level=info msg="TearDown network for sandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\" successfully" Sep 9 03:30:37.121713 containerd[1609]: time="2025-09-09T03:30:37.119053851Z" level=info msg="StopPodSandbox for \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\" returns successfully" Sep 9 03:30:37.122943 containerd[1609]: time="2025-09-09T03:30:37.122442751Z" level=info msg="RemovePodSandbox for \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\"" Sep 9 03:30:37.122943 containerd[1609]: time="2025-09-09T03:30:37.122478047Z" level=info msg="Forcibly stopping sandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\"" Sep 9 03:30:37.475168 containerd[1609]: 2025-09-09 03:30:37.192 [WARNING][5294] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0", GenerateName:"calico-apiserver-8dcb5dd78-", Namespace:"calico-apiserver", SelfLink:"", UID:"ee2b345f-7033-4c91-91dd-2cd3dbef2a5b", ResourceVersion:"933", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8dcb5dd78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45", Pod:"calico-apiserver-8dcb5dd78-sjn5k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4588905fc42", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:37.475168 containerd[1609]: 2025-09-09 03:30:37.192 [INFO][5294] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:37.475168 containerd[1609]: 2025-09-09 03:30:37.192 [INFO][5294] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" iface="eth0" netns="" Sep 9 03:30:37.475168 containerd[1609]: 2025-09-09 03:30:37.192 [INFO][5294] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:37.475168 containerd[1609]: 2025-09-09 03:30:37.192 [INFO][5294] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:37.475168 containerd[1609]: 2025-09-09 03:30:37.225 [INFO][5301] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" HandleID="k8s-pod-network.dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:37.475168 containerd[1609]: 2025-09-09 03:30:37.225 [INFO][5301] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:37.475168 containerd[1609]: 2025-09-09 03:30:37.225 [INFO][5301] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:37.475168 containerd[1609]: 2025-09-09 03:30:37.237 [WARNING][5301] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" HandleID="k8s-pod-network.dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:37.475168 containerd[1609]: 2025-09-09 03:30:37.237 [INFO][5301] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" HandleID="k8s-pod-network.dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Workload="srv--sy1m0.gb1.brightbox.com-k8s-calico--apiserver--8dcb5dd78--sjn5k-eth0" Sep 9 03:30:37.475168 containerd[1609]: 2025-09-09 03:30:37.464 [INFO][5301] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:37.475168 containerd[1609]: 2025-09-09 03:30:37.466 [INFO][5294] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be" Sep 9 03:30:37.478938 containerd[1609]: time="2025-09-09T03:30:37.477221764Z" level=info msg="TearDown network for sandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\" successfully" Sep 9 03:30:37.530866 containerd[1609]: time="2025-09-09T03:30:37.530342726Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 9 03:30:37.530866 containerd[1609]: time="2025-09-09T03:30:37.530463484Z" level=info msg="RemovePodSandbox \"dd29cde6c133ea9f6c9a0bc633c3d3fff9f979104a8569665fad155f7cd347be\" returns successfully" Sep 9 03:30:37.533200 containerd[1609]: time="2025-09-09T03:30:37.532696617Z" level=info msg="StopPodSandbox for \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\"" Sep 9 03:30:37.598178 systemd[1]: run-containerd-runc-k8s.io-f21845a758fca601f1f42de2818f66daeee41b3be7de750d8f969ba0eeefb384-runc.DfEkBH.mount: Deactivated successfully. Sep 9 03:30:37.865695 containerd[1609]: 2025-09-09 03:30:37.762 [WARNING][5318] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"c3e83bd5-7012-441b-9934-0559a0c6192c", ResourceVersion:"980", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65", Pod:"goldmane-7988f88666-vgmql", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7a9354c34b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:37.865695 containerd[1609]: 2025-09-09 03:30:37.763 [INFO][5318] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:37.865695 containerd[1609]: 2025-09-09 03:30:37.763 [INFO][5318] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" iface="eth0" netns="" Sep 9 03:30:37.865695 containerd[1609]: 2025-09-09 03:30:37.763 [INFO][5318] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:37.865695 containerd[1609]: 2025-09-09 03:30:37.763 [INFO][5318] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:37.865695 containerd[1609]: 2025-09-09 03:30:37.828 [INFO][5345] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" HandleID="k8s-pod-network.7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Workload="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:37.865695 containerd[1609]: 2025-09-09 03:30:37.829 [INFO][5345] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:37.865695 containerd[1609]: 2025-09-09 03:30:37.829 [INFO][5345] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:37.865695 containerd[1609]: 2025-09-09 03:30:37.847 [WARNING][5345] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" HandleID="k8s-pod-network.7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Workload="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:37.865695 containerd[1609]: 2025-09-09 03:30:37.848 [INFO][5345] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" HandleID="k8s-pod-network.7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Workload="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:37.865695 containerd[1609]: 2025-09-09 03:30:37.858 [INFO][5345] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:37.865695 containerd[1609]: 2025-09-09 03:30:37.861 [INFO][5318] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:37.884606 containerd[1609]: time="2025-09-09T03:30:37.884257163Z" level=info msg="TearDown network for sandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\" successfully" Sep 9 03:30:37.884606 containerd[1609]: time="2025-09-09T03:30:37.884317133Z" level=info msg="StopPodSandbox for \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\" returns successfully" Sep 9 03:30:37.886896 containerd[1609]: time="2025-09-09T03:30:37.885456833Z" level=info msg="RemovePodSandbox for \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\"" Sep 9 03:30:37.886896 containerd[1609]: time="2025-09-09T03:30:37.885501146Z" level=info msg="Forcibly stopping sandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\"" Sep 9 03:30:38.071476 containerd[1609]: 2025-09-09 03:30:37.969 [WARNING][5360] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"c3e83bd5-7012-441b-9934-0559a0c6192c", ResourceVersion:"980", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"3684ef3163a528c510cd60be9dcf724593e13511682d4a5d4b7116c5ee211f65", Pod:"goldmane-7988f88666-vgmql", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7a9354c34b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:38.071476 containerd[1609]: 2025-09-09 03:30:37.970 [INFO][5360] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:38.071476 containerd[1609]: 2025-09-09 03:30:37.970 [INFO][5360] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" iface="eth0" netns="" Sep 9 03:30:38.071476 containerd[1609]: 2025-09-09 03:30:37.971 [INFO][5360] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:38.071476 containerd[1609]: 2025-09-09 03:30:37.971 [INFO][5360] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:38.071476 containerd[1609]: 2025-09-09 03:30:38.031 [INFO][5368] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" HandleID="k8s-pod-network.7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Workload="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:38.071476 containerd[1609]: 2025-09-09 03:30:38.032 [INFO][5368] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:38.071476 containerd[1609]: 2025-09-09 03:30:38.033 [INFO][5368] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:38.071476 containerd[1609]: 2025-09-09 03:30:38.054 [WARNING][5368] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" HandleID="k8s-pod-network.7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Workload="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:38.071476 containerd[1609]: 2025-09-09 03:30:38.054 [INFO][5368] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" HandleID="k8s-pod-network.7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Workload="srv--sy1m0.gb1.brightbox.com-k8s-goldmane--7988f88666--vgmql-eth0" Sep 9 03:30:38.071476 containerd[1609]: 2025-09-09 03:30:38.065 [INFO][5368] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:38.071476 containerd[1609]: 2025-09-09 03:30:38.068 [INFO][5360] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b" Sep 9 03:30:38.074036 containerd[1609]: time="2025-09-09T03:30:38.072578023Z" level=info msg="TearDown network for sandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\" successfully" Sep 9 03:30:38.079240 containerd[1609]: time="2025-09-09T03:30:38.078800376Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 9 03:30:38.079240 containerd[1609]: time="2025-09-09T03:30:38.078910206Z" level=info msg="RemovePodSandbox \"7d865b3178ba08c1acc16a9410b0a4991d4d128006b631bb51b8b33c377fef1b\" returns successfully" Sep 9 03:30:38.080493 containerd[1609]: time="2025-09-09T03:30:38.080442381Z" level=info msg="StopPodSandbox for \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\"" Sep 9 03:30:38.255529 containerd[1609]: 2025-09-09 03:30:38.168 [WARNING][5382] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664", ResourceVersion:"926", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26", Pod:"csi-node-driver-hzbnh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie723ed321d6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:38.255529 containerd[1609]: 2025-09-09 03:30:38.169 [INFO][5382] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:38.255529 containerd[1609]: 2025-09-09 03:30:38.169 [INFO][5382] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" iface="eth0" netns="" Sep 9 03:30:38.255529 containerd[1609]: 2025-09-09 03:30:38.169 [INFO][5382] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:38.255529 containerd[1609]: 2025-09-09 03:30:38.169 [INFO][5382] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:38.255529 containerd[1609]: 2025-09-09 03:30:38.218 [INFO][5389] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" HandleID="k8s-pod-network.3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Workload="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:38.255529 containerd[1609]: 2025-09-09 03:30:38.218 [INFO][5389] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:38.255529 containerd[1609]: 2025-09-09 03:30:38.218 [INFO][5389] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:38.255529 containerd[1609]: 2025-09-09 03:30:38.239 [WARNING][5389] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" HandleID="k8s-pod-network.3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Workload="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:38.255529 containerd[1609]: 2025-09-09 03:30:38.240 [INFO][5389] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" HandleID="k8s-pod-network.3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Workload="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:38.255529 containerd[1609]: 2025-09-09 03:30:38.247 [INFO][5389] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:38.255529 containerd[1609]: 2025-09-09 03:30:38.251 [INFO][5382] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:38.258240 containerd[1609]: time="2025-09-09T03:30:38.255536691Z" level=info msg="TearDown network for sandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\" successfully" Sep 9 03:30:38.258240 containerd[1609]: time="2025-09-09T03:30:38.256800071Z" level=info msg="StopPodSandbox for \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\" returns successfully" Sep 9 03:30:38.258792 containerd[1609]: time="2025-09-09T03:30:38.258388022Z" level=info msg="RemovePodSandbox for \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\"" Sep 9 03:30:38.258792 containerd[1609]: time="2025-09-09T03:30:38.258439623Z" level=info msg="Forcibly stopping sandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\"" Sep 9 03:30:38.483802 containerd[1609]: 2025-09-09 03:30:38.384 [WARNING][5403] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4d5e060a-0d3e-4ba7-8cd9-233cfcbfe664", ResourceVersion:"926", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26", Pod:"csi-node-driver-hzbnh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie723ed321d6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:38.483802 containerd[1609]: 2025-09-09 03:30:38.385 [INFO][5403] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:38.483802 containerd[1609]: 2025-09-09 03:30:38.385 [INFO][5403] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" iface="eth0" netns="" Sep 9 03:30:38.483802 containerd[1609]: 2025-09-09 03:30:38.385 [INFO][5403] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:38.483802 containerd[1609]: 2025-09-09 03:30:38.385 [INFO][5403] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:38.483802 containerd[1609]: 2025-09-09 03:30:38.446 [INFO][5410] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" HandleID="k8s-pod-network.3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Workload="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:38.483802 containerd[1609]: 2025-09-09 03:30:38.447 [INFO][5410] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:38.483802 containerd[1609]: 2025-09-09 03:30:38.447 [INFO][5410] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:38.483802 containerd[1609]: 2025-09-09 03:30:38.462 [WARNING][5410] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" HandleID="k8s-pod-network.3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Workload="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:38.483802 containerd[1609]: 2025-09-09 03:30:38.462 [INFO][5410] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" HandleID="k8s-pod-network.3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Workload="srv--sy1m0.gb1.brightbox.com-k8s-csi--node--driver--hzbnh-eth0" Sep 9 03:30:38.483802 containerd[1609]: 2025-09-09 03:30:38.472 [INFO][5410] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:38.483802 containerd[1609]: 2025-09-09 03:30:38.475 [INFO][5403] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2" Sep 9 03:30:38.483802 containerd[1609]: time="2025-09-09T03:30:38.483246707Z" level=info msg="TearDown network for sandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\" successfully" Sep 9 03:30:38.487872 containerd[1609]: time="2025-09-09T03:30:38.487822241Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 9 03:30:38.488072 containerd[1609]: time="2025-09-09T03:30:38.488037934Z" level=info msg="RemovePodSandbox \"3b452541da54573cc9189ab6e28fad583f352f95df62b4ad2fffd71811e99df2\" returns successfully" Sep 9 03:30:38.490151 containerd[1609]: time="2025-09-09T03:30:38.490095927Z" level=info msg="StopPodSandbox for \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\"" Sep 9 03:30:38.666077 containerd[1609]: time="2025-09-09T03:30:38.665981634Z" level=info msg="StopPodSandbox for \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\"" Sep 9 03:30:38.834296 containerd[1609]: 2025-09-09 03:30:38.620 [WARNING][5424] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"016e9bee-9c8a-4c0a-b6a2-ef76e3666084", ResourceVersion:"966", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567", Pod:"coredns-7c65d6cfc9-mfkn7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calidd9e6b23bb7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:38.834296 containerd[1609]: 2025-09-09 03:30:38.622 [INFO][5424] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:38.834296 containerd[1609]: 2025-09-09 03:30:38.622 [INFO][5424] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" iface="eth0" netns="" Sep 9 03:30:38.834296 containerd[1609]: 2025-09-09 03:30:38.622 [INFO][5424] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:38.834296 containerd[1609]: 2025-09-09 03:30:38.622 [INFO][5424] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:38.834296 containerd[1609]: 2025-09-09 03:30:38.779 [INFO][5441] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" HandleID="k8s-pod-network.b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:38.834296 containerd[1609]: 2025-09-09 03:30:38.780 [INFO][5441] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:38.834296 containerd[1609]: 2025-09-09 03:30:38.780 [INFO][5441] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:38.834296 containerd[1609]: 2025-09-09 03:30:38.811 [WARNING][5441] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" HandleID="k8s-pod-network.b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:38.834296 containerd[1609]: 2025-09-09 03:30:38.812 [INFO][5441] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" HandleID="k8s-pod-network.b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:38.834296 containerd[1609]: 2025-09-09 03:30:38.823 [INFO][5441] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:38.834296 containerd[1609]: 2025-09-09 03:30:38.830 [INFO][5424] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:38.834296 containerd[1609]: time="2025-09-09T03:30:38.834262644Z" level=info msg="TearDown network for sandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\" successfully" Sep 9 03:30:38.834296 containerd[1609]: time="2025-09-09T03:30:38.834305804Z" level=info msg="StopPodSandbox for \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\" returns successfully" Sep 9 03:30:38.839100 containerd[1609]: time="2025-09-09T03:30:38.836303819Z" level=info msg="RemovePodSandbox for \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\"" Sep 9 03:30:38.839100 containerd[1609]: time="2025-09-09T03:30:38.836357835Z" level=info msg="Forcibly stopping sandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\"" Sep 9 03:30:38.909005 kubelet[2856]: I0909 03:30:38.908490 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-vgmql" podStartSLOduration=34.293652106 podStartE2EDuration="41.908413799s" podCreationTimestamp="2025-09-09 03:29:57 +0000 UTC" firstStartedPulling="2025-09-09 03:30:28.32543708 +0000 UTC m=+53.060077131" lastFinishedPulling="2025-09-09 03:30:35.940198774 +0000 UTC m=+60.674838824" observedRunningTime="2025-09-09 03:30:36.515247958 +0000 UTC m=+61.249888019" watchObservedRunningTime="2025-09-09 03:30:38.908413799 +0000 UTC m=+63.643053857" Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:38.901 [INFO][5465] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:38.903 [INFO][5465] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" iface="eth0" netns="/var/run/netns/cni-d327727e-851a-aa19-30d4-85e3942c6652" Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:38.905 [INFO][5465] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" iface="eth0" netns="/var/run/netns/cni-d327727e-851a-aa19-30d4-85e3942c6652" Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:38.906 [INFO][5465] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" iface="eth0" netns="/var/run/netns/cni-d327727e-851a-aa19-30d4-85e3942c6652" Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:38.906 [INFO][5465] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:38.906 [INFO][5465] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:38.982 [INFO][5488] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" HandleID="k8s-pod-network.1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Workload="srv--sy1m0.gb1.brightbox.com-k8s-whisker--d89f85994--nw628-eth0" Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:38.982 [INFO][5488] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:38.982 [INFO][5488] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:39.016 [WARNING][5488] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" HandleID="k8s-pod-network.1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Workload="srv--sy1m0.gb1.brightbox.com-k8s-whisker--d89f85994--nw628-eth0" Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:39.016 [INFO][5488] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" HandleID="k8s-pod-network.1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Workload="srv--sy1m0.gb1.brightbox.com-k8s-whisker--d89f85994--nw628-eth0" Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:39.024 [INFO][5488] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:39.033064 containerd[1609]: 2025-09-09 03:30:39.028 [INFO][5465] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:30:39.041734 containerd[1609]: time="2025-09-09T03:30:39.040307941Z" level=info msg="TearDown network for sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\" successfully" Sep 9 03:30:39.041734 containerd[1609]: time="2025-09-09T03:30:39.040368323Z" level=info msg="StopPodSandbox for \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\" returns successfully" Sep 9 03:30:39.039782 systemd[1]: run-netns-cni\x2dd327727e\x2d851a\x2daa19\x2d30d4\x2d85e3942c6652.mount: Deactivated successfully. Sep 9 03:30:39.081360 kubelet[2856]: I0909 03:30:39.081289 2856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fbb3505-bd00-459e-814c-f3717bb189c3-whisker-ca-bundle\") pod \"3fbb3505-bd00-459e-814c-f3717bb189c3\" (UID: \"3fbb3505-bd00-459e-814c-f3717bb189c3\") " Sep 9 03:30:39.082084 kubelet[2856]: I0909 03:30:39.081700 2856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj5m5\" (UniqueName: \"kubernetes.io/projected/3fbb3505-bd00-459e-814c-f3717bb189c3-kube-api-access-xj5m5\") pod \"3fbb3505-bd00-459e-814c-f3717bb189c3\" (UID: \"3fbb3505-bd00-459e-814c-f3717bb189c3\") " Sep 9 03:30:39.082084 kubelet[2856]: I0909 03:30:39.081818 2856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3fbb3505-bd00-459e-814c-f3717bb189c3-whisker-backend-key-pair\") pod \"3fbb3505-bd00-459e-814c-f3717bb189c3\" (UID: \"3fbb3505-bd00-459e-814c-f3717bb189c3\") " Sep 9 03:30:39.082084 kubelet[2856]: I0909 03:30:39.080939 2856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fbb3505-bd00-459e-814c-f3717bb189c3-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "3fbb3505-bd00-459e-814c-f3717bb189c3" (UID: "3fbb3505-bd00-459e-814c-f3717bb189c3"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 9 03:30:39.114794 kubelet[2856]: I0909 03:30:39.113543 2856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fbb3505-bd00-459e-814c-f3717bb189c3-kube-api-access-xj5m5" (OuterVolumeSpecName: "kube-api-access-xj5m5") pod "3fbb3505-bd00-459e-814c-f3717bb189c3" (UID: "3fbb3505-bd00-459e-814c-f3717bb189c3"). InnerVolumeSpecName "kube-api-access-xj5m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 9 03:30:39.115347 kubelet[2856]: I0909 03:30:39.115287 2856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fbb3505-bd00-459e-814c-f3717bb189c3-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "3fbb3505-bd00-459e-814c-f3717bb189c3" (UID: "3fbb3505-bd00-459e-814c-f3717bb189c3"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 9 03:30:39.115735 systemd[1]: var-lib-kubelet-pods-3fbb3505\x2dbd00\x2d459e\x2d814c\x2df3717bb189c3-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dxj5m5.mount: Deactivated successfully. Sep 9 03:30:39.116169 systemd[1]: var-lib-kubelet-pods-3fbb3505\x2dbd00\x2d459e\x2d814c\x2df3717bb189c3-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 03:30:39.158565 containerd[1609]: 2025-09-09 03:30:38.985 [WARNING][5482] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"016e9bee-9c8a-4c0a-b6a2-ef76e3666084", ResourceVersion:"966", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 29, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"ec02e88330934aa5dd94dce8053c2530d7afa9989ba53cb87314d6eec572a567", Pod:"coredns-7c65d6cfc9-mfkn7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calidd9e6b23bb7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:39.158565 containerd[1609]: 2025-09-09 03:30:38.986 [INFO][5482] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:39.158565 containerd[1609]: 2025-09-09 03:30:38.986 [INFO][5482] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" iface="eth0" netns="" Sep 9 03:30:39.158565 containerd[1609]: 2025-09-09 03:30:38.986 [INFO][5482] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:39.158565 containerd[1609]: 2025-09-09 03:30:38.986 [INFO][5482] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:39.158565 containerd[1609]: 2025-09-09 03:30:39.063 [INFO][5497] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" HandleID="k8s-pod-network.b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:39.158565 containerd[1609]: 2025-09-09 03:30:39.063 [INFO][5497] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:39.158565 containerd[1609]: 2025-09-09 03:30:39.064 [INFO][5497] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:39.158565 containerd[1609]: 2025-09-09 03:30:39.139 [WARNING][5497] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" HandleID="k8s-pod-network.b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:39.158565 containerd[1609]: 2025-09-09 03:30:39.139 [INFO][5497] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" HandleID="k8s-pod-network.b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Workload="srv--sy1m0.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--mfkn7-eth0" Sep 9 03:30:39.158565 containerd[1609]: 2025-09-09 03:30:39.151 [INFO][5497] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:39.158565 containerd[1609]: 2025-09-09 03:30:39.155 [INFO][5482] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f" Sep 9 03:30:39.159699 containerd[1609]: time="2025-09-09T03:30:39.158656825Z" level=info msg="TearDown network for sandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\" successfully" Sep 9 03:30:39.165422 containerd[1609]: time="2025-09-09T03:30:39.165190941Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 9 03:30:39.165422 containerd[1609]: time="2025-09-09T03:30:39.165260926Z" level=info msg="RemovePodSandbox \"b124ede51684f6215b32d7cb9be99e53b28a7e49028b038d5f6c0a3498164d7f\" returns successfully" Sep 9 03:30:39.183240 kubelet[2856]: I0909 03:30:39.182906 2856 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fbb3505-bd00-459e-814c-f3717bb189c3-whisker-ca-bundle\") on node \"srv-sy1m0.gb1.brightbox.com\" DevicePath \"\"" Sep 9 03:30:39.183240 kubelet[2856]: I0909 03:30:39.183053 2856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj5m5\" (UniqueName: \"kubernetes.io/projected/3fbb3505-bd00-459e-814c-f3717bb189c3-kube-api-access-xj5m5\") on node \"srv-sy1m0.gb1.brightbox.com\" DevicePath \"\"" Sep 9 03:30:39.183240 kubelet[2856]: I0909 03:30:39.183112 2856 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3fbb3505-bd00-459e-814c-f3717bb189c3-whisker-backend-key-pair\") on node \"srv-sy1m0.gb1.brightbox.com\" DevicePath \"\"" Sep 9 03:30:39.688994 kubelet[2856]: I0909 03:30:39.688925 2856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fbb3505-bd00-459e-814c-f3717bb189c3" path="/var/lib/kubelet/pods/3fbb3505-bd00-459e-814c-f3717bb189c3/volumes" Sep 9 03:30:39.891100 kubelet[2856]: I0909 03:30:39.890687 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsq8z\" (UniqueName: \"kubernetes.io/projected/0998487c-dcc6-48fd-8c02-4b56e02dd79f-kube-api-access-xsq8z\") pod \"whisker-7b6b59449-spbzv\" (UID: \"0998487c-dcc6-48fd-8c02-4b56e02dd79f\") " pod="calico-system/whisker-7b6b59449-spbzv" Sep 9 03:30:39.891100 kubelet[2856]: I0909 03:30:39.890923 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0998487c-dcc6-48fd-8c02-4b56e02dd79f-whisker-ca-bundle\") pod \"whisker-7b6b59449-spbzv\" (UID: \"0998487c-dcc6-48fd-8c02-4b56e02dd79f\") " pod="calico-system/whisker-7b6b59449-spbzv" Sep 9 03:30:39.891100 kubelet[2856]: I0909 03:30:39.890967 2856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0998487c-dcc6-48fd-8c02-4b56e02dd79f-whisker-backend-key-pair\") pod \"whisker-7b6b59449-spbzv\" (UID: \"0998487c-dcc6-48fd-8c02-4b56e02dd79f\") " pod="calico-system/whisker-7b6b59449-spbzv" Sep 9 03:30:40.128716 containerd[1609]: time="2025-09-09T03:30:40.128277325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b6b59449-spbzv,Uid:0998487c-dcc6-48fd-8c02-4b56e02dd79f,Namespace:calico-system,Attempt:0,}" Sep 9 03:30:40.553698 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:30:40.550158 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:30:40.550248 systemd-resolved[1508]: Flushed all caches. Sep 9 03:30:40.671285 systemd-networkd[1258]: cali71f8407679d: Link UP Sep 9 03:30:40.684024 systemd-networkd[1258]: cali71f8407679d: Gained carrier Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.411 [INFO][5513] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0 whisker-7b6b59449- calico-system 0998487c-dcc6-48fd-8c02-4b56e02dd79f 1007 0 2025-09-09 03:30:39 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7b6b59449 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-sy1m0.gb1.brightbox.com whisker-7b6b59449-spbzv eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali71f8407679d [] [] }} ContainerID="fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" Namespace="calico-system" Pod="whisker-7b6b59449-spbzv" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-" Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.411 [INFO][5513] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" Namespace="calico-system" Pod="whisker-7b6b59449-spbzv" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0" Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.494 [INFO][5525] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" HandleID="k8s-pod-network.fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" Workload="srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0" Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.494 [INFO][5525] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" HandleID="k8s-pod-network.fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" Workload="srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f770), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-sy1m0.gb1.brightbox.com", "pod":"whisker-7b6b59449-spbzv", "timestamp":"2025-09-09 03:30:40.494241712 +0000 UTC"}, Hostname:"srv-sy1m0.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.494 [INFO][5525] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.494 [INFO][5525] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.494 [INFO][5525] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-sy1m0.gb1.brightbox.com' Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.513 [INFO][5525] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.528 [INFO][5525] ipam/ipam.go 394: Looking up existing affinities for host host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.546 [INFO][5525] ipam/ipam.go 511: Trying affinity for 192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.551 [INFO][5525] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.560 [INFO][5525] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.0/26 host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.561 [INFO][5525] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.0/26 handle="k8s-pod-network.fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.568 [INFO][5525] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.583 [INFO][5525] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.0/26 handle="k8s-pod-network.fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.638 [INFO][5525] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.8/26] block=192.168.61.0/26 handle="k8s-pod-network.fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.642 [INFO][5525] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.8/26] handle="k8s-pod-network.fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" host="srv-sy1m0.gb1.brightbox.com" Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.643 [INFO][5525] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:30:40.741378 containerd[1609]: 2025-09-09 03:30:40.643 [INFO][5525] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.8/26] IPv6=[] ContainerID="fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" HandleID="k8s-pod-network.fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" Workload="srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0" Sep 9 03:30:40.746901 containerd[1609]: 2025-09-09 03:30:40.654 [INFO][5513] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" Namespace="calico-system" Pod="whisker-7b6b59449-spbzv" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0", GenerateName:"whisker-7b6b59449-", Namespace:"calico-system", SelfLink:"", UID:"0998487c-dcc6-48fd-8c02-4b56e02dd79f", ResourceVersion:"1007", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 30, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7b6b59449", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"", Pod:"whisker-7b6b59449-spbzv", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.61.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali71f8407679d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:40.746901 containerd[1609]: 2025-09-09 03:30:40.655 [INFO][5513] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.8/32] ContainerID="fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" Namespace="calico-system" Pod="whisker-7b6b59449-spbzv" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0" Sep 9 03:30:40.746901 containerd[1609]: 2025-09-09 03:30:40.655 [INFO][5513] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali71f8407679d ContainerID="fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" Namespace="calico-system" Pod="whisker-7b6b59449-spbzv" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0" Sep 9 03:30:40.746901 containerd[1609]: 2025-09-09 03:30:40.687 [INFO][5513] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" Namespace="calico-system" Pod="whisker-7b6b59449-spbzv" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0" Sep 9 03:30:40.746901 containerd[1609]: 2025-09-09 03:30:40.689 [INFO][5513] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" Namespace="calico-system" Pod="whisker-7b6b59449-spbzv" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0", GenerateName:"whisker-7b6b59449-", Namespace:"calico-system", SelfLink:"", UID:"0998487c-dcc6-48fd-8c02-4b56e02dd79f", ResourceVersion:"1007", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 3, 30, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7b6b59449", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-sy1m0.gb1.brightbox.com", ContainerID:"fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c", Pod:"whisker-7b6b59449-spbzv", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.61.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali71f8407679d", MAC:"36:26:d7:50:8a:8c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 03:30:40.746901 containerd[1609]: 2025-09-09 03:30:40.716 [INFO][5513] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c" Namespace="calico-system" Pod="whisker-7b6b59449-spbzv" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-whisker--7b6b59449--spbzv-eth0" Sep 9 03:30:40.954558 containerd[1609]: time="2025-09-09T03:30:40.954221251Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 9 03:30:40.958372 containerd[1609]: time="2025-09-09T03:30:40.956866436Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 9 03:30:40.958372 containerd[1609]: time="2025-09-09T03:30:40.956901708Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:40.958372 containerd[1609]: time="2025-09-09T03:30:40.957061633Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 9 03:30:41.113005 containerd[1609]: time="2025-09-09T03:30:41.112946974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b6b59449-spbzv,Uid:0998487c-dcc6-48fd-8c02-4b56e02dd79f,Namespace:calico-system,Attempt:0,} returns sandbox id \"fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c\"" Sep 9 03:30:41.533107 containerd[1609]: time="2025-09-09T03:30:41.533034993Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:41.535587 containerd[1609]: time="2025-09-09T03:30:41.535503290Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 9 03:30:41.536578 containerd[1609]: time="2025-09-09T03:30:41.536500586Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:41.541437 containerd[1609]: time="2025-09-09T03:30:41.540846810Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:41.542517 containerd[1609]: time="2025-09-09T03:30:41.542401566Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 5.587826537s" Sep 9 03:30:41.542676 containerd[1609]: time="2025-09-09T03:30:41.542645362Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 9 03:30:41.546220 containerd[1609]: time="2025-09-09T03:30:41.546188204Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 03:30:41.579854 containerd[1609]: time="2025-09-09T03:30:41.579534481Z" level=info msg="CreateContainer within sandbox \"16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 03:30:41.601095 containerd[1609]: time="2025-09-09T03:30:41.601039193Z" level=info msg="CreateContainer within sandbox \"16e25a5f758e2feca8d03dcf3a1147013d4e24aaeea21e6df6adb15f3cfc644b\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"350bb43386dc965c3434b577da742e1a9fb05318b237d5b392ab76618f2bbc40\"" Sep 9 03:30:41.603997 containerd[1609]: time="2025-09-09T03:30:41.603962365Z" level=info msg="StartContainer for \"350bb43386dc965c3434b577da742e1a9fb05318b237d5b392ab76618f2bbc40\"" Sep 9 03:30:41.735096 containerd[1609]: time="2025-09-09T03:30:41.735015752Z" level=info msg="StartContainer for \"350bb43386dc965c3434b577da742e1a9fb05318b237d5b392ab76618f2bbc40\" returns successfully" Sep 9 03:30:41.830245 systemd-networkd[1258]: cali71f8407679d: Gained IPv6LL Sep 9 03:30:42.607887 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:30:42.606135 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:30:42.606161 systemd-resolved[1508]: Flushed all caches. Sep 9 03:30:42.779538 kubelet[2856]: I0909 03:30:42.776981 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-567cf5f5d8-smmz5" podStartSLOduration=32.874770555 podStartE2EDuration="44.776936338s" podCreationTimestamp="2025-09-09 03:29:58 +0000 UTC" firstStartedPulling="2025-09-09 03:30:29.64197531 +0000 UTC m=+54.376615355" lastFinishedPulling="2025-09-09 03:30:41.544141086 +0000 UTC m=+66.278781138" observedRunningTime="2025-09-09 03:30:42.620950505 +0000 UTC m=+67.355590568" watchObservedRunningTime="2025-09-09 03:30:42.776936338 +0000 UTC m=+67.511576412" Sep 9 03:30:43.611074 containerd[1609]: time="2025-09-09T03:30:43.610736061Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:43.615139 containerd[1609]: time="2025-09-09T03:30:43.614809419Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 9 03:30:43.617985 containerd[1609]: time="2025-09-09T03:30:43.617406682Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:43.624109 containerd[1609]: time="2025-09-09T03:30:43.623990968Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:43.626037 containerd[1609]: time="2025-09-09T03:30:43.625819709Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.079579315s" Sep 9 03:30:43.626037 containerd[1609]: time="2025-09-09T03:30:43.625870458Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 9 03:30:43.630335 containerd[1609]: time="2025-09-09T03:30:43.630295685Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 03:30:43.646120 containerd[1609]: time="2025-09-09T03:30:43.645948886Z" level=info msg="CreateContainer within sandbox \"da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 03:30:43.695855 containerd[1609]: time="2025-09-09T03:30:43.694114144Z" level=info msg="CreateContainer within sandbox \"da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"c1a42e9ecc4247de11a6b0e065cfac2d696cafd8daf1b87dfb9b8613b5bc293c\"" Sep 9 03:30:43.710128 containerd[1609]: time="2025-09-09T03:30:43.709818082Z" level=info msg="StartContainer for \"c1a42e9ecc4247de11a6b0e065cfac2d696cafd8daf1b87dfb9b8613b5bc293c\"" Sep 9 03:30:43.899676 containerd[1609]: time="2025-09-09T03:30:43.899544949Z" level=info msg="StartContainer for \"c1a42e9ecc4247de11a6b0e065cfac2d696cafd8daf1b87dfb9b8613b5bc293c\" returns successfully" Sep 9 03:30:46.699143 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:30:46.698898 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:30:46.698965 systemd-resolved[1508]: Flushed all caches. Sep 9 03:30:48.747882 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:30:48.743888 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:30:48.743900 systemd-resolved[1508]: Flushed all caches. Sep 9 03:30:49.774720 containerd[1609]: time="2025-09-09T03:30:49.774232356Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:49.782088 containerd[1609]: time="2025-09-09T03:30:49.777468516Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 9 03:30:49.782348 containerd[1609]: time="2025-09-09T03:30:49.782309887Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:49.894160 containerd[1609]: time="2025-09-09T03:30:49.894081699Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:49.896141 containerd[1609]: time="2025-09-09T03:30:49.896103234Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 6.265067939s" Sep 9 03:30:49.899064 containerd[1609]: time="2025-09-09T03:30:49.896264037Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 03:30:49.996152 containerd[1609]: time="2025-09-09T03:30:49.996099203Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 03:30:50.106363 containerd[1609]: time="2025-09-09T03:30:50.106170188Z" level=info msg="CreateContainer within sandbox \"089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 03:30:50.132802 containerd[1609]: time="2025-09-09T03:30:50.129848295Z" level=info msg="CreateContainer within sandbox \"089890473e12ce3f6825d444330e7b332105d731112f4e443dbb921957301b45\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f3bb664b5c731750a4a4566cd4beb92150d7090b8a6339fc3022b56128a12822\"" Sep 9 03:30:50.158822 containerd[1609]: time="2025-09-09T03:30:50.158713547Z" level=info msg="StartContainer for \"f3bb664b5c731750a4a4566cd4beb92150d7090b8a6339fc3022b56128a12822\"" Sep 9 03:30:50.461680 containerd[1609]: time="2025-09-09T03:30:50.461455229Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:50.464762 containerd[1609]: time="2025-09-09T03:30:50.464600280Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 03:30:50.473024 containerd[1609]: time="2025-09-09T03:30:50.471387854Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 473.464484ms" Sep 9 03:30:50.473024 containerd[1609]: time="2025-09-09T03:30:50.471483256Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 03:30:50.480219 containerd[1609]: time="2025-09-09T03:30:50.476507874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 03:30:50.572737 containerd[1609]: time="2025-09-09T03:30:50.572674662Z" level=info msg="CreateContainer within sandbox \"ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 03:30:50.723301 containerd[1609]: time="2025-09-09T03:30:50.723133309Z" level=info msg="CreateContainer within sandbox \"ab9eb039f343a09a8a2f0397caa302a0592800a6c4d2cb468e8943a9c30a21f9\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a325afaeb69404a58b96ac572c4da95355730fee811290c917ba73cb16f6971f\"" Sep 9 03:30:50.729845 containerd[1609]: time="2025-09-09T03:30:50.729018441Z" level=info msg="StartContainer for \"a325afaeb69404a58b96ac572c4da95355730fee811290c917ba73cb16f6971f\"" Sep 9 03:30:50.792312 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:30:50.794318 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:30:50.794358 systemd-resolved[1508]: Flushed all caches. Sep 9 03:30:50.914538 containerd[1609]: time="2025-09-09T03:30:50.913228676Z" level=info msg="StartContainer for \"f3bb664b5c731750a4a4566cd4beb92150d7090b8a6339fc3022b56128a12822\" returns successfully" Sep 9 03:30:51.101130 containerd[1609]: time="2025-09-09T03:30:51.101071338Z" level=info msg="StartContainer for \"a325afaeb69404a58b96ac572c4da95355730fee811290c917ba73cb16f6971f\" returns successfully" Sep 9 03:30:51.132844 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2686851963.mount: Deactivated successfully. Sep 9 03:30:52.107783 kubelet[2856]: I0909 03:30:52.091827 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8dcb5dd78-sjn5k" podStartSLOduration=41.664663256 podStartE2EDuration="1m0.086910819s" podCreationTimestamp="2025-09-09 03:29:52 +0000 UTC" firstStartedPulling="2025-09-09 03:30:31.561263938 +0000 UTC m=+56.295903988" lastFinishedPulling="2025-09-09 03:30:49.983511498 +0000 UTC m=+74.718151551" observedRunningTime="2025-09-09 03:30:52.041925857 +0000 UTC m=+76.776565928" watchObservedRunningTime="2025-09-09 03:30:52.086910819 +0000 UTC m=+76.821550878" Sep 9 03:30:52.107783 kubelet[2856]: I0909 03:30:52.106314 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8dcb5dd78-47vcp" podStartSLOduration=42.414293071 podStartE2EDuration="1m0.106292545s" podCreationTimestamp="2025-09-09 03:29:52 +0000 UTC" firstStartedPulling="2025-09-09 03:30:32.782927756 +0000 UTC m=+57.517567801" lastFinishedPulling="2025-09-09 03:30:50.474927218 +0000 UTC m=+75.209567275" observedRunningTime="2025-09-09 03:30:52.072052923 +0000 UTC m=+76.806692997" watchObservedRunningTime="2025-09-09 03:30:52.106292545 +0000 UTC m=+76.840932609" Sep 9 03:30:52.669081 containerd[1609]: time="2025-09-09T03:30:52.669007010Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:52.676817 containerd[1609]: time="2025-09-09T03:30:52.674799019Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 9 03:30:52.677863 containerd[1609]: time="2025-09-09T03:30:52.677803510Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:52.696779 containerd[1609]: time="2025-09-09T03:30:52.694867338Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:52.700047 containerd[1609]: time="2025-09-09T03:30:52.700009430Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 2.223463055s" Sep 9 03:30:52.702853 containerd[1609]: time="2025-09-09T03:30:52.702820243Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 9 03:30:52.715268 containerd[1609]: time="2025-09-09T03:30:52.715228040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 03:30:52.722198 containerd[1609]: time="2025-09-09T03:30:52.721897197Z" level=info msg="CreateContainer within sandbox \"fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 03:30:52.762668 containerd[1609]: time="2025-09-09T03:30:52.762604579Z" level=info msg="CreateContainer within sandbox \"fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"fc049e02668125364a13cc431466e6931a166418606ea698a4dfb02b4e3d430c\"" Sep 9 03:30:52.767912 containerd[1609]: time="2025-09-09T03:30:52.765137641Z" level=info msg="StartContainer for \"fc049e02668125364a13cc431466e6931a166418606ea698a4dfb02b4e3d430c\"" Sep 9 03:30:52.778514 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount314863189.mount: Deactivated successfully. Sep 9 03:30:52.842370 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:30:52.840918 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:30:52.840960 systemd-resolved[1508]: Flushed all caches. Sep 9 03:30:53.078960 containerd[1609]: time="2025-09-09T03:30:53.078865493Z" level=info msg="StartContainer for \"fc049e02668125364a13cc431466e6931a166418606ea698a4dfb02b4e3d430c\" returns successfully" Sep 9 03:30:53.986087 kubelet[2856]: I0909 03:30:53.984242 2856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 03:30:53.991564 kubelet[2856]: I0909 03:30:53.989380 2856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 03:30:55.584834 containerd[1609]: time="2025-09-09T03:30:55.584625916Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:55.591734 containerd[1609]: time="2025-09-09T03:30:55.590448693Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 9 03:30:55.594437 containerd[1609]: time="2025-09-09T03:30:55.593562538Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:55.630455 containerd[1609]: time="2025-09-09T03:30:55.628505623Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:30:55.632102 containerd[1609]: time="2025-09-09T03:30:55.632014495Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.916538137s" Sep 9 03:30:55.633284 containerd[1609]: time="2025-09-09T03:30:55.633252516Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 9 03:30:55.653272 containerd[1609]: time="2025-09-09T03:30:55.652982841Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 03:30:55.712786 containerd[1609]: time="2025-09-09T03:30:55.712574963Z" level=info msg="CreateContainer within sandbox \"da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 03:30:55.786090 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1274272906.mount: Deactivated successfully. Sep 9 03:30:55.825902 containerd[1609]: time="2025-09-09T03:30:55.825843787Z" level=info msg="CreateContainer within sandbox \"da8f482ab0a932f9862e4fd90eb9bf696f1fffb003b09ab37b86b764df5d6c26\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"7b1e648fe1000e3be80b15d41c1726070a8cf774ba23f70571959f771612a3b6\"" Sep 9 03:30:55.828352 containerd[1609]: time="2025-09-09T03:30:55.828315195Z" level=info msg="StartContainer for \"7b1e648fe1000e3be80b15d41c1726070a8cf774ba23f70571959f771612a3b6\"" Sep 9 03:30:56.108008 containerd[1609]: time="2025-09-09T03:30:56.107514786Z" level=info msg="StartContainer for \"7b1e648fe1000e3be80b15d41c1726070a8cf774ba23f70571959f771612a3b6\" returns successfully" Sep 9 03:30:56.622991 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:30:56.621113 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:30:56.621210 systemd-resolved[1508]: Flushed all caches. Sep 9 03:30:57.036366 kubelet[2856]: I0909 03:30:57.036286 2856 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 03:30:57.038774 kubelet[2856]: I0909 03:30:57.038358 2856 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 03:30:58.668908 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:30:58.662272 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:30:58.662308 systemd-resolved[1508]: Flushed all caches. Sep 9 03:31:00.029689 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1109664593.mount: Deactivated successfully. Sep 9 03:31:00.135277 containerd[1609]: time="2025-09-09T03:31:00.135186631Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:31:00.146774 containerd[1609]: time="2025-09-09T03:31:00.146076375Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 9 03:31:00.166239 containerd[1609]: time="2025-09-09T03:31:00.161108676Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:31:00.191693 containerd[1609]: time="2025-09-09T03:31:00.187068260Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 03:31:00.195475 containerd[1609]: time="2025-09-09T03:31:00.194853986Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 4.541791207s" Sep 9 03:31:00.195475 containerd[1609]: time="2025-09-09T03:31:00.194927213Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 9 03:31:00.267292 containerd[1609]: time="2025-09-09T03:31:00.266768767Z" level=info msg="CreateContainer within sandbox \"fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 03:31:00.334613 containerd[1609]: time="2025-09-09T03:31:00.333706413Z" level=info msg="CreateContainer within sandbox \"fc49cae671cf023d5203c7bea08706c3e8b14f56b1caa4fb1db9de3b9ab0ed3c\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"6e07c72f28b1f1b96375f0b5e0aa3f04096d26eae039b351789ccd086dd86ca1\"" Sep 9 03:31:00.336857 containerd[1609]: time="2025-09-09T03:31:00.336805844Z" level=info msg="StartContainer for \"6e07c72f28b1f1b96375f0b5e0aa3f04096d26eae039b351789ccd086dd86ca1\"" Sep 9 03:31:00.723045 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:31:00.710958 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:31:00.710984 systemd-resolved[1508]: Flushed all caches. Sep 9 03:31:00.894202 containerd[1609]: time="2025-09-09T03:31:00.893124401Z" level=info msg="StartContainer for \"6e07c72f28b1f1b96375f0b5e0aa3f04096d26eae039b351789ccd086dd86ca1\" returns successfully" Sep 9 03:31:01.485320 kubelet[2856]: I0909 03:31:01.449570 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-hzbnh" podStartSLOduration=38.901389881 podStartE2EDuration="1m3.449504644s" podCreationTimestamp="2025-09-09 03:29:58 +0000 UTC" firstStartedPulling="2025-09-09 03:30:31.103608538 +0000 UTC m=+55.838248589" lastFinishedPulling="2025-09-09 03:30:55.651723305 +0000 UTC m=+80.386363352" observedRunningTime="2025-09-09 03:30:57.18896254 +0000 UTC m=+81.923602606" watchObservedRunningTime="2025-09-09 03:31:01.449504644 +0000 UTC m=+86.184144707" Sep 9 03:31:01.490674 kubelet[2856]: I0909 03:31:01.485532 2856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7b6b59449-spbzv" podStartSLOduration=3.4025767780000002 podStartE2EDuration="22.485509787s" podCreationTimestamp="2025-09-09 03:30:39 +0000 UTC" firstStartedPulling="2025-09-09 03:30:41.11603287 +0000 UTC m=+65.850672922" lastFinishedPulling="2025-09-09 03:31:00.198965872 +0000 UTC m=+84.933605931" observedRunningTime="2025-09-09 03:31:01.443990248 +0000 UTC m=+86.178630323" watchObservedRunningTime="2025-09-09 03:31:01.485509787 +0000 UTC m=+86.220149845" Sep 9 03:31:02.758862 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:31:02.762215 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:31:02.758876 systemd-resolved[1508]: Flushed all caches. Sep 9 03:31:13.235922 systemd[1]: run-containerd-runc-k8s.io-13296b995012e403c9deaea2923b28742a4dca3ce7d0f89d2400ef5396f4dfed-runc.K5bRkV.mount: Deactivated successfully. Sep 9 03:31:16.585045 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:31:16.582072 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:31:16.582104 systemd-resolved[1508]: Flushed all caches. Sep 9 03:31:39.210032 containerd[1609]: time="2025-09-09T03:31:39.199788603Z" level=info msg="StopPodSandbox for \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\"" Sep 9 03:31:39.807851 containerd[1609]: 2025-09-09 03:31:39.567 [WARNING][6107] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-whisker--d89f85994--nw628-eth0" Sep 9 03:31:39.807851 containerd[1609]: 2025-09-09 03:31:39.569 [INFO][6107] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:31:39.807851 containerd[1609]: 2025-09-09 03:31:39.569 [INFO][6107] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" iface="eth0" netns="" Sep 9 03:31:39.807851 containerd[1609]: 2025-09-09 03:31:39.570 [INFO][6107] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:31:39.807851 containerd[1609]: 2025-09-09 03:31:39.570 [INFO][6107] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:31:39.807851 containerd[1609]: 2025-09-09 03:31:39.781 [INFO][6115] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" HandleID="k8s-pod-network.1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Workload="srv--sy1m0.gb1.brightbox.com-k8s-whisker--d89f85994--nw628-eth0" Sep 9 03:31:39.807851 containerd[1609]: 2025-09-09 03:31:39.784 [INFO][6115] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:31:39.807851 containerd[1609]: 2025-09-09 03:31:39.785 [INFO][6115] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:31:39.807851 containerd[1609]: 2025-09-09 03:31:39.800 [WARNING][6115] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" HandleID="k8s-pod-network.1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Workload="srv--sy1m0.gb1.brightbox.com-k8s-whisker--d89f85994--nw628-eth0" Sep 9 03:31:39.807851 containerd[1609]: 2025-09-09 03:31:39.800 [INFO][6115] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" HandleID="k8s-pod-network.1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Workload="srv--sy1m0.gb1.brightbox.com-k8s-whisker--d89f85994--nw628-eth0" Sep 9 03:31:39.807851 containerd[1609]: 2025-09-09 03:31:39.802 [INFO][6115] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:31:39.807851 containerd[1609]: 2025-09-09 03:31:39.805 [INFO][6107] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:31:39.815653 containerd[1609]: time="2025-09-09T03:31:39.815560171Z" level=info msg="TearDown network for sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\" successfully" Sep 9 03:31:39.815767 containerd[1609]: time="2025-09-09T03:31:39.815656875Z" level=info msg="StopPodSandbox for \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\" returns successfully" Sep 9 03:31:39.840430 containerd[1609]: time="2025-09-09T03:31:39.840328356Z" level=info msg="RemovePodSandbox for \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\"" Sep 9 03:31:39.880160 containerd[1609]: time="2025-09-09T03:31:39.880075205Z" level=info msg="Forcibly stopping sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\"" Sep 9 03:31:40.013666 containerd[1609]: 2025-09-09 03:31:39.944 [WARNING][6130] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" WorkloadEndpoint="srv--sy1m0.gb1.brightbox.com-k8s-whisker--d89f85994--nw628-eth0" Sep 9 03:31:40.013666 containerd[1609]: 2025-09-09 03:31:39.944 [INFO][6130] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:31:40.013666 containerd[1609]: 2025-09-09 03:31:39.944 [INFO][6130] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" iface="eth0" netns="" Sep 9 03:31:40.013666 containerd[1609]: 2025-09-09 03:31:39.944 [INFO][6130] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:31:40.013666 containerd[1609]: 2025-09-09 03:31:39.944 [INFO][6130] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:31:40.013666 containerd[1609]: 2025-09-09 03:31:39.989 [INFO][6137] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" HandleID="k8s-pod-network.1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Workload="srv--sy1m0.gb1.brightbox.com-k8s-whisker--d89f85994--nw628-eth0" Sep 9 03:31:40.013666 containerd[1609]: 2025-09-09 03:31:39.989 [INFO][6137] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 03:31:40.013666 containerd[1609]: 2025-09-09 03:31:39.989 [INFO][6137] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 03:31:40.013666 containerd[1609]: 2025-09-09 03:31:39.999 [WARNING][6137] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" HandleID="k8s-pod-network.1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Workload="srv--sy1m0.gb1.brightbox.com-k8s-whisker--d89f85994--nw628-eth0" Sep 9 03:31:40.013666 containerd[1609]: 2025-09-09 03:31:39.999 [INFO][6137] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" HandleID="k8s-pod-network.1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Workload="srv--sy1m0.gb1.brightbox.com-k8s-whisker--d89f85994--nw628-eth0" Sep 9 03:31:40.013666 containerd[1609]: 2025-09-09 03:31:40.003 [INFO][6137] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 03:31:40.013666 containerd[1609]: 2025-09-09 03:31:40.010 [INFO][6130] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656" Sep 9 03:31:40.016428 containerd[1609]: time="2025-09-09T03:31:40.013764646Z" level=info msg="TearDown network for sandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\" successfully" Sep 9 03:31:40.105008 containerd[1609]: time="2025-09-09T03:31:40.104739059Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 9 03:31:40.105008 containerd[1609]: time="2025-09-09T03:31:40.104925607Z" level=info msg="RemovePodSandbox \"1734ca6b6761f482d89e6c6e1893c13e68b027556bedef8d52726f47e0dea656\" returns successfully" Sep 9 03:31:44.225292 systemd[1]: run-containerd-runc-k8s.io-f21845a758fca601f1f42de2818f66daeee41b3be7de750d8f969ba0eeefb384-runc.4PmsPB.mount: Deactivated successfully. Sep 9 03:32:44.055518 systemd[1]: run-containerd-runc-k8s.io-350bb43386dc965c3434b577da742e1a9fb05318b237d5b392ab76618f2bbc40-runc.2lTYJc.mount: Deactivated successfully. Sep 9 03:33:29.883380 systemd[1]: Started sshd@9-10.244.20.50:22-147.75.109.163:53750.service - OpenSSH per-connection server daemon (147.75.109.163:53750). Sep 9 03:33:30.919558 sshd[6529]: Accepted publickey for core from 147.75.109.163 port 53750 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:33:30.922863 sshd[6529]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:33:30.961019 systemd-logind[1583]: New session 12 of user core. Sep 9 03:33:30.966384 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 03:33:32.144196 sshd[6529]: pam_unix(sshd:session): session closed for user core Sep 9 03:33:32.156737 systemd[1]: sshd@9-10.244.20.50:22-147.75.109.163:53750.service: Deactivated successfully. Sep 9 03:33:32.162555 systemd-logind[1583]: Session 12 logged out. Waiting for processes to exit. Sep 9 03:33:32.163373 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 03:33:32.169475 systemd-logind[1583]: Removed session 12. Sep 9 03:33:37.302287 systemd[1]: Started sshd@10-10.244.20.50:22-147.75.109.163:57162.service - OpenSSH per-connection server daemon (147.75.109.163:57162). Sep 9 03:33:38.329157 sshd[6550]: Accepted publickey for core from 147.75.109.163 port 57162 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:33:38.335685 sshd[6550]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:33:38.353469 systemd-logind[1583]: New session 13 of user core. Sep 9 03:33:38.360695 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 03:33:39.447988 sshd[6550]: pam_unix(sshd:session): session closed for user core Sep 9 03:33:39.456621 systemd[1]: sshd@10-10.244.20.50:22-147.75.109.163:57162.service: Deactivated successfully. Sep 9 03:33:39.464265 systemd-logind[1583]: Session 13 logged out. Waiting for processes to exit. Sep 9 03:33:39.466384 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 03:33:39.470477 systemd-logind[1583]: Removed session 13. Sep 9 03:33:44.607794 systemd[1]: Started sshd@11-10.244.20.50:22-147.75.109.163:42440.service - OpenSSH per-connection server daemon (147.75.109.163:42440). Sep 9 03:33:45.691770 sshd[6651]: Accepted publickey for core from 147.75.109.163 port 42440 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:33:45.702082 sshd[6651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:33:45.714640 systemd-logind[1583]: New session 14 of user core. Sep 9 03:33:45.722310 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 03:33:46.664796 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:33:46.668163 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:33:46.668185 systemd-resolved[1508]: Flushed all caches. Sep 9 03:33:46.707216 sshd[6651]: pam_unix(sshd:session): session closed for user core Sep 9 03:33:46.712647 systemd[1]: sshd@11-10.244.20.50:22-147.75.109.163:42440.service: Deactivated successfully. Sep 9 03:33:46.719436 systemd-logind[1583]: Session 14 logged out. Waiting for processes to exit. Sep 9 03:33:46.719909 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 03:33:46.725021 systemd-logind[1583]: Removed session 14. Sep 9 03:33:46.858190 systemd[1]: Started sshd@12-10.244.20.50:22-147.75.109.163:42454.service - OpenSSH per-connection server daemon (147.75.109.163:42454). Sep 9 03:33:47.746771 sshd[6686]: Accepted publickey for core from 147.75.109.163 port 42454 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:33:47.746675 sshd[6686]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:33:47.755873 systemd-logind[1583]: New session 15 of user core. Sep 9 03:33:47.763489 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 03:33:48.535222 sshd[6686]: pam_unix(sshd:session): session closed for user core Sep 9 03:33:48.541031 systemd[1]: sshd@12-10.244.20.50:22-147.75.109.163:42454.service: Deactivated successfully. Sep 9 03:33:48.547165 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 03:33:48.547381 systemd-logind[1583]: Session 15 logged out. Waiting for processes to exit. Sep 9 03:33:48.551319 systemd-logind[1583]: Removed session 15. Sep 9 03:33:48.689186 systemd[1]: Started sshd@13-10.244.20.50:22-147.75.109.163:42468.service - OpenSSH per-connection server daemon (147.75.109.163:42468). Sep 9 03:33:49.588850 sshd[6698]: Accepted publickey for core from 147.75.109.163 port 42468 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:33:49.591801 sshd[6698]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:33:49.601215 systemd-logind[1583]: New session 16 of user core. Sep 9 03:33:49.608284 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 03:33:50.514915 sshd[6698]: pam_unix(sshd:session): session closed for user core Sep 9 03:33:50.524331 systemd[1]: sshd@13-10.244.20.50:22-147.75.109.163:42468.service: Deactivated successfully. Sep 9 03:33:50.530267 systemd-logind[1583]: Session 16 logged out. Waiting for processes to exit. Sep 9 03:33:50.531524 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 03:33:50.535292 systemd-logind[1583]: Removed session 16. Sep 9 03:33:55.668325 systemd[1]: Started sshd@14-10.244.20.50:22-147.75.109.163:49262.service - OpenSSH per-connection server daemon (147.75.109.163:49262). Sep 9 03:33:56.624618 sshd[6715]: Accepted publickey for core from 147.75.109.163 port 49262 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:33:56.627888 sshd[6715]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:33:56.635879 systemd-logind[1583]: New session 17 of user core. Sep 9 03:33:56.642237 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 03:33:57.827475 sshd[6715]: pam_unix(sshd:session): session closed for user core Sep 9 03:33:57.838580 systemd[1]: sshd@14-10.244.20.50:22-147.75.109.163:49262.service: Deactivated successfully. Sep 9 03:33:57.848551 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 03:33:57.851363 systemd-logind[1583]: Session 17 logged out. Waiting for processes to exit. Sep 9 03:33:57.854773 systemd-logind[1583]: Removed session 17. Sep 9 03:33:58.694000 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:33:58.694018 systemd-resolved[1508]: Flushed all caches. Sep 9 03:33:58.696794 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:34:00.742286 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:34:00.745012 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:34:00.742302 systemd-resolved[1508]: Flushed all caches. Sep 9 03:34:02.993373 systemd[1]: Started sshd@15-10.244.20.50:22-147.75.109.163:58644.service - OpenSSH per-connection server daemon (147.75.109.163:58644). Sep 9 03:34:03.971349 sshd[6729]: Accepted publickey for core from 147.75.109.163 port 58644 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:34:03.978371 sshd[6729]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:34:03.989703 systemd-logind[1583]: New session 18 of user core. Sep 9 03:34:03.997316 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 03:34:05.258542 sshd[6729]: pam_unix(sshd:session): session closed for user core Sep 9 03:34:05.268965 systemd[1]: sshd@15-10.244.20.50:22-147.75.109.163:58644.service: Deactivated successfully. Sep 9 03:34:05.287551 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 03:34:05.288462 systemd-logind[1583]: Session 18 logged out. Waiting for processes to exit. Sep 9 03:34:05.292495 systemd-logind[1583]: Removed session 18. Sep 9 03:34:06.694140 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:34:06.700078 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:34:06.694182 systemd-resolved[1508]: Flushed all caches. Sep 9 03:34:10.411588 systemd[1]: Started sshd@16-10.244.20.50:22-147.75.109.163:34156.service - OpenSSH per-connection server daemon (147.75.109.163:34156). Sep 9 03:34:11.423366 sshd[6743]: Accepted publickey for core from 147.75.109.163 port 34156 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:34:11.428609 sshd[6743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:34:11.445320 systemd-logind[1583]: New session 19 of user core. Sep 9 03:34:11.452801 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 03:34:12.413022 sshd[6743]: pam_unix(sshd:session): session closed for user core Sep 9 03:34:12.421881 systemd-logind[1583]: Session 19 logged out. Waiting for processes to exit. Sep 9 03:34:12.428002 systemd[1]: sshd@16-10.244.20.50:22-147.75.109.163:34156.service: Deactivated successfully. Sep 9 03:34:12.440099 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 03:34:12.441478 systemd-logind[1583]: Removed session 19. Sep 9 03:34:12.569421 systemd[1]: Started sshd@17-10.244.20.50:22-147.75.109.163:34168.service - OpenSSH per-connection server daemon (147.75.109.163:34168). Sep 9 03:34:13.489731 sshd[6759]: Accepted publickey for core from 147.75.109.163 port 34168 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:34:13.499817 sshd[6759]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:34:13.508558 systemd-logind[1583]: New session 20 of user core. Sep 9 03:34:13.516376 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 03:34:14.069046 systemd[1]: run-containerd-runc-k8s.io-350bb43386dc965c3434b577da742e1a9fb05318b237d5b392ab76618f2bbc40-runc.XnRpov.mount: Deactivated successfully. Sep 9 03:34:14.697601 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:34:14.694940 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:34:14.694960 systemd-resolved[1508]: Flushed all caches. Sep 9 03:34:14.699214 sshd[6759]: pam_unix(sshd:session): session closed for user core Sep 9 03:34:14.709193 systemd[1]: sshd@17-10.244.20.50:22-147.75.109.163:34168.service: Deactivated successfully. Sep 9 03:34:14.714705 systemd-logind[1583]: Session 20 logged out. Waiting for processes to exit. Sep 9 03:34:14.715050 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 03:34:14.718336 systemd-logind[1583]: Removed session 20. Sep 9 03:34:14.850082 systemd[1]: Started sshd@18-10.244.20.50:22-147.75.109.163:34180.service - OpenSSH per-connection server daemon (147.75.109.163:34180). Sep 9 03:34:15.759363 sshd[6836]: Accepted publickey for core from 147.75.109.163 port 34180 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:34:15.762074 sshd[6836]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:34:15.769782 systemd-logind[1583]: New session 21 of user core. Sep 9 03:34:15.776403 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 03:34:18.662319 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:34:18.666014 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:34:18.662352 systemd-resolved[1508]: Flushed all caches. Sep 9 03:34:19.587515 sshd[6836]: pam_unix(sshd:session): session closed for user core Sep 9 03:34:19.657492 systemd[1]: sshd@18-10.244.20.50:22-147.75.109.163:34180.service: Deactivated successfully. Sep 9 03:34:19.669942 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 03:34:19.673813 systemd-logind[1583]: Session 21 logged out. Waiting for processes to exit. Sep 9 03:34:19.685859 systemd-logind[1583]: Removed session 21. Sep 9 03:34:19.730148 systemd[1]: Started sshd@19-10.244.20.50:22-147.75.109.163:34184.service - OpenSSH per-connection server daemon (147.75.109.163:34184). Sep 9 03:34:20.640884 sshd[6866]: Accepted publickey for core from 147.75.109.163 port 34184 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:34:20.644102 sshd[6866]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:34:20.659132 systemd-logind[1583]: New session 22 of user core. Sep 9 03:34:20.665813 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 03:34:20.710539 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:34:20.713649 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:34:20.710556 systemd-resolved[1508]: Flushed all caches. Sep 9 03:34:22.088012 sshd[6866]: pam_unix(sshd:session): session closed for user core Sep 9 03:34:22.094040 systemd[1]: sshd@19-10.244.20.50:22-147.75.109.163:34184.service: Deactivated successfully. Sep 9 03:34:22.104735 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 03:34:22.108264 systemd-logind[1583]: Session 22 logged out. Waiting for processes to exit. Sep 9 03:34:22.120462 systemd-logind[1583]: Removed session 22. Sep 9 03:34:22.243138 systemd[1]: Started sshd@20-10.244.20.50:22-147.75.109.163:55138.service - OpenSSH per-connection server daemon (147.75.109.163:55138). Sep 9 03:34:23.171451 sshd[6878]: Accepted publickey for core from 147.75.109.163 port 55138 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:34:23.174128 sshd[6878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:34:23.182145 systemd-logind[1583]: New session 23 of user core. Sep 9 03:34:23.187334 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 03:34:24.303664 sshd[6878]: pam_unix(sshd:session): session closed for user core Sep 9 03:34:24.312497 systemd[1]: sshd@20-10.244.20.50:22-147.75.109.163:55138.service: Deactivated successfully. Sep 9 03:34:24.319347 systemd-logind[1583]: Session 23 logged out. Waiting for processes to exit. Sep 9 03:34:24.320235 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 03:34:24.324235 systemd-logind[1583]: Removed session 23. Sep 9 03:34:26.665794 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:34:26.673485 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:34:26.673525 systemd-resolved[1508]: Flushed all caches. Sep 9 03:34:29.489486 systemd[1]: Started sshd@21-10.244.20.50:22-147.75.109.163:55150.service - OpenSSH per-connection server daemon (147.75.109.163:55150). Sep 9 03:34:30.698432 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:34:30.695336 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:34:30.695353 systemd-resolved[1508]: Flushed all caches. Sep 9 03:34:30.911621 sshd[6896]: Accepted publickey for core from 147.75.109.163 port 55150 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:34:30.915998 sshd[6896]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:34:30.939876 systemd-logind[1583]: New session 24 of user core. Sep 9 03:34:30.961732 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 03:34:32.688262 sshd[6896]: pam_unix(sshd:session): session closed for user core Sep 9 03:34:32.713484 systemd[1]: sshd@21-10.244.20.50:22-147.75.109.163:55150.service: Deactivated successfully. Sep 9 03:34:32.726806 systemd-logind[1583]: Session 24 logged out. Waiting for processes to exit. Sep 9 03:34:32.726918 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 03:34:32.730502 systemd-logind[1583]: Removed session 24. Sep 9 03:34:32.742728 systemd-resolved[1508]: Under memory pressure, flushing caches. Sep 9 03:34:32.745358 systemd-journald[1180]: Under memory pressure, flushing caches. Sep 9 03:34:32.744182 systemd-resolved[1508]: Flushed all caches. Sep 9 03:34:37.871255 systemd[1]: Started sshd@22-10.244.20.50:22-147.75.109.163:43830.service - OpenSSH per-connection server daemon (147.75.109.163:43830). Sep 9 03:34:38.857856 sshd[6932]: Accepted publickey for core from 147.75.109.163 port 43830 ssh2: RSA SHA256:3hTcz/48zUeeQn500raM6v2vtJJQJrvIu4rGfKfvnS4 Sep 9 03:34:38.861853 sshd[6932]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 03:34:38.874345 systemd-logind[1583]: New session 25 of user core. Sep 9 03:34:38.880259 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 9 03:34:40.250874 sshd[6932]: pam_unix(sshd:session): session closed for user core Sep 9 03:34:40.261030 systemd[1]: sshd@22-10.244.20.50:22-147.75.109.163:43830.service: Deactivated successfully. Sep 9 03:34:40.268514 systemd[1]: session-25.scope: Deactivated successfully. Sep 9 03:34:40.270646 systemd-logind[1583]: Session 25 logged out. Waiting for processes to exit. Sep 9 03:34:40.278758 systemd-logind[1583]: Removed session 25.