Sep 13 01:09:51.032708 kernel: Linux version 6.6.106-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 22:30:50 -00 2025 Sep 13 01:09:51.032742 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=2945e6465d436b7d1da8a9350a0544af0bd9aec821cd06987451d5e1d3071534 Sep 13 01:09:51.032757 kernel: BIOS-provided physical RAM map: Sep 13 01:09:51.032773 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 13 01:09:51.032783 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 13 01:09:51.032793 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 13 01:09:51.032804 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Sep 13 01:09:51.032815 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Sep 13 01:09:51.032825 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 13 01:09:51.032835 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 13 01:09:51.032846 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 13 01:09:51.032856 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 13 01:09:51.032872 kernel: NX (Execute Disable) protection: active Sep 13 01:09:51.032883 kernel: APIC: Static calls initialized Sep 13 01:09:51.032895 kernel: SMBIOS 2.8 present. Sep 13 01:09:51.032907 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Sep 13 01:09:51.032918 kernel: Hypervisor detected: KVM Sep 13 01:09:51.032934 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 13 01:09:51.032946 kernel: kvm-clock: using sched offset of 5111486377 cycles Sep 13 01:09:51.032959 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 13 01:09:51.032970 kernel: tsc: Detected 2499.998 MHz processor Sep 13 01:09:51.032982 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 13 01:09:51.032994 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 13 01:09:51.033005 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Sep 13 01:09:51.033017 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 13 01:09:51.033029 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 13 01:09:51.033045 kernel: Using GB pages for direct mapping Sep 13 01:09:51.033057 kernel: ACPI: Early table checksum verification disabled Sep 13 01:09:51.033068 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Sep 13 01:09:51.033125 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:09:51.033142 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:09:51.033154 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:09:51.033166 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Sep 13 01:09:51.033177 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:09:51.033189 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:09:51.033208 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:09:51.033220 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:09:51.033231 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Sep 13 01:09:51.033243 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Sep 13 01:09:51.033255 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Sep 13 01:09:51.033273 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Sep 13 01:09:51.033285 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Sep 13 01:09:51.033302 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Sep 13 01:09:51.033315 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Sep 13 01:09:51.033327 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Sep 13 01:09:51.033339 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Sep 13 01:09:51.033351 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Sep 13 01:09:51.033363 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Sep 13 01:09:51.033375 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Sep 13 01:09:51.033391 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Sep 13 01:09:51.033404 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Sep 13 01:09:51.033416 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Sep 13 01:09:51.033428 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Sep 13 01:09:51.033440 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Sep 13 01:09:51.033452 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Sep 13 01:09:51.033464 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Sep 13 01:09:51.033476 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Sep 13 01:09:51.033488 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Sep 13 01:09:51.033500 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Sep 13 01:09:51.033516 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Sep 13 01:09:51.033529 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 13 01:09:51.033541 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 13 01:09:51.033553 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Sep 13 01:09:51.033565 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Sep 13 01:09:51.033578 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Sep 13 01:09:51.033590 kernel: Zone ranges: Sep 13 01:09:51.033602 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 13 01:09:51.033614 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Sep 13 01:09:51.033631 kernel: Normal empty Sep 13 01:09:51.033643 kernel: Movable zone start for each node Sep 13 01:09:51.033655 kernel: Early memory node ranges Sep 13 01:09:51.033667 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 13 01:09:51.033679 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Sep 13 01:09:51.033691 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Sep 13 01:09:51.033703 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 13 01:09:51.033715 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 13 01:09:51.033728 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Sep 13 01:09:51.033740 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 13 01:09:51.033756 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 13 01:09:51.033769 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 13 01:09:51.033781 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 13 01:09:51.033793 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 13 01:09:51.033805 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 13 01:09:51.033817 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 13 01:09:51.033829 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 13 01:09:51.033841 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 13 01:09:51.033853 kernel: TSC deadline timer available Sep 13 01:09:51.033870 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Sep 13 01:09:51.033882 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 13 01:09:51.033894 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 13 01:09:51.033906 kernel: Booting paravirtualized kernel on KVM Sep 13 01:09:51.033919 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 13 01:09:51.033931 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 13 01:09:51.033944 kernel: percpu: Embedded 58 pages/cpu s197160 r8192 d32216 u262144 Sep 13 01:09:51.033956 kernel: pcpu-alloc: s197160 r8192 d32216 u262144 alloc=1*2097152 Sep 13 01:09:51.033968 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 13 01:09:51.033985 kernel: kvm-guest: PV spinlocks enabled Sep 13 01:09:51.033997 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 13 01:09:51.034010 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=2945e6465d436b7d1da8a9350a0544af0bd9aec821cd06987451d5e1d3071534 Sep 13 01:09:51.034023 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 13 01:09:51.034035 kernel: random: crng init done Sep 13 01:09:51.034047 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 13 01:09:51.034060 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 13 01:09:51.034072 kernel: Fallback order for Node 0: 0 Sep 13 01:09:51.034725 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Sep 13 01:09:51.034739 kernel: Policy zone: DMA32 Sep 13 01:09:51.034751 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 13 01:09:51.034763 kernel: software IO TLB: area num 16. Sep 13 01:09:51.034776 kernel: Memory: 1901536K/2096616K available (12288K kernel code, 2293K rwdata, 22744K rodata, 42884K init, 2312K bss, 194820K reserved, 0K cma-reserved) Sep 13 01:09:51.034789 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 13 01:09:51.034801 kernel: Kernel/User page tables isolation: enabled Sep 13 01:09:51.034813 kernel: ftrace: allocating 37974 entries in 149 pages Sep 13 01:09:51.034825 kernel: ftrace: allocated 149 pages with 4 groups Sep 13 01:09:51.034845 kernel: Dynamic Preempt: voluntary Sep 13 01:09:51.034857 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 13 01:09:51.034870 kernel: rcu: RCU event tracing is enabled. Sep 13 01:09:51.034882 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 13 01:09:51.034895 kernel: Trampoline variant of Tasks RCU enabled. Sep 13 01:09:51.034920 kernel: Rude variant of Tasks RCU enabled. Sep 13 01:09:51.034938 kernel: Tracing variant of Tasks RCU enabled. Sep 13 01:09:51.034951 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 13 01:09:51.034963 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 13 01:09:51.034976 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Sep 13 01:09:51.034989 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 13 01:09:51.035006 kernel: Console: colour VGA+ 80x25 Sep 13 01:09:51.035019 kernel: printk: console [tty0] enabled Sep 13 01:09:51.035032 kernel: printk: console [ttyS0] enabled Sep 13 01:09:51.035045 kernel: ACPI: Core revision 20230628 Sep 13 01:09:51.035058 kernel: APIC: Switch to symmetric I/O mode setup Sep 13 01:09:51.035070 kernel: x2apic enabled Sep 13 01:09:51.035101 kernel: APIC: Switched APIC routing to: physical x2apic Sep 13 01:09:51.035128 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Sep 13 01:09:51.035141 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499998) Sep 13 01:09:51.035154 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 13 01:09:51.035167 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Sep 13 01:09:51.035179 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Sep 13 01:09:51.035192 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 13 01:09:51.035205 kernel: Spectre V2 : Mitigation: Retpolines Sep 13 01:09:51.035217 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 13 01:09:51.035236 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Sep 13 01:09:51.035249 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 13 01:09:51.035262 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 13 01:09:51.035275 kernel: MDS: Mitigation: Clear CPU buffers Sep 13 01:09:51.035287 kernel: MMIO Stale Data: Unknown: No mitigations Sep 13 01:09:51.035299 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 13 01:09:51.035312 kernel: active return thunk: its_return_thunk Sep 13 01:09:51.035324 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 13 01:09:51.035337 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 13 01:09:51.035350 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 13 01:09:51.035362 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 13 01:09:51.035380 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 13 01:09:51.035393 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Sep 13 01:09:51.035405 kernel: Freeing SMP alternatives memory: 32K Sep 13 01:09:51.035418 kernel: pid_max: default: 32768 minimum: 301 Sep 13 01:09:51.035431 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Sep 13 01:09:51.035444 kernel: landlock: Up and running. Sep 13 01:09:51.035456 kernel: SELinux: Initializing. Sep 13 01:09:51.035469 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 13 01:09:51.035482 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 13 01:09:51.035494 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Sep 13 01:09:51.035507 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 13 01:09:51.035525 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 13 01:09:51.035538 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 13 01:09:51.035551 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Sep 13 01:09:51.035564 kernel: signal: max sigframe size: 1776 Sep 13 01:09:51.035577 kernel: rcu: Hierarchical SRCU implementation. Sep 13 01:09:51.035590 kernel: rcu: Max phase no-delay instances is 400. Sep 13 01:09:51.035603 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 13 01:09:51.035616 kernel: smp: Bringing up secondary CPUs ... Sep 13 01:09:51.035628 kernel: smpboot: x86: Booting SMP configuration: Sep 13 01:09:51.035646 kernel: .... node #0, CPUs: #1 Sep 13 01:09:51.035659 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Sep 13 01:09:51.035672 kernel: smp: Brought up 1 node, 2 CPUs Sep 13 01:09:51.035684 kernel: smpboot: Max logical packages: 16 Sep 13 01:09:51.035697 kernel: smpboot: Total of 2 processors activated (9999.99 BogoMIPS) Sep 13 01:09:51.035710 kernel: devtmpfs: initialized Sep 13 01:09:51.035723 kernel: x86/mm: Memory block size: 128MB Sep 13 01:09:51.035735 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 13 01:09:51.035748 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 13 01:09:51.035766 kernel: pinctrl core: initialized pinctrl subsystem Sep 13 01:09:51.035779 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 13 01:09:51.035792 kernel: audit: initializing netlink subsys (disabled) Sep 13 01:09:51.035805 kernel: audit: type=2000 audit(1757725788.870:1): state=initialized audit_enabled=0 res=1 Sep 13 01:09:51.035817 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 13 01:09:51.035830 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 13 01:09:51.035843 kernel: cpuidle: using governor menu Sep 13 01:09:51.035856 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 13 01:09:51.035869 kernel: dca service started, version 1.12.1 Sep 13 01:09:51.035886 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Sep 13 01:09:51.035900 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Sep 13 01:09:51.035913 kernel: PCI: Using configuration type 1 for base access Sep 13 01:09:51.035926 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 13 01:09:51.035938 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 13 01:09:51.035951 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 13 01:09:51.035964 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 13 01:09:51.035977 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 13 01:09:51.035989 kernel: ACPI: Added _OSI(Module Device) Sep 13 01:09:51.036006 kernel: ACPI: Added _OSI(Processor Device) Sep 13 01:09:51.036020 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 13 01:09:51.036032 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 13 01:09:51.036045 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Sep 13 01:09:51.036058 kernel: ACPI: Interpreter enabled Sep 13 01:09:51.036071 kernel: ACPI: PM: (supports S0 S5) Sep 13 01:09:51.036095 kernel: ACPI: Using IOAPIC for interrupt routing Sep 13 01:09:51.036226 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 13 01:09:51.036242 kernel: PCI: Using E820 reservations for host bridge windows Sep 13 01:09:51.036261 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 13 01:09:51.036274 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 13 01:09:51.036566 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 13 01:09:51.036758 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 13 01:09:51.036936 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 13 01:09:51.036956 kernel: PCI host bridge to bus 0000:00 Sep 13 01:09:51.037176 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 13 01:09:51.037349 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 13 01:09:51.037509 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 13 01:09:51.037666 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Sep 13 01:09:51.037828 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 13 01:09:51.037989 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Sep 13 01:09:51.038186 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 13 01:09:51.038405 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Sep 13 01:09:51.038624 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Sep 13 01:09:51.038808 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Sep 13 01:09:51.038985 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Sep 13 01:09:51.039198 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Sep 13 01:09:51.039378 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 13 01:09:51.039575 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Sep 13 01:09:51.039764 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Sep 13 01:09:51.039961 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Sep 13 01:09:51.040173 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Sep 13 01:09:51.040371 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Sep 13 01:09:51.040550 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Sep 13 01:09:51.040749 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Sep 13 01:09:51.040936 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Sep 13 01:09:51.041154 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Sep 13 01:09:51.041337 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Sep 13 01:09:51.041523 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Sep 13 01:09:51.041700 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Sep 13 01:09:51.041895 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Sep 13 01:09:51.042091 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Sep 13 01:09:51.042907 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Sep 13 01:09:51.043100 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Sep 13 01:09:51.043344 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Sep 13 01:09:51.043520 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Sep 13 01:09:51.043692 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Sep 13 01:09:51.043865 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Sep 13 01:09:51.044073 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Sep 13 01:09:51.044325 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Sep 13 01:09:51.044505 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Sep 13 01:09:51.044677 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Sep 13 01:09:51.044849 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Sep 13 01:09:51.045030 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Sep 13 01:09:51.045242 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 13 01:09:51.045435 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Sep 13 01:09:51.045609 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Sep 13 01:09:51.045782 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Sep 13 01:09:51.045972 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Sep 13 01:09:51.046190 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Sep 13 01:09:51.046390 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Sep 13 01:09:51.046581 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Sep 13 01:09:51.046758 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 13 01:09:51.046929 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 13 01:09:51.048195 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 13 01:09:51.048409 kernel: pci_bus 0000:02: extended config space not accessible Sep 13 01:09:51.048617 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Sep 13 01:09:51.048818 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Sep 13 01:09:51.049050 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 13 01:09:51.049263 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 13 01:09:51.049455 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Sep 13 01:09:51.049635 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Sep 13 01:09:51.049811 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 13 01:09:51.049983 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 13 01:09:51.053349 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 13 01:09:51.053579 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Sep 13 01:09:51.053770 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Sep 13 01:09:51.053956 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 13 01:09:51.054165 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 13 01:09:51.054354 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 13 01:09:51.054535 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 13 01:09:51.054708 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 13 01:09:51.054893 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 13 01:09:51.055071 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 13 01:09:51.056612 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 13 01:09:51.056798 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 13 01:09:51.056981 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 13 01:09:51.057194 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 13 01:09:51.057383 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 13 01:09:51.057565 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 13 01:09:51.057770 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 13 01:09:51.057992 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 13 01:09:51.058225 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 13 01:09:51.058405 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 13 01:09:51.058583 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 13 01:09:51.058603 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 13 01:09:51.058617 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 13 01:09:51.058630 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 13 01:09:51.058651 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 13 01:09:51.058665 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 13 01:09:51.058678 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 13 01:09:51.058691 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 13 01:09:51.058705 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 13 01:09:51.058718 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 13 01:09:51.058731 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 13 01:09:51.058744 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 13 01:09:51.058756 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 13 01:09:51.058774 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 13 01:09:51.058787 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 13 01:09:51.058800 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 13 01:09:51.058814 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 13 01:09:51.058827 kernel: iommu: Default domain type: Translated Sep 13 01:09:51.058840 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 13 01:09:51.058853 kernel: PCI: Using ACPI for IRQ routing Sep 13 01:09:51.058866 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 13 01:09:51.058879 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 13 01:09:51.058896 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Sep 13 01:09:51.059071 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 13 01:09:51.059308 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 13 01:09:51.059485 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 13 01:09:51.059505 kernel: vgaarb: loaded Sep 13 01:09:51.059518 kernel: clocksource: Switched to clocksource kvm-clock Sep 13 01:09:51.059532 kernel: VFS: Disk quotas dquot_6.6.0 Sep 13 01:09:51.059546 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 13 01:09:51.059559 kernel: pnp: PnP ACPI init Sep 13 01:09:51.059759 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 13 01:09:51.059781 kernel: pnp: PnP ACPI: found 5 devices Sep 13 01:09:51.059794 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 13 01:09:51.059808 kernel: NET: Registered PF_INET protocol family Sep 13 01:09:51.059821 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 13 01:09:51.059834 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 13 01:09:51.059847 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 13 01:09:51.059861 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 13 01:09:51.059881 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 13 01:09:51.059894 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 13 01:09:51.059907 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 13 01:09:51.059920 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 13 01:09:51.059933 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 13 01:09:51.059946 kernel: NET: Registered PF_XDP protocol family Sep 13 01:09:51.060960 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Sep 13 01:09:51.061191 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 13 01:09:51.061382 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 13 01:09:51.061561 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 13 01:09:51.061739 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 13 01:09:51.061914 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 13 01:09:51.062100 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 13 01:09:51.062299 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 13 01:09:51.062481 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Sep 13 01:09:51.062653 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Sep 13 01:09:51.062825 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Sep 13 01:09:51.062998 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Sep 13 01:09:51.063207 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Sep 13 01:09:51.063387 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Sep 13 01:09:51.065209 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Sep 13 01:09:51.065414 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Sep 13 01:09:51.065633 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 13 01:09:51.065828 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 13 01:09:51.066012 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 13 01:09:51.068251 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 13 01:09:51.068439 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 13 01:09:51.068618 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 13 01:09:51.068797 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 13 01:09:51.068973 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 13 01:09:51.069217 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 13 01:09:51.069392 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 13 01:09:51.069569 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 13 01:09:51.069744 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 13 01:09:51.069919 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 13 01:09:51.070127 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 13 01:09:51.070316 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 13 01:09:51.070491 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 13 01:09:51.070670 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 13 01:09:51.070848 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 13 01:09:51.071027 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 13 01:09:51.071238 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 13 01:09:51.071416 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 13 01:09:51.071590 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 13 01:09:51.071769 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 13 01:09:51.071957 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 13 01:09:51.072191 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 13 01:09:51.072368 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 13 01:09:51.072545 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 13 01:09:51.072719 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 13 01:09:51.072902 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 13 01:09:51.073076 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 13 01:09:51.073322 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 13 01:09:51.073497 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 13 01:09:51.073669 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 13 01:09:51.073862 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 13 01:09:51.074030 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 13 01:09:51.074218 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 13 01:09:51.074378 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 13 01:09:51.074545 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Sep 13 01:09:51.074703 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 13 01:09:51.074859 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Sep 13 01:09:51.075039 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 13 01:09:51.075258 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Sep 13 01:09:51.075424 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 13 01:09:51.075601 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 13 01:09:51.075787 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Sep 13 01:09:51.075953 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 13 01:09:51.076151 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 13 01:09:51.076340 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Sep 13 01:09:51.076510 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 13 01:09:51.076676 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 13 01:09:51.076861 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Sep 13 01:09:51.077030 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 13 01:09:51.079261 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 13 01:09:51.079461 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Sep 13 01:09:51.079632 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 13 01:09:51.079801 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 13 01:09:51.079980 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Sep 13 01:09:51.080275 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 13 01:09:51.080447 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 13 01:09:51.080626 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Sep 13 01:09:51.080791 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Sep 13 01:09:51.080955 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 13 01:09:51.083183 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Sep 13 01:09:51.083368 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 13 01:09:51.083542 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 13 01:09:51.083563 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 13 01:09:51.083578 kernel: PCI: CLS 0 bytes, default 64 Sep 13 01:09:51.083592 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 13 01:09:51.083606 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Sep 13 01:09:51.083620 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 13 01:09:51.083634 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Sep 13 01:09:51.083648 kernel: Initialise system trusted keyrings Sep 13 01:09:51.083669 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 13 01:09:51.083683 kernel: Key type asymmetric registered Sep 13 01:09:51.083697 kernel: Asymmetric key parser 'x509' registered Sep 13 01:09:51.083710 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Sep 13 01:09:51.083723 kernel: io scheduler mq-deadline registered Sep 13 01:09:51.083737 kernel: io scheduler kyber registered Sep 13 01:09:51.083751 kernel: io scheduler bfq registered Sep 13 01:09:51.083934 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 13 01:09:51.084142 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 13 01:09:51.084330 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:09:51.084510 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 13 01:09:51.084686 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 13 01:09:51.084862 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:09:51.085040 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 13 01:09:51.087281 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 13 01:09:51.087473 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:09:51.087651 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 13 01:09:51.087832 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 13 01:09:51.088008 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:09:51.088227 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 13 01:09:51.088406 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 13 01:09:51.088594 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:09:51.088774 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 13 01:09:51.088951 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 13 01:09:51.091195 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:09:51.091381 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 13 01:09:51.091555 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 13 01:09:51.091740 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:09:51.091931 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 13 01:09:51.092133 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 13 01:09:51.092311 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:09:51.092333 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 13 01:09:51.092348 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 13 01:09:51.092370 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 13 01:09:51.092384 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 13 01:09:51.092398 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 13 01:09:51.092412 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 13 01:09:51.092426 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 13 01:09:51.092440 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 13 01:09:51.092454 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 13 01:09:51.092647 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 13 01:09:51.092820 kernel: rtc_cmos 00:03: registered as rtc0 Sep 13 01:09:51.092994 kernel: rtc_cmos 00:03: setting system clock to 2025-09-13T01:09:50 UTC (1757725790) Sep 13 01:09:51.095225 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Sep 13 01:09:51.095250 kernel: intel_pstate: CPU model not supported Sep 13 01:09:51.095264 kernel: NET: Registered PF_INET6 protocol family Sep 13 01:09:51.095277 kernel: Segment Routing with IPv6 Sep 13 01:09:51.095291 kernel: In-situ OAM (IOAM) with IPv6 Sep 13 01:09:51.095305 kernel: NET: Registered PF_PACKET protocol family Sep 13 01:09:51.095319 kernel: Key type dns_resolver registered Sep 13 01:09:51.095344 kernel: IPI shorthand broadcast: enabled Sep 13 01:09:51.095359 kernel: sched_clock: Marking stable (1305003794, 234524555)->(1673327668, -133799319) Sep 13 01:09:51.095372 kernel: registered taskstats version 1 Sep 13 01:09:51.095386 kernel: Loading compiled-in X.509 certificates Sep 13 01:09:51.095400 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.106-flatcar: 1274e0c573ac8d09163d6bc6d1ee1445fb2f8cc6' Sep 13 01:09:51.095414 kernel: Key type .fscrypt registered Sep 13 01:09:51.095427 kernel: Key type fscrypt-provisioning registered Sep 13 01:09:51.095440 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 13 01:09:51.095454 kernel: ima: Allocated hash algorithm: sha1 Sep 13 01:09:51.095473 kernel: ima: No architecture policies found Sep 13 01:09:51.095487 kernel: clk: Disabling unused clocks Sep 13 01:09:51.095501 kernel: Freeing unused kernel image (initmem) memory: 42884K Sep 13 01:09:51.095514 kernel: Write protecting the kernel read-only data: 36864k Sep 13 01:09:51.095528 kernel: Freeing unused kernel image (rodata/data gap) memory: 1832K Sep 13 01:09:51.095542 kernel: Run /init as init process Sep 13 01:09:51.095556 kernel: with arguments: Sep 13 01:09:51.095570 kernel: /init Sep 13 01:09:51.095583 kernel: with environment: Sep 13 01:09:51.095601 kernel: HOME=/ Sep 13 01:09:51.095614 kernel: TERM=linux Sep 13 01:09:51.095628 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 13 01:09:51.095645 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 13 01:09:51.095662 systemd[1]: Detected virtualization kvm. Sep 13 01:09:51.095676 systemd[1]: Detected architecture x86-64. Sep 13 01:09:51.095689 systemd[1]: Running in initrd. Sep 13 01:09:51.095703 systemd[1]: No hostname configured, using default hostname. Sep 13 01:09:51.095722 systemd[1]: Hostname set to . Sep 13 01:09:51.095737 systemd[1]: Initializing machine ID from VM UUID. Sep 13 01:09:51.095751 systemd[1]: Queued start job for default target initrd.target. Sep 13 01:09:51.095765 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 01:09:51.095780 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 01:09:51.095795 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 13 01:09:51.095809 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 13 01:09:51.095824 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 13 01:09:51.095844 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 13 01:09:51.095861 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 13 01:09:51.095875 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 13 01:09:51.095890 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 01:09:51.095905 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 13 01:09:51.095919 systemd[1]: Reached target paths.target - Path Units. Sep 13 01:09:51.095939 systemd[1]: Reached target slices.target - Slice Units. Sep 13 01:09:51.095953 systemd[1]: Reached target swap.target - Swaps. Sep 13 01:09:51.095967 systemd[1]: Reached target timers.target - Timer Units. Sep 13 01:09:51.095982 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 13 01:09:51.095996 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 13 01:09:51.096010 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 13 01:09:51.096025 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 13 01:09:51.096039 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 13 01:09:51.096054 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 13 01:09:51.096073 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 01:09:51.096099 systemd[1]: Reached target sockets.target - Socket Units. Sep 13 01:09:51.096129 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 13 01:09:51.096143 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 13 01:09:51.096157 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 13 01:09:51.096172 systemd[1]: Starting systemd-fsck-usr.service... Sep 13 01:09:51.096186 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 13 01:09:51.096201 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 13 01:09:51.096215 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 01:09:51.096237 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 13 01:09:51.096300 systemd-journald[202]: Collecting audit messages is disabled. Sep 13 01:09:51.096334 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 01:09:51.096355 systemd[1]: Finished systemd-fsck-usr.service. Sep 13 01:09:51.096371 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 13 01:09:51.096386 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 01:09:51.096401 systemd-journald[202]: Journal started Sep 13 01:09:51.096433 systemd-journald[202]: Runtime Journal (/run/log/journal/16f7c1ce39054b419f01333ec1b274eb) is 4.7M, max 38.0M, 33.2M free. Sep 13 01:09:51.053262 systemd-modules-load[203]: Inserted module 'overlay' Sep 13 01:09:51.161130 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 13 01:09:51.161175 kernel: Bridge firewalling registered Sep 13 01:09:51.161195 systemd[1]: Started systemd-journald.service - Journal Service. Sep 13 01:09:51.106719 systemd-modules-load[203]: Inserted module 'br_netfilter' Sep 13 01:09:51.162285 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 13 01:09:51.163694 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 01:09:51.173419 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 13 01:09:51.176404 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 13 01:09:51.188640 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 13 01:09:51.196362 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 13 01:09:51.202411 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 13 01:09:51.217668 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 01:09:51.220071 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 01:09:51.228409 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 13 01:09:51.229571 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 01:09:51.241383 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 13 01:09:51.252618 dracut-cmdline[235]: dracut-dracut-053 Sep 13 01:09:51.257515 dracut-cmdline[235]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=2945e6465d436b7d1da8a9350a0544af0bd9aec821cd06987451d5e1d3071534 Sep 13 01:09:51.287719 systemd-resolved[237]: Positive Trust Anchors: Sep 13 01:09:51.287743 systemd-resolved[237]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 13 01:09:51.287787 systemd-resolved[237]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 13 01:09:51.291957 systemd-resolved[237]: Defaulting to hostname 'linux'. Sep 13 01:09:51.294569 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 13 01:09:51.296837 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 13 01:09:51.380184 kernel: SCSI subsystem initialized Sep 13 01:09:51.392161 kernel: Loading iSCSI transport class v2.0-870. Sep 13 01:09:51.406159 kernel: iscsi: registered transport (tcp) Sep 13 01:09:51.433327 kernel: iscsi: registered transport (qla4xxx) Sep 13 01:09:51.433421 kernel: QLogic iSCSI HBA Driver Sep 13 01:09:51.491506 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 13 01:09:51.507495 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 13 01:09:51.538173 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 13 01:09:51.538264 kernel: device-mapper: uevent: version 1.0.3 Sep 13 01:09:51.540267 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Sep 13 01:09:51.595234 kernel: raid6: sse2x4 gen() 7256 MB/s Sep 13 01:09:51.613167 kernel: raid6: sse2x2 gen() 5493 MB/s Sep 13 01:09:51.632132 kernel: raid6: sse2x1 gen() 5469 MB/s Sep 13 01:09:51.632225 kernel: raid6: using algorithm sse2x4 gen() 7256 MB/s Sep 13 01:09:51.650944 kernel: raid6: .... xor() 4932 MB/s, rmw enabled Sep 13 01:09:51.651036 kernel: raid6: using ssse3x2 recovery algorithm Sep 13 01:09:51.679597 kernel: xor: automatically using best checksumming function avx Sep 13 01:09:51.876237 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 13 01:09:51.891760 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 13 01:09:51.899324 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 01:09:51.930550 systemd-udevd[421]: Using default interface naming scheme 'v255'. Sep 13 01:09:51.937694 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 01:09:51.946514 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 13 01:09:51.968951 dracut-pre-trigger[427]: rd.md=0: removing MD RAID activation Sep 13 01:09:52.010709 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 13 01:09:52.018470 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 13 01:09:52.139602 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 01:09:52.153362 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 13 01:09:52.174420 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 13 01:09:52.178508 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 13 01:09:52.181360 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 01:09:52.182079 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 13 01:09:52.192115 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 13 01:09:52.239085 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 13 01:09:52.287128 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Sep 13 01:09:52.296197 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Sep 13 01:09:52.321844 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 13 01:09:52.321922 kernel: GPT:17805311 != 125829119 Sep 13 01:09:52.321942 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 13 01:09:52.321960 kernel: GPT:17805311 != 125829119 Sep 13 01:09:52.321977 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 13 01:09:52.321994 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 01:09:52.326134 kernel: cryptd: max_cpu_qlen set to 1000 Sep 13 01:09:52.336713 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 13 01:09:52.337874 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 01:09:52.340708 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 13 01:09:52.341541 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 01:09:52.343239 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 01:09:52.343989 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 01:09:52.360152 kernel: AVX version of gcm_enc/dec engaged. Sep 13 01:09:52.360430 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 01:09:52.368205 kernel: AES CTR mode by8 optimization enabled Sep 13 01:09:52.378353 kernel: libata version 3.00 loaded. Sep 13 01:09:52.383136 kernel: ACPI: bus type USB registered Sep 13 01:09:52.391136 kernel: usbcore: registered new interface driver usbfs Sep 13 01:09:52.396168 kernel: ahci 0000:00:1f.2: version 3.0 Sep 13 01:09:52.396457 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 13 01:09:52.403585 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Sep 13 01:09:52.403887 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 13 01:09:52.408178 kernel: usbcore: registered new interface driver hub Sep 13 01:09:52.408236 kernel: usbcore: registered new device driver usb Sep 13 01:09:52.418742 kernel: scsi host0: ahci Sep 13 01:09:52.419327 kernel: scsi host1: ahci Sep 13 01:09:52.419572 kernel: scsi host2: ahci Sep 13 01:09:52.423406 kernel: scsi host3: ahci Sep 13 01:09:52.423686 kernel: scsi host4: ahci Sep 13 01:09:52.425404 kernel: scsi host5: ahci Sep 13 01:09:52.425638 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 Sep 13 01:09:52.425681 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 Sep 13 01:09:52.425700 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 Sep 13 01:09:52.425718 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 Sep 13 01:09:52.425736 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 Sep 13 01:09:52.425753 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 Sep 13 01:09:52.460378 kernel: BTRFS: device fsid fa70a3b0-3d47-4508-bba0-9fa4607626aa devid 1 transid 36 /dev/vda3 scanned by (udev-worker) (485) Sep 13 01:09:52.471504 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 13 01:09:52.537965 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/vda6 scanned by (udev-worker) (473) Sep 13 01:09:52.543202 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 13 01:09:52.544472 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 01:09:52.561953 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 13 01:09:52.562940 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 13 01:09:52.571358 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 13 01:09:52.578384 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 13 01:09:52.583302 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 13 01:09:52.590363 disk-uuid[558]: Primary Header is updated. Sep 13 01:09:52.590363 disk-uuid[558]: Secondary Entries is updated. Sep 13 01:09:52.590363 disk-uuid[558]: Secondary Header is updated. Sep 13 01:09:52.597162 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 01:09:52.607159 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 01:09:52.616775 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 01:09:52.617844 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 01:09:52.736129 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 13 01:09:52.736222 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 13 01:09:52.738879 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 13 01:09:52.740379 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 13 01:09:52.742592 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 13 01:09:52.743153 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 13 01:09:52.753754 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 13 01:09:52.754165 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Sep 13 01:09:52.767151 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 13 01:09:52.774319 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 13 01:09:52.774668 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Sep 13 01:09:52.774910 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Sep 13 01:09:52.784150 kernel: hub 1-0:1.0: USB hub found Sep 13 01:09:52.788996 kernel: hub 1-0:1.0: 4 ports detected Sep 13 01:09:52.798155 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 13 01:09:52.811138 kernel: hub 2-0:1.0: USB hub found Sep 13 01:09:52.815145 kernel: hub 2-0:1.0: 4 ports detected Sep 13 01:09:53.033176 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 13 01:09:53.176158 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 13 01:09:53.183652 kernel: usbcore: registered new interface driver usbhid Sep 13 01:09:53.183714 kernel: usbhid: USB HID core driver Sep 13 01:09:53.191978 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Sep 13 01:09:53.192067 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Sep 13 01:09:53.617493 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 01:09:53.617583 disk-uuid[559]: The operation has completed successfully. Sep 13 01:09:53.671191 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 13 01:09:53.671367 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 13 01:09:53.695399 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 13 01:09:53.708480 sh[587]: Success Sep 13 01:09:53.726157 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Sep 13 01:09:53.794829 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 13 01:09:53.805252 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 13 01:09:53.808216 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 13 01:09:53.842198 kernel: BTRFS info (device dm-0): first mount of filesystem fa70a3b0-3d47-4508-bba0-9fa4607626aa Sep 13 01:09:53.842378 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 13 01:09:53.842406 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Sep 13 01:09:53.842782 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 13 01:09:53.845994 kernel: BTRFS info (device dm-0): using free space tree Sep 13 01:09:53.864210 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 13 01:09:53.865985 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 13 01:09:53.874505 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 13 01:09:53.877573 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 13 01:09:53.899262 kernel: BTRFS info (device vda6): first mount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 01:09:53.899332 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 01:09:53.899352 kernel: BTRFS info (device vda6): using free space tree Sep 13 01:09:53.909135 kernel: BTRFS info (device vda6): auto enabling async discard Sep 13 01:09:53.928137 kernel: BTRFS info (device vda6): last unmount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 01:09:53.928458 systemd[1]: mnt-oem.mount: Deactivated successfully. Sep 13 01:09:53.939378 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 13 01:09:53.951504 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 13 01:09:54.075585 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 13 01:09:54.087325 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 13 01:09:54.098143 ignition[689]: Ignition 2.19.0 Sep 13 01:09:54.099270 ignition[689]: Stage: fetch-offline Sep 13 01:09:54.099375 ignition[689]: no configs at "/usr/lib/ignition/base.d" Sep 13 01:09:54.099403 ignition[689]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:09:54.099652 ignition[689]: parsed url from cmdline: "" Sep 13 01:09:54.099659 ignition[689]: no config URL provided Sep 13 01:09:54.099670 ignition[689]: reading system config file "/usr/lib/ignition/user.ign" Sep 13 01:09:54.099687 ignition[689]: no config at "/usr/lib/ignition/user.ign" Sep 13 01:09:54.099697 ignition[689]: failed to fetch config: resource requires networking Sep 13 01:09:54.102436 ignition[689]: Ignition finished successfully Sep 13 01:09:54.107428 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 13 01:09:54.134440 systemd-networkd[772]: lo: Link UP Sep 13 01:09:54.134459 systemd-networkd[772]: lo: Gained carrier Sep 13 01:09:54.136985 systemd-networkd[772]: Enumeration completed Sep 13 01:09:54.137554 systemd-networkd[772]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 01:09:54.137560 systemd-networkd[772]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 13 01:09:54.138600 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 13 01:09:54.138676 systemd-networkd[772]: eth0: Link UP Sep 13 01:09:54.138682 systemd-networkd[772]: eth0: Gained carrier Sep 13 01:09:54.138693 systemd-networkd[772]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 01:09:54.139994 systemd[1]: Reached target network.target - Network. Sep 13 01:09:54.150358 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 13 01:09:54.174295 ignition[776]: Ignition 2.19.0 Sep 13 01:09:54.174320 ignition[776]: Stage: fetch Sep 13 01:09:54.174613 ignition[776]: no configs at "/usr/lib/ignition/base.d" Sep 13 01:09:54.174634 ignition[776]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:09:54.174784 ignition[776]: parsed url from cmdline: "" Sep 13 01:09:54.174791 ignition[776]: no config URL provided Sep 13 01:09:54.174801 ignition[776]: reading system config file "/usr/lib/ignition/user.ign" Sep 13 01:09:54.174819 ignition[776]: no config at "/usr/lib/ignition/user.ign" Sep 13 01:09:54.175053 ignition[776]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Sep 13 01:09:54.175312 ignition[776]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Sep 13 01:09:54.175353 ignition[776]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Sep 13 01:09:54.175481 ignition[776]: GET error: Get "http://169.254.169.254/openstack/latest/user_data": dial tcp 169.254.169.254:80: connect: network is unreachable Sep 13 01:09:54.207258 systemd-networkd[772]: eth0: DHCPv4 address 10.244.29.26/30, gateway 10.244.29.25 acquired from 10.244.29.25 Sep 13 01:09:54.375613 ignition[776]: GET http://169.254.169.254/openstack/latest/user_data: attempt #2 Sep 13 01:09:54.407298 ignition[776]: GET result: OK Sep 13 01:09:54.408310 ignition[776]: parsing config with SHA512: baaeed4f4658d45248c04da8b51e3bb95d88a02c173d3d518d702a3c649140e00d4a5dd25dd230aeb4515b94730754921a5f8ca5f7cdf5e21591236111c10d26 Sep 13 01:09:54.414583 unknown[776]: fetched base config from "system" Sep 13 01:09:54.414604 unknown[776]: fetched base config from "system" Sep 13 01:09:54.415258 ignition[776]: fetch: fetch complete Sep 13 01:09:54.414614 unknown[776]: fetched user config from "openstack" Sep 13 01:09:54.415267 ignition[776]: fetch: fetch passed Sep 13 01:09:54.417792 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 13 01:09:54.415349 ignition[776]: Ignition finished successfully Sep 13 01:09:54.441351 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 13 01:09:54.462485 ignition[783]: Ignition 2.19.0 Sep 13 01:09:54.462509 ignition[783]: Stage: kargs Sep 13 01:09:54.462819 ignition[783]: no configs at "/usr/lib/ignition/base.d" Sep 13 01:09:54.462841 ignition[783]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:09:54.466502 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 13 01:09:54.464401 ignition[783]: kargs: kargs passed Sep 13 01:09:54.464482 ignition[783]: Ignition finished successfully Sep 13 01:09:54.481883 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 13 01:09:54.503759 ignition[789]: Ignition 2.19.0 Sep 13 01:09:54.503783 ignition[789]: Stage: disks Sep 13 01:09:54.504076 ignition[789]: no configs at "/usr/lib/ignition/base.d" Sep 13 01:09:54.504097 ignition[789]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:09:54.505821 ignition[789]: disks: disks passed Sep 13 01:09:54.507202 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 13 01:09:54.505901 ignition[789]: Ignition finished successfully Sep 13 01:09:54.508922 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 13 01:09:54.510526 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 13 01:09:54.511996 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 13 01:09:54.513534 systemd[1]: Reached target sysinit.target - System Initialization. Sep 13 01:09:54.515168 systemd[1]: Reached target basic.target - Basic System. Sep 13 01:09:54.528388 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 13 01:09:54.548348 systemd-fsck[797]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Sep 13 01:09:54.553488 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 13 01:09:54.562308 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 13 01:09:54.684150 kernel: EXT4-fs (vda9): mounted filesystem 3a3ecd49-b269-4fcb-bb61-e2994e1868ee r/w with ordered data mode. Quota mode: none. Sep 13 01:09:54.685382 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 13 01:09:54.686850 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 13 01:09:54.696428 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 13 01:09:54.700259 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 13 01:09:54.703615 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 13 01:09:54.710635 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/vda6 scanned by mount (805) Sep 13 01:09:54.710689 kernel: BTRFS info (device vda6): first mount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 01:09:54.712281 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 01:09:54.714204 kernel: BTRFS info (device vda6): using free space tree Sep 13 01:09:54.717387 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Sep 13 01:09:54.719748 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 13 01:09:54.725367 kernel: BTRFS info (device vda6): auto enabling async discard Sep 13 01:09:54.719804 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 13 01:09:54.727255 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 13 01:09:54.730634 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 13 01:09:54.738385 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 13 01:09:54.827042 initrd-setup-root[833]: cut: /sysroot/etc/passwd: No such file or directory Sep 13 01:09:54.840783 initrd-setup-root[840]: cut: /sysroot/etc/group: No such file or directory Sep 13 01:09:54.849621 initrd-setup-root[847]: cut: /sysroot/etc/shadow: No such file or directory Sep 13 01:09:54.858822 initrd-setup-root[854]: cut: /sysroot/etc/gshadow: No such file or directory Sep 13 01:09:54.995192 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 13 01:09:55.004346 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 13 01:09:55.008342 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 13 01:09:55.019461 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 13 01:09:55.022251 kernel: BTRFS info (device vda6): last unmount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 01:09:55.056914 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 13 01:09:55.065197 ignition[921]: INFO : Ignition 2.19.0 Sep 13 01:09:55.065197 ignition[921]: INFO : Stage: mount Sep 13 01:09:55.065197 ignition[921]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 01:09:55.065197 ignition[921]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:09:55.070244 ignition[921]: INFO : mount: mount passed Sep 13 01:09:55.070244 ignition[921]: INFO : Ignition finished successfully Sep 13 01:09:55.067571 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 13 01:09:55.534025 systemd-networkd[772]: eth0: Gained IPv6LL Sep 13 01:09:57.044012 systemd-networkd[772]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:746:24:19ff:fef4:1d1a/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:746:24:19ff:fef4:1d1a/64 assigned by NDisc. Sep 13 01:09:57.044029 systemd-networkd[772]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 13 01:10:01.930572 coreos-metadata[807]: Sep 13 01:10:01.930 WARN failed to locate config-drive, using the metadata service API instead Sep 13 01:10:01.956789 coreos-metadata[807]: Sep 13 01:10:01.956 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 13 01:10:01.990609 coreos-metadata[807]: Sep 13 01:10:01.990 INFO Fetch successful Sep 13 01:10:01.991515 coreos-metadata[807]: Sep 13 01:10:01.990 INFO wrote hostname srv-5asmg.gb1.brightbox.com to /sysroot/etc/hostname Sep 13 01:10:01.993581 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Sep 13 01:10:01.993772 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Sep 13 01:10:02.012579 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 13 01:10:02.030551 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 13 01:10:02.047152 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 scanned by mount (939) Sep 13 01:10:02.055674 kernel: BTRFS info (device vda6): first mount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 01:10:02.055763 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 01:10:02.056267 kernel: BTRFS info (device vda6): using free space tree Sep 13 01:10:02.063152 kernel: BTRFS info (device vda6): auto enabling async discard Sep 13 01:10:02.067348 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 13 01:10:02.104148 ignition[957]: INFO : Ignition 2.19.0 Sep 13 01:10:02.104148 ignition[957]: INFO : Stage: files Sep 13 01:10:02.106047 ignition[957]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 01:10:02.106047 ignition[957]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:10:02.106047 ignition[957]: DEBUG : files: compiled without relabeling support, skipping Sep 13 01:10:02.108919 ignition[957]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 13 01:10:02.108919 ignition[957]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 13 01:10:02.111780 ignition[957]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 13 01:10:02.113090 ignition[957]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 13 01:10:02.114250 ignition[957]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 13 01:10:02.113653 unknown[957]: wrote ssh authorized keys file for user: core Sep 13 01:10:02.116675 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 13 01:10:02.116675 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 13 01:10:02.361483 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 13 01:10:02.821277 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 13 01:10:02.822771 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 13 01:10:02.822771 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 13 01:10:02.822771 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 13 01:10:02.834559 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 13 01:10:02.834559 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 13 01:10:02.834559 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 13 01:10:02.834559 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 13 01:10:02.834559 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 13 01:10:02.834559 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 13 01:10:02.834559 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 13 01:10:02.834559 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 13 01:10:02.834559 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 13 01:10:02.834559 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 13 01:10:02.834559 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 13 01:10:03.334271 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 13 01:10:08.466352 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 13 01:10:08.466352 ignition[957]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 13 01:10:08.470041 ignition[957]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 13 01:10:08.470041 ignition[957]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 13 01:10:08.470041 ignition[957]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 13 01:10:08.470041 ignition[957]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 13 01:10:08.470041 ignition[957]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 13 01:10:08.470041 ignition[957]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 13 01:10:08.480029 ignition[957]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 13 01:10:08.480029 ignition[957]: INFO : files: files passed Sep 13 01:10:08.480029 ignition[957]: INFO : Ignition finished successfully Sep 13 01:10:08.474586 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 13 01:10:08.485551 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 13 01:10:08.490338 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 13 01:10:08.494897 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 13 01:10:08.495094 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 13 01:10:08.521720 initrd-setup-root-after-ignition[986]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 13 01:10:08.521720 initrd-setup-root-after-ignition[986]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 13 01:10:08.531368 initrd-setup-root-after-ignition[990]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 13 01:10:08.528512 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 13 01:10:08.531323 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 13 01:10:08.544590 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 13 01:10:08.597555 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 13 01:10:08.597774 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 13 01:10:08.599449 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 13 01:10:08.600458 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 13 01:10:08.601256 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 13 01:10:08.607568 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 13 01:10:08.632184 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 13 01:10:08.638472 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 13 01:10:08.655070 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 13 01:10:08.657160 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 01:10:08.658218 systemd[1]: Stopped target timers.target - Timer Units. Sep 13 01:10:08.659837 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 13 01:10:08.660063 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 13 01:10:08.661786 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 13 01:10:08.662761 systemd[1]: Stopped target basic.target - Basic System. Sep 13 01:10:08.664329 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 13 01:10:08.665737 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 13 01:10:08.667381 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 13 01:10:08.668909 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 13 01:10:08.670444 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 13 01:10:08.672035 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 13 01:10:08.673549 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 13 01:10:08.675210 systemd[1]: Stopped target swap.target - Swaps. Sep 13 01:10:08.676590 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 13 01:10:08.676856 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 13 01:10:08.678691 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 13 01:10:08.679677 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 01:10:08.681098 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 13 01:10:08.681328 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 01:10:08.682157 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 13 01:10:08.682356 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 13 01:10:08.683828 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 13 01:10:08.684072 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 13 01:10:08.685955 systemd[1]: ignition-files.service: Deactivated successfully. Sep 13 01:10:08.686217 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 13 01:10:08.694555 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 13 01:10:08.698644 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 13 01:10:08.703635 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 13 01:10:08.703912 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 01:10:08.707066 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 13 01:10:08.707289 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 13 01:10:08.725251 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 13 01:10:08.725414 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 13 01:10:08.741333 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 13 01:10:08.744977 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 13 01:10:08.745495 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 13 01:10:08.749922 ignition[1010]: INFO : Ignition 2.19.0 Sep 13 01:10:08.752951 ignition[1010]: INFO : Stage: umount Sep 13 01:10:08.752951 ignition[1010]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 01:10:08.752951 ignition[1010]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:10:08.752951 ignition[1010]: INFO : umount: umount passed Sep 13 01:10:08.752951 ignition[1010]: INFO : Ignition finished successfully Sep 13 01:10:08.756292 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 13 01:10:08.756467 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 13 01:10:08.758426 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 13 01:10:08.758593 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 13 01:10:08.760841 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 13 01:10:08.760924 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 13 01:10:08.762318 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 13 01:10:08.762415 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 13 01:10:08.763698 systemd[1]: Stopped target network.target - Network. Sep 13 01:10:08.764984 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 13 01:10:08.765088 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 13 01:10:08.766652 systemd[1]: Stopped target paths.target - Path Units. Sep 13 01:10:08.767976 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 13 01:10:08.771192 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 01:10:08.772154 systemd[1]: Stopped target slices.target - Slice Units. Sep 13 01:10:08.772793 systemd[1]: Stopped target sockets.target - Socket Units. Sep 13 01:10:08.774598 systemd[1]: iscsid.socket: Deactivated successfully. Sep 13 01:10:08.774685 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 13 01:10:08.776283 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 13 01:10:08.776353 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 13 01:10:08.777593 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 13 01:10:08.777680 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 13 01:10:08.779064 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 13 01:10:08.779163 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 13 01:10:08.780423 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 13 01:10:08.780494 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 13 01:10:08.782269 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 13 01:10:08.784188 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 13 01:10:08.786423 systemd-networkd[772]: eth0: DHCPv6 lease lost Sep 13 01:10:08.790079 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 13 01:10:08.790326 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 13 01:10:08.792512 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 13 01:10:08.792623 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 13 01:10:08.801305 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 13 01:10:08.802263 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 13 01:10:08.802366 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 13 01:10:08.806412 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 01:10:08.808720 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 13 01:10:08.808917 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 13 01:10:08.821794 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 13 01:10:08.822062 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 01:10:08.825975 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 13 01:10:08.826237 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 13 01:10:08.831281 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 13 01:10:08.831387 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 13 01:10:08.833062 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 13 01:10:08.833235 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 01:10:08.836853 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 13 01:10:08.836978 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 13 01:10:08.839261 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 13 01:10:08.839345 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 13 01:10:08.840674 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 13 01:10:08.840765 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 01:10:08.854583 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 13 01:10:08.856548 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 13 01:10:08.856649 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 13 01:10:08.860674 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 13 01:10:08.860795 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 13 01:10:08.861581 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 13 01:10:08.861654 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 01:10:08.862515 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 13 01:10:08.862584 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 01:10:08.864048 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 13 01:10:08.864216 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 01:10:08.867791 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 13 01:10:08.867896 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 01:10:08.869317 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 01:10:08.869392 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 01:10:08.871728 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 13 01:10:08.871893 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 13 01:10:08.873376 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 13 01:10:08.882473 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 13 01:10:08.894821 systemd[1]: Switching root. Sep 13 01:10:08.947866 systemd-journald[202]: Journal stopped Sep 13 01:10:10.614149 systemd-journald[202]: Received SIGTERM from PID 1 (systemd). Sep 13 01:10:10.614298 kernel: SELinux: policy capability network_peer_controls=1 Sep 13 01:10:10.614334 kernel: SELinux: policy capability open_perms=1 Sep 13 01:10:10.614356 kernel: SELinux: policy capability extended_socket_class=1 Sep 13 01:10:10.614374 kernel: SELinux: policy capability always_check_network=0 Sep 13 01:10:10.614406 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 13 01:10:10.614428 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 13 01:10:10.614454 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 13 01:10:10.614488 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 13 01:10:10.614515 kernel: audit: type=1403 audit(1757725809.242:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 13 01:10:10.614537 systemd[1]: Successfully loaded SELinux policy in 53.872ms. Sep 13 01:10:10.614580 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 24.057ms. Sep 13 01:10:10.614622 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 13 01:10:10.614652 systemd[1]: Detected virtualization kvm. Sep 13 01:10:10.614687 systemd[1]: Detected architecture x86-64. Sep 13 01:10:10.614710 systemd[1]: Detected first boot. Sep 13 01:10:10.614737 systemd[1]: Hostname set to . Sep 13 01:10:10.614777 systemd[1]: Initializing machine ID from VM UUID. Sep 13 01:10:10.614800 zram_generator::config[1054]: No configuration found. Sep 13 01:10:10.614822 systemd[1]: Populated /etc with preset unit settings. Sep 13 01:10:10.614849 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 13 01:10:10.614882 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 13 01:10:10.614906 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 13 01:10:10.614934 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 13 01:10:10.614956 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 13 01:10:10.614978 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 13 01:10:10.614999 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 13 01:10:10.615020 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 13 01:10:10.615048 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 13 01:10:10.615076 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 13 01:10:10.615124 systemd[1]: Created slice user.slice - User and Session Slice. Sep 13 01:10:10.615194 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 01:10:10.615223 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 01:10:10.615244 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 13 01:10:10.615278 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 13 01:10:10.615299 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 13 01:10:10.615344 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 13 01:10:10.615366 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 13 01:10:10.615387 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 01:10:10.615426 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 13 01:10:10.615450 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 13 01:10:10.615478 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 13 01:10:10.615499 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 13 01:10:10.615520 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 01:10:10.615541 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 13 01:10:10.615574 systemd[1]: Reached target slices.target - Slice Units. Sep 13 01:10:10.615603 systemd[1]: Reached target swap.target - Swaps. Sep 13 01:10:10.615638 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 13 01:10:10.615673 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 13 01:10:10.615711 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 13 01:10:10.615752 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 13 01:10:10.615793 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 01:10:10.615817 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 13 01:10:10.615838 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 13 01:10:10.615859 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 13 01:10:10.615880 systemd[1]: Mounting media.mount - External Media Directory... Sep 13 01:10:10.615902 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:10:10.615923 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 13 01:10:10.615944 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 13 01:10:10.615965 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 13 01:10:10.615999 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 13 01:10:10.616023 systemd[1]: Reached target machines.target - Containers. Sep 13 01:10:10.616044 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 13 01:10:10.616072 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 01:10:10.616094 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 13 01:10:10.616148 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 13 01:10:10.616172 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 01:10:10.616201 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 13 01:10:10.616236 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 13 01:10:10.616267 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 13 01:10:10.616288 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 01:10:10.616309 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 13 01:10:10.616330 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 13 01:10:10.616358 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 13 01:10:10.616385 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 13 01:10:10.616423 systemd[1]: Stopped systemd-fsck-usr.service. Sep 13 01:10:10.616451 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 13 01:10:10.616485 kernel: loop: module loaded Sep 13 01:10:10.616515 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 13 01:10:10.616538 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 13 01:10:10.616559 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 13 01:10:10.616580 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 13 01:10:10.616609 systemd[1]: verity-setup.service: Deactivated successfully. Sep 13 01:10:10.616631 systemd[1]: Stopped verity-setup.service. Sep 13 01:10:10.616651 kernel: fuse: init (API version 7.39) Sep 13 01:10:10.616671 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:10:10.616704 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 13 01:10:10.616727 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 13 01:10:10.616758 systemd[1]: Mounted media.mount - External Media Directory. Sep 13 01:10:10.616781 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 13 01:10:10.616815 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 13 01:10:10.616837 kernel: ACPI: bus type drm_connector registered Sep 13 01:10:10.616865 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 13 01:10:10.616888 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 01:10:10.616915 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 13 01:10:10.616937 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 13 01:10:10.616958 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 01:10:10.616980 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 01:10:10.617013 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 13 01:10:10.617036 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 13 01:10:10.617057 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 13 01:10:10.617078 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 13 01:10:10.617130 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 13 01:10:10.617168 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 13 01:10:10.617191 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 13 01:10:10.617212 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 01:10:10.617234 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 01:10:10.617255 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 13 01:10:10.617277 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 13 01:10:10.617341 systemd-journald[1150]: Collecting audit messages is disabled. Sep 13 01:10:10.617389 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 13 01:10:10.617426 systemd-journald[1150]: Journal started Sep 13 01:10:10.617461 systemd-journald[1150]: Runtime Journal (/run/log/journal/16f7c1ce39054b419f01333ec1b274eb) is 4.7M, max 38.0M, 33.2M free. Sep 13 01:10:10.147090 systemd[1]: Queued start job for default target multi-user.target. Sep 13 01:10:10.173881 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 13 01:10:10.174678 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 13 01:10:10.620189 systemd[1]: Started systemd-journald.service - Journal Service. Sep 13 01:10:10.640469 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 13 01:10:10.652133 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 13 01:10:10.667235 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 13 01:10:10.668516 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 13 01:10:10.668585 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 13 01:10:10.671503 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Sep 13 01:10:10.685342 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 13 01:10:10.693040 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 13 01:10:10.695385 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 01:10:10.702346 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 13 01:10:10.715075 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 13 01:10:10.716180 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 13 01:10:10.724463 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 13 01:10:10.726035 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 13 01:10:10.729894 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 13 01:10:10.733554 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 13 01:10:10.742371 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 13 01:10:10.747874 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 13 01:10:10.748951 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 13 01:10:10.750159 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 13 01:10:10.779919 systemd-journald[1150]: Time spent on flushing to /var/log/journal/16f7c1ce39054b419f01333ec1b274eb is 107.390ms for 1143 entries. Sep 13 01:10:10.779919 systemd-journald[1150]: System Journal (/var/log/journal/16f7c1ce39054b419f01333ec1b274eb) is 8.0M, max 584.8M, 576.8M free. Sep 13 01:10:10.942027 systemd-journald[1150]: Received client request to flush runtime journal. Sep 13 01:10:10.942206 kernel: loop0: detected capacity change from 0 to 142488 Sep 13 01:10:10.942305 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 13 01:10:10.784943 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 13 01:10:10.787407 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 13 01:10:10.794397 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Sep 13 01:10:10.848574 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 13 01:10:10.947147 kernel: loop1: detected capacity change from 0 to 224512 Sep 13 01:10:10.865921 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 13 01:10:10.867721 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Sep 13 01:10:10.891607 systemd-tmpfiles[1188]: ACLs are not supported, ignoring. Sep 13 01:10:10.891631 systemd-tmpfiles[1188]: ACLs are not supported, ignoring. Sep 13 01:10:10.939198 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 01:10:10.949562 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 13 01:10:10.951231 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 13 01:10:11.006052 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 01:10:11.020056 kernel: loop2: detected capacity change from 0 to 140768 Sep 13 01:10:11.018984 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Sep 13 01:10:11.064978 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 13 01:10:11.078550 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 13 01:10:11.087303 udevadm[1209]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Sep 13 01:10:11.123075 kernel: loop3: detected capacity change from 0 to 8 Sep 13 01:10:11.149815 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Sep 13 01:10:11.149846 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Sep 13 01:10:11.162864 kernel: loop4: detected capacity change from 0 to 142488 Sep 13 01:10:11.174802 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 01:10:11.198133 kernel: loop5: detected capacity change from 0 to 224512 Sep 13 01:10:11.226141 kernel: loop6: detected capacity change from 0 to 140768 Sep 13 01:10:11.278140 kernel: loop7: detected capacity change from 0 to 8 Sep 13 01:10:11.275093 (sd-merge)[1215]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Sep 13 01:10:11.275971 (sd-merge)[1215]: Merged extensions into '/usr'. Sep 13 01:10:11.281672 systemd[1]: Reloading requested from client PID 1187 ('systemd-sysext') (unit systemd-sysext.service)... Sep 13 01:10:11.281697 systemd[1]: Reloading... Sep 13 01:10:11.431177 zram_generator::config[1240]: No configuration found. Sep 13 01:10:11.541048 ldconfig[1182]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 13 01:10:11.724471 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 01:10:11.793705 systemd[1]: Reloading finished in 511 ms. Sep 13 01:10:11.839400 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 13 01:10:11.844635 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 13 01:10:11.858518 systemd[1]: Starting ensure-sysext.service... Sep 13 01:10:11.876978 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 13 01:10:11.879073 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 13 01:10:11.887426 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 01:10:11.896771 systemd[1]: Reloading requested from client PID 1298 ('systemctl') (unit ensure-sysext.service)... Sep 13 01:10:11.896803 systemd[1]: Reloading... Sep 13 01:10:11.914054 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 13 01:10:11.915003 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 13 01:10:11.917180 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 13 01:10:11.917939 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Sep 13 01:10:11.918307 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Sep 13 01:10:11.925859 systemd-tmpfiles[1299]: Detected autofs mount point /boot during canonicalization of boot. Sep 13 01:10:11.925881 systemd-tmpfiles[1299]: Skipping /boot Sep 13 01:10:11.948288 systemd-tmpfiles[1299]: Detected autofs mount point /boot during canonicalization of boot. Sep 13 01:10:11.948311 systemd-tmpfiles[1299]: Skipping /boot Sep 13 01:10:11.986048 systemd-udevd[1301]: Using default interface naming scheme 'v255'. Sep 13 01:10:12.025204 zram_generator::config[1327]: No configuration found. Sep 13 01:10:12.242172 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (1353) Sep 13 01:10:12.301300 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 01:10:12.386132 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 13 01:10:12.398097 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 13 01:10:12.399606 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 13 01:10:12.405253 systemd[1]: Reloading finished in 507 ms. Sep 13 01:10:12.416143 kernel: ACPI: button: Power Button [PWRF] Sep 13 01:10:12.430155 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 01:10:12.432031 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 01:10:12.452141 kernel: mousedev: PS/2 mouse device common for all mice Sep 13 01:10:12.487660 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 13 01:10:12.494028 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 13 01:10:12.500548 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 13 01:10:12.510601 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 13 01:10:12.517529 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 13 01:10:12.529541 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 13 01:10:12.534516 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 13 01:10:12.541576 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:10:12.541910 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 01:10:12.550692 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 01:10:12.559587 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 13 01:10:12.581142 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 13 01:10:12.581636 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Sep 13 01:10:12.581942 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 13 01:10:12.588308 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Sep 13 01:10:12.576571 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 01:10:12.577951 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 01:10:12.579175 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:10:12.580911 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 01:10:12.582281 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 01:10:12.587392 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:10:12.587686 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 01:10:12.597593 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 01:10:12.600384 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 01:10:12.600587 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:10:12.619666 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:10:12.620040 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 01:10:12.633441 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 13 01:10:12.634436 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 01:10:12.650406 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 13 01:10:12.651285 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:10:12.652079 systemd[1]: Finished ensure-sysext.service. Sep 13 01:10:12.653811 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 01:10:12.655666 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 01:10:12.695503 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 13 01:10:12.698227 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 13 01:10:12.723718 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 13 01:10:12.740230 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 13 01:10:12.755697 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 01:10:12.769184 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 13 01:10:12.781591 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 13 01:10:12.790337 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 13 01:10:12.805870 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 01:10:12.814384 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 01:10:12.817727 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 13 01:10:12.820745 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 13 01:10:12.822638 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 13 01:10:12.838899 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 13 01:10:12.841024 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 13 01:10:12.844589 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 13 01:10:12.867250 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 13 01:10:12.938585 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 13 01:10:12.953978 augenrules[1456]: No rules Sep 13 01:10:12.955594 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 13 01:10:13.109046 systemd-networkd[1411]: lo: Link UP Sep 13 01:10:13.109063 systemd-networkd[1411]: lo: Gained carrier Sep 13 01:10:13.111741 systemd-networkd[1411]: Enumeration completed Sep 13 01:10:13.111941 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 13 01:10:13.114794 systemd-networkd[1411]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 01:10:13.114810 systemd-networkd[1411]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 13 01:10:13.117223 systemd-networkd[1411]: eth0: Link UP Sep 13 01:10:13.117231 systemd-networkd[1411]: eth0: Gained carrier Sep 13 01:10:13.117252 systemd-networkd[1411]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 01:10:13.124368 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 13 01:10:13.166221 systemd-networkd[1411]: eth0: DHCPv4 address 10.244.29.26/30, gateway 10.244.29.25 acquired from 10.244.29.25 Sep 13 01:10:13.170999 systemd-resolved[1413]: Positive Trust Anchors: Sep 13 01:10:13.171646 systemd-timesyncd[1432]: Network configuration changed, trying to establish connection. Sep 13 01:10:13.174185 systemd-resolved[1413]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 13 01:10:13.174248 systemd-resolved[1413]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 13 01:10:13.194180 systemd-resolved[1413]: Using system hostname 'srv-5asmg.gb1.brightbox.com'. Sep 13 01:10:13.215079 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 13 01:10:13.217995 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 13 01:10:13.225902 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Sep 13 01:10:13.228183 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 01:10:13.231935 systemd[1]: Reached target network.target - Network. Sep 13 01:10:13.232789 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 13 01:10:13.233606 systemd[1]: Reached target time-set.target - System Time Set. Sep 13 01:10:13.243686 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Sep 13 01:10:13.262829 lvm[1473]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 13 01:10:13.297322 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Sep 13 01:10:13.299371 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 13 01:10:13.300189 systemd[1]: Reached target sysinit.target - System Initialization. Sep 13 01:10:13.301280 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 13 01:10:13.302211 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 13 01:10:13.303419 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 13 01:10:13.304293 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 13 01:10:13.305144 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 13 01:10:13.305931 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 13 01:10:13.306004 systemd[1]: Reached target paths.target - Path Units. Sep 13 01:10:13.312647 systemd[1]: Reached target timers.target - Timer Units. Sep 13 01:10:13.315552 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 13 01:10:13.318776 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 13 01:10:13.333182 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 13 01:10:13.336555 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Sep 13 01:10:13.338531 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 13 01:10:13.339534 systemd[1]: Reached target sockets.target - Socket Units. Sep 13 01:10:13.340381 systemd[1]: Reached target basic.target - Basic System. Sep 13 01:10:13.341372 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 13 01:10:13.341571 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 13 01:10:13.343940 systemd[1]: Starting containerd.service - containerd container runtime... Sep 13 01:10:13.349057 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 13 01:10:13.350068 lvm[1477]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 13 01:10:13.354409 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 13 01:10:13.364377 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 13 01:10:13.369892 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 13 01:10:13.371230 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 13 01:10:13.376384 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 13 01:10:13.378409 jq[1481]: false Sep 13 01:10:13.387366 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 13 01:10:13.392895 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 13 01:10:13.402431 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 13 01:10:13.412759 systemd-timesyncd[1432]: Contacted time server 91.109.118.94:123 (0.flatcar.pool.ntp.org). Sep 13 01:10:13.412910 systemd-timesyncd[1432]: Initial clock synchronization to Sat 2025-09-13 01:10:13.328068 UTC. Sep 13 01:10:13.416414 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 13 01:10:13.418298 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 13 01:10:13.420610 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 13 01:10:13.430412 systemd[1]: Starting update-engine.service - Update Engine... Sep 13 01:10:13.437703 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 13 01:10:13.450889 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 13 01:10:13.452275 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 13 01:10:13.476942 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 13 01:10:13.478240 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 13 01:10:13.478383 dbus-daemon[1480]: [system] SELinux support is enabled Sep 13 01:10:13.479646 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 13 01:10:13.488133 extend-filesystems[1482]: Found loop4 Sep 13 01:10:13.488133 extend-filesystems[1482]: Found loop5 Sep 13 01:10:13.488133 extend-filesystems[1482]: Found loop6 Sep 13 01:10:13.488133 extend-filesystems[1482]: Found loop7 Sep 13 01:10:13.488133 extend-filesystems[1482]: Found vda Sep 13 01:10:13.488133 extend-filesystems[1482]: Found vda1 Sep 13 01:10:13.488133 extend-filesystems[1482]: Found vda2 Sep 13 01:10:13.488133 extend-filesystems[1482]: Found vda3 Sep 13 01:10:13.488133 extend-filesystems[1482]: Found usr Sep 13 01:10:13.488133 extend-filesystems[1482]: Found vda4 Sep 13 01:10:13.488133 extend-filesystems[1482]: Found vda6 Sep 13 01:10:13.488133 extend-filesystems[1482]: Found vda7 Sep 13 01:10:13.488133 extend-filesystems[1482]: Found vda9 Sep 13 01:10:13.488133 extend-filesystems[1482]: Checking size of /dev/vda9 Sep 13 01:10:13.511747 dbus-daemon[1480]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1411 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 13 01:10:13.490842 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 13 01:10:13.520235 jq[1491]: true Sep 13 01:10:13.490954 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 13 01:10:13.538247 extend-filesystems[1482]: Resized partition /dev/vda9 Sep 13 01:10:13.495349 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 13 01:10:13.547529 extend-filesystems[1512]: resize2fs 1.47.1 (20-May-2024) Sep 13 01:10:13.495393 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 13 01:10:13.532466 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 13 01:10:13.535477 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Sep 13 01:10:13.566139 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Sep 13 01:10:13.566889 (ntainerd)[1507]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 13 01:10:13.574242 tar[1495]: linux-amd64/LICENSE Sep 13 01:10:13.574242 tar[1495]: linux-amd64/helm Sep 13 01:10:13.590507 jq[1506]: true Sep 13 01:10:13.640292 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (1358) Sep 13 01:10:13.640539 update_engine[1490]: I20250913 01:10:13.637338 1490 main.cc:92] Flatcar Update Engine starting Sep 13 01:10:13.648721 systemd[1]: motdgen.service: Deactivated successfully. Sep 13 01:10:13.649134 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 13 01:10:13.653272 systemd[1]: Started update-engine.service - Update Engine. Sep 13 01:10:13.655372 update_engine[1490]: I20250913 01:10:13.653373 1490 update_check_scheduler.cc:74] Next update check in 11m50s Sep 13 01:10:13.663432 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 13 01:10:13.888714 systemd-logind[1489]: Watching system buttons on /dev/input/event2 (Power Button) Sep 13 01:10:13.895827 systemd-logind[1489]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 13 01:10:13.898973 bash[1538]: Updated "/home/core/.ssh/authorized_keys" Sep 13 01:10:13.899543 systemd-logind[1489]: New seat seat0. Sep 13 01:10:13.904624 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 13 01:10:13.938518 systemd[1]: Starting sshkeys.service... Sep 13 01:10:13.939405 systemd[1]: Started systemd-logind.service - User Login Management. Sep 13 01:10:14.025760 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 13 01:10:14.044161 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Sep 13 01:10:14.041677 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 13 01:10:14.051806 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 13 01:10:14.069763 extend-filesystems[1512]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 13 01:10:14.069763 extend-filesystems[1512]: old_desc_blocks = 1, new_desc_blocks = 8 Sep 13 01:10:14.069763 extend-filesystems[1512]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Sep 13 01:10:14.051587 dbus-daemon[1480]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 13 01:10:14.081458 extend-filesystems[1482]: Resized filesystem in /dev/vda9 Sep 13 01:10:14.069923 systemd[1]: Starting polkit.service - Authorization Manager... Sep 13 01:10:14.056377 dbus-daemon[1480]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1509 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 13 01:10:14.074251 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 13 01:10:14.074587 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 13 01:10:14.110911 polkitd[1549]: Started polkitd version 121 Sep 13 01:10:14.128428 polkitd[1549]: Loading rules from directory /etc/polkit-1/rules.d Sep 13 01:10:14.128547 polkitd[1549]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 13 01:10:14.132298 containerd[1507]: time="2025-09-13T01:10:14.132082976Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Sep 13 01:10:14.137581 polkitd[1549]: Finished loading, compiling and executing 2 rules Sep 13 01:10:14.143338 dbus-daemon[1480]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 13 01:10:14.143886 polkitd[1549]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 13 01:10:14.143618 systemd[1]: Started polkit.service - Authorization Manager. Sep 13 01:10:14.183834 systemd-hostnamed[1509]: Hostname set to (static) Sep 13 01:10:14.196503 locksmithd[1523]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 13 01:10:14.233450 containerd[1507]: time="2025-09-13T01:10:14.233304650Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Sep 13 01:10:14.244135 containerd[1507]: time="2025-09-13T01:10:14.241669033Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.106-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Sep 13 01:10:14.244135 containerd[1507]: time="2025-09-13T01:10:14.241742596Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Sep 13 01:10:14.244135 containerd[1507]: time="2025-09-13T01:10:14.241774023Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Sep 13 01:10:14.244135 containerd[1507]: time="2025-09-13T01:10:14.242176274Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Sep 13 01:10:14.244135 containerd[1507]: time="2025-09-13T01:10:14.242220419Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Sep 13 01:10:14.244135 containerd[1507]: time="2025-09-13T01:10:14.242335091Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Sep 13 01:10:14.244135 containerd[1507]: time="2025-09-13T01:10:14.242358735Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Sep 13 01:10:14.244135 containerd[1507]: time="2025-09-13T01:10:14.243659451Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 13 01:10:14.244135 containerd[1507]: time="2025-09-13T01:10:14.243688548Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Sep 13 01:10:14.244135 containerd[1507]: time="2025-09-13T01:10:14.243712837Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Sep 13 01:10:14.244135 containerd[1507]: time="2025-09-13T01:10:14.243732746Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Sep 13 01:10:14.244655 containerd[1507]: time="2025-09-13T01:10:14.243880072Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Sep 13 01:10:14.245265 containerd[1507]: time="2025-09-13T01:10:14.245227704Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Sep 13 01:10:14.245590 containerd[1507]: time="2025-09-13T01:10:14.245556242Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 13 01:10:14.245637 containerd[1507]: time="2025-09-13T01:10:14.245594905Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Sep 13 01:10:14.245793 containerd[1507]: time="2025-09-13T01:10:14.245764456Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Sep 13 01:10:14.245895 containerd[1507]: time="2025-09-13T01:10:14.245867494Z" level=info msg="metadata content store policy set" policy=shared Sep 13 01:10:14.259364 containerd[1507]: time="2025-09-13T01:10:14.259183915Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Sep 13 01:10:14.259540 containerd[1507]: time="2025-09-13T01:10:14.259435724Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Sep 13 01:10:14.259632 containerd[1507]: time="2025-09-13T01:10:14.259601299Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Sep 13 01:10:14.259966 containerd[1507]: time="2025-09-13T01:10:14.259905510Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Sep 13 01:10:14.260203 containerd[1507]: time="2025-09-13T01:10:14.259987475Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Sep 13 01:10:14.260890 containerd[1507]: time="2025-09-13T01:10:14.260860728Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Sep 13 01:10:14.263199 containerd[1507]: time="2025-09-13T01:10:14.263164706Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Sep 13 01:10:14.263511 containerd[1507]: time="2025-09-13T01:10:14.263482429Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Sep 13 01:10:14.263571 containerd[1507]: time="2025-09-13T01:10:14.263521662Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Sep 13 01:10:14.263626 containerd[1507]: time="2025-09-13T01:10:14.263568930Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Sep 13 01:10:14.263760 containerd[1507]: time="2025-09-13T01:10:14.263594043Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Sep 13 01:10:14.263827 containerd[1507]: time="2025-09-13T01:10:14.263781529Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Sep 13 01:10:14.264177 containerd[1507]: time="2025-09-13T01:10:14.264148366Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Sep 13 01:10:14.264228 containerd[1507]: time="2025-09-13T01:10:14.264188624Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Sep 13 01:10:14.264437 containerd[1507]: time="2025-09-13T01:10:14.264274537Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Sep 13 01:10:14.264484 containerd[1507]: time="2025-09-13T01:10:14.264455919Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Sep 13 01:10:14.264601 containerd[1507]: time="2025-09-13T01:10:14.264573046Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Sep 13 01:10:14.264924 containerd[1507]: time="2025-09-13T01:10:14.264609963Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Sep 13 01:10:14.264974 containerd[1507]: time="2025-09-13T01:10:14.264948106Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266121 containerd[1507]: time="2025-09-13T01:10:14.265010027Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266121 containerd[1507]: time="2025-09-13T01:10:14.265045345Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266121 containerd[1507]: time="2025-09-13T01:10:14.265262376Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266121 containerd[1507]: time="2025-09-13T01:10:14.265597228Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266121 containerd[1507]: time="2025-09-13T01:10:14.265625673Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266121 containerd[1507]: time="2025-09-13T01:10:14.265663954Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266121 containerd[1507]: time="2025-09-13T01:10:14.265694444Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266121 containerd[1507]: time="2025-09-13T01:10:14.265717547Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266121 containerd[1507]: time="2025-09-13T01:10:14.265856461Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266427 containerd[1507]: time="2025-09-13T01:10:14.266145340Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266427 containerd[1507]: time="2025-09-13T01:10:14.266180134Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266427 containerd[1507]: time="2025-09-13T01:10:14.266224254Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266427 containerd[1507]: time="2025-09-13T01:10:14.266265902Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Sep 13 01:10:14.266586 containerd[1507]: time="2025-09-13T01:10:14.266462421Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266829 containerd[1507]: time="2025-09-13T01:10:14.266799483Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.266889 containerd[1507]: time="2025-09-13T01:10:14.266865923Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Sep 13 01:10:14.267081 containerd[1507]: time="2025-09-13T01:10:14.267051131Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Sep 13 01:10:14.267444 containerd[1507]: time="2025-09-13T01:10:14.267409289Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Sep 13 01:10:14.267502 containerd[1507]: time="2025-09-13T01:10:14.267460420Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Sep 13 01:10:14.267502 containerd[1507]: time="2025-09-13T01:10:14.267486794Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Sep 13 01:10:14.267603 containerd[1507]: time="2025-09-13T01:10:14.267504121Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.269117 containerd[1507]: time="2025-09-13T01:10:14.267694070Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Sep 13 01:10:14.269117 containerd[1507]: time="2025-09-13T01:10:14.267732172Z" level=info msg="NRI interface is disabled by configuration." Sep 13 01:10:14.269117 containerd[1507]: time="2025-09-13T01:10:14.267990300Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Sep 13 01:10:14.269612 containerd[1507]: time="2025-09-13T01:10:14.269330859Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Sep 13 01:10:14.269908 containerd[1507]: time="2025-09-13T01:10:14.269616094Z" level=info msg="Connect containerd service" Sep 13 01:10:14.270044 containerd[1507]: time="2025-09-13T01:10:14.270014054Z" level=info msg="using legacy CRI server" Sep 13 01:10:14.270115 containerd[1507]: time="2025-09-13T01:10:14.270043815Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 13 01:10:14.270690 containerd[1507]: time="2025-09-13T01:10:14.270644161Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Sep 13 01:10:14.273671 containerd[1507]: time="2025-09-13T01:10:14.273631864Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 13 01:10:14.276122 containerd[1507]: time="2025-09-13T01:10:14.273860880Z" level=info msg="Start subscribing containerd event" Sep 13 01:10:14.276122 containerd[1507]: time="2025-09-13T01:10:14.273986432Z" level=info msg="Start recovering state" Sep 13 01:10:14.276122 containerd[1507]: time="2025-09-13T01:10:14.274138094Z" level=info msg="Start event monitor" Sep 13 01:10:14.276122 containerd[1507]: time="2025-09-13T01:10:14.274187191Z" level=info msg="Start snapshots syncer" Sep 13 01:10:14.276122 containerd[1507]: time="2025-09-13T01:10:14.274211571Z" level=info msg="Start cni network conf syncer for default" Sep 13 01:10:14.276122 containerd[1507]: time="2025-09-13T01:10:14.274239156Z" level=info msg="Start streaming server" Sep 13 01:10:14.276122 containerd[1507]: time="2025-09-13T01:10:14.274380785Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 13 01:10:14.276122 containerd[1507]: time="2025-09-13T01:10:14.274483104Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 13 01:10:14.274746 systemd[1]: Started containerd.service - containerd container runtime. Sep 13 01:10:14.286243 containerd[1507]: time="2025-09-13T01:10:14.286157534Z" level=info msg="containerd successfully booted in 0.161234s" Sep 13 01:10:14.560657 sshd_keygen[1508]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 13 01:10:14.606709 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 13 01:10:14.621747 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 13 01:10:14.640461 systemd[1]: issuegen.service: Deactivated successfully. Sep 13 01:10:14.640752 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 13 01:10:14.649624 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 13 01:10:14.669206 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 13 01:10:14.680204 tar[1495]: linux-amd64/README.md Sep 13 01:10:14.680688 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 13 01:10:14.703040 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 13 01:10:14.704346 systemd[1]: Reached target getty.target - Login Prompts. Sep 13 01:10:14.719927 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 13 01:10:14.992793 systemd-networkd[1411]: eth0: Gained IPv6LL Sep 13 01:10:15.000250 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 13 01:10:15.002445 systemd[1]: Reached target network-online.target - Network is Online. Sep 13 01:10:15.012805 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:10:15.024516 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 13 01:10:15.052820 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 13 01:10:16.073676 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:10:16.090776 (kubelet)[1605]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 01:10:16.502897 systemd-networkd[1411]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:746:24:19ff:fef4:1d1a/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:746:24:19ff:fef4:1d1a/64 assigned by NDisc. Sep 13 01:10:16.503479 systemd-networkd[1411]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 13 01:10:16.716034 kubelet[1605]: E0913 01:10:16.715911 1605 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 01:10:16.718481 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 01:10:16.718740 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 01:10:16.719892 systemd[1]: kubelet.service: Consumed 1.188s CPU time. Sep 13 01:10:17.593704 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 13 01:10:17.617224 systemd[1]: Started sshd@0-10.244.29.26:22-139.178.68.195:57282.service - OpenSSH per-connection server daemon (139.178.68.195:57282). Sep 13 01:10:18.545314 sshd[1616]: Accepted publickey for core from 139.178.68.195 port 57282 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:10:18.548015 sshd[1616]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:10:18.568229 systemd-logind[1489]: New session 1 of user core. Sep 13 01:10:18.573199 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 13 01:10:18.583798 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 13 01:10:18.620951 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 13 01:10:18.631851 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 13 01:10:18.644835 (systemd)[1620]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 13 01:10:18.792597 systemd[1620]: Queued start job for default target default.target. Sep 13 01:10:18.802494 systemd[1620]: Created slice app.slice - User Application Slice. Sep 13 01:10:18.802539 systemd[1620]: Reached target paths.target - Paths. Sep 13 01:10:18.802564 systemd[1620]: Reached target timers.target - Timers. Sep 13 01:10:18.804926 systemd[1620]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 13 01:10:18.831335 systemd[1620]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 13 01:10:18.831584 systemd[1620]: Reached target sockets.target - Sockets. Sep 13 01:10:18.831611 systemd[1620]: Reached target basic.target - Basic System. Sep 13 01:10:18.831688 systemd[1620]: Reached target default.target - Main User Target. Sep 13 01:10:18.831789 systemd[1620]: Startup finished in 176ms. Sep 13 01:10:18.832092 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 13 01:10:18.841501 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 13 01:10:19.482748 systemd[1]: Started sshd@1-10.244.29.26:22-139.178.68.195:57292.service - OpenSSH per-connection server daemon (139.178.68.195:57292). Sep 13 01:10:19.753388 login[1583]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 13 01:10:19.765328 login[1584]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 13 01:10:19.767493 systemd-logind[1489]: New session 2 of user core. Sep 13 01:10:19.774817 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 13 01:10:19.781796 systemd-logind[1489]: New session 3 of user core. Sep 13 01:10:19.789425 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 13 01:10:20.379909 sshd[1631]: Accepted publickey for core from 139.178.68.195 port 57292 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:10:20.382379 sshd[1631]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:10:20.392339 systemd-logind[1489]: New session 4 of user core. Sep 13 01:10:20.397416 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 13 01:10:20.502198 coreos-metadata[1479]: Sep 13 01:10:20.501 WARN failed to locate config-drive, using the metadata service API instead Sep 13 01:10:20.719399 coreos-metadata[1479]: Sep 13 01:10:20.719 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Sep 13 01:10:20.726349 coreos-metadata[1479]: Sep 13 01:10:20.726 INFO Fetch failed with 404: resource not found Sep 13 01:10:20.726349 coreos-metadata[1479]: Sep 13 01:10:20.726 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 13 01:10:20.727004 coreos-metadata[1479]: Sep 13 01:10:20.726 INFO Fetch successful Sep 13 01:10:20.727136 coreos-metadata[1479]: Sep 13 01:10:20.727 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Sep 13 01:10:20.745738 coreos-metadata[1479]: Sep 13 01:10:20.745 INFO Fetch successful Sep 13 01:10:20.745738 coreos-metadata[1479]: Sep 13 01:10:20.745 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Sep 13 01:10:20.762450 coreos-metadata[1479]: Sep 13 01:10:20.762 INFO Fetch successful Sep 13 01:10:20.762450 coreos-metadata[1479]: Sep 13 01:10:20.762 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Sep 13 01:10:20.785595 coreos-metadata[1479]: Sep 13 01:10:20.785 INFO Fetch successful Sep 13 01:10:20.785595 coreos-metadata[1479]: Sep 13 01:10:20.785 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Sep 13 01:10:20.811597 coreos-metadata[1479]: Sep 13 01:10:20.811 INFO Fetch successful Sep 13 01:10:20.849856 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 13 01:10:20.852328 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 13 01:10:21.002714 sshd[1631]: pam_unix(sshd:session): session closed for user core Sep 13 01:10:21.008096 systemd[1]: sshd@1-10.244.29.26:22-139.178.68.195:57292.service: Deactivated successfully. Sep 13 01:10:21.011146 systemd[1]: session-4.scope: Deactivated successfully. Sep 13 01:10:21.012687 systemd-logind[1489]: Session 4 logged out. Waiting for processes to exit. Sep 13 01:10:21.014618 systemd-logind[1489]: Removed session 4. Sep 13 01:10:21.171078 systemd[1]: Started sshd@2-10.244.29.26:22-139.178.68.195:55632.service - OpenSSH per-connection server daemon (139.178.68.195:55632). Sep 13 01:10:21.200612 coreos-metadata[1548]: Sep 13 01:10:21.200 WARN failed to locate config-drive, using the metadata service API instead Sep 13 01:10:21.223331 coreos-metadata[1548]: Sep 13 01:10:21.223 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Sep 13 01:10:21.251283 coreos-metadata[1548]: Sep 13 01:10:21.251 INFO Fetch successful Sep 13 01:10:21.251681 coreos-metadata[1548]: Sep 13 01:10:21.251 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 13 01:10:21.279594 coreos-metadata[1548]: Sep 13 01:10:21.278 INFO Fetch successful Sep 13 01:10:21.286614 unknown[1548]: wrote ssh authorized keys file for user: core Sep 13 01:10:21.308821 update-ssh-keys[1675]: Updated "/home/core/.ssh/authorized_keys" Sep 13 01:10:21.309396 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 13 01:10:21.314746 systemd[1]: Finished sshkeys.service. Sep 13 01:10:21.316575 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 13 01:10:21.316993 systemd[1]: Startup finished in 1.487s (kernel) + 18.480s (initrd) + 12.127s (userspace) = 32.095s. Sep 13 01:10:22.063987 sshd[1671]: Accepted publickey for core from 139.178.68.195 port 55632 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:10:22.066319 sshd[1671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:10:22.073769 systemd-logind[1489]: New session 5 of user core. Sep 13 01:10:22.083426 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 13 01:10:22.698839 sshd[1671]: pam_unix(sshd:session): session closed for user core Sep 13 01:10:22.703030 systemd[1]: sshd@2-10.244.29.26:22-139.178.68.195:55632.service: Deactivated successfully. Sep 13 01:10:22.705590 systemd[1]: session-5.scope: Deactivated successfully. Sep 13 01:10:22.707718 systemd-logind[1489]: Session 5 logged out. Waiting for processes to exit. Sep 13 01:10:22.709462 systemd-logind[1489]: Removed session 5. Sep 13 01:10:26.969462 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 13 01:10:26.977474 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:10:27.178410 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:10:27.178720 (kubelet)[1691]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 01:10:27.270094 kubelet[1691]: E0913 01:10:27.269874 1691 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 01:10:27.274193 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 01:10:27.274704 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 01:10:32.844746 systemd[1]: Started sshd@3-10.244.29.26:22-139.178.68.195:34434.service - OpenSSH per-connection server daemon (139.178.68.195:34434). Sep 13 01:10:33.744049 sshd[1699]: Accepted publickey for core from 139.178.68.195 port 34434 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:10:33.746507 sshd[1699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:10:33.753994 systemd-logind[1489]: New session 6 of user core. Sep 13 01:10:33.763402 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 13 01:10:34.361702 sshd[1699]: pam_unix(sshd:session): session closed for user core Sep 13 01:10:34.367382 systemd[1]: sshd@3-10.244.29.26:22-139.178.68.195:34434.service: Deactivated successfully. Sep 13 01:10:34.369755 systemd[1]: session-6.scope: Deactivated successfully. Sep 13 01:10:34.370685 systemd-logind[1489]: Session 6 logged out. Waiting for processes to exit. Sep 13 01:10:34.372480 systemd-logind[1489]: Removed session 6. Sep 13 01:10:34.524673 systemd[1]: Started sshd@4-10.244.29.26:22-139.178.68.195:34446.service - OpenSSH per-connection server daemon (139.178.68.195:34446). Sep 13 01:10:35.405789 sshd[1706]: Accepted publickey for core from 139.178.68.195 port 34446 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:10:35.408026 sshd[1706]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:10:35.414568 systemd-logind[1489]: New session 7 of user core. Sep 13 01:10:35.421353 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 13 01:10:36.021536 sshd[1706]: pam_unix(sshd:session): session closed for user core Sep 13 01:10:36.027260 systemd[1]: sshd@4-10.244.29.26:22-139.178.68.195:34446.service: Deactivated successfully. Sep 13 01:10:36.029948 systemd[1]: session-7.scope: Deactivated successfully. Sep 13 01:10:36.031043 systemd-logind[1489]: Session 7 logged out. Waiting for processes to exit. Sep 13 01:10:36.032887 systemd-logind[1489]: Removed session 7. Sep 13 01:10:36.176719 systemd[1]: Started sshd@5-10.244.29.26:22-139.178.68.195:34456.service - OpenSSH per-connection server daemon (139.178.68.195:34456). Sep 13 01:10:37.082351 sshd[1713]: Accepted publickey for core from 139.178.68.195 port 34456 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:10:37.084654 sshd[1713]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:10:37.093221 systemd-logind[1489]: New session 8 of user core. Sep 13 01:10:37.100508 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 13 01:10:37.525032 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 13 01:10:37.532478 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:10:37.706779 sshd[1713]: pam_unix(sshd:session): session closed for user core Sep 13 01:10:37.714599 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:10:37.716019 systemd[1]: sshd@5-10.244.29.26:22-139.178.68.195:34456.service: Deactivated successfully. Sep 13 01:10:37.716706 (kubelet)[1724]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 01:10:37.720926 systemd[1]: session-8.scope: Deactivated successfully. Sep 13 01:10:37.723736 systemd-logind[1489]: Session 8 logged out. Waiting for processes to exit. Sep 13 01:10:37.726429 systemd-logind[1489]: Removed session 8. Sep 13 01:10:37.843071 kubelet[1724]: E0913 01:10:37.842836 1724 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 01:10:37.846533 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 01:10:37.847026 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 01:10:37.866767 systemd[1]: Started sshd@6-10.244.29.26:22-139.178.68.195:34470.service - OpenSSH per-connection server daemon (139.178.68.195:34470). Sep 13 01:10:38.761672 sshd[1735]: Accepted publickey for core from 139.178.68.195 port 34470 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:10:38.763916 sshd[1735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:10:38.770517 systemd-logind[1489]: New session 9 of user core. Sep 13 01:10:38.786541 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 13 01:10:39.256271 sudo[1738]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 13 01:10:39.256786 sudo[1738]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 01:10:39.276281 sudo[1738]: pam_unix(sudo:session): session closed for user root Sep 13 01:10:39.421371 sshd[1735]: pam_unix(sshd:session): session closed for user core Sep 13 01:10:39.427260 systemd[1]: sshd@6-10.244.29.26:22-139.178.68.195:34470.service: Deactivated successfully. Sep 13 01:10:39.429715 systemd[1]: session-9.scope: Deactivated successfully. Sep 13 01:10:39.430678 systemd-logind[1489]: Session 9 logged out. Waiting for processes to exit. Sep 13 01:10:39.432814 systemd-logind[1489]: Removed session 9. Sep 13 01:10:39.581720 systemd[1]: Started sshd@7-10.244.29.26:22-139.178.68.195:34486.service - OpenSSH per-connection server daemon (139.178.68.195:34486). Sep 13 01:10:40.469469 sshd[1743]: Accepted publickey for core from 139.178.68.195 port 34486 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:10:40.471841 sshd[1743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:10:40.480209 systemd-logind[1489]: New session 10 of user core. Sep 13 01:10:40.486494 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 13 01:10:40.950866 sudo[1747]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 13 01:10:40.951688 sudo[1747]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 01:10:40.957525 sudo[1747]: pam_unix(sudo:session): session closed for user root Sep 13 01:10:40.966080 sudo[1746]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Sep 13 01:10:40.966591 sudo[1746]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 01:10:40.992653 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Sep 13 01:10:40.995240 auditctl[1750]: No rules Sep 13 01:10:40.995825 systemd[1]: audit-rules.service: Deactivated successfully. Sep 13 01:10:40.996185 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Sep 13 01:10:41.000406 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 13 01:10:41.052560 augenrules[1768]: No rules Sep 13 01:10:41.054395 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 13 01:10:41.057450 sudo[1746]: pam_unix(sudo:session): session closed for user root Sep 13 01:10:41.201635 sshd[1743]: pam_unix(sshd:session): session closed for user core Sep 13 01:10:41.207246 systemd[1]: sshd@7-10.244.29.26:22-139.178.68.195:34486.service: Deactivated successfully. Sep 13 01:10:41.209565 systemd[1]: session-10.scope: Deactivated successfully. Sep 13 01:10:41.211243 systemd-logind[1489]: Session 10 logged out. Waiting for processes to exit. Sep 13 01:10:41.212944 systemd-logind[1489]: Removed session 10. Sep 13 01:10:41.359610 systemd[1]: Started sshd@8-10.244.29.26:22-139.178.68.195:37142.service - OpenSSH per-connection server daemon (139.178.68.195:37142). Sep 13 01:10:42.243784 sshd[1776]: Accepted publickey for core from 139.178.68.195 port 37142 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:10:42.246158 sshd[1776]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:10:42.252934 systemd-logind[1489]: New session 11 of user core. Sep 13 01:10:42.267519 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 13 01:10:42.721333 sudo[1779]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 13 01:10:42.721824 sudo[1779]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 01:10:43.208511 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 13 01:10:43.217666 (dockerd)[1795]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 13 01:10:43.914200 dockerd[1795]: time="2025-09-13T01:10:43.913839431Z" level=info msg="Starting up" Sep 13 01:10:44.132273 dockerd[1795]: time="2025-09-13T01:10:44.131791997Z" level=info msg="Loading containers: start." Sep 13 01:10:44.301247 kernel: Initializing XFRM netlink socket Sep 13 01:10:44.418615 systemd-networkd[1411]: docker0: Link UP Sep 13 01:10:44.440655 dockerd[1795]: time="2025-09-13T01:10:44.440419685Z" level=info msg="Loading containers: done." Sep 13 01:10:44.467911 dockerd[1795]: time="2025-09-13T01:10:44.467841373Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 13 01:10:44.468208 dockerd[1795]: time="2025-09-13T01:10:44.467998541Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Sep 13 01:10:44.468289 dockerd[1795]: time="2025-09-13T01:10:44.468234889Z" level=info msg="Daemon has completed initialization" Sep 13 01:10:44.469178 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4014569523-merged.mount: Deactivated successfully. Sep 13 01:10:44.520770 dockerd[1795]: time="2025-09-13T01:10:44.520676597Z" level=info msg="API listen on /run/docker.sock" Sep 13 01:10:44.522571 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 13 01:10:45.917500 containerd[1507]: time="2025-09-13T01:10:45.916526957Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Sep 13 01:10:46.526892 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 13 01:10:46.981428 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2768994222.mount: Deactivated successfully. Sep 13 01:10:48.097424 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 13 01:10:48.105356 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:10:48.312332 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:10:48.326603 (kubelet)[2007]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 01:10:48.433222 kubelet[2007]: E0913 01:10:48.432280 2007 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 01:10:48.436338 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 01:10:48.436670 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 01:10:49.734200 containerd[1507]: time="2025-09-13T01:10:49.733062020Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:10:49.735680 containerd[1507]: time="2025-09-13T01:10:49.735633272Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=28837924" Sep 13 01:10:49.736976 containerd[1507]: time="2025-09-13T01:10:49.736936133Z" level=info msg="ImageCreate event name:\"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:10:49.751283 containerd[1507]: time="2025-09-13T01:10:49.751211945Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:10:49.752833 containerd[1507]: time="2025-09-13T01:10:49.752788287Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"28834515\" in 3.836136154s" Sep 13 01:10:49.752931 containerd[1507]: time="2025-09-13T01:10:49.752854208Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\"" Sep 13 01:10:49.754949 containerd[1507]: time="2025-09-13T01:10:49.754898731Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Sep 13 01:10:52.288972 containerd[1507]: time="2025-09-13T01:10:52.288757002Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:10:52.291781 containerd[1507]: time="2025-09-13T01:10:52.291434799Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=24787035" Sep 13 01:10:52.292742 containerd[1507]: time="2025-09-13T01:10:52.292694599Z" level=info msg="ImageCreate event name:\"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:10:52.307694 containerd[1507]: time="2025-09-13T01:10:52.307602363Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:10:52.311853 containerd[1507]: time="2025-09-13T01:10:52.311806058Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"26421706\" in 2.55685802s" Sep 13 01:10:52.313913 containerd[1507]: time="2025-09-13T01:10:52.312017029Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\"" Sep 13 01:10:52.314447 containerd[1507]: time="2025-09-13T01:10:52.314307533Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Sep 13 01:10:54.542143 containerd[1507]: time="2025-09-13T01:10:54.540288144Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:10:54.543252 containerd[1507]: time="2025-09-13T01:10:54.543173452Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=19176297" Sep 13 01:10:54.545166 containerd[1507]: time="2025-09-13T01:10:54.543525036Z" level=info msg="ImageCreate event name:\"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:10:54.549395 containerd[1507]: time="2025-09-13T01:10:54.549303071Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:10:54.552560 containerd[1507]: time="2025-09-13T01:10:54.551046528Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"20810986\" in 2.236427632s" Sep 13 01:10:54.552560 containerd[1507]: time="2025-09-13T01:10:54.551097464Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\"" Sep 13 01:10:54.552987 containerd[1507]: time="2025-09-13T01:10:54.552920918Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Sep 13 01:10:56.753613 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1031535153.mount: Deactivated successfully. Sep 13 01:10:57.806858 containerd[1507]: time="2025-09-13T01:10:57.805671287Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:10:57.806858 containerd[1507]: time="2025-09-13T01:10:57.806792311Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=30924214" Sep 13 01:10:57.807674 containerd[1507]: time="2025-09-13T01:10:57.807632422Z" level=info msg="ImageCreate event name:\"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:10:57.811451 containerd[1507]: time="2025-09-13T01:10:57.811405720Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:10:57.812434 containerd[1507]: time="2025-09-13T01:10:57.812389114Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"30923225\" in 3.259417879s" Sep 13 01:10:57.812548 containerd[1507]: time="2025-09-13T01:10:57.812442519Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\"" Sep 13 01:10:57.813769 containerd[1507]: time="2025-09-13T01:10:57.813638070Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 13 01:10:58.564066 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 13 01:10:58.579477 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:10:58.817367 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:10:58.827810 (kubelet)[2039]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 01:10:58.931568 kubelet[2039]: E0913 01:10:58.931308 2039 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 01:10:58.933726 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 01:10:58.934012 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 01:10:59.006995 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2300566721.mount: Deactivated successfully. Sep 13 01:10:59.223688 update_engine[1490]: I20250913 01:10:59.222338 1490 update_attempter.cc:509] Updating boot flags... Sep 13 01:10:59.323191 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (2065) Sep 13 01:11:00.827984 containerd[1507]: time="2025-09-13T01:11:00.827868216Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:00.831473 containerd[1507]: time="2025-09-13T01:11:00.831420120Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Sep 13 01:11:00.831644 containerd[1507]: time="2025-09-13T01:11:00.831606430Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:00.837129 containerd[1507]: time="2025-09-13T01:11:00.837063560Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:00.838940 containerd[1507]: time="2025-09-13T01:11:00.838888352Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 3.025197078s" Sep 13 01:11:00.839035 containerd[1507]: time="2025-09-13T01:11:00.838946664Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 13 01:11:00.872162 containerd[1507]: time="2025-09-13T01:11:00.871354720Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 13 01:11:01.556518 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2971539728.mount: Deactivated successfully. Sep 13 01:11:01.564462 containerd[1507]: time="2025-09-13T01:11:01.564370535Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:01.566975 containerd[1507]: time="2025-09-13T01:11:01.566706317Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Sep 13 01:11:01.569134 containerd[1507]: time="2025-09-13T01:11:01.567830376Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:01.571574 containerd[1507]: time="2025-09-13T01:11:01.571536397Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:01.573038 containerd[1507]: time="2025-09-13T01:11:01.572995146Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 701.509332ms" Sep 13 01:11:01.573188 containerd[1507]: time="2025-09-13T01:11:01.573051724Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 13 01:11:01.573933 containerd[1507]: time="2025-09-13T01:11:01.573887619Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 13 01:11:02.320336 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1735956457.mount: Deactivated successfully. Sep 13 01:11:08.470071 containerd[1507]: time="2025-09-13T01:11:08.469961123Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:08.472988 containerd[1507]: time="2025-09-13T01:11:08.471661229Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682064" Sep 13 01:11:08.472988 containerd[1507]: time="2025-09-13T01:11:08.472915798Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:08.477897 containerd[1507]: time="2025-09-13T01:11:08.477836860Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:08.480198 containerd[1507]: time="2025-09-13T01:11:08.479943562Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 6.906013844s" Sep 13 01:11:08.480198 containerd[1507]: time="2025-09-13T01:11:08.479997404Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 13 01:11:09.080141 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Sep 13 01:11:09.095515 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:11:09.415496 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:11:09.442580 (kubelet)[2187]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 01:11:09.560940 kubelet[2187]: E0913 01:11:09.560011 2187 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 01:11:09.564580 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 01:11:09.564879 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 01:11:13.496195 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:11:13.513197 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:11:13.556199 systemd[1]: Reloading requested from client PID 2213 ('systemctl') (unit session-11.scope)... Sep 13 01:11:13.556525 systemd[1]: Reloading... Sep 13 01:11:13.787150 zram_generator::config[2248]: No configuration found. Sep 13 01:11:13.976811 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 01:11:14.092468 systemd[1]: Reloading finished in 535 ms. Sep 13 01:11:14.182259 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:11:14.188302 systemd[1]: kubelet.service: Deactivated successfully. Sep 13 01:11:14.188730 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:11:14.199559 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:11:14.457285 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:11:14.470662 (kubelet)[2321]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 13 01:11:14.538715 kubelet[2321]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 01:11:14.538715 kubelet[2321]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 13 01:11:14.538715 kubelet[2321]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 01:11:14.539495 kubelet[2321]: I0913 01:11:14.538863 2321 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 13 01:11:15.602276 kubelet[2321]: I0913 01:11:15.602215 2321 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 13 01:11:15.602276 kubelet[2321]: I0913 01:11:15.602265 2321 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 13 01:11:15.602917 kubelet[2321]: I0913 01:11:15.602656 2321 server.go:954] "Client rotation is on, will bootstrap in background" Sep 13 01:11:15.642132 kubelet[2321]: E0913 01:11:15.640704 2321 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.244.29.26:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.244.29.26:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:11:15.645522 kubelet[2321]: I0913 01:11:15.645490 2321 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 13 01:11:15.661068 kubelet[2321]: E0913 01:11:15.661016 2321 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 13 01:11:15.661068 kubelet[2321]: I0913 01:11:15.661069 2321 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 13 01:11:15.668382 kubelet[2321]: I0913 01:11:15.668343 2321 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 13 01:11:15.673291 kubelet[2321]: I0913 01:11:15.673182 2321 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 13 01:11:15.673551 kubelet[2321]: I0913 01:11:15.673250 2321 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-5asmg.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 13 01:11:15.675315 kubelet[2321]: I0913 01:11:15.675278 2321 topology_manager.go:138] "Creating topology manager with none policy" Sep 13 01:11:15.675315 kubelet[2321]: I0913 01:11:15.675312 2321 container_manager_linux.go:304] "Creating device plugin manager" Sep 13 01:11:15.676727 kubelet[2321]: I0913 01:11:15.676663 2321 state_mem.go:36] "Initialized new in-memory state store" Sep 13 01:11:15.681205 kubelet[2321]: I0913 01:11:15.681146 2321 kubelet.go:446] "Attempting to sync node with API server" Sep 13 01:11:15.681326 kubelet[2321]: I0913 01:11:15.681209 2321 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 13 01:11:15.681326 kubelet[2321]: I0913 01:11:15.681258 2321 kubelet.go:352] "Adding apiserver pod source" Sep 13 01:11:15.681326 kubelet[2321]: I0913 01:11:15.681292 2321 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 13 01:11:15.683660 kubelet[2321]: W0913 01:11:15.683584 2321 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.244.29.26:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-5asmg.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.244.29.26:6443: connect: connection refused Sep 13 01:11:15.683813 kubelet[2321]: E0913 01:11:15.683780 2321 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.244.29.26:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-5asmg.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.29.26:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:11:15.684329 kubelet[2321]: W0913 01:11:15.684285 2321 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.244.29.26:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.244.29.26:6443: connect: connection refused Sep 13 01:11:15.684509 kubelet[2321]: E0913 01:11:15.684478 2321 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.244.29.26:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.29.26:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:11:15.686622 kubelet[2321]: I0913 01:11:15.686579 2321 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 13 01:11:15.690065 kubelet[2321]: I0913 01:11:15.690037 2321 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 13 01:11:15.691060 kubelet[2321]: W0913 01:11:15.691032 2321 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 13 01:11:15.695733 kubelet[2321]: I0913 01:11:15.695427 2321 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 13 01:11:15.695733 kubelet[2321]: I0913 01:11:15.695493 2321 server.go:1287] "Started kubelet" Sep 13 01:11:15.698897 kubelet[2321]: I0913 01:11:15.697990 2321 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 13 01:11:15.700046 kubelet[2321]: I0913 01:11:15.699549 2321 server.go:479] "Adding debug handlers to kubelet server" Sep 13 01:11:15.702477 kubelet[2321]: I0913 01:11:15.702401 2321 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 13 01:11:15.703029 kubelet[2321]: I0913 01:11:15.703003 2321 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 13 01:11:15.708139 kubelet[2321]: I0913 01:11:15.707945 2321 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 13 01:11:15.714678 kubelet[2321]: I0913 01:11:15.713899 2321 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 13 01:11:15.721026 kubelet[2321]: I0913 01:11:15.720974 2321 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 13 01:11:15.721278 kubelet[2321]: E0913 01:11:15.721239 2321 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-5asmg.gb1.brightbox.com\" not found" Sep 13 01:11:15.722439 kubelet[2321]: I0913 01:11:15.722410 2321 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 13 01:11:15.722552 kubelet[2321]: I0913 01:11:15.722513 2321 reconciler.go:26] "Reconciler: start to sync state" Sep 13 01:11:15.725829 kubelet[2321]: E0913 01:11:15.704363 2321 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.244.29.26:6443/api/v1/namespaces/default/events\": dial tcp 10.244.29.26:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-5asmg.gb1.brightbox.com.1864b25f01571d76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-5asmg.gb1.brightbox.com,UID:srv-5asmg.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-5asmg.gb1.brightbox.com,},FirstTimestamp:2025-09-13 01:11:15.695459702 +0000 UTC m=+1.218870846,LastTimestamp:2025-09-13 01:11:15.695459702 +0000 UTC m=+1.218870846,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-5asmg.gb1.brightbox.com,}" Sep 13 01:11:15.726707 kubelet[2321]: W0913 01:11:15.726642 2321 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.244.29.26:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.244.29.26:6443: connect: connection refused Sep 13 01:11:15.726788 kubelet[2321]: E0913 01:11:15.726725 2321 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.244.29.26:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.29.26:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:11:15.726973 kubelet[2321]: E0913 01:11:15.726901 2321 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.29.26:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-5asmg.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.29.26:6443: connect: connection refused" interval="200ms" Sep 13 01:11:15.731146 kubelet[2321]: I0913 01:11:15.731087 2321 factory.go:221] Registration of the systemd container factory successfully Sep 13 01:11:15.731435 kubelet[2321]: I0913 01:11:15.731370 2321 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 13 01:11:15.739965 kubelet[2321]: I0913 01:11:15.739923 2321 factory.go:221] Registration of the containerd container factory successfully Sep 13 01:11:15.760566 kubelet[2321]: I0913 01:11:15.760474 2321 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 13 01:11:15.773859 kubelet[2321]: I0913 01:11:15.773532 2321 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 13 01:11:15.774116 kubelet[2321]: I0913 01:11:15.774074 2321 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 13 01:11:15.775328 kubelet[2321]: I0913 01:11:15.775303 2321 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 13 01:11:15.775328 kubelet[2321]: I0913 01:11:15.775328 2321 kubelet.go:2382] "Starting kubelet main sync loop" Sep 13 01:11:15.775484 kubelet[2321]: E0913 01:11:15.775415 2321 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 13 01:11:15.776678 kubelet[2321]: W0913 01:11:15.776634 2321 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.244.29.26:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.244.29.26:6443: connect: connection refused Sep 13 01:11:15.777327 kubelet[2321]: E0913 01:11:15.777187 2321 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.244.29.26:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.29.26:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:11:15.788550 kubelet[2321]: I0913 01:11:15.788520 2321 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 13 01:11:15.788748 kubelet[2321]: I0913 01:11:15.788728 2321 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 13 01:11:15.789166 kubelet[2321]: I0913 01:11:15.788923 2321 state_mem.go:36] "Initialized new in-memory state store" Sep 13 01:11:15.791349 kubelet[2321]: I0913 01:11:15.791317 2321 policy_none.go:49] "None policy: Start" Sep 13 01:11:15.791511 kubelet[2321]: I0913 01:11:15.791492 2321 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 13 01:11:15.791650 kubelet[2321]: I0913 01:11:15.791632 2321 state_mem.go:35] "Initializing new in-memory state store" Sep 13 01:11:15.801661 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 13 01:11:15.813863 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 13 01:11:15.820081 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 13 01:11:15.821708 kubelet[2321]: E0913 01:11:15.821676 2321 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-5asmg.gb1.brightbox.com\" not found" Sep 13 01:11:15.829901 kubelet[2321]: I0913 01:11:15.829857 2321 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 13 01:11:15.833307 kubelet[2321]: I0913 01:11:15.832437 2321 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 13 01:11:15.833307 kubelet[2321]: I0913 01:11:15.832478 2321 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 13 01:11:15.833307 kubelet[2321]: I0913 01:11:15.833008 2321 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 13 01:11:15.835989 kubelet[2321]: E0913 01:11:15.835962 2321 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 13 01:11:15.851307 kubelet[2321]: E0913 01:11:15.836194 2321 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-5asmg.gb1.brightbox.com\" not found" Sep 13 01:11:15.856304 kubelet[2321]: E0913 01:11:15.855448 2321 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.244.29.26:6443/api/v1/namespaces/default/events\": dial tcp 10.244.29.26:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-5asmg.gb1.brightbox.com.1864b25f01571d76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-5asmg.gb1.brightbox.com,UID:srv-5asmg.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-5asmg.gb1.brightbox.com,},FirstTimestamp:2025-09-13 01:11:15.695459702 +0000 UTC m=+1.218870846,LastTimestamp:2025-09-13 01:11:15.695459702 +0000 UTC m=+1.218870846,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-5asmg.gb1.brightbox.com,}" Sep 13 01:11:15.893579 systemd[1]: Created slice kubepods-burstable-pod3549699fc9435a80436f9ad35ce434c5.slice - libcontainer container kubepods-burstable-pod3549699fc9435a80436f9ad35ce434c5.slice. Sep 13 01:11:15.914743 kubelet[2321]: E0913 01:11:15.914490 2321 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-5asmg.gb1.brightbox.com\" not found" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:15.920702 systemd[1]: Created slice kubepods-burstable-podedab042cb2fe2c97a2d64ae55efa07b8.slice - libcontainer container kubepods-burstable-podedab042cb2fe2c97a2d64ae55efa07b8.slice. Sep 13 01:11:15.923410 kubelet[2321]: E0913 01:11:15.923261 2321 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-5asmg.gb1.brightbox.com\" not found" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:15.928344 kubelet[2321]: E0913 01:11:15.928291 2321 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.29.26:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-5asmg.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.29.26:6443: connect: connection refused" interval="400ms" Sep 13 01:11:15.934054 systemd[1]: Created slice kubepods-burstable-pod9633c103eafc94ce88cfef31f23b3422.slice - libcontainer container kubepods-burstable-pod9633c103eafc94ce88cfef31f23b3422.slice. Sep 13 01:11:15.937053 kubelet[2321]: E0913 01:11:15.936698 2321 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-5asmg.gb1.brightbox.com\" not found" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:15.939676 kubelet[2321]: I0913 01:11:15.939654 2321 kubelet_node_status.go:75] "Attempting to register node" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:15.940246 kubelet[2321]: E0913 01:11:15.940213 2321 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.29.26:6443/api/v1/nodes\": dial tcp 10.244.29.26:6443: connect: connection refused" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.023868 kubelet[2321]: I0913 01:11:16.023711 2321 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/edab042cb2fe2c97a2d64ae55efa07b8-k8s-certs\") pod \"kube-controller-manager-srv-5asmg.gb1.brightbox.com\" (UID: \"edab042cb2fe2c97a2d64ae55efa07b8\") " pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.023868 kubelet[2321]: I0913 01:11:16.023784 2321 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/edab042cb2fe2c97a2d64ae55efa07b8-kubeconfig\") pod \"kube-controller-manager-srv-5asmg.gb1.brightbox.com\" (UID: \"edab042cb2fe2c97a2d64ae55efa07b8\") " pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.023868 kubelet[2321]: I0913 01:11:16.023876 2321 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3549699fc9435a80436f9ad35ce434c5-ca-certs\") pod \"kube-apiserver-srv-5asmg.gb1.brightbox.com\" (UID: \"3549699fc9435a80436f9ad35ce434c5\") " pod="kube-system/kube-apiserver-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.024263 kubelet[2321]: I0913 01:11:16.023962 2321 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3549699fc9435a80436f9ad35ce434c5-k8s-certs\") pod \"kube-apiserver-srv-5asmg.gb1.brightbox.com\" (UID: \"3549699fc9435a80436f9ad35ce434c5\") " pod="kube-system/kube-apiserver-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.024263 kubelet[2321]: I0913 01:11:16.024040 2321 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3549699fc9435a80436f9ad35ce434c5-usr-share-ca-certificates\") pod \"kube-apiserver-srv-5asmg.gb1.brightbox.com\" (UID: \"3549699fc9435a80436f9ad35ce434c5\") " pod="kube-system/kube-apiserver-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.024263 kubelet[2321]: I0913 01:11:16.024129 2321 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/edab042cb2fe2c97a2d64ae55efa07b8-ca-certs\") pod \"kube-controller-manager-srv-5asmg.gb1.brightbox.com\" (UID: \"edab042cb2fe2c97a2d64ae55efa07b8\") " pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.024263 kubelet[2321]: I0913 01:11:16.024168 2321 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/edab042cb2fe2c97a2d64ae55efa07b8-flexvolume-dir\") pod \"kube-controller-manager-srv-5asmg.gb1.brightbox.com\" (UID: \"edab042cb2fe2c97a2d64ae55efa07b8\") " pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.024263 kubelet[2321]: I0913 01:11:16.024254 2321 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/edab042cb2fe2c97a2d64ae55efa07b8-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-5asmg.gb1.brightbox.com\" (UID: \"edab042cb2fe2c97a2d64ae55efa07b8\") " pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.024497 kubelet[2321]: I0913 01:11:16.024327 2321 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9633c103eafc94ce88cfef31f23b3422-kubeconfig\") pod \"kube-scheduler-srv-5asmg.gb1.brightbox.com\" (UID: \"9633c103eafc94ce88cfef31f23b3422\") " pod="kube-system/kube-scheduler-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.143638 kubelet[2321]: I0913 01:11:16.143417 2321 kubelet_node_status.go:75] "Attempting to register node" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.143917 kubelet[2321]: E0913 01:11:16.143858 2321 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.29.26:6443/api/v1/nodes\": dial tcp 10.244.29.26:6443: connect: connection refused" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.217419 containerd[1507]: time="2025-09-13T01:11:16.216901890Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-5asmg.gb1.brightbox.com,Uid:3549699fc9435a80436f9ad35ce434c5,Namespace:kube-system,Attempt:0,}" Sep 13 01:11:16.225067 containerd[1507]: time="2025-09-13T01:11:16.225002319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-5asmg.gb1.brightbox.com,Uid:edab042cb2fe2c97a2d64ae55efa07b8,Namespace:kube-system,Attempt:0,}" Sep 13 01:11:16.239215 containerd[1507]: time="2025-09-13T01:11:16.239085792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-5asmg.gb1.brightbox.com,Uid:9633c103eafc94ce88cfef31f23b3422,Namespace:kube-system,Attempt:0,}" Sep 13 01:11:16.329469 kubelet[2321]: E0913 01:11:16.329336 2321 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.29.26:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-5asmg.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.29.26:6443: connect: connection refused" interval="800ms" Sep 13 01:11:16.548255 kubelet[2321]: I0913 01:11:16.547761 2321 kubelet_node_status.go:75] "Attempting to register node" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.548512 kubelet[2321]: E0913 01:11:16.548464 2321 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.29.26:6443/api/v1/nodes\": dial tcp 10.244.29.26:6443: connect: connection refused" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:16.915606 kubelet[2321]: W0913 01:11:16.915439 2321 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.244.29.26:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-5asmg.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.244.29.26:6443: connect: connection refused Sep 13 01:11:16.915606 kubelet[2321]: E0913 01:11:16.915549 2321 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.244.29.26:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-5asmg.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.29.26:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:11:17.000250 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1084635805.mount: Deactivated successfully. Sep 13 01:11:17.006526 kubelet[2321]: W0913 01:11:17.006318 2321 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.244.29.26:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.244.29.26:6443: connect: connection refused Sep 13 01:11:17.006526 kubelet[2321]: E0913 01:11:17.006414 2321 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.244.29.26:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.29.26:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:11:17.010965 containerd[1507]: time="2025-09-13T01:11:17.010863063Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 01:11:17.012467 containerd[1507]: time="2025-09-13T01:11:17.012401038Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 01:11:17.013076 containerd[1507]: time="2025-09-13T01:11:17.012922986Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 13 01:11:17.014055 containerd[1507]: time="2025-09-13T01:11:17.014003648Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Sep 13 01:11:17.015235 containerd[1507]: time="2025-09-13T01:11:17.015191528Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 01:11:17.016806 containerd[1507]: time="2025-09-13T01:11:17.016748513Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 13 01:11:17.020597 containerd[1507]: time="2025-09-13T01:11:17.020559411Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 01:11:17.023041 containerd[1507]: time="2025-09-13T01:11:17.022378619Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 805.341171ms" Sep 13 01:11:17.026272 containerd[1507]: time="2025-09-13T01:11:17.026221320Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 801.112887ms" Sep 13 01:11:17.027148 containerd[1507]: time="2025-09-13T01:11:17.026482564Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 01:11:17.030453 containerd[1507]: time="2025-09-13T01:11:17.030389018Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 791.06905ms" Sep 13 01:11:17.064486 kubelet[2321]: W0913 01:11:17.064319 2321 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.244.29.26:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.244.29.26:6443: connect: connection refused Sep 13 01:11:17.064486 kubelet[2321]: E0913 01:11:17.064429 2321 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.244.29.26:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.29.26:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:11:17.131917 kubelet[2321]: E0913 01:11:17.131819 2321 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.29.26:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-5asmg.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.29.26:6443: connect: connection refused" interval="1.6s" Sep 13 01:11:17.224665 kubelet[2321]: W0913 01:11:17.223648 2321 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.244.29.26:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.244.29.26:6443: connect: connection refused Sep 13 01:11:17.224665 kubelet[2321]: E0913 01:11:17.223758 2321 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.244.29.26:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.29.26:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:11:17.261054 containerd[1507]: time="2025-09-13T01:11:17.260252548Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:11:17.261054 containerd[1507]: time="2025-09-13T01:11:17.260336673Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:11:17.261054 containerd[1507]: time="2025-09-13T01:11:17.260362342Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:17.261054 containerd[1507]: time="2025-09-13T01:11:17.260475333Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:17.264505 containerd[1507]: time="2025-09-13T01:11:17.264365618Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:11:17.264613 containerd[1507]: time="2025-09-13T01:11:17.264493770Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:11:17.264613 containerd[1507]: time="2025-09-13T01:11:17.264520577Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:17.264762 containerd[1507]: time="2025-09-13T01:11:17.264628693Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:17.274915 containerd[1507]: time="2025-09-13T01:11:17.274776806Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:11:17.279767 containerd[1507]: time="2025-09-13T01:11:17.279324014Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:11:17.279767 containerd[1507]: time="2025-09-13T01:11:17.279403645Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:17.279767 containerd[1507]: time="2025-09-13T01:11:17.279546230Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:17.354523 kubelet[2321]: I0913 01:11:17.354475 2321 kubelet_node_status.go:75] "Attempting to register node" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:17.357326 kubelet[2321]: E0913 01:11:17.357246 2321 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.29.26:6443/api/v1/nodes\": dial tcp 10.244.29.26:6443: connect: connection refused" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:17.366180 systemd[1]: Started cri-containerd-b0f4ee0d82b967fd2eecf5fbd2b204196e7e9b82cd967185247c590c763fc500.scope - libcontainer container b0f4ee0d82b967fd2eecf5fbd2b204196e7e9b82cd967185247c590c763fc500. Sep 13 01:11:17.373061 systemd[1]: Started cri-containerd-c14898964d20c121ac5c37e8efcf4520d4548d83296288237a1a3e9a4e6b9e48.scope - libcontainer container c14898964d20c121ac5c37e8efcf4520d4548d83296288237a1a3e9a4e6b9e48. Sep 13 01:11:17.417412 systemd[1]: Started cri-containerd-ec1007870eed88b2dcd4c30550a8dfa124aaf3e5a3c0e8bebe6dc3596e8b2ca3.scope - libcontainer container ec1007870eed88b2dcd4c30550a8dfa124aaf3e5a3c0e8bebe6dc3596e8b2ca3. Sep 13 01:11:17.521607 containerd[1507]: time="2025-09-13T01:11:17.521435093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-5asmg.gb1.brightbox.com,Uid:edab042cb2fe2c97a2d64ae55efa07b8,Namespace:kube-system,Attempt:0,} returns sandbox id \"b0f4ee0d82b967fd2eecf5fbd2b204196e7e9b82cd967185247c590c763fc500\"" Sep 13 01:11:17.526198 containerd[1507]: time="2025-09-13T01:11:17.525522089Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-5asmg.gb1.brightbox.com,Uid:3549699fc9435a80436f9ad35ce434c5,Namespace:kube-system,Attempt:0,} returns sandbox id \"c14898964d20c121ac5c37e8efcf4520d4548d83296288237a1a3e9a4e6b9e48\"" Sep 13 01:11:17.540139 containerd[1507]: time="2025-09-13T01:11:17.538236681Z" level=info msg="CreateContainer within sandbox \"b0f4ee0d82b967fd2eecf5fbd2b204196e7e9b82cd967185247c590c763fc500\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 13 01:11:17.540139 containerd[1507]: time="2025-09-13T01:11:17.538510943Z" level=info msg="CreateContainer within sandbox \"c14898964d20c121ac5c37e8efcf4520d4548d83296288237a1a3e9a4e6b9e48\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 13 01:11:17.570849 containerd[1507]: time="2025-09-13T01:11:17.570497012Z" level=info msg="CreateContainer within sandbox \"b0f4ee0d82b967fd2eecf5fbd2b204196e7e9b82cd967185247c590c763fc500\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"9fd689204d6ad8a5379da03e5164dcf0c0364ec826f33ae24c641429dbefbd9b\"" Sep 13 01:11:17.574176 containerd[1507]: time="2025-09-13T01:11:17.573624839Z" level=info msg="StartContainer for \"9fd689204d6ad8a5379da03e5164dcf0c0364ec826f33ae24c641429dbefbd9b\"" Sep 13 01:11:17.575880 containerd[1507]: time="2025-09-13T01:11:17.575844514Z" level=info msg="CreateContainer within sandbox \"c14898964d20c121ac5c37e8efcf4520d4548d83296288237a1a3e9a4e6b9e48\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"8fa9d02c3707e277d8856ec00bec47fab3cd3868e9fbfea82464811cd1dea2f3\"" Sep 13 01:11:17.576218 containerd[1507]: time="2025-09-13T01:11:17.576185519Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-5asmg.gb1.brightbox.com,Uid:9633c103eafc94ce88cfef31f23b3422,Namespace:kube-system,Attempt:0,} returns sandbox id \"ec1007870eed88b2dcd4c30550a8dfa124aaf3e5a3c0e8bebe6dc3596e8b2ca3\"" Sep 13 01:11:17.577659 containerd[1507]: time="2025-09-13T01:11:17.577599742Z" level=info msg="StartContainer for \"8fa9d02c3707e277d8856ec00bec47fab3cd3868e9fbfea82464811cd1dea2f3\"" Sep 13 01:11:17.581410 containerd[1507]: time="2025-09-13T01:11:17.581371948Z" level=info msg="CreateContainer within sandbox \"ec1007870eed88b2dcd4c30550a8dfa124aaf3e5a3c0e8bebe6dc3596e8b2ca3\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 13 01:11:17.611605 containerd[1507]: time="2025-09-13T01:11:17.611527694Z" level=info msg="CreateContainer within sandbox \"ec1007870eed88b2dcd4c30550a8dfa124aaf3e5a3c0e8bebe6dc3596e8b2ca3\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"166dd9e28e21be2815a5a9d9f170f04cc18cc5b3e6b1c28a3114aa70b5db20a7\"" Sep 13 01:11:17.612865 containerd[1507]: time="2025-09-13T01:11:17.612761655Z" level=info msg="StartContainer for \"166dd9e28e21be2815a5a9d9f170f04cc18cc5b3e6b1c28a3114aa70b5db20a7\"" Sep 13 01:11:17.634484 systemd[1]: Started cri-containerd-8fa9d02c3707e277d8856ec00bec47fab3cd3868e9fbfea82464811cd1dea2f3.scope - libcontainer container 8fa9d02c3707e277d8856ec00bec47fab3cd3868e9fbfea82464811cd1dea2f3. Sep 13 01:11:17.647376 systemd[1]: Started cri-containerd-9fd689204d6ad8a5379da03e5164dcf0c0364ec826f33ae24c641429dbefbd9b.scope - libcontainer container 9fd689204d6ad8a5379da03e5164dcf0c0364ec826f33ae24c641429dbefbd9b. Sep 13 01:11:17.673685 kubelet[2321]: E0913 01:11:17.673400 2321 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.244.29.26:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.244.29.26:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:11:17.700370 systemd[1]: Started cri-containerd-166dd9e28e21be2815a5a9d9f170f04cc18cc5b3e6b1c28a3114aa70b5db20a7.scope - libcontainer container 166dd9e28e21be2815a5a9d9f170f04cc18cc5b3e6b1c28a3114aa70b5db20a7. Sep 13 01:11:17.783463 containerd[1507]: time="2025-09-13T01:11:17.783276266Z" level=info msg="StartContainer for \"8fa9d02c3707e277d8856ec00bec47fab3cd3868e9fbfea82464811cd1dea2f3\" returns successfully" Sep 13 01:11:17.795198 kubelet[2321]: E0913 01:11:17.795151 2321 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-5asmg.gb1.brightbox.com\" not found" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:17.817248 containerd[1507]: time="2025-09-13T01:11:17.817182914Z" level=info msg="StartContainer for \"9fd689204d6ad8a5379da03e5164dcf0c0364ec826f33ae24c641429dbefbd9b\" returns successfully" Sep 13 01:11:17.826372 containerd[1507]: time="2025-09-13T01:11:17.826299088Z" level=info msg="StartContainer for \"166dd9e28e21be2815a5a9d9f170f04cc18cc5b3e6b1c28a3114aa70b5db20a7\" returns successfully" Sep 13 01:11:18.815322 kubelet[2321]: E0913 01:11:18.815269 2321 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-5asmg.gb1.brightbox.com\" not found" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:18.818867 kubelet[2321]: E0913 01:11:18.818824 2321 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-5asmg.gb1.brightbox.com\" not found" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:18.819420 kubelet[2321]: E0913 01:11:18.819390 2321 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-5asmg.gb1.brightbox.com\" not found" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:18.960854 kubelet[2321]: I0913 01:11:18.960802 2321 kubelet_node_status.go:75] "Attempting to register node" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:19.823704 kubelet[2321]: E0913 01:11:19.823655 2321 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-5asmg.gb1.brightbox.com\" not found" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:19.825056 kubelet[2321]: E0913 01:11:19.824320 2321 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-5asmg.gb1.brightbox.com\" not found" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:20.832173 kubelet[2321]: E0913 01:11:20.829614 2321 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-5asmg.gb1.brightbox.com\" not found" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:20.932869 kubelet[2321]: E0913 01:11:20.932810 2321 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-5asmg.gb1.brightbox.com\" not found" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:20.979860 kubelet[2321]: I0913 01:11:20.979806 2321 kubelet_node_status.go:78] "Successfully registered node" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:21.023608 kubelet[2321]: I0913 01:11:21.023543 2321 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:21.084170 kubelet[2321]: E0913 01:11:21.083896 2321 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-5asmg.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:21.086255 kubelet[2321]: I0913 01:11:21.086209 2321 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:21.091716 kubelet[2321]: E0913 01:11:21.091659 2321 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-5asmg.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:21.091716 kubelet[2321]: I0913 01:11:21.091715 2321 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:21.094496 kubelet[2321]: E0913 01:11:21.094466 2321 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-5asmg.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:21.688396 kubelet[2321]: I0913 01:11:21.688233 2321 apiserver.go:52] "Watching apiserver" Sep 13 01:11:21.723213 kubelet[2321]: I0913 01:11:21.723081 2321 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 13 01:11:23.464300 systemd[1]: Reloading requested from client PID 2602 ('systemctl') (unit session-11.scope)... Sep 13 01:11:23.464332 systemd[1]: Reloading... Sep 13 01:11:23.591156 zram_generator::config[2642]: No configuration found. Sep 13 01:11:23.794951 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 01:11:23.926703 systemd[1]: Reloading finished in 461 ms. Sep 13 01:11:24.003429 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:11:24.020035 systemd[1]: kubelet.service: Deactivated successfully. Sep 13 01:11:24.020491 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:11:24.020589 systemd[1]: kubelet.service: Consumed 1.853s CPU time, 132.4M memory peak, 0B memory swap peak. Sep 13 01:11:24.028547 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:11:24.255399 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:11:24.266618 (kubelet)[2705]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 13 01:11:24.384119 kubelet[2705]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 01:11:24.384119 kubelet[2705]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 13 01:11:24.384119 kubelet[2705]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 01:11:24.384748 kubelet[2705]: I0913 01:11:24.384164 2705 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 13 01:11:24.405190 kubelet[2705]: I0913 01:11:24.402852 2705 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 13 01:11:24.405190 kubelet[2705]: I0913 01:11:24.402923 2705 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 13 01:11:24.405190 kubelet[2705]: I0913 01:11:24.403625 2705 server.go:954] "Client rotation is on, will bootstrap in background" Sep 13 01:11:24.420704 kubelet[2705]: I0913 01:11:24.420656 2705 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 13 01:11:24.435155 kubelet[2705]: I0913 01:11:24.432827 2705 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 13 01:11:24.469676 kubelet[2705]: E0913 01:11:24.469384 2705 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 13 01:11:24.470435 kubelet[2705]: I0913 01:11:24.470409 2705 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 13 01:11:24.478705 kubelet[2705]: I0913 01:11:24.478666 2705 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 13 01:11:24.480441 kubelet[2705]: I0913 01:11:24.479066 2705 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 13 01:11:24.480729 kubelet[2705]: I0913 01:11:24.480432 2705 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-5asmg.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 13 01:11:24.480937 kubelet[2705]: I0913 01:11:24.480747 2705 topology_manager.go:138] "Creating topology manager with none policy" Sep 13 01:11:24.480937 kubelet[2705]: I0913 01:11:24.480766 2705 container_manager_linux.go:304] "Creating device plugin manager" Sep 13 01:11:24.480937 kubelet[2705]: I0913 01:11:24.480843 2705 state_mem.go:36] "Initialized new in-memory state store" Sep 13 01:11:24.485130 kubelet[2705]: I0913 01:11:24.485087 2705 kubelet.go:446] "Attempting to sync node with API server" Sep 13 01:11:24.488157 kubelet[2705]: I0913 01:11:24.486740 2705 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 13 01:11:24.488157 kubelet[2705]: I0913 01:11:24.486795 2705 kubelet.go:352] "Adding apiserver pod source" Sep 13 01:11:24.488157 kubelet[2705]: I0913 01:11:24.486816 2705 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 13 01:11:24.498373 kubelet[2705]: I0913 01:11:24.497855 2705 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 13 01:11:24.499650 kubelet[2705]: I0913 01:11:24.499509 2705 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 13 01:11:24.508352 kubelet[2705]: I0913 01:11:24.508216 2705 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 13 01:11:24.508352 kubelet[2705]: I0913 01:11:24.508273 2705 server.go:1287] "Started kubelet" Sep 13 01:11:24.519735 kubelet[2705]: I0913 01:11:24.519687 2705 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 13 01:11:24.531796 kubelet[2705]: I0913 01:11:24.530441 2705 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 13 01:11:24.541505 kubelet[2705]: I0913 01:11:24.541410 2705 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 13 01:11:24.555584 kubelet[2705]: I0913 01:11:24.542549 2705 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 13 01:11:24.555584 kubelet[2705]: I0913 01:11:24.543899 2705 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 13 01:11:24.562998 kubelet[2705]: I0913 01:11:24.543914 2705 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 13 01:11:24.562998 kubelet[2705]: E0913 01:11:24.544070 2705 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-5asmg.gb1.brightbox.com\" not found" Sep 13 01:11:24.562998 kubelet[2705]: I0913 01:11:24.545078 2705 server.go:479] "Adding debug handlers to kubelet server" Sep 13 01:11:24.567742 kubelet[2705]: I0913 01:11:24.567702 2705 reconciler.go:26] "Reconciler: start to sync state" Sep 13 01:11:24.573193 kubelet[2705]: I0913 01:11:24.572767 2705 factory.go:221] Registration of the systemd container factory successfully Sep 13 01:11:24.573193 kubelet[2705]: I0913 01:11:24.572942 2705 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 13 01:11:24.592093 kubelet[2705]: I0913 01:11:24.589661 2705 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 13 01:11:24.597613 kubelet[2705]: E0913 01:11:24.597457 2705 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 13 01:11:24.599305 kubelet[2705]: I0913 01:11:24.598779 2705 factory.go:221] Registration of the containerd container factory successfully Sep 13 01:11:24.626715 kubelet[2705]: I0913 01:11:24.626480 2705 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 13 01:11:24.636059 kubelet[2705]: I0913 01:11:24.636009 2705 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 13 01:11:24.636626 kubelet[2705]: I0913 01:11:24.636464 2705 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 13 01:11:24.636626 kubelet[2705]: I0913 01:11:24.636508 2705 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 13 01:11:24.636626 kubelet[2705]: I0913 01:11:24.636523 2705 kubelet.go:2382] "Starting kubelet main sync loop" Sep 13 01:11:24.640615 kubelet[2705]: E0913 01:11:24.639022 2705 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 13 01:11:24.739613 kubelet[2705]: E0913 01:11:24.739546 2705 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 13 01:11:24.770286 kubelet[2705]: I0913 01:11:24.770083 2705 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 13 01:11:24.770689 kubelet[2705]: I0913 01:11:24.770665 2705 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 13 01:11:24.770805 kubelet[2705]: I0913 01:11:24.770789 2705 state_mem.go:36] "Initialized new in-memory state store" Sep 13 01:11:24.771165 kubelet[2705]: I0913 01:11:24.771135 2705 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 13 01:11:24.771311 kubelet[2705]: I0913 01:11:24.771259 2705 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 13 01:11:24.771438 kubelet[2705]: I0913 01:11:24.771416 2705 policy_none.go:49] "None policy: Start" Sep 13 01:11:24.771562 kubelet[2705]: I0913 01:11:24.771538 2705 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 13 01:11:24.771772 kubelet[2705]: I0913 01:11:24.771748 2705 state_mem.go:35] "Initializing new in-memory state store" Sep 13 01:11:24.772097 kubelet[2705]: I0913 01:11:24.772072 2705 state_mem.go:75] "Updated machine memory state" Sep 13 01:11:24.784627 kubelet[2705]: I0913 01:11:24.784545 2705 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 13 01:11:24.786129 kubelet[2705]: I0913 01:11:24.785311 2705 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 13 01:11:24.786129 kubelet[2705]: I0913 01:11:24.785336 2705 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 13 01:11:24.786129 kubelet[2705]: I0913 01:11:24.786049 2705 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 13 01:11:24.805303 kubelet[2705]: E0913 01:11:24.805257 2705 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 13 01:11:24.924816 kubelet[2705]: I0913 01:11:24.924768 2705 kubelet_node_status.go:75] "Attempting to register node" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.942993 kubelet[2705]: I0913 01:11:24.942198 2705 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.942993 kubelet[2705]: I0913 01:11:24.942335 2705 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.945232 kubelet[2705]: I0913 01:11:24.945211 2705 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.953548 kubelet[2705]: W0913 01:11:24.953225 2705 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 01:11:24.953949 kubelet[2705]: I0913 01:11:24.953922 2705 kubelet_node_status.go:124] "Node was previously registered" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.954252 kubelet[2705]: I0913 01:11:24.954231 2705 kubelet_node_status.go:78] "Successfully registered node" node="srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.959651 kubelet[2705]: W0913 01:11:24.959250 2705 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 01:11:24.966051 kubelet[2705]: W0913 01:11:24.965784 2705 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 01:11:24.970394 kubelet[2705]: I0913 01:11:24.970354 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3549699fc9435a80436f9ad35ce434c5-ca-certs\") pod \"kube-apiserver-srv-5asmg.gb1.brightbox.com\" (UID: \"3549699fc9435a80436f9ad35ce434c5\") " pod="kube-system/kube-apiserver-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.970495 kubelet[2705]: I0913 01:11:24.970403 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3549699fc9435a80436f9ad35ce434c5-usr-share-ca-certificates\") pod \"kube-apiserver-srv-5asmg.gb1.brightbox.com\" (UID: \"3549699fc9435a80436f9ad35ce434c5\") " pod="kube-system/kube-apiserver-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.970495 kubelet[2705]: I0913 01:11:24.970439 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/edab042cb2fe2c97a2d64ae55efa07b8-ca-certs\") pod \"kube-controller-manager-srv-5asmg.gb1.brightbox.com\" (UID: \"edab042cb2fe2c97a2d64ae55efa07b8\") " pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.970495 kubelet[2705]: I0913 01:11:24.970471 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/edab042cb2fe2c97a2d64ae55efa07b8-flexvolume-dir\") pod \"kube-controller-manager-srv-5asmg.gb1.brightbox.com\" (UID: \"edab042cb2fe2c97a2d64ae55efa07b8\") " pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.970642 kubelet[2705]: I0913 01:11:24.970501 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/edab042cb2fe2c97a2d64ae55efa07b8-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-5asmg.gb1.brightbox.com\" (UID: \"edab042cb2fe2c97a2d64ae55efa07b8\") " pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.970642 kubelet[2705]: I0913 01:11:24.970534 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9633c103eafc94ce88cfef31f23b3422-kubeconfig\") pod \"kube-scheduler-srv-5asmg.gb1.brightbox.com\" (UID: \"9633c103eafc94ce88cfef31f23b3422\") " pod="kube-system/kube-scheduler-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.970642 kubelet[2705]: I0913 01:11:24.970564 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3549699fc9435a80436f9ad35ce434c5-k8s-certs\") pod \"kube-apiserver-srv-5asmg.gb1.brightbox.com\" (UID: \"3549699fc9435a80436f9ad35ce434c5\") " pod="kube-system/kube-apiserver-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.970642 kubelet[2705]: I0913 01:11:24.970591 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/edab042cb2fe2c97a2d64ae55efa07b8-k8s-certs\") pod \"kube-controller-manager-srv-5asmg.gb1.brightbox.com\" (UID: \"edab042cb2fe2c97a2d64ae55efa07b8\") " pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:24.970642 kubelet[2705]: I0913 01:11:24.970622 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/edab042cb2fe2c97a2d64ae55efa07b8-kubeconfig\") pod \"kube-controller-manager-srv-5asmg.gb1.brightbox.com\" (UID: \"edab042cb2fe2c97a2d64ae55efa07b8\") " pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:25.489775 kubelet[2705]: I0913 01:11:25.489700 2705 apiserver.go:52] "Watching apiserver" Sep 13 01:11:25.563242 kubelet[2705]: I0913 01:11:25.563149 2705 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 13 01:11:25.695373 kubelet[2705]: I0913 01:11:25.694034 2705 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:25.695880 kubelet[2705]: I0913 01:11:25.695256 2705 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:25.706391 kubelet[2705]: W0913 01:11:25.706210 2705 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 01:11:25.707189 kubelet[2705]: E0913 01:11:25.706349 2705 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-5asmg.gb1.brightbox.com\" already exists" pod="kube-system/kube-scheduler-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:25.708009 kubelet[2705]: W0913 01:11:25.707429 2705 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 01:11:25.708009 kubelet[2705]: E0913 01:11:25.707470 2705 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-5asmg.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-5asmg.gb1.brightbox.com" Sep 13 01:11:25.771642 kubelet[2705]: I0913 01:11:25.771408 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-5asmg.gb1.brightbox.com" podStartSLOduration=1.771368383 podStartE2EDuration="1.771368383s" podCreationTimestamp="2025-09-13 01:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 01:11:25.769551596 +0000 UTC m=+1.467399838" watchObservedRunningTime="2025-09-13 01:11:25.771368383 +0000 UTC m=+1.469216616" Sep 13 01:11:25.772184 kubelet[2705]: I0913 01:11:25.771599 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-5asmg.gb1.brightbox.com" podStartSLOduration=1.771589316 podStartE2EDuration="1.771589316s" podCreationTimestamp="2025-09-13 01:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 01:11:25.755424283 +0000 UTC m=+1.453272517" watchObservedRunningTime="2025-09-13 01:11:25.771589316 +0000 UTC m=+1.469437549" Sep 13 01:11:25.787445 kubelet[2705]: I0913 01:11:25.787344 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-5asmg.gb1.brightbox.com" podStartSLOduration=1.787315322 podStartE2EDuration="1.787315322s" podCreationTimestamp="2025-09-13 01:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 01:11:25.784249239 +0000 UTC m=+1.482097481" watchObservedRunningTime="2025-09-13 01:11:25.787315322 +0000 UTC m=+1.485163565" Sep 13 01:11:28.989714 kubelet[2705]: I0913 01:11:28.989623 2705 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 13 01:11:28.991764 kubelet[2705]: I0913 01:11:28.991153 2705 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 13 01:11:28.991844 containerd[1507]: time="2025-09-13T01:11:28.990631565Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 13 01:11:29.695485 systemd[1]: Created slice kubepods-besteffort-pod18caa0d6_b056_4041_a128_512b76b67a8b.slice - libcontainer container kubepods-besteffort-pod18caa0d6_b056_4041_a128_512b76b67a8b.slice. Sep 13 01:11:29.700284 kubelet[2705]: I0913 01:11:29.700244 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ff7l\" (UniqueName: \"kubernetes.io/projected/18caa0d6-b056-4041-a128-512b76b67a8b-kube-api-access-7ff7l\") pod \"kube-proxy-hgrbd\" (UID: \"18caa0d6-b056-4041-a128-512b76b67a8b\") " pod="kube-system/kube-proxy-hgrbd" Sep 13 01:11:29.700421 kubelet[2705]: I0913 01:11:29.700301 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/18caa0d6-b056-4041-a128-512b76b67a8b-xtables-lock\") pod \"kube-proxy-hgrbd\" (UID: \"18caa0d6-b056-4041-a128-512b76b67a8b\") " pod="kube-system/kube-proxy-hgrbd" Sep 13 01:11:29.700421 kubelet[2705]: I0913 01:11:29.700341 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18caa0d6-b056-4041-a128-512b76b67a8b-lib-modules\") pod \"kube-proxy-hgrbd\" (UID: \"18caa0d6-b056-4041-a128-512b76b67a8b\") " pod="kube-system/kube-proxy-hgrbd" Sep 13 01:11:29.700421 kubelet[2705]: I0913 01:11:29.700372 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/18caa0d6-b056-4041-a128-512b76b67a8b-kube-proxy\") pod \"kube-proxy-hgrbd\" (UID: \"18caa0d6-b056-4041-a128-512b76b67a8b\") " pod="kube-system/kube-proxy-hgrbd" Sep 13 01:11:29.813637 kubelet[2705]: E0913 01:11:29.813534 2705 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 13 01:11:29.813637 kubelet[2705]: E0913 01:11:29.813640 2705 projected.go:194] Error preparing data for projected volume kube-api-access-7ff7l for pod kube-system/kube-proxy-hgrbd: configmap "kube-root-ca.crt" not found Sep 13 01:11:29.813937 kubelet[2705]: E0913 01:11:29.813763 2705 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18caa0d6-b056-4041-a128-512b76b67a8b-kube-api-access-7ff7l podName:18caa0d6-b056-4041-a128-512b76b67a8b nodeName:}" failed. No retries permitted until 2025-09-13 01:11:30.313723382 +0000 UTC m=+6.011571615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7ff7l" (UniqueName: "kubernetes.io/projected/18caa0d6-b056-4041-a128-512b76b67a8b-kube-api-access-7ff7l") pod "kube-proxy-hgrbd" (UID: "18caa0d6-b056-4041-a128-512b76b67a8b") : configmap "kube-root-ca.crt" not found Sep 13 01:11:30.189199 systemd[1]: Created slice kubepods-besteffort-pod2ac25a20_1215_4972_ae43_5e6e4385d586.slice - libcontainer container kubepods-besteffort-pod2ac25a20_1215_4972_ae43_5e6e4385d586.slice. Sep 13 01:11:30.204300 kubelet[2705]: I0913 01:11:30.204236 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2ac25a20-1215-4972-ae43-5e6e4385d586-var-lib-calico\") pod \"tigera-operator-755d956888-kj4xw\" (UID: \"2ac25a20-1215-4972-ae43-5e6e4385d586\") " pod="tigera-operator/tigera-operator-755d956888-kj4xw" Sep 13 01:11:30.204300 kubelet[2705]: I0913 01:11:30.204303 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nf2x\" (UniqueName: \"kubernetes.io/projected/2ac25a20-1215-4972-ae43-5e6e4385d586-kube-api-access-4nf2x\") pod \"tigera-operator-755d956888-kj4xw\" (UID: \"2ac25a20-1215-4972-ae43-5e6e4385d586\") " pod="tigera-operator/tigera-operator-755d956888-kj4xw" Sep 13 01:11:30.495710 containerd[1507]: time="2025-09-13T01:11:30.494967830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-kj4xw,Uid:2ac25a20-1215-4972-ae43-5e6e4385d586,Namespace:tigera-operator,Attempt:0,}" Sep 13 01:11:30.540737 containerd[1507]: time="2025-09-13T01:11:30.536673131Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:11:30.540737 containerd[1507]: time="2025-09-13T01:11:30.536796390Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:11:30.540737 containerd[1507]: time="2025-09-13T01:11:30.536822583Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:30.540737 containerd[1507]: time="2025-09-13T01:11:30.537009975Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:30.589452 systemd[1]: Started cri-containerd-df791dd6c68426aac7249d2648daadd943f67d738e212263d22de540f411201c.scope - libcontainer container df791dd6c68426aac7249d2648daadd943f67d738e212263d22de540f411201c. Sep 13 01:11:30.607908 containerd[1507]: time="2025-09-13T01:11:30.607829165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hgrbd,Uid:18caa0d6-b056-4041-a128-512b76b67a8b,Namespace:kube-system,Attempt:0,}" Sep 13 01:11:30.676309 containerd[1507]: time="2025-09-13T01:11:30.676033382Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:11:30.676309 containerd[1507]: time="2025-09-13T01:11:30.676148712Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:11:30.676309 containerd[1507]: time="2025-09-13T01:11:30.676176561Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:30.680137 containerd[1507]: time="2025-09-13T01:11:30.678561381Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:30.687347 containerd[1507]: time="2025-09-13T01:11:30.687299939Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-kj4xw,Uid:2ac25a20-1215-4972-ae43-5e6e4385d586,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"df791dd6c68426aac7249d2648daadd943f67d738e212263d22de540f411201c\"" Sep 13 01:11:30.692240 containerd[1507]: time="2025-09-13T01:11:30.692184004Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 13 01:11:30.721405 systemd[1]: Started cri-containerd-0e80e9799dfc24fc208fb78695aa8a3de783e250515a0b1e23b214185b19ab1e.scope - libcontainer container 0e80e9799dfc24fc208fb78695aa8a3de783e250515a0b1e23b214185b19ab1e. Sep 13 01:11:30.759876 containerd[1507]: time="2025-09-13T01:11:30.759406537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hgrbd,Uid:18caa0d6-b056-4041-a128-512b76b67a8b,Namespace:kube-system,Attempt:0,} returns sandbox id \"0e80e9799dfc24fc208fb78695aa8a3de783e250515a0b1e23b214185b19ab1e\"" Sep 13 01:11:30.765764 containerd[1507]: time="2025-09-13T01:11:30.765383826Z" level=info msg="CreateContainer within sandbox \"0e80e9799dfc24fc208fb78695aa8a3de783e250515a0b1e23b214185b19ab1e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 13 01:11:30.802291 containerd[1507]: time="2025-09-13T01:11:30.802089699Z" level=info msg="CreateContainer within sandbox \"0e80e9799dfc24fc208fb78695aa8a3de783e250515a0b1e23b214185b19ab1e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"6ac1e4697bd6bbd5b0f83cf7c0af83d06bc0255844758e544816e9819fe09331\"" Sep 13 01:11:30.804614 containerd[1507]: time="2025-09-13T01:11:30.804577202Z" level=info msg="StartContainer for \"6ac1e4697bd6bbd5b0f83cf7c0af83d06bc0255844758e544816e9819fe09331\"" Sep 13 01:11:30.851391 systemd[1]: Started cri-containerd-6ac1e4697bd6bbd5b0f83cf7c0af83d06bc0255844758e544816e9819fe09331.scope - libcontainer container 6ac1e4697bd6bbd5b0f83cf7c0af83d06bc0255844758e544816e9819fe09331. Sep 13 01:11:30.896082 containerd[1507]: time="2025-09-13T01:11:30.896033102Z" level=info msg="StartContainer for \"6ac1e4697bd6bbd5b0f83cf7c0af83d06bc0255844758e544816e9819fe09331\" returns successfully" Sep 13 01:11:32.830921 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount461391747.mount: Deactivated successfully. Sep 13 01:11:33.884785 containerd[1507]: time="2025-09-13T01:11:33.884713040Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:33.886138 containerd[1507]: time="2025-09-13T01:11:33.886004870Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 13 01:11:33.887967 containerd[1507]: time="2025-09-13T01:11:33.887902810Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:33.891169 containerd[1507]: time="2025-09-13T01:11:33.891090525Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:33.892208 containerd[1507]: time="2025-09-13T01:11:33.892172679Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 3.199934692s" Sep 13 01:11:33.892451 containerd[1507]: time="2025-09-13T01:11:33.892314652Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 13 01:11:33.895407 containerd[1507]: time="2025-09-13T01:11:33.895362559Z" level=info msg="CreateContainer within sandbox \"df791dd6c68426aac7249d2648daadd943f67d738e212263d22de540f411201c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 13 01:11:33.912789 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2278823375.mount: Deactivated successfully. Sep 13 01:11:33.922531 containerd[1507]: time="2025-09-13T01:11:33.922281863Z" level=info msg="CreateContainer within sandbox \"df791dd6c68426aac7249d2648daadd943f67d738e212263d22de540f411201c\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ed79df3bbf2fe2895d25f516c46c9ef1829d532005cd09e8a3e12616afd494cc\"" Sep 13 01:11:33.924072 containerd[1507]: time="2025-09-13T01:11:33.923977754Z" level=info msg="StartContainer for \"ed79df3bbf2fe2895d25f516c46c9ef1829d532005cd09e8a3e12616afd494cc\"" Sep 13 01:11:33.990347 systemd[1]: Started cri-containerd-ed79df3bbf2fe2895d25f516c46c9ef1829d532005cd09e8a3e12616afd494cc.scope - libcontainer container ed79df3bbf2fe2895d25f516c46c9ef1829d532005cd09e8a3e12616afd494cc. Sep 13 01:11:34.029294 containerd[1507]: time="2025-09-13T01:11:34.029221436Z" level=info msg="StartContainer for \"ed79df3bbf2fe2895d25f516c46c9ef1829d532005cd09e8a3e12616afd494cc\" returns successfully" Sep 13 01:11:34.741978 kubelet[2705]: I0913 01:11:34.741679 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-hgrbd" podStartSLOduration=5.741646667 podStartE2EDuration="5.741646667s" podCreationTimestamp="2025-09-13 01:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 01:11:31.747397124 +0000 UTC m=+7.445245364" watchObservedRunningTime="2025-09-13 01:11:34.741646667 +0000 UTC m=+10.439494899" Sep 13 01:11:34.744335 kubelet[2705]: I0913 01:11:34.743949 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-kj4xw" podStartSLOduration=1.541768692 podStartE2EDuration="4.74393015s" podCreationTimestamp="2025-09-13 01:11:30 +0000 UTC" firstStartedPulling="2025-09-13 01:11:30.69157385 +0000 UTC m=+6.389422075" lastFinishedPulling="2025-09-13 01:11:33.89373531 +0000 UTC m=+9.591583533" observedRunningTime="2025-09-13 01:11:34.743452223 +0000 UTC m=+10.441300456" watchObservedRunningTime="2025-09-13 01:11:34.74393015 +0000 UTC m=+10.441778390" Sep 13 01:11:37.903298 systemd[1]: cri-containerd-ed79df3bbf2fe2895d25f516c46c9ef1829d532005cd09e8a3e12616afd494cc.scope: Deactivated successfully. Sep 13 01:11:37.981711 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ed79df3bbf2fe2895d25f516c46c9ef1829d532005cd09e8a3e12616afd494cc-rootfs.mount: Deactivated successfully. Sep 13 01:11:38.064858 containerd[1507]: time="2025-09-13T01:11:37.988830490Z" level=info msg="shim disconnected" id=ed79df3bbf2fe2895d25f516c46c9ef1829d532005cd09e8a3e12616afd494cc namespace=k8s.io Sep 13 01:11:38.065623 containerd[1507]: time="2025-09-13T01:11:38.064869423Z" level=warning msg="cleaning up after shim disconnected" id=ed79df3bbf2fe2895d25f516c46c9ef1829d532005cd09e8a3e12616afd494cc namespace=k8s.io Sep 13 01:11:38.065623 containerd[1507]: time="2025-09-13T01:11:38.064907061Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 01:11:38.782282 kubelet[2705]: I0913 01:11:38.779732 2705 scope.go:117] "RemoveContainer" containerID="ed79df3bbf2fe2895d25f516c46c9ef1829d532005cd09e8a3e12616afd494cc" Sep 13 01:11:38.797649 containerd[1507]: time="2025-09-13T01:11:38.797072256Z" level=info msg="CreateContainer within sandbox \"df791dd6c68426aac7249d2648daadd943f67d738e212263d22de540f411201c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 13 01:11:38.825665 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2249278011.mount: Deactivated successfully. Sep 13 01:11:38.830531 containerd[1507]: time="2025-09-13T01:11:38.830435120Z" level=info msg="CreateContainer within sandbox \"df791dd6c68426aac7249d2648daadd943f67d738e212263d22de540f411201c\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"b89fb3a2ee572e263baa3b28c1658aa2b68806179599b7ef74290b5b8b6d187a\"" Sep 13 01:11:38.839162 containerd[1507]: time="2025-09-13T01:11:38.835563701Z" level=info msg="StartContainer for \"b89fb3a2ee572e263baa3b28c1658aa2b68806179599b7ef74290b5b8b6d187a\"" Sep 13 01:11:38.905324 systemd[1]: Started cri-containerd-b89fb3a2ee572e263baa3b28c1658aa2b68806179599b7ef74290b5b8b6d187a.scope - libcontainer container b89fb3a2ee572e263baa3b28c1658aa2b68806179599b7ef74290b5b8b6d187a. Sep 13 01:11:38.965934 containerd[1507]: time="2025-09-13T01:11:38.965867182Z" level=info msg="StartContainer for \"b89fb3a2ee572e263baa3b28c1658aa2b68806179599b7ef74290b5b8b6d187a\" returns successfully" Sep 13 01:11:41.779304 sudo[1779]: pam_unix(sudo:session): session closed for user root Sep 13 01:11:41.926751 sshd[1776]: pam_unix(sshd:session): session closed for user core Sep 13 01:11:41.932375 systemd-logind[1489]: Session 11 logged out. Waiting for processes to exit. Sep 13 01:11:41.932918 systemd[1]: sshd@8-10.244.29.26:22-139.178.68.195:37142.service: Deactivated successfully. Sep 13 01:11:41.939275 systemd[1]: session-11.scope: Deactivated successfully. Sep 13 01:11:41.939868 systemd[1]: session-11.scope: Consumed 7.509s CPU time, 142.5M memory peak, 0B memory swap peak. Sep 13 01:11:41.943638 systemd-logind[1489]: Removed session 11. Sep 13 01:11:47.917611 systemd[1]: Created slice kubepods-besteffort-pod5996d7c1_0020_4b4f_bb42_b4796ff967b7.slice - libcontainer container kubepods-besteffort-pod5996d7c1_0020_4b4f_bb42_b4796ff967b7.slice. Sep 13 01:11:47.957557 kubelet[2705]: I0913 01:11:47.957320 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/5996d7c1-0020-4b4f-bb42-b4796ff967b7-typha-certs\") pod \"calico-typha-5474bf7bf9-8wvrs\" (UID: \"5996d7c1-0020-4b4f-bb42-b4796ff967b7\") " pod="calico-system/calico-typha-5474bf7bf9-8wvrs" Sep 13 01:11:47.957557 kubelet[2705]: I0913 01:11:47.957375 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2bth\" (UniqueName: \"kubernetes.io/projected/5996d7c1-0020-4b4f-bb42-b4796ff967b7-kube-api-access-l2bth\") pod \"calico-typha-5474bf7bf9-8wvrs\" (UID: \"5996d7c1-0020-4b4f-bb42-b4796ff967b7\") " pod="calico-system/calico-typha-5474bf7bf9-8wvrs" Sep 13 01:11:47.957557 kubelet[2705]: I0913 01:11:47.957411 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5996d7c1-0020-4b4f-bb42-b4796ff967b7-tigera-ca-bundle\") pod \"calico-typha-5474bf7bf9-8wvrs\" (UID: \"5996d7c1-0020-4b4f-bb42-b4796ff967b7\") " pod="calico-system/calico-typha-5474bf7bf9-8wvrs" Sep 13 01:11:48.229075 containerd[1507]: time="2025-09-13T01:11:48.228894269Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5474bf7bf9-8wvrs,Uid:5996d7c1-0020-4b4f-bb42-b4796ff967b7,Namespace:calico-system,Attempt:0,}" Sep 13 01:11:48.315083 containerd[1507]: time="2025-09-13T01:11:48.311355633Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:11:48.315083 containerd[1507]: time="2025-09-13T01:11:48.311465109Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:11:48.315083 containerd[1507]: time="2025-09-13T01:11:48.311507927Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:48.316967 systemd[1]: Created slice kubepods-besteffort-pod7aa155c8_e3d1_4320_a2f8_432a81875638.slice - libcontainer container kubepods-besteffort-pod7aa155c8_e3d1_4320_a2f8_432a81875638.slice. Sep 13 01:11:48.317313 containerd[1507]: time="2025-09-13T01:11:48.314908473Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:48.363582 kubelet[2705]: I0913 01:11:48.362210 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7aa155c8-e3d1-4320-a2f8-432a81875638-var-lib-calico\") pod \"calico-node-292qn\" (UID: \"7aa155c8-e3d1-4320-a2f8-432a81875638\") " pod="calico-system/calico-node-292qn" Sep 13 01:11:48.363582 kubelet[2705]: I0913 01:11:48.362295 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/7aa155c8-e3d1-4320-a2f8-432a81875638-policysync\") pod \"calico-node-292qn\" (UID: \"7aa155c8-e3d1-4320-a2f8-432a81875638\") " pod="calico-system/calico-node-292qn" Sep 13 01:11:48.363582 kubelet[2705]: I0913 01:11:48.362486 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7aa155c8-e3d1-4320-a2f8-432a81875638-xtables-lock\") pod \"calico-node-292qn\" (UID: \"7aa155c8-e3d1-4320-a2f8-432a81875638\") " pod="calico-system/calico-node-292qn" Sep 13 01:11:48.363582 kubelet[2705]: I0913 01:11:48.362548 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/7aa155c8-e3d1-4320-a2f8-432a81875638-flexvol-driver-host\") pod \"calico-node-292qn\" (UID: \"7aa155c8-e3d1-4320-a2f8-432a81875638\") " pod="calico-system/calico-node-292qn" Sep 13 01:11:48.363582 kubelet[2705]: I0913 01:11:48.362590 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aa155c8-e3d1-4320-a2f8-432a81875638-tigera-ca-bundle\") pod \"calico-node-292qn\" (UID: \"7aa155c8-e3d1-4320-a2f8-432a81875638\") " pod="calico-system/calico-node-292qn" Sep 13 01:11:48.363966 kubelet[2705]: I0913 01:11:48.362642 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/7aa155c8-e3d1-4320-a2f8-432a81875638-cni-log-dir\") pod \"calico-node-292qn\" (UID: \"7aa155c8-e3d1-4320-a2f8-432a81875638\") " pod="calico-system/calico-node-292qn" Sep 13 01:11:48.363966 kubelet[2705]: I0913 01:11:48.362676 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/7aa155c8-e3d1-4320-a2f8-432a81875638-var-run-calico\") pod \"calico-node-292qn\" (UID: \"7aa155c8-e3d1-4320-a2f8-432a81875638\") " pod="calico-system/calico-node-292qn" Sep 13 01:11:48.363966 kubelet[2705]: I0913 01:11:48.362916 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/7aa155c8-e3d1-4320-a2f8-432a81875638-cni-bin-dir\") pod \"calico-node-292qn\" (UID: \"7aa155c8-e3d1-4320-a2f8-432a81875638\") " pod="calico-system/calico-node-292qn" Sep 13 01:11:48.363966 kubelet[2705]: I0913 01:11:48.362970 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r296d\" (UniqueName: \"kubernetes.io/projected/7aa155c8-e3d1-4320-a2f8-432a81875638-kube-api-access-r296d\") pod \"calico-node-292qn\" (UID: \"7aa155c8-e3d1-4320-a2f8-432a81875638\") " pod="calico-system/calico-node-292qn" Sep 13 01:11:48.363966 kubelet[2705]: I0913 01:11:48.363007 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7aa155c8-e3d1-4320-a2f8-432a81875638-lib-modules\") pod \"calico-node-292qn\" (UID: \"7aa155c8-e3d1-4320-a2f8-432a81875638\") " pod="calico-system/calico-node-292qn" Sep 13 01:11:48.365087 kubelet[2705]: I0913 01:11:48.363063 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/7aa155c8-e3d1-4320-a2f8-432a81875638-node-certs\") pod \"calico-node-292qn\" (UID: \"7aa155c8-e3d1-4320-a2f8-432a81875638\") " pod="calico-system/calico-node-292qn" Sep 13 01:11:48.365087 kubelet[2705]: I0913 01:11:48.363097 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/7aa155c8-e3d1-4320-a2f8-432a81875638-cni-net-dir\") pod \"calico-node-292qn\" (UID: \"7aa155c8-e3d1-4320-a2f8-432a81875638\") " pod="calico-system/calico-node-292qn" Sep 13 01:11:48.366933 systemd[1]: Started cri-containerd-4d30bab5b150b99df06b207dd6b3f2804b38f9f124c46e83dad9b829021dc9af.scope - libcontainer container 4d30bab5b150b99df06b207dd6b3f2804b38f9f124c46e83dad9b829021dc9af. Sep 13 01:11:48.479485 kubelet[2705]: E0913 01:11:48.479040 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.480756 kubelet[2705]: W0913 01:11:48.480536 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.483141 containerd[1507]: time="2025-09-13T01:11:48.483030591Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5474bf7bf9-8wvrs,Uid:5996d7c1-0020-4b4f-bb42-b4796ff967b7,Namespace:calico-system,Attempt:0,} returns sandbox id \"4d30bab5b150b99df06b207dd6b3f2804b38f9f124c46e83dad9b829021dc9af\"" Sep 13 01:11:48.483931 kubelet[2705]: E0913 01:11:48.483749 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.484352 kubelet[2705]: E0913 01:11:48.484329 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.485061 kubelet[2705]: W0913 01:11:48.484557 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.485536 kubelet[2705]: E0913 01:11:48.485143 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.491533 kubelet[2705]: E0913 01:11:48.490678 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.491533 kubelet[2705]: W0913 01:11:48.490713 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.491533 kubelet[2705]: E0913 01:11:48.490818 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.494935 containerd[1507]: time="2025-09-13T01:11:48.493616596Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 13 01:11:48.596335 kubelet[2705]: E0913 01:11:48.596266 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8vq7t" podUID="2c92e0f8-426f-428f-9601-3c255a79b3c3" Sep 13 01:11:48.625450 containerd[1507]: time="2025-09-13T01:11:48.624688173Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-292qn,Uid:7aa155c8-e3d1-4320-a2f8-432a81875638,Namespace:calico-system,Attempt:0,}" Sep 13 01:11:48.636418 kubelet[2705]: E0913 01:11:48.636317 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.636418 kubelet[2705]: W0913 01:11:48.636352 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.636418 kubelet[2705]: E0913 01:11:48.636381 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.638576 kubelet[2705]: E0913 01:11:48.637402 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.638576 kubelet[2705]: W0913 01:11:48.637424 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.638576 kubelet[2705]: E0913 01:11:48.637442 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.640532 kubelet[2705]: E0913 01:11:48.640372 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.640668 kubelet[2705]: W0913 01:11:48.640643 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.641168 kubelet[2705]: E0913 01:11:48.641142 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.641658 kubelet[2705]: E0913 01:11:48.641637 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.642569 kubelet[2705]: W0913 01:11:48.642545 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.642686 kubelet[2705]: E0913 01:11:48.642664 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.644362 kubelet[2705]: E0913 01:11:48.644337 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.644625 kubelet[2705]: W0913 01:11:48.644469 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.644625 kubelet[2705]: E0913 01:11:48.644497 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.644843 kubelet[2705]: E0913 01:11:48.644823 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.644999 kubelet[2705]: W0913 01:11:48.644932 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.645385 kubelet[2705]: E0913 01:11:48.645361 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.647674 kubelet[2705]: E0913 01:11:48.647328 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.647674 kubelet[2705]: W0913 01:11:48.647350 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.647674 kubelet[2705]: E0913 01:11:48.647369 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.649798 kubelet[2705]: E0913 01:11:48.649469 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.649798 kubelet[2705]: W0913 01:11:48.649493 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.649798 kubelet[2705]: E0913 01:11:48.649513 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.651804 kubelet[2705]: E0913 01:11:48.651640 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.651804 kubelet[2705]: W0913 01:11:48.651661 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.651804 kubelet[2705]: E0913 01:11:48.651681 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.652265 kubelet[2705]: E0913 01:11:48.652026 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.652265 kubelet[2705]: W0913 01:11:48.652072 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.652265 kubelet[2705]: E0913 01:11:48.652094 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.652863 kubelet[2705]: E0913 01:11:48.652777 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.653578 kubelet[2705]: W0913 01:11:48.653514 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.655131 kubelet[2705]: E0913 01:11:48.654022 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.655432 kubelet[2705]: E0913 01:11:48.655410 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.655750 kubelet[2705]: W0913 01:11:48.655520 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.655750 kubelet[2705]: E0913 01:11:48.655548 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.657232 kubelet[2705]: E0913 01:11:48.656809 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.657232 kubelet[2705]: W0913 01:11:48.656830 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.657232 kubelet[2705]: E0913 01:11:48.656849 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.658515 kubelet[2705]: E0913 01:11:48.658189 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.658515 kubelet[2705]: W0913 01:11:48.658210 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.658515 kubelet[2705]: E0913 01:11:48.658228 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.660129 kubelet[2705]: E0913 01:11:48.659557 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.660129 kubelet[2705]: W0913 01:11:48.659589 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.660129 kubelet[2705]: E0913 01:11:48.659609 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.661171 kubelet[2705]: E0913 01:11:48.660784 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.661171 kubelet[2705]: W0913 01:11:48.660804 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.661171 kubelet[2705]: E0913 01:11:48.660822 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.662573 kubelet[2705]: E0913 01:11:48.662403 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.662573 kubelet[2705]: W0913 01:11:48.662423 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.662573 kubelet[2705]: E0913 01:11:48.662445 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.663538 kubelet[2705]: E0913 01:11:48.663190 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.665127 kubelet[2705]: W0913 01:11:48.663652 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.665127 kubelet[2705]: E0913 01:11:48.663679 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.665473 kubelet[2705]: E0913 01:11:48.665450 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.665743 kubelet[2705]: W0913 01:11:48.665576 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.665743 kubelet[2705]: E0913 01:11:48.665603 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.666371 kubelet[2705]: E0913 01:11:48.666045 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.666371 kubelet[2705]: W0913 01:11:48.666066 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.666371 kubelet[2705]: E0913 01:11:48.666084 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.667910 kubelet[2705]: E0913 01:11:48.667888 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.668031 kubelet[2705]: W0913 01:11:48.668009 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.668154 kubelet[2705]: E0913 01:11:48.668133 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.668424 kubelet[2705]: I0913 01:11:48.668287 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2c92e0f8-426f-428f-9601-3c255a79b3c3-varrun\") pod \"csi-node-driver-8vq7t\" (UID: \"2c92e0f8-426f-428f-9601-3c255a79b3c3\") " pod="calico-system/csi-node-driver-8vq7t" Sep 13 01:11:48.669203 kubelet[2705]: E0913 01:11:48.669180 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.669501 kubelet[2705]: W0913 01:11:48.669313 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.669501 kubelet[2705]: E0913 01:11:48.669352 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.669501 kubelet[2705]: I0913 01:11:48.669378 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2c92e0f8-426f-428f-9601-3c255a79b3c3-socket-dir\") pod \"csi-node-driver-8vq7t\" (UID: \"2c92e0f8-426f-428f-9601-3c255a79b3c3\") " pod="calico-system/csi-node-driver-8vq7t" Sep 13 01:11:48.671431 kubelet[2705]: E0913 01:11:48.671286 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.671431 kubelet[2705]: W0913 01:11:48.671307 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.671431 kubelet[2705]: E0913 01:11:48.671349 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.671640 kubelet[2705]: I0913 01:11:48.671432 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2c92e0f8-426f-428f-9601-3c255a79b3c3-registration-dir\") pod \"csi-node-driver-8vq7t\" (UID: \"2c92e0f8-426f-428f-9601-3c255a79b3c3\") " pod="calico-system/csi-node-driver-8vq7t" Sep 13 01:11:48.672041 kubelet[2705]: E0913 01:11:48.671901 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.672041 kubelet[2705]: W0913 01:11:48.671922 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.672041 kubelet[2705]: E0913 01:11:48.671985 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.673710 kubelet[2705]: E0913 01:11:48.672431 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.673710 kubelet[2705]: W0913 01:11:48.672450 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.674328 kubelet[2705]: E0913 01:11:48.674019 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.674328 kubelet[2705]: E0913 01:11:48.674036 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.674328 kubelet[2705]: W0913 01:11:48.674060 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.674328 kubelet[2705]: E0913 01:11:48.674097 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.674328 kubelet[2705]: I0913 01:11:48.674186 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qd5x\" (UniqueName: \"kubernetes.io/projected/2c92e0f8-426f-428f-9601-3c255a79b3c3-kube-api-access-6qd5x\") pod \"csi-node-driver-8vq7t\" (UID: \"2c92e0f8-426f-428f-9601-3c255a79b3c3\") " pod="calico-system/csi-node-driver-8vq7t" Sep 13 01:11:48.675397 kubelet[2705]: E0913 01:11:48.674789 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.675397 kubelet[2705]: W0913 01:11:48.674810 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.675397 kubelet[2705]: E0913 01:11:48.674853 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.676501 kubelet[2705]: E0913 01:11:48.676221 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.676501 kubelet[2705]: W0913 01:11:48.676241 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.676501 kubelet[2705]: E0913 01:11:48.676265 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.678464 kubelet[2705]: E0913 01:11:48.678440 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.678764 kubelet[2705]: W0913 01:11:48.678565 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.678764 kubelet[2705]: E0913 01:11:48.678603 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.678764 kubelet[2705]: I0913 01:11:48.678637 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2c92e0f8-426f-428f-9601-3c255a79b3c3-kubelet-dir\") pod \"csi-node-driver-8vq7t\" (UID: \"2c92e0f8-426f-428f-9601-3c255a79b3c3\") " pod="calico-system/csi-node-driver-8vq7t" Sep 13 01:11:48.679139 kubelet[2705]: E0913 01:11:48.679116 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.679361 kubelet[2705]: W0913 01:11:48.679231 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.679361 kubelet[2705]: E0913 01:11:48.679274 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.681268 kubelet[2705]: E0913 01:11:48.681240 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.681268 kubelet[2705]: W0913 01:11:48.681265 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.681407 kubelet[2705]: E0913 01:11:48.681285 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.682180 kubelet[2705]: E0913 01:11:48.682158 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.682180 kubelet[2705]: W0913 01:11:48.682179 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.683265 kubelet[2705]: E0913 01:11:48.682196 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.683265 kubelet[2705]: E0913 01:11:48.683219 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.683265 kubelet[2705]: W0913 01:11:48.683236 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.683265 kubelet[2705]: E0913 01:11:48.683252 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.685590 kubelet[2705]: E0913 01:11:48.685535 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.685590 kubelet[2705]: W0913 01:11:48.685558 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.685590 kubelet[2705]: E0913 01:11:48.685577 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.688460 kubelet[2705]: E0913 01:11:48.688318 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.688460 kubelet[2705]: W0913 01:11:48.688340 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.688460 kubelet[2705]: E0913 01:11:48.688369 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.709988 containerd[1507]: time="2025-09-13T01:11:48.707398906Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:11:48.709988 containerd[1507]: time="2025-09-13T01:11:48.707512535Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:11:48.709988 containerd[1507]: time="2025-09-13T01:11:48.707534142Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:48.714826 containerd[1507]: time="2025-09-13T01:11:48.713058890Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:11:48.771474 systemd[1]: Started cri-containerd-9ffba7aea979eedfb664a5728c098be7624f1fb2daaf5466d79fcf08a84b6ab3.scope - libcontainer container 9ffba7aea979eedfb664a5728c098be7624f1fb2daaf5466d79fcf08a84b6ab3. Sep 13 01:11:48.786818 kubelet[2705]: E0913 01:11:48.786779 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.788387 kubelet[2705]: W0913 01:11:48.787150 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.788387 kubelet[2705]: E0913 01:11:48.787192 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.789254 kubelet[2705]: E0913 01:11:48.789221 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.790136 kubelet[2705]: W0913 01:11:48.789371 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.790136 kubelet[2705]: E0913 01:11:48.789412 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.793408 kubelet[2705]: E0913 01:11:48.793216 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.793408 kubelet[2705]: W0913 01:11:48.793241 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.793408 kubelet[2705]: E0913 01:11:48.793316 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.793990 kubelet[2705]: E0913 01:11:48.793789 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.793990 kubelet[2705]: W0913 01:11:48.793806 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.793990 kubelet[2705]: E0913 01:11:48.793845 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.794391 kubelet[2705]: E0913 01:11:48.794250 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.794391 kubelet[2705]: W0913 01:11:48.794279 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.796211 kubelet[2705]: E0913 01:11:48.794956 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.796416 kubelet[2705]: E0913 01:11:48.796376 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.796757 kubelet[2705]: W0913 01:11:48.796604 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.796757 kubelet[2705]: E0913 01:11:48.796641 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.798298 kubelet[2705]: E0913 01:11:48.796981 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.798298 kubelet[2705]: W0913 01:11:48.797006 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.798298 kubelet[2705]: E0913 01:11:48.797033 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.798298 kubelet[2705]: E0913 01:11:48.797546 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.798298 kubelet[2705]: W0913 01:11:48.797562 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.798298 kubelet[2705]: E0913 01:11:48.797579 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.798298 kubelet[2705]: E0913 01:11:48.797924 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.798298 kubelet[2705]: W0913 01:11:48.797941 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.798298 kubelet[2705]: E0913 01:11:48.797958 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.800147 kubelet[2705]: E0913 01:11:48.798492 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.800147 kubelet[2705]: W0913 01:11:48.798969 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.800147 kubelet[2705]: E0913 01:11:48.799031 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.800147 kubelet[2705]: E0913 01:11:48.799413 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.800147 kubelet[2705]: W0913 01:11:48.799441 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.800147 kubelet[2705]: E0913 01:11:48.799506 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.803574 kubelet[2705]: E0913 01:11:48.803214 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.803574 kubelet[2705]: W0913 01:11:48.803240 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.803574 kubelet[2705]: E0913 01:11:48.803400 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.805077 kubelet[2705]: E0913 01:11:48.803929 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.805077 kubelet[2705]: W0913 01:11:48.803945 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.805077 kubelet[2705]: E0913 01:11:48.804014 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.806877 kubelet[2705]: E0913 01:11:48.805564 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.808151 kubelet[2705]: W0913 01:11:48.807158 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.808151 kubelet[2705]: E0913 01:11:48.807394 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.808151 kubelet[2705]: E0913 01:11:48.807862 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.808151 kubelet[2705]: W0913 01:11:48.807940 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.808151 kubelet[2705]: E0913 01:11:48.808053 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.809588 kubelet[2705]: E0913 01:11:48.809491 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.809588 kubelet[2705]: W0913 01:11:48.809515 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.809763 kubelet[2705]: E0913 01:11:48.809699 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.811529 kubelet[2705]: E0913 01:11:48.810210 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.811529 kubelet[2705]: W0913 01:11:48.810255 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.811529 kubelet[2705]: E0913 01:11:48.810954 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.811529 kubelet[2705]: E0913 01:11:48.811205 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.811778 kubelet[2705]: W0913 01:11:48.811569 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.812872 kubelet[2705]: E0913 01:11:48.812149 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.812872 kubelet[2705]: E0913 01:11:48.812716 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.812872 kubelet[2705]: W0913 01:11:48.812761 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.814127 kubelet[2705]: E0913 01:11:48.813757 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.814127 kubelet[2705]: E0913 01:11:48.814009 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.814127 kubelet[2705]: W0913 01:11:48.814025 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.814312 kubelet[2705]: E0913 01:11:48.814239 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.815376 kubelet[2705]: E0913 01:11:48.815202 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.815376 kubelet[2705]: W0913 01:11:48.815223 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.817418 kubelet[2705]: E0913 01:11:48.816901 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.819345 kubelet[2705]: E0913 01:11:48.818185 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.819345 kubelet[2705]: W0913 01:11:48.818208 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.819345 kubelet[2705]: E0913 01:11:48.818228 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.820918 kubelet[2705]: E0913 01:11:48.820181 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.820918 kubelet[2705]: W0913 01:11:48.820205 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.821067 kubelet[2705]: E0913 01:11:48.821033 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.823649 kubelet[2705]: E0913 01:11:48.823372 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.823649 kubelet[2705]: W0913 01:11:48.823398 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.823649 kubelet[2705]: E0913 01:11:48.823420 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.825646 kubelet[2705]: E0913 01:11:48.825611 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.825646 kubelet[2705]: W0913 01:11:48.825638 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.825805 kubelet[2705]: E0913 01:11:48.825658 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:48.886482 containerd[1507]: time="2025-09-13T01:11:48.886252340Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-292qn,Uid:7aa155c8-e3d1-4320-a2f8-432a81875638,Namespace:calico-system,Attempt:0,} returns sandbox id \"9ffba7aea979eedfb664a5728c098be7624f1fb2daaf5466d79fcf08a84b6ab3\"" Sep 13 01:11:48.925157 kubelet[2705]: E0913 01:11:48.925069 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:48.925531 kubelet[2705]: W0913 01:11:48.925404 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:48.925531 kubelet[2705]: E0913 01:11:48.925450 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:50.156707 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4241343615.mount: Deactivated successfully. Sep 13 01:11:50.637570 kubelet[2705]: E0913 01:11:50.637493 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8vq7t" podUID="2c92e0f8-426f-428f-9601-3c255a79b3c3" Sep 13 01:11:51.809342 containerd[1507]: time="2025-09-13T01:11:51.809239337Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:51.814173 containerd[1507]: time="2025-09-13T01:11:51.814061865Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 13 01:11:51.815433 containerd[1507]: time="2025-09-13T01:11:51.815370534Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:51.819670 containerd[1507]: time="2025-09-13T01:11:51.819619915Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:51.821747 containerd[1507]: time="2025-09-13T01:11:51.820969644Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 3.325142625s" Sep 13 01:11:51.821747 containerd[1507]: time="2025-09-13T01:11:51.821018104Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 13 01:11:51.831581 containerd[1507]: time="2025-09-13T01:11:51.831522959Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 13 01:11:51.867812 containerd[1507]: time="2025-09-13T01:11:51.867706036Z" level=info msg="CreateContainer within sandbox \"4d30bab5b150b99df06b207dd6b3f2804b38f9f124c46e83dad9b829021dc9af\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 13 01:11:51.896382 containerd[1507]: time="2025-09-13T01:11:51.896298944Z" level=info msg="CreateContainer within sandbox \"4d30bab5b150b99df06b207dd6b3f2804b38f9f124c46e83dad9b829021dc9af\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"e0903ab44f572e79772b33c65c1d0d296c03618dd3130c45a00667d0ece4ed9f\"" Sep 13 01:11:51.898403 containerd[1507]: time="2025-09-13T01:11:51.898367720Z" level=info msg="StartContainer for \"e0903ab44f572e79772b33c65c1d0d296c03618dd3130c45a00667d0ece4ed9f\"" Sep 13 01:11:51.984536 systemd[1]: Started cri-containerd-e0903ab44f572e79772b33c65c1d0d296c03618dd3130c45a00667d0ece4ed9f.scope - libcontainer container e0903ab44f572e79772b33c65c1d0d296c03618dd3130c45a00667d0ece4ed9f. Sep 13 01:11:52.079699 containerd[1507]: time="2025-09-13T01:11:52.078887632Z" level=info msg="StartContainer for \"e0903ab44f572e79772b33c65c1d0d296c03618dd3130c45a00667d0ece4ed9f\" returns successfully" Sep 13 01:11:52.638651 kubelet[2705]: E0913 01:11:52.638564 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8vq7t" podUID="2c92e0f8-426f-428f-9601-3c255a79b3c3" Sep 13 01:11:52.830902 kubelet[2705]: E0913 01:11:52.830827 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.830902 kubelet[2705]: W0913 01:11:52.830886 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.831245 kubelet[2705]: E0913 01:11:52.830941 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.831555 kubelet[2705]: E0913 01:11:52.831529 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.831555 kubelet[2705]: W0913 01:11:52.831551 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.831711 kubelet[2705]: E0913 01:11:52.831570 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.831867 kubelet[2705]: E0913 01:11:52.831842 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.831867 kubelet[2705]: W0913 01:11:52.831864 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.831968 kubelet[2705]: E0913 01:11:52.831881 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.832476 kubelet[2705]: E0913 01:11:52.832229 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.832476 kubelet[2705]: W0913 01:11:52.832251 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.832476 kubelet[2705]: E0913 01:11:52.832269 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.832644 kubelet[2705]: E0913 01:11:52.832535 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.832644 kubelet[2705]: W0913 01:11:52.832549 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.832644 kubelet[2705]: E0913 01:11:52.832566 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.833526 kubelet[2705]: E0913 01:11:52.833156 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.833526 kubelet[2705]: W0913 01:11:52.833172 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.833526 kubelet[2705]: E0913 01:11:52.833189 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.834560 kubelet[2705]: E0913 01:11:52.834325 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.834560 kubelet[2705]: W0913 01:11:52.834350 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.834560 kubelet[2705]: E0913 01:11:52.834370 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.835508 kubelet[2705]: E0913 01:11:52.835462 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.835508 kubelet[2705]: W0913 01:11:52.835490 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.835662 kubelet[2705]: E0913 01:11:52.835514 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.837165 kubelet[2705]: E0913 01:11:52.836737 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.837165 kubelet[2705]: W0913 01:11:52.836765 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.837165 kubelet[2705]: E0913 01:11:52.836794 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.838136 kubelet[2705]: E0913 01:11:52.838005 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.838136 kubelet[2705]: W0913 01:11:52.838035 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.838136 kubelet[2705]: E0913 01:11:52.838090 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.839969 kubelet[2705]: E0913 01:11:52.839452 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.839969 kubelet[2705]: W0913 01:11:52.839510 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.839969 kubelet[2705]: E0913 01:11:52.839545 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.841822 kubelet[2705]: E0913 01:11:52.841513 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.841822 kubelet[2705]: W0913 01:11:52.841552 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.841822 kubelet[2705]: E0913 01:11:52.841587 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.843176 kubelet[2705]: E0913 01:11:52.843144 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.843176 kubelet[2705]: W0913 01:11:52.843174 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.843890 kubelet[2705]: E0913 01:11:52.843202 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.843945 kubelet[2705]: E0913 01:11:52.843891 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.843945 kubelet[2705]: W0913 01:11:52.843908 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.843945 kubelet[2705]: E0913 01:11:52.843926 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.846749 kubelet[2705]: E0913 01:11:52.846649 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.846749 kubelet[2705]: W0913 01:11:52.846706 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.846749 kubelet[2705]: E0913 01:11:52.846746 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.848872 kubelet[2705]: E0913 01:11:52.847922 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.848872 kubelet[2705]: W0913 01:11:52.847940 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.848872 kubelet[2705]: E0913 01:11:52.847961 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.848872 kubelet[2705]: E0913 01:11:52.848344 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.848872 kubelet[2705]: W0913 01:11:52.848360 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.848872 kubelet[2705]: E0913 01:11:52.848385 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.849935 kubelet[2705]: E0913 01:11:52.849859 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.849935 kubelet[2705]: W0913 01:11:52.849881 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.849935 kubelet[2705]: E0913 01:11:52.849911 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.851384 kubelet[2705]: E0913 01:11:52.851199 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.851384 kubelet[2705]: W0913 01:11:52.851229 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.851384 kubelet[2705]: E0913 01:11:52.851333 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.852277 kubelet[2705]: E0913 01:11:52.852246 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.852277 kubelet[2705]: W0913 01:11:52.852272 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.852411 kubelet[2705]: E0913 01:11:52.852377 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.853005 kubelet[2705]: E0913 01:11:52.852767 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.853005 kubelet[2705]: W0913 01:11:52.852790 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.853005 kubelet[2705]: E0913 01:11:52.852893 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.854638 kubelet[2705]: E0913 01:11:52.854601 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.854638 kubelet[2705]: W0913 01:11:52.854634 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.854901 kubelet[2705]: E0913 01:11:52.854746 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.855445 kubelet[2705]: E0913 01:11:52.855344 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.855445 kubelet[2705]: W0913 01:11:52.855379 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.855564 kubelet[2705]: E0913 01:11:52.855478 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.856303 kubelet[2705]: E0913 01:11:52.856143 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.856303 kubelet[2705]: W0913 01:11:52.856165 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.856895 kubelet[2705]: E0913 01:11:52.856714 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.857563 kubelet[2705]: E0913 01:11:52.857335 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.857563 kubelet[2705]: W0913 01:11:52.857360 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.857563 kubelet[2705]: E0913 01:11:52.857445 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.858296 kubelet[2705]: E0913 01:11:52.858266 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.858296 kubelet[2705]: W0913 01:11:52.858291 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.858936 kubelet[2705]: E0913 01:11:52.858431 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.858936 kubelet[2705]: E0913 01:11:52.858734 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.858936 kubelet[2705]: W0913 01:11:52.858749 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.859830 kubelet[2705]: E0913 01:11:52.859769 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.859937 kubelet[2705]: E0913 01:11:52.859915 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.860019 kubelet[2705]: W0913 01:11:52.859937 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.860075 kubelet[2705]: E0913 01:11:52.860033 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.861495 kubelet[2705]: E0913 01:11:52.861313 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.861495 kubelet[2705]: W0913 01:11:52.861337 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.861632 kubelet[2705]: E0913 01:11:52.861505 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.862799 kubelet[2705]: E0913 01:11:52.862556 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.862799 kubelet[2705]: W0913 01:11:52.862579 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.862922 kubelet[2705]: E0913 01:11:52.862850 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.864569 kubelet[2705]: E0913 01:11:52.864317 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.864569 kubelet[2705]: W0913 01:11:52.864340 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.864569 kubelet[2705]: E0913 01:11:52.864401 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.865392 kubelet[2705]: E0913 01:11:52.865355 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.865392 kubelet[2705]: W0913 01:11:52.865378 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.865533 kubelet[2705]: E0913 01:11:52.865420 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:52.866508 kubelet[2705]: E0913 01:11:52.866350 2705 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:11:52.866508 kubelet[2705]: W0913 01:11:52.866375 2705 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:11:52.866508 kubelet[2705]: E0913 01:11:52.866394 2705 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:11:53.508138 containerd[1507]: time="2025-09-13T01:11:53.507545080Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:53.509419 containerd[1507]: time="2025-09-13T01:11:53.509374986Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 13 01:11:53.509661 containerd[1507]: time="2025-09-13T01:11:53.509627679Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:53.513031 containerd[1507]: time="2025-09-13T01:11:53.512981584Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:11:53.515049 containerd[1507]: time="2025-09-13T01:11:53.514407212Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.682396743s" Sep 13 01:11:53.515049 containerd[1507]: time="2025-09-13T01:11:53.514453285Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 13 01:11:53.520516 containerd[1507]: time="2025-09-13T01:11:53.520367942Z" level=info msg="CreateContainer within sandbox \"9ffba7aea979eedfb664a5728c098be7624f1fb2daaf5466d79fcf08a84b6ab3\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 13 01:11:53.547934 containerd[1507]: time="2025-09-13T01:11:53.547852675Z" level=info msg="CreateContainer within sandbox \"9ffba7aea979eedfb664a5728c098be7624f1fb2daaf5466d79fcf08a84b6ab3\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"8fd13633c80f5671a319b25645f3a0d685116b166802170c55b4f108f297862a\"" Sep 13 01:11:53.550124 containerd[1507]: time="2025-09-13T01:11:53.548601517Z" level=info msg="StartContainer for \"8fd13633c80f5671a319b25645f3a0d685116b166802170c55b4f108f297862a\"" Sep 13 01:11:53.604444 systemd[1]: Started cri-containerd-8fd13633c80f5671a319b25645f3a0d685116b166802170c55b4f108f297862a.scope - libcontainer container 8fd13633c80f5671a319b25645f3a0d685116b166802170c55b4f108f297862a. Sep 13 01:11:53.659795 containerd[1507]: time="2025-09-13T01:11:53.659586979Z" level=info msg="StartContainer for \"8fd13633c80f5671a319b25645f3a0d685116b166802170c55b4f108f297862a\" returns successfully" Sep 13 01:11:53.684643 systemd[1]: cri-containerd-8fd13633c80f5671a319b25645f3a0d685116b166802170c55b4f108f297862a.scope: Deactivated successfully. Sep 13 01:11:53.728281 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8fd13633c80f5671a319b25645f3a0d685116b166802170c55b4f108f297862a-rootfs.mount: Deactivated successfully. Sep 13 01:11:53.912139 kubelet[2705]: I0913 01:11:53.911333 2705 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 01:11:53.930411 containerd[1507]: time="2025-09-13T01:11:53.928710460Z" level=info msg="shim disconnected" id=8fd13633c80f5671a319b25645f3a0d685116b166802170c55b4f108f297862a namespace=k8s.io Sep 13 01:11:53.930411 containerd[1507]: time="2025-09-13T01:11:53.928797570Z" level=warning msg="cleaning up after shim disconnected" id=8fd13633c80f5671a319b25645f3a0d685116b166802170c55b4f108f297862a namespace=k8s.io Sep 13 01:11:53.930411 containerd[1507]: time="2025-09-13T01:11:53.928814604Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 01:11:53.945448 kubelet[2705]: I0913 01:11:53.944858 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5474bf7bf9-8wvrs" podStartSLOduration=3.606545715 podStartE2EDuration="6.944824631s" podCreationTimestamp="2025-09-13 01:11:47 +0000 UTC" firstStartedPulling="2025-09-13 01:11:48.492696655 +0000 UTC m=+24.190544886" lastFinishedPulling="2025-09-13 01:11:51.830975548 +0000 UTC m=+27.528823802" observedRunningTime="2025-09-13 01:11:52.870179345 +0000 UTC m=+28.568027591" watchObservedRunningTime="2025-09-13 01:11:53.944824631 +0000 UTC m=+29.642672868" Sep 13 01:11:54.642274 kubelet[2705]: E0913 01:11:54.640759 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8vq7t" podUID="2c92e0f8-426f-428f-9601-3c255a79b3c3" Sep 13 01:11:54.922634 containerd[1507]: time="2025-09-13T01:11:54.921745620Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 13 01:11:56.638020 kubelet[2705]: E0913 01:11:56.637500 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8vq7t" podUID="2c92e0f8-426f-428f-9601-3c255a79b3c3" Sep 13 01:11:56.892953 kubelet[2705]: I0913 01:11:56.892302 2705 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 01:11:58.640094 kubelet[2705]: E0913 01:11:58.640022 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8vq7t" podUID="2c92e0f8-426f-428f-9601-3c255a79b3c3" Sep 13 01:12:00.643625 kubelet[2705]: E0913 01:12:00.642647 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8vq7t" podUID="2c92e0f8-426f-428f-9601-3c255a79b3c3" Sep 13 01:12:00.888406 containerd[1507]: time="2025-09-13T01:12:00.888150409Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:00.892141 containerd[1507]: time="2025-09-13T01:12:00.891900736Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 13 01:12:00.902559 containerd[1507]: time="2025-09-13T01:12:00.902002165Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:00.906217 containerd[1507]: time="2025-09-13T01:12:00.906153376Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:00.908171 containerd[1507]: time="2025-09-13T01:12:00.907290093Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 5.985467291s" Sep 13 01:12:00.908171 containerd[1507]: time="2025-09-13T01:12:00.907340553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 13 01:12:00.912366 containerd[1507]: time="2025-09-13T01:12:00.912297702Z" level=info msg="CreateContainer within sandbox \"9ffba7aea979eedfb664a5728c098be7624f1fb2daaf5466d79fcf08a84b6ab3\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 13 01:12:00.970411 containerd[1507]: time="2025-09-13T01:12:00.970328821Z" level=info msg="CreateContainer within sandbox \"9ffba7aea979eedfb664a5728c098be7624f1fb2daaf5466d79fcf08a84b6ab3\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"e849b890b28bad225c21b56d4c1346f428d407175bacf795fedaad01788906ce\"" Sep 13 01:12:00.971860 containerd[1507]: time="2025-09-13T01:12:00.971824874Z" level=info msg="StartContainer for \"e849b890b28bad225c21b56d4c1346f428d407175bacf795fedaad01788906ce\"" Sep 13 01:12:01.074509 systemd[1]: run-containerd-runc-k8s.io-e849b890b28bad225c21b56d4c1346f428d407175bacf795fedaad01788906ce-runc.KnbpCm.mount: Deactivated successfully. Sep 13 01:12:01.087371 systemd[1]: Started cri-containerd-e849b890b28bad225c21b56d4c1346f428d407175bacf795fedaad01788906ce.scope - libcontainer container e849b890b28bad225c21b56d4c1346f428d407175bacf795fedaad01788906ce. Sep 13 01:12:01.166160 containerd[1507]: time="2025-09-13T01:12:01.165526254Z" level=info msg="StartContainer for \"e849b890b28bad225c21b56d4c1346f428d407175bacf795fedaad01788906ce\" returns successfully" Sep 13 01:12:02.317043 systemd[1]: cri-containerd-e849b890b28bad225c21b56d4c1346f428d407175bacf795fedaad01788906ce.scope: Deactivated successfully. Sep 13 01:12:02.383256 containerd[1507]: time="2025-09-13T01:12:02.383092159Z" level=info msg="shim disconnected" id=e849b890b28bad225c21b56d4c1346f428d407175bacf795fedaad01788906ce namespace=k8s.io Sep 13 01:12:02.383256 containerd[1507]: time="2025-09-13T01:12:02.383227958Z" level=warning msg="cleaning up after shim disconnected" id=e849b890b28bad225c21b56d4c1346f428d407175bacf795fedaad01788906ce namespace=k8s.io Sep 13 01:12:02.383256 containerd[1507]: time="2025-09-13T01:12:02.383247175Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 01:12:02.384553 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e849b890b28bad225c21b56d4c1346f428d407175bacf795fedaad01788906ce-rootfs.mount: Deactivated successfully. Sep 13 01:12:02.400862 kubelet[2705]: I0913 01:12:02.400667 2705 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 13 01:12:02.429141 containerd[1507]: time="2025-09-13T01:12:02.428595550Z" level=warning msg="cleanup warnings time=\"2025-09-13T01:12:02Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Sep 13 01:12:02.476027 systemd[1]: Created slice kubepods-burstable-pod6300a4f8_ae4b_49f1_a9a0_a5bb3a5cd0d9.slice - libcontainer container kubepods-burstable-pod6300a4f8_ae4b_49f1_a9a0_a5bb3a5cd0d9.slice. Sep 13 01:12:02.503004 systemd[1]: Created slice kubepods-burstable-poda73f9354_94fe_434f_94c6_f203f326e804.slice - libcontainer container kubepods-burstable-poda73f9354_94fe_434f_94c6_f203f326e804.slice. Sep 13 01:12:02.519749 systemd[1]: Created slice kubepods-besteffort-pod1e3277dd_9a07_4afe_ac81_d3afe9a4aa46.slice - libcontainer container kubepods-besteffort-pod1e3277dd_9a07_4afe_ac81_d3afe9a4aa46.slice. Sep 13 01:12:02.524766 kubelet[2705]: I0913 01:12:02.524605 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmm7p\" (UniqueName: \"kubernetes.io/projected/ae902541-3045-4fda-8e34-8995114228b4-kube-api-access-mmm7p\") pod \"goldmane-54d579b49d-9s577\" (UID: \"ae902541-3045-4fda-8e34-8995114228b4\") " pod="calico-system/goldmane-54d579b49d-9s577" Sep 13 01:12:02.525675 kubelet[2705]: I0913 01:12:02.525274 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f1634209-c6c5-41be-8e7f-82f83544c3ed-calico-apiserver-certs\") pod \"calico-apiserver-658bd7dd7b-wdc44\" (UID: \"f1634209-c6c5-41be-8e7f-82f83544c3ed\") " pod="calico-apiserver/calico-apiserver-658bd7dd7b-wdc44" Sep 13 01:12:02.528220 kubelet[2705]: I0913 01:12:02.526274 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8c44\" (UniqueName: \"kubernetes.io/projected/a73f9354-94fe-434f-94c6-f203f326e804-kube-api-access-b8c44\") pod \"coredns-668d6bf9bc-kdwhw\" (UID: \"a73f9354-94fe-434f-94c6-f203f326e804\") " pod="kube-system/coredns-668d6bf9bc-kdwhw" Sep 13 01:12:02.528220 kubelet[2705]: I0913 01:12:02.527037 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae902541-3045-4fda-8e34-8995114228b4-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-9s577\" (UID: \"ae902541-3045-4fda-8e34-8995114228b4\") " pod="calico-system/goldmane-54d579b49d-9s577" Sep 13 01:12:02.528220 kubelet[2705]: I0913 01:12:02.527077 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/ae902541-3045-4fda-8e34-8995114228b4-goldmane-key-pair\") pod \"goldmane-54d579b49d-9s577\" (UID: \"ae902541-3045-4fda-8e34-8995114228b4\") " pod="calico-system/goldmane-54d579b49d-9s577" Sep 13 01:12:02.528220 kubelet[2705]: I0913 01:12:02.527147 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg6c9\" (UniqueName: \"kubernetes.io/projected/f1634209-c6c5-41be-8e7f-82f83544c3ed-kube-api-access-kg6c9\") pod \"calico-apiserver-658bd7dd7b-wdc44\" (UID: \"f1634209-c6c5-41be-8e7f-82f83544c3ed\") " pod="calico-apiserver/calico-apiserver-658bd7dd7b-wdc44" Sep 13 01:12:02.528220 kubelet[2705]: I0913 01:12:02.527206 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9-config-volume\") pod \"coredns-668d6bf9bc-sn2fg\" (UID: \"6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9\") " pod="kube-system/coredns-668d6bf9bc-sn2fg" Sep 13 01:12:02.529151 kubelet[2705]: I0913 01:12:02.527250 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzlsp\" (UniqueName: \"kubernetes.io/projected/6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9-kube-api-access-lzlsp\") pod \"coredns-668d6bf9bc-sn2fg\" (UID: \"6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9\") " pod="kube-system/coredns-668d6bf9bc-sn2fg" Sep 13 01:12:02.529151 kubelet[2705]: I0913 01:12:02.527281 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkwvd\" (UniqueName: \"kubernetes.io/projected/9962c42a-298d-44e3-93d7-b667b8f90fa1-kube-api-access-gkwvd\") pod \"whisker-84fd46496-c4kbx\" (UID: \"9962c42a-298d-44e3-93d7-b667b8f90fa1\") " pod="calico-system/whisker-84fd46496-c4kbx" Sep 13 01:12:02.529151 kubelet[2705]: I0913 01:12:02.527309 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e62adc23-d970-45f8-9359-c39be02aa620-calico-apiserver-certs\") pod \"calico-apiserver-658bd7dd7b-zf7gk\" (UID: \"e62adc23-d970-45f8-9359-c39be02aa620\") " pod="calico-apiserver/calico-apiserver-658bd7dd7b-zf7gk" Sep 13 01:12:02.529151 kubelet[2705]: I0913 01:12:02.527342 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae902541-3045-4fda-8e34-8995114228b4-config\") pod \"goldmane-54d579b49d-9s577\" (UID: \"ae902541-3045-4fda-8e34-8995114228b4\") " pod="calico-system/goldmane-54d579b49d-9s577" Sep 13 01:12:02.529151 kubelet[2705]: I0913 01:12:02.527372 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9962c42a-298d-44e3-93d7-b667b8f90fa1-whisker-ca-bundle\") pod \"whisker-84fd46496-c4kbx\" (UID: \"9962c42a-298d-44e3-93d7-b667b8f90fa1\") " pod="calico-system/whisker-84fd46496-c4kbx" Sep 13 01:12:02.531830 kubelet[2705]: I0913 01:12:02.527398 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a73f9354-94fe-434f-94c6-f203f326e804-config-volume\") pod \"coredns-668d6bf9bc-kdwhw\" (UID: \"a73f9354-94fe-434f-94c6-f203f326e804\") " pod="kube-system/coredns-668d6bf9bc-kdwhw" Sep 13 01:12:02.531830 kubelet[2705]: I0913 01:12:02.527434 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7bpv\" (UniqueName: \"kubernetes.io/projected/1e3277dd-9a07-4afe-ac81-d3afe9a4aa46-kube-api-access-f7bpv\") pod \"calico-kube-controllers-5b7745757f-zhjt6\" (UID: \"1e3277dd-9a07-4afe-ac81-d3afe9a4aa46\") " pod="calico-system/calico-kube-controllers-5b7745757f-zhjt6" Sep 13 01:12:02.531830 kubelet[2705]: I0913 01:12:02.527470 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9962c42a-298d-44e3-93d7-b667b8f90fa1-whisker-backend-key-pair\") pod \"whisker-84fd46496-c4kbx\" (UID: \"9962c42a-298d-44e3-93d7-b667b8f90fa1\") " pod="calico-system/whisker-84fd46496-c4kbx" Sep 13 01:12:02.531830 kubelet[2705]: I0913 01:12:02.527501 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgj4z\" (UniqueName: \"kubernetes.io/projected/e62adc23-d970-45f8-9359-c39be02aa620-kube-api-access-cgj4z\") pod \"calico-apiserver-658bd7dd7b-zf7gk\" (UID: \"e62adc23-d970-45f8-9359-c39be02aa620\") " pod="calico-apiserver/calico-apiserver-658bd7dd7b-zf7gk" Sep 13 01:12:02.531830 kubelet[2705]: I0913 01:12:02.527535 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e3277dd-9a07-4afe-ac81-d3afe9a4aa46-tigera-ca-bundle\") pod \"calico-kube-controllers-5b7745757f-zhjt6\" (UID: \"1e3277dd-9a07-4afe-ac81-d3afe9a4aa46\") " pod="calico-system/calico-kube-controllers-5b7745757f-zhjt6" Sep 13 01:12:02.544837 systemd[1]: Created slice kubepods-besteffort-pod9962c42a_298d_44e3_93d7_b667b8f90fa1.slice - libcontainer container kubepods-besteffort-pod9962c42a_298d_44e3_93d7_b667b8f90fa1.slice. Sep 13 01:12:02.559720 systemd[1]: Created slice kubepods-besteffort-pode62adc23_d970_45f8_9359_c39be02aa620.slice - libcontainer container kubepods-besteffort-pode62adc23_d970_45f8_9359_c39be02aa620.slice. Sep 13 01:12:02.579578 systemd[1]: Created slice kubepods-besteffort-podf1634209_c6c5_41be_8e7f_82f83544c3ed.slice - libcontainer container kubepods-besteffort-podf1634209_c6c5_41be_8e7f_82f83544c3ed.slice. Sep 13 01:12:02.592630 systemd[1]: Created slice kubepods-besteffort-podae902541_3045_4fda_8e34_8995114228b4.slice - libcontainer container kubepods-besteffort-podae902541_3045_4fda_8e34_8995114228b4.slice. Sep 13 01:12:02.738775 systemd[1]: Created slice kubepods-besteffort-pod2c92e0f8_426f_428f_9601_3c255a79b3c3.slice - libcontainer container kubepods-besteffort-pod2c92e0f8_426f_428f_9601_3c255a79b3c3.slice. Sep 13 01:12:02.744897 containerd[1507]: time="2025-09-13T01:12:02.744842924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8vq7t,Uid:2c92e0f8-426f-428f-9601-3c255a79b3c3,Namespace:calico-system,Attempt:0,}" Sep 13 01:12:02.802211 containerd[1507]: time="2025-09-13T01:12:02.802144531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sn2fg,Uid:6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9,Namespace:kube-system,Attempt:0,}" Sep 13 01:12:02.814905 containerd[1507]: time="2025-09-13T01:12:02.814827082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-kdwhw,Uid:a73f9354-94fe-434f-94c6-f203f326e804,Namespace:kube-system,Attempt:0,}" Sep 13 01:12:02.833069 containerd[1507]: time="2025-09-13T01:12:02.832864330Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5b7745757f-zhjt6,Uid:1e3277dd-9a07-4afe-ac81-d3afe9a4aa46,Namespace:calico-system,Attempt:0,}" Sep 13 01:12:02.854246 containerd[1507]: time="2025-09-13T01:12:02.854163263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84fd46496-c4kbx,Uid:9962c42a-298d-44e3-93d7-b667b8f90fa1,Namespace:calico-system,Attempt:0,}" Sep 13 01:12:02.874511 containerd[1507]: time="2025-09-13T01:12:02.874302410Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658bd7dd7b-zf7gk,Uid:e62adc23-d970-45f8-9359-c39be02aa620,Namespace:calico-apiserver,Attempt:0,}" Sep 13 01:12:02.888320 containerd[1507]: time="2025-09-13T01:12:02.887740116Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658bd7dd7b-wdc44,Uid:f1634209-c6c5-41be-8e7f-82f83544c3ed,Namespace:calico-apiserver,Attempt:0,}" Sep 13 01:12:02.900917 containerd[1507]: time="2025-09-13T01:12:02.900849972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-9s577,Uid:ae902541-3045-4fda-8e34-8995114228b4,Namespace:calico-system,Attempt:0,}" Sep 13 01:12:03.018009 containerd[1507]: time="2025-09-13T01:12:03.017764752Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 13 01:12:03.312895 containerd[1507]: time="2025-09-13T01:12:03.312687998Z" level=error msg="Failed to destroy network for sandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.325263 containerd[1507]: time="2025-09-13T01:12:03.314387707Z" level=error msg="Failed to destroy network for sandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.336200 containerd[1507]: time="2025-09-13T01:12:03.327167469Z" level=error msg="encountered an error cleaning up failed sandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.336200 containerd[1507]: time="2025-09-13T01:12:03.327782787Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8vq7t,Uid:2c92e0f8-426f-428f-9601-3c255a79b3c3,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.336200 containerd[1507]: time="2025-09-13T01:12:03.329175623Z" level=error msg="encountered an error cleaning up failed sandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.336200 containerd[1507]: time="2025-09-13T01:12:03.329268862Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sn2fg,Uid:6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.351163 kubelet[2705]: E0913 01:12:03.350952 2705 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.351163 kubelet[2705]: E0913 01:12:03.350963 2705 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.351163 kubelet[2705]: E0913 01:12:03.351168 2705 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8vq7t" Sep 13 01:12:03.351544 kubelet[2705]: E0913 01:12:03.351403 2705 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8vq7t" Sep 13 01:12:03.351544 kubelet[2705]: E0913 01:12:03.351489 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8vq7t_calico-system(2c92e0f8-426f-428f-9601-3c255a79b3c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8vq7t_calico-system(2c92e0f8-426f-428f-9601-3c255a79b3c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8vq7t" podUID="2c92e0f8-426f-428f-9601-3c255a79b3c3" Sep 13 01:12:03.354087 kubelet[2705]: E0913 01:12:03.351092 2705 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-sn2fg" Sep 13 01:12:03.354087 kubelet[2705]: E0913 01:12:03.351706 2705 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-sn2fg" Sep 13 01:12:03.354087 kubelet[2705]: E0913 01:12:03.351785 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-sn2fg_kube-system(6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-sn2fg_kube-system(6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-sn2fg" podUID="6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9" Sep 13 01:12:03.362626 containerd[1507]: time="2025-09-13T01:12:03.362325545Z" level=error msg="Failed to destroy network for sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.366518 containerd[1507]: time="2025-09-13T01:12:03.366231007Z" level=error msg="encountered an error cleaning up failed sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.366518 containerd[1507]: time="2025-09-13T01:12:03.366319048Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658bd7dd7b-wdc44,Uid:f1634209-c6c5-41be-8e7f-82f83544c3ed,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.367733 kubelet[2705]: E0913 01:12:03.367675 2705 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.367944 kubelet[2705]: E0913 01:12:03.367779 2705 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-658bd7dd7b-wdc44" Sep 13 01:12:03.367944 kubelet[2705]: E0913 01:12:03.367816 2705 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-658bd7dd7b-wdc44" Sep 13 01:12:03.370394 kubelet[2705]: E0913 01:12:03.368296 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-658bd7dd7b-wdc44_calico-apiserver(f1634209-c6c5-41be-8e7f-82f83544c3ed)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-658bd7dd7b-wdc44_calico-apiserver(f1634209-c6c5-41be-8e7f-82f83544c3ed)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-658bd7dd7b-wdc44" podUID="f1634209-c6c5-41be-8e7f-82f83544c3ed" Sep 13 01:12:03.407891 containerd[1507]: time="2025-09-13T01:12:03.407676785Z" level=error msg="Failed to destroy network for sandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.409702 containerd[1507]: time="2025-09-13T01:12:03.409513796Z" level=error msg="encountered an error cleaning up failed sandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.409702 containerd[1507]: time="2025-09-13T01:12:03.409628859Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84fd46496-c4kbx,Uid:9962c42a-298d-44e3-93d7-b667b8f90fa1,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.411556 kubelet[2705]: E0913 01:12:03.411473 2705 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.413787 kubelet[2705]: E0913 01:12:03.411593 2705 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-84fd46496-c4kbx" Sep 13 01:12:03.413787 kubelet[2705]: E0913 01:12:03.411634 2705 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-84fd46496-c4kbx" Sep 13 01:12:03.413787 kubelet[2705]: E0913 01:12:03.411721 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-84fd46496-c4kbx_calico-system(9962c42a-298d-44e3-93d7-b667b8f90fa1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-84fd46496-c4kbx_calico-system(9962c42a-298d-44e3-93d7-b667b8f90fa1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-84fd46496-c4kbx" podUID="9962c42a-298d-44e3-93d7-b667b8f90fa1" Sep 13 01:12:03.416877 containerd[1507]: time="2025-09-13T01:12:03.416677570Z" level=error msg="Failed to destroy network for sandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.420509 containerd[1507]: time="2025-09-13T01:12:03.420210169Z" level=error msg="encountered an error cleaning up failed sandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.420509 containerd[1507]: time="2025-09-13T01:12:03.420323993Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5b7745757f-zhjt6,Uid:1e3277dd-9a07-4afe-ac81-d3afe9a4aa46,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.423676 kubelet[2705]: E0913 01:12:03.423595 2705 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.423930 kubelet[2705]: E0913 01:12:03.423683 2705 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5b7745757f-zhjt6" Sep 13 01:12:03.423930 kubelet[2705]: E0913 01:12:03.423729 2705 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5b7745757f-zhjt6" Sep 13 01:12:03.423930 kubelet[2705]: E0913 01:12:03.423792 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5b7745757f-zhjt6_calico-system(1e3277dd-9a07-4afe-ac81-d3afe9a4aa46)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5b7745757f-zhjt6_calico-system(1e3277dd-9a07-4afe-ac81-d3afe9a4aa46)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b7745757f-zhjt6" podUID="1e3277dd-9a07-4afe-ac81-d3afe9a4aa46" Sep 13 01:12:03.442169 containerd[1507]: time="2025-09-13T01:12:03.441815987Z" level=error msg="Failed to destroy network for sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.442979 containerd[1507]: time="2025-09-13T01:12:03.442698714Z" level=error msg="encountered an error cleaning up failed sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.442979 containerd[1507]: time="2025-09-13T01:12:03.442793455Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-kdwhw,Uid:a73f9354-94fe-434f-94c6-f203f326e804,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.443685 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f-shm.mount: Deactivated successfully. Sep 13 01:12:03.443868 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc-shm.mount: Deactivated successfully. Sep 13 01:12:03.448931 kubelet[2705]: E0913 01:12:03.448806 2705 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.448931 kubelet[2705]: E0913 01:12:03.448909 2705 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-kdwhw" Sep 13 01:12:03.449565 kubelet[2705]: E0913 01:12:03.448947 2705 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-kdwhw" Sep 13 01:12:03.449565 kubelet[2705]: E0913 01:12:03.449035 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-kdwhw_kube-system(a73f9354-94fe-434f-94c6-f203f326e804)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-kdwhw_kube-system(a73f9354-94fe-434f-94c6-f203f326e804)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-kdwhw" podUID="a73f9354-94fe-434f-94c6-f203f326e804" Sep 13 01:12:03.452840 containerd[1507]: time="2025-09-13T01:12:03.452543180Z" level=error msg="Failed to destroy network for sandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.454265 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f-shm.mount: Deactivated successfully. Sep 13 01:12:03.460411 containerd[1507]: time="2025-09-13T01:12:03.454560538Z" level=error msg="encountered an error cleaning up failed sandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.461049 containerd[1507]: time="2025-09-13T01:12:03.460922362Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658bd7dd7b-zf7gk,Uid:e62adc23-d970-45f8-9359-c39be02aa620,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.462818 kubelet[2705]: E0913 01:12:03.462493 2705 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.462818 kubelet[2705]: E0913 01:12:03.462606 2705 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-658bd7dd7b-zf7gk" Sep 13 01:12:03.462818 kubelet[2705]: E0913 01:12:03.462655 2705 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-658bd7dd7b-zf7gk" Sep 13 01:12:03.463043 kubelet[2705]: E0913 01:12:03.462734 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-658bd7dd7b-zf7gk_calico-apiserver(e62adc23-d970-45f8-9359-c39be02aa620)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-658bd7dd7b-zf7gk_calico-apiserver(e62adc23-d970-45f8-9359-c39be02aa620)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-658bd7dd7b-zf7gk" podUID="e62adc23-d970-45f8-9359-c39be02aa620" Sep 13 01:12:03.463498 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e-shm.mount: Deactivated successfully. Sep 13 01:12:03.494018 containerd[1507]: time="2025-09-13T01:12:03.493759726Z" level=error msg="Failed to destroy network for sandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.498132 containerd[1507]: time="2025-09-13T01:12:03.494932918Z" level=error msg="encountered an error cleaning up failed sandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.498132 containerd[1507]: time="2025-09-13T01:12:03.495017271Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-9s577,Uid:ae902541-3045-4fda-8e34-8995114228b4,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.499364 kubelet[2705]: E0913 01:12:03.498674 2705 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:03.499364 kubelet[2705]: E0913 01:12:03.498778 2705 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-9s577" Sep 13 01:12:03.499364 kubelet[2705]: E0913 01:12:03.498816 2705 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-9s577" Sep 13 01:12:03.499561 kubelet[2705]: E0913 01:12:03.498892 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-9s577_calico-system(ae902541-3045-4fda-8e34-8995114228b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-9s577_calico-system(ae902541-3045-4fda-8e34-8995114228b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-9s577" podUID="ae902541-3045-4fda-8e34-8995114228b4" Sep 13 01:12:03.500836 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218-shm.mount: Deactivated successfully. Sep 13 01:12:04.018645 kubelet[2705]: I0913 01:12:04.018591 2705 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:12:04.023139 kubelet[2705]: I0913 01:12:04.022759 2705 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:04.044359 kubelet[2705]: I0913 01:12:04.043805 2705 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:12:04.056405 kubelet[2705]: I0913 01:12:04.055425 2705 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:04.058584 kubelet[2705]: I0913 01:12:04.058556 2705 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:04.069420 containerd[1507]: time="2025-09-13T01:12:04.069350675Z" level=info msg="StopPodSandbox for \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\"" Sep 13 01:12:04.071147 containerd[1507]: time="2025-09-13T01:12:04.070329713Z" level=info msg="StopPodSandbox for \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\"" Sep 13 01:12:04.071710 containerd[1507]: time="2025-09-13T01:12:04.071415999Z" level=info msg="Ensure that sandbox 69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8 in task-service has been cleanup successfully" Sep 13 01:12:04.071778 containerd[1507]: time="2025-09-13T01:12:04.071423121Z" level=info msg="Ensure that sandbox ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884 in task-service has been cleanup successfully" Sep 13 01:12:04.072216 containerd[1507]: time="2025-09-13T01:12:04.072182931Z" level=info msg="StopPodSandbox for \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\"" Sep 13 01:12:04.075343 containerd[1507]: time="2025-09-13T01:12:04.075310719Z" level=info msg="Ensure that sandbox 5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc in task-service has been cleanup successfully" Sep 13 01:12:04.092857 containerd[1507]: time="2025-09-13T01:12:04.092162741Z" level=info msg="StopPodSandbox for \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\"" Sep 13 01:12:04.092857 containerd[1507]: time="2025-09-13T01:12:04.092470812Z" level=info msg="Ensure that sandbox c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f in task-service has been cleanup successfully" Sep 13 01:12:04.093878 containerd[1507]: time="2025-09-13T01:12:04.093384368Z" level=info msg="StopPodSandbox for \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\"" Sep 13 01:12:04.093878 containerd[1507]: time="2025-09-13T01:12:04.093606227Z" level=info msg="Ensure that sandbox ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f in task-service has been cleanup successfully" Sep 13 01:12:04.101680 kubelet[2705]: I0913 01:12:04.101628 2705 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:04.105826 containerd[1507]: time="2025-09-13T01:12:04.105774062Z" level=info msg="StopPodSandbox for \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\"" Sep 13 01:12:04.106200 containerd[1507]: time="2025-09-13T01:12:04.106088554Z" level=info msg="Ensure that sandbox 3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218 in task-service has been cleanup successfully" Sep 13 01:12:04.117924 kubelet[2705]: I0913 01:12:04.117867 2705 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:04.124147 containerd[1507]: time="2025-09-13T01:12:04.122346521Z" level=info msg="StopPodSandbox for \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\"" Sep 13 01:12:04.124774 containerd[1507]: time="2025-09-13T01:12:04.124565911Z" level=info msg="Ensure that sandbox 898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e in task-service has been cleanup successfully" Sep 13 01:12:04.142015 kubelet[2705]: I0913 01:12:04.141667 2705 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:04.155524 containerd[1507]: time="2025-09-13T01:12:04.155460070Z" level=info msg="StopPodSandbox for \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\"" Sep 13 01:12:04.156061 containerd[1507]: time="2025-09-13T01:12:04.156029410Z" level=info msg="Ensure that sandbox 3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08 in task-service has been cleanup successfully" Sep 13 01:12:04.312388 containerd[1507]: time="2025-09-13T01:12:04.312078399Z" level=error msg="StopPodSandbox for \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\" failed" error="failed to destroy network for sandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:04.315552 containerd[1507]: time="2025-09-13T01:12:04.313229353Z" level=error msg="StopPodSandbox for \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\" failed" error="failed to destroy network for sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:04.315651 kubelet[2705]: E0913 01:12:04.314168 2705 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:04.315651 kubelet[2705]: E0913 01:12:04.314532 2705 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:12:04.317962 containerd[1507]: time="2025-09-13T01:12:04.317046984Z" level=error msg="StopPodSandbox for \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\" failed" error="failed to destroy network for sandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:04.322653 containerd[1507]: time="2025-09-13T01:12:04.322576368Z" level=error msg="StopPodSandbox for \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\" failed" error="failed to destroy network for sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:04.327490 kubelet[2705]: E0913 01:12:04.326656 2705 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:12:04.327490 kubelet[2705]: E0913 01:12:04.326777 2705 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:04.327490 kubelet[2705]: E0913 01:12:04.314293 2705 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f"} Sep 13 01:12:04.327490 kubelet[2705]: E0913 01:12:04.326744 2705 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884"} Sep 13 01:12:04.327490 kubelet[2705]: E0913 01:12:04.326944 2705 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9962c42a-298d-44e3-93d7-b667b8f90fa1\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:12:04.327951 kubelet[2705]: E0913 01:12:04.327011 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9962c42a-298d-44e3-93d7-b667b8f90fa1\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-84fd46496-c4kbx" podUID="9962c42a-298d-44e3-93d7-b667b8f90fa1" Sep 13 01:12:04.327951 kubelet[2705]: E0913 01:12:04.326948 2705 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"f1634209-c6c5-41be-8e7f-82f83544c3ed\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:12:04.327951 kubelet[2705]: E0913 01:12:04.327163 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"f1634209-c6c5-41be-8e7f-82f83544c3ed\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-658bd7dd7b-wdc44" podUID="f1634209-c6c5-41be-8e7f-82f83544c3ed" Sep 13 01:12:04.327951 kubelet[2705]: E0913 01:12:04.326808 2705 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e"} Sep 13 01:12:04.328367 kubelet[2705]: E0913 01:12:04.327245 2705 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"e62adc23-d970-45f8-9359-c39be02aa620\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:12:04.328367 kubelet[2705]: E0913 01:12:04.327305 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"e62adc23-d970-45f8-9359-c39be02aa620\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-658bd7dd7b-zf7gk" podUID="e62adc23-d970-45f8-9359-c39be02aa620" Sep 13 01:12:04.328367 kubelet[2705]: E0913 01:12:04.314596 2705 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f"} Sep 13 01:12:04.328367 kubelet[2705]: E0913 01:12:04.327389 2705 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"a73f9354-94fe-434f-94c6-f203f326e804\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:12:04.328716 kubelet[2705]: E0913 01:12:04.327421 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"a73f9354-94fe-434f-94c6-f203f326e804\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-kdwhw" podUID="a73f9354-94fe-434f-94c6-f203f326e804" Sep 13 01:12:04.333852 containerd[1507]: time="2025-09-13T01:12:04.332533764Z" level=error msg="StopPodSandbox for \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\" failed" error="failed to destroy network for sandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:04.333852 containerd[1507]: time="2025-09-13T01:12:04.333587580Z" level=error msg="StopPodSandbox for \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\" failed" error="failed to destroy network for sandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:04.333852 containerd[1507]: time="2025-09-13T01:12:04.333716604Z" level=error msg="StopPodSandbox for \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\" failed" error="failed to destroy network for sandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:04.334784 kubelet[2705]: E0913 01:12:04.332997 2705 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:04.334784 kubelet[2705]: E0913 01:12:04.333191 2705 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8"} Sep 13 01:12:04.334784 kubelet[2705]: E0913 01:12:04.333280 2705 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:12:04.334784 kubelet[2705]: E0913 01:12:04.333349 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-sn2fg" podUID="6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9" Sep 13 01:12:04.335120 kubelet[2705]: E0913 01:12:04.334014 2705 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:04.335120 kubelet[2705]: E0913 01:12:04.334157 2705 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218"} Sep 13 01:12:04.335120 kubelet[2705]: E0913 01:12:04.334206 2705 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ae902541-3045-4fda-8e34-8995114228b4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:12:04.335120 kubelet[2705]: E0913 01:12:04.334242 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ae902541-3045-4fda-8e34-8995114228b4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-9s577" podUID="ae902541-3045-4fda-8e34-8995114228b4" Sep 13 01:12:04.335392 kubelet[2705]: E0913 01:12:04.333976 2705 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:04.335392 kubelet[2705]: E0913 01:12:04.334423 2705 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc"} Sep 13 01:12:04.335392 kubelet[2705]: E0913 01:12:04.334466 2705 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1e3277dd-9a07-4afe-ac81-d3afe9a4aa46\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:12:04.335392 kubelet[2705]: E0913 01:12:04.334500 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1e3277dd-9a07-4afe-ac81-d3afe9a4aa46\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b7745757f-zhjt6" podUID="1e3277dd-9a07-4afe-ac81-d3afe9a4aa46" Sep 13 01:12:04.344231 containerd[1507]: time="2025-09-13T01:12:04.344135263Z" level=error msg="StopPodSandbox for \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\" failed" error="failed to destroy network for sandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:04.344767 kubelet[2705]: E0913 01:12:04.344487 2705 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:04.344767 kubelet[2705]: E0913 01:12:04.344593 2705 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08"} Sep 13 01:12:04.344767 kubelet[2705]: E0913 01:12:04.344651 2705 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"2c92e0f8-426f-428f-9601-3c255a79b3c3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:12:04.344767 kubelet[2705]: E0913 01:12:04.344687 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"2c92e0f8-426f-428f-9601-3c255a79b3c3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8vq7t" podUID="2c92e0f8-426f-428f-9601-3c255a79b3c3" Sep 13 01:12:14.655702 containerd[1507]: time="2025-09-13T01:12:14.653945785Z" level=info msg="StopPodSandbox for \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\"" Sep 13 01:12:14.785876 containerd[1507]: time="2025-09-13T01:12:14.785740646Z" level=error msg="StopPodSandbox for \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\" failed" error="failed to destroy network for sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:14.787911 kubelet[2705]: E0913 01:12:14.787692 2705 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:12:14.787911 kubelet[2705]: E0913 01:12:14.787797 2705 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f"} Sep 13 01:12:14.787911 kubelet[2705]: E0913 01:12:14.787859 2705 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"a73f9354-94fe-434f-94c6-f203f326e804\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:12:14.787911 kubelet[2705]: E0913 01:12:14.787898 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"a73f9354-94fe-434f-94c6-f203f326e804\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-kdwhw" podUID="a73f9354-94fe-434f-94c6-f203f326e804" Sep 13 01:12:15.151863 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3603573209.mount: Deactivated successfully. Sep 13 01:12:15.271998 containerd[1507]: time="2025-09-13T01:12:15.271002600Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 13 01:12:15.323691 containerd[1507]: time="2025-09-13T01:12:15.323607716Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:15.366520 containerd[1507]: time="2025-09-13T01:12:15.366441958Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:15.369826 containerd[1507]: time="2025-09-13T01:12:15.368604759Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:15.373931 containerd[1507]: time="2025-09-13T01:12:15.373160683Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 12.350332473s" Sep 13 01:12:15.374092 containerd[1507]: time="2025-09-13T01:12:15.374062040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 13 01:12:15.468970 containerd[1507]: time="2025-09-13T01:12:15.467917214Z" level=info msg="CreateContainer within sandbox \"9ffba7aea979eedfb664a5728c098be7624f1fb2daaf5466d79fcf08a84b6ab3\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 13 01:12:15.550499 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2776222619.mount: Deactivated successfully. Sep 13 01:12:15.560823 containerd[1507]: time="2025-09-13T01:12:15.560724739Z" level=info msg="CreateContainer within sandbox \"9ffba7aea979eedfb664a5728c098be7624f1fb2daaf5466d79fcf08a84b6ab3\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c1b265e016f0174c04072e8b952c791971937208c0e592b1bd3efd76588d44cb\"" Sep 13 01:12:15.565973 containerd[1507]: time="2025-09-13T01:12:15.565643907Z" level=info msg="StartContainer for \"c1b265e016f0174c04072e8b952c791971937208c0e592b1bd3efd76588d44cb\"" Sep 13 01:12:15.645142 containerd[1507]: time="2025-09-13T01:12:15.644279636Z" level=info msg="StopPodSandbox for \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\"" Sep 13 01:12:15.729258 containerd[1507]: time="2025-09-13T01:12:15.727282791Z" level=error msg="StopPodSandbox for \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\" failed" error="failed to destroy network for sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:12:15.728846 systemd[1]: Started cri-containerd-c1b265e016f0174c04072e8b952c791971937208c0e592b1bd3efd76588d44cb.scope - libcontainer container c1b265e016f0174c04072e8b952c791971937208c0e592b1bd3efd76588d44cb. Sep 13 01:12:15.730015 kubelet[2705]: E0913 01:12:15.728271 2705 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:12:15.730015 kubelet[2705]: E0913 01:12:15.728362 2705 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884"} Sep 13 01:12:15.730015 kubelet[2705]: E0913 01:12:15.728438 2705 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"f1634209-c6c5-41be-8e7f-82f83544c3ed\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:12:15.730015 kubelet[2705]: E0913 01:12:15.728500 2705 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"f1634209-c6c5-41be-8e7f-82f83544c3ed\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-658bd7dd7b-wdc44" podUID="f1634209-c6c5-41be-8e7f-82f83544c3ed" Sep 13 01:12:15.798974 containerd[1507]: time="2025-09-13T01:12:15.798791144Z" level=info msg="StartContainer for \"c1b265e016f0174c04072e8b952c791971937208c0e592b1bd3efd76588d44cb\" returns successfully" Sep 13 01:12:16.157259 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 13 01:12:16.159245 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 13 01:12:16.334855 systemd[1]: run-containerd-runc-k8s.io-c1b265e016f0174c04072e8b952c791971937208c0e592b1bd3efd76588d44cb-runc.AnyGBu.mount: Deactivated successfully. Sep 13 01:12:16.641217 containerd[1507]: time="2025-09-13T01:12:16.641084627Z" level=info msg="StopPodSandbox for \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\"" Sep 13 01:12:16.649756 containerd[1507]: time="2025-09-13T01:12:16.649160915Z" level=info msg="StopPodSandbox for \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\"" Sep 13 01:12:16.649756 containerd[1507]: time="2025-09-13T01:12:16.649539618Z" level=info msg="StopPodSandbox for \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\"" Sep 13 01:12:16.879880 kubelet[2705]: I0913 01:12:16.866038 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-292qn" podStartSLOduration=2.3479974820000002 podStartE2EDuration="28.830879338s" podCreationTimestamp="2025-09-13 01:11:48 +0000 UTC" firstStartedPulling="2025-09-13 01:11:48.893266428 +0000 UTC m=+24.591114649" lastFinishedPulling="2025-09-13 01:12:15.376148279 +0000 UTC m=+51.073996505" observedRunningTime="2025-09-13 01:12:16.288377185 +0000 UTC m=+51.986225419" watchObservedRunningTime="2025-09-13 01:12:16.830879338 +0000 UTC m=+52.528727576" Sep 13 01:12:16.891050 containerd[1507]: time="2025-09-13T01:12:16.890487046Z" level=info msg="StopPodSandbox for \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\"" Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:16.955 [INFO][4020] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:16.956 [INFO][4020] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" iface="eth0" netns="/var/run/netns/cni-24723efb-7ca7-69a7-89b5-90ac206064ca" Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:16.958 [INFO][4020] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" iface="eth0" netns="/var/run/netns/cni-24723efb-7ca7-69a7-89b5-90ac206064ca" Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:16.959 [INFO][4020] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" iface="eth0" netns="/var/run/netns/cni-24723efb-7ca7-69a7-89b5-90ac206064ca" Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:16.959 [INFO][4020] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:16.959 [INFO][4020] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:17.261 [INFO][4058] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" HandleID="k8s-pod-network.3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Workload="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:17.268 [INFO][4058] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:17.272 [INFO][4058] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:17.310 [WARNING][4058] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" HandleID="k8s-pod-network.3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Workload="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:17.310 [INFO][4058] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" HandleID="k8s-pod-network.3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Workload="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:17.313 [INFO][4058] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:17.334147 containerd[1507]: 2025-09-13 01:12:17.318 [INFO][4020] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:17.334147 containerd[1507]: time="2025-09-13T01:12:17.332577333Z" level=info msg="TearDown network for sandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\" successfully" Sep 13 01:12:17.334147 containerd[1507]: time="2025-09-13T01:12:17.332630491Z" level=info msg="StopPodSandbox for \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\" returns successfully" Sep 13 01:12:17.337780 systemd[1]: run-containerd-runc-k8s.io-c1b265e016f0174c04072e8b952c791971937208c0e592b1bd3efd76588d44cb-runc.ObhSQD.mount: Deactivated successfully. Sep 13 01:12:17.337919 systemd[1]: run-netns-cni\x2d24723efb\x2d7ca7\x2d69a7\x2d89b5\x2d90ac206064ca.mount: Deactivated successfully. Sep 13 01:12:17.348947 containerd[1507]: time="2025-09-13T01:12:17.347578312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8vq7t,Uid:2c92e0f8-426f-428f-9601-3c255a79b3c3,Namespace:calico-system,Attempt:1,}" Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:16.926 [INFO][4024] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:16.927 [INFO][4024] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" iface="eth0" netns="/var/run/netns/cni-b5036c23-54ba-8844-2053-ae383c4de681" Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:16.931 [INFO][4024] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" iface="eth0" netns="/var/run/netns/cni-b5036c23-54ba-8844-2053-ae383c4de681" Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:16.933 [INFO][4024] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" iface="eth0" netns="/var/run/netns/cni-b5036c23-54ba-8844-2053-ae383c4de681" Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:16.933 [INFO][4024] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:16.934 [INFO][4024] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:17.261 [INFO][4042] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" HandleID="k8s-pod-network.3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Workload="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:17.269 [INFO][4042] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:17.313 [INFO][4042] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:17.356 [WARNING][4042] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" HandleID="k8s-pod-network.3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Workload="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:17.356 [INFO][4042] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" HandleID="k8s-pod-network.3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Workload="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:17.364 [INFO][4042] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:17.403267 containerd[1507]: 2025-09-13 01:12:17.393 [INFO][4024] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:17.408757 containerd[1507]: time="2025-09-13T01:12:17.404716194Z" level=info msg="TearDown network for sandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\" successfully" Sep 13 01:12:17.408757 containerd[1507]: time="2025-09-13T01:12:17.404763674Z" level=info msg="StopPodSandbox for \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\" returns successfully" Sep 13 01:12:17.420594 containerd[1507]: time="2025-09-13T01:12:17.420355813Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-9s577,Uid:ae902541-3045-4fda-8e34-8995114228b4,Namespace:calico-system,Attempt:1,}" Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:16.932 [INFO][4013] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:16.935 [INFO][4013] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" iface="eth0" netns="/var/run/netns/cni-457c936d-f49a-4bfc-e6d0-d2d14dc4a382" Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:16.935 [INFO][4013] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" iface="eth0" netns="/var/run/netns/cni-457c936d-f49a-4bfc-e6d0-d2d14dc4a382" Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:16.936 [INFO][4013] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" iface="eth0" netns="/var/run/netns/cni-457c936d-f49a-4bfc-e6d0-d2d14dc4a382" Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:16.936 [INFO][4013] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:16.938 [INFO][4013] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:17.295 [INFO][4047] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" HandleID="k8s-pod-network.5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:17.296 [INFO][4047] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:17.365 [INFO][4047] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:17.394 [WARNING][4047] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" HandleID="k8s-pod-network.5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:17.395 [INFO][4047] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" HandleID="k8s-pod-network.5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:17.412 [INFO][4047] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:17.441323 containerd[1507]: 2025-09-13 01:12:17.417 [INFO][4013] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:17.444073 containerd[1507]: time="2025-09-13T01:12:17.443186284Z" level=info msg="TearDown network for sandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\" successfully" Sep 13 01:12:17.444073 containerd[1507]: time="2025-09-13T01:12:17.443226105Z" level=info msg="StopPodSandbox for \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\" returns successfully" Sep 13 01:12:17.452152 containerd[1507]: time="2025-09-13T01:12:17.451634766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5b7745757f-zhjt6,Uid:1e3277dd-9a07-4afe-ac81-d3afe9a4aa46,Namespace:calico-system,Attempt:1,}" Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.109 [INFO][4053] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.110 [INFO][4053] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" iface="eth0" netns="/var/run/netns/cni-48867381-ec9d-2fff-cca3-a75874b1a61c" Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.112 [INFO][4053] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" iface="eth0" netns="/var/run/netns/cni-48867381-ec9d-2fff-cca3-a75874b1a61c" Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.114 [INFO][4053] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" iface="eth0" netns="/var/run/netns/cni-48867381-ec9d-2fff-cca3-a75874b1a61c" Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.114 [INFO][4053] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.114 [INFO][4053] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.306 [INFO][4075] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" HandleID="k8s-pod-network.c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Workload="srv--5asmg.gb1.brightbox.com-k8s-whisker--84fd46496--c4kbx-eth0" Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.307 [INFO][4075] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.414 [INFO][4075] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.446 [WARNING][4075] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" HandleID="k8s-pod-network.c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Workload="srv--5asmg.gb1.brightbox.com-k8s-whisker--84fd46496--c4kbx-eth0" Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.446 [INFO][4075] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" HandleID="k8s-pod-network.c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Workload="srv--5asmg.gb1.brightbox.com-k8s-whisker--84fd46496--c4kbx-eth0" Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.457 [INFO][4075] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:17.489955 containerd[1507]: 2025-09-13 01:12:17.475 [INFO][4053] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:17.493191 containerd[1507]: time="2025-09-13T01:12:17.490362543Z" level=info msg="TearDown network for sandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\" successfully" Sep 13 01:12:17.493191 containerd[1507]: time="2025-09-13T01:12:17.490414523Z" level=info msg="StopPodSandbox for \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\" returns successfully" Sep 13 01:12:17.678146 kubelet[2705]: I0913 01:12:17.672875 2705 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9962c42a-298d-44e3-93d7-b667b8f90fa1-whisker-backend-key-pair\") pod \"9962c42a-298d-44e3-93d7-b667b8f90fa1\" (UID: \"9962c42a-298d-44e3-93d7-b667b8f90fa1\") " Sep 13 01:12:17.678146 kubelet[2705]: I0913 01:12:17.676310 2705 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9962c42a-298d-44e3-93d7-b667b8f90fa1-whisker-ca-bundle\") pod \"9962c42a-298d-44e3-93d7-b667b8f90fa1\" (UID: \"9962c42a-298d-44e3-93d7-b667b8f90fa1\") " Sep 13 01:12:17.678981 kubelet[2705]: I0913 01:12:17.678484 2705 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkwvd\" (UniqueName: \"kubernetes.io/projected/9962c42a-298d-44e3-93d7-b667b8f90fa1-kube-api-access-gkwvd\") pod \"9962c42a-298d-44e3-93d7-b667b8f90fa1\" (UID: \"9962c42a-298d-44e3-93d7-b667b8f90fa1\") " Sep 13 01:12:17.691138 kubelet[2705]: I0913 01:12:17.689780 2705 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9962c42a-298d-44e3-93d7-b667b8f90fa1-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "9962c42a-298d-44e3-93d7-b667b8f90fa1" (UID: "9962c42a-298d-44e3-93d7-b667b8f90fa1"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 13 01:12:17.711058 kubelet[2705]: I0913 01:12:17.710746 2705 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9962c42a-298d-44e3-93d7-b667b8f90fa1-kube-api-access-gkwvd" (OuterVolumeSpecName: "kube-api-access-gkwvd") pod "9962c42a-298d-44e3-93d7-b667b8f90fa1" (UID: "9962c42a-298d-44e3-93d7-b667b8f90fa1"). InnerVolumeSpecName "kube-api-access-gkwvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 13 01:12:17.711058 kubelet[2705]: I0913 01:12:17.710951 2705 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9962c42a-298d-44e3-93d7-b667b8f90fa1-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "9962c42a-298d-44e3-93d7-b667b8f90fa1" (UID: "9962c42a-298d-44e3-93d7-b667b8f90fa1"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 13 01:12:17.779663 kubelet[2705]: I0913 01:12:17.779375 2705 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gkwvd\" (UniqueName: \"kubernetes.io/projected/9962c42a-298d-44e3-93d7-b667b8f90fa1-kube-api-access-gkwvd\") on node \"srv-5asmg.gb1.brightbox.com\" DevicePath \"\"" Sep 13 01:12:17.780579 kubelet[2705]: I0913 01:12:17.780217 2705 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9962c42a-298d-44e3-93d7-b667b8f90fa1-whisker-backend-key-pair\") on node \"srv-5asmg.gb1.brightbox.com\" DevicePath \"\"" Sep 13 01:12:17.784134 kubelet[2705]: I0913 01:12:17.780245 2705 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9962c42a-298d-44e3-93d7-b667b8f90fa1-whisker-ca-bundle\") on node \"srv-5asmg.gb1.brightbox.com\" DevicePath \"\"" Sep 13 01:12:17.974519 systemd-networkd[1411]: calib1001d8d787: Link UP Sep 13 01:12:17.976589 systemd-networkd[1411]: calib1001d8d787: Gained carrier Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.530 [INFO][4111] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.574 [INFO][4111] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0 csi-node-driver- calico-system 2c92e0f8-426f-428f-9601-3c255a79b3c3 897 0 2025-09-13 01:11:48 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-5asmg.gb1.brightbox.com csi-node-driver-8vq7t eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib1001d8d787 [] [] }} ContainerID="5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" Namespace="calico-system" Pod="csi-node-driver-8vq7t" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-" Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.574 [INFO][4111] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" Namespace="calico-system" Pod="csi-node-driver-8vq7t" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.767 [INFO][4141] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" HandleID="k8s-pod-network.5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" Workload="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.767 [INFO][4141] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" HandleID="k8s-pod-network.5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" Workload="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000327350), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-5asmg.gb1.brightbox.com", "pod":"csi-node-driver-8vq7t", "timestamp":"2025-09-13 01:12:17.767044148 +0000 UTC"}, Hostname:"srv-5asmg.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.767 [INFO][4141] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.767 [INFO][4141] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.767 [INFO][4141] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5asmg.gb1.brightbox.com' Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.806 [INFO][4141] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.849 [INFO][4141] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.869 [INFO][4141] ipam/ipam.go 511: Trying affinity for 192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.880 [INFO][4141] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.886 [INFO][4141] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.886 [INFO][4141] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.192/26 handle="k8s-pod-network.5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.892 [INFO][4141] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9 Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.907 [INFO][4141] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.192/26 handle="k8s-pod-network.5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.919 [INFO][4141] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.193/26] block=192.168.21.192/26 handle="k8s-pod-network.5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.919 [INFO][4141] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.193/26] handle="k8s-pod-network.5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.919 [INFO][4141] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:18.028362 containerd[1507]: 2025-09-13 01:12:17.919 [INFO][4141] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.193/26] IPv6=[] ContainerID="5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" HandleID="k8s-pod-network.5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" Workload="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:18.033695 containerd[1507]: 2025-09-13 01:12:17.928 [INFO][4111] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" Namespace="calico-system" Pod="csi-node-driver-8vq7t" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2c92e0f8-426f-428f-9601-3c255a79b3c3", ResourceVersion:"897", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-8vq7t", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.21.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib1001d8d787", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:18.033695 containerd[1507]: 2025-09-13 01:12:17.929 [INFO][4111] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.193/32] ContainerID="5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" Namespace="calico-system" Pod="csi-node-driver-8vq7t" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:18.033695 containerd[1507]: 2025-09-13 01:12:17.930 [INFO][4111] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib1001d8d787 ContainerID="5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" Namespace="calico-system" Pod="csi-node-driver-8vq7t" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:18.033695 containerd[1507]: 2025-09-13 01:12:17.976 [INFO][4111] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" Namespace="calico-system" Pod="csi-node-driver-8vq7t" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:18.033695 containerd[1507]: 2025-09-13 01:12:17.977 [INFO][4111] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" Namespace="calico-system" Pod="csi-node-driver-8vq7t" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2c92e0f8-426f-428f-9601-3c255a79b3c3", ResourceVersion:"897", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9", Pod:"csi-node-driver-8vq7t", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.21.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib1001d8d787", MAC:"42:86:5a:0f:90:f0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:18.033695 containerd[1507]: 2025-09-13 01:12:18.020 [INFO][4111] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9" Namespace="calico-system" Pod="csi-node-driver-8vq7t" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:18.091999 systemd-networkd[1411]: cali1624d0c7b70: Link UP Sep 13 01:12:18.092447 systemd-networkd[1411]: cali1624d0c7b70: Gained carrier Sep 13 01:12:18.099675 containerd[1507]: time="2025-09-13T01:12:18.092424314Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:12:18.099675 containerd[1507]: time="2025-09-13T01:12:18.092554051Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:12:18.099675 containerd[1507]: time="2025-09-13T01:12:18.092580322Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:18.107320 containerd[1507]: time="2025-09-13T01:12:18.104338158Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:17.705 [INFO][4121] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:17.774 [INFO][4121] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0 calico-kube-controllers-5b7745757f- calico-system 1e3277dd-9a07-4afe-ac81-d3afe9a4aa46 895 0 2025-09-13 01:11:48 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5b7745757f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-5asmg.gb1.brightbox.com calico-kube-controllers-5b7745757f-zhjt6 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali1624d0c7b70 [] [] }} ContainerID="cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" Namespace="calico-system" Pod="calico-kube-controllers-5b7745757f-zhjt6" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-" Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:17.775 [INFO][4121] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" Namespace="calico-system" Pod="calico-kube-controllers-5b7745757f-zhjt6" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:17.931 [INFO][4164] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" HandleID="k8s-pod-network.cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:17.932 [INFO][4164] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" HandleID="k8s-pod-network.cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00035bf00), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-5asmg.gb1.brightbox.com", "pod":"calico-kube-controllers-5b7745757f-zhjt6", "timestamp":"2025-09-13 01:12:17.931087609 +0000 UTC"}, Hostname:"srv-5asmg.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:17.932 [INFO][4164] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:17.932 [INFO][4164] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:17.932 [INFO][4164] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5asmg.gb1.brightbox.com' Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:17.963 [INFO][4164] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:18.009 [INFO][4164] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:18.037 [INFO][4164] ipam/ipam.go 511: Trying affinity for 192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:18.041 [INFO][4164] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:18.046 [INFO][4164] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:18.046 [INFO][4164] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.192/26 handle="k8s-pod-network.cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:18.049 [INFO][4164] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1 Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:18.057 [INFO][4164] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.192/26 handle="k8s-pod-network.cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:18.070 [INFO][4164] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.194/26] block=192.168.21.192/26 handle="k8s-pod-network.cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:18.070 [INFO][4164] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.194/26] handle="k8s-pod-network.cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:18.071 [INFO][4164] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:18.140687 containerd[1507]: 2025-09-13 01:12:18.071 [INFO][4164] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.194/26] IPv6=[] ContainerID="cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" HandleID="k8s-pod-network.cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:18.142134 containerd[1507]: 2025-09-13 01:12:18.078 [INFO][4121] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" Namespace="calico-system" Pod="calico-kube-controllers-5b7745757f-zhjt6" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0", GenerateName:"calico-kube-controllers-5b7745757f-", Namespace:"calico-system", SelfLink:"", UID:"1e3277dd-9a07-4afe-ac81-d3afe9a4aa46", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5b7745757f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-5b7745757f-zhjt6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.21.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1624d0c7b70", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:18.142134 containerd[1507]: 2025-09-13 01:12:18.079 [INFO][4121] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.194/32] ContainerID="cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" Namespace="calico-system" Pod="calico-kube-controllers-5b7745757f-zhjt6" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:18.142134 containerd[1507]: 2025-09-13 01:12:18.079 [INFO][4121] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1624d0c7b70 ContainerID="cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" Namespace="calico-system" Pod="calico-kube-controllers-5b7745757f-zhjt6" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:18.142134 containerd[1507]: 2025-09-13 01:12:18.093 [INFO][4121] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" Namespace="calico-system" Pod="calico-kube-controllers-5b7745757f-zhjt6" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:18.142134 containerd[1507]: 2025-09-13 01:12:18.093 [INFO][4121] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" Namespace="calico-system" Pod="calico-kube-controllers-5b7745757f-zhjt6" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0", GenerateName:"calico-kube-controllers-5b7745757f-", Namespace:"calico-system", SelfLink:"", UID:"1e3277dd-9a07-4afe-ac81-d3afe9a4aa46", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5b7745757f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1", Pod:"calico-kube-controllers-5b7745757f-zhjt6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.21.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1624d0c7b70", MAC:"ea:1a:ef:06:6a:64", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:18.142134 containerd[1507]: 2025-09-13 01:12:18.133 [INFO][4121] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1" Namespace="calico-system" Pod="calico-kube-controllers-5b7745757f-zhjt6" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:18.175059 systemd[1]: Started cri-containerd-5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9.scope - libcontainer container 5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9. Sep 13 01:12:18.207830 systemd-networkd[1411]: cali2ec2207f398: Link UP Sep 13 01:12:18.209579 systemd-networkd[1411]: cali2ec2207f398: Gained carrier Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:17.720 [INFO][4129] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:17.791 [INFO][4129] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0 goldmane-54d579b49d- calico-system ae902541-3045-4fda-8e34-8995114228b4 894 0 2025-09-13 01:11:47 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-5asmg.gb1.brightbox.com goldmane-54d579b49d-9s577 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali2ec2207f398 [] [] }} ContainerID="41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" Namespace="calico-system" Pod="goldmane-54d579b49d-9s577" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-" Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:17.791 [INFO][4129] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" Namespace="calico-system" Pod="goldmane-54d579b49d-9s577" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:17.932 [INFO][4162] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" HandleID="k8s-pod-network.41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" Workload="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:17.932 [INFO][4162] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" HandleID="k8s-pod-network.41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" Workload="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5d80), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-5asmg.gb1.brightbox.com", "pod":"goldmane-54d579b49d-9s577", "timestamp":"2025-09-13 01:12:17.929944336 +0000 UTC"}, Hostname:"srv-5asmg.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:17.933 [INFO][4162] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.071 [INFO][4162] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.072 [INFO][4162] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5asmg.gb1.brightbox.com' Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.100 [INFO][4162] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.122 [INFO][4162] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.142 [INFO][4162] ipam/ipam.go 511: Trying affinity for 192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.149 [INFO][4162] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.156 [INFO][4162] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.156 [INFO][4162] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.192/26 handle="k8s-pod-network.41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.162 [INFO][4162] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.173 [INFO][4162] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.192/26 handle="k8s-pod-network.41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.187 [INFO][4162] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.195/26] block=192.168.21.192/26 handle="k8s-pod-network.41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.188 [INFO][4162] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.195/26] handle="k8s-pod-network.41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.189 [INFO][4162] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:18.243463 containerd[1507]: 2025-09-13 01:12:18.190 [INFO][4162] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.195/26] IPv6=[] ContainerID="41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" HandleID="k8s-pod-network.41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" Workload="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:18.244698 containerd[1507]: 2025-09-13 01:12:18.197 [INFO][4129] cni-plugin/k8s.go 418: Populated endpoint ContainerID="41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" Namespace="calico-system" Pod="goldmane-54d579b49d-9s577" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"ae902541-3045-4fda-8e34-8995114228b4", ResourceVersion:"894", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-54d579b49d-9s577", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.21.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2ec2207f398", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:18.244698 containerd[1507]: 2025-09-13 01:12:18.199 [INFO][4129] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.195/32] ContainerID="41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" Namespace="calico-system" Pod="goldmane-54d579b49d-9s577" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:18.244698 containerd[1507]: 2025-09-13 01:12:18.199 [INFO][4129] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2ec2207f398 ContainerID="41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" Namespace="calico-system" Pod="goldmane-54d579b49d-9s577" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:18.244698 containerd[1507]: 2025-09-13 01:12:18.209 [INFO][4129] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" Namespace="calico-system" Pod="goldmane-54d579b49d-9s577" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:18.244698 containerd[1507]: 2025-09-13 01:12:18.212 [INFO][4129] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" Namespace="calico-system" Pod="goldmane-54d579b49d-9s577" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"ae902541-3045-4fda-8e34-8995114228b4", ResourceVersion:"894", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf", Pod:"goldmane-54d579b49d-9s577", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.21.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2ec2207f398", MAC:"f2:35:98:72:b3:db", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:18.244698 containerd[1507]: 2025-09-13 01:12:18.237 [INFO][4129] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf" Namespace="calico-system" Pod="goldmane-54d579b49d-9s577" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:18.253043 containerd[1507]: time="2025-09-13T01:12:18.248944387Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:12:18.253043 containerd[1507]: time="2025-09-13T01:12:18.249079727Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:12:18.253043 containerd[1507]: time="2025-09-13T01:12:18.249600392Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:18.253043 containerd[1507]: time="2025-09-13T01:12:18.250709328Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:18.276634 systemd[1]: Removed slice kubepods-besteffort-pod9962c42a_298d_44e3_93d7_b667b8f90fa1.slice - libcontainer container kubepods-besteffort-pod9962c42a_298d_44e3_93d7_b667b8f90fa1.slice. Sep 13 01:12:18.316672 systemd[1]: Started cri-containerd-cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1.scope - libcontainer container cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1. Sep 13 01:12:18.337022 systemd[1]: run-netns-cni\x2db5036c23\x2d54ba\x2d8844\x2d2053\x2dae383c4de681.mount: Deactivated successfully. Sep 13 01:12:18.338528 systemd[1]: run-netns-cni\x2d48867381\x2dec9d\x2d2fff\x2dcca3\x2da75874b1a61c.mount: Deactivated successfully. Sep 13 01:12:18.338646 systemd[1]: run-netns-cni\x2d457c936d\x2df49a\x2d4bfc\x2de6d0\x2dd2d14dc4a382.mount: Deactivated successfully. Sep 13 01:12:18.338777 systemd[1]: var-lib-kubelet-pods-9962c42a\x2d298d\x2d44e3\x2d93d7\x2db667b8f90fa1-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dgkwvd.mount: Deactivated successfully. Sep 13 01:12:18.338909 systemd[1]: var-lib-kubelet-pods-9962c42a\x2d298d\x2d44e3\x2d93d7\x2db667b8f90fa1-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 13 01:12:18.374913 containerd[1507]: time="2025-09-13T01:12:18.373062245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8vq7t,Uid:2c92e0f8-426f-428f-9601-3c255a79b3c3,Namespace:calico-system,Attempt:1,} returns sandbox id \"5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9\"" Sep 13 01:12:18.385351 containerd[1507]: time="2025-09-13T01:12:18.385174822Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 13 01:12:18.407141 containerd[1507]: time="2025-09-13T01:12:18.405943098Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:12:18.407141 containerd[1507]: time="2025-09-13T01:12:18.406076088Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:12:18.407141 containerd[1507]: time="2025-09-13T01:12:18.406124551Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:18.407617 containerd[1507]: time="2025-09-13T01:12:18.406309985Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:18.468605 systemd[1]: Started cri-containerd-41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf.scope - libcontainer container 41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf. Sep 13 01:12:18.501535 systemd[1]: Created slice kubepods-besteffort-podad7401ad_0d3b_4d69_bb9a_b9f33aa4b43a.slice - libcontainer container kubepods-besteffort-podad7401ad_0d3b_4d69_bb9a_b9f33aa4b43a.slice. Sep 13 01:12:18.608754 kubelet[2705]: I0913 01:12:18.608681 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xssrc\" (UniqueName: \"kubernetes.io/projected/ad7401ad-0d3b-4d69-bb9a-b9f33aa4b43a-kube-api-access-xssrc\") pod \"whisker-7b856d4fbc-9lhhj\" (UID: \"ad7401ad-0d3b-4d69-bb9a-b9f33aa4b43a\") " pod="calico-system/whisker-7b856d4fbc-9lhhj" Sep 13 01:12:18.610918 kubelet[2705]: I0913 01:12:18.609718 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ad7401ad-0d3b-4d69-bb9a-b9f33aa4b43a-whisker-backend-key-pair\") pod \"whisker-7b856d4fbc-9lhhj\" (UID: \"ad7401ad-0d3b-4d69-bb9a-b9f33aa4b43a\") " pod="calico-system/whisker-7b856d4fbc-9lhhj" Sep 13 01:12:18.610918 kubelet[2705]: I0913 01:12:18.609800 2705 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad7401ad-0d3b-4d69-bb9a-b9f33aa4b43a-whisker-ca-bundle\") pod \"whisker-7b856d4fbc-9lhhj\" (UID: \"ad7401ad-0d3b-4d69-bb9a-b9f33aa4b43a\") " pod="calico-system/whisker-7b856d4fbc-9lhhj" Sep 13 01:12:18.643589 containerd[1507]: time="2025-09-13T01:12:18.643342792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5b7745757f-zhjt6,Uid:1e3277dd-9a07-4afe-ac81-d3afe9a4aa46,Namespace:calico-system,Attempt:1,} returns sandbox id \"cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1\"" Sep 13 01:12:18.649501 containerd[1507]: time="2025-09-13T01:12:18.648966577Z" level=info msg="StopPodSandbox for \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\"" Sep 13 01:12:18.655983 containerd[1507]: time="2025-09-13T01:12:18.655929542Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-9s577,Uid:ae902541-3045-4fda-8e34-8995114228b4,Namespace:calico-system,Attempt:1,} returns sandbox id \"41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf\"" Sep 13 01:12:18.667487 kubelet[2705]: I0913 01:12:18.667411 2705 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9962c42a-298d-44e3-93d7-b667b8f90fa1" path="/var/lib/kubelet/pods/9962c42a-298d-44e3-93d7-b667b8f90fa1/volumes" Sep 13 01:12:18.810927 containerd[1507]: time="2025-09-13T01:12:18.809272039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b856d4fbc-9lhhj,Uid:ad7401ad-0d3b-4d69-bb9a-b9f33aa4b43a,Namespace:calico-system,Attempt:0,}" Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.754 [INFO][4335] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.754 [INFO][4335] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" iface="eth0" netns="/var/run/netns/cni-9f2474b0-9823-9fb4-1558-3ca7292e5068" Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.756 [INFO][4335] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" iface="eth0" netns="/var/run/netns/cni-9f2474b0-9823-9fb4-1558-3ca7292e5068" Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.762 [INFO][4335] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" iface="eth0" netns="/var/run/netns/cni-9f2474b0-9823-9fb4-1558-3ca7292e5068" Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.762 [INFO][4335] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.762 [INFO][4335] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.799 [INFO][4344] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" HandleID="k8s-pod-network.898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.799 [INFO][4344] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.799 [INFO][4344] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.809 [WARNING][4344] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" HandleID="k8s-pod-network.898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.809 [INFO][4344] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" HandleID="k8s-pod-network.898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.812 [INFO][4344] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:18.818239 containerd[1507]: 2025-09-13 01:12:18.815 [INFO][4335] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:18.818239 containerd[1507]: time="2025-09-13T01:12:18.818197732Z" level=info msg="TearDown network for sandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\" successfully" Sep 13 01:12:18.818239 containerd[1507]: time="2025-09-13T01:12:18.818237339Z" level=info msg="StopPodSandbox for \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\" returns successfully" Sep 13 01:12:18.820382 containerd[1507]: time="2025-09-13T01:12:18.819810725Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658bd7dd7b-zf7gk,Uid:e62adc23-d970-45f8-9359-c39be02aa620,Namespace:calico-apiserver,Attempt:1,}" Sep 13 01:12:19.124673 systemd-networkd[1411]: cali3ce79871c59: Link UP Sep 13 01:12:19.126805 systemd-networkd[1411]: cali3ce79871c59: Gained carrier Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:18.877 [INFO][4350] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:18.904 [INFO][4350] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0 whisker-7b856d4fbc- calico-system ad7401ad-0d3b-4d69-bb9a-b9f33aa4b43a 934 0 2025-09-13 01:12:18 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7b856d4fbc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-5asmg.gb1.brightbox.com whisker-7b856d4fbc-9lhhj eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali3ce79871c59 [] [] }} ContainerID="7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" Namespace="calico-system" Pod="whisker-7b856d4fbc-9lhhj" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-" Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:18.904 [INFO][4350] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" Namespace="calico-system" Pod="whisker-7b856d4fbc-9lhhj" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0" Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.028 [INFO][4374] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" HandleID="k8s-pod-network.7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" Workload="srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0" Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.028 [INFO][4374] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" HandleID="k8s-pod-network.7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" Workload="srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00063c670), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-5asmg.gb1.brightbox.com", "pod":"whisker-7b856d4fbc-9lhhj", "timestamp":"2025-09-13 01:12:19.027164064 +0000 UTC"}, Hostname:"srv-5asmg.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.028 [INFO][4374] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.028 [INFO][4374] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.028 [INFO][4374] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5asmg.gb1.brightbox.com' Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.054 [INFO][4374] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.066 [INFO][4374] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.076 [INFO][4374] ipam/ipam.go 511: Trying affinity for 192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.081 [INFO][4374] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.086 [INFO][4374] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.086 [INFO][4374] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.192/26 handle="k8s-pod-network.7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.090 [INFO][4374] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1 Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.097 [INFO][4374] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.192/26 handle="k8s-pod-network.7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.106 [INFO][4374] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.196/26] block=192.168.21.192/26 handle="k8s-pod-network.7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.107 [INFO][4374] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.196/26] handle="k8s-pod-network.7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.107 [INFO][4374] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:19.149388 containerd[1507]: 2025-09-13 01:12:19.107 [INFO][4374] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.196/26] IPv6=[] ContainerID="7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" HandleID="k8s-pod-network.7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" Workload="srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0" Sep 13 01:12:19.156652 containerd[1507]: 2025-09-13 01:12:19.114 [INFO][4350] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" Namespace="calico-system" Pod="whisker-7b856d4fbc-9lhhj" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0", GenerateName:"whisker-7b856d4fbc-", Namespace:"calico-system", SelfLink:"", UID:"ad7401ad-0d3b-4d69-bb9a-b9f33aa4b43a", ResourceVersion:"934", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 12, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7b856d4fbc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"", Pod:"whisker-7b856d4fbc-9lhhj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.21.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3ce79871c59", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:19.156652 containerd[1507]: 2025-09-13 01:12:19.114 [INFO][4350] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.196/32] ContainerID="7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" Namespace="calico-system" Pod="whisker-7b856d4fbc-9lhhj" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0" Sep 13 01:12:19.156652 containerd[1507]: 2025-09-13 01:12:19.115 [INFO][4350] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3ce79871c59 ContainerID="7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" Namespace="calico-system" Pod="whisker-7b856d4fbc-9lhhj" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0" Sep 13 01:12:19.156652 containerd[1507]: 2025-09-13 01:12:19.128 [INFO][4350] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" Namespace="calico-system" Pod="whisker-7b856d4fbc-9lhhj" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0" Sep 13 01:12:19.156652 containerd[1507]: 2025-09-13 01:12:19.129 [INFO][4350] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" Namespace="calico-system" Pod="whisker-7b856d4fbc-9lhhj" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0", GenerateName:"whisker-7b856d4fbc-", Namespace:"calico-system", SelfLink:"", UID:"ad7401ad-0d3b-4d69-bb9a-b9f33aa4b43a", ResourceVersion:"934", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 12, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7b856d4fbc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1", Pod:"whisker-7b856d4fbc-9lhhj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.21.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3ce79871c59", MAC:"22:02:f8:42:73:1d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:19.156652 containerd[1507]: 2025-09-13 01:12:19.143 [INFO][4350] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1" Namespace="calico-system" Pod="whisker-7b856d4fbc-9lhhj" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-whisker--7b856d4fbc--9lhhj-eth0" Sep 13 01:12:19.303287 containerd[1507]: time="2025-09-13T01:12:19.300477263Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:12:19.308844 containerd[1507]: time="2025-09-13T01:12:19.305884020Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:12:19.308844 containerd[1507]: time="2025-09-13T01:12:19.305925051Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:19.308844 containerd[1507]: time="2025-09-13T01:12:19.306062842Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:19.309704 systemd-networkd[1411]: caliaebdfb3c6c9: Link UP Sep 13 01:12:19.313991 systemd-networkd[1411]: caliaebdfb3c6c9: Gained carrier Sep 13 01:12:19.338800 systemd[1]: run-netns-cni\x2d9f2474b0\x2d9823\x2d9fb4\x2d1558\x2d3ca7292e5068.mount: Deactivated successfully. Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:18.906 [INFO][4360] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:18.946 [INFO][4360] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0 calico-apiserver-658bd7dd7b- calico-apiserver e62adc23-d970-45f8-9359-c39be02aa620 940 0 2025-09-13 01:11:43 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:658bd7dd7b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-5asmg.gb1.brightbox.com calico-apiserver-658bd7dd7b-zf7gk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliaebdfb3c6c9 [] [] }} ContainerID="ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-zf7gk" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-" Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:18.946 [INFO][4360] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-zf7gk" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.081 [INFO][4394] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" HandleID="k8s-pod-network.ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.082 [INFO][4394] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" HandleID="k8s-pod-network.ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00038c070), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-5asmg.gb1.brightbox.com", "pod":"calico-apiserver-658bd7dd7b-zf7gk", "timestamp":"2025-09-13 01:12:19.081867766 +0000 UTC"}, Hostname:"srv-5asmg.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.082 [INFO][4394] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.107 [INFO][4394] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.107 [INFO][4394] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5asmg.gb1.brightbox.com' Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.152 [INFO][4394] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.192 [INFO][4394] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.210 [INFO][4394] ipam/ipam.go 511: Trying affinity for 192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.217 [INFO][4394] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.228 [INFO][4394] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.230 [INFO][4394] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.192/26 handle="k8s-pod-network.ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.240 [INFO][4394] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.265 [INFO][4394] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.192/26 handle="k8s-pod-network.ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.287 [INFO][4394] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.197/26] block=192.168.21.192/26 handle="k8s-pod-network.ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.287 [INFO][4394] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.197/26] handle="k8s-pod-network.ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.287 [INFO][4394] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:19.374568 containerd[1507]: 2025-09-13 01:12:19.288 [INFO][4394] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.197/26] IPv6=[] ContainerID="ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" HandleID="k8s-pod-network.ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:19.377511 containerd[1507]: 2025-09-13 01:12:19.297 [INFO][4360] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-zf7gk" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0", GenerateName:"calico-apiserver-658bd7dd7b-", Namespace:"calico-apiserver", SelfLink:"", UID:"e62adc23-d970-45f8-9359-c39be02aa620", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658bd7dd7b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-658bd7dd7b-zf7gk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliaebdfb3c6c9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:19.377511 containerd[1507]: 2025-09-13 01:12:19.297 [INFO][4360] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.197/32] ContainerID="ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-zf7gk" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:19.377511 containerd[1507]: 2025-09-13 01:12:19.297 [INFO][4360] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaebdfb3c6c9 ContainerID="ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-zf7gk" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:19.377511 containerd[1507]: 2025-09-13 01:12:19.318 [INFO][4360] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-zf7gk" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:19.377511 containerd[1507]: 2025-09-13 01:12:19.320 [INFO][4360] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-zf7gk" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0", GenerateName:"calico-apiserver-658bd7dd7b-", Namespace:"calico-apiserver", SelfLink:"", UID:"e62adc23-d970-45f8-9359-c39be02aa620", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658bd7dd7b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc", Pod:"calico-apiserver-658bd7dd7b-zf7gk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliaebdfb3c6c9", MAC:"ce:87:6e:41:93:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:19.377511 containerd[1507]: 2025-09-13 01:12:19.367 [INFO][4360] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-zf7gk" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:19.406671 systemd-networkd[1411]: calib1001d8d787: Gained IPv6LL Sep 13 01:12:19.408067 systemd[1]: Started cri-containerd-7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1.scope - libcontainer container 7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1. Sep 13 01:12:19.469504 systemd-networkd[1411]: cali1624d0c7b70: Gained IPv6LL Sep 13 01:12:19.475386 containerd[1507]: time="2025-09-13T01:12:19.475075279Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:12:19.475386 containerd[1507]: time="2025-09-13T01:12:19.475320006Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:12:19.475846 containerd[1507]: time="2025-09-13T01:12:19.475404106Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:19.477428 containerd[1507]: time="2025-09-13T01:12:19.476340942Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:19.546484 systemd[1]: Started cri-containerd-ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc.scope - libcontainer container ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc. Sep 13 01:12:19.641894 containerd[1507]: time="2025-09-13T01:12:19.640343390Z" level=info msg="StopPodSandbox for \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\"" Sep 13 01:12:19.666752 containerd[1507]: time="2025-09-13T01:12:19.665961342Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b856d4fbc-9lhhj,Uid:ad7401ad-0d3b-4d69-bb9a-b9f33aa4b43a,Namespace:calico-system,Attempt:0,} returns sandbox id \"7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1\"" Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.793 [INFO][4572] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.794 [INFO][4572] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" iface="eth0" netns="/var/run/netns/cni-ca111000-8191-cc59-012d-a66b75e4362d" Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.795 [INFO][4572] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" iface="eth0" netns="/var/run/netns/cni-ca111000-8191-cc59-012d-a66b75e4362d" Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.795 [INFO][4572] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" iface="eth0" netns="/var/run/netns/cni-ca111000-8191-cc59-012d-a66b75e4362d" Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.795 [INFO][4572] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.795 [INFO][4572] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.871 [INFO][4579] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" HandleID="k8s-pod-network.69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.871 [INFO][4579] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.871 [INFO][4579] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.884 [WARNING][4579] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" HandleID="k8s-pod-network.69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.885 [INFO][4579] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" HandleID="k8s-pod-network.69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.888 [INFO][4579] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:19.896778 containerd[1507]: 2025-09-13 01:12:19.893 [INFO][4572] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:19.900538 containerd[1507]: time="2025-09-13T01:12:19.897933261Z" level=info msg="TearDown network for sandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\" successfully" Sep 13 01:12:19.900538 containerd[1507]: time="2025-09-13T01:12:19.897978021Z" level=info msg="StopPodSandbox for \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\" returns successfully" Sep 13 01:12:19.901792 containerd[1507]: time="2025-09-13T01:12:19.901271932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sn2fg,Uid:6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9,Namespace:kube-system,Attempt:1,}" Sep 13 01:12:19.904816 systemd[1]: run-netns-cni\x2dca111000\x2d8191\x2dcc59\x2d012d\x2da66b75e4362d.mount: Deactivated successfully. Sep 13 01:12:19.975301 containerd[1507]: time="2025-09-13T01:12:19.975191803Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658bd7dd7b-zf7gk,Uid:e62adc23-d970-45f8-9359-c39be02aa620,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc\"" Sep 13 01:12:19.981652 systemd-networkd[1411]: cali2ec2207f398: Gained IPv6LL Sep 13 01:12:20.290824 systemd-networkd[1411]: calid97eac67a13: Link UP Sep 13 01:12:20.295280 systemd-networkd[1411]: calid97eac67a13: Gained carrier Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.042 [INFO][4592] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.077 [INFO][4592] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0 coredns-668d6bf9bc- kube-system 6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9 951 0 2025-09-13 01:11:30 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-5asmg.gb1.brightbox.com coredns-668d6bf9bc-sn2fg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid97eac67a13 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" Namespace="kube-system" Pod="coredns-668d6bf9bc-sn2fg" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-" Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.077 [INFO][4592] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" Namespace="kube-system" Pod="coredns-668d6bf9bc-sn2fg" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.158 [INFO][4605] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" HandleID="k8s-pod-network.928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.158 [INFO][4605] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" HandleID="k8s-pod-network.928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ac230), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-5asmg.gb1.brightbox.com", "pod":"coredns-668d6bf9bc-sn2fg", "timestamp":"2025-09-13 01:12:20.158476597 +0000 UTC"}, Hostname:"srv-5asmg.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.159 [INFO][4605] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.159 [INFO][4605] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.159 [INFO][4605] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5asmg.gb1.brightbox.com' Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.178 [INFO][4605] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.195 [INFO][4605] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.207 [INFO][4605] ipam/ipam.go 511: Trying affinity for 192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.211 [INFO][4605] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.215 [INFO][4605] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.216 [INFO][4605] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.192/26 handle="k8s-pod-network.928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.219 [INFO][4605] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.229 [INFO][4605] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.192/26 handle="k8s-pod-network.928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.263 [INFO][4605] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.198/26] block=192.168.21.192/26 handle="k8s-pod-network.928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.263 [INFO][4605] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.198/26] handle="k8s-pod-network.928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.263 [INFO][4605] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:20.353685 containerd[1507]: 2025-09-13 01:12:20.263 [INFO][4605] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.198/26] IPv6=[] ContainerID="928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" HandleID="k8s-pod-network.928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:20.363000 containerd[1507]: 2025-09-13 01:12:20.278 [INFO][4592] cni-plugin/k8s.go 418: Populated endpoint ContainerID="928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" Namespace="kube-system" Pod="coredns-668d6bf9bc-sn2fg" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9", ResourceVersion:"951", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"", Pod:"coredns-668d6bf9bc-sn2fg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid97eac67a13", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:20.363000 containerd[1507]: 2025-09-13 01:12:20.279 [INFO][4592] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.198/32] ContainerID="928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" Namespace="kube-system" Pod="coredns-668d6bf9bc-sn2fg" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:20.363000 containerd[1507]: 2025-09-13 01:12:20.279 [INFO][4592] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid97eac67a13 ContainerID="928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" Namespace="kube-system" Pod="coredns-668d6bf9bc-sn2fg" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:20.363000 containerd[1507]: 2025-09-13 01:12:20.297 [INFO][4592] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" Namespace="kube-system" Pod="coredns-668d6bf9bc-sn2fg" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:20.363000 containerd[1507]: 2025-09-13 01:12:20.300 [INFO][4592] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" Namespace="kube-system" Pod="coredns-668d6bf9bc-sn2fg" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9", ResourceVersion:"951", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a", Pod:"coredns-668d6bf9bc-sn2fg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid97eac67a13", MAC:"b2:2f:94:83:7f:51", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:20.363000 containerd[1507]: 2025-09-13 01:12:20.349 [INFO][4592] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a" Namespace="kube-system" Pod="coredns-668d6bf9bc-sn2fg" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:20.457773 containerd[1507]: time="2025-09-13T01:12:20.454502191Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:12:20.457773 containerd[1507]: time="2025-09-13T01:12:20.454602423Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:12:20.457773 containerd[1507]: time="2025-09-13T01:12:20.454621677Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:20.457773 containerd[1507]: time="2025-09-13T01:12:20.454763044Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:20.522387 systemd[1]: Started cri-containerd-928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a.scope - libcontainer container 928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a. Sep 13 01:12:20.624920 systemd-networkd[1411]: caliaebdfb3c6c9: Gained IPv6LL Sep 13 01:12:20.654794 containerd[1507]: time="2025-09-13T01:12:20.654729545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-sn2fg,Uid:6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9,Namespace:kube-system,Attempt:1,} returns sandbox id \"928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a\"" Sep 13 01:12:20.674142 containerd[1507]: time="2025-09-13T01:12:20.672391991Z" level=info msg="CreateContainer within sandbox \"928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 13 01:12:20.744481 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4146628764.mount: Deactivated successfully. Sep 13 01:12:20.783069 containerd[1507]: time="2025-09-13T01:12:20.782845801Z" level=info msg="CreateContainer within sandbox \"928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e0163bbe025601bf188c065177cbaad1f7a8ed6aa072240fd23125baa252c1f1\"" Sep 13 01:12:20.793240 containerd[1507]: time="2025-09-13T01:12:20.788573219Z" level=info msg="StartContainer for \"e0163bbe025601bf188c065177cbaad1f7a8ed6aa072240fd23125baa252c1f1\"" Sep 13 01:12:20.863405 systemd[1]: Started cri-containerd-e0163bbe025601bf188c065177cbaad1f7a8ed6aa072240fd23125baa252c1f1.scope - libcontainer container e0163bbe025601bf188c065177cbaad1f7a8ed6aa072240fd23125baa252c1f1. Sep 13 01:12:20.878214 systemd-networkd[1411]: cali3ce79871c59: Gained IPv6LL Sep 13 01:12:21.005532 containerd[1507]: time="2025-09-13T01:12:21.005449341Z" level=info msg="StartContainer for \"e0163bbe025601bf188c065177cbaad1f7a8ed6aa072240fd23125baa252c1f1\" returns successfully" Sep 13 01:12:21.154469 containerd[1507]: time="2025-09-13T01:12:21.154245277Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:21.161385 containerd[1507]: time="2025-09-13T01:12:21.160845610Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 13 01:12:21.166449 containerd[1507]: time="2025-09-13T01:12:21.166393285Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:21.174917 containerd[1507]: time="2025-09-13T01:12:21.174868928Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:21.178135 containerd[1507]: time="2025-09-13T01:12:21.177678686Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.791713708s" Sep 13 01:12:21.178733 containerd[1507]: time="2025-09-13T01:12:21.178701826Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 13 01:12:21.182238 containerd[1507]: time="2025-09-13T01:12:21.182166337Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 13 01:12:21.187197 containerd[1507]: time="2025-09-13T01:12:21.186954908Z" level=info msg="CreateContainer within sandbox \"5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 13 01:12:21.211783 containerd[1507]: time="2025-09-13T01:12:21.211638590Z" level=info msg="CreateContainer within sandbox \"5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"a97a74dba9fde0c278143646c864ea4b6b26faf2900dfb63b4e0d2f7a6165464\"" Sep 13 01:12:21.213005 containerd[1507]: time="2025-09-13T01:12:21.212963549Z" level=info msg="StartContainer for \"a97a74dba9fde0c278143646c864ea4b6b26faf2900dfb63b4e0d2f7a6165464\"" Sep 13 01:12:21.286731 systemd[1]: Started cri-containerd-a97a74dba9fde0c278143646c864ea4b6b26faf2900dfb63b4e0d2f7a6165464.scope - libcontainer container a97a74dba9fde0c278143646c864ea4b6b26faf2900dfb63b4e0d2f7a6165464. Sep 13 01:12:21.425902 containerd[1507]: time="2025-09-13T01:12:21.425731772Z" level=info msg="StartContainer for \"a97a74dba9fde0c278143646c864ea4b6b26faf2900dfb63b4e0d2f7a6165464\" returns successfully" Sep 13 01:12:21.530190 kernel: bpftool[4779]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Sep 13 01:12:21.908196 systemd-networkd[1411]: vxlan.calico: Link UP Sep 13 01:12:21.908226 systemd-networkd[1411]: vxlan.calico: Gained carrier Sep 13 01:12:22.221914 systemd-networkd[1411]: calid97eac67a13: Gained IPv6LL Sep 13 01:12:22.450741 kubelet[2705]: I0913 01:12:22.450590 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-sn2fg" podStartSLOduration=52.45051835 podStartE2EDuration="52.45051835s" podCreationTimestamp="2025-09-13 01:11:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 01:12:21.44974171 +0000 UTC m=+57.147589938" watchObservedRunningTime="2025-09-13 01:12:22.45051835 +0000 UTC m=+58.148366581" Sep 13 01:12:23.118529 systemd-networkd[1411]: vxlan.calico: Gained IPv6LL Sep 13 01:12:24.619164 containerd[1507]: time="2025-09-13T01:12:24.619030308Z" level=info msg="StopPodSandbox for \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\"" Sep 13 01:12:24.974525 containerd[1507]: 2025-09-13 01:12:24.796 [WARNING][4881] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0", GenerateName:"calico-apiserver-658bd7dd7b-", Namespace:"calico-apiserver", SelfLink:"", UID:"e62adc23-d970-45f8-9359-c39be02aa620", ResourceVersion:"948", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658bd7dd7b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc", Pod:"calico-apiserver-658bd7dd7b-zf7gk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliaebdfb3c6c9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:24.974525 containerd[1507]: 2025-09-13 01:12:24.796 [INFO][4881] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:24.974525 containerd[1507]: 2025-09-13 01:12:24.796 [INFO][4881] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" iface="eth0" netns="" Sep 13 01:12:24.974525 containerd[1507]: 2025-09-13 01:12:24.796 [INFO][4881] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:24.974525 containerd[1507]: 2025-09-13 01:12:24.796 [INFO][4881] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:24.974525 containerd[1507]: 2025-09-13 01:12:24.940 [INFO][4890] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" HandleID="k8s-pod-network.898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:24.974525 containerd[1507]: 2025-09-13 01:12:24.941 [INFO][4890] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:24.974525 containerd[1507]: 2025-09-13 01:12:24.943 [INFO][4890] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:24.974525 containerd[1507]: 2025-09-13 01:12:24.964 [WARNING][4890] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" HandleID="k8s-pod-network.898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:24.974525 containerd[1507]: 2025-09-13 01:12:24.964 [INFO][4890] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" HandleID="k8s-pod-network.898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:24.974525 containerd[1507]: 2025-09-13 01:12:24.968 [INFO][4890] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:24.974525 containerd[1507]: 2025-09-13 01:12:24.971 [INFO][4881] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:24.977191 containerd[1507]: time="2025-09-13T01:12:24.976500432Z" level=info msg="TearDown network for sandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\" successfully" Sep 13 01:12:24.977191 containerd[1507]: time="2025-09-13T01:12:24.976574460Z" level=info msg="StopPodSandbox for \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\" returns successfully" Sep 13 01:12:25.010777 containerd[1507]: time="2025-09-13T01:12:25.010708303Z" level=info msg="RemovePodSandbox for \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\"" Sep 13 01:12:25.016549 containerd[1507]: time="2025-09-13T01:12:25.014896062Z" level=info msg="Forcibly stopping sandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\"" Sep 13 01:12:25.273043 containerd[1507]: 2025-09-13 01:12:25.121 [WARNING][4904] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0", GenerateName:"calico-apiserver-658bd7dd7b-", Namespace:"calico-apiserver", SelfLink:"", UID:"e62adc23-d970-45f8-9359-c39be02aa620", ResourceVersion:"948", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658bd7dd7b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc", Pod:"calico-apiserver-658bd7dd7b-zf7gk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliaebdfb3c6c9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:25.273043 containerd[1507]: 2025-09-13 01:12:25.125 [INFO][4904] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:25.273043 containerd[1507]: 2025-09-13 01:12:25.125 [INFO][4904] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" iface="eth0" netns="" Sep 13 01:12:25.273043 containerd[1507]: 2025-09-13 01:12:25.125 [INFO][4904] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:25.273043 containerd[1507]: 2025-09-13 01:12:25.125 [INFO][4904] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:25.273043 containerd[1507]: 2025-09-13 01:12:25.233 [INFO][4911] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" HandleID="k8s-pod-network.898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:25.273043 containerd[1507]: 2025-09-13 01:12:25.233 [INFO][4911] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:25.273043 containerd[1507]: 2025-09-13 01:12:25.233 [INFO][4911] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:25.273043 containerd[1507]: 2025-09-13 01:12:25.251 [WARNING][4911] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" HandleID="k8s-pod-network.898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:25.273043 containerd[1507]: 2025-09-13 01:12:25.251 [INFO][4911] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" HandleID="k8s-pod-network.898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--zf7gk-eth0" Sep 13 01:12:25.273043 containerd[1507]: 2025-09-13 01:12:25.254 [INFO][4911] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:25.273043 containerd[1507]: 2025-09-13 01:12:25.264 [INFO][4904] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e" Sep 13 01:12:25.284551 containerd[1507]: time="2025-09-13T01:12:25.273710329Z" level=info msg="TearDown network for sandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\" successfully" Sep 13 01:12:25.303286 containerd[1507]: time="2025-09-13T01:12:25.303039001Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:12:25.303286 containerd[1507]: time="2025-09-13T01:12:25.303207828Z" level=info msg="RemovePodSandbox \"898eaca76b419556953c6266e1b0ccecbf0dfaa00a4358abf4f1ab8e1319cf6e\" returns successfully" Sep 13 01:12:25.306679 containerd[1507]: time="2025-09-13T01:12:25.306397824Z" level=info msg="StopPodSandbox for \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\"" Sep 13 01:12:25.531393 containerd[1507]: 2025-09-13 01:12:25.403 [WARNING][4926] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"ae902541-3045-4fda-8e34-8995114228b4", ResourceVersion:"918", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf", Pod:"goldmane-54d579b49d-9s577", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.21.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2ec2207f398", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:25.531393 containerd[1507]: 2025-09-13 01:12:25.406 [INFO][4926] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:25.531393 containerd[1507]: 2025-09-13 01:12:25.406 [INFO][4926] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" iface="eth0" netns="" Sep 13 01:12:25.531393 containerd[1507]: 2025-09-13 01:12:25.406 [INFO][4926] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:25.531393 containerd[1507]: 2025-09-13 01:12:25.406 [INFO][4926] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:25.531393 containerd[1507]: 2025-09-13 01:12:25.497 [INFO][4934] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" HandleID="k8s-pod-network.3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Workload="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:25.531393 containerd[1507]: 2025-09-13 01:12:25.498 [INFO][4934] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:25.531393 containerd[1507]: 2025-09-13 01:12:25.498 [INFO][4934] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:25.531393 containerd[1507]: 2025-09-13 01:12:25.514 [WARNING][4934] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" HandleID="k8s-pod-network.3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Workload="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:25.531393 containerd[1507]: 2025-09-13 01:12:25.514 [INFO][4934] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" HandleID="k8s-pod-network.3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Workload="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:25.531393 containerd[1507]: 2025-09-13 01:12:25.522 [INFO][4934] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:25.531393 containerd[1507]: 2025-09-13 01:12:25.526 [INFO][4926] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:25.531393 containerd[1507]: time="2025-09-13T01:12:25.530892833Z" level=info msg="TearDown network for sandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\" successfully" Sep 13 01:12:25.531393 containerd[1507]: time="2025-09-13T01:12:25.530938325Z" level=info msg="StopPodSandbox for \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\" returns successfully" Sep 13 01:12:25.534002 containerd[1507]: time="2025-09-13T01:12:25.533386120Z" level=info msg="RemovePodSandbox for \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\"" Sep 13 01:12:25.534177 containerd[1507]: time="2025-09-13T01:12:25.534153006Z" level=info msg="Forcibly stopping sandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\"" Sep 13 01:12:25.725943 containerd[1507]: 2025-09-13 01:12:25.648 [WARNING][4948] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"ae902541-3045-4fda-8e34-8995114228b4", ResourceVersion:"918", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf", Pod:"goldmane-54d579b49d-9s577", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.21.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2ec2207f398", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:25.725943 containerd[1507]: 2025-09-13 01:12:25.649 [INFO][4948] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:25.725943 containerd[1507]: 2025-09-13 01:12:25.649 [INFO][4948] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" iface="eth0" netns="" Sep 13 01:12:25.725943 containerd[1507]: 2025-09-13 01:12:25.649 [INFO][4948] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:25.725943 containerd[1507]: 2025-09-13 01:12:25.649 [INFO][4948] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:25.725943 containerd[1507]: 2025-09-13 01:12:25.697 [INFO][4955] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" HandleID="k8s-pod-network.3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Workload="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:25.725943 containerd[1507]: 2025-09-13 01:12:25.698 [INFO][4955] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:25.725943 containerd[1507]: 2025-09-13 01:12:25.698 [INFO][4955] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:25.725943 containerd[1507]: 2025-09-13 01:12:25.717 [WARNING][4955] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" HandleID="k8s-pod-network.3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Workload="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:25.725943 containerd[1507]: 2025-09-13 01:12:25.717 [INFO][4955] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" HandleID="k8s-pod-network.3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Workload="srv--5asmg.gb1.brightbox.com-k8s-goldmane--54d579b49d--9s577-eth0" Sep 13 01:12:25.725943 containerd[1507]: 2025-09-13 01:12:25.720 [INFO][4955] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:25.725943 containerd[1507]: 2025-09-13 01:12:25.723 [INFO][4948] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218" Sep 13 01:12:25.728938 containerd[1507]: time="2025-09-13T01:12:25.726016269Z" level=info msg="TearDown network for sandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\" successfully" Sep 13 01:12:25.735087 containerd[1507]: time="2025-09-13T01:12:25.735016074Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:12:25.735241 containerd[1507]: time="2025-09-13T01:12:25.735147798Z" level=info msg="RemovePodSandbox \"3b81243de78661a5570a359e68bb27ceea7793d25c51e5230cd91d87566dc218\" returns successfully" Sep 13 01:12:25.736788 containerd[1507]: time="2025-09-13T01:12:25.736755753Z" level=info msg="StopPodSandbox for \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\"" Sep 13 01:12:25.882997 containerd[1507]: time="2025-09-13T01:12:25.882911671Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:25.885446 containerd[1507]: time="2025-09-13T01:12:25.885275447Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 13 01:12:25.892428 containerd[1507]: time="2025-09-13T01:12:25.892240829Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:25.903600 containerd[1507]: 2025-09-13 01:12:25.811 [WARNING][4969] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0", GenerateName:"calico-kube-controllers-5b7745757f-", Namespace:"calico-system", SelfLink:"", UID:"1e3277dd-9a07-4afe-ac81-d3afe9a4aa46", ResourceVersion:"915", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5b7745757f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1", Pod:"calico-kube-controllers-5b7745757f-zhjt6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.21.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1624d0c7b70", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:25.903600 containerd[1507]: 2025-09-13 01:12:25.811 [INFO][4969] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:25.903600 containerd[1507]: 2025-09-13 01:12:25.811 [INFO][4969] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" iface="eth0" netns="" Sep 13 01:12:25.903600 containerd[1507]: 2025-09-13 01:12:25.811 [INFO][4969] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:25.903600 containerd[1507]: 2025-09-13 01:12:25.811 [INFO][4969] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:25.903600 containerd[1507]: 2025-09-13 01:12:25.871 [INFO][4977] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" HandleID="k8s-pod-network.5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:25.903600 containerd[1507]: 2025-09-13 01:12:25.871 [INFO][4977] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:25.903600 containerd[1507]: 2025-09-13 01:12:25.871 [INFO][4977] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:25.903600 containerd[1507]: 2025-09-13 01:12:25.891 [WARNING][4977] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" HandleID="k8s-pod-network.5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:25.903600 containerd[1507]: 2025-09-13 01:12:25.891 [INFO][4977] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" HandleID="k8s-pod-network.5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:25.903600 containerd[1507]: 2025-09-13 01:12:25.894 [INFO][4977] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:25.903600 containerd[1507]: 2025-09-13 01:12:25.899 [INFO][4969] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:25.903600 containerd[1507]: time="2025-09-13T01:12:25.903455064Z" level=info msg="TearDown network for sandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\" successfully" Sep 13 01:12:25.903600 containerd[1507]: time="2025-09-13T01:12:25.903491285Z" level=info msg="StopPodSandbox for \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\" returns successfully" Sep 13 01:12:25.905664 containerd[1507]: time="2025-09-13T01:12:25.904512194Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:25.905664 containerd[1507]: time="2025-09-13T01:12:25.904872432Z" level=info msg="RemovePodSandbox for \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\"" Sep 13 01:12:25.905664 containerd[1507]: time="2025-09-13T01:12:25.904912019Z" level=info msg="Forcibly stopping sandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\"" Sep 13 01:12:25.907256 containerd[1507]: time="2025-09-13T01:12:25.905913904Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 4.723700244s" Sep 13 01:12:25.907256 containerd[1507]: time="2025-09-13T01:12:25.905970319Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 13 01:12:25.951949 containerd[1507]: time="2025-09-13T01:12:25.951855779Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 13 01:12:25.983079 containerd[1507]: time="2025-09-13T01:12:25.982923766Z" level=info msg="CreateContainer within sandbox \"cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 13 01:12:26.044060 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2097888335.mount: Deactivated successfully. Sep 13 01:12:26.049735 containerd[1507]: time="2025-09-13T01:12:26.049086113Z" level=info msg="CreateContainer within sandbox \"cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"81c391da96e902cfbc06a6ac82d1f2d7d6228f209a53f939832c00042128aec9\"" Sep 13 01:12:26.051540 containerd[1507]: time="2025-09-13T01:12:26.051505416Z" level=info msg="StartContainer for \"81c391da96e902cfbc06a6ac82d1f2d7d6228f209a53f939832c00042128aec9\"" Sep 13 01:12:26.127497 containerd[1507]: 2025-09-13 01:12:26.043 [WARNING][4992] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0", GenerateName:"calico-kube-controllers-5b7745757f-", Namespace:"calico-system", SelfLink:"", UID:"1e3277dd-9a07-4afe-ac81-d3afe9a4aa46", ResourceVersion:"915", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5b7745757f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"cb78d936722434018da9e55e1a4c0c97e484a6cf357cb7d3e5444ae0d62a17c1", Pod:"calico-kube-controllers-5b7745757f-zhjt6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.21.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1624d0c7b70", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:26.127497 containerd[1507]: 2025-09-13 01:12:26.044 [INFO][4992] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:26.127497 containerd[1507]: 2025-09-13 01:12:26.044 [INFO][4992] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" iface="eth0" netns="" Sep 13 01:12:26.127497 containerd[1507]: 2025-09-13 01:12:26.044 [INFO][4992] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:26.127497 containerd[1507]: 2025-09-13 01:12:26.044 [INFO][4992] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:26.127497 containerd[1507]: 2025-09-13 01:12:26.097 [INFO][5000] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" HandleID="k8s-pod-network.5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:26.127497 containerd[1507]: 2025-09-13 01:12:26.097 [INFO][5000] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:26.127497 containerd[1507]: 2025-09-13 01:12:26.097 [INFO][5000] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:26.127497 containerd[1507]: 2025-09-13 01:12:26.111 [WARNING][5000] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" HandleID="k8s-pod-network.5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:26.127497 containerd[1507]: 2025-09-13 01:12:26.112 [INFO][5000] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" HandleID="k8s-pod-network.5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--kube--controllers--5b7745757f--zhjt6-eth0" Sep 13 01:12:26.127497 containerd[1507]: 2025-09-13 01:12:26.118 [INFO][5000] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:26.127497 containerd[1507]: 2025-09-13 01:12:26.124 [INFO][4992] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc" Sep 13 01:12:26.128941 containerd[1507]: time="2025-09-13T01:12:26.128283013Z" level=info msg="TearDown network for sandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\" successfully" Sep 13 01:12:26.133726 containerd[1507]: time="2025-09-13T01:12:26.133322711Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:12:26.133726 containerd[1507]: time="2025-09-13T01:12:26.133433068Z" level=info msg="RemovePodSandbox \"5c2da179b8294c92227ff8b4a351a348ac2350c9dd009636214c21488fd4e7bc\" returns successfully" Sep 13 01:12:26.136267 containerd[1507]: time="2025-09-13T01:12:26.134994618Z" level=info msg="StopPodSandbox for \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\"" Sep 13 01:12:26.188747 systemd[1]: Started cri-containerd-81c391da96e902cfbc06a6ac82d1f2d7d6228f209a53f939832c00042128aec9.scope - libcontainer container 81c391da96e902cfbc06a6ac82d1f2d7d6228f209a53f939832c00042128aec9. Sep 13 01:12:26.316905 containerd[1507]: time="2025-09-13T01:12:26.316722145Z" level=info msg="StartContainer for \"81c391da96e902cfbc06a6ac82d1f2d7d6228f209a53f939832c00042128aec9\" returns successfully" Sep 13 01:12:26.321304 containerd[1507]: 2025-09-13 01:12:26.222 [WARNING][5023] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9", ResourceVersion:"973", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a", Pod:"coredns-668d6bf9bc-sn2fg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid97eac67a13", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:26.321304 containerd[1507]: 2025-09-13 01:12:26.222 [INFO][5023] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:26.321304 containerd[1507]: 2025-09-13 01:12:26.222 [INFO][5023] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" iface="eth0" netns="" Sep 13 01:12:26.321304 containerd[1507]: 2025-09-13 01:12:26.222 [INFO][5023] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:26.321304 containerd[1507]: 2025-09-13 01:12:26.222 [INFO][5023] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:26.321304 containerd[1507]: 2025-09-13 01:12:26.266 [INFO][5046] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" HandleID="k8s-pod-network.69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:26.321304 containerd[1507]: 2025-09-13 01:12:26.267 [INFO][5046] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:26.321304 containerd[1507]: 2025-09-13 01:12:26.267 [INFO][5046] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:26.321304 containerd[1507]: 2025-09-13 01:12:26.284 [WARNING][5046] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" HandleID="k8s-pod-network.69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:26.321304 containerd[1507]: 2025-09-13 01:12:26.284 [INFO][5046] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" HandleID="k8s-pod-network.69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:26.321304 containerd[1507]: 2025-09-13 01:12:26.294 [INFO][5046] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:26.321304 containerd[1507]: 2025-09-13 01:12:26.314 [INFO][5023] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:26.323417 containerd[1507]: time="2025-09-13T01:12:26.322454868Z" level=info msg="TearDown network for sandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\" successfully" Sep 13 01:12:26.323417 containerd[1507]: time="2025-09-13T01:12:26.322497977Z" level=info msg="StopPodSandbox for \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\" returns successfully" Sep 13 01:12:26.324139 containerd[1507]: time="2025-09-13T01:12:26.324034479Z" level=info msg="RemovePodSandbox for \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\"" Sep 13 01:12:26.324139 containerd[1507]: time="2025-09-13T01:12:26.324074437Z" level=info msg="Forcibly stopping sandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\"" Sep 13 01:12:26.492671 kubelet[2705]: I0913 01:12:26.491031 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5b7745757f-zhjt6" podStartSLOduration=31.237069805 podStartE2EDuration="38.491005787s" podCreationTimestamp="2025-09-13 01:11:48 +0000 UTC" firstStartedPulling="2025-09-13 01:12:18.655213973 +0000 UTC m=+54.353062193" lastFinishedPulling="2025-09-13 01:12:25.909149942 +0000 UTC m=+61.606998175" observedRunningTime="2025-09-13 01:12:26.484620249 +0000 UTC m=+62.182468488" watchObservedRunningTime="2025-09-13 01:12:26.491005787 +0000 UTC m=+62.188854021" Sep 13 01:12:26.584152 containerd[1507]: 2025-09-13 01:12:26.415 [WARNING][5075] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"6300a4f8-ae4b-49f1-a9a0-a5bb3a5cd0d9", ResourceVersion:"973", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"928d5d548cd33d66f67d17020e9bf67c10aba9ace3b79030b8d8b6fd8350771a", Pod:"coredns-668d6bf9bc-sn2fg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid97eac67a13", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:26.584152 containerd[1507]: 2025-09-13 01:12:26.416 [INFO][5075] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:26.584152 containerd[1507]: 2025-09-13 01:12:26.416 [INFO][5075] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" iface="eth0" netns="" Sep 13 01:12:26.584152 containerd[1507]: 2025-09-13 01:12:26.416 [INFO][5075] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:26.584152 containerd[1507]: 2025-09-13 01:12:26.416 [INFO][5075] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:26.584152 containerd[1507]: 2025-09-13 01:12:26.536 [INFO][5086] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" HandleID="k8s-pod-network.69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:26.584152 containerd[1507]: 2025-09-13 01:12:26.539 [INFO][5086] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:26.584152 containerd[1507]: 2025-09-13 01:12:26.539 [INFO][5086] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:26.584152 containerd[1507]: 2025-09-13 01:12:26.572 [WARNING][5086] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" HandleID="k8s-pod-network.69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:26.584152 containerd[1507]: 2025-09-13 01:12:26.572 [INFO][5086] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" HandleID="k8s-pod-network.69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--sn2fg-eth0" Sep 13 01:12:26.584152 containerd[1507]: 2025-09-13 01:12:26.577 [INFO][5086] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:26.584152 containerd[1507]: 2025-09-13 01:12:26.579 [INFO][5075] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8" Sep 13 01:12:26.584152 containerd[1507]: time="2025-09-13T01:12:26.583820547Z" level=info msg="TearDown network for sandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\" successfully" Sep 13 01:12:26.590134 containerd[1507]: time="2025-09-13T01:12:26.589175544Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:12:26.590134 containerd[1507]: time="2025-09-13T01:12:26.589338502Z" level=info msg="RemovePodSandbox \"69da2d110182d5db69b54125ca9d06b92ac24549a7b0a612dd83d4c0721001f8\" returns successfully" Sep 13 01:12:26.592225 containerd[1507]: time="2025-09-13T01:12:26.591016990Z" level=info msg="StopPodSandbox for \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\"" Sep 13 01:12:26.808973 containerd[1507]: 2025-09-13 01:12:26.697 [WARNING][5119] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2c92e0f8-426f-428f-9601-3c255a79b3c3", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9", Pod:"csi-node-driver-8vq7t", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.21.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib1001d8d787", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:26.808973 containerd[1507]: 2025-09-13 01:12:26.697 [INFO][5119] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:26.808973 containerd[1507]: 2025-09-13 01:12:26.697 [INFO][5119] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" iface="eth0" netns="" Sep 13 01:12:26.808973 containerd[1507]: 2025-09-13 01:12:26.697 [INFO][5119] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:26.808973 containerd[1507]: 2025-09-13 01:12:26.698 [INFO][5119] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:26.808973 containerd[1507]: 2025-09-13 01:12:26.773 [INFO][5130] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" HandleID="k8s-pod-network.3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Workload="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:26.808973 containerd[1507]: 2025-09-13 01:12:26.774 [INFO][5130] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:26.808973 containerd[1507]: 2025-09-13 01:12:26.774 [INFO][5130] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:26.808973 containerd[1507]: 2025-09-13 01:12:26.795 [WARNING][5130] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" HandleID="k8s-pod-network.3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Workload="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:26.808973 containerd[1507]: 2025-09-13 01:12:26.795 [INFO][5130] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" HandleID="k8s-pod-network.3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Workload="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:26.808973 containerd[1507]: 2025-09-13 01:12:26.801 [INFO][5130] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:26.808973 containerd[1507]: 2025-09-13 01:12:26.805 [INFO][5119] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:26.821238 containerd[1507]: time="2025-09-13T01:12:26.809225466Z" level=info msg="TearDown network for sandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\" successfully" Sep 13 01:12:26.821238 containerd[1507]: time="2025-09-13T01:12:26.809308236Z" level=info msg="StopPodSandbox for \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\" returns successfully" Sep 13 01:12:26.821238 containerd[1507]: time="2025-09-13T01:12:26.812759928Z" level=info msg="RemovePodSandbox for \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\"" Sep 13 01:12:26.821238 containerd[1507]: time="2025-09-13T01:12:26.812980630Z" level=info msg="Forcibly stopping sandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\"" Sep 13 01:12:26.973714 containerd[1507]: 2025-09-13 01:12:26.917 [WARNING][5144] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2c92e0f8-426f-428f-9601-3c255a79b3c3", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9", Pod:"csi-node-driver-8vq7t", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.21.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib1001d8d787", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:26.973714 containerd[1507]: 2025-09-13 01:12:26.920 [INFO][5144] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:26.973714 containerd[1507]: 2025-09-13 01:12:26.920 [INFO][5144] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" iface="eth0" netns="" Sep 13 01:12:26.973714 containerd[1507]: 2025-09-13 01:12:26.920 [INFO][5144] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:26.973714 containerd[1507]: 2025-09-13 01:12:26.920 [INFO][5144] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:26.973714 containerd[1507]: 2025-09-13 01:12:26.956 [INFO][5151] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" HandleID="k8s-pod-network.3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Workload="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:26.973714 containerd[1507]: 2025-09-13 01:12:26.957 [INFO][5151] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:26.973714 containerd[1507]: 2025-09-13 01:12:26.957 [INFO][5151] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:26.973714 containerd[1507]: 2025-09-13 01:12:26.967 [WARNING][5151] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" HandleID="k8s-pod-network.3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Workload="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:26.973714 containerd[1507]: 2025-09-13 01:12:26.967 [INFO][5151] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" HandleID="k8s-pod-network.3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Workload="srv--5asmg.gb1.brightbox.com-k8s-csi--node--driver--8vq7t-eth0" Sep 13 01:12:26.973714 containerd[1507]: 2025-09-13 01:12:26.969 [INFO][5151] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:26.973714 containerd[1507]: 2025-09-13 01:12:26.971 [INFO][5144] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08" Sep 13 01:12:26.975931 containerd[1507]: time="2025-09-13T01:12:26.973816462Z" level=info msg="TearDown network for sandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\" successfully" Sep 13 01:12:26.978749 containerd[1507]: time="2025-09-13T01:12:26.978505282Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:12:26.978749 containerd[1507]: time="2025-09-13T01:12:26.978692271Z" level=info msg="RemovePodSandbox \"3bc210c2af83b9414e03c180b7cae81458efbe073b4497410fcbf37c912bac08\" returns successfully" Sep 13 01:12:26.980337 containerd[1507]: time="2025-09-13T01:12:26.980098839Z" level=info msg="StopPodSandbox for \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\"" Sep 13 01:12:27.100726 containerd[1507]: 2025-09-13 01:12:27.042 [WARNING][5165] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-whisker--84fd46496--c4kbx-eth0" Sep 13 01:12:27.100726 containerd[1507]: 2025-09-13 01:12:27.042 [INFO][5165] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:27.100726 containerd[1507]: 2025-09-13 01:12:27.042 [INFO][5165] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" iface="eth0" netns="" Sep 13 01:12:27.100726 containerd[1507]: 2025-09-13 01:12:27.042 [INFO][5165] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:27.100726 containerd[1507]: 2025-09-13 01:12:27.042 [INFO][5165] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:27.100726 containerd[1507]: 2025-09-13 01:12:27.076 [INFO][5172] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" HandleID="k8s-pod-network.c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Workload="srv--5asmg.gb1.brightbox.com-k8s-whisker--84fd46496--c4kbx-eth0" Sep 13 01:12:27.100726 containerd[1507]: 2025-09-13 01:12:27.077 [INFO][5172] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:27.100726 containerd[1507]: 2025-09-13 01:12:27.077 [INFO][5172] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:27.100726 containerd[1507]: 2025-09-13 01:12:27.090 [WARNING][5172] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" HandleID="k8s-pod-network.c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Workload="srv--5asmg.gb1.brightbox.com-k8s-whisker--84fd46496--c4kbx-eth0" Sep 13 01:12:27.100726 containerd[1507]: 2025-09-13 01:12:27.091 [INFO][5172] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" HandleID="k8s-pod-network.c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Workload="srv--5asmg.gb1.brightbox.com-k8s-whisker--84fd46496--c4kbx-eth0" Sep 13 01:12:27.100726 containerd[1507]: 2025-09-13 01:12:27.094 [INFO][5172] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:27.100726 containerd[1507]: 2025-09-13 01:12:27.097 [INFO][5165] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:27.100726 containerd[1507]: time="2025-09-13T01:12:27.100859720Z" level=info msg="TearDown network for sandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\" successfully" Sep 13 01:12:27.100726 containerd[1507]: time="2025-09-13T01:12:27.100899537Z" level=info msg="StopPodSandbox for \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\" returns successfully" Sep 13 01:12:27.103007 containerd[1507]: time="2025-09-13T01:12:27.101899814Z" level=info msg="RemovePodSandbox for \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\"" Sep 13 01:12:27.103007 containerd[1507]: time="2025-09-13T01:12:27.101963526Z" level=info msg="Forcibly stopping sandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\"" Sep 13 01:12:27.232184 containerd[1507]: 2025-09-13 01:12:27.182 [WARNING][5187] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-whisker--84fd46496--c4kbx-eth0" Sep 13 01:12:27.232184 containerd[1507]: 2025-09-13 01:12:27.183 [INFO][5187] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:27.232184 containerd[1507]: 2025-09-13 01:12:27.183 [INFO][5187] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" iface="eth0" netns="" Sep 13 01:12:27.232184 containerd[1507]: 2025-09-13 01:12:27.183 [INFO][5187] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:27.232184 containerd[1507]: 2025-09-13 01:12:27.183 [INFO][5187] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:27.232184 containerd[1507]: 2025-09-13 01:12:27.215 [INFO][5195] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" HandleID="k8s-pod-network.c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Workload="srv--5asmg.gb1.brightbox.com-k8s-whisker--84fd46496--c4kbx-eth0" Sep 13 01:12:27.232184 containerd[1507]: 2025-09-13 01:12:27.215 [INFO][5195] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:27.232184 containerd[1507]: 2025-09-13 01:12:27.216 [INFO][5195] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:27.232184 containerd[1507]: 2025-09-13 01:12:27.225 [WARNING][5195] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" HandleID="k8s-pod-network.c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Workload="srv--5asmg.gb1.brightbox.com-k8s-whisker--84fd46496--c4kbx-eth0" Sep 13 01:12:27.232184 containerd[1507]: 2025-09-13 01:12:27.225 [INFO][5195] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" HandleID="k8s-pod-network.c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Workload="srv--5asmg.gb1.brightbox.com-k8s-whisker--84fd46496--c4kbx-eth0" Sep 13 01:12:27.232184 containerd[1507]: 2025-09-13 01:12:27.227 [INFO][5195] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:27.232184 containerd[1507]: 2025-09-13 01:12:27.230 [INFO][5187] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f" Sep 13 01:12:27.232928 containerd[1507]: time="2025-09-13T01:12:27.232259740Z" level=info msg="TearDown network for sandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\" successfully" Sep 13 01:12:27.236615 containerd[1507]: time="2025-09-13T01:12:27.236558388Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:12:27.236857 containerd[1507]: time="2025-09-13T01:12:27.236649567Z" level=info msg="RemovePodSandbox \"c38959737abc190e7b942ddd46c33bc4f6e28fcd172e0b762e8cfc70f4a1942f\" returns successfully" Sep 13 01:12:27.639777 containerd[1507]: time="2025-09-13T01:12:27.639475586Z" level=info msg="StopPodSandbox for \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\"" Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.755 [INFO][5219] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.756 [INFO][5219] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" iface="eth0" netns="/var/run/netns/cni-0053fc60-4d8c-4176-3735-141d1820e243" Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.757 [INFO][5219] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" iface="eth0" netns="/var/run/netns/cni-0053fc60-4d8c-4176-3735-141d1820e243" Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.758 [INFO][5219] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" iface="eth0" netns="/var/run/netns/cni-0053fc60-4d8c-4176-3735-141d1820e243" Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.758 [INFO][5219] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.758 [INFO][5219] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.824 [INFO][5226] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" HandleID="k8s-pod-network.ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.825 [INFO][5226] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.825 [INFO][5226] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.845 [WARNING][5226] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" HandleID="k8s-pod-network.ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.845 [INFO][5226] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" HandleID="k8s-pod-network.ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.850 [INFO][5226] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:27.858432 containerd[1507]: 2025-09-13 01:12:27.853 [INFO][5219] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:12:27.864133 containerd[1507]: time="2025-09-13T01:12:27.862296481Z" level=info msg="TearDown network for sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\" successfully" Sep 13 01:12:27.864133 containerd[1507]: time="2025-09-13T01:12:27.862350410Z" level=info msg="StopPodSandbox for \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\" returns successfully" Sep 13 01:12:27.873786 systemd[1]: run-netns-cni\x2d0053fc60\x2d4d8c\x2d4176\x2d3735\x2d141d1820e243.mount: Deactivated successfully. Sep 13 01:12:27.879920 containerd[1507]: time="2025-09-13T01:12:27.879846380Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658bd7dd7b-wdc44,Uid:f1634209-c6c5-41be-8e7f-82f83544c3ed,Namespace:calico-apiserver,Attempt:1,}" Sep 13 01:12:28.256740 systemd-networkd[1411]: calie647a09106c: Link UP Sep 13 01:12:28.260870 systemd-networkd[1411]: calie647a09106c: Gained carrier Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.089 [INFO][5232] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0 calico-apiserver-658bd7dd7b- calico-apiserver f1634209-c6c5-41be-8e7f-82f83544c3ed 999 0 2025-09-13 01:11:43 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:658bd7dd7b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-5asmg.gb1.brightbox.com calico-apiserver-658bd7dd7b-wdc44 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie647a09106c [] [] }} ContainerID="3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-wdc44" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-" Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.089 [INFO][5232] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-wdc44" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.161 [INFO][5245] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" HandleID="k8s-pod-network.3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.161 [INFO][5245] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" HandleID="k8s-pod-network.3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f980), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-5asmg.gb1.brightbox.com", "pod":"calico-apiserver-658bd7dd7b-wdc44", "timestamp":"2025-09-13 01:12:28.161380855 +0000 UTC"}, Hostname:"srv-5asmg.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.161 [INFO][5245] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.161 [INFO][5245] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.161 [INFO][5245] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5asmg.gb1.brightbox.com' Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.177 [INFO][5245] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.186 [INFO][5245] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.199 [INFO][5245] ipam/ipam.go 511: Trying affinity for 192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.202 [INFO][5245] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.208 [INFO][5245] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.209 [INFO][5245] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.192/26 handle="k8s-pod-network.3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.212 [INFO][5245] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.222 [INFO][5245] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.192/26 handle="k8s-pod-network.3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.235 [INFO][5245] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.199/26] block=192.168.21.192/26 handle="k8s-pod-network.3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.235 [INFO][5245] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.199/26] handle="k8s-pod-network.3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.235 [INFO][5245] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:28.351368 containerd[1507]: 2025-09-13 01:12:28.235 [INFO][5245] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.199/26] IPv6=[] ContainerID="3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" HandleID="k8s-pod-network.3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:12:28.364893 containerd[1507]: 2025-09-13 01:12:28.242 [INFO][5232] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-wdc44" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0", GenerateName:"calico-apiserver-658bd7dd7b-", Namespace:"calico-apiserver", SelfLink:"", UID:"f1634209-c6c5-41be-8e7f-82f83544c3ed", ResourceVersion:"999", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658bd7dd7b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-658bd7dd7b-wdc44", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie647a09106c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:28.364893 containerd[1507]: 2025-09-13 01:12:28.242 [INFO][5232] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.199/32] ContainerID="3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-wdc44" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:12:28.364893 containerd[1507]: 2025-09-13 01:12:28.243 [INFO][5232] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie647a09106c ContainerID="3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-wdc44" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:12:28.364893 containerd[1507]: 2025-09-13 01:12:28.261 [INFO][5232] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-wdc44" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:12:28.364893 containerd[1507]: 2025-09-13 01:12:28.262 [INFO][5232] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-wdc44" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0", GenerateName:"calico-apiserver-658bd7dd7b-", Namespace:"calico-apiserver", SelfLink:"", UID:"f1634209-c6c5-41be-8e7f-82f83544c3ed", ResourceVersion:"999", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658bd7dd7b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f", Pod:"calico-apiserver-658bd7dd7b-wdc44", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie647a09106c", MAC:"1e:4d:b2:df:4e:eb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:28.364893 containerd[1507]: 2025-09-13 01:12:28.345 [INFO][5232] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f" Namespace="calico-apiserver" Pod="calico-apiserver-658bd7dd7b-wdc44" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:12:28.414218 containerd[1507]: time="2025-09-13T01:12:28.413411503Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:12:28.414218 containerd[1507]: time="2025-09-13T01:12:28.413504019Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:12:28.414218 containerd[1507]: time="2025-09-13T01:12:28.413528610Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:28.414218 containerd[1507]: time="2025-09-13T01:12:28.413673985Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:28.479370 systemd[1]: Started cri-containerd-3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f.scope - libcontainer container 3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f. Sep 13 01:12:28.584940 containerd[1507]: time="2025-09-13T01:12:28.584360667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658bd7dd7b-wdc44,Uid:f1634209-c6c5-41be-8e7f-82f83544c3ed,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f\"" Sep 13 01:12:29.641232 containerd[1507]: time="2025-09-13T01:12:29.640479794Z" level=info msg="StopPodSandbox for \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\"" Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.852 [INFO][5318] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.854 [INFO][5318] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" iface="eth0" netns="/var/run/netns/cni-baa81b09-9eab-7ae8-4ac1-6566efb6e166" Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.854 [INFO][5318] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" iface="eth0" netns="/var/run/netns/cni-baa81b09-9eab-7ae8-4ac1-6566efb6e166" Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.855 [INFO][5318] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" iface="eth0" netns="/var/run/netns/cni-baa81b09-9eab-7ae8-4ac1-6566efb6e166" Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.856 [INFO][5318] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.856 [INFO][5318] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.908 [INFO][5325] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" HandleID="k8s-pod-network.ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.910 [INFO][5325] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.910 [INFO][5325] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.927 [WARNING][5325] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" HandleID="k8s-pod-network.ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.927 [INFO][5325] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" HandleID="k8s-pod-network.ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.932 [INFO][5325] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:29.949268 containerd[1507]: 2025-09-13 01:12:29.942 [INFO][5318] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:12:29.949268 containerd[1507]: time="2025-09-13T01:12:29.947805402Z" level=info msg="TearDown network for sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\" successfully" Sep 13 01:12:29.949268 containerd[1507]: time="2025-09-13T01:12:29.947847669Z" level=info msg="StopPodSandbox for \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\" returns successfully" Sep 13 01:12:29.954932 containerd[1507]: time="2025-09-13T01:12:29.950227685Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-kdwhw,Uid:a73f9354-94fe-434f-94c6-f203f326e804,Namespace:kube-system,Attempt:1,}" Sep 13 01:12:29.954972 systemd[1]: run-netns-cni\x2dbaa81b09\x2d9eab\x2d7ae8\x2d4ac1\x2d6566efb6e166.mount: Deactivated successfully. Sep 13 01:12:30.030547 systemd-networkd[1411]: calie647a09106c: Gained IPv6LL Sep 13 01:12:30.404282 systemd-networkd[1411]: calib2cfcfb65cc: Link UP Sep 13 01:12:30.406826 systemd-networkd[1411]: calib2cfcfb65cc: Gained carrier Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.116 [INFO][5335] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0 coredns-668d6bf9bc- kube-system a73f9354-94fe-434f-94c6-f203f326e804 1008 0 2025-09-13 01:11:30 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-5asmg.gb1.brightbox.com coredns-668d6bf9bc-kdwhw eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib2cfcfb65cc [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" Namespace="kube-system" Pod="coredns-668d6bf9bc-kdwhw" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-" Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.117 [INFO][5335] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" Namespace="kube-system" Pod="coredns-668d6bf9bc-kdwhw" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.207 [INFO][5346] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" HandleID="k8s-pod-network.d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.208 [INFO][5346] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" HandleID="k8s-pod-network.d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cb7e0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-5asmg.gb1.brightbox.com", "pod":"coredns-668d6bf9bc-kdwhw", "timestamp":"2025-09-13 01:12:30.207958826 +0000 UTC"}, Hostname:"srv-5asmg.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.211 [INFO][5346] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.211 [INFO][5346] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.211 [INFO][5346] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-5asmg.gb1.brightbox.com' Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.262 [INFO][5346] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.286 [INFO][5346] ipam/ipam.go 394: Looking up existing affinities for host host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.334 [INFO][5346] ipam/ipam.go 511: Trying affinity for 192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.340 [INFO][5346] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.345 [INFO][5346] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.192/26 host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.345 [INFO][5346] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.192/26 handle="k8s-pod-network.d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.350 [INFO][5346] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747 Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.365 [INFO][5346] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.192/26 handle="k8s-pod-network.d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.382 [INFO][5346] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.200/26] block=192.168.21.192/26 handle="k8s-pod-network.d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.382 [INFO][5346] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.200/26] handle="k8s-pod-network.d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" host="srv-5asmg.gb1.brightbox.com" Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.382 [INFO][5346] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:12:30.446824 containerd[1507]: 2025-09-13 01:12:30.382 [INFO][5346] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.200/26] IPv6=[] ContainerID="d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" HandleID="k8s-pod-network.d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:12:30.449002 containerd[1507]: 2025-09-13 01:12:30.390 [INFO][5335] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" Namespace="kube-system" Pod="coredns-668d6bf9bc-kdwhw" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a73f9354-94fe-434f-94c6-f203f326e804", ResourceVersion:"1008", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"", Pod:"coredns-668d6bf9bc-kdwhw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib2cfcfb65cc", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:30.449002 containerd[1507]: 2025-09-13 01:12:30.394 [INFO][5335] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.200/32] ContainerID="d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" Namespace="kube-system" Pod="coredns-668d6bf9bc-kdwhw" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:12:30.449002 containerd[1507]: 2025-09-13 01:12:30.394 [INFO][5335] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib2cfcfb65cc ContainerID="d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" Namespace="kube-system" Pod="coredns-668d6bf9bc-kdwhw" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:12:30.449002 containerd[1507]: 2025-09-13 01:12:30.410 [INFO][5335] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" Namespace="kube-system" Pod="coredns-668d6bf9bc-kdwhw" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:12:30.449002 containerd[1507]: 2025-09-13 01:12:30.411 [INFO][5335] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" Namespace="kube-system" Pod="coredns-668d6bf9bc-kdwhw" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a73f9354-94fe-434f-94c6-f203f326e804", ResourceVersion:"1008", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747", Pod:"coredns-668d6bf9bc-kdwhw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib2cfcfb65cc", MAC:"16:af:d4:b8:a5:4e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:12:30.449002 containerd[1507]: 2025-09-13 01:12:30.438 [INFO][5335] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747" Namespace="kube-system" Pod="coredns-668d6bf9bc-kdwhw" WorkloadEndpoint="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:12:30.520424 containerd[1507]: time="2025-09-13T01:12:30.519830040Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:12:30.520424 containerd[1507]: time="2025-09-13T01:12:30.519949495Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:12:30.520424 containerd[1507]: time="2025-09-13T01:12:30.519973586Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:30.520424 containerd[1507]: time="2025-09-13T01:12:30.520149025Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:12:30.606564 systemd[1]: Started cri-containerd-d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747.scope - libcontainer container d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747. Sep 13 01:12:30.708037 containerd[1507]: time="2025-09-13T01:12:30.707098902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-kdwhw,Uid:a73f9354-94fe-434f-94c6-f203f326e804,Namespace:kube-system,Attempt:1,} returns sandbox id \"d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747\"" Sep 13 01:12:30.726313 containerd[1507]: time="2025-09-13T01:12:30.725868020Z" level=info msg="CreateContainer within sandbox \"d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 13 01:12:30.776691 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3185606922.mount: Deactivated successfully. Sep 13 01:12:30.842373 containerd[1507]: time="2025-09-13T01:12:30.842089359Z" level=info msg="CreateContainer within sandbox \"d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"83c4c25a9abf3e81c02116b806345029be4e2a1236111453ff2c41f3f95dbe26\"" Sep 13 01:12:30.845570 containerd[1507]: time="2025-09-13T01:12:30.844180274Z" level=info msg="StartContainer for \"83c4c25a9abf3e81c02116b806345029be4e2a1236111453ff2c41f3f95dbe26\"" Sep 13 01:12:30.915393 systemd[1]: Started cri-containerd-83c4c25a9abf3e81c02116b806345029be4e2a1236111453ff2c41f3f95dbe26.scope - libcontainer container 83c4c25a9abf3e81c02116b806345029be4e2a1236111453ff2c41f3f95dbe26. Sep 13 01:12:30.991649 containerd[1507]: time="2025-09-13T01:12:30.991481236Z" level=info msg="StartContainer for \"83c4c25a9abf3e81c02116b806345029be4e2a1236111453ff2c41f3f95dbe26\" returns successfully" Sep 13 01:12:31.534091 kubelet[2705]: I0913 01:12:31.534000 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-kdwhw" podStartSLOduration=61.53397086 podStartE2EDuration="1m1.53397086s" podCreationTimestamp="2025-09-13 01:11:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 01:12:31.532388246 +0000 UTC m=+67.230236489" watchObservedRunningTime="2025-09-13 01:12:31.53397086 +0000 UTC m=+67.231819094" Sep 13 01:12:31.716266 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1204648316.mount: Deactivated successfully. Sep 13 01:12:32.079180 systemd-networkd[1411]: calib2cfcfb65cc: Gained IPv6LL Sep 13 01:12:32.789995 containerd[1507]: time="2025-09-13T01:12:32.789192759Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 13 01:12:32.833827 containerd[1507]: time="2025-09-13T01:12:32.832885305Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:32.835577 containerd[1507]: time="2025-09-13T01:12:32.835020268Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:32.843911 containerd[1507]: time="2025-09-13T01:12:32.843840459Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:32.845367 containerd[1507]: time="2025-09-13T01:12:32.845327306Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 6.893290507s" Sep 13 01:12:32.845537 containerd[1507]: time="2025-09-13T01:12:32.845496921Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 13 01:12:32.855409 containerd[1507]: time="2025-09-13T01:12:32.854194884Z" level=info msg="CreateContainer within sandbox \"41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 13 01:12:32.874796 containerd[1507]: time="2025-09-13T01:12:32.874714340Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 13 01:12:32.889176 containerd[1507]: time="2025-09-13T01:12:32.889086655Z" level=info msg="CreateContainer within sandbox \"41668a4679cdf15237f2d3ec08f21de2059e07ffcdf3fef4fa1efa2faf0f06cf\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"b8a79e41830e3e6fbb29717e5b98e72c6cb4581f81331d4714e794be44e6aafd\"" Sep 13 01:12:32.891125 containerd[1507]: time="2025-09-13T01:12:32.891066062Z" level=info msg="StartContainer for \"b8a79e41830e3e6fbb29717e5b98e72c6cb4581f81331d4714e794be44e6aafd\"" Sep 13 01:12:32.982580 systemd[1]: Started cri-containerd-b8a79e41830e3e6fbb29717e5b98e72c6cb4581f81331d4714e794be44e6aafd.scope - libcontainer container b8a79e41830e3e6fbb29717e5b98e72c6cb4581f81331d4714e794be44e6aafd. Sep 13 01:12:33.076479 containerd[1507]: time="2025-09-13T01:12:33.076055075Z" level=info msg="StartContainer for \"b8a79e41830e3e6fbb29717e5b98e72c6cb4581f81331d4714e794be44e6aafd\" returns successfully" Sep 13 01:12:34.497317 containerd[1507]: time="2025-09-13T01:12:34.496501324Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:34.498760 containerd[1507]: time="2025-09-13T01:12:34.498623643Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 13 01:12:34.500690 containerd[1507]: time="2025-09-13T01:12:34.500024033Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:34.505858 containerd[1507]: time="2025-09-13T01:12:34.505819879Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:34.507300 containerd[1507]: time="2025-09-13T01:12:34.507179344Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.632395686s" Sep 13 01:12:34.507300 containerd[1507]: time="2025-09-13T01:12:34.507242343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 13 01:12:34.510856 containerd[1507]: time="2025-09-13T01:12:34.510787076Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 13 01:12:34.522342 containerd[1507]: time="2025-09-13T01:12:34.522280029Z" level=info msg="CreateContainer within sandbox \"7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 13 01:12:34.566388 containerd[1507]: time="2025-09-13T01:12:34.566312155Z" level=info msg="CreateContainer within sandbox \"7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"3e852ea660f1e9114c8f18983a6ce69156c3498cd54191d3656b98306e0f9787\"" Sep 13 01:12:34.570487 containerd[1507]: time="2025-09-13T01:12:34.570448724Z" level=info msg="StartContainer for \"3e852ea660f1e9114c8f18983a6ce69156c3498cd54191d3656b98306e0f9787\"" Sep 13 01:12:34.591719 systemd[1]: run-containerd-runc-k8s.io-b8a79e41830e3e6fbb29717e5b98e72c6cb4581f81331d4714e794be44e6aafd-runc.05ymNq.mount: Deactivated successfully. Sep 13 01:12:34.666729 systemd[1]: Started cri-containerd-3e852ea660f1e9114c8f18983a6ce69156c3498cd54191d3656b98306e0f9787.scope - libcontainer container 3e852ea660f1e9114c8f18983a6ce69156c3498cd54191d3656b98306e0f9787. Sep 13 01:12:34.768688 containerd[1507]: time="2025-09-13T01:12:34.767428249Z" level=info msg="StartContainer for \"3e852ea660f1e9114c8f18983a6ce69156c3498cd54191d3656b98306e0f9787\" returns successfully" Sep 13 01:12:40.689598 containerd[1507]: time="2025-09-13T01:12:40.688975140Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:40.690507 containerd[1507]: time="2025-09-13T01:12:40.690313815Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 13 01:12:40.692135 containerd[1507]: time="2025-09-13T01:12:40.692033790Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:40.705015 containerd[1507]: time="2025-09-13T01:12:40.704943917Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:40.706451 containerd[1507]: time="2025-09-13T01:12:40.706385184Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 6.195515s" Sep 13 01:12:40.706546 containerd[1507]: time="2025-09-13T01:12:40.706461324Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 13 01:12:40.739842 containerd[1507]: time="2025-09-13T01:12:40.739518344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 13 01:12:40.755332 containerd[1507]: time="2025-09-13T01:12:40.755096903Z" level=info msg="CreateContainer within sandbox \"ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 13 01:12:40.785808 containerd[1507]: time="2025-09-13T01:12:40.785674543Z" level=info msg="CreateContainer within sandbox \"ada088ab6c265005090d51a3eb8b548f700cd9436b019195688e0e9531cf8bfc\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8e47ca1e7dddae18201ea4002e86b33ec21bcea3aaa900f8f274dfbce5a8bbcb\"" Sep 13 01:12:40.787719 containerd[1507]: time="2025-09-13T01:12:40.786809915Z" level=info msg="StartContainer for \"8e47ca1e7dddae18201ea4002e86b33ec21bcea3aaa900f8f274dfbce5a8bbcb\"" Sep 13 01:12:40.852624 systemd[1]: Started cri-containerd-8e47ca1e7dddae18201ea4002e86b33ec21bcea3aaa900f8f274dfbce5a8bbcb.scope - libcontainer container 8e47ca1e7dddae18201ea4002e86b33ec21bcea3aaa900f8f274dfbce5a8bbcb. Sep 13 01:12:40.932930 containerd[1507]: time="2025-09-13T01:12:40.932874968Z" level=info msg="StartContainer for \"8e47ca1e7dddae18201ea4002e86b33ec21bcea3aaa900f8f274dfbce5a8bbcb\" returns successfully" Sep 13 01:12:41.815887 kubelet[2705]: I0913 01:12:41.815637 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-658bd7dd7b-zf7gk" podStartSLOduration=38.057746176 podStartE2EDuration="58.806186113s" podCreationTimestamp="2025-09-13 01:11:43 +0000 UTC" firstStartedPulling="2025-09-13 01:12:19.980304885 +0000 UTC m=+55.678153112" lastFinishedPulling="2025-09-13 01:12:40.728744775 +0000 UTC m=+76.426593049" observedRunningTime="2025-09-13 01:12:41.802084995 +0000 UTC m=+77.499933241" watchObservedRunningTime="2025-09-13 01:12:41.806186113 +0000 UTC m=+77.504034347" Sep 13 01:12:41.833869 kubelet[2705]: I0913 01:12:41.833356 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-9s577" podStartSLOduration=40.653308779 podStartE2EDuration="54.833324663s" podCreationTimestamp="2025-09-13 01:11:47 +0000 UTC" firstStartedPulling="2025-09-13 01:12:18.66679792 +0000 UTC m=+54.364646145" lastFinishedPulling="2025-09-13 01:12:32.846813798 +0000 UTC m=+68.544662029" observedRunningTime="2025-09-13 01:12:33.534014065 +0000 UTC m=+69.231862305" watchObservedRunningTime="2025-09-13 01:12:41.833324663 +0000 UTC m=+77.531172890" Sep 13 01:12:42.683088 kubelet[2705]: I0913 01:12:42.683024 2705 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 01:12:43.940362 containerd[1507]: time="2025-09-13T01:12:43.940256243Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:43.943392 containerd[1507]: time="2025-09-13T01:12:43.943334207Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 13 01:12:43.944946 containerd[1507]: time="2025-09-13T01:12:43.944898292Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:43.948047 containerd[1507]: time="2025-09-13T01:12:43.947704809Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:43.950040 containerd[1507]: time="2025-09-13T01:12:43.949262120Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 3.209675002s" Sep 13 01:12:43.950040 containerd[1507]: time="2025-09-13T01:12:43.949315931Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 13 01:12:43.969938 containerd[1507]: time="2025-09-13T01:12:43.969461577Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 13 01:12:43.979034 containerd[1507]: time="2025-09-13T01:12:43.978517101Z" level=info msg="CreateContainer within sandbox \"5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 13 01:12:44.028418 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3043622387.mount: Deactivated successfully. Sep 13 01:12:44.033257 containerd[1507]: time="2025-09-13T01:12:44.033207978Z" level=info msg="CreateContainer within sandbox \"5e5228d75a6cb4c9b323cf7acec33873eab461b33ee278461c5da458ef5feab9\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"7fd415d861d412a88be0285193c8adf5c214b4aea6614c376601862a1ed52b01\"" Sep 13 01:12:44.034653 containerd[1507]: time="2025-09-13T01:12:44.034621138Z" level=info msg="StartContainer for \"7fd415d861d412a88be0285193c8adf5c214b4aea6614c376601862a1ed52b01\"" Sep 13 01:12:44.153832 systemd[1]: Started cri-containerd-7fd415d861d412a88be0285193c8adf5c214b4aea6614c376601862a1ed52b01.scope - libcontainer container 7fd415d861d412a88be0285193c8adf5c214b4aea6614c376601862a1ed52b01. Sep 13 01:12:44.308063 containerd[1507]: time="2025-09-13T01:12:44.305403924Z" level=info msg="StartContainer for \"7fd415d861d412a88be0285193c8adf5c214b4aea6614c376601862a1ed52b01\" returns successfully" Sep 13 01:12:44.441053 containerd[1507]: time="2025-09-13T01:12:44.440337494Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:44.444978 containerd[1507]: time="2025-09-13T01:12:44.444906571Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 13 01:12:44.450273 containerd[1507]: time="2025-09-13T01:12:44.449613841Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 480.087049ms" Sep 13 01:12:44.450273 containerd[1507]: time="2025-09-13T01:12:44.449666240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 13 01:12:44.571276 containerd[1507]: time="2025-09-13T01:12:44.570580997Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 13 01:12:44.585376 containerd[1507]: time="2025-09-13T01:12:44.581545622Z" level=info msg="CreateContainer within sandbox \"3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 13 01:12:44.634674 containerd[1507]: time="2025-09-13T01:12:44.634605460Z" level=info msg="CreateContainer within sandbox \"3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"fed2ff0518fa512946628ac7d7b6bf7fbe0e8d8651dc67fd941e59e10d332390\"" Sep 13 01:12:44.682634 containerd[1507]: time="2025-09-13T01:12:44.682142158Z" level=info msg="StartContainer for \"fed2ff0518fa512946628ac7d7b6bf7fbe0e8d8651dc67fd941e59e10d332390\"" Sep 13 01:12:44.806328 systemd[1]: Started cri-containerd-fed2ff0518fa512946628ac7d7b6bf7fbe0e8d8651dc67fd941e59e10d332390.scope - libcontainer container fed2ff0518fa512946628ac7d7b6bf7fbe0e8d8651dc67fd941e59e10d332390. Sep 13 01:12:45.049238 containerd[1507]: time="2025-09-13T01:12:45.045888407Z" level=info msg="StartContainer for \"fed2ff0518fa512946628ac7d7b6bf7fbe0e8d8651dc67fd941e59e10d332390\" returns successfully" Sep 13 01:12:45.082610 kubelet[2705]: I0913 01:12:45.071305 2705 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 13 01:12:45.099149 kubelet[2705]: I0913 01:12:45.082656 2705 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 13 01:12:45.968638 kubelet[2705]: I0913 01:12:45.921077 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-658bd7dd7b-wdc44" podStartSLOduration=46.949074938 podStartE2EDuration="1m2.91199962s" podCreationTimestamp="2025-09-13 01:11:43 +0000 UTC" firstStartedPulling="2025-09-13 01:12:28.588426498 +0000 UTC m=+64.286274723" lastFinishedPulling="2025-09-13 01:12:44.55135118 +0000 UTC m=+80.249199405" observedRunningTime="2025-09-13 01:12:45.910679318 +0000 UTC m=+81.608527557" watchObservedRunningTime="2025-09-13 01:12:45.91199962 +0000 UTC m=+81.609847851" Sep 13 01:12:45.970286 kubelet[2705]: I0913 01:12:45.969471 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-8vq7t" podStartSLOduration=32.383596429 podStartE2EDuration="57.969444421s" podCreationTimestamp="2025-09-13 01:11:48 +0000 UTC" firstStartedPulling="2025-09-13 01:12:18.382420073 +0000 UTC m=+54.080268299" lastFinishedPulling="2025-09-13 01:12:43.968268058 +0000 UTC m=+79.666116291" observedRunningTime="2025-09-13 01:12:44.891442085 +0000 UTC m=+80.589290369" watchObservedRunningTime="2025-09-13 01:12:45.969444421 +0000 UTC m=+81.667292653" Sep 13 01:12:46.898550 kubelet[2705]: I0913 01:12:46.898496 2705 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 01:12:49.668078 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2845453481.mount: Deactivated successfully. Sep 13 01:12:49.716460 containerd[1507]: time="2025-09-13T01:12:49.715113308Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:49.718300 containerd[1507]: time="2025-09-13T01:12:49.718095160Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 13 01:12:49.719049 containerd[1507]: time="2025-09-13T01:12:49.718981118Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:49.737249 containerd[1507]: time="2025-09-13T01:12:49.736870959Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:12:49.739206 containerd[1507]: time="2025-09-13T01:12:49.738853062Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 5.168213148s" Sep 13 01:12:49.739298 containerd[1507]: time="2025-09-13T01:12:49.739200239Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 13 01:12:49.900705 containerd[1507]: time="2025-09-13T01:12:49.900634967Z" level=info msg="CreateContainer within sandbox \"7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 13 01:12:49.994930 containerd[1507]: time="2025-09-13T01:12:49.992208181Z" level=info msg="CreateContainer within sandbox \"7c164182d8cf04bbda1157aafab8b0f557a18c77923cc9f77cfd39bf8a5e2da1\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"a3ed68309f2e292b7bc4b8930590e599b19e6cd5bf037313942e93341ec978e4\"" Sep 13 01:12:49.997163 containerd[1507]: time="2025-09-13T01:12:49.996695468Z" level=info msg="StartContainer for \"a3ed68309f2e292b7bc4b8930590e599b19e6cd5bf037313942e93341ec978e4\"" Sep 13 01:12:50.168349 systemd[1]: Started cri-containerd-a3ed68309f2e292b7bc4b8930590e599b19e6cd5bf037313942e93341ec978e4.scope - libcontainer container a3ed68309f2e292b7bc4b8930590e599b19e6cd5bf037313942e93341ec978e4. Sep 13 01:12:50.480464 containerd[1507]: time="2025-09-13T01:12:50.480304586Z" level=info msg="StartContainer for \"a3ed68309f2e292b7bc4b8930590e599b19e6cd5bf037313942e93341ec978e4\" returns successfully" Sep 13 01:12:51.204549 kubelet[2705]: I0913 01:12:51.202301 2705 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7b856d4fbc-9lhhj" podStartSLOduration=3.106690179 podStartE2EDuration="33.192643038s" podCreationTimestamp="2025-09-13 01:12:18 +0000 UTC" firstStartedPulling="2025-09-13 01:12:19.671020241 +0000 UTC m=+55.368868466" lastFinishedPulling="2025-09-13 01:12:49.756973074 +0000 UTC m=+85.454821325" observedRunningTime="2025-09-13 01:12:51.18018841 +0000 UTC m=+86.878036652" watchObservedRunningTime="2025-09-13 01:12:51.192643038 +0000 UTC m=+86.890491267" Sep 13 01:12:56.263765 systemd[1]: Started sshd@9-10.244.29.26:22-139.178.68.195:47056.service - OpenSSH per-connection server daemon (139.178.68.195:47056). Sep 13 01:12:56.872233 kubelet[2705]: I0913 01:12:56.871583 2705 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 01:12:57.352701 sshd[5854]: Accepted publickey for core from 139.178.68.195 port 47056 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:12:57.362348 sshd[5854]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:12:57.399863 systemd-logind[1489]: New session 12 of user core. Sep 13 01:12:57.409408 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 13 01:12:58.364871 kubelet[2705]: I0913 01:12:58.364579 2705 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 01:12:58.920931 sshd[5854]: pam_unix(sshd:session): session closed for user core Sep 13 01:12:58.930230 systemd-logind[1489]: Session 12 logged out. Waiting for processes to exit. Sep 13 01:12:58.932194 systemd[1]: sshd@9-10.244.29.26:22-139.178.68.195:47056.service: Deactivated successfully. Sep 13 01:12:58.937774 systemd[1]: session-12.scope: Deactivated successfully. Sep 13 01:12:58.940888 systemd-logind[1489]: Removed session 12. Sep 13 01:13:04.100299 systemd[1]: Started sshd@10-10.244.29.26:22-139.178.68.195:45970.service - OpenSSH per-connection server daemon (139.178.68.195:45970). Sep 13 01:13:05.196844 sshd[5899]: Accepted publickey for core from 139.178.68.195 port 45970 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:13:05.202670 sshd[5899]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:13:05.214626 systemd-logind[1489]: New session 13 of user core. Sep 13 01:13:05.222395 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 13 01:13:06.387863 sshd[5899]: pam_unix(sshd:session): session closed for user core Sep 13 01:13:06.402799 systemd[1]: sshd@10-10.244.29.26:22-139.178.68.195:45970.service: Deactivated successfully. Sep 13 01:13:06.406788 systemd[1]: session-13.scope: Deactivated successfully. Sep 13 01:13:06.416246 systemd-logind[1489]: Session 13 logged out. Waiting for processes to exit. Sep 13 01:13:06.420682 systemd-logind[1489]: Removed session 13. Sep 13 01:13:11.544499 systemd[1]: Started sshd@11-10.244.29.26:22-139.178.68.195:54696.service - OpenSSH per-connection server daemon (139.178.68.195:54696). Sep 13 01:13:12.524234 sshd[5940]: Accepted publickey for core from 139.178.68.195 port 54696 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:13:12.527380 sshd[5940]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:13:12.537002 systemd-logind[1489]: New session 14 of user core. Sep 13 01:13:12.544360 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 13 01:13:13.346944 sshd[5940]: pam_unix(sshd:session): session closed for user core Sep 13 01:13:13.353794 systemd[1]: sshd@11-10.244.29.26:22-139.178.68.195:54696.service: Deactivated successfully. Sep 13 01:13:13.359437 systemd[1]: session-14.scope: Deactivated successfully. Sep 13 01:13:13.362899 systemd-logind[1489]: Session 14 logged out. Waiting for processes to exit. Sep 13 01:13:13.365167 systemd-logind[1489]: Removed session 14. Sep 13 01:13:13.508583 systemd[1]: Started sshd@12-10.244.29.26:22-139.178.68.195:54704.service - OpenSSH per-connection server daemon (139.178.68.195:54704). Sep 13 01:13:14.466232 sshd[5954]: Accepted publickey for core from 139.178.68.195 port 54704 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:13:14.469347 sshd[5954]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:13:14.479803 systemd-logind[1489]: New session 15 of user core. Sep 13 01:13:14.486082 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 13 01:13:15.356424 sshd[5954]: pam_unix(sshd:session): session closed for user core Sep 13 01:13:15.360782 systemd-logind[1489]: Session 15 logged out. Waiting for processes to exit. Sep 13 01:13:15.364481 systemd[1]: sshd@12-10.244.29.26:22-139.178.68.195:54704.service: Deactivated successfully. Sep 13 01:13:15.369742 systemd[1]: session-15.scope: Deactivated successfully. Sep 13 01:13:15.372443 systemd-logind[1489]: Removed session 15. Sep 13 01:13:15.529074 systemd[1]: Started sshd@13-10.244.29.26:22-139.178.68.195:54706.service - OpenSSH per-connection server daemon (139.178.68.195:54706). Sep 13 01:13:16.458600 sshd[5965]: Accepted publickey for core from 139.178.68.195 port 54706 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:13:16.461561 sshd[5965]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:13:16.471219 systemd-logind[1489]: New session 16 of user core. Sep 13 01:13:16.480447 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 13 01:13:17.273908 sshd[5965]: pam_unix(sshd:session): session closed for user core Sep 13 01:13:17.284721 systemd[1]: sshd@13-10.244.29.26:22-139.178.68.195:54706.service: Deactivated successfully. Sep 13 01:13:17.293741 systemd[1]: session-16.scope: Deactivated successfully. Sep 13 01:13:17.302200 systemd-logind[1489]: Session 16 logged out. Waiting for processes to exit. Sep 13 01:13:17.309182 systemd-logind[1489]: Removed session 16. Sep 13 01:13:22.441552 systemd[1]: Started sshd@14-10.244.29.26:22-139.178.68.195:52304.service - OpenSSH per-connection server daemon (139.178.68.195:52304). Sep 13 01:13:23.480782 sshd[6006]: Accepted publickey for core from 139.178.68.195 port 52304 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:13:23.487927 sshd[6006]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:13:23.500557 systemd-logind[1489]: New session 17 of user core. Sep 13 01:13:23.509768 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 13 01:13:24.994358 sshd[6006]: pam_unix(sshd:session): session closed for user core Sep 13 01:13:25.004011 systemd[1]: sshd@14-10.244.29.26:22-139.178.68.195:52304.service: Deactivated successfully. Sep 13 01:13:25.010687 systemd[1]: session-17.scope: Deactivated successfully. Sep 13 01:13:25.018262 systemd-logind[1489]: Session 17 logged out. Waiting for processes to exit. Sep 13 01:13:25.022982 systemd-logind[1489]: Removed session 17. Sep 13 01:13:27.320316 containerd[1507]: time="2025-09-13T01:13:27.275191566Z" level=info msg="StopPodSandbox for \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\"" Sep 13 01:13:28.267433 containerd[1507]: 2025-09-13 01:13:27.866 [WARNING][6048] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a73f9354-94fe-434f-94c6-f203f326e804", ResourceVersion:"1021", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747", Pod:"coredns-668d6bf9bc-kdwhw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib2cfcfb65cc", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:13:28.267433 containerd[1507]: 2025-09-13 01:13:27.877 [INFO][6048] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:13:28.267433 containerd[1507]: 2025-09-13 01:13:27.877 [INFO][6048] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" iface="eth0" netns="" Sep 13 01:13:28.267433 containerd[1507]: 2025-09-13 01:13:27.877 [INFO][6048] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:13:28.267433 containerd[1507]: 2025-09-13 01:13:27.877 [INFO][6048] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:13:28.267433 containerd[1507]: 2025-09-13 01:13:28.197 [INFO][6055] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" HandleID="k8s-pod-network.ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:13:28.267433 containerd[1507]: 2025-09-13 01:13:28.201 [INFO][6055] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:13:28.267433 containerd[1507]: 2025-09-13 01:13:28.202 [INFO][6055] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:13:28.267433 containerd[1507]: 2025-09-13 01:13:28.238 [WARNING][6055] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" HandleID="k8s-pod-network.ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:13:28.267433 containerd[1507]: 2025-09-13 01:13:28.239 [INFO][6055] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" HandleID="k8s-pod-network.ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:13:28.267433 containerd[1507]: 2025-09-13 01:13:28.247 [INFO][6055] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:13:28.267433 containerd[1507]: 2025-09-13 01:13:28.261 [INFO][6048] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:13:28.347738 containerd[1507]: time="2025-09-13T01:13:28.347617385Z" level=info msg="TearDown network for sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\" successfully" Sep 13 01:13:28.349562 containerd[1507]: time="2025-09-13T01:13:28.349523451Z" level=info msg="StopPodSandbox for \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\" returns successfully" Sep 13 01:13:28.471653 containerd[1507]: time="2025-09-13T01:13:28.471581937Z" level=info msg="RemovePodSandbox for \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\"" Sep 13 01:13:28.491247 containerd[1507]: time="2025-09-13T01:13:28.491162075Z" level=info msg="Forcibly stopping sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\"" Sep 13 01:13:28.726433 containerd[1507]: 2025-09-13 01:13:28.633 [WARNING][6072] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a73f9354-94fe-434f-94c6-f203f326e804", ResourceVersion:"1021", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"d72be20edc61ed1e0c78744e79e6a244a4fe01d34cf988e05d4fa6e613ad0747", Pod:"coredns-668d6bf9bc-kdwhw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib2cfcfb65cc", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:13:28.726433 containerd[1507]: 2025-09-13 01:13:28.634 [INFO][6072] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:13:28.726433 containerd[1507]: 2025-09-13 01:13:28.634 [INFO][6072] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" iface="eth0" netns="" Sep 13 01:13:28.726433 containerd[1507]: 2025-09-13 01:13:28.635 [INFO][6072] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:13:28.726433 containerd[1507]: 2025-09-13 01:13:28.635 [INFO][6072] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:13:28.726433 containerd[1507]: 2025-09-13 01:13:28.699 [INFO][6079] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" HandleID="k8s-pod-network.ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:13:28.726433 containerd[1507]: 2025-09-13 01:13:28.700 [INFO][6079] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:13:28.726433 containerd[1507]: 2025-09-13 01:13:28.700 [INFO][6079] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:13:28.726433 containerd[1507]: 2025-09-13 01:13:28.714 [WARNING][6079] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" HandleID="k8s-pod-network.ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:13:28.726433 containerd[1507]: 2025-09-13 01:13:28.715 [INFO][6079] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" HandleID="k8s-pod-network.ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Workload="srv--5asmg.gb1.brightbox.com-k8s-coredns--668d6bf9bc--kdwhw-eth0" Sep 13 01:13:28.726433 containerd[1507]: 2025-09-13 01:13:28.717 [INFO][6079] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:13:28.726433 containerd[1507]: 2025-09-13 01:13:28.721 [INFO][6072] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f" Sep 13 01:13:28.726433 containerd[1507]: time="2025-09-13T01:13:28.725831183Z" level=info msg="TearDown network for sandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\" successfully" Sep 13 01:13:28.851535 containerd[1507]: time="2025-09-13T01:13:28.851444242Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:13:28.851769 containerd[1507]: time="2025-09-13T01:13:28.851620161Z" level=info msg="RemovePodSandbox \"ef314f33511ffa413fc05cbdbb356090dd16e1cbb28fa570c1886f5ba662e90f\" returns successfully" Sep 13 01:13:28.854815 containerd[1507]: time="2025-09-13T01:13:28.854780782Z" level=info msg="StopPodSandbox for \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\"" Sep 13 01:13:29.069861 containerd[1507]: 2025-09-13 01:13:28.954 [WARNING][6093] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0", GenerateName:"calico-apiserver-658bd7dd7b-", Namespace:"calico-apiserver", SelfLink:"", UID:"f1634209-c6c5-41be-8e7f-82f83544c3ed", ResourceVersion:"1176", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658bd7dd7b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f", Pod:"calico-apiserver-658bd7dd7b-wdc44", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie647a09106c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:13:29.069861 containerd[1507]: 2025-09-13 01:13:28.955 [INFO][6093] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:13:29.069861 containerd[1507]: 2025-09-13 01:13:28.955 [INFO][6093] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" iface="eth0" netns="" Sep 13 01:13:29.069861 containerd[1507]: 2025-09-13 01:13:28.955 [INFO][6093] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:13:29.069861 containerd[1507]: 2025-09-13 01:13:28.955 [INFO][6093] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:13:29.069861 containerd[1507]: 2025-09-13 01:13:29.023 [INFO][6100] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" HandleID="k8s-pod-network.ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:13:29.069861 containerd[1507]: 2025-09-13 01:13:29.027 [INFO][6100] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:13:29.069861 containerd[1507]: 2025-09-13 01:13:29.027 [INFO][6100] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:13:29.069861 containerd[1507]: 2025-09-13 01:13:29.057 [WARNING][6100] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" HandleID="k8s-pod-network.ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:13:29.069861 containerd[1507]: 2025-09-13 01:13:29.058 [INFO][6100] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" HandleID="k8s-pod-network.ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:13:29.069861 containerd[1507]: 2025-09-13 01:13:29.063 [INFO][6100] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:13:29.069861 containerd[1507]: 2025-09-13 01:13:29.067 [INFO][6093] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:13:29.074634 containerd[1507]: time="2025-09-13T01:13:29.070706889Z" level=info msg="TearDown network for sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\" successfully" Sep 13 01:13:29.074634 containerd[1507]: time="2025-09-13T01:13:29.070749877Z" level=info msg="StopPodSandbox for \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\" returns successfully" Sep 13 01:13:29.074634 containerd[1507]: time="2025-09-13T01:13:29.071406036Z" level=info msg="RemovePodSandbox for \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\"" Sep 13 01:13:29.074634 containerd[1507]: time="2025-09-13T01:13:29.071452378Z" level=info msg="Forcibly stopping sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\"" Sep 13 01:13:29.233287 containerd[1507]: 2025-09-13 01:13:29.156 [WARNING][6114] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0", GenerateName:"calico-apiserver-658bd7dd7b-", Namespace:"calico-apiserver", SelfLink:"", UID:"f1634209-c6c5-41be-8e7f-82f83544c3ed", ResourceVersion:"1176", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 11, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658bd7dd7b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-5asmg.gb1.brightbox.com", ContainerID:"3a0979c7494e4c5921de445da94ec2a508ba1e567b3dfcbc10e3b590297cf02f", Pod:"calico-apiserver-658bd7dd7b-wdc44", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie647a09106c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:13:29.233287 containerd[1507]: 2025-09-13 01:13:29.157 [INFO][6114] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:13:29.233287 containerd[1507]: 2025-09-13 01:13:29.157 [INFO][6114] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" iface="eth0" netns="" Sep 13 01:13:29.233287 containerd[1507]: 2025-09-13 01:13:29.157 [INFO][6114] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:13:29.233287 containerd[1507]: 2025-09-13 01:13:29.157 [INFO][6114] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:13:29.233287 containerd[1507]: 2025-09-13 01:13:29.210 [INFO][6122] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" HandleID="k8s-pod-network.ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:13:29.233287 containerd[1507]: 2025-09-13 01:13:29.211 [INFO][6122] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:13:29.233287 containerd[1507]: 2025-09-13 01:13:29.211 [INFO][6122] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:13:29.233287 containerd[1507]: 2025-09-13 01:13:29.222 [WARNING][6122] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" HandleID="k8s-pod-network.ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:13:29.233287 containerd[1507]: 2025-09-13 01:13:29.222 [INFO][6122] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" HandleID="k8s-pod-network.ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Workload="srv--5asmg.gb1.brightbox.com-k8s-calico--apiserver--658bd7dd7b--wdc44-eth0" Sep 13 01:13:29.233287 containerd[1507]: 2025-09-13 01:13:29.224 [INFO][6122] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:13:29.233287 containerd[1507]: 2025-09-13 01:13:29.227 [INFO][6114] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884" Sep 13 01:13:29.233287 containerd[1507]: time="2025-09-13T01:13:29.231656293Z" level=info msg="TearDown network for sandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\" successfully" Sep 13 01:13:29.257855 containerd[1507]: time="2025-09-13T01:13:29.257548757Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:13:29.257855 containerd[1507]: time="2025-09-13T01:13:29.257702774Z" level=info msg="RemovePodSandbox \"ba0bbfe712eff2e9a7ba02031e9f50da44134c759d7d6a06b8ed0d91936cf884\" returns successfully" Sep 13 01:13:30.230646 systemd[1]: Started sshd@15-10.244.29.26:22-139.178.68.195:48512.service - OpenSSH per-connection server daemon (139.178.68.195:48512). Sep 13 01:13:31.227198 sshd[6129]: Accepted publickey for core from 139.178.68.195 port 48512 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:13:31.230348 sshd[6129]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:13:31.246386 systemd-logind[1489]: New session 18 of user core. Sep 13 01:13:31.253400 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 13 01:13:32.617212 sshd[6129]: pam_unix(sshd:session): session closed for user core Sep 13 01:13:32.625393 systemd[1]: sshd@15-10.244.29.26:22-139.178.68.195:48512.service: Deactivated successfully. Sep 13 01:13:32.629340 systemd[1]: session-18.scope: Deactivated successfully. Sep 13 01:13:32.632629 systemd-logind[1489]: Session 18 logged out. Waiting for processes to exit. Sep 13 01:13:32.635429 systemd-logind[1489]: Removed session 18. Sep 13 01:13:35.716276 systemd[1]: run-containerd-runc-k8s.io-b8a79e41830e3e6fbb29717e5b98e72c6cb4581f81331d4714e794be44e6aafd-runc.04hfZI.mount: Deactivated successfully. Sep 13 01:13:37.777555 systemd[1]: Started sshd@16-10.244.29.26:22-139.178.68.195:48520.service - OpenSSH per-connection server daemon (139.178.68.195:48520). Sep 13 01:13:38.778164 sshd[6164]: Accepted publickey for core from 139.178.68.195 port 48520 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:13:38.779524 sshd[6164]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:13:38.794910 systemd-logind[1489]: New session 19 of user core. Sep 13 01:13:38.801497 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 13 01:13:39.908966 sshd[6164]: pam_unix(sshd:session): session closed for user core Sep 13 01:13:39.919057 systemd-logind[1489]: Session 19 logged out. Waiting for processes to exit. Sep 13 01:13:39.919857 systemd[1]: sshd@16-10.244.29.26:22-139.178.68.195:48520.service: Deactivated successfully. Sep 13 01:13:39.926343 systemd[1]: session-19.scope: Deactivated successfully. Sep 13 01:13:39.931000 systemd-logind[1489]: Removed session 19. Sep 13 01:13:40.067576 systemd[1]: Started sshd@17-10.244.29.26:22-139.178.68.195:48532.service - OpenSSH per-connection server daemon (139.178.68.195:48532). Sep 13 01:13:41.066205 sshd[6178]: Accepted publickey for core from 139.178.68.195 port 48532 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:13:41.077412 sshd[6178]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:13:41.111259 systemd-logind[1489]: New session 20 of user core. Sep 13 01:13:41.116339 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 13 01:13:42.151833 sshd[6178]: pam_unix(sshd:session): session closed for user core Sep 13 01:13:42.162045 systemd[1]: sshd@17-10.244.29.26:22-139.178.68.195:48532.service: Deactivated successfully. Sep 13 01:13:42.172034 systemd[1]: session-20.scope: Deactivated successfully. Sep 13 01:13:42.177022 systemd-logind[1489]: Session 20 logged out. Waiting for processes to exit. Sep 13 01:13:42.179324 systemd-logind[1489]: Removed session 20. Sep 13 01:13:42.309344 systemd[1]: Started sshd@18-10.244.29.26:22-139.178.68.195:44016.service - OpenSSH per-connection server daemon (139.178.68.195:44016). Sep 13 01:13:43.244210 sshd[6207]: Accepted publickey for core from 139.178.68.195 port 44016 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:13:43.247071 sshd[6207]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:13:43.256988 systemd-logind[1489]: New session 21 of user core. Sep 13 01:13:43.263516 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 13 01:13:45.583933 sshd[6207]: pam_unix(sshd:session): session closed for user core Sep 13 01:13:45.620664 systemd[1]: sshd@18-10.244.29.26:22-139.178.68.195:44016.service: Deactivated successfully. Sep 13 01:13:45.634951 systemd[1]: session-21.scope: Deactivated successfully. Sep 13 01:13:45.644214 systemd-logind[1489]: Session 21 logged out. Waiting for processes to exit. Sep 13 01:13:45.647379 systemd-logind[1489]: Removed session 21. Sep 13 01:13:45.740596 systemd[1]: Started sshd@19-10.244.29.26:22-139.178.68.195:44028.service - OpenSSH per-connection server daemon (139.178.68.195:44028). Sep 13 01:13:46.718131 sshd[6232]: Accepted publickey for core from 139.178.68.195 port 44028 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:13:46.723823 sshd[6232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:13:46.733612 systemd-logind[1489]: New session 22 of user core. Sep 13 01:13:46.742033 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 13 01:13:49.540915 sshd[6232]: pam_unix(sshd:session): session closed for user core Sep 13 01:13:49.587690 systemd[1]: sshd@19-10.244.29.26:22-139.178.68.195:44028.service: Deactivated successfully. Sep 13 01:13:49.591629 systemd[1]: session-22.scope: Deactivated successfully. Sep 13 01:13:49.595147 systemd-logind[1489]: Session 22 logged out. Waiting for processes to exit. Sep 13 01:13:49.597835 systemd-logind[1489]: Removed session 22. Sep 13 01:13:49.684540 systemd[1]: Started sshd@20-10.244.29.26:22-139.178.68.195:44042.service - OpenSSH per-connection server daemon (139.178.68.195:44042). Sep 13 01:13:50.684094 sshd[6265]: Accepted publickey for core from 139.178.68.195 port 44042 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:13:50.687690 sshd[6265]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:13:50.698788 systemd-logind[1489]: New session 23 of user core. Sep 13 01:13:50.707654 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 13 01:13:51.986062 sshd[6265]: pam_unix(sshd:session): session closed for user core Sep 13 01:13:51.996299 systemd[1]: sshd@20-10.244.29.26:22-139.178.68.195:44042.service: Deactivated successfully. Sep 13 01:13:51.996896 systemd-logind[1489]: Session 23 logged out. Waiting for processes to exit. Sep 13 01:13:52.001380 systemd[1]: session-23.scope: Deactivated successfully. Sep 13 01:13:52.003385 systemd-logind[1489]: Removed session 23. Sep 13 01:13:53.108993 systemd[1]: run-containerd-runc-k8s.io-b8a79e41830e3e6fbb29717e5b98e72c6cb4581f81331d4714e794be44e6aafd-runc.eUhEdt.mount: Deactivated successfully. Sep 13 01:13:57.157775 systemd[1]: Started sshd@21-10.244.29.26:22-139.178.68.195:55382.service - OpenSSH per-connection server daemon (139.178.68.195:55382). Sep 13 01:13:58.210028 sshd[6345]: Accepted publickey for core from 139.178.68.195 port 55382 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:13:58.213312 sshd[6345]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:13:58.234977 systemd-logind[1489]: New session 24 of user core. Sep 13 01:13:58.239827 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 13 01:13:59.942365 sshd[6345]: pam_unix(sshd:session): session closed for user core Sep 13 01:13:59.957920 systemd[1]: sshd@21-10.244.29.26:22-139.178.68.195:55382.service: Deactivated successfully. Sep 13 01:13:59.965678 systemd[1]: session-24.scope: Deactivated successfully. Sep 13 01:13:59.969659 systemd-logind[1489]: Session 24 logged out. Waiting for processes to exit. Sep 13 01:13:59.974964 systemd-logind[1489]: Removed session 24. Sep 13 01:14:05.114522 systemd[1]: Started sshd@22-10.244.29.26:22-139.178.68.195:47096.service - OpenSSH per-connection server daemon (139.178.68.195:47096). Sep 13 01:14:06.085795 sshd[6360]: Accepted publickey for core from 139.178.68.195 port 47096 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:14:06.088566 sshd[6360]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:14:06.101890 systemd-logind[1489]: New session 25 of user core. Sep 13 01:14:06.107352 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 13 01:14:07.567395 sshd[6360]: pam_unix(sshd:session): session closed for user core Sep 13 01:14:07.574894 systemd[1]: sshd@22-10.244.29.26:22-139.178.68.195:47096.service: Deactivated successfully. Sep 13 01:14:07.579960 systemd[1]: session-25.scope: Deactivated successfully. Sep 13 01:14:07.581828 systemd-logind[1489]: Session 25 logged out. Waiting for processes to exit. Sep 13 01:14:07.583717 systemd-logind[1489]: Removed session 25.