Sep 13 01:16:44.997516 kernel: Linux version 6.6.106-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 22:30:50 -00 2025 Sep 13 01:16:44.997583 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=2945e6465d436b7d1da8a9350a0544af0bd9aec821cd06987451d5e1d3071534 Sep 13 01:16:44.997597 kernel: BIOS-provided physical RAM map: Sep 13 01:16:44.997613 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 13 01:16:44.997623 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 13 01:16:44.997633 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 13 01:16:44.997644 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Sep 13 01:16:44.997655 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Sep 13 01:16:44.997665 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 13 01:16:44.997675 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 13 01:16:44.997685 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 13 01:16:44.997695 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 13 01:16:44.997710 kernel: NX (Execute Disable) protection: active Sep 13 01:16:44.997721 kernel: APIC: Static calls initialized Sep 13 01:16:44.997733 kernel: SMBIOS 2.8 present. Sep 13 01:16:44.997745 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Sep 13 01:16:44.997756 kernel: Hypervisor detected: KVM Sep 13 01:16:44.997771 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 13 01:16:44.997783 kernel: kvm-clock: using sched offset of 4381950834 cycles Sep 13 01:16:44.997795 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 13 01:16:44.997806 kernel: tsc: Detected 2799.998 MHz processor Sep 13 01:16:44.997818 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 13 01:16:44.997829 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 13 01:16:44.997840 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Sep 13 01:16:44.997852 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 13 01:16:44.997863 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 13 01:16:44.997879 kernel: Using GB pages for direct mapping Sep 13 01:16:44.997890 kernel: ACPI: Early table checksum verification disabled Sep 13 01:16:44.997901 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Sep 13 01:16:44.997913 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:16:44.997924 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:16:44.997935 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:16:44.997946 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Sep 13 01:16:44.997957 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:16:44.997969 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:16:44.997984 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:16:44.997996 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 01:16:44.998007 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Sep 13 01:16:44.998018 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Sep 13 01:16:44.998029 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Sep 13 01:16:44.998047 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Sep 13 01:16:44.998058 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Sep 13 01:16:44.998074 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Sep 13 01:16:44.998086 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Sep 13 01:16:44.998098 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Sep 13 01:16:44.998110 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Sep 13 01:16:44.998121 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Sep 13 01:16:44.998133 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Sep 13 01:16:44.998144 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Sep 13 01:16:44.998160 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Sep 13 01:16:44.998172 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Sep 13 01:16:44.998183 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Sep 13 01:16:44.998195 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Sep 13 01:16:44.998207 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Sep 13 01:16:44.998218 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Sep 13 01:16:44.998230 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Sep 13 01:16:44.998241 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Sep 13 01:16:44.998253 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Sep 13 01:16:44.998264 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Sep 13 01:16:44.998280 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Sep 13 01:16:44.998292 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 13 01:16:44.998304 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Sep 13 01:16:44.998316 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Sep 13 01:16:44.998328 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Sep 13 01:16:44.998339 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Sep 13 01:16:44.998351 kernel: Zone ranges: Sep 13 01:16:44.998363 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 13 01:16:44.998375 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Sep 13 01:16:44.998391 kernel: Normal empty Sep 13 01:16:44.998403 kernel: Movable zone start for each node Sep 13 01:16:44.998415 kernel: Early memory node ranges Sep 13 01:16:44.998426 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 13 01:16:44.998461 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Sep 13 01:16:44.998476 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Sep 13 01:16:44.998488 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 13 01:16:44.998499 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 13 01:16:44.998511 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Sep 13 01:16:44.998522 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 13 01:16:44.998563 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 13 01:16:44.998575 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 13 01:16:44.998587 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 13 01:16:44.998599 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 13 01:16:44.998610 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 13 01:16:44.998622 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 13 01:16:44.998634 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 13 01:16:44.998645 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 13 01:16:44.998657 kernel: TSC deadline timer available Sep 13 01:16:44.998674 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Sep 13 01:16:44.998686 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 13 01:16:44.998697 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 13 01:16:44.998709 kernel: Booting paravirtualized kernel on KVM Sep 13 01:16:44.998721 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 13 01:16:44.998733 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 13 01:16:44.998745 kernel: percpu: Embedded 58 pages/cpu s197160 r8192 d32216 u262144 Sep 13 01:16:44.998757 kernel: pcpu-alloc: s197160 r8192 d32216 u262144 alloc=1*2097152 Sep 13 01:16:44.998769 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 13 01:16:44.998785 kernel: kvm-guest: PV spinlocks enabled Sep 13 01:16:44.998797 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 13 01:16:44.998810 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=2945e6465d436b7d1da8a9350a0544af0bd9aec821cd06987451d5e1d3071534 Sep 13 01:16:44.998822 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 13 01:16:44.998834 kernel: random: crng init done Sep 13 01:16:44.998872 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 13 01:16:44.998886 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 13 01:16:44.998897 kernel: Fallback order for Node 0: 0 Sep 13 01:16:44.998915 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Sep 13 01:16:44.998927 kernel: Policy zone: DMA32 Sep 13 01:16:44.998939 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 13 01:16:44.998951 kernel: software IO TLB: area num 16. Sep 13 01:16:44.998963 kernel: Memory: 1901524K/2096616K available (12288K kernel code, 2293K rwdata, 22744K rodata, 42884K init, 2312K bss, 194832K reserved, 0K cma-reserved) Sep 13 01:16:44.998975 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 13 01:16:44.998987 kernel: Kernel/User page tables isolation: enabled Sep 13 01:16:44.998999 kernel: ftrace: allocating 37974 entries in 149 pages Sep 13 01:16:44.999011 kernel: ftrace: allocated 149 pages with 4 groups Sep 13 01:16:44.999028 kernel: Dynamic Preempt: voluntary Sep 13 01:16:44.999040 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 13 01:16:44.999052 kernel: rcu: RCU event tracing is enabled. Sep 13 01:16:44.999064 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 13 01:16:44.999076 kernel: Trampoline variant of Tasks RCU enabled. Sep 13 01:16:44.999100 kernel: Rude variant of Tasks RCU enabled. Sep 13 01:16:44.999117 kernel: Tracing variant of Tasks RCU enabled. Sep 13 01:16:44.999129 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 13 01:16:44.999141 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 13 01:16:44.999154 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Sep 13 01:16:44.999166 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 13 01:16:44.999178 kernel: Console: colour VGA+ 80x25 Sep 13 01:16:44.999195 kernel: printk: console [tty0] enabled Sep 13 01:16:44.999207 kernel: printk: console [ttyS0] enabled Sep 13 01:16:44.999220 kernel: ACPI: Core revision 20230628 Sep 13 01:16:44.999232 kernel: APIC: Switch to symmetric I/O mode setup Sep 13 01:16:44.999244 kernel: x2apic enabled Sep 13 01:16:44.999261 kernel: APIC: Switched APIC routing to: physical x2apic Sep 13 01:16:44.999274 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Sep 13 01:16:44.999286 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Sep 13 01:16:44.999299 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 13 01:16:44.999311 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Sep 13 01:16:44.999323 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Sep 13 01:16:44.999335 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 13 01:16:44.999347 kernel: Spectre V2 : Mitigation: Retpolines Sep 13 01:16:44.999359 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 13 01:16:44.999376 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Sep 13 01:16:44.999389 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 13 01:16:44.999401 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 13 01:16:44.999413 kernel: MDS: Mitigation: Clear CPU buffers Sep 13 01:16:44.999426 kernel: MMIO Stale Data: Unknown: No mitigations Sep 13 01:16:44.999455 kernel: SRBDS: Unknown: Dependent on hypervisor status Sep 13 01:16:44.999468 kernel: active return thunk: its_return_thunk Sep 13 01:16:44.999480 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 13 01:16:44.999493 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 13 01:16:44.999505 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 13 01:16:44.999517 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 13 01:16:44.999544 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 13 01:16:44.999559 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Sep 13 01:16:44.999571 kernel: Freeing SMP alternatives memory: 32K Sep 13 01:16:44.999583 kernel: pid_max: default: 32768 minimum: 301 Sep 13 01:16:44.999595 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Sep 13 01:16:44.999607 kernel: landlock: Up and running. Sep 13 01:16:44.999619 kernel: SELinux: Initializing. Sep 13 01:16:44.999631 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 13 01:16:44.999643 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 13 01:16:44.999656 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Sep 13 01:16:44.999668 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 13 01:16:44.999686 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 13 01:16:44.999699 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 13 01:16:44.999711 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Sep 13 01:16:44.999724 kernel: signal: max sigframe size: 1776 Sep 13 01:16:44.999736 kernel: rcu: Hierarchical SRCU implementation. Sep 13 01:16:44.999749 kernel: rcu: Max phase no-delay instances is 400. Sep 13 01:16:44.999761 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 13 01:16:44.999773 kernel: smp: Bringing up secondary CPUs ... Sep 13 01:16:44.999786 kernel: smpboot: x86: Booting SMP configuration: Sep 13 01:16:44.999803 kernel: .... node #0, CPUs: #1 Sep 13 01:16:44.999815 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Sep 13 01:16:44.999828 kernel: smp: Brought up 1 node, 2 CPUs Sep 13 01:16:44.999840 kernel: smpboot: Max logical packages: 16 Sep 13 01:16:44.999852 kernel: smpboot: Total of 2 processors activated (11199.99 BogoMIPS) Sep 13 01:16:44.999865 kernel: devtmpfs: initialized Sep 13 01:16:44.999877 kernel: x86/mm: Memory block size: 128MB Sep 13 01:16:44.999890 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 13 01:16:44.999902 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 13 01:16:44.999919 kernel: pinctrl core: initialized pinctrl subsystem Sep 13 01:16:44.999931 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 13 01:16:44.999944 kernel: audit: initializing netlink subsys (disabled) Sep 13 01:16:44.999956 kernel: audit: type=2000 audit(1757726203.527:1): state=initialized audit_enabled=0 res=1 Sep 13 01:16:44.999968 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 13 01:16:44.999980 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 13 01:16:44.999993 kernel: cpuidle: using governor menu Sep 13 01:16:45.000005 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 13 01:16:45.000018 kernel: dca service started, version 1.12.1 Sep 13 01:16:45.000035 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Sep 13 01:16:45.000048 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Sep 13 01:16:45.000060 kernel: PCI: Using configuration type 1 for base access Sep 13 01:16:45.000072 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 13 01:16:45.000085 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 13 01:16:45.000097 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 13 01:16:45.000110 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 13 01:16:45.000122 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 13 01:16:45.000134 kernel: ACPI: Added _OSI(Module Device) Sep 13 01:16:45.000151 kernel: ACPI: Added _OSI(Processor Device) Sep 13 01:16:45.000164 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 13 01:16:45.000176 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 13 01:16:45.000189 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Sep 13 01:16:45.000201 kernel: ACPI: Interpreter enabled Sep 13 01:16:45.000213 kernel: ACPI: PM: (supports S0 S5) Sep 13 01:16:45.000226 kernel: ACPI: Using IOAPIC for interrupt routing Sep 13 01:16:45.000238 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 13 01:16:45.000250 kernel: PCI: Using E820 reservations for host bridge windows Sep 13 01:16:45.000267 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 13 01:16:45.000280 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 13 01:16:45.001009 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 13 01:16:45.001193 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 13 01:16:45.001361 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 13 01:16:45.001380 kernel: PCI host bridge to bus 0000:00 Sep 13 01:16:45.001619 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 13 01:16:45.001783 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 13 01:16:45.001938 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 13 01:16:45.002090 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Sep 13 01:16:45.002240 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 13 01:16:45.002401 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Sep 13 01:16:45.002585 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 13 01:16:45.002779 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Sep 13 01:16:45.003002 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Sep 13 01:16:45.003181 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Sep 13 01:16:45.003362 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Sep 13 01:16:45.003559 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Sep 13 01:16:45.003729 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 13 01:16:45.003908 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Sep 13 01:16:45.004123 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Sep 13 01:16:45.004306 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Sep 13 01:16:45.004505 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Sep 13 01:16:45.004695 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Sep 13 01:16:45.004864 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Sep 13 01:16:45.005051 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Sep 13 01:16:45.005219 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Sep 13 01:16:45.005410 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Sep 13 01:16:45.005616 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Sep 13 01:16:45.005789 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Sep 13 01:16:45.005978 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Sep 13 01:16:45.006153 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Sep 13 01:16:45.006328 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Sep 13 01:16:45.006519 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Sep 13 01:16:45.006735 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Sep 13 01:16:45.006913 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Sep 13 01:16:45.007085 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Sep 13 01:16:45.007255 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Sep 13 01:16:45.008629 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Sep 13 01:16:45.008821 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Sep 13 01:16:45.008999 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Sep 13 01:16:45.009168 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Sep 13 01:16:45.009333 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Sep 13 01:16:45.009618 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Sep 13 01:16:45.009793 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Sep 13 01:16:45.009957 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 13 01:16:45.010138 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Sep 13 01:16:45.011739 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Sep 13 01:16:45.011949 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Sep 13 01:16:45.012143 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Sep 13 01:16:45.012310 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Sep 13 01:16:45.012519 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Sep 13 01:16:45.012709 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Sep 13 01:16:45.012917 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 13 01:16:45.013084 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 13 01:16:45.013250 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 13 01:16:45.013424 kernel: pci_bus 0000:02: extended config space not accessible Sep 13 01:16:45.015720 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Sep 13 01:16:45.015918 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Sep 13 01:16:45.016096 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 13 01:16:45.016267 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 13 01:16:45.017599 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Sep 13 01:16:45.017785 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Sep 13 01:16:45.017992 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 13 01:16:45.018174 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 13 01:16:45.018346 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 13 01:16:45.019044 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Sep 13 01:16:45.019225 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Sep 13 01:16:45.019397 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 13 01:16:45.021615 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 13 01:16:45.021791 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 13 01:16:45.021964 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 13 01:16:45.022156 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 13 01:16:45.022339 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 13 01:16:45.022564 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 13 01:16:45.022732 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 13 01:16:45.022895 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 13 01:16:45.023119 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 13 01:16:45.023304 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 13 01:16:45.023487 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 13 01:16:45.023673 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 13 01:16:45.023847 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 13 01:16:45.024013 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 13 01:16:45.024195 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 13 01:16:45.024360 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 13 01:16:45.026591 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 13 01:16:45.026615 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 13 01:16:45.026629 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 13 01:16:45.026642 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 13 01:16:45.026655 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 13 01:16:45.026676 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 13 01:16:45.026688 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 13 01:16:45.026701 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 13 01:16:45.026714 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 13 01:16:45.026726 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 13 01:16:45.026739 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 13 01:16:45.026751 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 13 01:16:45.026764 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 13 01:16:45.026777 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 13 01:16:45.026795 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 13 01:16:45.026807 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 13 01:16:45.026820 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 13 01:16:45.026833 kernel: iommu: Default domain type: Translated Sep 13 01:16:45.026845 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 13 01:16:45.026864 kernel: PCI: Using ACPI for IRQ routing Sep 13 01:16:45.026877 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 13 01:16:45.026889 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 13 01:16:45.026902 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Sep 13 01:16:45.027110 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 13 01:16:45.027282 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 13 01:16:45.029080 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 13 01:16:45.029104 kernel: vgaarb: loaded Sep 13 01:16:45.029147 kernel: clocksource: Switched to clocksource kvm-clock Sep 13 01:16:45.029160 kernel: VFS: Disk quotas dquot_6.6.0 Sep 13 01:16:45.029173 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 13 01:16:45.029185 kernel: pnp: PnP ACPI init Sep 13 01:16:45.029388 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 13 01:16:45.029410 kernel: pnp: PnP ACPI: found 5 devices Sep 13 01:16:45.029423 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 13 01:16:45.029465 kernel: NET: Registered PF_INET protocol family Sep 13 01:16:45.029479 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 13 01:16:45.029492 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 13 01:16:45.029505 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 13 01:16:45.029518 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 13 01:16:45.029550 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 13 01:16:45.029563 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 13 01:16:45.029576 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 13 01:16:45.029589 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 13 01:16:45.029601 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 13 01:16:45.029613 kernel: NET: Registered PF_XDP protocol family Sep 13 01:16:45.029787 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Sep 13 01:16:45.029956 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 13 01:16:45.030131 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 13 01:16:45.030299 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 13 01:16:45.031556 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 13 01:16:45.031737 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 13 01:16:45.031905 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 13 01:16:45.032084 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 13 01:16:45.032272 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Sep 13 01:16:45.032457 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Sep 13 01:16:45.033636 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Sep 13 01:16:45.033805 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Sep 13 01:16:45.033992 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Sep 13 01:16:45.034192 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Sep 13 01:16:45.034361 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Sep 13 01:16:45.035587 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Sep 13 01:16:45.035793 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Sep 13 01:16:45.035970 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 13 01:16:45.036138 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Sep 13 01:16:45.036302 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 13 01:16:45.037501 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Sep 13 01:16:45.037705 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 13 01:16:45.037879 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Sep 13 01:16:45.038044 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 13 01:16:45.038215 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Sep 13 01:16:45.038386 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 13 01:16:45.039613 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Sep 13 01:16:45.039781 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 13 01:16:45.039946 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Sep 13 01:16:45.040118 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 13 01:16:45.040302 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Sep 13 01:16:45.040501 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 13 01:16:45.040682 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Sep 13 01:16:45.040850 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 13 01:16:45.041022 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Sep 13 01:16:45.041279 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 13 01:16:45.044483 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Sep 13 01:16:45.044679 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 13 01:16:45.044853 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Sep 13 01:16:45.045034 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 13 01:16:45.045202 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Sep 13 01:16:45.045394 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 13 01:16:45.045599 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Sep 13 01:16:45.045798 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 13 01:16:45.045990 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Sep 13 01:16:45.046225 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 13 01:16:45.047594 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Sep 13 01:16:45.047789 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 13 01:16:45.047961 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Sep 13 01:16:45.048131 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 13 01:16:45.048313 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 13 01:16:45.048550 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 13 01:16:45.048763 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 13 01:16:45.048925 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Sep 13 01:16:45.049075 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 13 01:16:45.049223 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Sep 13 01:16:45.050561 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 13 01:16:45.050754 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Sep 13 01:16:45.050928 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 13 01:16:45.051106 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 13 01:16:45.051304 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Sep 13 01:16:45.051496 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 13 01:16:45.051676 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 13 01:16:45.051866 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Sep 13 01:16:45.052043 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 13 01:16:45.052218 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 13 01:16:45.052411 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Sep 13 01:16:45.053338 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 13 01:16:45.053525 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 13 01:16:45.053717 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Sep 13 01:16:45.053876 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 13 01:16:45.054031 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 13 01:16:45.054210 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Sep 13 01:16:45.054378 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 13 01:16:45.054586 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 13 01:16:45.054754 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Sep 13 01:16:45.054911 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Sep 13 01:16:45.055066 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 13 01:16:45.055241 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Sep 13 01:16:45.055413 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 13 01:16:45.055617 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 13 01:16:45.055639 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 13 01:16:45.055654 kernel: PCI: CLS 0 bytes, default 64 Sep 13 01:16:45.055667 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 13 01:16:45.055681 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Sep 13 01:16:45.055695 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 13 01:16:45.055709 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Sep 13 01:16:45.055722 kernel: Initialise system trusted keyrings Sep 13 01:16:45.055742 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 13 01:16:45.055761 kernel: Key type asymmetric registered Sep 13 01:16:45.055774 kernel: Asymmetric key parser 'x509' registered Sep 13 01:16:45.055788 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Sep 13 01:16:45.055801 kernel: io scheduler mq-deadline registered Sep 13 01:16:45.055815 kernel: io scheduler kyber registered Sep 13 01:16:45.055836 kernel: io scheduler bfq registered Sep 13 01:16:45.056008 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 13 01:16:45.056180 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 13 01:16:45.056360 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:16:45.056582 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 13 01:16:45.056755 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 13 01:16:45.056926 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:16:45.057098 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 13 01:16:45.057268 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 13 01:16:45.057571 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:16:45.057746 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 13 01:16:45.057912 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 13 01:16:45.058083 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:16:45.058252 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 13 01:16:45.058416 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 13 01:16:45.058618 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:16:45.058787 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 13 01:16:45.058952 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 13 01:16:45.059121 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:16:45.059289 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 13 01:16:45.059501 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 13 01:16:45.059700 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:16:45.059868 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 13 01:16:45.060034 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 13 01:16:45.060199 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 13 01:16:45.060220 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 13 01:16:45.060235 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 13 01:16:45.060255 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 13 01:16:45.060269 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 13 01:16:45.060283 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 13 01:16:45.060296 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 13 01:16:45.060310 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 13 01:16:45.060323 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 13 01:16:45.060344 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 13 01:16:45.060555 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 13 01:16:45.060719 kernel: rtc_cmos 00:03: registered as rtc0 Sep 13 01:16:45.060886 kernel: rtc_cmos 00:03: setting system clock to 2025-09-13T01:16:44 UTC (1757726204) Sep 13 01:16:45.061043 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Sep 13 01:16:45.061063 kernel: intel_pstate: CPU model not supported Sep 13 01:16:45.061077 kernel: NET: Registered PF_INET6 protocol family Sep 13 01:16:45.061091 kernel: Segment Routing with IPv6 Sep 13 01:16:45.061104 kernel: In-situ OAM (IOAM) with IPv6 Sep 13 01:16:45.061117 kernel: NET: Registered PF_PACKET protocol family Sep 13 01:16:45.061131 kernel: Key type dns_resolver registered Sep 13 01:16:45.061151 kernel: IPI shorthand broadcast: enabled Sep 13 01:16:45.061165 kernel: sched_clock: Marking stable (1122004511, 224336586)->(1568362650, -222021553) Sep 13 01:16:45.061179 kernel: registered taskstats version 1 Sep 13 01:16:45.061192 kernel: Loading compiled-in X.509 certificates Sep 13 01:16:45.061206 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.106-flatcar: 1274e0c573ac8d09163d6bc6d1ee1445fb2f8cc6' Sep 13 01:16:45.061219 kernel: Key type .fscrypt registered Sep 13 01:16:45.061232 kernel: Key type fscrypt-provisioning registered Sep 13 01:16:45.061246 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 13 01:16:45.061259 kernel: ima: Allocated hash algorithm: sha1 Sep 13 01:16:45.061277 kernel: ima: No architecture policies found Sep 13 01:16:45.061295 kernel: clk: Disabling unused clocks Sep 13 01:16:45.061309 kernel: Freeing unused kernel image (initmem) memory: 42884K Sep 13 01:16:45.061322 kernel: Write protecting the kernel read-only data: 36864k Sep 13 01:16:45.061336 kernel: Freeing unused kernel image (rodata/data gap) memory: 1832K Sep 13 01:16:45.061349 kernel: Run /init as init process Sep 13 01:16:45.061362 kernel: with arguments: Sep 13 01:16:45.061375 kernel: /init Sep 13 01:16:45.061389 kernel: with environment: Sep 13 01:16:45.061406 kernel: HOME=/ Sep 13 01:16:45.061420 kernel: TERM=linux Sep 13 01:16:45.061462 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 13 01:16:45.061480 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 13 01:16:45.061497 systemd[1]: Detected virtualization kvm. Sep 13 01:16:45.061511 systemd[1]: Detected architecture x86-64. Sep 13 01:16:45.061531 systemd[1]: Running in initrd. Sep 13 01:16:45.061556 systemd[1]: No hostname configured, using default hostname. Sep 13 01:16:45.061577 systemd[1]: Hostname set to . Sep 13 01:16:45.061592 systemd[1]: Initializing machine ID from VM UUID. Sep 13 01:16:45.061606 systemd[1]: Queued start job for default target initrd.target. Sep 13 01:16:45.061621 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 01:16:45.061635 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 01:16:45.061650 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 13 01:16:45.061664 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 13 01:16:45.061678 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 13 01:16:45.061697 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 13 01:16:45.061713 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 13 01:16:45.061728 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 13 01:16:45.061742 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 01:16:45.061757 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 13 01:16:45.061771 systemd[1]: Reached target paths.target - Path Units. Sep 13 01:16:45.061785 systemd[1]: Reached target slices.target - Slice Units. Sep 13 01:16:45.061804 systemd[1]: Reached target swap.target - Swaps. Sep 13 01:16:45.061818 systemd[1]: Reached target timers.target - Timer Units. Sep 13 01:16:45.061833 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 13 01:16:45.061847 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 13 01:16:45.061861 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 13 01:16:45.061876 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 13 01:16:45.061890 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 13 01:16:45.061904 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 13 01:16:45.061923 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 01:16:45.061938 systemd[1]: Reached target sockets.target - Socket Units. Sep 13 01:16:45.061952 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 13 01:16:45.061967 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 13 01:16:45.061981 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 13 01:16:45.061995 systemd[1]: Starting systemd-fsck-usr.service... Sep 13 01:16:45.062009 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 13 01:16:45.062023 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 13 01:16:45.062038 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 01:16:45.062112 systemd-journald[202]: Collecting audit messages is disabled. Sep 13 01:16:45.062144 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 13 01:16:45.062159 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 01:16:45.062174 systemd[1]: Finished systemd-fsck-usr.service. Sep 13 01:16:45.062195 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 13 01:16:45.062209 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 13 01:16:45.062223 kernel: Bridge firewalling registered Sep 13 01:16:45.062237 systemd-journald[202]: Journal started Sep 13 01:16:45.062267 systemd-journald[202]: Runtime Journal (/run/log/journal/dc4728508d8a4c41a3c2d3845d5d9520) is 4.7M, max 38.0M, 33.2M free. Sep 13 01:16:45.007898 systemd-modules-load[203]: Inserted module 'overlay' Sep 13 01:16:45.052546 systemd-modules-load[203]: Inserted module 'br_netfilter' Sep 13 01:16:45.127464 systemd[1]: Started systemd-journald.service - Journal Service. Sep 13 01:16:45.126902 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 13 01:16:45.128882 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 01:16:45.131148 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 01:16:45.141963 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 13 01:16:45.145622 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 13 01:16:45.152149 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 13 01:16:45.160631 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 13 01:16:45.173866 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 01:16:45.176371 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 01:16:45.177323 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 13 01:16:45.184684 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 13 01:16:45.185771 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 01:16:45.195660 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 13 01:16:45.201836 dracut-cmdline[236]: dracut-dracut-053 Sep 13 01:16:45.206261 dracut-cmdline[236]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=2945e6465d436b7d1da8a9350a0544af0bd9aec821cd06987451d5e1d3071534 Sep 13 01:16:45.240403 systemd-resolved[238]: Positive Trust Anchors: Sep 13 01:16:45.240426 systemd-resolved[238]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 13 01:16:45.241504 systemd-resolved[238]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 13 01:16:45.249249 systemd-resolved[238]: Defaulting to hostname 'linux'. Sep 13 01:16:45.251838 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 13 01:16:45.252999 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 13 01:16:45.305480 kernel: SCSI subsystem initialized Sep 13 01:16:45.317470 kernel: Loading iSCSI transport class v2.0-870. Sep 13 01:16:45.329462 kernel: iscsi: registered transport (tcp) Sep 13 01:16:45.354790 kernel: iscsi: registered transport (qla4xxx) Sep 13 01:16:45.354872 kernel: QLogic iSCSI HBA Driver Sep 13 01:16:45.407700 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 13 01:16:45.414650 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 13 01:16:45.456239 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 13 01:16:45.456320 kernel: device-mapper: uevent: version 1.0.3 Sep 13 01:16:45.457090 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Sep 13 01:16:45.504607 kernel: raid6: sse2x4 gen() 14280 MB/s Sep 13 01:16:45.522499 kernel: raid6: sse2x2 gen() 9594 MB/s Sep 13 01:16:45.543971 kernel: raid6: sse2x1 gen() 10412 MB/s Sep 13 01:16:45.544298 kernel: raid6: using algorithm sse2x4 gen() 14280 MB/s Sep 13 01:16:45.562038 kernel: raid6: .... xor() 7902 MB/s, rmw enabled Sep 13 01:16:45.562128 kernel: raid6: using ssse3x2 recovery algorithm Sep 13 01:16:45.587486 kernel: xor: automatically using best checksumming function avx Sep 13 01:16:45.772472 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 13 01:16:45.788286 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 13 01:16:45.795687 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 01:16:45.828356 systemd-udevd[421]: Using default interface naming scheme 'v255'. Sep 13 01:16:45.835574 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 01:16:45.846107 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 13 01:16:45.866626 dracut-pre-trigger[429]: rd.md=0: removing MD RAID activation Sep 13 01:16:45.907531 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 13 01:16:45.914767 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 13 01:16:46.027772 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 01:16:46.035640 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 13 01:16:46.062743 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 13 01:16:46.065060 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 13 01:16:46.066847 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 01:16:46.068755 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 13 01:16:46.077664 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 13 01:16:46.098579 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 13 01:16:46.158469 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Sep 13 01:16:46.169497 kernel: cryptd: max_cpu_qlen set to 1000 Sep 13 01:16:46.174462 kernel: ACPI: bus type USB registered Sep 13 01:16:46.179447 kernel: usbcore: registered new interface driver usbfs Sep 13 01:16:46.183603 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Sep 13 01:16:46.183844 kernel: usbcore: registered new interface driver hub Sep 13 01:16:46.196458 kernel: usbcore: registered new device driver usb Sep 13 01:16:46.199456 kernel: AVX version of gcm_enc/dec engaged. Sep 13 01:16:46.207554 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 13 01:16:46.207600 kernel: GPT:17805311 != 125829119 Sep 13 01:16:46.207619 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 13 01:16:46.207636 kernel: GPT:17805311 != 125829119 Sep 13 01:16:46.207653 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 13 01:16:46.207670 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 01:16:46.199924 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 13 01:16:46.211632 kernel: AES CTR mode by8 optimization enabled Sep 13 01:16:46.200089 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 01:16:46.211388 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 13 01:16:46.213559 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 01:16:46.213764 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 01:16:46.216021 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 01:16:46.229764 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 01:16:46.240289 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 13 01:16:46.240683 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Sep 13 01:16:46.246171 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 13 01:16:46.246420 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Sep 13 01:16:46.248617 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Sep 13 01:16:46.248843 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Sep 13 01:16:46.249053 kernel: hub 1-0:1.0: USB hub found Sep 13 01:16:46.249272 kernel: hub 1-0:1.0: 4 ports detected Sep 13 01:16:46.249534 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 13 01:16:46.249749 kernel: hub 2-0:1.0: USB hub found Sep 13 01:16:46.249975 kernel: hub 2-0:1.0: 4 ports detected Sep 13 01:16:46.288495 kernel: libata version 3.00 loaded. Sep 13 01:16:46.297489 kernel: ahci 0000:00:1f.2: version 3.0 Sep 13 01:16:46.300830 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 13 01:16:46.304293 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Sep 13 01:16:46.304710 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 13 01:16:46.304918 kernel: scsi host0: ahci Sep 13 01:16:46.307423 kernel: scsi host1: ahci Sep 13 01:16:46.307722 kernel: scsi host2: ahci Sep 13 01:16:46.309599 kernel: scsi host3: ahci Sep 13 01:16:46.309800 kernel: scsi host4: ahci Sep 13 01:16:46.310020 kernel: scsi host5: ahci Sep 13 01:16:46.310213 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 41 Sep 13 01:16:46.310234 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 41 Sep 13 01:16:46.310251 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 41 Sep 13 01:16:46.310268 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 41 Sep 13 01:16:46.310291 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 41 Sep 13 01:16:46.310308 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 41 Sep 13 01:16:46.334466 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/vda6 scanned by (udev-worker) (470) Sep 13 01:16:46.355465 kernel: BTRFS: device fsid fa70a3b0-3d47-4508-bba0-9fa4607626aa devid 1 transid 36 /dev/vda3 scanned by (udev-worker) (475) Sep 13 01:16:46.364624 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 13 01:16:46.402910 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 13 01:16:46.404106 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 01:16:46.416700 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 13 01:16:46.417623 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 13 01:16:46.425657 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 13 01:16:46.438774 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 13 01:16:46.443612 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 13 01:16:46.448577 disk-uuid[564]: Primary Header is updated. Sep 13 01:16:46.448577 disk-uuid[564]: Secondary Entries is updated. Sep 13 01:16:46.448577 disk-uuid[564]: Secondary Header is updated. Sep 13 01:16:46.453492 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 01:16:46.461537 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 01:16:46.471138 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 01:16:46.493472 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 13 01:16:46.616594 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 13 01:16:46.616666 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 13 01:16:46.622243 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 13 01:16:46.622309 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 13 01:16:46.622329 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 13 01:16:46.626466 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 13 01:16:46.655399 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 13 01:16:46.667034 kernel: usbcore: registered new interface driver usbhid Sep 13 01:16:46.667099 kernel: usbhid: USB HID core driver Sep 13 01:16:46.675473 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Sep 13 01:16:46.680582 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Sep 13 01:16:47.469694 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 01:16:47.471387 disk-uuid[565]: The operation has completed successfully. Sep 13 01:16:47.519250 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 13 01:16:47.519515 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 13 01:16:47.544666 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 13 01:16:47.548925 sh[587]: Success Sep 13 01:16:47.565495 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Sep 13 01:16:47.644825 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 13 01:16:47.647634 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 13 01:16:47.649206 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 13 01:16:47.670522 kernel: BTRFS info (device dm-0): first mount of filesystem fa70a3b0-3d47-4508-bba0-9fa4607626aa Sep 13 01:16:47.670616 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 13 01:16:47.674797 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Sep 13 01:16:47.674836 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 13 01:16:47.676423 kernel: BTRFS info (device dm-0): using free space tree Sep 13 01:16:47.688342 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 13 01:16:47.689876 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 13 01:16:47.695723 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 13 01:16:47.698623 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 13 01:16:47.715631 kernel: BTRFS info (device vda6): first mount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 01:16:47.715693 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 01:16:47.715713 kernel: BTRFS info (device vda6): using free space tree Sep 13 01:16:47.723486 kernel: BTRFS info (device vda6): auto enabling async discard Sep 13 01:16:47.739261 systemd[1]: mnt-oem.mount: Deactivated successfully. Sep 13 01:16:47.745535 kernel: BTRFS info (device vda6): last unmount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 01:16:47.754599 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 13 01:16:47.761704 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 13 01:16:47.865084 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 13 01:16:47.873964 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 13 01:16:47.914054 ignition[684]: Ignition 2.19.0 Sep 13 01:16:47.914079 ignition[684]: Stage: fetch-offline Sep 13 01:16:47.914180 ignition[684]: no configs at "/usr/lib/ignition/base.d" Sep 13 01:16:47.914207 ignition[684]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:16:47.915427 ignition[684]: parsed url from cmdline: "" Sep 13 01:16:47.919253 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 13 01:16:47.915455 ignition[684]: no config URL provided Sep 13 01:16:47.919644 systemd-networkd[771]: lo: Link UP Sep 13 01:16:47.915477 ignition[684]: reading system config file "/usr/lib/ignition/user.ign" Sep 13 01:16:47.919651 systemd-networkd[771]: lo: Gained carrier Sep 13 01:16:47.915495 ignition[684]: no config at "/usr/lib/ignition/user.ign" Sep 13 01:16:47.922917 systemd-networkd[771]: Enumeration completed Sep 13 01:16:47.915503 ignition[684]: failed to fetch config: resource requires networking Sep 13 01:16:47.923897 systemd-networkd[771]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 01:16:47.915769 ignition[684]: Ignition finished successfully Sep 13 01:16:47.923903 systemd-networkd[771]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 13 01:16:47.924555 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 13 01:16:47.926142 systemd-networkd[771]: eth0: Link UP Sep 13 01:16:47.926148 systemd-networkd[771]: eth0: Gained carrier Sep 13 01:16:47.926161 systemd-networkd[771]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 01:16:47.926727 systemd[1]: Reached target network.target - Network. Sep 13 01:16:47.936730 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 13 01:16:47.955618 systemd-networkd[771]: eth0: DHCPv4 address 10.230.52.250/30, gateway 10.230.52.249 acquired from 10.230.52.249 Sep 13 01:16:47.963314 ignition[778]: Ignition 2.19.0 Sep 13 01:16:47.963336 ignition[778]: Stage: fetch Sep 13 01:16:47.963872 ignition[778]: no configs at "/usr/lib/ignition/base.d" Sep 13 01:16:47.963894 ignition[778]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:16:47.964076 ignition[778]: parsed url from cmdline: "" Sep 13 01:16:47.964083 ignition[778]: no config URL provided Sep 13 01:16:47.964092 ignition[778]: reading system config file "/usr/lib/ignition/user.ign" Sep 13 01:16:47.964108 ignition[778]: no config at "/usr/lib/ignition/user.ign" Sep 13 01:16:47.964331 ignition[778]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Sep 13 01:16:47.964379 ignition[778]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Sep 13 01:16:47.964651 ignition[778]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Sep 13 01:16:47.983492 ignition[778]: GET result: OK Sep 13 01:16:47.984502 ignition[778]: parsing config with SHA512: 38490211b222409a49b3a5cbeea243ce83a5ac83eab6ed390ff8ee35f705b25fbec9d5eea857dcbf49c654a22423d69fa8eff15d3b2524fca9532a6b73aa36b9 Sep 13 01:16:47.993053 unknown[778]: fetched base config from "system" Sep 13 01:16:47.993075 unknown[778]: fetched base config from "system" Sep 13 01:16:47.993603 ignition[778]: fetch: fetch complete Sep 13 01:16:47.993085 unknown[778]: fetched user config from "openstack" Sep 13 01:16:47.993612 ignition[778]: fetch: fetch passed Sep 13 01:16:47.996207 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 13 01:16:47.993675 ignition[778]: Ignition finished successfully Sep 13 01:16:48.008651 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 13 01:16:48.036549 ignition[785]: Ignition 2.19.0 Sep 13 01:16:48.036569 ignition[785]: Stage: kargs Sep 13 01:16:48.036833 ignition[785]: no configs at "/usr/lib/ignition/base.d" Sep 13 01:16:48.036853 ignition[785]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:16:48.041819 ignition[785]: kargs: kargs passed Sep 13 01:16:48.042579 ignition[785]: Ignition finished successfully Sep 13 01:16:48.044598 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 13 01:16:48.051674 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 13 01:16:48.079720 ignition[792]: Ignition 2.19.0 Sep 13 01:16:48.080866 ignition[792]: Stage: disks Sep 13 01:16:48.081113 ignition[792]: no configs at "/usr/lib/ignition/base.d" Sep 13 01:16:48.081134 ignition[792]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:16:48.083683 ignition[792]: disks: disks passed Sep 13 01:16:48.084857 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 13 01:16:48.083754 ignition[792]: Ignition finished successfully Sep 13 01:16:48.086775 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 13 01:16:48.087773 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 13 01:16:48.089100 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 13 01:16:48.090723 systemd[1]: Reached target sysinit.target - System Initialization. Sep 13 01:16:48.092318 systemd[1]: Reached target basic.target - Basic System. Sep 13 01:16:48.106782 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 13 01:16:48.126157 systemd-fsck[800]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Sep 13 01:16:48.129580 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 13 01:16:48.136650 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 13 01:16:48.254748 kernel: EXT4-fs (vda9): mounted filesystem 3a3ecd49-b269-4fcb-bb61-e2994e1868ee r/w with ordered data mode. Quota mode: none. Sep 13 01:16:48.255751 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 13 01:16:48.257018 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 13 01:16:48.272661 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 13 01:16:48.275570 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 13 01:16:48.278581 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 13 01:16:48.286493 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/vda6 scanned by mount (808) Sep 13 01:16:48.290956 kernel: BTRFS info (device vda6): first mount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 01:16:48.289852 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Sep 13 01:16:48.292407 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 13 01:16:48.292488 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 13 01:16:48.302555 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 01:16:48.302588 kernel: BTRFS info (device vda6): using free space tree Sep 13 01:16:48.298927 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 13 01:16:48.312633 kernel: BTRFS info (device vda6): auto enabling async discard Sep 13 01:16:48.312980 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 13 01:16:48.316923 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 13 01:16:48.396746 initrd-setup-root[836]: cut: /sysroot/etc/passwd: No such file or directory Sep 13 01:16:48.407915 initrd-setup-root[843]: cut: /sysroot/etc/group: No such file or directory Sep 13 01:16:48.417955 initrd-setup-root[850]: cut: /sysroot/etc/shadow: No such file or directory Sep 13 01:16:48.430594 initrd-setup-root[857]: cut: /sysroot/etc/gshadow: No such file or directory Sep 13 01:16:48.542606 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 13 01:16:48.551609 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 13 01:16:48.554630 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 13 01:16:48.565483 kernel: BTRFS info (device vda6): last unmount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 01:16:48.595946 ignition[925]: INFO : Ignition 2.19.0 Sep 13 01:16:48.595946 ignition[925]: INFO : Stage: mount Sep 13 01:16:48.598670 ignition[925]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 01:16:48.598670 ignition[925]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:16:48.598670 ignition[925]: INFO : mount: mount passed Sep 13 01:16:48.598670 ignition[925]: INFO : Ignition finished successfully Sep 13 01:16:48.600212 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 13 01:16:48.601542 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 13 01:16:48.669336 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 13 01:16:49.383736 systemd-networkd[771]: eth0: Gained IPv6LL Sep 13 01:16:50.890291 systemd-networkd[771]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8d3e:24:19ff:fee6:34fa/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8d3e:24:19ff:fee6:34fa/64 assigned by NDisc. Sep 13 01:16:50.890306 systemd-networkd[771]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 13 01:16:55.461898 coreos-metadata[810]: Sep 13 01:16:55.461 WARN failed to locate config-drive, using the metadata service API instead Sep 13 01:16:55.484421 coreos-metadata[810]: Sep 13 01:16:55.484 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 13 01:16:55.498847 coreos-metadata[810]: Sep 13 01:16:55.498 INFO Fetch successful Sep 13 01:16:55.500901 coreos-metadata[810]: Sep 13 01:16:55.499 INFO wrote hostname srv-qlx5f.gb1.brightbox.com to /sysroot/etc/hostname Sep 13 01:16:55.502045 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Sep 13 01:16:55.502191 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Sep 13 01:16:55.515015 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 13 01:16:55.523105 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 13 01:16:55.549492 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 scanned by mount (941) Sep 13 01:16:55.555411 kernel: BTRFS info (device vda6): first mount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 01:16:55.555474 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 01:16:55.557906 kernel: BTRFS info (device vda6): using free space tree Sep 13 01:16:55.563466 kernel: BTRFS info (device vda6): auto enabling async discard Sep 13 01:16:55.564959 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 13 01:16:55.600785 ignition[959]: INFO : Ignition 2.19.0 Sep 13 01:16:55.600785 ignition[959]: INFO : Stage: files Sep 13 01:16:55.602563 ignition[959]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 01:16:55.602563 ignition[959]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:16:55.602563 ignition[959]: DEBUG : files: compiled without relabeling support, skipping Sep 13 01:16:55.608642 ignition[959]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 13 01:16:55.608642 ignition[959]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 13 01:16:55.612240 ignition[959]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 13 01:16:55.613467 ignition[959]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 13 01:16:55.614864 unknown[959]: wrote ssh authorized keys file for user: core Sep 13 01:16:55.615850 ignition[959]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 13 01:16:55.617020 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Sep 13 01:16:55.618461 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Sep 13 01:16:55.618461 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 13 01:16:55.618461 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 13 01:16:55.808832 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Sep 13 01:16:56.060399 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 13 01:16:56.063455 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/install.sh" Sep 13 01:16:56.063455 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/install.sh" Sep 13 01:16:56.063455 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 13 01:16:56.063455 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 13 01:16:56.063455 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 13 01:16:56.063455 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 13 01:16:56.063455 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 13 01:16:56.063455 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 13 01:16:56.063455 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 13 01:16:56.063455 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 13 01:16:56.063455 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 13 01:16:56.081551 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 13 01:16:56.081551 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 13 01:16:56.081551 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 13 01:16:56.511459 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET result: OK Sep 13 01:16:59.809587 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 13 01:16:59.809587 ignition[959]: INFO : files: op(c): [started] processing unit "containerd.service" Sep 13 01:16:59.815758 ignition[959]: INFO : files: op(c): op(d): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Sep 13 01:16:59.815758 ignition[959]: INFO : files: op(c): op(d): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Sep 13 01:16:59.815758 ignition[959]: INFO : files: op(c): [finished] processing unit "containerd.service" Sep 13 01:16:59.815758 ignition[959]: INFO : files: op(e): [started] processing unit "prepare-helm.service" Sep 13 01:16:59.815758 ignition[959]: INFO : files: op(e): op(f): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 13 01:16:59.815758 ignition[959]: INFO : files: op(e): op(f): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 13 01:16:59.815758 ignition[959]: INFO : files: op(e): [finished] processing unit "prepare-helm.service" Sep 13 01:16:59.815758 ignition[959]: INFO : files: op(10): [started] setting preset to enabled for "prepare-helm.service" Sep 13 01:16:59.815758 ignition[959]: INFO : files: op(10): [finished] setting preset to enabled for "prepare-helm.service" Sep 13 01:16:59.815758 ignition[959]: INFO : files: createResultFile: createFiles: op(11): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 13 01:16:59.815758 ignition[959]: INFO : files: createResultFile: createFiles: op(11): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 13 01:16:59.815758 ignition[959]: INFO : files: files passed Sep 13 01:16:59.815758 ignition[959]: INFO : Ignition finished successfully Sep 13 01:16:59.816535 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 13 01:16:59.828667 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 13 01:16:59.833503 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 13 01:16:59.836747 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 13 01:16:59.837562 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 13 01:16:59.856013 initrd-setup-root-after-ignition[988]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 13 01:16:59.856013 initrd-setup-root-after-ignition[988]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 13 01:16:59.859486 initrd-setup-root-after-ignition[992]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 13 01:16:59.861770 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 13 01:16:59.863199 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 13 01:16:59.869645 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 13 01:16:59.898456 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 13 01:16:59.898611 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 13 01:16:59.899961 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 13 01:16:59.900722 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 13 01:16:59.902315 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 13 01:16:59.903674 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 13 01:16:59.936042 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 13 01:16:59.945648 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 13 01:16:59.958206 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 13 01:16:59.959111 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 01:16:59.960827 systemd[1]: Stopped target timers.target - Timer Units. Sep 13 01:16:59.962207 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 13 01:16:59.962376 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 13 01:16:59.964218 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 13 01:16:59.965300 systemd[1]: Stopped target basic.target - Basic System. Sep 13 01:16:59.966722 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 13 01:16:59.968076 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 13 01:16:59.969476 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 13 01:16:59.970984 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 13 01:16:59.972554 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 13 01:16:59.974122 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 13 01:16:59.975573 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 13 01:16:59.977103 systemd[1]: Stopped target swap.target - Swaps. Sep 13 01:16:59.978550 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 13 01:16:59.978709 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 13 01:16:59.980509 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 13 01:16:59.981423 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 01:16:59.982923 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 13 01:16:59.983100 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 01:16:59.984587 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 13 01:16:59.984792 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 13 01:16:59.986597 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 13 01:16:59.986766 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 13 01:16:59.988454 systemd[1]: ignition-files.service: Deactivated successfully. Sep 13 01:16:59.988606 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 13 01:16:59.996727 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 13 01:17:00.000619 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 13 01:17:00.001362 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 13 01:17:00.003514 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 01:17:00.011802 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 13 01:17:00.012782 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 13 01:17:00.021397 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 13 01:17:00.022479 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 13 01:17:00.030906 ignition[1012]: INFO : Ignition 2.19.0 Sep 13 01:17:00.030906 ignition[1012]: INFO : Stage: umount Sep 13 01:17:00.032748 ignition[1012]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 01:17:00.032748 ignition[1012]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 13 01:17:00.035874 ignition[1012]: INFO : umount: umount passed Sep 13 01:17:00.035874 ignition[1012]: INFO : Ignition finished successfully Sep 13 01:17:00.037839 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 13 01:17:00.038799 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 13 01:17:00.040729 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 13 01:17:00.040813 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 13 01:17:00.042278 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 13 01:17:00.042344 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 13 01:17:00.043198 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 13 01:17:00.043263 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 13 01:17:00.045858 systemd[1]: Stopped target network.target - Network. Sep 13 01:17:00.047767 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 13 01:17:00.047852 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 13 01:17:00.049168 systemd[1]: Stopped target paths.target - Path Units. Sep 13 01:17:00.050424 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 13 01:17:00.054530 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 01:17:00.055344 systemd[1]: Stopped target slices.target - Slice Units. Sep 13 01:17:00.057005 systemd[1]: Stopped target sockets.target - Socket Units. Sep 13 01:17:00.058306 systemd[1]: iscsid.socket: Deactivated successfully. Sep 13 01:17:00.058378 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 13 01:17:00.059626 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 13 01:17:00.059690 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 13 01:17:00.060887 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 13 01:17:00.060961 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 13 01:17:00.062277 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 13 01:17:00.062349 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 13 01:17:00.063845 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 13 01:17:00.065630 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 13 01:17:00.068522 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 13 01:17:00.069289 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 13 01:17:00.069422 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 13 01:17:00.069545 systemd-networkd[771]: eth0: DHCPv6 lease lost Sep 13 01:17:00.075241 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 13 01:17:00.075400 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 13 01:17:00.077352 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 13 01:17:00.077891 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 13 01:17:00.079190 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 13 01:17:00.079259 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 13 01:17:00.096581 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 13 01:17:00.097239 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 13 01:17:00.097317 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 13 01:17:00.099504 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 01:17:00.101386 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 13 01:17:00.101655 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 13 01:17:00.114959 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 13 01:17:00.115217 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 01:17:00.119013 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 13 01:17:00.119125 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 13 01:17:00.120248 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 13 01:17:00.120301 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 01:17:00.121746 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 13 01:17:00.121819 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 13 01:17:00.123880 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 13 01:17:00.123944 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 13 01:17:00.125297 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 13 01:17:00.125363 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 01:17:00.137041 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 13 01:17:00.137814 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 13 01:17:00.137887 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 13 01:17:00.138628 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 13 01:17:00.138700 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 13 01:17:00.140191 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 13 01:17:00.140261 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 01:17:00.142872 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 13 01:17:00.142941 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 01:17:00.148019 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 13 01:17:00.148096 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 01:17:00.149644 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 13 01:17:00.149714 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 01:17:00.151405 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 01:17:00.151486 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 01:17:00.153803 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 13 01:17:00.153962 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 13 01:17:00.155867 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 13 01:17:00.155999 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 13 01:17:00.158027 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 13 01:17:00.165626 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 13 01:17:00.175044 systemd[1]: Switching root. Sep 13 01:17:00.208909 systemd-journald[202]: Journal stopped Sep 13 01:17:01.575746 systemd-journald[202]: Received SIGTERM from PID 1 (systemd). Sep 13 01:17:01.575853 kernel: SELinux: policy capability network_peer_controls=1 Sep 13 01:17:01.575896 kernel: SELinux: policy capability open_perms=1 Sep 13 01:17:01.575923 kernel: SELinux: policy capability extended_socket_class=1 Sep 13 01:17:01.575942 kernel: SELinux: policy capability always_check_network=0 Sep 13 01:17:01.575960 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 13 01:17:01.575978 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 13 01:17:01.576015 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 13 01:17:01.576035 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 13 01:17:01.576058 kernel: audit: type=1403 audit(1757726220.504:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 13 01:17:01.576083 systemd[1]: Successfully loaded SELinux policy in 53.727ms. Sep 13 01:17:01.576113 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 21.201ms. Sep 13 01:17:01.576147 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 13 01:17:01.576168 systemd[1]: Detected virtualization kvm. Sep 13 01:17:01.576188 systemd[1]: Detected architecture x86-64. Sep 13 01:17:01.576231 systemd[1]: Detected first boot. Sep 13 01:17:01.576254 systemd[1]: Hostname set to . Sep 13 01:17:01.576272 systemd[1]: Initializing machine ID from VM UUID. Sep 13 01:17:01.576292 zram_generator::config[1077]: No configuration found. Sep 13 01:17:01.576319 systemd[1]: Populated /etc with preset unit settings. Sep 13 01:17:01.576340 systemd[1]: Queued start job for default target multi-user.target. Sep 13 01:17:01.576366 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 13 01:17:01.576389 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 13 01:17:01.577478 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 13 01:17:01.577516 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 13 01:17:01.577536 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 13 01:17:01.577555 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 13 01:17:01.577583 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 13 01:17:01.577603 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 13 01:17:01.577624 systemd[1]: Created slice user.slice - User and Session Slice. Sep 13 01:17:01.577643 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 01:17:01.577664 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 01:17:01.577699 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 13 01:17:01.577722 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 13 01:17:01.577741 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 13 01:17:01.577761 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 13 01:17:01.577779 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 13 01:17:01.577798 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 01:17:01.577817 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 13 01:17:01.577836 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 01:17:01.577868 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 13 01:17:01.577889 systemd[1]: Reached target slices.target - Slice Units. Sep 13 01:17:01.577908 systemd[1]: Reached target swap.target - Swaps. Sep 13 01:17:01.577940 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 13 01:17:01.577959 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 13 01:17:01.578011 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 13 01:17:01.578049 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 13 01:17:01.578070 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 13 01:17:01.578098 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 13 01:17:01.578126 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 01:17:01.578160 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 13 01:17:01.578180 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 13 01:17:01.578201 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 13 01:17:01.578233 systemd[1]: Mounting media.mount - External Media Directory... Sep 13 01:17:01.578254 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:17:01.578274 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 13 01:17:01.578294 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 13 01:17:01.578314 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 13 01:17:01.578333 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 13 01:17:01.578353 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 01:17:01.578372 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 13 01:17:01.578391 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 13 01:17:01.578422 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 01:17:01.579536 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 13 01:17:01.579564 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 13 01:17:01.579597 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 13 01:17:01.579616 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 01:17:01.579662 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 13 01:17:01.579727 systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. Sep 13 01:17:01.579757 systemd[1]: systemd-journald.service: (This warning is only shown for the first unit using IP firewalling.) Sep 13 01:17:01.579791 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 13 01:17:01.579812 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 13 01:17:01.579868 systemd-journald[1181]: Collecting audit messages is disabled. Sep 13 01:17:01.579906 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 13 01:17:01.579927 systemd-journald[1181]: Journal started Sep 13 01:17:01.579963 systemd-journald[1181]: Runtime Journal (/run/log/journal/dc4728508d8a4c41a3c2d3845d5d9520) is 4.7M, max 38.0M, 33.2M free. Sep 13 01:17:01.588459 kernel: fuse: init (API version 7.39) Sep 13 01:17:01.593457 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 13 01:17:01.622721 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 13 01:17:01.622839 kernel: loop: module loaded Sep 13 01:17:01.630597 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:17:01.660507 systemd[1]: Started systemd-journald.service - Journal Service. Sep 13 01:17:01.657524 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 13 01:17:01.658381 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 13 01:17:01.660596 systemd[1]: Mounted media.mount - External Media Directory. Sep 13 01:17:01.661561 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 13 01:17:01.663462 kernel: ACPI: bus type drm_connector registered Sep 13 01:17:01.663703 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 13 01:17:01.664580 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 13 01:17:01.665822 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 13 01:17:01.667978 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 01:17:01.669209 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 13 01:17:01.669462 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 13 01:17:01.670908 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 01:17:01.671157 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 01:17:01.672299 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 13 01:17:01.672587 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 13 01:17:01.673646 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 13 01:17:01.673869 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 13 01:17:01.674992 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 13 01:17:01.675227 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 13 01:17:01.676344 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 01:17:01.680717 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 01:17:01.681843 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 13 01:17:01.684289 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 13 01:17:01.685611 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 13 01:17:01.699244 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 13 01:17:01.705568 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 13 01:17:01.712539 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 13 01:17:01.713508 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 13 01:17:01.732688 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 13 01:17:01.738643 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 13 01:17:01.739458 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 13 01:17:01.752645 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 13 01:17:01.753586 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 13 01:17:01.759613 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 13 01:17:01.771083 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 13 01:17:01.780590 systemd-journald[1181]: Time spent on flushing to /var/log/journal/dc4728508d8a4c41a3c2d3845d5d9520 is 61.159ms for 1126 entries. Sep 13 01:17:01.780590 systemd-journald[1181]: System Journal (/var/log/journal/dc4728508d8a4c41a3c2d3845d5d9520) is 8.0M, max 584.8M, 576.8M free. Sep 13 01:17:01.866543 systemd-journald[1181]: Received client request to flush runtime journal. Sep 13 01:17:01.779565 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 13 01:17:01.784581 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 13 01:17:01.785743 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 13 01:17:01.791815 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 13 01:17:01.848962 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 13 01:17:01.872530 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 13 01:17:01.884376 systemd-tmpfiles[1229]: ACLs are not supported, ignoring. Sep 13 01:17:01.884403 systemd-tmpfiles[1229]: ACLs are not supported, ignoring. Sep 13 01:17:01.902590 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 01:17:01.913674 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 13 01:17:01.954070 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 01:17:01.964669 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Sep 13 01:17:01.985200 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 13 01:17:01.996801 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 13 01:17:02.003693 udevadm[1248]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Sep 13 01:17:02.026646 systemd-tmpfiles[1251]: ACLs are not supported, ignoring. Sep 13 01:17:02.026671 systemd-tmpfiles[1251]: ACLs are not supported, ignoring. Sep 13 01:17:02.035127 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 01:17:02.472875 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 13 01:17:02.484649 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 01:17:02.516493 systemd-udevd[1257]: Using default interface naming scheme 'v255'. Sep 13 01:17:02.543701 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 01:17:02.555738 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 13 01:17:02.584592 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 13 01:17:02.665673 systemd[1]: Found device dev-ttyS0.device - /dev/ttyS0. Sep 13 01:17:02.706467 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (1266) Sep 13 01:17:02.715847 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 13 01:17:02.766468 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 13 01:17:02.792485 kernel: ACPI: button: Power Button [PWRF] Sep 13 01:17:02.843628 systemd-networkd[1265]: lo: Link UP Sep 13 01:17:02.844728 kernel: mousedev: PS/2 mouse device common for all mice Sep 13 01:17:02.843641 systemd-networkd[1265]: lo: Gained carrier Sep 13 01:17:02.846183 systemd-networkd[1265]: Enumeration completed Sep 13 01:17:02.846334 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 13 01:17:02.849070 systemd-networkd[1265]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 01:17:02.849083 systemd-networkd[1265]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 13 01:17:02.851371 systemd-networkd[1265]: eth0: Link UP Sep 13 01:17:02.851384 systemd-networkd[1265]: eth0: Gained carrier Sep 13 01:17:02.851400 systemd-networkd[1265]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 01:17:02.855711 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 13 01:17:02.865535 systemd-networkd[1265]: eth0: DHCPv4 address 10.230.52.250/30, gateway 10.230.52.249 acquired from 10.230.52.249 Sep 13 01:17:02.874381 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 13 01:17:02.877713 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Sep 13 01:17:02.878012 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 13 01:17:02.904459 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Sep 13 01:17:02.958419 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 13 01:17:02.968720 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 01:17:03.122976 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Sep 13 01:17:03.134721 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Sep 13 01:17:03.205057 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 01:17:03.223138 lvm[1295]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 13 01:17:03.258911 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Sep 13 01:17:03.260580 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 13 01:17:03.266639 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Sep 13 01:17:03.274569 lvm[1300]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 13 01:17:03.306903 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Sep 13 01:17:03.308508 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 13 01:17:03.309414 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 13 01:17:03.309595 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 13 01:17:03.310258 systemd[1]: Reached target machines.target - Containers. Sep 13 01:17:03.312528 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Sep 13 01:17:03.323710 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 13 01:17:03.328611 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 13 01:17:03.329572 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 01:17:03.330716 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 13 01:17:03.336600 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Sep 13 01:17:03.344608 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 13 01:17:03.354695 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 13 01:17:03.375811 kernel: loop0: detected capacity change from 0 to 221472 Sep 13 01:17:03.381520 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 13 01:17:03.386174 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 13 01:17:03.390053 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Sep 13 01:17:03.407764 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 13 01:17:03.431478 kernel: loop1: detected capacity change from 0 to 140768 Sep 13 01:17:03.477464 kernel: loop2: detected capacity change from 0 to 8 Sep 13 01:17:03.511619 kernel: loop3: detected capacity change from 0 to 142488 Sep 13 01:17:03.572212 kernel: loop4: detected capacity change from 0 to 221472 Sep 13 01:17:03.594696 kernel: loop5: detected capacity change from 0 to 140768 Sep 13 01:17:03.614463 kernel: loop6: detected capacity change from 0 to 8 Sep 13 01:17:03.621484 kernel: loop7: detected capacity change from 0 to 142488 Sep 13 01:17:03.645974 (sd-merge)[1322]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Sep 13 01:17:03.646888 (sd-merge)[1322]: Merged extensions into '/usr'. Sep 13 01:17:03.654568 systemd[1]: Reloading requested from client PID 1308 ('systemd-sysext') (unit systemd-sysext.service)... Sep 13 01:17:03.654598 systemd[1]: Reloading... Sep 13 01:17:03.747486 zram_generator::config[1347]: No configuration found. Sep 13 01:17:03.950532 ldconfig[1305]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 13 01:17:03.997104 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 01:17:04.087728 systemd[1]: Reloading finished in 432 ms. Sep 13 01:17:04.113096 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 13 01:17:04.116043 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 13 01:17:04.128740 systemd[1]: Starting ensure-sysext.service... Sep 13 01:17:04.131642 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 13 01:17:04.151576 systemd[1]: Reloading requested from client PID 1413 ('systemctl') (unit ensure-sysext.service)... Sep 13 01:17:04.151618 systemd[1]: Reloading... Sep 13 01:17:04.180885 systemd-tmpfiles[1414]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 13 01:17:04.181464 systemd-tmpfiles[1414]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 13 01:17:04.184905 systemd-tmpfiles[1414]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 13 01:17:04.185329 systemd-tmpfiles[1414]: ACLs are not supported, ignoring. Sep 13 01:17:04.185533 systemd-tmpfiles[1414]: ACLs are not supported, ignoring. Sep 13 01:17:04.193231 systemd-tmpfiles[1414]: Detected autofs mount point /boot during canonicalization of boot. Sep 13 01:17:04.193250 systemd-tmpfiles[1414]: Skipping /boot Sep 13 01:17:04.211941 systemd-tmpfiles[1414]: Detected autofs mount point /boot during canonicalization of boot. Sep 13 01:17:04.211962 systemd-tmpfiles[1414]: Skipping /boot Sep 13 01:17:04.259496 zram_generator::config[1443]: No configuration found. Sep 13 01:17:04.433757 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 01:17:04.521577 systemd[1]: Reloading finished in 369 ms. Sep 13 01:17:04.559272 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 01:17:04.566343 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 13 01:17:04.580876 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 13 01:17:04.584671 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 13 01:17:04.594613 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 13 01:17:04.606672 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 13 01:17:04.624090 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:17:04.624402 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 01:17:04.630330 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 01:17:04.641666 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 13 01:17:04.649735 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 01:17:04.651224 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 01:17:04.652590 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:17:04.664143 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 01:17:04.664381 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 01:17:04.667636 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:17:04.668254 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 01:17:04.670110 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 01:17:04.670864 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:17:04.674679 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 13 01:17:04.688595 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 13 01:17:04.688865 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 13 01:17:04.693086 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 01:17:04.693393 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 01:17:04.699293 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:17:04.699728 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 01:17:04.706801 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 01:17:04.719830 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 13 01:17:04.720763 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 01:17:04.721059 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 13 01:17:04.723622 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 01:17:04.725938 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 13 01:17:04.741741 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 13 01:17:04.742924 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 01:17:04.743192 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 01:17:04.743754 systemd-networkd[1265]: eth0: Gained IPv6LL Sep 13 01:17:04.751752 systemd[1]: Finished ensure-sysext.service. Sep 13 01:17:04.762808 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 13 01:17:04.767505 augenrules[1550]: No rules Sep 13 01:17:04.765258 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 13 01:17:04.770135 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 13 01:17:04.771343 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 13 01:17:04.773796 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 13 01:17:04.787004 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 13 01:17:04.788789 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 13 01:17:04.798851 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 13 01:17:04.800532 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 13 01:17:04.820878 systemd-resolved[1511]: Positive Trust Anchors: Sep 13 01:17:04.821404 systemd-resolved[1511]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 13 01:17:04.821556 systemd-resolved[1511]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 13 01:17:04.828998 systemd-resolved[1511]: Using system hostname 'srv-qlx5f.gb1.brightbox.com'. Sep 13 01:17:04.832324 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 13 01:17:04.836947 systemd[1]: Reached target network.target - Network. Sep 13 01:17:04.837628 systemd[1]: Reached target network-online.target - Network is Online. Sep 13 01:17:04.838311 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 13 01:17:04.883285 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 13 01:17:04.885078 systemd[1]: Reached target sysinit.target - System Initialization. Sep 13 01:17:04.885937 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 13 01:17:04.886802 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 13 01:17:04.887680 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 13 01:17:04.888460 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 13 01:17:04.888513 systemd[1]: Reached target paths.target - Path Units. Sep 13 01:17:04.889132 systemd[1]: Reached target time-set.target - System Time Set. Sep 13 01:17:04.890111 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 13 01:17:04.890975 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 13 01:17:04.891749 systemd[1]: Reached target timers.target - Timer Units. Sep 13 01:17:04.893839 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 13 01:17:04.897125 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 13 01:17:04.900278 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 13 01:17:04.902626 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 13 01:17:04.903357 systemd[1]: Reached target sockets.target - Socket Units. Sep 13 01:17:04.904019 systemd[1]: Reached target basic.target - Basic System. Sep 13 01:17:04.904947 systemd[1]: System is tainted: cgroupsv1 Sep 13 01:17:04.905008 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 13 01:17:04.905060 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 13 01:17:04.908598 systemd[1]: Starting containerd.service - containerd container runtime... Sep 13 01:17:04.913630 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 13 01:17:04.921680 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 13 01:17:04.925554 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 13 01:17:04.932649 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 13 01:17:04.933733 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 13 01:17:04.945563 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:17:04.954273 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 13 01:17:04.962646 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 13 01:17:04.973530 jq[1573]: false Sep 13 01:17:04.977583 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 13 01:17:04.977414 dbus-daemon[1572]: [system] SELinux support is enabled Sep 13 01:17:04.993939 dbus-daemon[1572]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1265 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 13 01:17:04.995620 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 13 01:17:05.003147 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 13 01:17:05.008216 extend-filesystems[1576]: Found loop4 Sep 13 01:17:05.008216 extend-filesystems[1576]: Found loop5 Sep 13 01:17:05.008216 extend-filesystems[1576]: Found loop6 Sep 13 01:17:05.008216 extend-filesystems[1576]: Found loop7 Sep 13 01:17:05.008216 extend-filesystems[1576]: Found vda Sep 13 01:17:05.008216 extend-filesystems[1576]: Found vda1 Sep 13 01:17:05.008216 extend-filesystems[1576]: Found vda2 Sep 13 01:17:05.020530 extend-filesystems[1576]: Found vda3 Sep 13 01:17:05.020530 extend-filesystems[1576]: Found usr Sep 13 01:17:05.020530 extend-filesystems[1576]: Found vda4 Sep 13 01:17:05.020530 extend-filesystems[1576]: Found vda6 Sep 13 01:17:05.020530 extend-filesystems[1576]: Found vda7 Sep 13 01:17:05.020530 extend-filesystems[1576]: Found vda9 Sep 13 01:17:05.020530 extend-filesystems[1576]: Checking size of /dev/vda9 Sep 13 01:17:05.024718 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 13 01:17:05.031238 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 13 01:17:05.038105 systemd[1]: Starting update-engine.service - Update Engine... Sep 13 01:17:05.049533 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 13 01:17:05.054306 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 13 01:17:05.073839 update_engine[1599]: I20250913 01:17:05.073720 1599 main.cc:92] Flatcar Update Engine starting Sep 13 01:17:05.076520 update_engine[1599]: I20250913 01:17:05.075942 1599 update_check_scheduler.cc:74] Next update check in 9m38s Sep 13 01:17:05.076905 extend-filesystems[1576]: Resized partition /dev/vda9 Sep 13 01:17:05.079002 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 13 01:17:05.079390 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 13 01:17:05.086044 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 13 01:17:05.086405 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 13 01:17:05.087361 extend-filesystems[1604]: resize2fs 1.47.1 (20-May-2024) Sep 13 01:17:05.097528 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 13 01:17:05.108848 systemd[1]: motdgen.service: Deactivated successfully. Sep 13 01:17:05.109217 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 13 01:17:05.115458 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Sep 13 01:17:05.118252 jq[1600]: true Sep 13 01:17:05.145098 (ntainerd)[1615]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 13 01:17:05.175456 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (1261) Sep 13 01:17:05.172139 dbus-daemon[1572]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 13 01:17:05.171124 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 13 01:17:05.171267 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 13 01:17:05.172958 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 13 01:17:05.172991 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 13 01:17:05.176315 systemd[1]: Started update-engine.service - Update Engine. Sep 13 01:17:05.190682 jq[1618]: true Sep 13 01:17:05.198633 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 13 01:17:05.201278 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 13 01:17:05.237951 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 13 01:17:05.735027 systemd-timesyncd[1565]: Contacted time server 176.58.127.131:123 (0.flatcar.pool.ntp.org). Sep 13 01:17:05.735141 systemd-resolved[1511]: Clock change detected. Flushing caches. Sep 13 01:17:05.735889 systemd-timesyncd[1565]: Initial clock synchronization to Sat 2025-09-13 01:17:05.734030 UTC. Sep 13 01:17:05.771309 tar[1609]: linux-amd64/helm Sep 13 01:17:05.884540 systemd-logind[1593]: Watching system buttons on /dev/input/event2 (Power Button) Sep 13 01:17:05.888061 systemd-logind[1593]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 13 01:17:05.890154 systemd-logind[1593]: New seat seat0. Sep 13 01:17:05.894934 systemd[1]: Started systemd-logind.service - User Login Management. Sep 13 01:17:05.957489 bash[1652]: Updated "/home/core/.ssh/authorized_keys" Sep 13 01:17:05.954729 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 13 01:17:05.967383 systemd[1]: Starting sshkeys.service... Sep 13 01:17:05.979756 containerd[1615]: time="2025-09-13T01:17:05.979556199Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Sep 13 01:17:06.030001 containerd[1615]: time="2025-09-13T01:17:06.029560584Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Sep 13 01:17:06.036028 containerd[1615]: time="2025-09-13T01:17:06.035684300Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.106-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Sep 13 01:17:06.036028 containerd[1615]: time="2025-09-13T01:17:06.035725379Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Sep 13 01:17:06.036028 containerd[1615]: time="2025-09-13T01:17:06.035748664Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Sep 13 01:17:06.036260 containerd[1615]: time="2025-09-13T01:17:06.036008763Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Sep 13 01:17:06.036353 containerd[1615]: time="2025-09-13T01:17:06.036329640Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Sep 13 01:17:06.036556 containerd[1615]: time="2025-09-13T01:17:06.036527490Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Sep 13 01:17:06.036676 containerd[1615]: time="2025-09-13T01:17:06.036653863Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Sep 13 01:17:06.040997 containerd[1615]: time="2025-09-13T01:17:06.040073010Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 13 01:17:06.040997 containerd[1615]: time="2025-09-13T01:17:06.040116136Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Sep 13 01:17:06.040997 containerd[1615]: time="2025-09-13T01:17:06.040137215Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Sep 13 01:17:06.040997 containerd[1615]: time="2025-09-13T01:17:06.040152795Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Sep 13 01:17:06.040997 containerd[1615]: time="2025-09-13T01:17:06.040286857Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Sep 13 01:17:06.040997 containerd[1615]: time="2025-09-13T01:17:06.040669175Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Sep 13 01:17:06.040997 containerd[1615]: time="2025-09-13T01:17:06.040844703Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 13 01:17:06.040997 containerd[1615]: time="2025-09-13T01:17:06.040867200Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Sep 13 01:17:06.041360 containerd[1615]: time="2025-09-13T01:17:06.041333569Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Sep 13 01:17:06.041521 containerd[1615]: time="2025-09-13T01:17:06.041495941Z" level=info msg="metadata content store policy set" policy=shared Sep 13 01:17:06.049133 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 13 01:17:06.057436 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 13 01:17:06.082500 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Sep 13 01:17:06.102005 extend-filesystems[1604]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 13 01:17:06.102005 extend-filesystems[1604]: old_desc_blocks = 1, new_desc_blocks = 8 Sep 13 01:17:06.102005 extend-filesystems[1604]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Sep 13 01:17:06.101687 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.107192306Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.107350929Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.107405976Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.107436381Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.107465274Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.107716611Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.116620638Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.116926848Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.116974601Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.117025491Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.117048063Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.117085887Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Sep 13 01:17:06.120509 containerd[1615]: time="2025-09-13T01:17:06.117108764Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Sep 13 01:17:06.120965 extend-filesystems[1576]: Resized filesystem in /dev/vda9 Sep 13 01:17:06.105297 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.117129211Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.121330900Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.121376976Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.121399343Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.121421532Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.121483435Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.121505905Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.121545079Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.121581369Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.121638861Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.121664114Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.122023774Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.122055022Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.124994 containerd[1615]: time="2025-09-13T01:17:06.123302308Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.125512 containerd[1615]: time="2025-09-13T01:17:06.123936812Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.125512 containerd[1615]: time="2025-09-13T01:17:06.123970625Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.125512 containerd[1615]: time="2025-09-13T01:17:06.124032149Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.125512 containerd[1615]: time="2025-09-13T01:17:06.124056700Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.125512 containerd[1615]: time="2025-09-13T01:17:06.124114872Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Sep 13 01:17:06.125512 containerd[1615]: time="2025-09-13T01:17:06.124227014Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.125512 containerd[1615]: time="2025-09-13T01:17:06.124252447Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.125512 containerd[1615]: time="2025-09-13T01:17:06.124270635Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Sep 13 01:17:06.131220 containerd[1615]: time="2025-09-13T01:17:06.131105779Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Sep 13 01:17:06.131220 containerd[1615]: time="2025-09-13T01:17:06.131154982Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Sep 13 01:17:06.131220 containerd[1615]: time="2025-09-13T01:17:06.131196615Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Sep 13 01:17:06.131220 containerd[1615]: time="2025-09-13T01:17:06.131266849Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Sep 13 01:17:06.131435 containerd[1615]: time="2025-09-13T01:17:06.131287765Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.131435 containerd[1615]: time="2025-09-13T01:17:06.131318165Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Sep 13 01:17:06.131435 containerd[1615]: time="2025-09-13T01:17:06.131361098Z" level=info msg="NRI interface is disabled by configuration." Sep 13 01:17:06.131435 containerd[1615]: time="2025-09-13T01:17:06.131381369Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Sep 13 01:17:06.133064 containerd[1615]: time="2025-09-13T01:17:06.131891338Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:false] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:false SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Sep 13 01:17:06.133064 containerd[1615]: time="2025-09-13T01:17:06.132015239Z" level=info msg="Connect containerd service" Sep 13 01:17:06.133064 containerd[1615]: time="2025-09-13T01:17:06.132079452Z" level=info msg="using legacy CRI server" Sep 13 01:17:06.133064 containerd[1615]: time="2025-09-13T01:17:06.132095229Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 13 01:17:06.133064 containerd[1615]: time="2025-09-13T01:17:06.132263465Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Sep 13 01:17:06.136135 containerd[1615]: time="2025-09-13T01:17:06.136098834Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 13 01:17:06.137578 containerd[1615]: time="2025-09-13T01:17:06.136331197Z" level=info msg="Start subscribing containerd event" Sep 13 01:17:06.137578 containerd[1615]: time="2025-09-13T01:17:06.136422391Z" level=info msg="Start recovering state" Sep 13 01:17:06.137578 containerd[1615]: time="2025-09-13T01:17:06.136523139Z" level=info msg="Start event monitor" Sep 13 01:17:06.137578 containerd[1615]: time="2025-09-13T01:17:06.136555657Z" level=info msg="Start snapshots syncer" Sep 13 01:17:06.137578 containerd[1615]: time="2025-09-13T01:17:06.136582354Z" level=info msg="Start cni network conf syncer for default" Sep 13 01:17:06.137578 containerd[1615]: time="2025-09-13T01:17:06.136595097Z" level=info msg="Start streaming server" Sep 13 01:17:06.139721 containerd[1615]: time="2025-09-13T01:17:06.139690251Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 13 01:17:06.139806 containerd[1615]: time="2025-09-13T01:17:06.139784155Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 13 01:17:06.143855 systemd[1]: Started containerd.service - containerd container runtime. Sep 13 01:17:06.146071 containerd[1615]: time="2025-09-13T01:17:06.146040550Z" level=info msg="containerd successfully booted in 0.168103s" Sep 13 01:17:06.221374 dbus-daemon[1572]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 13 01:17:06.221617 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 13 01:17:06.225546 dbus-daemon[1572]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.7' (uid=0 pid=1632 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 13 01:17:06.239355 systemd[1]: Starting polkit.service - Authorization Manager... Sep 13 01:17:06.259044 locksmithd[1633]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 13 01:17:06.273131 polkitd[1671]: Started polkitd version 121 Sep 13 01:17:06.289446 polkitd[1671]: Loading rules from directory /etc/polkit-1/rules.d Sep 13 01:17:06.293111 polkitd[1671]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 13 01:17:06.296251 polkitd[1671]: Finished loading, compiling and executing 2 rules Sep 13 01:17:06.296847 dbus-daemon[1572]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 13 01:17:06.297135 systemd[1]: Started polkit.service - Authorization Manager. Sep 13 01:17:06.299724 polkitd[1671]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 13 01:17:06.332912 systemd-hostnamed[1632]: Hostname set to (static) Sep 13 01:17:06.342956 systemd-networkd[1265]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8d3e:24:19ff:fee6:34fa/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8d3e:24:19ff:fee6:34fa/64 assigned by NDisc. Sep 13 01:17:06.342968 systemd-networkd[1265]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Sep 13 01:17:07.029925 sshd_keygen[1617]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 13 01:17:07.065622 tar[1609]: linux-amd64/LICENSE Sep 13 01:17:07.065622 tar[1609]: linux-amd64/README.md Sep 13 01:17:07.089604 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 13 01:17:07.108094 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 13 01:17:07.113322 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 13 01:17:07.122524 systemd[1]: issuegen.service: Deactivated successfully. Sep 13 01:17:07.122862 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 13 01:17:07.129548 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 13 01:17:07.157603 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 13 01:17:07.181346 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 13 01:17:07.186437 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 13 01:17:07.188078 systemd[1]: Reached target getty.target - Login Prompts. Sep 13 01:17:07.200291 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:17:07.201076 (kubelet)[1714]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 01:17:07.834148 kubelet[1714]: E0913 01:17:07.834086 1714 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 01:17:07.836447 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 01:17:07.836839 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 01:17:09.210967 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 13 01:17:09.220444 systemd[1]: Started sshd@0-10.230.52.250:22-139.178.68.195:38066.service - OpenSSH per-connection server daemon (139.178.68.195:38066). Sep 13 01:17:10.109889 sshd[1725]: Accepted publickey for core from 139.178.68.195 port 38066 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:17:10.112144 sshd[1725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:17:10.127015 systemd-logind[1593]: New session 1 of user core. Sep 13 01:17:10.129236 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 13 01:17:10.136477 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 13 01:17:10.166134 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 13 01:17:10.175444 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 13 01:17:10.182774 (systemd)[1731]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 13 01:17:10.322560 systemd[1731]: Queued start job for default target default.target. Sep 13 01:17:10.324055 systemd[1731]: Created slice app.slice - User Application Slice. Sep 13 01:17:10.324096 systemd[1731]: Reached target paths.target - Paths. Sep 13 01:17:10.324228 systemd[1731]: Reached target timers.target - Timers. Sep 13 01:17:10.332145 systemd[1731]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 13 01:17:10.340690 systemd[1731]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 13 01:17:10.340768 systemd[1731]: Reached target sockets.target - Sockets. Sep 13 01:17:10.340791 systemd[1731]: Reached target basic.target - Basic System. Sep 13 01:17:10.340850 systemd[1731]: Reached target default.target - Main User Target. Sep 13 01:17:10.340908 systemd[1731]: Startup finished in 149ms. Sep 13 01:17:10.341312 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 13 01:17:10.351621 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 13 01:17:10.978467 systemd[1]: Started sshd@1-10.230.52.250:22-139.178.68.195:44796.service - OpenSSH per-connection server daemon (139.178.68.195:44796). Sep 13 01:17:11.861658 sshd[1743]: Accepted publickey for core from 139.178.68.195 port 44796 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:17:11.863521 sshd[1743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:17:11.870314 systemd-logind[1593]: New session 2 of user core. Sep 13 01:17:11.880718 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 13 01:17:12.216911 login[1713]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 13 01:17:12.224227 systemd-logind[1593]: New session 3 of user core. Sep 13 01:17:12.234927 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 13 01:17:12.240118 login[1711]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 13 01:17:12.252238 systemd-logind[1593]: New session 4 of user core. Sep 13 01:17:12.261047 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 13 01:17:12.486488 sshd[1743]: pam_unix(sshd:session): session closed for user core Sep 13 01:17:12.491944 systemd[1]: sshd@1-10.230.52.250:22-139.178.68.195:44796.service: Deactivated successfully. Sep 13 01:17:12.497249 systemd[1]: session-2.scope: Deactivated successfully. Sep 13 01:17:12.498494 systemd-logind[1593]: Session 2 logged out. Waiting for processes to exit. Sep 13 01:17:12.500603 systemd-logind[1593]: Removed session 2. Sep 13 01:17:12.513697 coreos-metadata[1570]: Sep 13 01:17:12.513 WARN failed to locate config-drive, using the metadata service API instead Sep 13 01:17:12.539549 coreos-metadata[1570]: Sep 13 01:17:12.539 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Sep 13 01:17:12.547226 coreos-metadata[1570]: Sep 13 01:17:12.547 INFO Fetch failed with 404: resource not found Sep 13 01:17:12.547226 coreos-metadata[1570]: Sep 13 01:17:12.547 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 13 01:17:12.548631 coreos-metadata[1570]: Sep 13 01:17:12.548 INFO Fetch successful Sep 13 01:17:12.548782 coreos-metadata[1570]: Sep 13 01:17:12.548 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Sep 13 01:17:12.566287 coreos-metadata[1570]: Sep 13 01:17:12.566 INFO Fetch successful Sep 13 01:17:12.566287 coreos-metadata[1570]: Sep 13 01:17:12.566 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Sep 13 01:17:12.580199 coreos-metadata[1570]: Sep 13 01:17:12.580 INFO Fetch successful Sep 13 01:17:12.580199 coreos-metadata[1570]: Sep 13 01:17:12.580 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Sep 13 01:17:12.597473 coreos-metadata[1570]: Sep 13 01:17:12.597 INFO Fetch successful Sep 13 01:17:12.597473 coreos-metadata[1570]: Sep 13 01:17:12.597 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Sep 13 01:17:12.618602 coreos-metadata[1570]: Sep 13 01:17:12.618 INFO Fetch successful Sep 13 01:17:12.639458 systemd[1]: Started sshd@2-10.230.52.250:22-139.178.68.195:44798.service - OpenSSH per-connection server daemon (139.178.68.195:44798). Sep 13 01:17:12.656247 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 13 01:17:12.659679 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 13 01:17:13.195293 coreos-metadata[1659]: Sep 13 01:17:13.195 WARN failed to locate config-drive, using the metadata service API instead Sep 13 01:17:13.217025 coreos-metadata[1659]: Sep 13 01:17:13.216 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Sep 13 01:17:13.262905 coreos-metadata[1659]: Sep 13 01:17:13.262 INFO Fetch successful Sep 13 01:17:13.263159 coreos-metadata[1659]: Sep 13 01:17:13.263 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 13 01:17:13.308850 coreos-metadata[1659]: Sep 13 01:17:13.308 INFO Fetch successful Sep 13 01:17:13.311140 unknown[1659]: wrote ssh authorized keys file for user: core Sep 13 01:17:13.329229 update-ssh-keys[1793]: Updated "/home/core/.ssh/authorized_keys" Sep 13 01:17:13.330028 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 13 01:17:13.338259 systemd[1]: Finished sshkeys.service. Sep 13 01:17:13.340814 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 13 01:17:13.341312 systemd[1]: Startup finished in 17.048s (kernel) + 12.411s (userspace) = 29.460s. Sep 13 01:17:13.534742 sshd[1782]: Accepted publickey for core from 139.178.68.195 port 44798 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:17:13.536762 sshd[1782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:17:13.543722 systemd-logind[1593]: New session 5 of user core. Sep 13 01:17:13.550461 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 13 01:17:14.152241 sshd[1782]: pam_unix(sshd:session): session closed for user core Sep 13 01:17:14.156955 systemd[1]: sshd@2-10.230.52.250:22-139.178.68.195:44798.service: Deactivated successfully. Sep 13 01:17:14.159931 systemd-logind[1593]: Session 5 logged out. Waiting for processes to exit. Sep 13 01:17:14.161231 systemd[1]: session-5.scope: Deactivated successfully. Sep 13 01:17:14.161936 systemd-logind[1593]: Removed session 5. Sep 13 01:17:17.857159 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 13 01:17:17.870333 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:17:18.039201 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:17:18.047621 (kubelet)[1817]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 01:17:18.156418 kubelet[1817]: E0913 01:17:18.156224 1817 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 01:17:18.160114 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 01:17:18.160408 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 01:17:24.307584 systemd[1]: Started sshd@3-10.230.52.250:22-139.178.68.195:48314.service - OpenSSH per-connection server daemon (139.178.68.195:48314). Sep 13 01:17:25.201570 sshd[1825]: Accepted publickey for core from 139.178.68.195 port 48314 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:17:25.205057 sshd[1825]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:17:25.210899 systemd-logind[1593]: New session 6 of user core. Sep 13 01:17:25.220492 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 13 01:17:25.819186 sshd[1825]: pam_unix(sshd:session): session closed for user core Sep 13 01:17:25.824026 systemd[1]: sshd@3-10.230.52.250:22-139.178.68.195:48314.service: Deactivated successfully. Sep 13 01:17:25.827957 systemd-logind[1593]: Session 6 logged out. Waiting for processes to exit. Sep 13 01:17:25.829206 systemd[1]: session-6.scope: Deactivated successfully. Sep 13 01:17:25.830428 systemd-logind[1593]: Removed session 6. Sep 13 01:17:25.972361 systemd[1]: Started sshd@4-10.230.52.250:22-139.178.68.195:48322.service - OpenSSH per-connection server daemon (139.178.68.195:48322). Sep 13 01:17:26.855434 sshd[1833]: Accepted publickey for core from 139.178.68.195 port 48322 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:17:26.857561 sshd[1833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:17:26.864233 systemd-logind[1593]: New session 7 of user core. Sep 13 01:17:26.876574 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 13 01:17:27.472404 sshd[1833]: pam_unix(sshd:session): session closed for user core Sep 13 01:17:27.478932 systemd[1]: sshd@4-10.230.52.250:22-139.178.68.195:48322.service: Deactivated successfully. Sep 13 01:17:27.482317 systemd[1]: session-7.scope: Deactivated successfully. Sep 13 01:17:27.483837 systemd-logind[1593]: Session 7 logged out. Waiting for processes to exit. Sep 13 01:17:27.485432 systemd-logind[1593]: Removed session 7. Sep 13 01:17:27.621328 systemd[1]: Started sshd@5-10.230.52.250:22-139.178.68.195:48332.service - OpenSSH per-connection server daemon (139.178.68.195:48332). Sep 13 01:17:28.357049 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 13 01:17:28.369385 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:17:28.507231 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:17:28.511098 (kubelet)[1855]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 01:17:28.521349 sshd[1841]: Accepted publickey for core from 139.178.68.195 port 48332 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:17:28.524611 sshd[1841]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:17:28.535110 systemd-logind[1593]: New session 8 of user core. Sep 13 01:17:28.538513 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 13 01:17:28.622675 kubelet[1855]: E0913 01:17:28.622521 1855 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 01:17:28.626353 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 01:17:28.626805 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 01:17:29.140040 sshd[1841]: pam_unix(sshd:session): session closed for user core Sep 13 01:17:29.144048 systemd-logind[1593]: Session 8 logged out. Waiting for processes to exit. Sep 13 01:17:29.146555 systemd[1]: sshd@5-10.230.52.250:22-139.178.68.195:48332.service: Deactivated successfully. Sep 13 01:17:29.149490 systemd[1]: session-8.scope: Deactivated successfully. Sep 13 01:17:29.150690 systemd-logind[1593]: Removed session 8. Sep 13 01:17:29.293402 systemd[1]: Started sshd@6-10.230.52.250:22-139.178.68.195:48348.service - OpenSSH per-connection server daemon (139.178.68.195:48348). Sep 13 01:17:30.176401 sshd[1869]: Accepted publickey for core from 139.178.68.195 port 48348 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:17:30.178405 sshd[1869]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:17:30.185159 systemd-logind[1593]: New session 9 of user core. Sep 13 01:17:30.197865 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 13 01:17:30.664866 sudo[1873]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 13 01:17:30.665378 sudo[1873]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 01:17:30.679383 sudo[1873]: pam_unix(sudo:session): session closed for user root Sep 13 01:17:30.824528 sshd[1869]: pam_unix(sshd:session): session closed for user core Sep 13 01:17:30.831737 systemd[1]: sshd@6-10.230.52.250:22-139.178.68.195:48348.service: Deactivated successfully. Sep 13 01:17:30.836255 systemd[1]: session-9.scope: Deactivated successfully. Sep 13 01:17:30.837696 systemd-logind[1593]: Session 9 logged out. Waiting for processes to exit. Sep 13 01:17:30.839317 systemd-logind[1593]: Removed session 9. Sep 13 01:17:30.976415 systemd[1]: Started sshd@7-10.230.52.250:22-139.178.68.195:40028.service - OpenSSH per-connection server daemon (139.178.68.195:40028). Sep 13 01:17:31.850907 sshd[1878]: Accepted publickey for core from 139.178.68.195 port 40028 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:17:31.853219 sshd[1878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:17:31.860232 systemd-logind[1593]: New session 10 of user core. Sep 13 01:17:31.870525 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 13 01:17:32.326162 sudo[1883]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 13 01:17:32.327275 sudo[1883]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 01:17:32.333185 sudo[1883]: pam_unix(sudo:session): session closed for user root Sep 13 01:17:32.341342 sudo[1882]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Sep 13 01:17:32.341765 sudo[1882]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 01:17:32.360320 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Sep 13 01:17:32.365294 auditctl[1886]: No rules Sep 13 01:17:32.365820 systemd[1]: audit-rules.service: Deactivated successfully. Sep 13 01:17:32.366224 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Sep 13 01:17:32.374400 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 13 01:17:32.414186 augenrules[1905]: No rules Sep 13 01:17:32.415928 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 13 01:17:32.418523 sudo[1882]: pam_unix(sudo:session): session closed for user root Sep 13 01:17:32.562327 sshd[1878]: pam_unix(sshd:session): session closed for user core Sep 13 01:17:32.568707 systemd[1]: sshd@7-10.230.52.250:22-139.178.68.195:40028.service: Deactivated successfully. Sep 13 01:17:32.570180 systemd-logind[1593]: Session 10 logged out. Waiting for processes to exit. Sep 13 01:17:32.572147 systemd[1]: session-10.scope: Deactivated successfully. Sep 13 01:17:32.573763 systemd-logind[1593]: Removed session 10. Sep 13 01:17:32.716452 systemd[1]: Started sshd@8-10.230.52.250:22-139.178.68.195:40032.service - OpenSSH per-connection server daemon (139.178.68.195:40032). Sep 13 01:17:33.599171 sshd[1914]: Accepted publickey for core from 139.178.68.195 port 40032 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:17:33.601177 sshd[1914]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:17:33.607545 systemd-logind[1593]: New session 11 of user core. Sep 13 01:17:33.618108 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 13 01:17:34.075611 sudo[1918]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 13 01:17:34.076472 sudo[1918]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 01:17:34.552570 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 13 01:17:34.552696 (dockerd)[1933]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 13 01:17:34.978242 dockerd[1933]: time="2025-09-13T01:17:34.977705418Z" level=info msg="Starting up" Sep 13 01:17:35.215434 systemd[1]: var-lib-docker-metacopy\x2dcheck749264032-merged.mount: Deactivated successfully. Sep 13 01:17:35.236716 dockerd[1933]: time="2025-09-13T01:17:35.235867226Z" level=info msg="Loading containers: start." Sep 13 01:17:35.394597 kernel: Initializing XFRM netlink socket Sep 13 01:17:35.499442 systemd-networkd[1265]: docker0: Link UP Sep 13 01:17:35.526995 dockerd[1933]: time="2025-09-13T01:17:35.526910173Z" level=info msg="Loading containers: done." Sep 13 01:17:35.544417 dockerd[1933]: time="2025-09-13T01:17:35.544272947Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 13 01:17:35.544613 dockerd[1933]: time="2025-09-13T01:17:35.544426370Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Sep 13 01:17:35.544670 dockerd[1933]: time="2025-09-13T01:17:35.544607117Z" level=info msg="Daemon has completed initialization" Sep 13 01:17:35.583149 dockerd[1933]: time="2025-09-13T01:17:35.582616714Z" level=info msg="API listen on /run/docker.sock" Sep 13 01:17:35.583354 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 13 01:17:36.381224 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 13 01:17:36.873486 containerd[1615]: time="2025-09-13T01:17:36.873383301Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 13 01:17:37.665316 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1377410620.mount: Deactivated successfully. Sep 13 01:17:38.857307 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 13 01:17:38.867640 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:17:39.078812 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:17:39.091775 (kubelet)[2147]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 01:17:39.176027 kubelet[2147]: E0913 01:17:39.175798 2147 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 01:17:39.178816 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 01:17:39.179155 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 01:17:40.098596 containerd[1615]: time="2025-09-13T01:17:40.098539261Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:40.100119 containerd[1615]: time="2025-09-13T01:17:40.100076236Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=28117132" Sep 13 01:17:40.100636 containerd[1615]: time="2025-09-13T01:17:40.100583248Z" level=info msg="ImageCreate event name:\"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:40.106353 containerd[1615]: time="2025-09-13T01:17:40.106260766Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:40.108692 containerd[1615]: time="2025-09-13T01:17:40.108619236Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"28113723\" in 3.235128553s" Sep 13 01:17:40.108793 containerd[1615]: time="2025-09-13T01:17:40.108692954Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\"" Sep 13 01:17:40.109763 containerd[1615]: time="2025-09-13T01:17:40.109733329Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 13 01:17:42.361574 containerd[1615]: time="2025-09-13T01:17:42.359927083Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:42.361574 containerd[1615]: time="2025-09-13T01:17:42.361372763Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=24716640" Sep 13 01:17:42.361574 containerd[1615]: time="2025-09-13T01:17:42.361494686Z" level=info msg="ImageCreate event name:\"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:42.365570 containerd[1615]: time="2025-09-13T01:17:42.365530335Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:42.367318 containerd[1615]: time="2025-09-13T01:17:42.367283104Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"26351311\" in 2.257510247s" Sep 13 01:17:42.367440 containerd[1615]: time="2025-09-13T01:17:42.367414951Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\"" Sep 13 01:17:42.368089 containerd[1615]: time="2025-09-13T01:17:42.368015683Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 13 01:17:44.145872 containerd[1615]: time="2025-09-13T01:17:44.145780457Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:44.147407 containerd[1615]: time="2025-09-13T01:17:44.147352876Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=18787706" Sep 13 01:17:44.148231 containerd[1615]: time="2025-09-13T01:17:44.148156368Z" level=info msg="ImageCreate event name:\"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:44.155061 containerd[1615]: time="2025-09-13T01:17:44.154675852Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:44.158889 containerd[1615]: time="2025-09-13T01:17:44.158851933Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"20422395\" in 1.79045667s" Sep 13 01:17:44.158972 containerd[1615]: time="2025-09-13T01:17:44.158898407Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\"" Sep 13 01:17:44.159659 containerd[1615]: time="2025-09-13T01:17:44.159627466Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 13 01:17:45.922230 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount423865903.mount: Deactivated successfully. Sep 13 01:17:46.727013 containerd[1615]: time="2025-09-13T01:17:46.726198564Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:46.727708 containerd[1615]: time="2025-09-13T01:17:46.727222233Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=30410260" Sep 13 01:17:46.728573 containerd[1615]: time="2025-09-13T01:17:46.728187635Z" level=info msg="ImageCreate event name:\"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:46.730846 containerd[1615]: time="2025-09-13T01:17:46.730790569Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:46.731993 containerd[1615]: time="2025-09-13T01:17:46.731875155Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"30409271\" in 2.572205241s" Sep 13 01:17:46.731993 containerd[1615]: time="2025-09-13T01:17:46.731921621Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\"" Sep 13 01:17:46.733550 containerd[1615]: time="2025-09-13T01:17:46.733230464Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 13 01:17:47.368244 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount338119140.mount: Deactivated successfully. Sep 13 01:17:48.640018 containerd[1615]: time="2025-09-13T01:17:48.638342200Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:48.640018 containerd[1615]: time="2025-09-13T01:17:48.639634099Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Sep 13 01:17:48.640018 containerd[1615]: time="2025-09-13T01:17:48.639953189Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:48.644023 containerd[1615]: time="2025-09-13T01:17:48.643946439Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:48.645790 containerd[1615]: time="2025-09-13T01:17:48.645610246Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.912328092s" Sep 13 01:17:48.645790 containerd[1615]: time="2025-09-13T01:17:48.645655854Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 13 01:17:48.655796 containerd[1615]: time="2025-09-13T01:17:48.655713327Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 13 01:17:49.228812 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 13 01:17:49.238829 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:17:49.255681 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1497306251.mount: Deactivated successfully. Sep 13 01:17:49.257366 containerd[1615]: time="2025-09-13T01:17:49.257267020Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:49.258420 containerd[1615]: time="2025-09-13T01:17:49.258378151Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Sep 13 01:17:49.259605 containerd[1615]: time="2025-09-13T01:17:49.259177278Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:49.262845 containerd[1615]: time="2025-09-13T01:17:49.262811921Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:49.265084 containerd[1615]: time="2025-09-13T01:17:49.265051097Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 609.25888ms" Sep 13 01:17:49.265253 containerd[1615]: time="2025-09-13T01:17:49.265226407Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 13 01:17:49.267618 containerd[1615]: time="2025-09-13T01:17:49.267410277Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 13 01:17:49.409358 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:17:49.424603 (kubelet)[2244]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 01:17:49.485029 kubelet[2244]: E0913 01:17:49.484248 2244 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 01:17:49.489224 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 01:17:49.489564 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 01:17:49.942276 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount681486060.mount: Deactivated successfully. Sep 13 01:17:50.882034 update_engine[1599]: I20250913 01:17:50.881834 1599 update_attempter.cc:509] Updating boot flags... Sep 13 01:17:50.935568 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (2307) Sep 13 01:17:51.043015 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (2310) Sep 13 01:17:53.907827 containerd[1615]: time="2025-09-13T01:17:53.907751955Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:53.911494 containerd[1615]: time="2025-09-13T01:17:53.911447178Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910717" Sep 13 01:17:53.912588 containerd[1615]: time="2025-09-13T01:17:53.912555610Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:53.916503 containerd[1615]: time="2025-09-13T01:17:53.916447957Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:17:53.919999 containerd[1615]: time="2025-09-13T01:17:53.918223003Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 4.650774305s" Sep 13 01:17:53.919999 containerd[1615]: time="2025-09-13T01:17:53.918268965Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 13 01:17:57.300326 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:17:57.310319 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:17:57.352647 systemd[1]: Reloading requested from client PID 2348 ('systemctl') (unit session-11.scope)... Sep 13 01:17:57.352687 systemd[1]: Reloading... Sep 13 01:17:57.552149 zram_generator::config[2383]: No configuration found. Sep 13 01:17:57.736732 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 01:17:57.840299 systemd[1]: Reloading finished in 486 ms. Sep 13 01:17:57.912722 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:17:57.916153 systemd[1]: kubelet.service: Deactivated successfully. Sep 13 01:17:57.916573 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:17:57.925792 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:17:58.281224 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:17:58.291621 (kubelet)[2469]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 13 01:17:58.347006 kubelet[2469]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 01:17:58.347006 kubelet[2469]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 13 01:17:58.347006 kubelet[2469]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 01:17:58.363053 kubelet[2469]: I0913 01:17:58.362296 2469 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 13 01:17:58.693534 kubelet[2469]: I0913 01:17:58.693468 2469 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 13 01:17:58.693534 kubelet[2469]: I0913 01:17:58.693511 2469 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 13 01:17:58.693922 kubelet[2469]: I0913 01:17:58.693889 2469 server.go:934] "Client rotation is on, will bootstrap in background" Sep 13 01:17:58.723782 kubelet[2469]: E0913 01:17:58.723367 2469 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.230.52.250:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.230.52.250:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:17:58.723782 kubelet[2469]: I0913 01:17:58.723454 2469 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 13 01:17:58.737872 kubelet[2469]: E0913 01:17:58.737738 2469 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 13 01:17:58.737872 kubelet[2469]: I0913 01:17:58.737838 2469 server.go:1408] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 13 01:17:58.745391 kubelet[2469]: I0913 01:17:58.745298 2469 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 13 01:17:58.749088 kubelet[2469]: I0913 01:17:58.749038 2469 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 13 01:17:58.749317 kubelet[2469]: I0913 01:17:58.749253 2469 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 13 01:17:58.749589 kubelet[2469]: I0913 01:17:58.749309 2469 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-qlx5f.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":1} Sep 13 01:17:58.749853 kubelet[2469]: I0913 01:17:58.749610 2469 topology_manager.go:138] "Creating topology manager with none policy" Sep 13 01:17:58.749853 kubelet[2469]: I0913 01:17:58.749628 2469 container_manager_linux.go:300] "Creating device plugin manager" Sep 13 01:17:58.749853 kubelet[2469]: I0913 01:17:58.749832 2469 state_mem.go:36] "Initialized new in-memory state store" Sep 13 01:17:58.754241 kubelet[2469]: I0913 01:17:58.754167 2469 kubelet.go:408] "Attempting to sync node with API server" Sep 13 01:17:58.754241 kubelet[2469]: I0913 01:17:58.754217 2469 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 13 01:17:58.754397 kubelet[2469]: I0913 01:17:58.754288 2469 kubelet.go:314] "Adding apiserver pod source" Sep 13 01:17:58.754397 kubelet[2469]: I0913 01:17:58.754370 2469 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 13 01:17:58.757060 kubelet[2469]: W0913 01:17:58.756927 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.52.250:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-qlx5f.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.52.250:6443: connect: connection refused Sep 13 01:17:58.757060 kubelet[2469]: E0913 01:17:58.757035 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.52.250:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-qlx5f.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.52.250:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:17:58.758514 kubelet[2469]: W0913 01:17:58.758476 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.52.250:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.52.250:6443: connect: connection refused Sep 13 01:17:58.761014 kubelet[2469]: E0913 01:17:58.758665 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.52.250:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.52.250:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:17:58.761014 kubelet[2469]: I0913 01:17:58.758828 2469 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 13 01:17:58.766186 kubelet[2469]: I0913 01:17:58.766162 2469 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 13 01:17:58.767601 kubelet[2469]: W0913 01:17:58.767141 2469 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 13 01:17:58.769154 kubelet[2469]: I0913 01:17:58.769135 2469 server.go:1274] "Started kubelet" Sep 13 01:17:58.770626 kubelet[2469]: I0913 01:17:58.770285 2469 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 13 01:17:58.772007 kubelet[2469]: I0913 01:17:58.771964 2469 server.go:449] "Adding debug handlers to kubelet server" Sep 13 01:17:58.774011 kubelet[2469]: I0913 01:17:58.773248 2469 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 13 01:17:58.774011 kubelet[2469]: I0913 01:17:58.773670 2469 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 13 01:17:58.775720 kubelet[2469]: E0913 01:17:58.774263 2469 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.230.52.250:6443/api/v1/namespaces/default/events\": dial tcp 10.230.52.250:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-qlx5f.gb1.brightbox.com.1864b2bcda66d829 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-qlx5f.gb1.brightbox.com,UID:srv-qlx5f.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-qlx5f.gb1.brightbox.com,},FirstTimestamp:2025-09-13 01:17:58.769104937 +0000 UTC m=+0.472995561,LastTimestamp:2025-09-13 01:17:58.769104937 +0000 UTC m=+0.472995561,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-qlx5f.gb1.brightbox.com,}" Sep 13 01:17:58.777130 kubelet[2469]: I0913 01:17:58.777094 2469 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 13 01:17:58.777545 kubelet[2469]: I0913 01:17:58.777519 2469 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 13 01:17:58.787931 kubelet[2469]: I0913 01:17:58.787570 2469 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 13 01:17:58.788053 kubelet[2469]: E0913 01:17:58.787969 2469 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-qlx5f.gb1.brightbox.com\" not found" Sep 13 01:17:58.789074 kubelet[2469]: E0913 01:17:58.789035 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.52.250:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-qlx5f.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.52.250:6443: connect: connection refused" interval="200ms" Sep 13 01:17:58.791789 kubelet[2469]: I0913 01:17:58.791767 2469 factory.go:221] Registration of the containerd container factory successfully Sep 13 01:17:58.793036 kubelet[2469]: I0913 01:17:58.791889 2469 factory.go:221] Registration of the systemd container factory successfully Sep 13 01:17:58.793036 kubelet[2469]: I0913 01:17:58.792012 2469 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 13 01:17:58.794281 kubelet[2469]: I0913 01:17:58.794242 2469 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 13 01:17:58.794373 kubelet[2469]: I0913 01:17:58.794362 2469 reconciler.go:26] "Reconciler: start to sync state" Sep 13 01:17:58.802512 kubelet[2469]: W0913 01:17:58.802466 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.52.250:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.52.250:6443: connect: connection refused Sep 13 01:17:58.802659 kubelet[2469]: E0913 01:17:58.802633 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.52.250:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.52.250:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:17:58.804953 kubelet[2469]: E0913 01:17:58.804925 2469 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 13 01:17:58.828176 kubelet[2469]: I0913 01:17:58.828105 2469 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 13 01:17:58.832370 kubelet[2469]: I0913 01:17:58.832339 2469 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 13 01:17:58.832509 kubelet[2469]: I0913 01:17:58.832390 2469 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 13 01:17:58.832509 kubelet[2469]: I0913 01:17:58.832449 2469 kubelet.go:2321] "Starting kubelet main sync loop" Sep 13 01:17:58.832612 kubelet[2469]: E0913 01:17:58.832527 2469 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 13 01:17:58.833406 kubelet[2469]: W0913 01:17:58.833362 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.52.250:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.52.250:6443: connect: connection refused Sep 13 01:17:58.833507 kubelet[2469]: E0913 01:17:58.833422 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.52.250:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.52.250:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:17:58.846268 kubelet[2469]: I0913 01:17:58.846244 2469 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 13 01:17:58.846268 kubelet[2469]: I0913 01:17:58.846267 2469 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 13 01:17:58.846429 kubelet[2469]: I0913 01:17:58.846293 2469 state_mem.go:36] "Initialized new in-memory state store" Sep 13 01:17:58.848322 kubelet[2469]: I0913 01:17:58.848285 2469 policy_none.go:49] "None policy: Start" Sep 13 01:17:58.849203 kubelet[2469]: I0913 01:17:58.849176 2469 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 13 01:17:58.849256 kubelet[2469]: I0913 01:17:58.849208 2469 state_mem.go:35] "Initializing new in-memory state store" Sep 13 01:17:58.856612 kubelet[2469]: I0913 01:17:58.855775 2469 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 13 01:17:58.856612 kubelet[2469]: I0913 01:17:58.856057 2469 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 13 01:17:58.856612 kubelet[2469]: I0913 01:17:58.856094 2469 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 13 01:17:58.867204 kubelet[2469]: I0913 01:17:58.867154 2469 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 13 01:17:58.871659 kubelet[2469]: E0913 01:17:58.871620 2469 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-qlx5f.gb1.brightbox.com\" not found" Sep 13 01:17:58.958906 kubelet[2469]: I0913 01:17:58.958779 2469 kubelet_node_status.go:72] "Attempting to register node" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:58.960235 kubelet[2469]: E0913 01:17:58.960102 2469 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.52.250:6443/api/v1/nodes\": dial tcp 10.230.52.250:6443: connect: connection refused" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:58.990186 kubelet[2469]: E0913 01:17:58.990117 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.52.250:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-qlx5f.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.52.250:6443: connect: connection refused" interval="400ms" Sep 13 01:17:59.096385 kubelet[2469]: I0913 01:17:59.096285 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f4ae19941f2dbccb22f6f66cb1381890-usr-share-ca-certificates\") pod \"kube-apiserver-srv-qlx5f.gb1.brightbox.com\" (UID: \"f4ae19941f2dbccb22f6f66cb1381890\") " pod="kube-system/kube-apiserver-srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.096385 kubelet[2469]: I0913 01:17:59.096372 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1cdc8dd31ace9f8c24a65c32567035c8-ca-certs\") pod \"kube-controller-manager-srv-qlx5f.gb1.brightbox.com\" (UID: \"1cdc8dd31ace9f8c24a65c32567035c8\") " pod="kube-system/kube-controller-manager-srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.096636 kubelet[2469]: I0913 01:17:59.096405 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/22fa4739e99238359ccfd652a0b1825c-kubeconfig\") pod \"kube-scheduler-srv-qlx5f.gb1.brightbox.com\" (UID: \"22fa4739e99238359ccfd652a0b1825c\") " pod="kube-system/kube-scheduler-srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.096636 kubelet[2469]: I0913 01:17:59.096458 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f4ae19941f2dbccb22f6f66cb1381890-ca-certs\") pod \"kube-apiserver-srv-qlx5f.gb1.brightbox.com\" (UID: \"f4ae19941f2dbccb22f6f66cb1381890\") " pod="kube-system/kube-apiserver-srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.096636 kubelet[2469]: I0913 01:17:59.096500 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1cdc8dd31ace9f8c24a65c32567035c8-flexvolume-dir\") pod \"kube-controller-manager-srv-qlx5f.gb1.brightbox.com\" (UID: \"1cdc8dd31ace9f8c24a65c32567035c8\") " pod="kube-system/kube-controller-manager-srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.096636 kubelet[2469]: I0913 01:17:59.096532 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1cdc8dd31ace9f8c24a65c32567035c8-k8s-certs\") pod \"kube-controller-manager-srv-qlx5f.gb1.brightbox.com\" (UID: \"1cdc8dd31ace9f8c24a65c32567035c8\") " pod="kube-system/kube-controller-manager-srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.096636 kubelet[2469]: I0913 01:17:59.096558 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1cdc8dd31ace9f8c24a65c32567035c8-kubeconfig\") pod \"kube-controller-manager-srv-qlx5f.gb1.brightbox.com\" (UID: \"1cdc8dd31ace9f8c24a65c32567035c8\") " pod="kube-system/kube-controller-manager-srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.096866 kubelet[2469]: I0913 01:17:59.096586 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1cdc8dd31ace9f8c24a65c32567035c8-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-qlx5f.gb1.brightbox.com\" (UID: \"1cdc8dd31ace9f8c24a65c32567035c8\") " pod="kube-system/kube-controller-manager-srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.096866 kubelet[2469]: I0913 01:17:59.096613 2469 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f4ae19941f2dbccb22f6f66cb1381890-k8s-certs\") pod \"kube-apiserver-srv-qlx5f.gb1.brightbox.com\" (UID: \"f4ae19941f2dbccb22f6f66cb1381890\") " pod="kube-system/kube-apiserver-srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.164547 kubelet[2469]: I0913 01:17:59.164372 2469 kubelet_node_status.go:72] "Attempting to register node" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.164899 kubelet[2469]: E0913 01:17:59.164848 2469 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.52.250:6443/api/v1/nodes\": dial tcp 10.230.52.250:6443: connect: connection refused" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.244362 containerd[1615]: time="2025-09-13T01:17:59.244190728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-qlx5f.gb1.brightbox.com,Uid:f4ae19941f2dbccb22f6f66cb1381890,Namespace:kube-system,Attempt:0,}" Sep 13 01:17:59.257896 containerd[1615]: time="2025-09-13T01:17:59.257614479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-qlx5f.gb1.brightbox.com,Uid:1cdc8dd31ace9f8c24a65c32567035c8,Namespace:kube-system,Attempt:0,}" Sep 13 01:17:59.257896 containerd[1615]: time="2025-09-13T01:17:59.257663064Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-qlx5f.gb1.brightbox.com,Uid:22fa4739e99238359ccfd652a0b1825c,Namespace:kube-system,Attempt:0,}" Sep 13 01:17:59.390697 kubelet[2469]: E0913 01:17:59.390635 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.52.250:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-qlx5f.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.52.250:6443: connect: connection refused" interval="800ms" Sep 13 01:17:59.569030 kubelet[2469]: I0913 01:17:59.568419 2469 kubelet_node_status.go:72] "Attempting to register node" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.569030 kubelet[2469]: E0913 01:17:59.568866 2469 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.52.250:6443/api/v1/nodes\": dial tcp 10.230.52.250:6443: connect: connection refused" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:17:59.652943 kubelet[2469]: W0913 01:17:59.652709 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.52.250:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-qlx5f.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.52.250:6443: connect: connection refused Sep 13 01:17:59.652943 kubelet[2469]: E0913 01:17:59.652844 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.52.250:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-qlx5f.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.52.250:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:17:59.820423 kubelet[2469]: W0913 01:17:59.820141 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.52.250:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.52.250:6443: connect: connection refused Sep 13 01:17:59.820423 kubelet[2469]: E0913 01:17:59.820215 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.52.250:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.52.250:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:17:59.875563 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount181395253.mount: Deactivated successfully. Sep 13 01:17:59.882470 containerd[1615]: time="2025-09-13T01:17:59.882359817Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 01:17:59.884196 containerd[1615]: time="2025-09-13T01:17:59.884132864Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 13 01:17:59.887596 containerd[1615]: time="2025-09-13T01:17:59.885372395Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 01:17:59.887596 containerd[1615]: time="2025-09-13T01:17:59.886489136Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Sep 13 01:17:59.888277 containerd[1615]: time="2025-09-13T01:17:59.888221748Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 13 01:17:59.888403 containerd[1615]: time="2025-09-13T01:17:59.888360062Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 01:17:59.893620 containerd[1615]: time="2025-09-13T01:17:59.893579236Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 635.773136ms" Sep 13 01:17:59.894939 containerd[1615]: time="2025-09-13T01:17:59.894908882Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 01:17:59.897333 containerd[1615]: time="2025-09-13T01:17:59.897300475Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 652.944974ms" Sep 13 01:17:59.898051 containerd[1615]: time="2025-09-13T01:17:59.898006982Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 01:17:59.915729 containerd[1615]: time="2025-09-13T01:17:59.915689849Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 657.979937ms" Sep 13 01:17:59.935529 kubelet[2469]: W0913 01:17:59.935436 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.52.250:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.52.250:6443: connect: connection refused Sep 13 01:17:59.935791 kubelet[2469]: E0913 01:17:59.935713 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.52.250:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.52.250:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:18:00.148758 kubelet[2469]: W0913 01:18:00.148677 2469 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.52.250:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.52.250:6443: connect: connection refused Sep 13 01:18:00.149118 kubelet[2469]: E0913 01:18:00.148771 2469 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.52.250:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.52.250:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:18:00.194170 kubelet[2469]: E0913 01:18:00.194041 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.52.250:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-qlx5f.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.52.250:6443: connect: connection refused" interval="1.6s" Sep 13 01:18:00.218531 containerd[1615]: time="2025-09-13T01:18:00.218272203Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:00.223565 containerd[1615]: time="2025-09-13T01:18:00.221377719Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:00.223565 containerd[1615]: time="2025-09-13T01:18:00.222355463Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:00.223565 containerd[1615]: time="2025-09-13T01:18:00.222511421Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:00.223565 containerd[1615]: time="2025-09-13T01:18:00.223217627Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:00.223565 containerd[1615]: time="2025-09-13T01:18:00.223307365Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:00.223565 containerd[1615]: time="2025-09-13T01:18:00.223336814Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:00.223565 containerd[1615]: time="2025-09-13T01:18:00.223461420Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:00.230373 containerd[1615]: time="2025-09-13T01:18:00.229301743Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:00.230373 containerd[1615]: time="2025-09-13T01:18:00.229385619Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:00.230373 containerd[1615]: time="2025-09-13T01:18:00.229425092Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:00.230373 containerd[1615]: time="2025-09-13T01:18:00.229603309Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:00.372932 kubelet[2469]: I0913 01:18:00.372544 2469 kubelet_node_status.go:72] "Attempting to register node" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:00.374564 kubelet[2469]: E0913 01:18:00.373089 2469 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.52.250:6443/api/v1/nodes\": dial tcp 10.230.52.250:6443: connect: connection refused" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:00.378776 containerd[1615]: time="2025-09-13T01:18:00.378722702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-qlx5f.gb1.brightbox.com,Uid:22fa4739e99238359ccfd652a0b1825c,Namespace:kube-system,Attempt:0,} returns sandbox id \"a2baaefea94652dadde0804953faf31dfd8bd8244519c97b698bff82df7f9672\"" Sep 13 01:18:00.382973 containerd[1615]: time="2025-09-13T01:18:00.382936217Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-qlx5f.gb1.brightbox.com,Uid:f4ae19941f2dbccb22f6f66cb1381890,Namespace:kube-system,Attempt:0,} returns sandbox id \"f2787ab8771e3afec893f14a2bd46f73b2ed8d1703668e6666d77d3e241305ab\"" Sep 13 01:18:00.394430 containerd[1615]: time="2025-09-13T01:18:00.394203694Z" level=info msg="CreateContainer within sandbox \"f2787ab8771e3afec893f14a2bd46f73b2ed8d1703668e6666d77d3e241305ab\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 13 01:18:00.394931 containerd[1615]: time="2025-09-13T01:18:00.394805048Z" level=info msg="CreateContainer within sandbox \"a2baaefea94652dadde0804953faf31dfd8bd8244519c97b698bff82df7f9672\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 13 01:18:00.397684 containerd[1615]: time="2025-09-13T01:18:00.397492751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-qlx5f.gb1.brightbox.com,Uid:1cdc8dd31ace9f8c24a65c32567035c8,Namespace:kube-system,Attempt:0,} returns sandbox id \"30ee3ed1a8223afa9be3d878204826a0b55adaf60b665391f6f0825d4a9d5d24\"" Sep 13 01:18:00.400800 containerd[1615]: time="2025-09-13T01:18:00.400615503Z" level=info msg="CreateContainer within sandbox \"30ee3ed1a8223afa9be3d878204826a0b55adaf60b665391f6f0825d4a9d5d24\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 13 01:18:00.424670 containerd[1615]: time="2025-09-13T01:18:00.424523416Z" level=info msg="CreateContainer within sandbox \"f2787ab8771e3afec893f14a2bd46f73b2ed8d1703668e6666d77d3e241305ab\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"2c08445ed7f96b64fe7b90bf4673d843b21956d7f09be54f797535c2501d6414\"" Sep 13 01:18:00.428017 containerd[1615]: time="2025-09-13T01:18:00.426750908Z" level=info msg="StartContainer for \"2c08445ed7f96b64fe7b90bf4673d843b21956d7f09be54f797535c2501d6414\"" Sep 13 01:18:00.430099 containerd[1615]: time="2025-09-13T01:18:00.430067550Z" level=info msg="CreateContainer within sandbox \"30ee3ed1a8223afa9be3d878204826a0b55adaf60b665391f6f0825d4a9d5d24\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"396238e00afc29a034e399aedea3bb4cdc4997e36c0eafae71000f59398f8cb1\"" Sep 13 01:18:00.430448 containerd[1615]: time="2025-09-13T01:18:00.430407112Z" level=info msg="CreateContainer within sandbox \"a2baaefea94652dadde0804953faf31dfd8bd8244519c97b698bff82df7f9672\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"6b1ea8efc05f2340d21480dd88bf1e0b3fe59a7c53b09d1f7a1f39c07c2c6e4c\"" Sep 13 01:18:00.433228 containerd[1615]: time="2025-09-13T01:18:00.433199923Z" level=info msg="StartContainer for \"396238e00afc29a034e399aedea3bb4cdc4997e36c0eafae71000f59398f8cb1\"" Sep 13 01:18:00.433471 containerd[1615]: time="2025-09-13T01:18:00.433416776Z" level=info msg="StartContainer for \"6b1ea8efc05f2340d21480dd88bf1e0b3fe59a7c53b09d1f7a1f39c07c2c6e4c\"" Sep 13 01:18:00.601012 containerd[1615]: time="2025-09-13T01:18:00.600889084Z" level=info msg="StartContainer for \"6b1ea8efc05f2340d21480dd88bf1e0b3fe59a7c53b09d1f7a1f39c07c2c6e4c\" returns successfully" Sep 13 01:18:00.624048 containerd[1615]: time="2025-09-13T01:18:00.623717970Z" level=info msg="StartContainer for \"2c08445ed7f96b64fe7b90bf4673d843b21956d7f09be54f797535c2501d6414\" returns successfully" Sep 13 01:18:00.640827 containerd[1615]: time="2025-09-13T01:18:00.640769491Z" level=info msg="StartContainer for \"396238e00afc29a034e399aedea3bb4cdc4997e36c0eafae71000f59398f8cb1\" returns successfully" Sep 13 01:18:00.873579 kubelet[2469]: E0913 01:18:00.873030 2469 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.230.52.250:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.230.52.250:6443: connect: connection refused" logger="UnhandledError" Sep 13 01:18:01.979066 kubelet[2469]: I0913 01:18:01.978817 2469 kubelet_node_status.go:72] "Attempting to register node" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:03.768739 kubelet[2469]: I0913 01:18:03.768149 2469 apiserver.go:52] "Watching apiserver" Sep 13 01:18:03.773607 kubelet[2469]: I0913 01:18:03.773574 2469 kubelet_node_status.go:75] "Successfully registered node" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:03.774095 kubelet[2469]: E0913 01:18:03.774066 2469 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"srv-qlx5f.gb1.brightbox.com\": node \"srv-qlx5f.gb1.brightbox.com\" not found" Sep 13 01:18:03.794479 kubelet[2469]: I0913 01:18:03.794417 2469 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 13 01:18:03.883004 kubelet[2469]: E0913 01:18:03.881861 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-node-lease\" not found" interval="3.2s" Sep 13 01:18:05.191134 kubelet[2469]: W0913 01:18:05.191054 2469 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 01:18:06.325583 systemd[1]: Reloading requested from client PID 2748 ('systemctl') (unit session-11.scope)... Sep 13 01:18:06.325607 systemd[1]: Reloading... Sep 13 01:18:06.440139 zram_generator::config[2788]: No configuration found. Sep 13 01:18:06.618075 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 01:18:06.736541 systemd[1]: Reloading finished in 410 ms. Sep 13 01:18:06.785870 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:18:06.800571 systemd[1]: kubelet.service: Deactivated successfully. Sep 13 01:18:06.801153 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:18:06.812608 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 01:18:06.995246 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 01:18:07.010596 (kubelet)[2861]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 13 01:18:07.138265 kubelet[2861]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 01:18:07.138265 kubelet[2861]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 13 01:18:07.138265 kubelet[2861]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 01:18:07.140398 kubelet[2861]: I0913 01:18:07.138794 2861 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 13 01:18:07.148016 kubelet[2861]: I0913 01:18:07.147969 2861 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 13 01:18:07.148172 kubelet[2861]: I0913 01:18:07.148153 2861 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 13 01:18:07.148579 kubelet[2861]: I0913 01:18:07.148558 2861 server.go:934] "Client rotation is on, will bootstrap in background" Sep 13 01:18:07.150514 kubelet[2861]: I0913 01:18:07.150493 2861 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 13 01:18:07.153491 kubelet[2861]: I0913 01:18:07.153463 2861 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 13 01:18:07.159209 kubelet[2861]: E0913 01:18:07.159173 2861 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 13 01:18:07.159382 kubelet[2861]: I0913 01:18:07.159361 2861 server.go:1408] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 13 01:18:07.165039 kubelet[2861]: I0913 01:18:07.165015 2861 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 13 01:18:07.165787 kubelet[2861]: I0913 01:18:07.165768 2861 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 13 01:18:07.166760 kubelet[2861]: I0913 01:18:07.166695 2861 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 13 01:18:07.168057 kubelet[2861]: I0913 01:18:07.166878 2861 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-qlx5f.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":1} Sep 13 01:18:07.172711 kubelet[2861]: I0913 01:18:07.172662 2861 topology_manager.go:138] "Creating topology manager with none policy" Sep 13 01:18:07.173289 kubelet[2861]: I0913 01:18:07.172949 2861 container_manager_linux.go:300] "Creating device plugin manager" Sep 13 01:18:07.173289 kubelet[2861]: I0913 01:18:07.173087 2861 state_mem.go:36] "Initialized new in-memory state store" Sep 13 01:18:07.175833 kubelet[2861]: I0913 01:18:07.175804 2861 kubelet.go:408] "Attempting to sync node with API server" Sep 13 01:18:07.176647 kubelet[2861]: I0913 01:18:07.175929 2861 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 13 01:18:07.176647 kubelet[2861]: I0913 01:18:07.176028 2861 kubelet.go:314] "Adding apiserver pod source" Sep 13 01:18:07.176647 kubelet[2861]: I0913 01:18:07.176053 2861 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 13 01:18:07.179056 kubelet[2861]: I0913 01:18:07.177912 2861 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 13 01:18:07.179056 kubelet[2861]: I0913 01:18:07.178508 2861 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 13 01:18:07.182890 kubelet[2861]: I0913 01:18:07.182078 2861 server.go:1274] "Started kubelet" Sep 13 01:18:07.188691 kubelet[2861]: I0913 01:18:07.188638 2861 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 13 01:18:07.196963 kubelet[2861]: I0913 01:18:07.196890 2861 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 13 01:18:07.199693 kubelet[2861]: I0913 01:18:07.199671 2861 server.go:449] "Adding debug handlers to kubelet server" Sep 13 01:18:07.201792 kubelet[2861]: I0913 01:18:07.201753 2861 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 13 01:18:07.202319 kubelet[2861]: I0913 01:18:07.202299 2861 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 13 01:18:07.202748 kubelet[2861]: I0913 01:18:07.202717 2861 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 13 01:18:07.205711 kubelet[2861]: I0913 01:18:07.205689 2861 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 13 01:18:07.206218 kubelet[2861]: E0913 01:18:07.206195 2861 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-qlx5f.gb1.brightbox.com\" not found" Sep 13 01:18:07.207086 kubelet[2861]: I0913 01:18:07.206928 2861 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 13 01:18:07.207483 kubelet[2861]: I0913 01:18:07.207464 2861 reconciler.go:26] "Reconciler: start to sync state" Sep 13 01:18:07.213548 kubelet[2861]: I0913 01:18:07.213451 2861 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 13 01:18:07.225012 kubelet[2861]: I0913 01:18:07.224862 2861 factory.go:221] Registration of the containerd container factory successfully Sep 13 01:18:07.225012 kubelet[2861]: I0913 01:18:07.224889 2861 factory.go:221] Registration of the systemd container factory successfully Sep 13 01:18:07.231526 kubelet[2861]: I0913 01:18:07.231320 2861 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 13 01:18:07.234419 kubelet[2861]: I0913 01:18:07.234385 2861 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 13 01:18:07.234996 kubelet[2861]: I0913 01:18:07.234541 2861 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 13 01:18:07.234996 kubelet[2861]: I0913 01:18:07.234575 2861 kubelet.go:2321] "Starting kubelet main sync loop" Sep 13 01:18:07.234996 kubelet[2861]: E0913 01:18:07.234633 2861 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 13 01:18:07.254205 kubelet[2861]: E0913 01:18:07.253359 2861 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 13 01:18:07.320375 kubelet[2861]: I0913 01:18:07.320118 2861 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 13 01:18:07.320375 kubelet[2861]: I0913 01:18:07.320144 2861 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 13 01:18:07.320375 kubelet[2861]: I0913 01:18:07.320171 2861 state_mem.go:36] "Initialized new in-memory state store" Sep 13 01:18:07.321133 kubelet[2861]: I0913 01:18:07.320646 2861 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 13 01:18:07.321133 kubelet[2861]: I0913 01:18:07.320675 2861 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 13 01:18:07.321133 kubelet[2861]: I0913 01:18:07.320713 2861 policy_none.go:49] "None policy: Start" Sep 13 01:18:07.323735 kubelet[2861]: I0913 01:18:07.322800 2861 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 13 01:18:07.323735 kubelet[2861]: I0913 01:18:07.322836 2861 state_mem.go:35] "Initializing new in-memory state store" Sep 13 01:18:07.323735 kubelet[2861]: I0913 01:18:07.323021 2861 state_mem.go:75] "Updated machine memory state" Sep 13 01:18:07.331007 kubelet[2861]: I0913 01:18:07.328831 2861 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 13 01:18:07.331007 kubelet[2861]: I0913 01:18:07.329198 2861 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 13 01:18:07.331007 kubelet[2861]: I0913 01:18:07.329219 2861 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 13 01:18:07.333570 kubelet[2861]: I0913 01:18:07.333539 2861 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 13 01:18:07.355417 kubelet[2861]: W0913 01:18:07.355370 2861 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 01:18:07.356211 kubelet[2861]: W0913 01:18:07.356189 2861 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 01:18:07.363109 kubelet[2861]: W0913 01:18:07.363048 2861 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 13 01:18:07.363322 kubelet[2861]: E0913 01:18:07.363168 2861 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-srv-qlx5f.gb1.brightbox.com\" already exists" pod="kube-system/kube-scheduler-srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:07.410871 kubelet[2861]: I0913 01:18:07.410494 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1cdc8dd31ace9f8c24a65c32567035c8-flexvolume-dir\") pod \"kube-controller-manager-srv-qlx5f.gb1.brightbox.com\" (UID: \"1cdc8dd31ace9f8c24a65c32567035c8\") " pod="kube-system/kube-controller-manager-srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:07.410871 kubelet[2861]: I0913 01:18:07.410553 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1cdc8dd31ace9f8c24a65c32567035c8-kubeconfig\") pod \"kube-controller-manager-srv-qlx5f.gb1.brightbox.com\" (UID: \"1cdc8dd31ace9f8c24a65c32567035c8\") " pod="kube-system/kube-controller-manager-srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:07.410871 kubelet[2861]: I0913 01:18:07.410586 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/22fa4739e99238359ccfd652a0b1825c-kubeconfig\") pod \"kube-scheduler-srv-qlx5f.gb1.brightbox.com\" (UID: \"22fa4739e99238359ccfd652a0b1825c\") " pod="kube-system/kube-scheduler-srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:07.410871 kubelet[2861]: I0913 01:18:07.410611 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f4ae19941f2dbccb22f6f66cb1381890-k8s-certs\") pod \"kube-apiserver-srv-qlx5f.gb1.brightbox.com\" (UID: \"f4ae19941f2dbccb22f6f66cb1381890\") " pod="kube-system/kube-apiserver-srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:07.410871 kubelet[2861]: I0913 01:18:07.410638 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1cdc8dd31ace9f8c24a65c32567035c8-ca-certs\") pod \"kube-controller-manager-srv-qlx5f.gb1.brightbox.com\" (UID: \"1cdc8dd31ace9f8c24a65c32567035c8\") " pod="kube-system/kube-controller-manager-srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:07.411246 kubelet[2861]: I0913 01:18:07.410661 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1cdc8dd31ace9f8c24a65c32567035c8-k8s-certs\") pod \"kube-controller-manager-srv-qlx5f.gb1.brightbox.com\" (UID: \"1cdc8dd31ace9f8c24a65c32567035c8\") " pod="kube-system/kube-controller-manager-srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:07.411246 kubelet[2861]: I0913 01:18:07.410697 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1cdc8dd31ace9f8c24a65c32567035c8-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-qlx5f.gb1.brightbox.com\" (UID: \"1cdc8dd31ace9f8c24a65c32567035c8\") " pod="kube-system/kube-controller-manager-srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:07.411246 kubelet[2861]: I0913 01:18:07.410730 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f4ae19941f2dbccb22f6f66cb1381890-ca-certs\") pod \"kube-apiserver-srv-qlx5f.gb1.brightbox.com\" (UID: \"f4ae19941f2dbccb22f6f66cb1381890\") " pod="kube-system/kube-apiserver-srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:07.411246 kubelet[2861]: I0913 01:18:07.410768 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f4ae19941f2dbccb22f6f66cb1381890-usr-share-ca-certificates\") pod \"kube-apiserver-srv-qlx5f.gb1.brightbox.com\" (UID: \"f4ae19941f2dbccb22f6f66cb1381890\") " pod="kube-system/kube-apiserver-srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:07.472077 kubelet[2861]: I0913 01:18:07.471547 2861 kubelet_node_status.go:72] "Attempting to register node" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:07.490024 kubelet[2861]: I0913 01:18:07.488885 2861 kubelet_node_status.go:111] "Node was previously registered" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:07.490024 kubelet[2861]: I0913 01:18:07.489035 2861 kubelet_node_status.go:75] "Successfully registered node" node="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:08.177828 kubelet[2861]: I0913 01:18:08.177754 2861 apiserver.go:52] "Watching apiserver" Sep 13 01:18:08.207745 kubelet[2861]: I0913 01:18:08.207638 2861 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 13 01:18:08.314158 kubelet[2861]: I0913 01:18:08.313987 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-qlx5f.gb1.brightbox.com" podStartSLOduration=1.313952758 podStartE2EDuration="1.313952758s" podCreationTimestamp="2025-09-13 01:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 01:18:08.312202649 +0000 UTC m=+1.255960305" watchObservedRunningTime="2025-09-13 01:18:08.313952758 +0000 UTC m=+1.257710393" Sep 13 01:18:08.329683 kubelet[2861]: I0913 01:18:08.328786 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-qlx5f.gb1.brightbox.com" podStartSLOduration=1.328763125 podStartE2EDuration="1.328763125s" podCreationTimestamp="2025-09-13 01:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 01:18:08.327830747 +0000 UTC m=+1.271588405" watchObservedRunningTime="2025-09-13 01:18:08.328763125 +0000 UTC m=+1.272520778" Sep 13 01:18:10.376790 kubelet[2861]: I0913 01:18:10.376565 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-qlx5f.gb1.brightbox.com" podStartSLOduration=5.3765432220000005 podStartE2EDuration="5.376543222s" podCreationTimestamp="2025-09-13 01:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 01:18:08.343563941 +0000 UTC m=+1.287321592" watchObservedRunningTime="2025-09-13 01:18:10.376543222 +0000 UTC m=+3.320300872" Sep 13 01:18:12.443884 kubelet[2861]: I0913 01:18:12.443828 2861 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 13 01:18:12.446422 containerd[1615]: time="2025-09-13T01:18:12.446375331Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 13 01:18:12.447536 kubelet[2861]: I0913 01:18:12.447035 2861 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 13 01:18:13.358919 kubelet[2861]: I0913 01:18:13.358701 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/570acb4b-5a82-427e-8dd5-4948f850ea2d-kube-proxy\") pod \"kube-proxy-lvvks\" (UID: \"570acb4b-5a82-427e-8dd5-4948f850ea2d\") " pod="kube-system/kube-proxy-lvvks" Sep 13 01:18:13.358919 kubelet[2861]: I0913 01:18:13.358762 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/570acb4b-5a82-427e-8dd5-4948f850ea2d-xtables-lock\") pod \"kube-proxy-lvvks\" (UID: \"570acb4b-5a82-427e-8dd5-4948f850ea2d\") " pod="kube-system/kube-proxy-lvvks" Sep 13 01:18:13.358919 kubelet[2861]: I0913 01:18:13.358789 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/570acb4b-5a82-427e-8dd5-4948f850ea2d-lib-modules\") pod \"kube-proxy-lvvks\" (UID: \"570acb4b-5a82-427e-8dd5-4948f850ea2d\") " pod="kube-system/kube-proxy-lvvks" Sep 13 01:18:13.358919 kubelet[2861]: I0913 01:18:13.358819 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2t7k\" (UniqueName: \"kubernetes.io/projected/570acb4b-5a82-427e-8dd5-4948f850ea2d-kube-api-access-v2t7k\") pod \"kube-proxy-lvvks\" (UID: \"570acb4b-5a82-427e-8dd5-4948f850ea2d\") " pod="kube-system/kube-proxy-lvvks" Sep 13 01:18:13.560411 kubelet[2861]: I0913 01:18:13.560251 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4a87ddfe-20e8-40cd-92f9-36011882ac68-var-lib-calico\") pod \"tigera-operator-58fc44c59b-lbvns\" (UID: \"4a87ddfe-20e8-40cd-92f9-36011882ac68\") " pod="tigera-operator/tigera-operator-58fc44c59b-lbvns" Sep 13 01:18:13.560411 kubelet[2861]: I0913 01:18:13.560342 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptpmw\" (UniqueName: \"kubernetes.io/projected/4a87ddfe-20e8-40cd-92f9-36011882ac68-kube-api-access-ptpmw\") pod \"tigera-operator-58fc44c59b-lbvns\" (UID: \"4a87ddfe-20e8-40cd-92f9-36011882ac68\") " pod="tigera-operator/tigera-operator-58fc44c59b-lbvns" Sep 13 01:18:13.612252 containerd[1615]: time="2025-09-13T01:18:13.611491914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lvvks,Uid:570acb4b-5a82-427e-8dd5-4948f850ea2d,Namespace:kube-system,Attempt:0,}" Sep 13 01:18:13.654068 containerd[1615]: time="2025-09-13T01:18:13.653891089Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:13.654850 containerd[1615]: time="2025-09-13T01:18:13.654195877Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:13.655018 containerd[1615]: time="2025-09-13T01:18:13.654960858Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:13.655270 containerd[1615]: time="2025-09-13T01:18:13.655156057Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:13.724900 containerd[1615]: time="2025-09-13T01:18:13.724759293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lvvks,Uid:570acb4b-5a82-427e-8dd5-4948f850ea2d,Namespace:kube-system,Attempt:0,} returns sandbox id \"e091b2e487f6419c52e3bf565fb323239ec651737f29ed97d5df31f8f4bd82be\"" Sep 13 01:18:13.730513 containerd[1615]: time="2025-09-13T01:18:13.730381306Z" level=info msg="CreateContainer within sandbox \"e091b2e487f6419c52e3bf565fb323239ec651737f29ed97d5df31f8f4bd82be\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 13 01:18:13.754011 containerd[1615]: time="2025-09-13T01:18:13.753870716Z" level=info msg="CreateContainer within sandbox \"e091b2e487f6419c52e3bf565fb323239ec651737f29ed97d5df31f8f4bd82be\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"aa7c50722872e3b87fa3fc229367a301b66d97a000300c1009edf3f832ea8324\"" Sep 13 01:18:13.755829 containerd[1615]: time="2025-09-13T01:18:13.755696347Z" level=info msg="StartContainer for \"aa7c50722872e3b87fa3fc229367a301b66d97a000300c1009edf3f832ea8324\"" Sep 13 01:18:13.843250 containerd[1615]: time="2025-09-13T01:18:13.843188148Z" level=info msg="StartContainer for \"aa7c50722872e3b87fa3fc229367a301b66d97a000300c1009edf3f832ea8324\" returns successfully" Sep 13 01:18:13.854669 containerd[1615]: time="2025-09-13T01:18:13.854620834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-lbvns,Uid:4a87ddfe-20e8-40cd-92f9-36011882ac68,Namespace:tigera-operator,Attempt:0,}" Sep 13 01:18:13.892291 containerd[1615]: time="2025-09-13T01:18:13.891850509Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:13.893284 containerd[1615]: time="2025-09-13T01:18:13.892930119Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:13.893284 containerd[1615]: time="2025-09-13T01:18:13.892970864Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:13.893922 containerd[1615]: time="2025-09-13T01:18:13.893864176Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:14.011337 containerd[1615]: time="2025-09-13T01:18:14.011271150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-lbvns,Uid:4a87ddfe-20e8-40cd-92f9-36011882ac68,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"598bf4f50bf17ad5a62f87a852c466cac6679421f7b75df5e018c8ad077cc5c1\"" Sep 13 01:18:14.017023 containerd[1615]: time="2025-09-13T01:18:14.015221072Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 13 01:18:14.326026 kubelet[2861]: I0913 01:18:14.325823 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-lvvks" podStartSLOduration=1.325791942 podStartE2EDuration="1.325791942s" podCreationTimestamp="2025-09-13 01:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 01:18:14.322273993 +0000 UTC m=+7.266031652" watchObservedRunningTime="2025-09-13 01:18:14.325791942 +0000 UTC m=+7.269549586" Sep 13 01:18:15.986417 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1221149028.mount: Deactivated successfully. Sep 13 01:18:17.034602 containerd[1615]: time="2025-09-13T01:18:17.034446893Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:17.036654 containerd[1615]: time="2025-09-13T01:18:17.036605362Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 13 01:18:17.039015 containerd[1615]: time="2025-09-13T01:18:17.037387886Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:17.040635 containerd[1615]: time="2025-09-13T01:18:17.040599619Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:17.041997 containerd[1615]: time="2025-09-13T01:18:17.041941025Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 3.026680653s" Sep 13 01:18:17.042156 containerd[1615]: time="2025-09-13T01:18:17.042115581Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 13 01:18:17.045616 containerd[1615]: time="2025-09-13T01:18:17.045574262Z" level=info msg="CreateContainer within sandbox \"598bf4f50bf17ad5a62f87a852c466cac6679421f7b75df5e018c8ad077cc5c1\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 13 01:18:17.063951 containerd[1615]: time="2025-09-13T01:18:17.063900507Z" level=info msg="CreateContainer within sandbox \"598bf4f50bf17ad5a62f87a852c466cac6679421f7b75df5e018c8ad077cc5c1\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"c94f24222f3c1318e4e438a92712fc4d241442720267aa00fd66e42fe6348c6a\"" Sep 13 01:18:17.066710 containerd[1615]: time="2025-09-13T01:18:17.066654505Z" level=info msg="StartContainer for \"c94f24222f3c1318e4e438a92712fc4d241442720267aa00fd66e42fe6348c6a\"" Sep 13 01:18:17.153373 containerd[1615]: time="2025-09-13T01:18:17.153319588Z" level=info msg="StartContainer for \"c94f24222f3c1318e4e438a92712fc4d241442720267aa00fd66e42fe6348c6a\" returns successfully" Sep 13 01:18:17.362264 kubelet[2861]: I0913 01:18:17.361966 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-lbvns" podStartSLOduration=1.332965145 podStartE2EDuration="4.361931105s" podCreationTimestamp="2025-09-13 01:18:13 +0000 UTC" firstStartedPulling="2025-09-13 01:18:14.014520033 +0000 UTC m=+6.958277678" lastFinishedPulling="2025-09-13 01:18:17.043486003 +0000 UTC m=+9.987243638" observedRunningTime="2025-09-13 01:18:17.350025733 +0000 UTC m=+10.293783386" watchObservedRunningTime="2025-09-13 01:18:17.361931105 +0000 UTC m=+10.305688748" Sep 13 01:18:22.598405 sudo[1918]: pam_unix(sudo:session): session closed for user root Sep 13 01:18:22.752288 sshd[1914]: pam_unix(sshd:session): session closed for user core Sep 13 01:18:22.766962 systemd-logind[1593]: Session 11 logged out. Waiting for processes to exit. Sep 13 01:18:22.768387 systemd[1]: sshd@8-10.230.52.250:22-139.178.68.195:40032.service: Deactivated successfully. Sep 13 01:18:22.781320 systemd[1]: session-11.scope: Deactivated successfully. Sep 13 01:18:22.787018 systemd-logind[1593]: Removed session 11. Sep 13 01:18:27.777874 kubelet[2861]: I0913 01:18:27.777435 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ccf00ac-9559-44df-91cb-d1d47f57063c-tigera-ca-bundle\") pod \"calico-typha-699b7c5754-79qxb\" (UID: \"7ccf00ac-9559-44df-91cb-d1d47f57063c\") " pod="calico-system/calico-typha-699b7c5754-79qxb" Sep 13 01:18:27.777874 kubelet[2861]: I0913 01:18:27.777542 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/7ccf00ac-9559-44df-91cb-d1d47f57063c-typha-certs\") pod \"calico-typha-699b7c5754-79qxb\" (UID: \"7ccf00ac-9559-44df-91cb-d1d47f57063c\") " pod="calico-system/calico-typha-699b7c5754-79qxb" Sep 13 01:18:27.777874 kubelet[2861]: I0913 01:18:27.777581 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd4kv\" (UniqueName: \"kubernetes.io/projected/7ccf00ac-9559-44df-91cb-d1d47f57063c-kube-api-access-cd4kv\") pod \"calico-typha-699b7c5754-79qxb\" (UID: \"7ccf00ac-9559-44df-91cb-d1d47f57063c\") " pod="calico-system/calico-typha-699b7c5754-79qxb" Sep 13 01:18:27.960036 containerd[1615]: time="2025-09-13T01:18:27.959928273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-699b7c5754-79qxb,Uid:7ccf00ac-9559-44df-91cb-d1d47f57063c,Namespace:calico-system,Attempt:0,}" Sep 13 01:18:28.035537 containerd[1615]: time="2025-09-13T01:18:28.034106597Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:28.035537 containerd[1615]: time="2025-09-13T01:18:28.034233622Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:28.035537 containerd[1615]: time="2025-09-13T01:18:28.034256896Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:28.035537 containerd[1615]: time="2025-09-13T01:18:28.034413991Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:28.183270 kubelet[2861]: I0913 01:18:28.182638 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ffc3284e-28f4-4a62-94fb-9bf40109a8ee-flexvol-driver-host\") pod \"calico-node-hsczl\" (UID: \"ffc3284e-28f4-4a62-94fb-9bf40109a8ee\") " pod="calico-system/calico-node-hsczl" Sep 13 01:18:28.183270 kubelet[2861]: I0913 01:18:28.182701 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffc3284e-28f4-4a62-94fb-9bf40109a8ee-tigera-ca-bundle\") pod \"calico-node-hsczl\" (UID: \"ffc3284e-28f4-4a62-94fb-9bf40109a8ee\") " pod="calico-system/calico-node-hsczl" Sep 13 01:18:28.183270 kubelet[2861]: I0913 01:18:28.182743 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ffc3284e-28f4-4a62-94fb-9bf40109a8ee-cni-bin-dir\") pod \"calico-node-hsczl\" (UID: \"ffc3284e-28f4-4a62-94fb-9bf40109a8ee\") " pod="calico-system/calico-node-hsczl" Sep 13 01:18:28.183270 kubelet[2861]: I0913 01:18:28.182768 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ffc3284e-28f4-4a62-94fb-9bf40109a8ee-lib-modules\") pod \"calico-node-hsczl\" (UID: \"ffc3284e-28f4-4a62-94fb-9bf40109a8ee\") " pod="calico-system/calico-node-hsczl" Sep 13 01:18:28.183270 kubelet[2861]: I0913 01:18:28.182807 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ffc3284e-28f4-4a62-94fb-9bf40109a8ee-var-lib-calico\") pod \"calico-node-hsczl\" (UID: \"ffc3284e-28f4-4a62-94fb-9bf40109a8ee\") " pod="calico-system/calico-node-hsczl" Sep 13 01:18:28.183732 kubelet[2861]: I0913 01:18:28.182862 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ffc3284e-28f4-4a62-94fb-9bf40109a8ee-xtables-lock\") pod \"calico-node-hsczl\" (UID: \"ffc3284e-28f4-4a62-94fb-9bf40109a8ee\") " pod="calico-system/calico-node-hsczl" Sep 13 01:18:28.183732 kubelet[2861]: I0913 01:18:28.182894 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ffc3284e-28f4-4a62-94fb-9bf40109a8ee-cni-log-dir\") pod \"calico-node-hsczl\" (UID: \"ffc3284e-28f4-4a62-94fb-9bf40109a8ee\") " pod="calico-system/calico-node-hsczl" Sep 13 01:18:28.183732 kubelet[2861]: I0913 01:18:28.182919 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ffc3284e-28f4-4a62-94fb-9bf40109a8ee-node-certs\") pod \"calico-node-hsczl\" (UID: \"ffc3284e-28f4-4a62-94fb-9bf40109a8ee\") " pod="calico-system/calico-node-hsczl" Sep 13 01:18:28.183732 kubelet[2861]: I0913 01:18:28.182955 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ffc3284e-28f4-4a62-94fb-9bf40109a8ee-var-run-calico\") pod \"calico-node-hsczl\" (UID: \"ffc3284e-28f4-4a62-94fb-9bf40109a8ee\") " pod="calico-system/calico-node-hsczl" Sep 13 01:18:28.183732 kubelet[2861]: I0913 01:18:28.183024 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ffc3284e-28f4-4a62-94fb-9bf40109a8ee-cni-net-dir\") pod \"calico-node-hsczl\" (UID: \"ffc3284e-28f4-4a62-94fb-9bf40109a8ee\") " pod="calico-system/calico-node-hsczl" Sep 13 01:18:28.184043 kubelet[2861]: I0913 01:18:28.183088 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ffc3284e-28f4-4a62-94fb-9bf40109a8ee-policysync\") pod \"calico-node-hsczl\" (UID: \"ffc3284e-28f4-4a62-94fb-9bf40109a8ee\") " pod="calico-system/calico-node-hsczl" Sep 13 01:18:28.184043 kubelet[2861]: I0913 01:18:28.183118 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzwlj\" (UniqueName: \"kubernetes.io/projected/ffc3284e-28f4-4a62-94fb-9bf40109a8ee-kube-api-access-pzwlj\") pod \"calico-node-hsczl\" (UID: \"ffc3284e-28f4-4a62-94fb-9bf40109a8ee\") " pod="calico-system/calico-node-hsczl" Sep 13 01:18:28.230876 containerd[1615]: time="2025-09-13T01:18:28.230781917Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-699b7c5754-79qxb,Uid:7ccf00ac-9559-44df-91cb-d1d47f57063c,Namespace:calico-system,Attempt:0,} returns sandbox id \"af2711a87baffd29c53de89322524ca46eb4cd3d7fc89724589ca9559ac527c0\"" Sep 13 01:18:28.243830 containerd[1615]: time="2025-09-13T01:18:28.242733170Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 13 01:18:28.334185 kubelet[2861]: E0913 01:18:28.333016 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.334185 kubelet[2861]: W0913 01:18:28.333067 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.334185 kubelet[2861]: E0913 01:18:28.333125 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.336906 kubelet[2861]: E0913 01:18:28.336690 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.336906 kubelet[2861]: W0913 01:18:28.336716 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.336906 kubelet[2861]: E0913 01:18:28.336752 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.339444 kubelet[2861]: E0913 01:18:28.339193 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.339444 kubelet[2861]: W0913 01:18:28.339223 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.339444 kubelet[2861]: E0913 01:18:28.339246 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.339942 kubelet[2861]: E0913 01:18:28.339795 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.339942 kubelet[2861]: W0913 01:18:28.339813 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.339942 kubelet[2861]: E0913 01:18:28.339830 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.340605 kubelet[2861]: E0913 01:18:28.340299 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.340605 kubelet[2861]: W0913 01:18:28.340317 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.340605 kubelet[2861]: E0913 01:18:28.340346 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.342497 kubelet[2861]: E0913 01:18:28.342267 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.342497 kubelet[2861]: W0913 01:18:28.342286 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.342497 kubelet[2861]: E0913 01:18:28.342304 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.348304 kubelet[2861]: E0913 01:18:28.348183 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.348304 kubelet[2861]: W0913 01:18:28.348213 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.348304 kubelet[2861]: E0913 01:18:28.348236 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.351085 kubelet[2861]: E0913 01:18:28.349483 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4zbwf" podUID="3a06b43f-3090-4270-8011-d28f2c555ca3" Sep 13 01:18:28.351085 kubelet[2861]: E0913 01:18:28.350078 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.351085 kubelet[2861]: W0913 01:18:28.350093 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.351085 kubelet[2861]: E0913 01:18:28.350110 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.352482 kubelet[2861]: E0913 01:18:28.351876 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.352482 kubelet[2861]: W0913 01:18:28.351899 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.352482 kubelet[2861]: E0913 01:18:28.351918 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.355802 kubelet[2861]: E0913 01:18:28.355746 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.355802 kubelet[2861]: W0913 01:18:28.355774 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.355802 kubelet[2861]: E0913 01:18:28.355796 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.358439 kubelet[2861]: E0913 01:18:28.358233 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.358439 kubelet[2861]: W0913 01:18:28.358253 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.358439 kubelet[2861]: E0913 01:18:28.358271 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.360553 kubelet[2861]: E0913 01:18:28.360515 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.360827 kubelet[2861]: W0913 01:18:28.360681 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.360827 kubelet[2861]: E0913 01:18:28.360711 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.361392 kubelet[2861]: E0913 01:18:28.361070 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.361392 kubelet[2861]: W0913 01:18:28.361114 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.361392 kubelet[2861]: E0913 01:18:28.361144 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.364183 kubelet[2861]: E0913 01:18:28.364145 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.364386 kubelet[2861]: W0913 01:18:28.364263 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.364386 kubelet[2861]: E0913 01:18:28.364284 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.365067 kubelet[2861]: E0913 01:18:28.364840 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.365067 kubelet[2861]: W0913 01:18:28.364894 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.365067 kubelet[2861]: E0913 01:18:28.364954 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.365874 kubelet[2861]: E0913 01:18:28.365658 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.365874 kubelet[2861]: W0913 01:18:28.365676 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.365874 kubelet[2861]: E0913 01:18:28.365710 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.366753 kubelet[2861]: E0913 01:18:28.366535 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.366753 kubelet[2861]: W0913 01:18:28.366553 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.366753 kubelet[2861]: E0913 01:18:28.366570 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.368729 kubelet[2861]: E0913 01:18:28.368132 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.368729 kubelet[2861]: W0913 01:18:28.368152 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.368729 kubelet[2861]: E0913 01:18:28.368169 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.370218 kubelet[2861]: E0913 01:18:28.370198 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.370357 kubelet[2861]: W0913 01:18:28.370337 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.370465 kubelet[2861]: E0913 01:18:28.370446 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.370820 kubelet[2861]: E0913 01:18:28.370801 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.370939 kubelet[2861]: W0913 01:18:28.370902 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.371170 kubelet[2861]: E0913 01:18:28.371050 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.371529 kubelet[2861]: E0913 01:18:28.371401 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.371529 kubelet[2861]: W0913 01:18:28.371418 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.371529 kubelet[2861]: E0913 01:18:28.371433 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.374270 kubelet[2861]: E0913 01:18:28.374247 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.374498 kubelet[2861]: W0913 01:18:28.374475 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.374748 kubelet[2861]: E0913 01:18:28.374593 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.379765 kubelet[2861]: E0913 01:18:28.379729 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.379765 kubelet[2861]: W0913 01:18:28.379760 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.379998 kubelet[2861]: E0913 01:18:28.379798 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.380186 kubelet[2861]: E0913 01:18:28.380167 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.380321 kubelet[2861]: W0913 01:18:28.380214 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.380321 kubelet[2861]: E0913 01:18:28.380236 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.380536 kubelet[2861]: E0913 01:18:28.380490 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.380536 kubelet[2861]: W0913 01:18:28.380508 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.380536 kubelet[2861]: E0913 01:18:28.380524 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.380790 kubelet[2861]: E0913 01:18:28.380771 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.380790 kubelet[2861]: W0913 01:18:28.380789 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.380929 kubelet[2861]: E0913 01:18:28.380804 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.386583 kubelet[2861]: E0913 01:18:28.386198 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.386583 kubelet[2861]: W0913 01:18:28.386240 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.386583 kubelet[2861]: E0913 01:18:28.386269 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.386893 kubelet[2861]: E0913 01:18:28.386635 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.386893 kubelet[2861]: W0913 01:18:28.386650 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.386893 kubelet[2861]: E0913 01:18:28.386668 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.387273 kubelet[2861]: E0913 01:18:28.387248 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.387273 kubelet[2861]: W0913 01:18:28.387269 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.387392 kubelet[2861]: E0913 01:18:28.387285 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.387603 kubelet[2861]: E0913 01:18:28.387582 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.387603 kubelet[2861]: W0913 01:18:28.387596 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.387711 kubelet[2861]: E0913 01:18:28.387612 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.388932 kubelet[2861]: E0913 01:18:28.388906 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.388932 kubelet[2861]: W0913 01:18:28.388928 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.389111 kubelet[2861]: E0913 01:18:28.388946 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.392132 kubelet[2861]: E0913 01:18:28.392088 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.392132 kubelet[2861]: W0913 01:18:28.392112 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.392132 kubelet[2861]: E0913 01:18:28.392133 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.393352 kubelet[2861]: E0913 01:18:28.393197 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.393631 kubelet[2861]: W0913 01:18:28.393221 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.393631 kubelet[2861]: E0913 01:18:28.393537 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.395756 kubelet[2861]: E0913 01:18:28.395554 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.395756 kubelet[2861]: W0913 01:18:28.395605 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.395756 kubelet[2861]: E0913 01:18:28.395628 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.399530 kubelet[2861]: E0913 01:18:28.397873 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.399530 kubelet[2861]: W0913 01:18:28.397898 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.399530 kubelet[2861]: E0913 01:18:28.397917 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.399530 kubelet[2861]: I0913 01:18:28.397950 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/3a06b43f-3090-4270-8011-d28f2c555ca3-varrun\") pod \"csi-node-driver-4zbwf\" (UID: \"3a06b43f-3090-4270-8011-d28f2c555ca3\") " pod="calico-system/csi-node-driver-4zbwf" Sep 13 01:18:28.403936 kubelet[2861]: E0913 01:18:28.402824 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.403936 kubelet[2861]: W0913 01:18:28.402859 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.403936 kubelet[2861]: E0913 01:18:28.402889 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.403936 kubelet[2861]: I0913 01:18:28.402938 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkqj8\" (UniqueName: \"kubernetes.io/projected/3a06b43f-3090-4270-8011-d28f2c555ca3-kube-api-access-fkqj8\") pod \"csi-node-driver-4zbwf\" (UID: \"3a06b43f-3090-4270-8011-d28f2c555ca3\") " pod="calico-system/csi-node-driver-4zbwf" Sep 13 01:18:28.406634 kubelet[2861]: E0913 01:18:28.406084 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.406634 kubelet[2861]: W0913 01:18:28.406113 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.406634 kubelet[2861]: E0913 01:18:28.406133 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.406634 kubelet[2861]: I0913 01:18:28.406162 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a06b43f-3090-4270-8011-d28f2c555ca3-registration-dir\") pod \"csi-node-driver-4zbwf\" (UID: \"3a06b43f-3090-4270-8011-d28f2c555ca3\") " pod="calico-system/csi-node-driver-4zbwf" Sep 13 01:18:28.409898 kubelet[2861]: E0913 01:18:28.409068 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.409898 kubelet[2861]: W0913 01:18:28.409095 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.409898 kubelet[2861]: E0913 01:18:28.409116 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.409898 kubelet[2861]: I0913 01:18:28.409147 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a06b43f-3090-4270-8011-d28f2c555ca3-socket-dir\") pod \"csi-node-driver-4zbwf\" (UID: \"3a06b43f-3090-4270-8011-d28f2c555ca3\") " pod="calico-system/csi-node-driver-4zbwf" Sep 13 01:18:28.412674 kubelet[2861]: E0913 01:18:28.412246 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.412674 kubelet[2861]: W0913 01:18:28.412274 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.412674 kubelet[2861]: E0913 01:18:28.412295 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.412674 kubelet[2861]: I0913 01:18:28.412321 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a06b43f-3090-4270-8011-d28f2c555ca3-kubelet-dir\") pod \"csi-node-driver-4zbwf\" (UID: \"3a06b43f-3090-4270-8011-d28f2c555ca3\") " pod="calico-system/csi-node-driver-4zbwf" Sep 13 01:18:28.424734 kubelet[2861]: E0913 01:18:28.424692 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.425513 kubelet[2861]: W0913 01:18:28.425154 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.425513 kubelet[2861]: E0913 01:18:28.425192 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.426616 kubelet[2861]: E0913 01:18:28.425935 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.426616 kubelet[2861]: W0913 01:18:28.425957 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.426616 kubelet[2861]: E0913 01:18:28.425993 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.427013 kubelet[2861]: E0913 01:18:28.426960 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.427234 kubelet[2861]: W0913 01:18:28.427213 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.427431 kubelet[2861]: E0913 01:18:28.427322 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.428351 kubelet[2861]: E0913 01:18:28.428332 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.428467 kubelet[2861]: W0913 01:18:28.428446 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.428581 kubelet[2861]: E0913 01:18:28.428560 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.429090 kubelet[2861]: E0913 01:18:28.429069 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.429319 kubelet[2861]: W0913 01:18:28.429196 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.429319 kubelet[2861]: E0913 01:18:28.429223 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.429765 kubelet[2861]: E0913 01:18:28.429605 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.429765 kubelet[2861]: W0913 01:18:28.429623 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.429765 kubelet[2861]: E0913 01:18:28.429643 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.430195 kubelet[2861]: E0913 01:18:28.430176 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.430511 kubelet[2861]: W0913 01:18:28.430301 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.430511 kubelet[2861]: E0913 01:18:28.430327 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.431044 kubelet[2861]: E0913 01:18:28.430815 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.431044 kubelet[2861]: W0913 01:18:28.430832 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.431044 kubelet[2861]: E0913 01:18:28.430850 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.431693 kubelet[2861]: E0913 01:18:28.431568 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.431693 kubelet[2861]: W0913 01:18:28.431587 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.431693 kubelet[2861]: E0913 01:18:28.431603 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.432640 kubelet[2861]: E0913 01:18:28.432571 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.432640 kubelet[2861]: W0913 01:18:28.432589 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.432640 kubelet[2861]: E0913 01:18:28.432606 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.433678 containerd[1615]: time="2025-09-13T01:18:28.433364513Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hsczl,Uid:ffc3284e-28f4-4a62-94fb-9bf40109a8ee,Namespace:calico-system,Attempt:0,}" Sep 13 01:18:28.523131 kubelet[2861]: E0913 01:18:28.522550 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.523131 kubelet[2861]: W0913 01:18:28.522584 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.523131 kubelet[2861]: E0913 01:18:28.522648 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.524107 kubelet[2861]: E0913 01:18:28.524087 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.524293 kubelet[2861]: W0913 01:18:28.524269 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.524461 kubelet[2861]: E0913 01:18:28.524440 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.527084 kubelet[2861]: E0913 01:18:28.525692 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.527084 kubelet[2861]: W0913 01:18:28.525713 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.527084 kubelet[2861]: E0913 01:18:28.525737 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.527584 kubelet[2861]: E0913 01:18:28.527368 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.527584 kubelet[2861]: W0913 01:18:28.527387 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.527584 kubelet[2861]: E0913 01:18:28.527543 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.527889 kubelet[2861]: E0913 01:18:28.527841 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.527889 kubelet[2861]: W0913 01:18:28.527866 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.528232 kubelet[2861]: E0913 01:18:28.528143 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.530706 kubelet[2861]: E0913 01:18:28.530312 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.530706 kubelet[2861]: W0913 01:18:28.530332 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.531090 kubelet[2861]: E0913 01:18:28.531069 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.533497 kubelet[2861]: E0913 01:18:28.533476 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.533795 kubelet[2861]: W0913 01:18:28.533562 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.533940 kubelet[2861]: E0913 01:18:28.533867 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.542379 kubelet[2861]: E0913 01:18:28.542105 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.542379 kubelet[2861]: W0913 01:18:28.542192 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.543576 kubelet[2861]: E0913 01:18:28.543166 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.543576 kubelet[2861]: E0913 01:18:28.543436 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.543576 kubelet[2861]: W0913 01:18:28.543449 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.544483 kubelet[2861]: E0913 01:18:28.544087 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.544483 kubelet[2861]: E0913 01:18:28.544318 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.544483 kubelet[2861]: W0913 01:18:28.544331 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.545124 kubelet[2861]: E0913 01:18:28.544805 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.547969 kubelet[2861]: E0913 01:18:28.547378 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.547969 kubelet[2861]: W0913 01:18:28.547399 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.547969 kubelet[2861]: E0913 01:18:28.547450 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.549539 kubelet[2861]: E0913 01:18:28.549403 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.549539 kubelet[2861]: W0913 01:18:28.549431 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.550043 kubelet[2861]: E0913 01:18:28.549930 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.550903 kubelet[2861]: E0913 01:18:28.550165 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.550903 kubelet[2861]: W0913 01:18:28.550840 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.552488 kubelet[2861]: E0913 01:18:28.552255 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.552488 kubelet[2861]: W0913 01:18:28.552403 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.552954 kubelet[2861]: E0913 01:18:28.552824 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.553842 kubelet[2861]: E0913 01:18:28.553696 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.554102 kubelet[2861]: E0913 01:18:28.553943 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.554102 kubelet[2861]: W0913 01:18:28.553966 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.554348 kubelet[2861]: E0913 01:18:28.554218 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.555733 kubelet[2861]: E0913 01:18:28.555283 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.555733 kubelet[2861]: W0913 01:18:28.555302 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.555733 kubelet[2861]: E0913 01:18:28.555346 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.558524 kubelet[2861]: E0913 01:18:28.558349 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.558524 kubelet[2861]: W0913 01:18:28.558375 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.560873 kubelet[2861]: E0913 01:18:28.560485 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.560873 kubelet[2861]: E0913 01:18:28.560699 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.560873 kubelet[2861]: W0913 01:18:28.560722 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.562425 kubelet[2861]: E0913 01:18:28.562079 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.562425 kubelet[2861]: W0913 01:18:28.562113 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.562425 kubelet[2861]: E0913 01:18:28.562205 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.562688 kubelet[2861]: E0913 01:18:28.562669 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.566260 kubelet[2861]: E0913 01:18:28.565732 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.566260 kubelet[2861]: W0913 01:18:28.565753 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.566260 kubelet[2861]: E0913 01:18:28.566008 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.570328 kubelet[2861]: E0913 01:18:28.570302 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.570968 kubelet[2861]: W0913 01:18:28.570940 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.571130 kubelet[2861]: E0913 01:18:28.571109 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.578173 kubelet[2861]: E0913 01:18:28.575149 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.578173 kubelet[2861]: W0913 01:18:28.575169 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.578173 kubelet[2861]: E0913 01:18:28.577888 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.578173 kubelet[2861]: W0913 01:18:28.577961 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.580550 kubelet[2861]: E0913 01:18:28.580529 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.581206 kubelet[2861]: W0913 01:18:28.581044 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.581206 kubelet[2861]: E0913 01:18:28.581086 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.581206 kubelet[2861]: E0913 01:18:28.581160 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.581646 kubelet[2861]: E0913 01:18:28.581304 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.582602 kubelet[2861]: E0913 01:18:28.582400 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.582602 kubelet[2861]: W0913 01:18:28.582537 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.582602 kubelet[2861]: E0913 01:18:28.582554 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.616020 kubelet[2861]: E0913 01:18:28.615948 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:28.616020 kubelet[2861]: W0913 01:18:28.616003 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:28.616232 kubelet[2861]: E0913 01:18:28.616042 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:28.633910 containerd[1615]: time="2025-09-13T01:18:28.595013373Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:28.633910 containerd[1615]: time="2025-09-13T01:18:28.595148940Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:28.633910 containerd[1615]: time="2025-09-13T01:18:28.595167677Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:28.633910 containerd[1615]: time="2025-09-13T01:18:28.598784386Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:28.716815 containerd[1615]: time="2025-09-13T01:18:28.716749403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hsczl,Uid:ffc3284e-28f4-4a62-94fb-9bf40109a8ee,Namespace:calico-system,Attempt:0,} returns sandbox id \"668369cdc7df55acff8c7122635a4add3bbdd741f2994edfd0dc642ddf189837\"" Sep 13 01:18:29.935207 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3543879697.mount: Deactivated successfully. Sep 13 01:18:30.235203 kubelet[2861]: E0913 01:18:30.235010 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4zbwf" podUID="3a06b43f-3090-4270-8011-d28f2c555ca3" Sep 13 01:18:31.976101 containerd[1615]: time="2025-09-13T01:18:31.975845019Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:31.978953 containerd[1615]: time="2025-09-13T01:18:31.978819606Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 13 01:18:31.981208 containerd[1615]: time="2025-09-13T01:18:31.981175171Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:31.989184 containerd[1615]: time="2025-09-13T01:18:31.989126049Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:31.991368 containerd[1615]: time="2025-09-13T01:18:31.991173919Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 3.748382586s" Sep 13 01:18:31.991368 containerd[1615]: time="2025-09-13T01:18:31.991239048Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 13 01:18:31.996870 containerd[1615]: time="2025-09-13T01:18:31.996557190Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 13 01:18:32.022357 containerd[1615]: time="2025-09-13T01:18:32.019967960Z" level=info msg="CreateContainer within sandbox \"af2711a87baffd29c53de89322524ca46eb4cd3d7fc89724589ca9559ac527c0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 13 01:18:32.053792 containerd[1615]: time="2025-09-13T01:18:32.053470002Z" level=info msg="CreateContainer within sandbox \"af2711a87baffd29c53de89322524ca46eb4cd3d7fc89724589ca9559ac527c0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"4e5960ba02983e9d5368a0b83480dd7ec53262cb54d9adda1b34af58c0d34210\"" Sep 13 01:18:32.061250 containerd[1615]: time="2025-09-13T01:18:32.060887343Z" level=info msg="StartContainer for \"4e5960ba02983e9d5368a0b83480dd7ec53262cb54d9adda1b34af58c0d34210\"" Sep 13 01:18:32.209425 containerd[1615]: time="2025-09-13T01:18:32.209371614Z" level=info msg="StartContainer for \"4e5960ba02983e9d5368a0b83480dd7ec53262cb54d9adda1b34af58c0d34210\" returns successfully" Sep 13 01:18:32.237287 kubelet[2861]: E0913 01:18:32.236632 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4zbwf" podUID="3a06b43f-3090-4270-8011-d28f2c555ca3" Sep 13 01:18:32.531654 kubelet[2861]: E0913 01:18:32.530797 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.531654 kubelet[2861]: W0913 01:18:32.530841 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.531654 kubelet[2861]: E0913 01:18:32.530879 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.535202 kubelet[2861]: E0913 01:18:32.533058 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.535202 kubelet[2861]: W0913 01:18:32.533079 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.535202 kubelet[2861]: E0913 01:18:32.533116 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.538883 kubelet[2861]: E0913 01:18:32.535499 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.538883 kubelet[2861]: W0913 01:18:32.535515 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.538883 kubelet[2861]: E0913 01:18:32.535533 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.538883 kubelet[2861]: E0913 01:18:32.535822 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.538883 kubelet[2861]: W0913 01:18:32.535837 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.538883 kubelet[2861]: E0913 01:18:32.535853 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.543015 kubelet[2861]: E0913 01:18:32.542409 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.543371 kubelet[2861]: W0913 01:18:32.543255 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.543818 kubelet[2861]: E0913 01:18:32.543718 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.545753 kubelet[2861]: E0913 01:18:32.545439 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.545753 kubelet[2861]: W0913 01:18:32.545459 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.545753 kubelet[2861]: E0913 01:18:32.545477 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.548319 kubelet[2861]: E0913 01:18:32.548067 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.548319 kubelet[2861]: W0913 01:18:32.548094 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.548319 kubelet[2861]: E0913 01:18:32.548149 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.549714 kubelet[2861]: E0913 01:18:32.549692 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.549816 kubelet[2861]: W0913 01:18:32.549797 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.551000 kubelet[2861]: E0913 01:18:32.550130 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.555459 kubelet[2861]: E0913 01:18:32.553911 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.555459 kubelet[2861]: W0913 01:18:32.553961 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.555459 kubelet[2861]: E0913 01:18:32.554015 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.555904 kubelet[2861]: E0913 01:18:32.555754 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.555904 kubelet[2861]: W0913 01:18:32.555775 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.555904 kubelet[2861]: E0913 01:18:32.555791 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.557400 kubelet[2861]: E0913 01:18:32.557052 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.557400 kubelet[2861]: W0913 01:18:32.557072 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.557400 kubelet[2861]: E0913 01:18:32.557090 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.560465 kubelet[2861]: E0913 01:18:32.560405 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.561858 kubelet[2861]: W0913 01:18:32.560879 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.561858 kubelet[2861]: E0913 01:18:32.560912 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.562124 kubelet[2861]: E0913 01:18:32.562087 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.563287 kubelet[2861]: W0913 01:18:32.563117 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.563287 kubelet[2861]: E0913 01:18:32.563148 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.564332 kubelet[2861]: E0913 01:18:32.563760 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.568121 kubelet[2861]: W0913 01:18:32.566806 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.568121 kubelet[2861]: E0913 01:18:32.567688 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.571263 kubelet[2861]: E0913 01:18:32.569106 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.571263 kubelet[2861]: W0913 01:18:32.569124 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.571263 kubelet[2861]: E0913 01:18:32.569141 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.585148 kubelet[2861]: E0913 01:18:32.585101 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.585148 kubelet[2861]: W0913 01:18:32.585131 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.586490 kubelet[2861]: E0913 01:18:32.585179 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.586490 kubelet[2861]: E0913 01:18:32.586236 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.586490 kubelet[2861]: W0913 01:18:32.586251 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.586490 kubelet[2861]: E0913 01:18:32.586286 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.587107 kubelet[2861]: E0913 01:18:32.587054 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.587107 kubelet[2861]: W0913 01:18:32.587076 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.588159 kubelet[2861]: E0913 01:18:32.587769 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.588159 kubelet[2861]: W0913 01:18:32.587784 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.588159 kubelet[2861]: E0913 01:18:32.587800 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.588315 kubelet[2861]: E0913 01:18:32.587219 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.590455 kubelet[2861]: E0913 01:18:32.590225 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.590455 kubelet[2861]: W0913 01:18:32.590262 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.590455 kubelet[2861]: E0913 01:18:32.590290 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.590774 kubelet[2861]: E0913 01:18:32.590721 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.593160 kubelet[2861]: W0913 01:18:32.593006 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.593160 kubelet[2861]: E0913 01:18:32.593065 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.593550 kubelet[2861]: E0913 01:18:32.593533 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.593726 kubelet[2861]: W0913 01:18:32.593625 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.593726 kubelet[2861]: E0913 01:18:32.593670 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.596226 kubelet[2861]: E0913 01:18:32.596009 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.596226 kubelet[2861]: W0913 01:18:32.596030 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.596226 kubelet[2861]: E0913 01:18:32.596069 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.596659 kubelet[2861]: E0913 01:18:32.596641 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.597284 kubelet[2861]: W0913 01:18:32.596935 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.597284 kubelet[2861]: E0913 01:18:32.597006 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.617001 kubelet[2861]: E0913 01:18:32.616313 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.617001 kubelet[2861]: W0913 01:18:32.616438 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.617001 kubelet[2861]: E0913 01:18:32.616525 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.618103 kubelet[2861]: E0913 01:18:32.618080 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.618103 kubelet[2861]: W0913 01:18:32.618103 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.622000 kubelet[2861]: E0913 01:18:32.621056 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.622000 kubelet[2861]: W0913 01:18:32.621082 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.623073 kubelet[2861]: E0913 01:18:32.622972 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.623156 kubelet[2861]: E0913 01:18:32.623113 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.623156 kubelet[2861]: W0913 01:18:32.623128 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.623156 kubelet[2861]: E0913 01:18:32.623145 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.625736 kubelet[2861]: E0913 01:18:32.625699 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.625736 kubelet[2861]: W0913 01:18:32.625724 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.626211 kubelet[2861]: E0913 01:18:32.625749 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.632137 kubelet[2861]: E0913 01:18:32.632054 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.632474 kubelet[2861]: E0913 01:18:32.632209 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.632474 kubelet[2861]: W0913 01:18:32.632227 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.633316 kubelet[2861]: E0913 01:18:32.633291 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.634516 kubelet[2861]: E0913 01:18:32.634489 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.634606 kubelet[2861]: W0913 01:18:32.634510 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.634606 kubelet[2861]: E0913 01:18:32.634555 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.635642 kubelet[2861]: E0913 01:18:32.635621 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.635642 kubelet[2861]: W0913 01:18:32.635640 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.635770 kubelet[2861]: E0913 01:18:32.635657 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:32.637430 kubelet[2861]: E0913 01:18:32.637408 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:32.637430 kubelet[2861]: W0913 01:18:32.637428 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:32.637583 kubelet[2861]: E0913 01:18:32.637446 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.496596 kubelet[2861]: I0913 01:18:33.496434 2861 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 01:18:33.578742 kubelet[2861]: E0913 01:18:33.577558 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.578742 kubelet[2861]: W0913 01:18:33.577591 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.578742 kubelet[2861]: E0913 01:18:33.577623 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.578742 kubelet[2861]: E0913 01:18:33.578052 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.578742 kubelet[2861]: W0913 01:18:33.578077 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.578742 kubelet[2861]: E0913 01:18:33.578106 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.578742 kubelet[2861]: E0913 01:18:33.578491 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.578742 kubelet[2861]: W0913 01:18:33.578543 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.578742 kubelet[2861]: E0913 01:18:33.578583 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.580570 kubelet[2861]: E0913 01:18:33.580545 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.580675 kubelet[2861]: W0913 01:18:33.580566 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.580675 kubelet[2861]: E0913 01:18:33.580630 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.581964 kubelet[2861]: E0913 01:18:33.581931 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.582052 kubelet[2861]: W0913 01:18:33.581994 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.582052 kubelet[2861]: E0913 01:18:33.582018 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.582352 kubelet[2861]: E0913 01:18:33.582333 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.582419 kubelet[2861]: W0913 01:18:33.582352 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.582419 kubelet[2861]: E0913 01:18:33.582391 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.582704 kubelet[2861]: E0913 01:18:33.582686 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.582770 kubelet[2861]: W0913 01:18:33.582706 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.582770 kubelet[2861]: E0913 01:18:33.582722 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.583828 kubelet[2861]: E0913 01:18:33.583797 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.583828 kubelet[2861]: W0913 01:18:33.583815 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.583954 kubelet[2861]: E0913 01:18:33.583843 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.584159 kubelet[2861]: E0913 01:18:33.584146 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.584247 kubelet[2861]: W0913 01:18:33.584159 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.584247 kubelet[2861]: E0913 01:18:33.584173 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.584573 kubelet[2861]: E0913 01:18:33.584554 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.584573 kubelet[2861]: W0913 01:18:33.584572 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.584704 kubelet[2861]: E0913 01:18:33.584586 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.584907 kubelet[2861]: E0913 01:18:33.584888 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.584967 kubelet[2861]: W0913 01:18:33.584907 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.584967 kubelet[2861]: E0913 01:18:33.584923 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.585810 kubelet[2861]: E0913 01:18:33.585790 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.585810 kubelet[2861]: W0913 01:18:33.585810 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.585910 kubelet[2861]: E0913 01:18:33.585825 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.587286 kubelet[2861]: E0913 01:18:33.586531 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.587286 kubelet[2861]: W0913 01:18:33.586564 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.587286 kubelet[2861]: E0913 01:18:33.586579 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.587286 kubelet[2861]: E0913 01:18:33.587083 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.587517 kubelet[2861]: W0913 01:18:33.587311 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.587517 kubelet[2861]: E0913 01:18:33.587329 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.588224 kubelet[2861]: E0913 01:18:33.588195 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.588224 kubelet[2861]: W0913 01:18:33.588223 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.588334 kubelet[2861]: E0913 01:18:33.588240 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.595730 kubelet[2861]: E0913 01:18:33.595542 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.595730 kubelet[2861]: W0913 01:18:33.595565 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.595730 kubelet[2861]: E0913 01:18:33.595590 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.596294 kubelet[2861]: E0913 01:18:33.596120 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.596294 kubelet[2861]: W0913 01:18:33.596138 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.596294 kubelet[2861]: E0913 01:18:33.596164 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.596620 kubelet[2861]: E0913 01:18:33.596602 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.596736 kubelet[2861]: W0913 01:18:33.596718 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.596974 kubelet[2861]: E0913 01:18:33.596828 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.597349 kubelet[2861]: E0913 01:18:33.597197 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.597349 kubelet[2861]: W0913 01:18:33.597214 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.597349 kubelet[2861]: E0913 01:18:33.597238 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.597651 kubelet[2861]: E0913 01:18:33.597633 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.597755 kubelet[2861]: W0913 01:18:33.597736 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.597868 kubelet[2861]: E0913 01:18:33.597849 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.598626 kubelet[2861]: E0913 01:18:33.598524 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.598703 kubelet[2861]: W0913 01:18:33.598663 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.598703 kubelet[2861]: E0913 01:18:33.598693 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.599201 kubelet[2861]: E0913 01:18:33.599145 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.599201 kubelet[2861]: W0913 01:18:33.599181 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.599343 kubelet[2861]: E0913 01:18:33.599261 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.599691 kubelet[2861]: E0913 01:18:33.599623 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.599691 kubelet[2861]: W0913 01:18:33.599642 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.600059 kubelet[2861]: E0913 01:18:33.600011 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.600385 kubelet[2861]: E0913 01:18:33.600355 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.600547 kubelet[2861]: W0913 01:18:33.600379 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.600646 kubelet[2861]: E0913 01:18:33.600624 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.601115 kubelet[2861]: E0913 01:18:33.601095 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.601115 kubelet[2861]: W0913 01:18:33.601114 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.601614 kubelet[2861]: E0913 01:18:33.601584 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.601805 kubelet[2861]: E0913 01:18:33.601778 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.601869 kubelet[2861]: W0913 01:18:33.601801 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.601996 kubelet[2861]: E0913 01:18:33.601949 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.602670 kubelet[2861]: E0913 01:18:33.602640 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.602745 kubelet[2861]: W0913 01:18:33.602664 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.602850 kubelet[2861]: E0913 01:18:33.602807 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.603339 kubelet[2861]: E0913 01:18:33.603319 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.603339 kubelet[2861]: W0913 01:18:33.603339 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.603759 kubelet[2861]: E0913 01:18:33.603377 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.604132 kubelet[2861]: E0913 01:18:33.604088 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.604132 kubelet[2861]: W0913 01:18:33.604127 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.605047 kubelet[2861]: E0913 01:18:33.604749 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.605047 kubelet[2861]: W0913 01:18:33.604774 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.605504 kubelet[2861]: E0913 01:18:33.605477 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.605504 kubelet[2861]: W0913 01:18:33.605499 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.605616 kubelet[2861]: E0913 01:18:33.605519 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.605616 kubelet[2861]: E0913 01:18:33.605592 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.606001 kubelet[2861]: E0913 01:18:33.605954 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.606520 kubelet[2861]: E0913 01:18:33.606496 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.606520 kubelet[2861]: W0913 01:18:33.606515 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.606625 kubelet[2861]: E0913 01:18:33.606560 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.606889 kubelet[2861]: E0913 01:18:33.606848 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 01:18:33.606889 kubelet[2861]: W0913 01:18:33.606869 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 01:18:33.606889 kubelet[2861]: E0913 01:18:33.606885 2861 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 01:18:33.825190 containerd[1615]: time="2025-09-13T01:18:33.824461096Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:33.828056 containerd[1615]: time="2025-09-13T01:18:33.827185088Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 13 01:18:33.828308 containerd[1615]: time="2025-09-13T01:18:33.828275219Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:33.848754 containerd[1615]: time="2025-09-13T01:18:33.847670592Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:33.849138 containerd[1615]: time="2025-09-13T01:18:33.848670294Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.852069209s" Sep 13 01:18:33.849138 containerd[1615]: time="2025-09-13T01:18:33.849129401Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 13 01:18:33.854017 containerd[1615]: time="2025-09-13T01:18:33.853367044Z" level=info msg="CreateContainer within sandbox \"668369cdc7df55acff8c7122635a4add3bbdd741f2994edfd0dc642ddf189837\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 13 01:18:33.880575 containerd[1615]: time="2025-09-13T01:18:33.880519200Z" level=info msg="CreateContainer within sandbox \"668369cdc7df55acff8c7122635a4add3bbdd741f2994edfd0dc642ddf189837\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"63aa0c468c0b62de4b80ddd4af2bc607c849dcf6422db31857ec94acd71bcbe2\"" Sep 13 01:18:33.882124 containerd[1615]: time="2025-09-13T01:18:33.882059106Z" level=info msg="StartContainer for \"63aa0c468c0b62de4b80ddd4af2bc607c849dcf6422db31857ec94acd71bcbe2\"" Sep 13 01:18:33.990250 containerd[1615]: time="2025-09-13T01:18:33.990089083Z" level=info msg="StartContainer for \"63aa0c468c0b62de4b80ddd4af2bc607c849dcf6422db31857ec94acd71bcbe2\" returns successfully" Sep 13 01:18:34.045341 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-63aa0c468c0b62de4b80ddd4af2bc607c849dcf6422db31857ec94acd71bcbe2-rootfs.mount: Deactivated successfully. Sep 13 01:18:34.087086 containerd[1615]: time="2025-09-13T01:18:34.071602143Z" level=info msg="shim disconnected" id=63aa0c468c0b62de4b80ddd4af2bc607c849dcf6422db31857ec94acd71bcbe2 namespace=k8s.io Sep 13 01:18:34.087086 containerd[1615]: time="2025-09-13T01:18:34.086922467Z" level=warning msg="cleaning up after shim disconnected" id=63aa0c468c0b62de4b80ddd4af2bc607c849dcf6422db31857ec94acd71bcbe2 namespace=k8s.io Sep 13 01:18:34.087086 containerd[1615]: time="2025-09-13T01:18:34.086956314Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 01:18:34.107949 containerd[1615]: time="2025-09-13T01:18:34.107871257Z" level=warning msg="cleanup warnings time=\"2025-09-13T01:18:34Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Sep 13 01:18:34.235596 kubelet[2861]: E0913 01:18:34.235495 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4zbwf" podUID="3a06b43f-3090-4270-8011-d28f2c555ca3" Sep 13 01:18:34.504033 containerd[1615]: time="2025-09-13T01:18:34.502624716Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 13 01:18:34.537206 kubelet[2861]: I0913 01:18:34.536747 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-699b7c5754-79qxb" podStartSLOduration=3.78052633 podStartE2EDuration="7.536713291s" podCreationTimestamp="2025-09-13 01:18:27 +0000 UTC" firstStartedPulling="2025-09-13 01:18:28.238843505 +0000 UTC m=+21.182601142" lastFinishedPulling="2025-09-13 01:18:31.995030453 +0000 UTC m=+24.938788103" observedRunningTime="2025-09-13 01:18:32.616500553 +0000 UTC m=+25.560258207" watchObservedRunningTime="2025-09-13 01:18:34.536713291 +0000 UTC m=+27.480470941" Sep 13 01:18:36.236851 kubelet[2861]: E0913 01:18:36.236034 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4zbwf" podUID="3a06b43f-3090-4270-8011-d28f2c555ca3" Sep 13 01:18:38.236694 kubelet[2861]: E0913 01:18:38.236619 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4zbwf" podUID="3a06b43f-3090-4270-8011-d28f2c555ca3" Sep 13 01:18:39.211182 containerd[1615]: time="2025-09-13T01:18:39.211049600Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:39.213118 containerd[1615]: time="2025-09-13T01:18:39.213056345Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 13 01:18:39.214164 containerd[1615]: time="2025-09-13T01:18:39.214095440Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:39.216550 containerd[1615]: time="2025-09-13T01:18:39.216485119Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:39.217882 containerd[1615]: time="2025-09-13T01:18:39.217687878Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 4.713223526s" Sep 13 01:18:39.217882 containerd[1615]: time="2025-09-13T01:18:39.217736712Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 13 01:18:39.222600 containerd[1615]: time="2025-09-13T01:18:39.222429807Z" level=info msg="CreateContainer within sandbox \"668369cdc7df55acff8c7122635a4add3bbdd741f2994edfd0dc642ddf189837\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 13 01:18:39.256840 containerd[1615]: time="2025-09-13T01:18:39.256777139Z" level=info msg="CreateContainer within sandbox \"668369cdc7df55acff8c7122635a4add3bbdd741f2994edfd0dc642ddf189837\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"1cfe771cd644882f6ff02101f5cfae5c71ab85c14edd9a90cb6aade48d6e7795\"" Sep 13 01:18:39.259284 containerd[1615]: time="2025-09-13T01:18:39.257623146Z" level=info msg="StartContainer for \"1cfe771cd644882f6ff02101f5cfae5c71ab85c14edd9a90cb6aade48d6e7795\"" Sep 13 01:18:39.392129 containerd[1615]: time="2025-09-13T01:18:39.392069340Z" level=info msg="StartContainer for \"1cfe771cd644882f6ff02101f5cfae5c71ab85c14edd9a90cb6aade48d6e7795\" returns successfully" Sep 13 01:18:40.173461 kubelet[2861]: I0913 01:18:40.172774 2861 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 01:18:40.243004 kubelet[2861]: E0913 01:18:40.235456 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4zbwf" podUID="3a06b43f-3090-4270-8011-d28f2c555ca3" Sep 13 01:18:40.658890 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1cfe771cd644882f6ff02101f5cfae5c71ab85c14edd9a90cb6aade48d6e7795-rootfs.mount: Deactivated successfully. Sep 13 01:18:40.663136 containerd[1615]: time="2025-09-13T01:18:40.657292532Z" level=info msg="shim disconnected" id=1cfe771cd644882f6ff02101f5cfae5c71ab85c14edd9a90cb6aade48d6e7795 namespace=k8s.io Sep 13 01:18:40.663562 containerd[1615]: time="2025-09-13T01:18:40.663161144Z" level=warning msg="cleaning up after shim disconnected" id=1cfe771cd644882f6ff02101f5cfae5c71ab85c14edd9a90cb6aade48d6e7795 namespace=k8s.io Sep 13 01:18:40.663562 containerd[1615]: time="2025-09-13T01:18:40.663187493Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 01:18:40.682083 kubelet[2861]: I0913 01:18:40.682046 2861 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 13 01:18:40.749111 kubelet[2861]: I0913 01:18:40.747024 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt4zg\" (UniqueName: \"kubernetes.io/projected/b466df6e-e38d-402c-86b5-4227a521ff8c-kube-api-access-lt4zg\") pod \"coredns-7c65d6cfc9-8m5j5\" (UID: \"b466df6e-e38d-402c-86b5-4227a521ff8c\") " pod="kube-system/coredns-7c65d6cfc9-8m5j5" Sep 13 01:18:40.749111 kubelet[2861]: I0913 01:18:40.748811 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b466df6e-e38d-402c-86b5-4227a521ff8c-config-volume\") pod \"coredns-7c65d6cfc9-8m5j5\" (UID: \"b466df6e-e38d-402c-86b5-4227a521ff8c\") " pod="kube-system/coredns-7c65d6cfc9-8m5j5" Sep 13 01:18:40.850895 kubelet[2861]: I0913 01:18:40.849221 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b023c1a-2436-46bd-a84b-0a772635011c-config-volume\") pod \"coredns-7c65d6cfc9-sh5jf\" (UID: \"9b023c1a-2436-46bd-a84b-0a772635011c\") " pod="kube-system/coredns-7c65d6cfc9-sh5jf" Sep 13 01:18:40.850895 kubelet[2861]: I0913 01:18:40.849281 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njtrb\" (UniqueName: \"kubernetes.io/projected/9ed5b42c-f6b8-4b84-aec8-d89c53dde12b-kube-api-access-njtrb\") pod \"calico-apiserver-794c96ccd7-9sdrw\" (UID: \"9ed5b42c-f6b8-4b84-aec8-d89c53dde12b\") " pod="calico-apiserver/calico-apiserver-794c96ccd7-9sdrw" Sep 13 01:18:40.850895 kubelet[2861]: I0913 01:18:40.849318 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9kd8\" (UniqueName: \"kubernetes.io/projected/21caebb3-60d2-40b7-849a-7ed3e6b8a990-kube-api-access-j9kd8\") pod \"goldmane-7988f88666-9kgmq\" (UID: \"21caebb3-60d2-40b7-849a-7ed3e6b8a990\") " pod="calico-system/goldmane-7988f88666-9kgmq" Sep 13 01:18:40.850895 kubelet[2861]: I0913 01:18:40.849363 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21caebb3-60d2-40b7-849a-7ed3e6b8a990-goldmane-ca-bundle\") pod \"goldmane-7988f88666-9kgmq\" (UID: \"21caebb3-60d2-40b7-849a-7ed3e6b8a990\") " pod="calico-system/goldmane-7988f88666-9kgmq" Sep 13 01:18:40.850895 kubelet[2861]: I0913 01:18:40.849389 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8b7ec64d-f2d5-435c-a825-32b5603eece4-calico-apiserver-certs\") pod \"calico-apiserver-794c96ccd7-d8jh4\" (UID: \"8b7ec64d-f2d5-435c-a825-32b5603eece4\") " pod="calico-apiserver/calico-apiserver-794c96ccd7-d8jh4" Sep 13 01:18:40.852536 kubelet[2861]: I0913 01:18:40.849417 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9ed5b42c-f6b8-4b84-aec8-d89c53dde12b-calico-apiserver-certs\") pod \"calico-apiserver-794c96ccd7-9sdrw\" (UID: \"9ed5b42c-f6b8-4b84-aec8-d89c53dde12b\") " pod="calico-apiserver/calico-apiserver-794c96ccd7-9sdrw" Sep 13 01:18:40.852536 kubelet[2861]: I0913 01:18:40.849455 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2777e08f-0ded-483c-bbf8-a0c22f090d0b-tigera-ca-bundle\") pod \"calico-kube-controllers-7f77cfb5c7-t6k5v\" (UID: \"2777e08f-0ded-483c-bbf8-a0c22f090d0b\") " pod="calico-system/calico-kube-controllers-7f77cfb5c7-t6k5v" Sep 13 01:18:40.852536 kubelet[2861]: I0913 01:18:40.849490 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzk28\" (UniqueName: \"kubernetes.io/projected/2777e08f-0ded-483c-bbf8-a0c22f090d0b-kube-api-access-dzk28\") pod \"calico-kube-controllers-7f77cfb5c7-t6k5v\" (UID: \"2777e08f-0ded-483c-bbf8-a0c22f090d0b\") " pod="calico-system/calico-kube-controllers-7f77cfb5c7-t6k5v" Sep 13 01:18:40.852536 kubelet[2861]: I0913 01:18:40.849528 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/21caebb3-60d2-40b7-849a-7ed3e6b8a990-goldmane-key-pair\") pod \"goldmane-7988f88666-9kgmq\" (UID: \"21caebb3-60d2-40b7-849a-7ed3e6b8a990\") " pod="calico-system/goldmane-7988f88666-9kgmq" Sep 13 01:18:40.852536 kubelet[2861]: I0913 01:18:40.849571 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vhvs\" (UniqueName: \"kubernetes.io/projected/e4f2ee13-e591-4f60-a716-9a0d904df543-kube-api-access-7vhvs\") pod \"whisker-84f64fc779-xslb8\" (UID: \"e4f2ee13-e591-4f60-a716-9a0d904df543\") " pod="calico-system/whisker-84f64fc779-xslb8" Sep 13 01:18:40.853296 kubelet[2861]: I0913 01:18:40.849644 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5fsf\" (UniqueName: \"kubernetes.io/projected/9b023c1a-2436-46bd-a84b-0a772635011c-kube-api-access-m5fsf\") pod \"coredns-7c65d6cfc9-sh5jf\" (UID: \"9b023c1a-2436-46bd-a84b-0a772635011c\") " pod="kube-system/coredns-7c65d6cfc9-sh5jf" Sep 13 01:18:40.853296 kubelet[2861]: I0913 01:18:40.849683 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4f2ee13-e591-4f60-a716-9a0d904df543-whisker-ca-bundle\") pod \"whisker-84f64fc779-xslb8\" (UID: \"e4f2ee13-e591-4f60-a716-9a0d904df543\") " pod="calico-system/whisker-84f64fc779-xslb8" Sep 13 01:18:40.853296 kubelet[2861]: I0913 01:18:40.849708 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21caebb3-60d2-40b7-849a-7ed3e6b8a990-config\") pod \"goldmane-7988f88666-9kgmq\" (UID: \"21caebb3-60d2-40b7-849a-7ed3e6b8a990\") " pod="calico-system/goldmane-7988f88666-9kgmq" Sep 13 01:18:40.853296 kubelet[2861]: I0913 01:18:40.849736 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brg4j\" (UniqueName: \"kubernetes.io/projected/8b7ec64d-f2d5-435c-a825-32b5603eece4-kube-api-access-brg4j\") pod \"calico-apiserver-794c96ccd7-d8jh4\" (UID: \"8b7ec64d-f2d5-435c-a825-32b5603eece4\") " pod="calico-apiserver/calico-apiserver-794c96ccd7-d8jh4" Sep 13 01:18:40.853296 kubelet[2861]: I0913 01:18:40.849796 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e4f2ee13-e591-4f60-a716-9a0d904df543-whisker-backend-key-pair\") pod \"whisker-84f64fc779-xslb8\" (UID: \"e4f2ee13-e591-4f60-a716-9a0d904df543\") " pod="calico-system/whisker-84f64fc779-xslb8" Sep 13 01:18:41.054023 containerd[1615]: time="2025-09-13T01:18:41.052570573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8m5j5,Uid:b466df6e-e38d-402c-86b5-4227a521ff8c,Namespace:kube-system,Attempt:0,}" Sep 13 01:18:41.097243 containerd[1615]: time="2025-09-13T01:18:41.096724867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-794c96ccd7-9sdrw,Uid:9ed5b42c-f6b8-4b84-aec8-d89c53dde12b,Namespace:calico-apiserver,Attempt:0,}" Sep 13 01:18:41.097243 containerd[1615]: time="2025-09-13T01:18:41.096996848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-sh5jf,Uid:9b023c1a-2436-46bd-a84b-0a772635011c,Namespace:kube-system,Attempt:0,}" Sep 13 01:18:41.097243 containerd[1615]: time="2025-09-13T01:18:41.097196053Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f77cfb5c7-t6k5v,Uid:2777e08f-0ded-483c-bbf8-a0c22f090d0b,Namespace:calico-system,Attempt:0,}" Sep 13 01:18:41.097579 containerd[1615]: time="2025-09-13T01:18:41.097406192Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84f64fc779-xslb8,Uid:e4f2ee13-e591-4f60-a716-9a0d904df543,Namespace:calico-system,Attempt:0,}" Sep 13 01:18:41.116866 containerd[1615]: time="2025-09-13T01:18:41.116158392Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-9kgmq,Uid:21caebb3-60d2-40b7-849a-7ed3e6b8a990,Namespace:calico-system,Attempt:0,}" Sep 13 01:18:41.119608 containerd[1615]: time="2025-09-13T01:18:41.119521105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-794c96ccd7-d8jh4,Uid:8b7ec64d-f2d5-435c-a825-32b5603eece4,Namespace:calico-apiserver,Attempt:0,}" Sep 13 01:18:41.522178 containerd[1615]: time="2025-09-13T01:18:41.522035283Z" level=error msg="Failed to destroy network for sandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.536669 containerd[1615]: time="2025-09-13T01:18:41.536601901Z" level=error msg="Failed to destroy network for sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.539804 containerd[1615]: time="2025-09-13T01:18:41.539754774Z" level=error msg="encountered an error cleaning up failed sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.542080 containerd[1615]: time="2025-09-13T01:18:41.541072650Z" level=error msg="encountered an error cleaning up failed sandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.552282 containerd[1615]: time="2025-09-13T01:18:41.552219116Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-sh5jf,Uid:9b023c1a-2436-46bd-a84b-0a772635011c,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.553613 containerd[1615]: time="2025-09-13T01:18:41.553572135Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8m5j5,Uid:b466df6e-e38d-402c-86b5-4227a521ff8c,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.565431 containerd[1615]: time="2025-09-13T01:18:41.565374395Z" level=error msg="Failed to destroy network for sandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.568446 kubelet[2861]: E0913 01:18:41.568381 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.568953 kubelet[2861]: E0913 01:18:41.568535 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-sh5jf" Sep 13 01:18:41.568953 kubelet[2861]: E0913 01:18:41.568581 2861 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-sh5jf" Sep 13 01:18:41.568953 kubelet[2861]: E0913 01:18:41.568650 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-sh5jf_kube-system(9b023c1a-2436-46bd-a84b-0a772635011c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-sh5jf_kube-system(9b023c1a-2436-46bd-a84b-0a772635011c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-sh5jf" podUID="9b023c1a-2436-46bd-a84b-0a772635011c" Sep 13 01:18:41.572974 kubelet[2861]: E0913 01:18:41.569217 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.572974 kubelet[2861]: E0913 01:18:41.569252 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8m5j5" Sep 13 01:18:41.572974 kubelet[2861]: E0913 01:18:41.569277 2861 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8m5j5" Sep 13 01:18:41.573379 kubelet[2861]: E0913 01:18:41.569311 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-8m5j5_kube-system(b466df6e-e38d-402c-86b5-4227a521ff8c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-8m5j5_kube-system(b466df6e-e38d-402c-86b5-4227a521ff8c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-8m5j5" podUID="b466df6e-e38d-402c-86b5-4227a521ff8c" Sep 13 01:18:41.574291 containerd[1615]: time="2025-09-13T01:18:41.573622267Z" level=error msg="encountered an error cleaning up failed sandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.577699 containerd[1615]: time="2025-09-13T01:18:41.576869265Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-794c96ccd7-d8jh4,Uid:8b7ec64d-f2d5-435c-a825-32b5603eece4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.577853 kubelet[2861]: E0913 01:18:41.577072 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.577853 kubelet[2861]: E0913 01:18:41.577127 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-794c96ccd7-d8jh4" Sep 13 01:18:41.577853 kubelet[2861]: E0913 01:18:41.577155 2861 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-794c96ccd7-d8jh4" Sep 13 01:18:41.578045 kubelet[2861]: E0913 01:18:41.577206 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-794c96ccd7-d8jh4_calico-apiserver(8b7ec64d-f2d5-435c-a825-32b5603eece4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-794c96ccd7-d8jh4_calico-apiserver(8b7ec64d-f2d5-435c-a825-32b5603eece4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-794c96ccd7-d8jh4" podUID="8b7ec64d-f2d5-435c-a825-32b5603eece4" Sep 13 01:18:41.593707 kubelet[2861]: I0913 01:18:41.593466 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:18:41.599499 containerd[1615]: time="2025-09-13T01:18:41.598247344Z" level=error msg="Failed to destroy network for sandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.603005 containerd[1615]: time="2025-09-13T01:18:41.601932079Z" level=error msg="encountered an error cleaning up failed sandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.603198 containerd[1615]: time="2025-09-13T01:18:41.603160288Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-794c96ccd7-9sdrw,Uid:9ed5b42c-f6b8-4b84-aec8-d89c53dde12b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.622423 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:18:41.617807 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:18:41.617920 systemd-resolved[1511]: Flushed all caches. Sep 13 01:18:41.636922 kubelet[2861]: E0913 01:18:41.636406 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.636922 kubelet[2861]: E0913 01:18:41.636506 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-794c96ccd7-9sdrw" Sep 13 01:18:41.636922 kubelet[2861]: E0913 01:18:41.636547 2861 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-794c96ccd7-9sdrw" Sep 13 01:18:41.639587 kubelet[2861]: E0913 01:18:41.636612 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-794c96ccd7-9sdrw_calico-apiserver(9ed5b42c-f6b8-4b84-aec8-d89c53dde12b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-794c96ccd7-9sdrw_calico-apiserver(9ed5b42c-f6b8-4b84-aec8-d89c53dde12b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-794c96ccd7-9sdrw" podUID="9ed5b42c-f6b8-4b84-aec8-d89c53dde12b" Sep 13 01:18:41.640894 containerd[1615]: time="2025-09-13T01:18:41.640713669Z" level=error msg="Failed to destroy network for sandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.641967 containerd[1615]: time="2025-09-13T01:18:41.641932441Z" level=error msg="encountered an error cleaning up failed sandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.644725 kubelet[2861]: I0913 01:18:41.644698 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:18:41.645048 containerd[1615]: time="2025-09-13T01:18:41.644902755Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-9kgmq,Uid:21caebb3-60d2-40b7-849a-7ed3e6b8a990,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.646130 containerd[1615]: time="2025-09-13T01:18:41.645495480Z" level=info msg="StopPodSandbox for \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\"" Sep 13 01:18:41.646822 kubelet[2861]: E0913 01:18:41.646331 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.646822 kubelet[2861]: E0913 01:18:41.646403 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-9kgmq" Sep 13 01:18:41.646822 kubelet[2861]: E0913 01:18:41.646436 2861 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-9kgmq" Sep 13 01:18:41.647025 kubelet[2861]: E0913 01:18:41.646486 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-9kgmq_calico-system(21caebb3-60d2-40b7-849a-7ed3e6b8a990)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-9kgmq_calico-system(21caebb3-60d2-40b7-849a-7ed3e6b8a990)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-9kgmq" podUID="21caebb3-60d2-40b7-849a-7ed3e6b8a990" Sep 13 01:18:41.647127 containerd[1615]: time="2025-09-13T01:18:41.646974741Z" level=info msg="Ensure that sandbox 7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1 in task-service has been cleanup successfully" Sep 13 01:18:41.647563 containerd[1615]: time="2025-09-13T01:18:41.647202413Z" level=info msg="StopPodSandbox for \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\"" Sep 13 01:18:41.647563 containerd[1615]: time="2025-09-13T01:18:41.647393869Z" level=info msg="Ensure that sandbox a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3 in task-service has been cleanup successfully" Sep 13 01:18:41.662081 containerd[1615]: time="2025-09-13T01:18:41.657942569Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 13 01:18:41.691853 containerd[1615]: time="2025-09-13T01:18:41.691801602Z" level=error msg="Failed to destroy network for sandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.697054 containerd[1615]: time="2025-09-13T01:18:41.696318941Z" level=error msg="encountered an error cleaning up failed sandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.697054 containerd[1615]: time="2025-09-13T01:18:41.696383935Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84f64fc779-xslb8,Uid:e4f2ee13-e591-4f60-a716-9a0d904df543,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.701468 kubelet[2861]: E0913 01:18:41.697090 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.701468 kubelet[2861]: E0913 01:18:41.697195 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-84f64fc779-xslb8" Sep 13 01:18:41.701468 kubelet[2861]: E0913 01:18:41.697235 2861 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-84f64fc779-xslb8" Sep 13 01:18:41.700548 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a-shm.mount: Deactivated successfully. Sep 13 01:18:41.702220 kubelet[2861]: E0913 01:18:41.698408 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-84f64fc779-xslb8_calico-system(e4f2ee13-e591-4f60-a716-9a0d904df543)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-84f64fc779-xslb8_calico-system(e4f2ee13-e591-4f60-a716-9a0d904df543)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-84f64fc779-xslb8" podUID="e4f2ee13-e591-4f60-a716-9a0d904df543" Sep 13 01:18:41.720787 containerd[1615]: time="2025-09-13T01:18:41.720579429Z" level=error msg="Failed to destroy network for sandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.724096 containerd[1615]: time="2025-09-13T01:18:41.723800692Z" level=error msg="encountered an error cleaning up failed sandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.728537 containerd[1615]: time="2025-09-13T01:18:41.724190346Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f77cfb5c7-t6k5v,Uid:2777e08f-0ded-483c-bbf8-a0c22f090d0b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.728677 kubelet[2861]: E0913 01:18:41.724730 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.728677 kubelet[2861]: E0913 01:18:41.725563 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7f77cfb5c7-t6k5v" Sep 13 01:18:41.728677 kubelet[2861]: E0913 01:18:41.725646 2861 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7f77cfb5c7-t6k5v" Sep 13 01:18:41.724572 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914-shm.mount: Deactivated successfully. Sep 13 01:18:41.729051 kubelet[2861]: E0913 01:18:41.725747 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7f77cfb5c7-t6k5v_calico-system(2777e08f-0ded-483c-bbf8-a0c22f090d0b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7f77cfb5c7-t6k5v_calico-system(2777e08f-0ded-483c-bbf8-a0c22f090d0b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7f77cfb5c7-t6k5v" podUID="2777e08f-0ded-483c-bbf8-a0c22f090d0b" Sep 13 01:18:41.756154 containerd[1615]: time="2025-09-13T01:18:41.753968415Z" level=error msg="StopPodSandbox for \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\" failed" error="failed to destroy network for sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.756503 kubelet[2861]: E0913 01:18:41.754491 2861 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:18:41.756503 kubelet[2861]: E0913 01:18:41.754646 2861 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1"} Sep 13 01:18:41.756503 kubelet[2861]: E0913 01:18:41.754768 2861 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"b466df6e-e38d-402c-86b5-4227a521ff8c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:18:41.756503 kubelet[2861]: E0913 01:18:41.754818 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"b466df6e-e38d-402c-86b5-4227a521ff8c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-8m5j5" podUID="b466df6e-e38d-402c-86b5-4227a521ff8c" Sep 13 01:18:41.778437 containerd[1615]: time="2025-09-13T01:18:41.778225698Z" level=error msg="StopPodSandbox for \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\" failed" error="failed to destroy network for sandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:41.778898 kubelet[2861]: E0913 01:18:41.778662 2861 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:18:41.778898 kubelet[2861]: E0913 01:18:41.778749 2861 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3"} Sep 13 01:18:41.778898 kubelet[2861]: E0913 01:18:41.778822 2861 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9b023c1a-2436-46bd-a84b-0a772635011c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:18:41.778898 kubelet[2861]: E0913 01:18:41.778860 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9b023c1a-2436-46bd-a84b-0a772635011c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-sh5jf" podUID="9b023c1a-2436-46bd-a84b-0a772635011c" Sep 13 01:18:42.242737 containerd[1615]: time="2025-09-13T01:18:42.242044763Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4zbwf,Uid:3a06b43f-3090-4270-8011-d28f2c555ca3,Namespace:calico-system,Attempt:0,}" Sep 13 01:18:42.335049 containerd[1615]: time="2025-09-13T01:18:42.334838153Z" level=error msg="Failed to destroy network for sandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:42.335858 containerd[1615]: time="2025-09-13T01:18:42.335640408Z" level=error msg="encountered an error cleaning up failed sandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:42.335858 containerd[1615]: time="2025-09-13T01:18:42.335708161Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4zbwf,Uid:3a06b43f-3090-4270-8011-d28f2c555ca3,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:42.336146 kubelet[2861]: E0913 01:18:42.336034 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:42.336248 kubelet[2861]: E0913 01:18:42.336169 2861 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-4zbwf" Sep 13 01:18:42.336248 kubelet[2861]: E0913 01:18:42.336212 2861 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-4zbwf" Sep 13 01:18:42.336361 kubelet[2861]: E0913 01:18:42.336277 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-4zbwf_calico-system(3a06b43f-3090-4270-8011-d28f2c555ca3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-4zbwf_calico-system(3a06b43f-3090-4270-8011-d28f2c555ca3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-4zbwf" podUID="3a06b43f-3090-4270-8011-d28f2c555ca3" Sep 13 01:18:42.339737 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235-shm.mount: Deactivated successfully. Sep 13 01:18:42.648732 kubelet[2861]: I0913 01:18:42.648695 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:18:42.650448 containerd[1615]: time="2025-09-13T01:18:42.650227749Z" level=info msg="StopPodSandbox for \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\"" Sep 13 01:18:42.651409 containerd[1615]: time="2025-09-13T01:18:42.651382627Z" level=info msg="Ensure that sandbox db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589 in task-service has been cleanup successfully" Sep 13 01:18:42.652659 kubelet[2861]: I0913 01:18:42.652621 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:18:42.655973 containerd[1615]: time="2025-09-13T01:18:42.654599555Z" level=info msg="StopPodSandbox for \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\"" Sep 13 01:18:42.658020 containerd[1615]: time="2025-09-13T01:18:42.657914969Z" level=info msg="Ensure that sandbox 134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a in task-service has been cleanup successfully" Sep 13 01:18:42.660250 kubelet[2861]: I0913 01:18:42.659060 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:18:42.661923 containerd[1615]: time="2025-09-13T01:18:42.661510301Z" level=info msg="StopPodSandbox for \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\"" Sep 13 01:18:42.662087 containerd[1615]: time="2025-09-13T01:18:42.662045207Z" level=info msg="Ensure that sandbox 99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524 in task-service has been cleanup successfully" Sep 13 01:18:42.666168 kubelet[2861]: I0913 01:18:42.666097 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:18:42.669272 containerd[1615]: time="2025-09-13T01:18:42.668148553Z" level=info msg="StopPodSandbox for \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\"" Sep 13 01:18:42.669272 containerd[1615]: time="2025-09-13T01:18:42.668626817Z" level=info msg="Ensure that sandbox ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235 in task-service has been cleanup successfully" Sep 13 01:18:42.674244 kubelet[2861]: I0913 01:18:42.673885 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:18:42.675632 containerd[1615]: time="2025-09-13T01:18:42.675593022Z" level=info msg="StopPodSandbox for \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\"" Sep 13 01:18:42.678293 containerd[1615]: time="2025-09-13T01:18:42.678202825Z" level=info msg="Ensure that sandbox 879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914 in task-service has been cleanup successfully" Sep 13 01:18:42.678858 kubelet[2861]: I0913 01:18:42.678833 2861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:18:42.679634 containerd[1615]: time="2025-09-13T01:18:42.679592120Z" level=info msg="StopPodSandbox for \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\"" Sep 13 01:18:42.679862 containerd[1615]: time="2025-09-13T01:18:42.679825650Z" level=info msg="Ensure that sandbox b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3 in task-service has been cleanup successfully" Sep 13 01:18:42.790666 containerd[1615]: time="2025-09-13T01:18:42.790469747Z" level=error msg="StopPodSandbox for \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\" failed" error="failed to destroy network for sandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:42.791925 kubelet[2861]: E0913 01:18:42.790805 2861 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:18:42.791925 kubelet[2861]: E0913 01:18:42.790869 2861 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524"} Sep 13 01:18:42.791925 kubelet[2861]: E0913 01:18:42.790918 2861 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ed5b42c-f6b8-4b84-aec8-d89c53dde12b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:18:42.791925 kubelet[2861]: E0913 01:18:42.790952 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ed5b42c-f6b8-4b84-aec8-d89c53dde12b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-794c96ccd7-9sdrw" podUID="9ed5b42c-f6b8-4b84-aec8-d89c53dde12b" Sep 13 01:18:42.802439 containerd[1615]: time="2025-09-13T01:18:42.802373260Z" level=error msg="StopPodSandbox for \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\" failed" error="failed to destroy network for sandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:42.803179 containerd[1615]: time="2025-09-13T01:18:42.802621857Z" level=error msg="StopPodSandbox for \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\" failed" error="failed to destroy network for sandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:42.803533 kubelet[2861]: E0913 01:18:42.803463 2861 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:18:42.803634 kubelet[2861]: E0913 01:18:42.803549 2861 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3"} Sep 13 01:18:42.803634 kubelet[2861]: E0913 01:18:42.803605 2861 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8b7ec64d-f2d5-435c-a825-32b5603eece4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:18:42.804114 kubelet[2861]: E0913 01:18:42.803638 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8b7ec64d-f2d5-435c-a825-32b5603eece4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-794c96ccd7-d8jh4" podUID="8b7ec64d-f2d5-435c-a825-32b5603eece4" Sep 13 01:18:42.804114 kubelet[2861]: E0913 01:18:42.803816 2861 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:18:42.804114 kubelet[2861]: E0913 01:18:42.803852 2861 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589"} Sep 13 01:18:42.804114 kubelet[2861]: E0913 01:18:42.803886 2861 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"21caebb3-60d2-40b7-849a-7ed3e6b8a990\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:18:42.804350 kubelet[2861]: E0913 01:18:42.803917 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"21caebb3-60d2-40b7-849a-7ed3e6b8a990\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-9kgmq" podUID="21caebb3-60d2-40b7-849a-7ed3e6b8a990" Sep 13 01:18:42.816361 containerd[1615]: time="2025-09-13T01:18:42.816085196Z" level=error msg="StopPodSandbox for \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\" failed" error="failed to destroy network for sandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:42.816523 kubelet[2861]: E0913 01:18:42.816358 2861 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:18:42.816523 kubelet[2861]: E0913 01:18:42.816434 2861 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914"} Sep 13 01:18:42.816523 kubelet[2861]: E0913 01:18:42.816484 2861 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"2777e08f-0ded-483c-bbf8-a0c22f090d0b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:18:42.816523 kubelet[2861]: E0913 01:18:42.816513 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"2777e08f-0ded-483c-bbf8-a0c22f090d0b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7f77cfb5c7-t6k5v" podUID="2777e08f-0ded-483c-bbf8-a0c22f090d0b" Sep 13 01:18:42.820029 containerd[1615]: time="2025-09-13T01:18:42.819810934Z" level=error msg="StopPodSandbox for \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\" failed" error="failed to destroy network for sandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:42.820378 kubelet[2861]: E0913 01:18:42.820328 2861 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:18:42.820378 kubelet[2861]: E0913 01:18:42.820390 2861 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a"} Sep 13 01:18:42.820697 kubelet[2861]: E0913 01:18:42.820426 2861 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"e4f2ee13-e591-4f60-a716-9a0d904df543\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:18:42.820697 kubelet[2861]: E0913 01:18:42.820463 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"e4f2ee13-e591-4f60-a716-9a0d904df543\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-84f64fc779-xslb8" podUID="e4f2ee13-e591-4f60-a716-9a0d904df543" Sep 13 01:18:42.821475 containerd[1615]: time="2025-09-13T01:18:42.821047485Z" level=error msg="StopPodSandbox for \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\" failed" error="failed to destroy network for sandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:42.821574 kubelet[2861]: E0913 01:18:42.821324 2861 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:18:42.821574 kubelet[2861]: E0913 01:18:42.821367 2861 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235"} Sep 13 01:18:42.821574 kubelet[2861]: E0913 01:18:42.821404 2861 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"3a06b43f-3090-4270-8011-d28f2c555ca3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:18:42.821574 kubelet[2861]: E0913 01:18:42.821432 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"3a06b43f-3090-4270-8011-d28f2c555ca3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-4zbwf" podUID="3a06b43f-3090-4270-8011-d28f2c555ca3" Sep 13 01:18:51.578793 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount526955471.mount: Deactivated successfully. Sep 13 01:18:51.665541 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:18:51.654437 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:18:51.654475 systemd-resolved[1511]: Flushed all caches. Sep 13 01:18:51.675645 containerd[1615]: time="2025-09-13T01:18:51.675358710Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 10.015321826s" Sep 13 01:18:51.676413 containerd[1615]: time="2025-09-13T01:18:51.676374843Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 13 01:18:51.676971 containerd[1615]: time="2025-09-13T01:18:51.662512328Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 13 01:18:51.677103 containerd[1615]: time="2025-09-13T01:18:51.677052902Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:51.694740 containerd[1615]: time="2025-09-13T01:18:51.694675538Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:51.696717 containerd[1615]: time="2025-09-13T01:18:51.695966399Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:51.769213 containerd[1615]: time="2025-09-13T01:18:51.769112559Z" level=info msg="CreateContainer within sandbox \"668369cdc7df55acff8c7122635a4add3bbdd741f2994edfd0dc642ddf189837\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 13 01:18:51.849485 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1614551765.mount: Deactivated successfully. Sep 13 01:18:51.857555 containerd[1615]: time="2025-09-13T01:18:51.856431531Z" level=info msg="CreateContainer within sandbox \"668369cdc7df55acff8c7122635a4add3bbdd741f2994edfd0dc642ddf189837\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"10133b9378be78c00a3efbaa0e2d019e1616b66844159d9fdc2e7da331a814e3\"" Sep 13 01:18:51.860458 containerd[1615]: time="2025-09-13T01:18:51.859745688Z" level=info msg="StartContainer for \"10133b9378be78c00a3efbaa0e2d019e1616b66844159d9fdc2e7da331a814e3\"" Sep 13 01:18:52.176094 containerd[1615]: time="2025-09-13T01:18:52.176019398Z" level=info msg="StartContainer for \"10133b9378be78c00a3efbaa0e2d019e1616b66844159d9fdc2e7da331a814e3\" returns successfully" Sep 13 01:18:52.237383 containerd[1615]: time="2025-09-13T01:18:52.237327445Z" level=info msg="StopPodSandbox for \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\"" Sep 13 01:18:52.347491 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 13 01:18:52.349416 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 13 01:18:52.371691 containerd[1615]: time="2025-09-13T01:18:52.371611778Z" level=error msg="StopPodSandbox for \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\" failed" error="failed to destroy network for sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 01:18:52.372637 kubelet[2861]: E0913 01:18:52.372567 2861 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:18:52.373543 kubelet[2861]: E0913 01:18:52.372658 2861 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1"} Sep 13 01:18:52.373543 kubelet[2861]: E0913 01:18:52.372728 2861 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"b466df6e-e38d-402c-86b5-4227a521ff8c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 01:18:52.374348 kubelet[2861]: E0913 01:18:52.374292 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"b466df6e-e38d-402c-86b5-4227a521ff8c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-8m5j5" podUID="b466df6e-e38d-402c-86b5-4227a521ff8c" Sep 13 01:18:52.641004 containerd[1615]: time="2025-09-13T01:18:52.639077765Z" level=info msg="StopPodSandbox for \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\"" Sep 13 01:18:52.808106 kubelet[2861]: I0913 01:18:52.787797 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-hsczl" podStartSLOduration=1.792190699 podStartE2EDuration="24.766095679s" podCreationTimestamp="2025-09-13 01:18:28 +0000 UTC" firstStartedPulling="2025-09-13 01:18:28.721429477 +0000 UTC m=+21.665187111" lastFinishedPulling="2025-09-13 01:18:51.695334462 +0000 UTC m=+44.639092091" observedRunningTime="2025-09-13 01:18:52.765566069 +0000 UTC m=+45.709323723" watchObservedRunningTime="2025-09-13 01:18:52.766095679 +0000 UTC m=+45.709853315" Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:52.895 [INFO][4106] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:52.896 [INFO][4106] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" iface="eth0" netns="/var/run/netns/cni-9f50d3c4-2f1a-a17d-644b-99cec9d0d2e9" Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:52.909 [INFO][4106] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" iface="eth0" netns="/var/run/netns/cni-9f50d3c4-2f1a-a17d-644b-99cec9d0d2e9" Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:52.913 [INFO][4106] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" iface="eth0" netns="/var/run/netns/cni-9f50d3c4-2f1a-a17d-644b-99cec9d0d2e9" Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:52.913 [INFO][4106] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:52.913 [INFO][4106] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:53.218 [INFO][4131] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" HandleID="k8s-pod-network.134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Workload="srv--qlx5f.gb1.brightbox.com-k8s-whisker--84f64fc779--xslb8-eth0" Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:53.222 [INFO][4131] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:53.223 [INFO][4131] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:53.237 [WARNING][4131] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" HandleID="k8s-pod-network.134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Workload="srv--qlx5f.gb1.brightbox.com-k8s-whisker--84f64fc779--xslb8-eth0" Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:53.237 [INFO][4131] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" HandleID="k8s-pod-network.134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Workload="srv--qlx5f.gb1.brightbox.com-k8s-whisker--84f64fc779--xslb8-eth0" Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:53.243 [INFO][4131] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:53.247671 containerd[1615]: 2025-09-13 01:18:53.245 [INFO][4106] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:18:53.250722 containerd[1615]: time="2025-09-13T01:18:53.248236308Z" level=info msg="TearDown network for sandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\" successfully" Sep 13 01:18:53.250722 containerd[1615]: time="2025-09-13T01:18:53.248289589Z" level=info msg="StopPodSandbox for \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\" returns successfully" Sep 13 01:18:53.255445 systemd[1]: run-netns-cni\x2d9f50d3c4\x2d2f1a\x2da17d\x2d644b\x2d99cec9d0d2e9.mount: Deactivated successfully. Sep 13 01:18:53.329617 kubelet[2861]: I0913 01:18:53.329482 2861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vhvs\" (UniqueName: \"kubernetes.io/projected/e4f2ee13-e591-4f60-a716-9a0d904df543-kube-api-access-7vhvs\") pod \"e4f2ee13-e591-4f60-a716-9a0d904df543\" (UID: \"e4f2ee13-e591-4f60-a716-9a0d904df543\") " Sep 13 01:18:53.332656 kubelet[2861]: I0913 01:18:53.332247 2861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4f2ee13-e591-4f60-a716-9a0d904df543-whisker-ca-bundle\") pod \"e4f2ee13-e591-4f60-a716-9a0d904df543\" (UID: \"e4f2ee13-e591-4f60-a716-9a0d904df543\") " Sep 13 01:18:53.332656 kubelet[2861]: I0913 01:18:53.332310 2861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e4f2ee13-e591-4f60-a716-9a0d904df543-whisker-backend-key-pair\") pod \"e4f2ee13-e591-4f60-a716-9a0d904df543\" (UID: \"e4f2ee13-e591-4f60-a716-9a0d904df543\") " Sep 13 01:18:53.348684 kubelet[2861]: I0913 01:18:53.347386 2861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4f2ee13-e591-4f60-a716-9a0d904df543-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "e4f2ee13-e591-4f60-a716-9a0d904df543" (UID: "e4f2ee13-e591-4f60-a716-9a0d904df543"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 13 01:18:53.355146 kubelet[2861]: I0913 01:18:53.354938 2861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4f2ee13-e591-4f60-a716-9a0d904df543-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "e4f2ee13-e591-4f60-a716-9a0d904df543" (UID: "e4f2ee13-e591-4f60-a716-9a0d904df543"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 13 01:18:53.357411 systemd[1]: var-lib-kubelet-pods-e4f2ee13\x2de591\x2d4f60\x2da716\x2d9a0d904df543-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d7vhvs.mount: Deactivated successfully. Sep 13 01:18:53.358315 kubelet[2861]: I0913 01:18:53.358057 2861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4f2ee13-e591-4f60-a716-9a0d904df543-kube-api-access-7vhvs" (OuterVolumeSpecName: "kube-api-access-7vhvs") pod "e4f2ee13-e591-4f60-a716-9a0d904df543" (UID: "e4f2ee13-e591-4f60-a716-9a0d904df543"). InnerVolumeSpecName "kube-api-access-7vhvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 13 01:18:53.362890 systemd[1]: var-lib-kubelet-pods-e4f2ee13\x2de591\x2d4f60\x2da716\x2d9a0d904df543-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 13 01:18:53.432772 kubelet[2861]: I0913 01:18:53.432702 2861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vhvs\" (UniqueName: \"kubernetes.io/projected/e4f2ee13-e591-4f60-a716-9a0d904df543-kube-api-access-7vhvs\") on node \"srv-qlx5f.gb1.brightbox.com\" DevicePath \"\"" Sep 13 01:18:53.432772 kubelet[2861]: I0913 01:18:53.432772 2861 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4f2ee13-e591-4f60-a716-9a0d904df543-whisker-ca-bundle\") on node \"srv-qlx5f.gb1.brightbox.com\" DevicePath \"\"" Sep 13 01:18:53.434140 kubelet[2861]: I0913 01:18:53.432789 2861 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e4f2ee13-e591-4f60-a716-9a0d904df543-whisker-backend-key-pair\") on node \"srv-qlx5f.gb1.brightbox.com\" DevicePath \"\"" Sep 13 01:18:53.702420 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:18:53.702432 systemd-resolved[1511]: Flushed all caches. Sep 13 01:18:53.706046 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:18:54.038840 kubelet[2861]: I0913 01:18:54.038497 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r85g9\" (UniqueName: \"kubernetes.io/projected/3d47803f-2bbc-46fa-b77a-0fc18c62dfb3-kube-api-access-r85g9\") pod \"whisker-69479b54b-nqqrf\" (UID: \"3d47803f-2bbc-46fa-b77a-0fc18c62dfb3\") " pod="calico-system/whisker-69479b54b-nqqrf" Sep 13 01:18:54.038840 kubelet[2861]: I0913 01:18:54.038591 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d47803f-2bbc-46fa-b77a-0fc18c62dfb3-whisker-ca-bundle\") pod \"whisker-69479b54b-nqqrf\" (UID: \"3d47803f-2bbc-46fa-b77a-0fc18c62dfb3\") " pod="calico-system/whisker-69479b54b-nqqrf" Sep 13 01:18:54.038840 kubelet[2861]: I0913 01:18:54.038636 2861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3d47803f-2bbc-46fa-b77a-0fc18c62dfb3-whisker-backend-key-pair\") pod \"whisker-69479b54b-nqqrf\" (UID: \"3d47803f-2bbc-46fa-b77a-0fc18c62dfb3\") " pod="calico-system/whisker-69479b54b-nqqrf" Sep 13 01:18:54.222203 containerd[1615]: time="2025-09-13T01:18:54.221477377Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-69479b54b-nqqrf,Uid:3d47803f-2bbc-46fa-b77a-0fc18c62dfb3,Namespace:calico-system,Attempt:0,}" Sep 13 01:18:54.239395 containerd[1615]: time="2025-09-13T01:18:54.237962162Z" level=info msg="StopPodSandbox for \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\"" Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.520 [INFO][4259] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.523 [INFO][4259] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" iface="eth0" netns="/var/run/netns/cni-e74d4767-1406-6f99-a213-f18bf58dc192" Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.525 [INFO][4259] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" iface="eth0" netns="/var/run/netns/cni-e74d4767-1406-6f99-a213-f18bf58dc192" Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.530 [INFO][4259] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" iface="eth0" netns="/var/run/netns/cni-e74d4767-1406-6f99-a213-f18bf58dc192" Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.530 [INFO][4259] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.530 [INFO][4259] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.667 [INFO][4296] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" HandleID="k8s-pod-network.ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Workload="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.667 [INFO][4296] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.667 [INFO][4296] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.690 [WARNING][4296] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" HandleID="k8s-pod-network.ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Workload="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.690 [INFO][4296] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" HandleID="k8s-pod-network.ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Workload="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.695 [INFO][4296] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:54.717837 containerd[1615]: 2025-09-13 01:18:54.709 [INFO][4259] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:18:54.725491 containerd[1615]: time="2025-09-13T01:18:54.718765223Z" level=info msg="TearDown network for sandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\" successfully" Sep 13 01:18:54.725491 containerd[1615]: time="2025-09-13T01:18:54.718808129Z" level=info msg="StopPodSandbox for \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\" returns successfully" Sep 13 01:18:54.725491 containerd[1615]: time="2025-09-13T01:18:54.724035535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4zbwf,Uid:3a06b43f-3090-4270-8011-d28f2c555ca3,Namespace:calico-system,Attempt:1,}" Sep 13 01:18:54.775645 systemd[1]: run-netns-cni\x2de74d4767\x2d1406\x2d6f99\x2da213\x2df18bf58dc192.mount: Deactivated successfully. Sep 13 01:18:54.956540 systemd-networkd[1265]: cali4587859a3ad: Link UP Sep 13 01:18:54.970116 systemd-networkd[1265]: cali4587859a3ad: Gained carrier Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.453 [INFO][4242] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.504 [INFO][4242] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0 whisker-69479b54b- calico-system 3d47803f-2bbc-46fa-b77a-0fc18c62dfb3 899 0 2025-09-13 01:18:53 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:69479b54b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-qlx5f.gb1.brightbox.com whisker-69479b54b-nqqrf eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali4587859a3ad [] [] }} ContainerID="5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" Namespace="calico-system" Pod="whisker-69479b54b-nqqrf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-" Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.504 [INFO][4242] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" Namespace="calico-system" Pod="whisker-69479b54b-nqqrf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0" Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.662 [INFO][4291] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" HandleID="k8s-pod-network.5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" Workload="srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0" Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.678 [INFO][4291] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" HandleID="k8s-pod-network.5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" Workload="srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003a8220), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-qlx5f.gb1.brightbox.com", "pod":"whisker-69479b54b-nqqrf", "timestamp":"2025-09-13 01:18:54.662615683 +0000 UTC"}, Hostname:"srv-qlx5f.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.678 [INFO][4291] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.695 [INFO][4291] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.697 [INFO][4291] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-qlx5f.gb1.brightbox.com' Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.731 [INFO][4291] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.768 [INFO][4291] ipam/ipam.go 394: Looking up existing affinities for host host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.810 [INFO][4291] ipam/ipam.go 511: Trying affinity for 192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.828 [INFO][4291] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.838 [INFO][4291] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.838 [INFO][4291] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.855 [INFO][4291] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.875 [INFO][4291] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.890 [INFO][4291] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.193/26] block=192.168.98.192/26 handle="k8s-pod-network.5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.890 [INFO][4291] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.193/26] handle="k8s-pod-network.5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.890 [INFO][4291] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:55.079781 containerd[1615]: 2025-09-13 01:18:54.892 [INFO][4291] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.193/26] IPv6=[] ContainerID="5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" HandleID="k8s-pod-network.5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" Workload="srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0" Sep 13 01:18:55.087633 containerd[1615]: 2025-09-13 01:18:54.906 [INFO][4242] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" Namespace="calico-system" Pod="whisker-69479b54b-nqqrf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0", GenerateName:"whisker-69479b54b-", Namespace:"calico-system", SelfLink:"", UID:"3d47803f-2bbc-46fa-b77a-0fc18c62dfb3", ResourceVersion:"899", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"69479b54b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"", Pod:"whisker-69479b54b-nqqrf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.98.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali4587859a3ad", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:55.087633 containerd[1615]: 2025-09-13 01:18:54.906 [INFO][4242] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.193/32] ContainerID="5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" Namespace="calico-system" Pod="whisker-69479b54b-nqqrf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0" Sep 13 01:18:55.087633 containerd[1615]: 2025-09-13 01:18:54.906 [INFO][4242] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4587859a3ad ContainerID="5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" Namespace="calico-system" Pod="whisker-69479b54b-nqqrf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0" Sep 13 01:18:55.087633 containerd[1615]: 2025-09-13 01:18:54.979 [INFO][4242] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" Namespace="calico-system" Pod="whisker-69479b54b-nqqrf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0" Sep 13 01:18:55.087633 containerd[1615]: 2025-09-13 01:18:54.992 [INFO][4242] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" Namespace="calico-system" Pod="whisker-69479b54b-nqqrf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0", GenerateName:"whisker-69479b54b-", Namespace:"calico-system", SelfLink:"", UID:"3d47803f-2bbc-46fa-b77a-0fc18c62dfb3", ResourceVersion:"899", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"69479b54b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c", Pod:"whisker-69479b54b-nqqrf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.98.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali4587859a3ad", MAC:"06:aa:90:1c:bd:d3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:55.087633 containerd[1615]: 2025-09-13 01:18:55.051 [INFO][4242] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c" Namespace="calico-system" Pod="whisker-69479b54b-nqqrf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-whisker--69479b54b--nqqrf-eth0" Sep 13 01:18:55.206687 containerd[1615]: time="2025-09-13T01:18:55.202474160Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:55.206687 containerd[1615]: time="2025-09-13T01:18:55.202619047Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:55.206687 containerd[1615]: time="2025-09-13T01:18:55.202643268Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:55.206687 containerd[1615]: time="2025-09-13T01:18:55.202847563Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:55.272616 kubelet[2861]: I0913 01:18:55.272486 2861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4f2ee13-e591-4f60-a716-9a0d904df543" path="/var/lib/kubelet/pods/e4f2ee13-e591-4f60-a716-9a0d904df543/volumes" Sep 13 01:18:55.285150 systemd-networkd[1265]: calib437f1c4d48: Link UP Sep 13 01:18:55.286382 systemd-networkd[1265]: calib437f1c4d48: Gained carrier Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:54.867 [INFO][4304] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:54.915 [INFO][4304] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0 csi-node-driver- calico-system 3a06b43f-3090-4270-8011-d28f2c555ca3 905 0 2025-09-13 01:18:28 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-qlx5f.gb1.brightbox.com csi-node-driver-4zbwf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib437f1c4d48 [] [] }} ContainerID="f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" Namespace="calico-system" Pod="csi-node-driver-4zbwf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-" Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:54.915 [INFO][4304] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" Namespace="calico-system" Pod="csi-node-driver-4zbwf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.037 [INFO][4318] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" HandleID="k8s-pod-network.f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" Workload="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.038 [INFO][4318] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" HandleID="k8s-pod-network.f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" Workload="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5ba0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-qlx5f.gb1.brightbox.com", "pod":"csi-node-driver-4zbwf", "timestamp":"2025-09-13 01:18:55.037530494 +0000 UTC"}, Hostname:"srv-qlx5f.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.038 [INFO][4318] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.038 [INFO][4318] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.038 [INFO][4318] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-qlx5f.gb1.brightbox.com' Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.056 [INFO][4318] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.090 [INFO][4318] ipam/ipam.go 394: Looking up existing affinities for host host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.108 [INFO][4318] ipam/ipam.go 511: Trying affinity for 192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.114 [INFO][4318] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.118 [INFO][4318] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.120 [INFO][4318] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.132 [INFO][4318] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202 Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.150 [INFO][4318] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.256 [INFO][4318] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.194/26] block=192.168.98.192/26 handle="k8s-pod-network.f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.258 [INFO][4318] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.194/26] handle="k8s-pod-network.f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.258 [INFO][4318] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:55.341443 containerd[1615]: 2025-09-13 01:18:55.258 [INFO][4318] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.194/26] IPv6=[] ContainerID="f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" HandleID="k8s-pod-network.f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" Workload="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:18:55.342489 containerd[1615]: 2025-09-13 01:18:55.270 [INFO][4304] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" Namespace="calico-system" Pod="csi-node-driver-4zbwf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3a06b43f-3090-4270-8011-d28f2c555ca3", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-4zbwf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib437f1c4d48", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:55.342489 containerd[1615]: 2025-09-13 01:18:55.274 [INFO][4304] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.194/32] ContainerID="f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" Namespace="calico-system" Pod="csi-node-driver-4zbwf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:18:55.342489 containerd[1615]: 2025-09-13 01:18:55.274 [INFO][4304] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib437f1c4d48 ContainerID="f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" Namespace="calico-system" Pod="csi-node-driver-4zbwf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:18:55.342489 containerd[1615]: 2025-09-13 01:18:55.308 [INFO][4304] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" Namespace="calico-system" Pod="csi-node-driver-4zbwf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:18:55.342489 containerd[1615]: 2025-09-13 01:18:55.309 [INFO][4304] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" Namespace="calico-system" Pod="csi-node-driver-4zbwf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3a06b43f-3090-4270-8011-d28f2c555ca3", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202", Pod:"csi-node-driver-4zbwf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib437f1c4d48", MAC:"46:e3:e6:b8:13:67", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:55.342489 containerd[1615]: 2025-09-13 01:18:55.338 [INFO][4304] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202" Namespace="calico-system" Pod="csi-node-driver-4zbwf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:18:55.386335 containerd[1615]: time="2025-09-13T01:18:55.383335058Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:55.386335 containerd[1615]: time="2025-09-13T01:18:55.383462479Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:55.386335 containerd[1615]: time="2025-09-13T01:18:55.383513295Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:55.386335 containerd[1615]: time="2025-09-13T01:18:55.385443621Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:55.612743 containerd[1615]: time="2025-09-13T01:18:55.611953118Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-69479b54b-nqqrf,Uid:3d47803f-2bbc-46fa-b77a-0fc18c62dfb3,Namespace:calico-system,Attempt:0,} returns sandbox id \"5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c\"" Sep 13 01:18:55.618671 containerd[1615]: time="2025-09-13T01:18:55.618601293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4zbwf,Uid:3a06b43f-3090-4270-8011-d28f2c555ca3,Namespace:calico-system,Attempt:1,} returns sandbox id \"f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202\"" Sep 13 01:18:55.620069 containerd[1615]: time="2025-09-13T01:18:55.620026505Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 13 01:18:55.887174 kernel: bpftool[4458]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Sep 13 01:18:56.238233 containerd[1615]: time="2025-09-13T01:18:56.237874674Z" level=info msg="StopPodSandbox for \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\"" Sep 13 01:18:56.323435 systemd-networkd[1265]: vxlan.calico: Link UP Sep 13 01:18:56.323446 systemd-networkd[1265]: vxlan.calico: Gained carrier Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.384 [INFO][4481] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.389 [INFO][4481] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" iface="eth0" netns="/var/run/netns/cni-7738228a-9aba-76a4-7300-4e08ca28fe4f" Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.389 [INFO][4481] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" iface="eth0" netns="/var/run/netns/cni-7738228a-9aba-76a4-7300-4e08ca28fe4f" Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.389 [INFO][4481] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" iface="eth0" netns="/var/run/netns/cni-7738228a-9aba-76a4-7300-4e08ca28fe4f" Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.390 [INFO][4481] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.390 [INFO][4481] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.460 [INFO][4502] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" HandleID="k8s-pod-network.a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.466 [INFO][4502] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.466 [INFO][4502] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.487 [WARNING][4502] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" HandleID="k8s-pod-network.a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.487 [INFO][4502] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" HandleID="k8s-pod-network.a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.490 [INFO][4502] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:56.507340 containerd[1615]: 2025-09-13 01:18:56.499 [INFO][4481] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:18:56.521057 containerd[1615]: time="2025-09-13T01:18:56.514862340Z" level=info msg="TearDown network for sandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\" successfully" Sep 13 01:18:56.521057 containerd[1615]: time="2025-09-13T01:18:56.514930756Z" level=info msg="StopPodSandbox for \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\" returns successfully" Sep 13 01:18:56.515829 systemd[1]: run-netns-cni\x2d7738228a\x2d9aba\x2d76a4\x2d7300\x2d4e08ca28fe4f.mount: Deactivated successfully. Sep 13 01:18:56.523144 containerd[1615]: time="2025-09-13T01:18:56.522814598Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-sh5jf,Uid:9b023c1a-2436-46bd-a84b-0a772635011c,Namespace:kube-system,Attempt:1,}" Sep 13 01:18:56.754893 systemd-networkd[1265]: caliecaa0ce9c23: Link UP Sep 13 01:18:56.756083 systemd-networkd[1265]: caliecaa0ce9c23: Gained carrier Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.607 [INFO][4529] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0 coredns-7c65d6cfc9- kube-system 9b023c1a-2436-46bd-a84b-0a772635011c 920 0 2025-09-13 01:18:13 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-qlx5f.gb1.brightbox.com coredns-7c65d6cfc9-sh5jf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliecaa0ce9c23 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sh5jf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-" Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.607 [INFO][4529] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sh5jf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.657 [INFO][4541] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" HandleID="k8s-pod-network.cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.658 [INFO][4541] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" HandleID="k8s-pod-network.cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5600), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-qlx5f.gb1.brightbox.com", "pod":"coredns-7c65d6cfc9-sh5jf", "timestamp":"2025-09-13 01:18:56.657657529 +0000 UTC"}, Hostname:"srv-qlx5f.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.658 [INFO][4541] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.658 [INFO][4541] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.658 [INFO][4541] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-qlx5f.gb1.brightbox.com' Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.670 [INFO][4541] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.689 [INFO][4541] ipam/ipam.go 394: Looking up existing affinities for host host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.702 [INFO][4541] ipam/ipam.go 511: Trying affinity for 192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.707 [INFO][4541] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.711 [INFO][4541] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.711 [INFO][4541] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.716 [INFO][4541] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0 Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.726 [INFO][4541] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.735 [INFO][4541] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.195/26] block=192.168.98.192/26 handle="k8s-pod-network.cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.735 [INFO][4541] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.195/26] handle="k8s-pod-network.cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.735 [INFO][4541] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:56.784917 containerd[1615]: 2025-09-13 01:18:56.735 [INFO][4541] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.195/26] IPv6=[] ContainerID="cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" HandleID="k8s-pod-network.cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:18:56.787449 containerd[1615]: 2025-09-13 01:18:56.740 [INFO][4529] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sh5jf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"9b023c1a-2436-46bd-a84b-0a772635011c", ResourceVersion:"920", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7c65d6cfc9-sh5jf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliecaa0ce9c23", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:56.787449 containerd[1615]: 2025-09-13 01:18:56.741 [INFO][4529] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.195/32] ContainerID="cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sh5jf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:18:56.787449 containerd[1615]: 2025-09-13 01:18:56.741 [INFO][4529] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliecaa0ce9c23 ContainerID="cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sh5jf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:18:56.787449 containerd[1615]: 2025-09-13 01:18:56.757 [INFO][4529] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sh5jf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:18:56.787449 containerd[1615]: 2025-09-13 01:18:56.758 [INFO][4529] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sh5jf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"9b023c1a-2436-46bd-a84b-0a772635011c", ResourceVersion:"920", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0", Pod:"coredns-7c65d6cfc9-sh5jf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliecaa0ce9c23", MAC:"86:e7:a8:e7:47:b7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:56.787449 containerd[1615]: 2025-09-13 01:18:56.778 [INFO][4529] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sh5jf" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:18:56.818893 containerd[1615]: time="2025-09-13T01:18:56.818462125Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:56.818893 containerd[1615]: time="2025-09-13T01:18:56.818586033Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:56.818893 containerd[1615]: time="2025-09-13T01:18:56.818609803Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:56.818893 containerd[1615]: time="2025-09-13T01:18:56.818757690Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:56.840173 systemd-networkd[1265]: cali4587859a3ad: Gained IPv6LL Sep 13 01:18:56.954106 containerd[1615]: time="2025-09-13T01:18:56.953867560Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-sh5jf,Uid:9b023c1a-2436-46bd-a84b-0a772635011c,Namespace:kube-system,Attempt:1,} returns sandbox id \"cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0\"" Sep 13 01:18:56.964366 containerd[1615]: time="2025-09-13T01:18:56.964307443Z" level=info msg="CreateContainer within sandbox \"cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 13 01:18:57.009523 containerd[1615]: time="2025-09-13T01:18:57.009347432Z" level=info msg="CreateContainer within sandbox \"cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f0041bdafd3ede2afaf7fed0ef3c0ad7a4892a984a7a64993732473addeff2c6\"" Sep 13 01:18:57.012612 containerd[1615]: time="2025-09-13T01:18:57.012569675Z" level=info msg="StartContainer for \"f0041bdafd3ede2afaf7fed0ef3c0ad7a4892a984a7a64993732473addeff2c6\"" Sep 13 01:18:57.114788 containerd[1615]: time="2025-09-13T01:18:57.114564894Z" level=info msg="StartContainer for \"f0041bdafd3ede2afaf7fed0ef3c0ad7a4892a984a7a64993732473addeff2c6\" returns successfully" Sep 13 01:18:57.243577 containerd[1615]: time="2025-09-13T01:18:57.243370461Z" level=info msg="StopPodSandbox for \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\"" Sep 13 01:18:57.249585 containerd[1615]: time="2025-09-13T01:18:57.244825385Z" level=info msg="StopPodSandbox for \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\"" Sep 13 01:18:57.250821 containerd[1615]: time="2025-09-13T01:18:57.244867148Z" level=info msg="StopPodSandbox for \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\"" Sep 13 01:18:57.251304 containerd[1615]: time="2025-09-13T01:18:57.244898064Z" level=info msg="StopPodSandbox for \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\"" Sep 13 01:18:57.289548 systemd-networkd[1265]: calib437f1c4d48: Gained IPv6LL Sep 13 01:18:57.422119 systemd-networkd[1265]: vxlan.calico: Gained IPv6LL Sep 13 01:18:57.521435 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount416282867.mount: Deactivated successfully. Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.581 [INFO][4700] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.581 [INFO][4700] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" iface="eth0" netns="/var/run/netns/cni-43cb58ad-f39a-fb9f-deb2-0e5c99e18e9e" Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.582 [INFO][4700] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" iface="eth0" netns="/var/run/netns/cni-43cb58ad-f39a-fb9f-deb2-0e5c99e18e9e" Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.583 [INFO][4700] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" iface="eth0" netns="/var/run/netns/cni-43cb58ad-f39a-fb9f-deb2-0e5c99e18e9e" Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.583 [INFO][4700] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.583 [INFO][4700] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.684 [INFO][4734] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" HandleID="k8s-pod-network.db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Workload="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.685 [INFO][4734] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.685 [INFO][4734] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.699 [WARNING][4734] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" HandleID="k8s-pod-network.db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Workload="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.699 [INFO][4734] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" HandleID="k8s-pod-network.db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Workload="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.709 [INFO][4734] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:57.723327 containerd[1615]: 2025-09-13 01:18:57.716 [INFO][4700] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:18:57.723327 containerd[1615]: time="2025-09-13T01:18:57.721892647Z" level=info msg="TearDown network for sandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\" successfully" Sep 13 01:18:57.723327 containerd[1615]: time="2025-09-13T01:18:57.721929652Z" level=info msg="StopPodSandbox for \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\" returns successfully" Sep 13 01:18:57.731277 containerd[1615]: time="2025-09-13T01:18:57.725037319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-9kgmq,Uid:21caebb3-60d2-40b7-849a-7ed3e6b8a990,Namespace:calico-system,Attempt:1,}" Sep 13 01:18:57.729110 systemd[1]: run-netns-cni\x2d43cb58ad\x2df39a\x2dfb9f\x2ddeb2\x2d0e5c99e18e9e.mount: Deactivated successfully. Sep 13 01:18:57.825008 kubelet[2861]: I0913 01:18:57.822175 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-sh5jf" podStartSLOduration=44.822138758 podStartE2EDuration="44.822138758s" podCreationTimestamp="2025-09-13 01:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 01:18:57.818085097 +0000 UTC m=+50.761842756" watchObservedRunningTime="2025-09-13 01:18:57.822138758 +0000 UTC m=+50.765896402" Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.620 [INFO][4705] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.621 [INFO][4705] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" iface="eth0" netns="/var/run/netns/cni-469b7509-b71d-97a6-f831-2d65669a4219" Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.622 [INFO][4705] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" iface="eth0" netns="/var/run/netns/cni-469b7509-b71d-97a6-f831-2d65669a4219" Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.624 [INFO][4705] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" iface="eth0" netns="/var/run/netns/cni-469b7509-b71d-97a6-f831-2d65669a4219" Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.624 [INFO][4705] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.624 [INFO][4705] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.837 [INFO][4743] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" HandleID="k8s-pod-network.b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.837 [INFO][4743] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.837 [INFO][4743] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.884 [WARNING][4743] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" HandleID="k8s-pod-network.b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.885 [INFO][4743] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" HandleID="k8s-pod-network.b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.893 [INFO][4743] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:57.924906 containerd[1615]: 2025-09-13 01:18:57.916 [INFO][4705] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:18:57.924906 containerd[1615]: time="2025-09-13T01:18:57.922168950Z" level=info msg="TearDown network for sandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\" successfully" Sep 13 01:18:57.924906 containerd[1615]: time="2025-09-13T01:18:57.922211895Z" level=info msg="StopPodSandbox for \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\" returns successfully" Sep 13 01:18:57.927946 systemd[1]: run-netns-cni\x2d469b7509\x2db71d\x2d97a6\x2df831\x2d2d65669a4219.mount: Deactivated successfully. Sep 13 01:18:57.943355 containerd[1615]: time="2025-09-13T01:18:57.943180904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-794c96ccd7-d8jh4,Uid:8b7ec64d-f2d5-435c-a825-32b5603eece4,Namespace:calico-apiserver,Attempt:1,}" Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.620 [INFO][4707] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.625 [INFO][4707] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" iface="eth0" netns="/var/run/netns/cni-f0c25cf4-472d-b777-5cd2-988df38eb7ac" Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.626 [INFO][4707] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" iface="eth0" netns="/var/run/netns/cni-f0c25cf4-472d-b777-5cd2-988df38eb7ac" Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.627 [INFO][4707] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" iface="eth0" netns="/var/run/netns/cni-f0c25cf4-472d-b777-5cd2-988df38eb7ac" Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.627 [INFO][4707] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.627 [INFO][4707] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.844 [INFO][4744] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" HandleID="k8s-pod-network.879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.845 [INFO][4744] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.893 [INFO][4744] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.922 [WARNING][4744] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" HandleID="k8s-pod-network.879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.923 [INFO][4744] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" HandleID="k8s-pod-network.879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.933 [INFO][4744] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:57.951026 containerd[1615]: 2025-09-13 01:18:57.945 [INFO][4707] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:18:57.951026 containerd[1615]: time="2025-09-13T01:18:57.951279514Z" level=info msg="TearDown network for sandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\" successfully" Sep 13 01:18:57.951026 containerd[1615]: time="2025-09-13T01:18:57.951344771Z" level=info msg="StopPodSandbox for \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\" returns successfully" Sep 13 01:18:57.959538 containerd[1615]: time="2025-09-13T01:18:57.957249476Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f77cfb5c7-t6k5v,Uid:2777e08f-0ded-483c-bbf8-a0c22f090d0b,Namespace:calico-system,Attempt:1,}" Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.615 [INFO][4706] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.620 [INFO][4706] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" iface="eth0" netns="/var/run/netns/cni-97b74750-7bb3-30cc-bb1d-54a7b8980107" Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.621 [INFO][4706] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" iface="eth0" netns="/var/run/netns/cni-97b74750-7bb3-30cc-bb1d-54a7b8980107" Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.626 [INFO][4706] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" iface="eth0" netns="/var/run/netns/cni-97b74750-7bb3-30cc-bb1d-54a7b8980107" Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.626 [INFO][4706] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.626 [INFO][4706] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.896 [INFO][4741] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" HandleID="k8s-pod-network.99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.907 [INFO][4741] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.933 [INFO][4741] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.967 [WARNING][4741] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" HandleID="k8s-pod-network.99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.967 [INFO][4741] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" HandleID="k8s-pod-network.99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.975 [INFO][4741] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:58.005107 containerd[1615]: 2025-09-13 01:18:57.987 [INFO][4706] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:18:58.005107 containerd[1615]: time="2025-09-13T01:18:58.001543861Z" level=info msg="TearDown network for sandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\" successfully" Sep 13 01:18:58.005107 containerd[1615]: time="2025-09-13T01:18:58.001589739Z" level=info msg="StopPodSandbox for \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\" returns successfully" Sep 13 01:18:58.010588 containerd[1615]: time="2025-09-13T01:18:58.010110657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-794c96ccd7-9sdrw,Uid:9ed5b42c-f6b8-4b84-aec8-d89c53dde12b,Namespace:calico-apiserver,Attempt:1,}" Sep 13 01:18:58.247282 systemd-networkd[1265]: caliecaa0ce9c23: Gained IPv6LL Sep 13 01:18:58.352802 systemd-networkd[1265]: cali2b4a2839923: Link UP Sep 13 01:18:58.355637 systemd-networkd[1265]: cali2b4a2839923: Gained carrier Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.180 [INFO][4808] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0 calico-apiserver-794c96ccd7- calico-apiserver 9ed5b42c-f6b8-4b84-aec8-d89c53dde12b 934 0 2025-09-13 01:18:23 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:794c96ccd7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-qlx5f.gb1.brightbox.com calico-apiserver-794c96ccd7-9sdrw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2b4a2839923 [] [] }} ContainerID="0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-9sdrw" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-" Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.181 [INFO][4808] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-9sdrw" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.252 [INFO][4831] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" HandleID="k8s-pod-network.0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.252 [INFO][4831] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" HandleID="k8s-pod-network.0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ad4a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-qlx5f.gb1.brightbox.com", "pod":"calico-apiserver-794c96ccd7-9sdrw", "timestamp":"2025-09-13 01:18:58.252314143 +0000 UTC"}, Hostname:"srv-qlx5f.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.252 [INFO][4831] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.252 [INFO][4831] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.252 [INFO][4831] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-qlx5f.gb1.brightbox.com' Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.268 [INFO][4831] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.280 [INFO][4831] ipam/ipam.go 394: Looking up existing affinities for host host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.290 [INFO][4831] ipam/ipam.go 511: Trying affinity for 192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.296 [INFO][4831] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.300 [INFO][4831] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.300 [INFO][4831] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.303 [INFO][4831] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.311 [INFO][4831] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.325 [INFO][4831] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.196/26] block=192.168.98.192/26 handle="k8s-pod-network.0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.325 [INFO][4831] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.196/26] handle="k8s-pod-network.0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.325 [INFO][4831] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:58.429232 containerd[1615]: 2025-09-13 01:18:58.325 [INFO][4831] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.196/26] IPv6=[] ContainerID="0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" HandleID="k8s-pod-network.0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:18:58.432585 containerd[1615]: 2025-09-13 01:18:58.335 [INFO][4808] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-9sdrw" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0", GenerateName:"calico-apiserver-794c96ccd7-", Namespace:"calico-apiserver", SelfLink:"", UID:"9ed5b42c-f6b8-4b84-aec8-d89c53dde12b", ResourceVersion:"934", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"794c96ccd7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-794c96ccd7-9sdrw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2b4a2839923", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:58.432585 containerd[1615]: 2025-09-13 01:18:58.336 [INFO][4808] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.196/32] ContainerID="0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-9sdrw" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:18:58.432585 containerd[1615]: 2025-09-13 01:18:58.336 [INFO][4808] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2b4a2839923 ContainerID="0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-9sdrw" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:18:58.432585 containerd[1615]: 2025-09-13 01:18:58.358 [INFO][4808] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-9sdrw" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:18:58.432585 containerd[1615]: 2025-09-13 01:18:58.360 [INFO][4808] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-9sdrw" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0", GenerateName:"calico-apiserver-794c96ccd7-", Namespace:"calico-apiserver", SelfLink:"", UID:"9ed5b42c-f6b8-4b84-aec8-d89c53dde12b", ResourceVersion:"934", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"794c96ccd7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f", Pod:"calico-apiserver-794c96ccd7-9sdrw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2b4a2839923", MAC:"16:b5:53:56:0f:60", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:58.432585 containerd[1615]: 2025-09-13 01:18:58.398 [INFO][4808] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-9sdrw" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:18:58.448924 containerd[1615]: time="2025-09-13T01:18:58.448870256Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:58.449853 containerd[1615]: time="2025-09-13T01:18:58.449321544Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 13 01:18:58.474902 containerd[1615]: time="2025-09-13T01:18:58.474818986Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:58.481120 containerd[1615]: time="2025-09-13T01:18:58.481085346Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:18:58.485833 containerd[1615]: time="2025-09-13T01:18:58.485607594Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 2.865520927s" Sep 13 01:18:58.485833 containerd[1615]: time="2025-09-13T01:18:58.485652710Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 13 01:18:58.503830 containerd[1615]: time="2025-09-13T01:18:58.503637028Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 13 01:18:58.512850 containerd[1615]: time="2025-09-13T01:18:58.512674002Z" level=info msg="CreateContainer within sandbox \"5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 13 01:18:58.531515 systemd[1]: run-netns-cni\x2df0c25cf4\x2d472d\x2db777\x2d5cd2\x2d988df38eb7ac.mount: Deactivated successfully. Sep 13 01:18:58.531735 systemd[1]: run-netns-cni\x2d97b74750\x2d7bb3\x2d30cc\x2dbb1d\x2d54a7b8980107.mount: Deactivated successfully. Sep 13 01:18:58.566109 containerd[1615]: time="2025-09-13T01:18:58.560662422Z" level=info msg="CreateContainer within sandbox \"5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"5c939778f74ac79edf768b897692d5958ad8f999b91a7c86e280c63d9529780b\"" Sep 13 01:18:58.574371 containerd[1615]: time="2025-09-13T01:18:58.573888052Z" level=info msg="StartContainer for \"5c939778f74ac79edf768b897692d5958ad8f999b91a7c86e280c63d9529780b\"" Sep 13 01:18:58.590210 systemd-networkd[1265]: cali7890faf4299: Link UP Sep 13 01:18:58.590605 systemd-networkd[1265]: cali7890faf4299: Gained carrier Sep 13 01:18:58.608677 containerd[1615]: time="2025-09-13T01:18:58.595481668Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:58.608677 containerd[1615]: time="2025-09-13T01:18:58.595585851Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:58.608677 containerd[1615]: time="2025-09-13T01:18:58.595605402Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:58.608677 containerd[1615]: time="2025-09-13T01:18:58.595841263Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.070 [INFO][4761] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0 goldmane-7988f88666- calico-system 21caebb3-60d2-40b7-849a-7ed3e6b8a990 931 0 2025-09-13 01:18:27 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-qlx5f.gb1.brightbox.com goldmane-7988f88666-9kgmq eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali7890faf4299 [] [] }} ContainerID="b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" Namespace="calico-system" Pod="goldmane-7988f88666-9kgmq" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-" Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.070 [INFO][4761] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" Namespace="calico-system" Pod="goldmane-7988f88666-9kgmq" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.256 [INFO][4818] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" HandleID="k8s-pod-network.b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.257 [INFO][4818] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" HandleID="k8s-pod-network.b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000322410), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-qlx5f.gb1.brightbox.com", "pod":"goldmane-7988f88666-9kgmq", "timestamp":"2025-09-13 01:18:58.256657606 +0000 UTC"}, Hostname:"srv-qlx5f.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.257 [INFO][4818] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.326 [INFO][4818] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.334 [INFO][4818] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-qlx5f.gb1.brightbox.com' Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.383 [INFO][4818] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.421 [INFO][4818] ipam/ipam.go 394: Looking up existing affinities for host host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.437 [INFO][4818] ipam/ipam.go 511: Trying affinity for 192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.441 [INFO][4818] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.446 [INFO][4818] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.446 [INFO][4818] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.449 [INFO][4818] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3 Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.471 [INFO][4818] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.486 [INFO][4818] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.197/26] block=192.168.98.192/26 handle="k8s-pod-network.b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.495 [INFO][4818] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.197/26] handle="k8s-pod-network.b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.499 [INFO][4818] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:58.649567 containerd[1615]: 2025-09-13 01:18:58.499 [INFO][4818] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.197/26] IPv6=[] ContainerID="b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" HandleID="k8s-pod-network.b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:18:58.651227 containerd[1615]: 2025-09-13 01:18:58.544 [INFO][4761] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" Namespace="calico-system" Pod="goldmane-7988f88666-9kgmq" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"21caebb3-60d2-40b7-849a-7ed3e6b8a990", ResourceVersion:"931", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-7988f88666-9kgmq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7890faf4299", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:58.651227 containerd[1615]: 2025-09-13 01:18:58.545 [INFO][4761] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.197/32] ContainerID="b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" Namespace="calico-system" Pod="goldmane-7988f88666-9kgmq" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:18:58.651227 containerd[1615]: 2025-09-13 01:18:58.545 [INFO][4761] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7890faf4299 ContainerID="b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" Namespace="calico-system" Pod="goldmane-7988f88666-9kgmq" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:18:58.651227 containerd[1615]: 2025-09-13 01:18:58.597 [INFO][4761] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" Namespace="calico-system" Pod="goldmane-7988f88666-9kgmq" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:18:58.651227 containerd[1615]: 2025-09-13 01:18:58.601 [INFO][4761] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" Namespace="calico-system" Pod="goldmane-7988f88666-9kgmq" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"21caebb3-60d2-40b7-849a-7ed3e6b8a990", ResourceVersion:"931", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3", Pod:"goldmane-7988f88666-9kgmq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7890faf4299", MAC:"8e:08:1a:42:33:40", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:58.651227 containerd[1615]: 2025-09-13 01:18:58.638 [INFO][4761] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3" Namespace="calico-system" Pod="goldmane-7988f88666-9kgmq" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:18:58.726443 systemd-networkd[1265]: calie6d1875203a: Link UP Sep 13 01:18:58.731252 systemd-networkd[1265]: calie6d1875203a: Gained carrier Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.199 [INFO][4798] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0 calico-kube-controllers-7f77cfb5c7- calico-system 2777e08f-0ded-483c-bbf8-a0c22f090d0b 932 0 2025-09-13 01:18:28 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7f77cfb5c7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-qlx5f.gb1.brightbox.com calico-kube-controllers-7f77cfb5c7-t6k5v eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calie6d1875203a [] [] }} ContainerID="2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" Namespace="calico-system" Pod="calico-kube-controllers-7f77cfb5c7-t6k5v" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-" Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.199 [INFO][4798] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" Namespace="calico-system" Pod="calico-kube-controllers-7f77cfb5c7-t6k5v" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.399 [INFO][4839] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" HandleID="k8s-pod-network.2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.399 [INFO][4839] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" HandleID="k8s-pod-network.2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000636a10), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-qlx5f.gb1.brightbox.com", "pod":"calico-kube-controllers-7f77cfb5c7-t6k5v", "timestamp":"2025-09-13 01:18:58.398722416 +0000 UTC"}, Hostname:"srv-qlx5f.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.400 [INFO][4839] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.496 [INFO][4839] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.496 [INFO][4839] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-qlx5f.gb1.brightbox.com' Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.523 [INFO][4839] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.594 [INFO][4839] ipam/ipam.go 394: Looking up existing affinities for host host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.622 [INFO][4839] ipam/ipam.go 511: Trying affinity for 192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.627 [INFO][4839] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.641 [INFO][4839] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.641 [INFO][4839] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.647 [INFO][4839] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.658 [INFO][4839] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.691 [INFO][4839] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.198/26] block=192.168.98.192/26 handle="k8s-pod-network.2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.691 [INFO][4839] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.198/26] handle="k8s-pod-network.2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.691 [INFO][4839] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:58.861920 containerd[1615]: 2025-09-13 01:18:58.691 [INFO][4839] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.198/26] IPv6=[] ContainerID="2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" HandleID="k8s-pod-network.2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:18:58.865651 containerd[1615]: 2025-09-13 01:18:58.708 [INFO][4798] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" Namespace="calico-system" Pod="calico-kube-controllers-7f77cfb5c7-t6k5v" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0", GenerateName:"calico-kube-controllers-7f77cfb5c7-", Namespace:"calico-system", SelfLink:"", UID:"2777e08f-0ded-483c-bbf8-a0c22f090d0b", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7f77cfb5c7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-7f77cfb5c7-t6k5v", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie6d1875203a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:58.865651 containerd[1615]: 2025-09-13 01:18:58.709 [INFO][4798] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.198/32] ContainerID="2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" Namespace="calico-system" Pod="calico-kube-controllers-7f77cfb5c7-t6k5v" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:18:58.865651 containerd[1615]: 2025-09-13 01:18:58.710 [INFO][4798] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie6d1875203a ContainerID="2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" Namespace="calico-system" Pod="calico-kube-controllers-7f77cfb5c7-t6k5v" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:18:58.865651 containerd[1615]: 2025-09-13 01:18:58.731 [INFO][4798] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" Namespace="calico-system" Pod="calico-kube-controllers-7f77cfb5c7-t6k5v" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:18:58.865651 containerd[1615]: 2025-09-13 01:18:58.753 [INFO][4798] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" Namespace="calico-system" Pod="calico-kube-controllers-7f77cfb5c7-t6k5v" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0", GenerateName:"calico-kube-controllers-7f77cfb5c7-", Namespace:"calico-system", SelfLink:"", UID:"2777e08f-0ded-483c-bbf8-a0c22f090d0b", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7f77cfb5c7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de", Pod:"calico-kube-controllers-7f77cfb5c7-t6k5v", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie6d1875203a", MAC:"9e:b3:7a:08:2a:93", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:58.865651 containerd[1615]: 2025-09-13 01:18:58.830 [INFO][4798] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de" Namespace="calico-system" Pod="calico-kube-controllers-7f77cfb5c7-t6k5v" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:18:58.910641 containerd[1615]: time="2025-09-13T01:18:58.908582872Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:58.910641 containerd[1615]: time="2025-09-13T01:18:58.908688031Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:58.910641 containerd[1615]: time="2025-09-13T01:18:58.908707680Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:58.910641 containerd[1615]: time="2025-09-13T01:18:58.908866495Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:58.978024 systemd-networkd[1265]: cali29470fe9153: Link UP Sep 13 01:18:59.000315 systemd-networkd[1265]: cali29470fe9153: Gained carrier Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.171 [INFO][4785] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0 calico-apiserver-794c96ccd7- calico-apiserver 8b7ec64d-f2d5-435c-a825-32b5603eece4 933 0 2025-09-13 01:18:23 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:794c96ccd7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-qlx5f.gb1.brightbox.com calico-apiserver-794c96ccd7-d8jh4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali29470fe9153 [] [] }} ContainerID="2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-d8jh4" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-" Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.171 [INFO][4785] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-d8jh4" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.439 [INFO][4829] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" HandleID="k8s-pod-network.2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.439 [INFO][4829] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" HandleID="k8s-pod-network.2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000616200), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-qlx5f.gb1.brightbox.com", "pod":"calico-apiserver-794c96ccd7-d8jh4", "timestamp":"2025-09-13 01:18:58.43962985 +0000 UTC"}, Hostname:"srv-qlx5f.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.440 [INFO][4829] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.694 [INFO][4829] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.694 [INFO][4829] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-qlx5f.gb1.brightbox.com' Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.823 [INFO][4829] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.861 [INFO][4829] ipam/ipam.go 394: Looking up existing affinities for host host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.879 [INFO][4829] ipam/ipam.go 511: Trying affinity for 192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.884 [INFO][4829] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.891 [INFO][4829] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.891 [INFO][4829] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.904 [INFO][4829] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.911 [INFO][4829] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.927 [INFO][4829] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.199/26] block=192.168.98.192/26 handle="k8s-pod-network.2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.936 [INFO][4829] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.199/26] handle="k8s-pod-network.2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.936 [INFO][4829] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:18:59.052409 containerd[1615]: 2025-09-13 01:18:58.936 [INFO][4829] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.199/26] IPv6=[] ContainerID="2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" HandleID="k8s-pod-network.2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:18:59.053856 containerd[1615]: 2025-09-13 01:18:58.946 [INFO][4785] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-d8jh4" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0", GenerateName:"calico-apiserver-794c96ccd7-", Namespace:"calico-apiserver", SelfLink:"", UID:"8b7ec64d-f2d5-435c-a825-32b5603eece4", ResourceVersion:"933", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"794c96ccd7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-794c96ccd7-d8jh4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali29470fe9153", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:59.053856 containerd[1615]: 2025-09-13 01:18:58.946 [INFO][4785] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.199/32] ContainerID="2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-d8jh4" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:18:59.053856 containerd[1615]: 2025-09-13 01:18:58.946 [INFO][4785] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali29470fe9153 ContainerID="2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-d8jh4" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:18:59.053856 containerd[1615]: 2025-09-13 01:18:59.016 [INFO][4785] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-d8jh4" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:18:59.053856 containerd[1615]: 2025-09-13 01:18:59.019 [INFO][4785] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-d8jh4" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0", GenerateName:"calico-apiserver-794c96ccd7-", Namespace:"calico-apiserver", SelfLink:"", UID:"8b7ec64d-f2d5-435c-a825-32b5603eece4", ResourceVersion:"933", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"794c96ccd7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a", Pod:"calico-apiserver-794c96ccd7-d8jh4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali29470fe9153", MAC:"2a:5c:74:cb:94:df", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:18:59.053856 containerd[1615]: 2025-09-13 01:18:59.040 [INFO][4785] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a" Namespace="calico-apiserver" Pod="calico-apiserver-794c96ccd7-d8jh4" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:18:59.097013 containerd[1615]: time="2025-09-13T01:18:59.094197696Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:59.097013 containerd[1615]: time="2025-09-13T01:18:59.094346659Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:59.097013 containerd[1615]: time="2025-09-13T01:18:59.094371068Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:59.097013 containerd[1615]: time="2025-09-13T01:18:59.095238802Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:59.154035 containerd[1615]: time="2025-09-13T01:18:59.151359604Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:18:59.154035 containerd[1615]: time="2025-09-13T01:18:59.151433648Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:18:59.154035 containerd[1615]: time="2025-09-13T01:18:59.151492809Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:59.159356 containerd[1615]: time="2025-09-13T01:18:59.159040472Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:18:59.340034 containerd[1615]: time="2025-09-13T01:18:59.338768446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f77cfb5c7-t6k5v,Uid:2777e08f-0ded-483c-bbf8-a0c22f090d0b,Namespace:calico-system,Attempt:1,} returns sandbox id \"2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de\"" Sep 13 01:18:59.351931 containerd[1615]: time="2025-09-13T01:18:59.351557583Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-9kgmq,Uid:21caebb3-60d2-40b7-849a-7ed3e6b8a990,Namespace:calico-system,Attempt:1,} returns sandbox id \"b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3\"" Sep 13 01:18:59.361902 containerd[1615]: time="2025-09-13T01:18:59.361830521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-794c96ccd7-9sdrw,Uid:9ed5b42c-f6b8-4b84-aec8-d89c53dde12b,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f\"" Sep 13 01:18:59.379559 containerd[1615]: time="2025-09-13T01:18:59.379493788Z" level=info msg="StartContainer for \"5c939778f74ac79edf768b897692d5958ad8f999b91a7c86e280c63d9529780b\" returns successfully" Sep 13 01:18:59.409174 containerd[1615]: time="2025-09-13T01:18:59.408280882Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-794c96ccd7-d8jh4,Uid:8b7ec64d-f2d5-435c-a825-32b5603eece4,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a\"" Sep 13 01:18:59.597081 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:18:59.595771 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:18:59.595859 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:00.039091 systemd-networkd[1265]: cali7890faf4299: Gained IPv6LL Sep 13 01:19:00.422531 systemd-networkd[1265]: cali2b4a2839923: Gained IPv6LL Sep 13 01:19:00.778848 containerd[1615]: time="2025-09-13T01:19:00.778611162Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:00.780317 containerd[1615]: time="2025-09-13T01:19:00.780262240Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 13 01:19:00.781037 containerd[1615]: time="2025-09-13T01:19:00.780821618Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:00.783923 containerd[1615]: time="2025-09-13T01:19:00.783890352Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:00.785283 containerd[1615]: time="2025-09-13T01:19:00.785052306Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.281365867s" Sep 13 01:19:00.785283 containerd[1615]: time="2025-09-13T01:19:00.785123077Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 13 01:19:00.787372 containerd[1615]: time="2025-09-13T01:19:00.787294789Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 13 01:19:00.790544 containerd[1615]: time="2025-09-13T01:19:00.790230281Z" level=info msg="CreateContainer within sandbox \"f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 13 01:19:00.808190 systemd-networkd[1265]: calie6d1875203a: Gained IPv6LL Sep 13 01:19:00.814685 containerd[1615]: time="2025-09-13T01:19:00.814571713Z" level=info msg="CreateContainer within sandbox \"f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"e1fc97e40d4739c69954577fc557ade93e2075e7fe676fc98b966eb4cdd30d98\"" Sep 13 01:19:00.815669 containerd[1615]: time="2025-09-13T01:19:00.815638957Z" level=info msg="StartContainer for \"e1fc97e40d4739c69954577fc557ade93e2075e7fe676fc98b966eb4cdd30d98\"" Sep 13 01:19:00.880404 systemd[1]: run-containerd-runc-k8s.io-e1fc97e40d4739c69954577fc557ade93e2075e7fe676fc98b966eb4cdd30d98-runc.kajDVW.mount: Deactivated successfully. Sep 13 01:19:00.935205 systemd-networkd[1265]: cali29470fe9153: Gained IPv6LL Sep 13 01:19:00.940413 containerd[1615]: time="2025-09-13T01:19:00.940299958Z" level=info msg="StartContainer for \"e1fc97e40d4739c69954577fc557ade93e2075e7fe676fc98b966eb4cdd30d98\" returns successfully" Sep 13 01:19:05.629626 containerd[1615]: time="2025-09-13T01:19:05.629558748Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:05.634012 containerd[1615]: time="2025-09-13T01:19:05.632223267Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 13 01:19:05.634012 containerd[1615]: time="2025-09-13T01:19:05.633558928Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:05.638301 containerd[1615]: time="2025-09-13T01:19:05.638008121Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:05.640291 containerd[1615]: time="2025-09-13T01:19:05.640245130Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 4.852648654s" Sep 13 01:19:05.640435 containerd[1615]: time="2025-09-13T01:19:05.640296125Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 13 01:19:05.653803 containerd[1615]: time="2025-09-13T01:19:05.653749337Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 13 01:19:05.691224 containerd[1615]: time="2025-09-13T01:19:05.689604578Z" level=info msg="CreateContainer within sandbox \"2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 13 01:19:05.725313 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2772290062.mount: Deactivated successfully. Sep 13 01:19:05.737615 containerd[1615]: time="2025-09-13T01:19:05.737508265Z" level=info msg="CreateContainer within sandbox \"2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ee1d983b0fe8af28d38a552cd04df7689abaa4f1021eb36ea26592b8132e9881\"" Sep 13 01:19:05.739000 containerd[1615]: time="2025-09-13T01:19:05.738707584Z" level=info msg="StartContainer for \"ee1d983b0fe8af28d38a552cd04df7689abaa4f1021eb36ea26592b8132e9881\"" Sep 13 01:19:05.921835 containerd[1615]: time="2025-09-13T01:19:05.921559659Z" level=info msg="StartContainer for \"ee1d983b0fe8af28d38a552cd04df7689abaa4f1021eb36ea26592b8132e9881\" returns successfully" Sep 13 01:19:05.970167 kubelet[2861]: I0913 01:19:05.969666 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7f77cfb5c7-t6k5v" podStartSLOduration=31.660605703999998 podStartE2EDuration="37.969632068s" podCreationTimestamp="2025-09-13 01:18:28 +0000 UTC" firstStartedPulling="2025-09-13 01:18:59.343124927 +0000 UTC m=+52.286882562" lastFinishedPulling="2025-09-13 01:19:05.652151283 +0000 UTC m=+58.595908926" observedRunningTime="2025-09-13 01:19:05.967720251 +0000 UTC m=+58.911477900" watchObservedRunningTime="2025-09-13 01:19:05.969632068 +0000 UTC m=+58.913389752" Sep 13 01:19:07.489166 containerd[1615]: time="2025-09-13T01:19:07.485866142Z" level=info msg="StopPodSandbox for \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\"" Sep 13 01:19:07.593572 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:07.597046 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:07.597087 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:07.647397 containerd[1615]: time="2025-09-13T01:19:07.647350880Z" level=info msg="StopPodSandbox for \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\"" Sep 13 01:19:08.175346 containerd[1615]: 2025-09-13 01:19:07.870 [WARNING][5270] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0", GenerateName:"calico-kube-controllers-7f77cfb5c7-", Namespace:"calico-system", SelfLink:"", UID:"2777e08f-0ded-483c-bbf8-a0c22f090d0b", ResourceVersion:"999", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7f77cfb5c7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de", Pod:"calico-kube-controllers-7f77cfb5c7-t6k5v", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie6d1875203a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:08.175346 containerd[1615]: 2025-09-13 01:19:07.873 [INFO][5270] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:19:08.175346 containerd[1615]: 2025-09-13 01:19:07.873 [INFO][5270] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" iface="eth0" netns="" Sep 13 01:19:08.175346 containerd[1615]: 2025-09-13 01:19:07.873 [INFO][5270] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:19:08.175346 containerd[1615]: 2025-09-13 01:19:07.873 [INFO][5270] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:19:08.175346 containerd[1615]: 2025-09-13 01:19:08.052 [INFO][5279] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" HandleID="k8s-pod-network.879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:19:08.175346 containerd[1615]: 2025-09-13 01:19:08.053 [INFO][5279] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:08.175346 containerd[1615]: 2025-09-13 01:19:08.053 [INFO][5279] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:08.175346 containerd[1615]: 2025-09-13 01:19:08.078 [WARNING][5279] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" HandleID="k8s-pod-network.879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:19:08.175346 containerd[1615]: 2025-09-13 01:19:08.078 [INFO][5279] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" HandleID="k8s-pod-network.879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:19:08.175346 containerd[1615]: 2025-09-13 01:19:08.091 [INFO][5279] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:08.175346 containerd[1615]: 2025-09-13 01:19:08.115 [INFO][5270] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:19:08.182174 containerd[1615]: time="2025-09-13T01:19:08.178079038Z" level=info msg="TearDown network for sandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\" successfully" Sep 13 01:19:08.182174 containerd[1615]: time="2025-09-13T01:19:08.178133340Z" level=info msg="StopPodSandbox for \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\" returns successfully" Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:07.905 [INFO][5262] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:07.906 [INFO][5262] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" iface="eth0" netns="/var/run/netns/cni-17ee4589-a474-6178-d515-87d88ebabd18" Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:07.908 [INFO][5262] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" iface="eth0" netns="/var/run/netns/cni-17ee4589-a474-6178-d515-87d88ebabd18" Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:07.911 [INFO][5262] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" iface="eth0" netns="/var/run/netns/cni-17ee4589-a474-6178-d515-87d88ebabd18" Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:07.912 [INFO][5262] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:07.913 [INFO][5262] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:08.223 [INFO][5284] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" HandleID="k8s-pod-network.7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:08.225 [INFO][5284] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:08.225 [INFO][5284] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:08.269 [WARNING][5284] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" HandleID="k8s-pod-network.7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:08.269 [INFO][5284] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" HandleID="k8s-pod-network.7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:08.285 [INFO][5284] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:08.308325 containerd[1615]: 2025-09-13 01:19:08.302 [INFO][5262] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:19:08.316138 systemd[1]: run-netns-cni\x2d17ee4589\x2da474\x2d6178\x2dd515\x2d87d88ebabd18.mount: Deactivated successfully. Sep 13 01:19:08.321140 containerd[1615]: time="2025-09-13T01:19:08.321090953Z" level=info msg="TearDown network for sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\" successfully" Sep 13 01:19:08.321279 containerd[1615]: time="2025-09-13T01:19:08.321255135Z" level=info msg="StopPodSandbox for \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\" returns successfully" Sep 13 01:19:08.338541 containerd[1615]: time="2025-09-13T01:19:08.338491271Z" level=info msg="RemovePodSandbox for \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\"" Sep 13 01:19:08.345360 containerd[1615]: time="2025-09-13T01:19:08.345315154Z" level=info msg="Forcibly stopping sandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\"" Sep 13 01:19:08.350853 containerd[1615]: time="2025-09-13T01:19:08.350780279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8m5j5,Uid:b466df6e-e38d-402c-86b5-4227a521ff8c,Namespace:kube-system,Attempt:1,}" Sep 13 01:19:08.977128 systemd-networkd[1265]: cali331404c00dd: Link UP Sep 13 01:19:08.977510 systemd-networkd[1265]: cali331404c00dd: Gained carrier Sep 13 01:19:09.046137 containerd[1615]: 2025-09-13 01:19:08.613 [WARNING][5310] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0", GenerateName:"calico-kube-controllers-7f77cfb5c7-", Namespace:"calico-system", SelfLink:"", UID:"2777e08f-0ded-483c-bbf8-a0c22f090d0b", ResourceVersion:"999", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7f77cfb5c7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"2b1e00bac2f6f3ea516dc06f23d44a573f3f04015303f32cb705afddadee15de", Pod:"calico-kube-controllers-7f77cfb5c7-t6k5v", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie6d1875203a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:09.046137 containerd[1615]: 2025-09-13 01:19:08.613 [INFO][5310] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:19:09.046137 containerd[1615]: 2025-09-13 01:19:08.613 [INFO][5310] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" iface="eth0" netns="" Sep 13 01:19:09.046137 containerd[1615]: 2025-09-13 01:19:08.613 [INFO][5310] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:19:09.046137 containerd[1615]: 2025-09-13 01:19:08.613 [INFO][5310] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:19:09.046137 containerd[1615]: 2025-09-13 01:19:08.863 [INFO][5321] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" HandleID="k8s-pod-network.879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:19:09.046137 containerd[1615]: 2025-09-13 01:19:08.863 [INFO][5321] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:09.046137 containerd[1615]: 2025-09-13 01:19:08.923 [INFO][5321] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:09.046137 containerd[1615]: 2025-09-13 01:19:08.959 [WARNING][5321] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" HandleID="k8s-pod-network.879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:19:09.046137 containerd[1615]: 2025-09-13 01:19:08.981 [INFO][5321] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" HandleID="k8s-pod-network.879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--kube--controllers--7f77cfb5c7--t6k5v-eth0" Sep 13 01:19:09.046137 containerd[1615]: 2025-09-13 01:19:08.998 [INFO][5321] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:09.046137 containerd[1615]: 2025-09-13 01:19:09.006 [INFO][5310] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914" Sep 13 01:19:09.050724 containerd[1615]: time="2025-09-13T01:19:09.050530127Z" level=info msg="TearDown network for sandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\" successfully" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.574 [INFO][5299] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0 coredns-7c65d6cfc9- kube-system b466df6e-e38d-402c-86b5-4227a521ff8c 1003 0 2025-09-13 01:18:13 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-qlx5f.gb1.brightbox.com coredns-7c65d6cfc9-8m5j5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali331404c00dd [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8m5j5" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.575 [INFO][5299] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8m5j5" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.779 [INFO][5319] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" HandleID="k8s-pod-network.b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.779 [INFO][5319] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" HandleID="k8s-pod-network.b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00032f240), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-qlx5f.gb1.brightbox.com", "pod":"coredns-7c65d6cfc9-8m5j5", "timestamp":"2025-09-13 01:19:08.778966462 +0000 UTC"}, Hostname:"srv-qlx5f.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.780 [INFO][5319] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.780 [INFO][5319] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.780 [INFO][5319] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-qlx5f.gb1.brightbox.com' Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.814 [INFO][5319] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.851 [INFO][5319] ipam/ipam.go 394: Looking up existing affinities for host host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.864 [INFO][5319] ipam/ipam.go 511: Trying affinity for 192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.872 [INFO][5319] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.876 [INFO][5319] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.876 [INFO][5319] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.891 [INFO][5319] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778 Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.903 [INFO][5319] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.922 [INFO][5319] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.98.200/26] block=192.168.98.192/26 handle="k8s-pod-network.b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.922 [INFO][5319] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.200/26] handle="k8s-pod-network.b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" host="srv-qlx5f.gb1.brightbox.com" Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.922 [INFO][5319] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:09.068640 containerd[1615]: 2025-09-13 01:19:08.923 [INFO][5319] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.200/26] IPv6=[] ContainerID="b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" HandleID="k8s-pod-network.b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:19:09.071564 containerd[1615]: 2025-09-13 01:19:08.936 [INFO][5299] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8m5j5" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"b466df6e-e38d-402c-86b5-4227a521ff8c", ResourceVersion:"1003", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7c65d6cfc9-8m5j5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali331404c00dd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:09.071564 containerd[1615]: 2025-09-13 01:19:08.938 [INFO][5299] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.200/32] ContainerID="b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8m5j5" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:19:09.071564 containerd[1615]: 2025-09-13 01:19:08.938 [INFO][5299] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali331404c00dd ContainerID="b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8m5j5" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:19:09.071564 containerd[1615]: 2025-09-13 01:19:08.974 [INFO][5299] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8m5j5" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:19:09.071564 containerd[1615]: 2025-09-13 01:19:08.980 [INFO][5299] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8m5j5" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"b466df6e-e38d-402c-86b5-4227a521ff8c", ResourceVersion:"1003", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778", Pod:"coredns-7c65d6cfc9-8m5j5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali331404c00dd", MAC:"0a:07:27:5f:17:03", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:09.071564 containerd[1615]: 2025-09-13 01:19:09.033 [INFO][5299] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8m5j5" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:19:09.178952 containerd[1615]: time="2025-09-13T01:19:09.169047284Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 01:19:09.178952 containerd[1615]: time="2025-09-13T01:19:09.169154696Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 01:19:09.178952 containerd[1615]: time="2025-09-13T01:19:09.169183693Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:19:09.178952 containerd[1615]: time="2025-09-13T01:19:09.169434068Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 01:19:09.186350 containerd[1615]: time="2025-09-13T01:19:09.186162027Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:19:09.197493 containerd[1615]: time="2025-09-13T01:19:09.197430097Z" level=info msg="RemovePodSandbox \"879bc40636212eb62f44e0606c46ab4974312ebbf6324c6ef76b9fcd7530f914\" returns successfully" Sep 13 01:19:09.221017 containerd[1615]: time="2025-09-13T01:19:09.220168581Z" level=info msg="StopPodSandbox for \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\"" Sep 13 01:19:09.492971 containerd[1615]: time="2025-09-13T01:19:09.492924109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8m5j5,Uid:b466df6e-e38d-402c-86b5-4227a521ff8c,Namespace:kube-system,Attempt:1,} returns sandbox id \"b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778\"" Sep 13 01:19:09.516437 containerd[1615]: time="2025-09-13T01:19:09.516390907Z" level=info msg="CreateContainer within sandbox \"b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 13 01:19:09.613711 containerd[1615]: time="2025-09-13T01:19:09.613651188Z" level=info msg="CreateContainer within sandbox \"b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4abb94e98e73a38286ba197d8f85726097e7796624b4201c76545cbf20d0445b\"" Sep 13 01:19:09.619248 containerd[1615]: time="2025-09-13T01:19:09.618765456Z" level=info msg="StartContainer for \"4abb94e98e73a38286ba197d8f85726097e7796624b4201c76545cbf20d0445b\"" Sep 13 01:19:09.636920 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount491956814.mount: Deactivated successfully. Sep 13 01:19:09.657685 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:09.639303 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:09.639312 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:09.718144 containerd[1615]: 2025-09-13 01:19:09.520 [WARNING][5386] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"9b023c1a-2436-46bd-a84b-0a772635011c", ResourceVersion:"938", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0", Pod:"coredns-7c65d6cfc9-sh5jf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliecaa0ce9c23", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:09.718144 containerd[1615]: 2025-09-13 01:19:09.543 [INFO][5386] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:19:09.718144 containerd[1615]: 2025-09-13 01:19:09.543 [INFO][5386] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" iface="eth0" netns="" Sep 13 01:19:09.718144 containerd[1615]: 2025-09-13 01:19:09.543 [INFO][5386] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:19:09.718144 containerd[1615]: 2025-09-13 01:19:09.543 [INFO][5386] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:19:09.718144 containerd[1615]: 2025-09-13 01:19:09.635 [INFO][5406] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" HandleID="k8s-pod-network.a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:19:09.718144 containerd[1615]: 2025-09-13 01:19:09.636 [INFO][5406] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:09.718144 containerd[1615]: 2025-09-13 01:19:09.651 [INFO][5406] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:09.718144 containerd[1615]: 2025-09-13 01:19:09.694 [WARNING][5406] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" HandleID="k8s-pod-network.a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:19:09.718144 containerd[1615]: 2025-09-13 01:19:09.696 [INFO][5406] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" HandleID="k8s-pod-network.a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:19:09.718144 containerd[1615]: 2025-09-13 01:19:09.704 [INFO][5406] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:09.718144 containerd[1615]: 2025-09-13 01:19:09.709 [INFO][5386] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:19:09.724533 containerd[1615]: time="2025-09-13T01:19:09.718666607Z" level=info msg="TearDown network for sandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\" successfully" Sep 13 01:19:09.724533 containerd[1615]: time="2025-09-13T01:19:09.719935185Z" level=info msg="StopPodSandbox for \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\" returns successfully" Sep 13 01:19:09.739048 containerd[1615]: time="2025-09-13T01:19:09.738401787Z" level=info msg="RemovePodSandbox for \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\"" Sep 13 01:19:09.739048 containerd[1615]: time="2025-09-13T01:19:09.738453747Z" level=info msg="Forcibly stopping sandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\"" Sep 13 01:19:10.012007 containerd[1615]: time="2025-09-13T01:19:10.011829927Z" level=info msg="StartContainer for \"4abb94e98e73a38286ba197d8f85726097e7796624b4201c76545cbf20d0445b\" returns successfully" Sep 13 01:19:10.022344 systemd-networkd[1265]: cali331404c00dd: Gained IPv6LL Sep 13 01:19:10.129213 kubelet[2861]: I0913 01:19:10.128580 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-8m5j5" podStartSLOduration=57.128192695 podStartE2EDuration="57.128192695s" podCreationTimestamp="2025-09-13 01:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 01:19:10.10138196 +0000 UTC m=+63.045139627" watchObservedRunningTime="2025-09-13 01:19:10.128192695 +0000 UTC m=+63.071950345" Sep 13 01:19:10.420973 containerd[1615]: 2025-09-13 01:19:10.154 [WARNING][5434] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"9b023c1a-2436-46bd-a84b-0a772635011c", ResourceVersion:"938", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"cd9032fc5df04b7435f22c9b3c04320bd2f3530bd2542b8120d5bf98477f48f0", Pod:"coredns-7c65d6cfc9-sh5jf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliecaa0ce9c23", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:10.420973 containerd[1615]: 2025-09-13 01:19:10.156 [INFO][5434] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:19:10.420973 containerd[1615]: 2025-09-13 01:19:10.156 [INFO][5434] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" iface="eth0" netns="" Sep 13 01:19:10.420973 containerd[1615]: 2025-09-13 01:19:10.156 [INFO][5434] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:19:10.420973 containerd[1615]: 2025-09-13 01:19:10.156 [INFO][5434] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:19:10.420973 containerd[1615]: 2025-09-13 01:19:10.378 [INFO][5463] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" HandleID="k8s-pod-network.a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:19:10.420973 containerd[1615]: 2025-09-13 01:19:10.381 [INFO][5463] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:10.420973 containerd[1615]: 2025-09-13 01:19:10.381 [INFO][5463] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:10.420973 containerd[1615]: 2025-09-13 01:19:10.401 [WARNING][5463] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" HandleID="k8s-pod-network.a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:19:10.420973 containerd[1615]: 2025-09-13 01:19:10.401 [INFO][5463] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" HandleID="k8s-pod-network.a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--sh5jf-eth0" Sep 13 01:19:10.420973 containerd[1615]: 2025-09-13 01:19:10.406 [INFO][5463] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:10.420973 containerd[1615]: 2025-09-13 01:19:10.414 [INFO][5434] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3" Sep 13 01:19:10.420973 containerd[1615]: time="2025-09-13T01:19:10.420852057Z" level=info msg="TearDown network for sandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\" successfully" Sep 13 01:19:10.431817 containerd[1615]: time="2025-09-13T01:19:10.431327262Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:19:10.432022 containerd[1615]: time="2025-09-13T01:19:10.431995064Z" level=info msg="RemovePodSandbox \"a212a585937f1ea98e8ee640ea00a29bddca45b4cc53c485290cf6a0467807c3\" returns successfully" Sep 13 01:19:10.433788 containerd[1615]: time="2025-09-13T01:19:10.433760012Z" level=info msg="StopPodSandbox for \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\"" Sep 13 01:19:10.844435 containerd[1615]: 2025-09-13 01:19:10.638 [WARNING][5480] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-whisker--84f64fc779--xslb8-eth0" Sep 13 01:19:10.844435 containerd[1615]: 2025-09-13 01:19:10.638 [INFO][5480] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:19:10.844435 containerd[1615]: 2025-09-13 01:19:10.639 [INFO][5480] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" iface="eth0" netns="" Sep 13 01:19:10.844435 containerd[1615]: 2025-09-13 01:19:10.639 [INFO][5480] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:19:10.844435 containerd[1615]: 2025-09-13 01:19:10.639 [INFO][5480] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:19:10.844435 containerd[1615]: 2025-09-13 01:19:10.803 [INFO][5487] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" HandleID="k8s-pod-network.134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Workload="srv--qlx5f.gb1.brightbox.com-k8s-whisker--84f64fc779--xslb8-eth0" Sep 13 01:19:10.844435 containerd[1615]: 2025-09-13 01:19:10.804 [INFO][5487] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:10.844435 containerd[1615]: 2025-09-13 01:19:10.805 [INFO][5487] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:10.844435 containerd[1615]: 2025-09-13 01:19:10.816 [WARNING][5487] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" HandleID="k8s-pod-network.134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Workload="srv--qlx5f.gb1.brightbox.com-k8s-whisker--84f64fc779--xslb8-eth0" Sep 13 01:19:10.844435 containerd[1615]: 2025-09-13 01:19:10.816 [INFO][5487] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" HandleID="k8s-pod-network.134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Workload="srv--qlx5f.gb1.brightbox.com-k8s-whisker--84f64fc779--xslb8-eth0" Sep 13 01:19:10.844435 containerd[1615]: 2025-09-13 01:19:10.819 [INFO][5487] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:10.844435 containerd[1615]: 2025-09-13 01:19:10.836 [INFO][5480] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:19:10.844435 containerd[1615]: time="2025-09-13T01:19:10.844171600Z" level=info msg="TearDown network for sandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\" successfully" Sep 13 01:19:10.844435 containerd[1615]: time="2025-09-13T01:19:10.844207503Z" level=info msg="StopPodSandbox for \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\" returns successfully" Sep 13 01:19:10.846421 containerd[1615]: time="2025-09-13T01:19:10.845283165Z" level=info msg="RemovePodSandbox for \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\"" Sep 13 01:19:10.846421 containerd[1615]: time="2025-09-13T01:19:10.845331201Z" level=info msg="Forcibly stopping sandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\"" Sep 13 01:19:11.464015 containerd[1615]: 2025-09-13 01:19:11.152 [WARNING][5501] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" WorkloadEndpoint="srv--qlx5f.gb1.brightbox.com-k8s-whisker--84f64fc779--xslb8-eth0" Sep 13 01:19:11.464015 containerd[1615]: 2025-09-13 01:19:11.162 [INFO][5501] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:19:11.464015 containerd[1615]: 2025-09-13 01:19:11.162 [INFO][5501] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" iface="eth0" netns="" Sep 13 01:19:11.464015 containerd[1615]: 2025-09-13 01:19:11.162 [INFO][5501] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:19:11.464015 containerd[1615]: 2025-09-13 01:19:11.162 [INFO][5501] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:19:11.464015 containerd[1615]: 2025-09-13 01:19:11.409 [INFO][5508] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" HandleID="k8s-pod-network.134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Workload="srv--qlx5f.gb1.brightbox.com-k8s-whisker--84f64fc779--xslb8-eth0" Sep 13 01:19:11.464015 containerd[1615]: 2025-09-13 01:19:11.409 [INFO][5508] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:11.464015 containerd[1615]: 2025-09-13 01:19:11.409 [INFO][5508] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:11.464015 containerd[1615]: 2025-09-13 01:19:11.425 [WARNING][5508] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" HandleID="k8s-pod-network.134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Workload="srv--qlx5f.gb1.brightbox.com-k8s-whisker--84f64fc779--xslb8-eth0" Sep 13 01:19:11.464015 containerd[1615]: 2025-09-13 01:19:11.425 [INFO][5508] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" HandleID="k8s-pod-network.134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Workload="srv--qlx5f.gb1.brightbox.com-k8s-whisker--84f64fc779--xslb8-eth0" Sep 13 01:19:11.464015 containerd[1615]: 2025-09-13 01:19:11.427 [INFO][5508] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:11.464015 containerd[1615]: 2025-09-13 01:19:11.436 [INFO][5501] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a" Sep 13 01:19:11.466174 containerd[1615]: time="2025-09-13T01:19:11.465277652Z" level=info msg="TearDown network for sandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\" successfully" Sep 13 01:19:11.500061 containerd[1615]: time="2025-09-13T01:19:11.497488133Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:19:11.500061 containerd[1615]: time="2025-09-13T01:19:11.497876345Z" level=info msg="RemovePodSandbox \"134af4953303474cae3b769a2b0105b61c517667109c058e9a9f83c38907084a\" returns successfully" Sep 13 01:19:11.505533 containerd[1615]: time="2025-09-13T01:19:11.505499461Z" level=info msg="StopPodSandbox for \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\"" Sep 13 01:19:11.894834 containerd[1615]: 2025-09-13 01:19:11.678 [WARNING][5544] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3a06b43f-3090-4270-8011-d28f2c555ca3", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202", Pod:"csi-node-driver-4zbwf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib437f1c4d48", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:11.894834 containerd[1615]: 2025-09-13 01:19:11.678 [INFO][5544] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:19:11.894834 containerd[1615]: 2025-09-13 01:19:11.678 [INFO][5544] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" iface="eth0" netns="" Sep 13 01:19:11.894834 containerd[1615]: 2025-09-13 01:19:11.678 [INFO][5544] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:19:11.894834 containerd[1615]: 2025-09-13 01:19:11.678 [INFO][5544] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:19:11.894834 containerd[1615]: 2025-09-13 01:19:11.849 [INFO][5552] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" HandleID="k8s-pod-network.ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Workload="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:19:11.894834 containerd[1615]: 2025-09-13 01:19:11.851 [INFO][5552] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:11.894834 containerd[1615]: 2025-09-13 01:19:11.852 [INFO][5552] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:11.894834 containerd[1615]: 2025-09-13 01:19:11.874 [WARNING][5552] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" HandleID="k8s-pod-network.ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Workload="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:19:11.894834 containerd[1615]: 2025-09-13 01:19:11.875 [INFO][5552] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" HandleID="k8s-pod-network.ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Workload="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:19:11.894834 containerd[1615]: 2025-09-13 01:19:11.878 [INFO][5552] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:11.894834 containerd[1615]: 2025-09-13 01:19:11.889 [INFO][5544] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:19:11.894834 containerd[1615]: time="2025-09-13T01:19:11.893485142Z" level=info msg="TearDown network for sandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\" successfully" Sep 13 01:19:11.894834 containerd[1615]: time="2025-09-13T01:19:11.893520132Z" level=info msg="StopPodSandbox for \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\" returns successfully" Sep 13 01:19:11.906014 containerd[1615]: time="2025-09-13T01:19:11.895072857Z" level=info msg="RemovePodSandbox for \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\"" Sep 13 01:19:11.906014 containerd[1615]: time="2025-09-13T01:19:11.895116804Z" level=info msg="Forcibly stopping sandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\"" Sep 13 01:19:12.110219 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3275671591.mount: Deactivated successfully. Sep 13 01:19:12.306089 containerd[1615]: 2025-09-13 01:19:12.091 [WARNING][5566] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3a06b43f-3090-4270-8011-d28f2c555ca3", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202", Pod:"csi-node-driver-4zbwf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib437f1c4d48", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:12.306089 containerd[1615]: 2025-09-13 01:19:12.091 [INFO][5566] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:19:12.306089 containerd[1615]: 2025-09-13 01:19:12.091 [INFO][5566] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" iface="eth0" netns="" Sep 13 01:19:12.306089 containerd[1615]: 2025-09-13 01:19:12.091 [INFO][5566] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:19:12.306089 containerd[1615]: 2025-09-13 01:19:12.091 [INFO][5566] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:19:12.306089 containerd[1615]: 2025-09-13 01:19:12.271 [INFO][5573] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" HandleID="k8s-pod-network.ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Workload="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:19:12.306089 containerd[1615]: 2025-09-13 01:19:12.273 [INFO][5573] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:12.306089 containerd[1615]: 2025-09-13 01:19:12.273 [INFO][5573] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:12.306089 containerd[1615]: 2025-09-13 01:19:12.287 [WARNING][5573] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" HandleID="k8s-pod-network.ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Workload="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:19:12.306089 containerd[1615]: 2025-09-13 01:19:12.287 [INFO][5573] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" HandleID="k8s-pod-network.ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Workload="srv--qlx5f.gb1.brightbox.com-k8s-csi--node--driver--4zbwf-eth0" Sep 13 01:19:12.306089 containerd[1615]: 2025-09-13 01:19:12.291 [INFO][5573] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:12.306089 containerd[1615]: 2025-09-13 01:19:12.299 [INFO][5566] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235" Sep 13 01:19:12.306089 containerd[1615]: time="2025-09-13T01:19:12.302812406Z" level=info msg="TearDown network for sandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\" successfully" Sep 13 01:19:12.315871 containerd[1615]: time="2025-09-13T01:19:12.314732224Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:19:12.315871 containerd[1615]: time="2025-09-13T01:19:12.314809068Z" level=info msg="RemovePodSandbox \"ca8a9d62a73caf8edcae4e41d6fddec03af33a64292c58a38b7c51625c109235\" returns successfully" Sep 13 01:19:12.317903 containerd[1615]: time="2025-09-13T01:19:12.317524169Z" level=info msg="StopPodSandbox for \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\"" Sep 13 01:19:12.728442 containerd[1615]: 2025-09-13 01:19:12.512 [WARNING][5592] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0", GenerateName:"calico-apiserver-794c96ccd7-", Namespace:"calico-apiserver", SelfLink:"", UID:"9ed5b42c-f6b8-4b84-aec8-d89c53dde12b", ResourceVersion:"950", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"794c96ccd7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f", Pod:"calico-apiserver-794c96ccd7-9sdrw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2b4a2839923", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:12.728442 containerd[1615]: 2025-09-13 01:19:12.512 [INFO][5592] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:19:12.728442 containerd[1615]: 2025-09-13 01:19:12.512 [INFO][5592] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" iface="eth0" netns="" Sep 13 01:19:12.728442 containerd[1615]: 2025-09-13 01:19:12.512 [INFO][5592] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:19:12.728442 containerd[1615]: 2025-09-13 01:19:12.512 [INFO][5592] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:19:12.728442 containerd[1615]: 2025-09-13 01:19:12.667 [INFO][5599] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" HandleID="k8s-pod-network.99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:19:12.728442 containerd[1615]: 2025-09-13 01:19:12.670 [INFO][5599] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:12.728442 containerd[1615]: 2025-09-13 01:19:12.671 [INFO][5599] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:12.728442 containerd[1615]: 2025-09-13 01:19:12.708 [WARNING][5599] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" HandleID="k8s-pod-network.99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:19:12.728442 containerd[1615]: 2025-09-13 01:19:12.708 [INFO][5599] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" HandleID="k8s-pod-network.99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:19:12.728442 containerd[1615]: 2025-09-13 01:19:12.719 [INFO][5599] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:12.728442 containerd[1615]: 2025-09-13 01:19:12.725 [INFO][5592] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:19:12.731860 containerd[1615]: time="2025-09-13T01:19:12.729627185Z" level=info msg="TearDown network for sandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\" successfully" Sep 13 01:19:12.731860 containerd[1615]: time="2025-09-13T01:19:12.729661368Z" level=info msg="StopPodSandbox for \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\" returns successfully" Sep 13 01:19:12.737596 containerd[1615]: time="2025-09-13T01:19:12.734926654Z" level=info msg="RemovePodSandbox for \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\"" Sep 13 01:19:12.737596 containerd[1615]: time="2025-09-13T01:19:12.735529097Z" level=info msg="Forcibly stopping sandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\"" Sep 13 01:19:13.098441 containerd[1615]: 2025-09-13 01:19:12.981 [WARNING][5613] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0", GenerateName:"calico-apiserver-794c96ccd7-", Namespace:"calico-apiserver", SelfLink:"", UID:"9ed5b42c-f6b8-4b84-aec8-d89c53dde12b", ResourceVersion:"950", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"794c96ccd7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f", Pod:"calico-apiserver-794c96ccd7-9sdrw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2b4a2839923", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:13.098441 containerd[1615]: 2025-09-13 01:19:12.982 [INFO][5613] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:19:13.098441 containerd[1615]: 2025-09-13 01:19:12.982 [INFO][5613] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" iface="eth0" netns="" Sep 13 01:19:13.098441 containerd[1615]: 2025-09-13 01:19:12.982 [INFO][5613] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:19:13.098441 containerd[1615]: 2025-09-13 01:19:12.982 [INFO][5613] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:19:13.098441 containerd[1615]: 2025-09-13 01:19:13.051 [INFO][5621] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" HandleID="k8s-pod-network.99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:19:13.098441 containerd[1615]: 2025-09-13 01:19:13.053 [INFO][5621] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:13.098441 containerd[1615]: 2025-09-13 01:19:13.053 [INFO][5621] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:13.098441 containerd[1615]: 2025-09-13 01:19:13.071 [WARNING][5621] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" HandleID="k8s-pod-network.99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:19:13.098441 containerd[1615]: 2025-09-13 01:19:13.071 [INFO][5621] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" HandleID="k8s-pod-network.99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--9sdrw-eth0" Sep 13 01:19:13.098441 containerd[1615]: 2025-09-13 01:19:13.076 [INFO][5621] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:13.098441 containerd[1615]: 2025-09-13 01:19:13.085 [INFO][5613] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524" Sep 13 01:19:13.111003 containerd[1615]: time="2025-09-13T01:19:13.107494906Z" level=info msg="TearDown network for sandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\" successfully" Sep 13 01:19:13.202000 containerd[1615]: time="2025-09-13T01:19:13.200554986Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:19:13.202000 containerd[1615]: time="2025-09-13T01:19:13.200644856Z" level=info msg="RemovePodSandbox \"99ee8a30dc008271878ab7ce72006fa0ed63c81ef036183c272025fec10c8524\" returns successfully" Sep 13 01:19:13.203928 containerd[1615]: time="2025-09-13T01:19:13.203893628Z" level=info msg="StopPodSandbox for \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\"" Sep 13 01:19:13.570006 containerd[1615]: 2025-09-13 01:19:13.447 [WARNING][5636] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0", GenerateName:"calico-apiserver-794c96ccd7-", Namespace:"calico-apiserver", SelfLink:"", UID:"8b7ec64d-f2d5-435c-a825-32b5603eece4", ResourceVersion:"961", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"794c96ccd7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a", Pod:"calico-apiserver-794c96ccd7-d8jh4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali29470fe9153", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:13.570006 containerd[1615]: 2025-09-13 01:19:13.448 [INFO][5636] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:19:13.570006 containerd[1615]: 2025-09-13 01:19:13.448 [INFO][5636] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" iface="eth0" netns="" Sep 13 01:19:13.570006 containerd[1615]: 2025-09-13 01:19:13.448 [INFO][5636] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:19:13.570006 containerd[1615]: 2025-09-13 01:19:13.448 [INFO][5636] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:19:13.570006 containerd[1615]: 2025-09-13 01:19:13.522 [INFO][5644] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" HandleID="k8s-pod-network.b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:19:13.570006 containerd[1615]: 2025-09-13 01:19:13.524 [INFO][5644] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:13.570006 containerd[1615]: 2025-09-13 01:19:13.526 [INFO][5644] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:13.570006 containerd[1615]: 2025-09-13 01:19:13.538 [WARNING][5644] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" HandleID="k8s-pod-network.b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:19:13.570006 containerd[1615]: 2025-09-13 01:19:13.538 [INFO][5644] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" HandleID="k8s-pod-network.b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:19:13.570006 containerd[1615]: 2025-09-13 01:19:13.545 [INFO][5644] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:13.570006 containerd[1615]: 2025-09-13 01:19:13.560 [INFO][5636] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:19:13.570006 containerd[1615]: time="2025-09-13T01:19:13.566256885Z" level=info msg="TearDown network for sandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\" successfully" Sep 13 01:19:13.570006 containerd[1615]: time="2025-09-13T01:19:13.567069148Z" level=info msg="StopPodSandbox for \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\" returns successfully" Sep 13 01:19:13.579062 containerd[1615]: time="2025-09-13T01:19:13.573791040Z" level=info msg="RemovePodSandbox for \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\"" Sep 13 01:19:13.579062 containerd[1615]: time="2025-09-13T01:19:13.573829984Z" level=info msg="Forcibly stopping sandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\"" Sep 13 01:19:13.843561 containerd[1615]: 2025-09-13 01:19:13.707 [WARNING][5659] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0", GenerateName:"calico-apiserver-794c96ccd7-", Namespace:"calico-apiserver", SelfLink:"", UID:"8b7ec64d-f2d5-435c-a825-32b5603eece4", ResourceVersion:"961", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"794c96ccd7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a", Pod:"calico-apiserver-794c96ccd7-d8jh4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali29470fe9153", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:13.843561 containerd[1615]: 2025-09-13 01:19:13.712 [INFO][5659] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:19:13.843561 containerd[1615]: 2025-09-13 01:19:13.713 [INFO][5659] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" iface="eth0" netns="" Sep 13 01:19:13.843561 containerd[1615]: 2025-09-13 01:19:13.713 [INFO][5659] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:19:13.843561 containerd[1615]: 2025-09-13 01:19:13.713 [INFO][5659] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:19:13.843561 containerd[1615]: 2025-09-13 01:19:13.802 [INFO][5666] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" HandleID="k8s-pod-network.b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:19:13.843561 containerd[1615]: 2025-09-13 01:19:13.802 [INFO][5666] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:13.843561 containerd[1615]: 2025-09-13 01:19:13.803 [INFO][5666] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:13.843561 containerd[1615]: 2025-09-13 01:19:13.822 [WARNING][5666] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" HandleID="k8s-pod-network.b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:19:13.843561 containerd[1615]: 2025-09-13 01:19:13.822 [INFO][5666] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" HandleID="k8s-pod-network.b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Workload="srv--qlx5f.gb1.brightbox.com-k8s-calico--apiserver--794c96ccd7--d8jh4-eth0" Sep 13 01:19:13.843561 containerd[1615]: 2025-09-13 01:19:13.831 [INFO][5666] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:13.843561 containerd[1615]: 2025-09-13 01:19:13.838 [INFO][5659] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3" Sep 13 01:19:13.843561 containerd[1615]: time="2025-09-13T01:19:13.843448582Z" level=info msg="TearDown network for sandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\" successfully" Sep 13 01:19:13.850024 containerd[1615]: time="2025-09-13T01:19:13.849682506Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:19:13.850024 containerd[1615]: time="2025-09-13T01:19:13.849753476Z" level=info msg="RemovePodSandbox \"b67aa27d69833f7bd7be9db8582f7e7c48ee9458ab2f44f67591c0da011575b3\" returns successfully" Sep 13 01:19:13.852635 containerd[1615]: time="2025-09-13T01:19:13.852520161Z" level=info msg="StopPodSandbox for \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\"" Sep 13 01:19:14.111429 containerd[1615]: 2025-09-13 01:19:13.976 [WARNING][5680] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"21caebb3-60d2-40b7-849a-7ed3e6b8a990", ResourceVersion:"955", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3", Pod:"goldmane-7988f88666-9kgmq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7890faf4299", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:14.111429 containerd[1615]: 2025-09-13 01:19:13.976 [INFO][5680] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:19:14.111429 containerd[1615]: 2025-09-13 01:19:13.976 [INFO][5680] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" iface="eth0" netns="" Sep 13 01:19:14.111429 containerd[1615]: 2025-09-13 01:19:13.976 [INFO][5680] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:19:14.111429 containerd[1615]: 2025-09-13 01:19:13.976 [INFO][5680] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:19:14.111429 containerd[1615]: 2025-09-13 01:19:14.090 [INFO][5687] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" HandleID="k8s-pod-network.db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Workload="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:19:14.111429 containerd[1615]: 2025-09-13 01:19:14.090 [INFO][5687] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:14.111429 containerd[1615]: 2025-09-13 01:19:14.090 [INFO][5687] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:14.111429 containerd[1615]: 2025-09-13 01:19:14.102 [WARNING][5687] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" HandleID="k8s-pod-network.db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Workload="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:19:14.111429 containerd[1615]: 2025-09-13 01:19:14.102 [INFO][5687] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" HandleID="k8s-pod-network.db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Workload="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:19:14.111429 containerd[1615]: 2025-09-13 01:19:14.105 [INFO][5687] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:14.111429 containerd[1615]: 2025-09-13 01:19:14.108 [INFO][5680] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:19:14.111429 containerd[1615]: time="2025-09-13T01:19:14.111233845Z" level=info msg="TearDown network for sandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\" successfully" Sep 13 01:19:14.111429 containerd[1615]: time="2025-09-13T01:19:14.111267315Z" level=info msg="StopPodSandbox for \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\" returns successfully" Sep 13 01:19:14.113378 containerd[1615]: time="2025-09-13T01:19:14.113337752Z" level=info msg="RemovePodSandbox for \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\"" Sep 13 01:19:14.113449 containerd[1615]: time="2025-09-13T01:19:14.113416217Z" level=info msg="Forcibly stopping sandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\"" Sep 13 01:19:14.469770 containerd[1615]: 2025-09-13 01:19:14.226 [WARNING][5701] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"21caebb3-60d2-40b7-849a-7ed3e6b8a990", ResourceVersion:"955", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3", Pod:"goldmane-7988f88666-9kgmq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7890faf4299", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:19:14.469770 containerd[1615]: 2025-09-13 01:19:14.226 [INFO][5701] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:19:14.469770 containerd[1615]: 2025-09-13 01:19:14.226 [INFO][5701] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" iface="eth0" netns="" Sep 13 01:19:14.469770 containerd[1615]: 2025-09-13 01:19:14.227 [INFO][5701] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:19:14.469770 containerd[1615]: 2025-09-13 01:19:14.227 [INFO][5701] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:19:14.469770 containerd[1615]: 2025-09-13 01:19:14.407 [INFO][5708] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" HandleID="k8s-pod-network.db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Workload="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:19:14.469770 containerd[1615]: 2025-09-13 01:19:14.408 [INFO][5708] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:19:14.469770 containerd[1615]: 2025-09-13 01:19:14.408 [INFO][5708] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:19:14.469770 containerd[1615]: 2025-09-13 01:19:14.439 [WARNING][5708] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" HandleID="k8s-pod-network.db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Workload="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:19:14.469770 containerd[1615]: 2025-09-13 01:19:14.439 [INFO][5708] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" HandleID="k8s-pod-network.db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Workload="srv--qlx5f.gb1.brightbox.com-k8s-goldmane--7988f88666--9kgmq-eth0" Sep 13 01:19:14.469770 containerd[1615]: 2025-09-13 01:19:14.454 [INFO][5708] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:19:14.469770 containerd[1615]: 2025-09-13 01:19:14.462 [INFO][5701] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589" Sep 13 01:19:14.469770 containerd[1615]: time="2025-09-13T01:19:14.469067476Z" level=info msg="TearDown network for sandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\" successfully" Sep 13 01:19:14.479929 containerd[1615]: time="2025-09-13T01:19:14.479847779Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:19:14.480323 containerd[1615]: time="2025-09-13T01:19:14.480268340Z" level=info msg="RemovePodSandbox \"db43e34c6f518b80fc951c824ef75acda9968614ad0b0a0b7839b0e4bcc3b589\" returns successfully" Sep 13 01:19:14.665335 containerd[1615]: time="2025-09-13T01:19:14.665228489Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 13 01:19:14.707004 containerd[1615]: time="2025-09-13T01:19:14.705239306Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:14.725167 containerd[1615]: time="2025-09-13T01:19:14.721493601Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:14.725167 containerd[1615]: time="2025-09-13T01:19:14.722093874Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 9.067671343s" Sep 13 01:19:14.725167 containerd[1615]: time="2025-09-13T01:19:14.722167966Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 13 01:19:14.727694 containerd[1615]: time="2025-09-13T01:19:14.727654216Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:14.750940 containerd[1615]: time="2025-09-13T01:19:14.750879799Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 13 01:19:14.758701 containerd[1615]: time="2025-09-13T01:19:14.758417897Z" level=info msg="CreateContainer within sandbox \"b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 13 01:19:14.796883 containerd[1615]: time="2025-09-13T01:19:14.796711303Z" level=info msg="CreateContainer within sandbox \"b1d23758ca10a37e7cbb50c5850dffb62d7c119002ad01b9ce753774b15847a3\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"5bdd921bb1f101143ee18e00b12fae4ed83ebfc9de1ef6f34bffaae54593fac5\"" Sep 13 01:19:14.801499 containerd[1615]: time="2025-09-13T01:19:14.798135975Z" level=info msg="StartContainer for \"5bdd921bb1f101143ee18e00b12fae4ed83ebfc9de1ef6f34bffaae54593fac5\"" Sep 13 01:19:15.047097 containerd[1615]: time="2025-09-13T01:19:15.043872940Z" level=info msg="StartContainer for \"5bdd921bb1f101143ee18e00b12fae4ed83ebfc9de1ef6f34bffaae54593fac5\" returns successfully" Sep 13 01:19:15.384113 kubelet[2861]: I0913 01:19:15.383955 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-9kgmq" podStartSLOduration=32.995344051000004 podStartE2EDuration="48.359956546s" podCreationTimestamp="2025-09-13 01:18:27 +0000 UTC" firstStartedPulling="2025-09-13 01:18:59.366815973 +0000 UTC m=+52.310573609" lastFinishedPulling="2025-09-13 01:19:14.731428462 +0000 UTC m=+67.675186104" observedRunningTime="2025-09-13 01:19:15.359628671 +0000 UTC m=+68.303386319" watchObservedRunningTime="2025-09-13 01:19:15.359956546 +0000 UTC m=+68.303714184" Sep 13 01:19:15.785549 systemd[1]: run-containerd-runc-k8s.io-5bdd921bb1f101143ee18e00b12fae4ed83ebfc9de1ef6f34bffaae54593fac5-runc.4EqBZZ.mount: Deactivated successfully. Sep 13 01:19:17.583871 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:17.574518 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:17.574582 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:19.635380 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:19.622330 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:19.622340 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:20.597759 containerd[1615]: time="2025-09-13T01:19:20.597531306Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:20.607659 containerd[1615]: time="2025-09-13T01:19:20.607583723Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 13 01:19:20.617124 containerd[1615]: time="2025-09-13T01:19:20.616961512Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:20.618796 containerd[1615]: time="2025-09-13T01:19:20.618687693Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 5.867735839s" Sep 13 01:19:20.619445 containerd[1615]: time="2025-09-13T01:19:20.618794047Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 13 01:19:20.623012 containerd[1615]: time="2025-09-13T01:19:20.622087999Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:20.703080 containerd[1615]: time="2025-09-13T01:19:20.701864094Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 13 01:19:20.753212 containerd[1615]: time="2025-09-13T01:19:20.753139646Z" level=info msg="CreateContainer within sandbox \"0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 13 01:19:20.792414 containerd[1615]: time="2025-09-13T01:19:20.792352900Z" level=info msg="CreateContainer within sandbox \"0c1376ea00c8cfa46d73c969d346bab944dfcb17e28fe2af016bf1211f816e6f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"13874a3557ba6660d3a5e75d9d11b258ad658d9d3627c802ffd6b8a8e6cd0d9f\"" Sep 13 01:19:20.795021 containerd[1615]: time="2025-09-13T01:19:20.794868311Z" level=info msg="StartContainer for \"13874a3557ba6660d3a5e75d9d11b258ad658d9d3627c802ffd6b8a8e6cd0d9f\"" Sep 13 01:19:21.048911 containerd[1615]: time="2025-09-13T01:19:21.046968994Z" level=info msg="StartContainer for \"13874a3557ba6660d3a5e75d9d11b258ad658d9d3627c802ffd6b8a8e6cd0d9f\" returns successfully" Sep 13 01:19:21.679191 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:21.678146 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:21.678192 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:22.200082 kubelet[2861]: I0913 01:19:22.193225 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-794c96ccd7-9sdrw" podStartSLOduration=37.89944514 podStartE2EDuration="59.169196667s" podCreationTimestamp="2025-09-13 01:18:23 +0000 UTC" firstStartedPulling="2025-09-13 01:18:59.371882201 +0000 UTC m=+52.315639837" lastFinishedPulling="2025-09-13 01:19:20.641633709 +0000 UTC m=+73.585391364" observedRunningTime="2025-09-13 01:19:22.115841302 +0000 UTC m=+75.059598968" watchObservedRunningTime="2025-09-13 01:19:22.169196667 +0000 UTC m=+75.112954310" Sep 13 01:19:25.477738 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount347113113.mount: Deactivated successfully. Sep 13 01:19:25.554314 containerd[1615]: time="2025-09-13T01:19:25.554243770Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:25.558794 containerd[1615]: time="2025-09-13T01:19:25.556289847Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 13 01:19:25.558794 containerd[1615]: time="2025-09-13T01:19:25.557690116Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:25.569453 containerd[1615]: time="2025-09-13T01:19:25.569207361Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:25.572178 containerd[1615]: time="2025-09-13T01:19:25.572132939Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 4.870147914s" Sep 13 01:19:25.573132 containerd[1615]: time="2025-09-13T01:19:25.572185165Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 13 01:19:25.662682 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:25.663100 containerd[1615]: time="2025-09-13T01:19:25.640173865Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 13 01:19:25.663100 containerd[1615]: time="2025-09-13T01:19:25.649380531Z" level=info msg="CreateContainer within sandbox \"5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 13 01:19:25.649933 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:25.649971 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:25.732467 containerd[1615]: time="2025-09-13T01:19:25.730077850Z" level=info msg="CreateContainer within sandbox \"5e76aafdbb5b90901f769f7e1331b09d1c5f0f8d4d3c15c5626a0faab60aed1c\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"5cb63349102db637f8795d9e9f22b76017267086fbb8f7f9188665566a3af06c\"" Sep 13 01:19:25.732467 containerd[1615]: time="2025-09-13T01:19:25.731245904Z" level=info msg="StartContainer for \"5cb63349102db637f8795d9e9f22b76017267086fbb8f7f9188665566a3af06c\"" Sep 13 01:19:26.163749 containerd[1615]: time="2025-09-13T01:19:26.163641664Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:26.163749 containerd[1615]: time="2025-09-13T01:19:26.163710010Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 13 01:19:26.173644 containerd[1615]: time="2025-09-13T01:19:26.173182610Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 532.951514ms" Sep 13 01:19:26.173644 containerd[1615]: time="2025-09-13T01:19:26.173252643Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 13 01:19:26.181202 containerd[1615]: time="2025-09-13T01:19:26.179232764Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 13 01:19:26.199341 containerd[1615]: time="2025-09-13T01:19:26.197337720Z" level=info msg="StartContainer for \"5cb63349102db637f8795d9e9f22b76017267086fbb8f7f9188665566a3af06c\" returns successfully" Sep 13 01:19:26.201293 containerd[1615]: time="2025-09-13T01:19:26.199962756Z" level=info msg="CreateContainer within sandbox \"2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 13 01:19:26.240971 containerd[1615]: time="2025-09-13T01:19:26.240692689Z" level=info msg="CreateContainer within sandbox \"2c00ba48d129271e56c2d0a55f6fa2b2b3a8e70714702a99c55c6b5ed6f7b66a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8fd3e6f3bb5c07bbbe85e623ea0f315a9e34a230503316dad3adb02cfc422349\"" Sep 13 01:19:26.242588 containerd[1615]: time="2025-09-13T01:19:26.241791328Z" level=info msg="StartContainer for \"8fd3e6f3bb5c07bbbe85e623ea0f315a9e34a230503316dad3adb02cfc422349\"" Sep 13 01:19:26.309108 systemd[1]: Started sshd@9-10.230.52.250:22-139.178.68.195:59338.service - OpenSSH per-connection server daemon (139.178.68.195:59338). Sep 13 01:19:26.929017 containerd[1615]: time="2025-09-13T01:19:26.928933863Z" level=info msg="StartContainer for \"8fd3e6f3bb5c07bbbe85e623ea0f315a9e34a230503316dad3adb02cfc422349\" returns successfully" Sep 13 01:19:27.320029 kubelet[2861]: I0913 01:19:27.319818 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-69479b54b-nqqrf" podStartSLOduration=4.346361872 podStartE2EDuration="34.319785947s" podCreationTimestamp="2025-09-13 01:18:53 +0000 UTC" firstStartedPulling="2025-09-13 01:18:55.618857824 +0000 UTC m=+48.562615456" lastFinishedPulling="2025-09-13 01:19:25.592281883 +0000 UTC m=+78.536039531" observedRunningTime="2025-09-13 01:19:27.238375801 +0000 UTC m=+80.182133461" watchObservedRunningTime="2025-09-13 01:19:27.319785947 +0000 UTC m=+80.263543587" Sep 13 01:19:27.409128 sshd[5914]: Accepted publickey for core from 139.178.68.195 port 59338 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:19:27.412869 sshd[5914]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:19:27.495882 systemd-logind[1593]: New session 12 of user core. Sep 13 01:19:27.504913 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 13 01:19:27.703889 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:27.697341 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:27.697417 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:29.031310 sshd[5914]: pam_unix(sshd:session): session closed for user core Sep 13 01:19:29.054183 systemd[1]: sshd@9-10.230.52.250:22-139.178.68.195:59338.service: Deactivated successfully. Sep 13 01:19:29.062150 systemd[1]: session-12.scope: Deactivated successfully. Sep 13 01:19:29.062915 systemd-logind[1593]: Session 12 logged out. Waiting for processes to exit. Sep 13 01:19:29.093994 systemd-logind[1593]: Removed session 12. Sep 13 01:19:29.583078 containerd[1615]: time="2025-09-13T01:19:29.580973815Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:29.590015 containerd[1615]: time="2025-09-13T01:19:29.586698301Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 13 01:19:29.592996 containerd[1615]: time="2025-09-13T01:19:29.590248847Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:29.613828 containerd[1615]: time="2025-09-13T01:19:29.608216323Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 01:19:29.613828 containerd[1615]: time="2025-09-13T01:19:29.611724513Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 3.432447773s" Sep 13 01:19:29.613828 containerd[1615]: time="2025-09-13T01:19:29.611777493Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 13 01:19:29.714319 containerd[1615]: time="2025-09-13T01:19:29.714256714Z" level=info msg="CreateContainer within sandbox \"f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 13 01:19:29.755203 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:29.734384 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:29.734401 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:29.825916 containerd[1615]: time="2025-09-13T01:19:29.825854507Z" level=info msg="CreateContainer within sandbox \"f30921b1a7f33a505f2e55577254fd2caf0c69f861dc1c87b198e040ab242202\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"3c22316a38d5b822fb5c31245f3d244daddd6a9cd26b057d1172bec3d0bc0d48\"" Sep 13 01:19:29.863699 containerd[1615]: time="2025-09-13T01:19:29.863058599Z" level=info msg="StartContainer for \"3c22316a38d5b822fb5c31245f3d244daddd6a9cd26b057d1172bec3d0bc0d48\"" Sep 13 01:19:30.125638 containerd[1615]: time="2025-09-13T01:19:30.125282440Z" level=info msg="StartContainer for \"3c22316a38d5b822fb5c31245f3d244daddd6a9cd26b057d1172bec3d0bc0d48\" returns successfully" Sep 13 01:19:30.267572 kubelet[2861]: I0913 01:19:30.267484 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-794c96ccd7-d8jh4" podStartSLOduration=40.502506973 podStartE2EDuration="1m7.267438945s" podCreationTimestamp="2025-09-13 01:18:23 +0000 UTC" firstStartedPulling="2025-09-13 01:18:59.411787549 +0000 UTC m=+52.355545186" lastFinishedPulling="2025-09-13 01:19:26.176719517 +0000 UTC m=+79.120477158" observedRunningTime="2025-09-13 01:19:27.328399844 +0000 UTC m=+80.272157514" watchObservedRunningTime="2025-09-13 01:19:30.267438945 +0000 UTC m=+83.211196579" Sep 13 01:19:30.269425 kubelet[2861]: I0913 01:19:30.267772 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-4zbwf" podStartSLOduration=28.263495368 podStartE2EDuration="1m2.267720915s" podCreationTimestamp="2025-09-13 01:18:28 +0000 UTC" firstStartedPulling="2025-09-13 01:18:55.625513153 +0000 UTC m=+48.569270782" lastFinishedPulling="2025-09-13 01:19:29.6297387 +0000 UTC m=+82.573496329" observedRunningTime="2025-09-13 01:19:30.262265717 +0000 UTC m=+83.206023372" watchObservedRunningTime="2025-09-13 01:19:30.267720915 +0000 UTC m=+83.211478561" Sep 13 01:19:30.693326 kubelet[2861]: I0913 01:19:30.692169 2861 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 13 01:19:30.696142 kubelet[2861]: I0913 01:19:30.694419 2861 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 13 01:19:31.798732 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:31.782499 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:31.782511 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:33.830459 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:33.838289 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:33.830472 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:34.207682 systemd[1]: Started sshd@10-10.230.52.250:22-139.178.68.195:48574.service - OpenSSH per-connection server daemon (139.178.68.195:48574). Sep 13 01:19:35.231101 sshd[6039]: Accepted publickey for core from 139.178.68.195 port 48574 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:19:35.234005 sshd[6039]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:19:35.246250 systemd-logind[1593]: New session 13 of user core. Sep 13 01:19:35.252842 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 13 01:19:36.896735 sshd[6039]: pam_unix(sshd:session): session closed for user core Sep 13 01:19:36.905931 systemd-logind[1593]: Session 13 logged out. Waiting for processes to exit. Sep 13 01:19:36.909884 systemd[1]: sshd@10-10.230.52.250:22-139.178.68.195:48574.service: Deactivated successfully. Sep 13 01:19:36.937624 systemd[1]: session-13.scope: Deactivated successfully. Sep 13 01:19:36.945180 systemd-logind[1593]: Removed session 13. Sep 13 01:19:37.929246 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:37.926372 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:37.926418 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:42.050315 systemd[1]: Started sshd@11-10.230.52.250:22-139.178.68.195:40978.service - OpenSSH per-connection server daemon (139.178.68.195:40978). Sep 13 01:19:43.011690 sshd[6099]: Accepted publickey for core from 139.178.68.195 port 40978 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:19:43.015813 sshd[6099]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:19:43.032921 systemd-logind[1593]: New session 14 of user core. Sep 13 01:19:43.038526 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 13 01:19:43.635801 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:43.639195 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:43.639224 systemd-resolved[1511]: Flushed all caches. Sep 13 01:19:44.342419 sshd[6099]: pam_unix(sshd:session): session closed for user core Sep 13 01:19:44.353596 systemd-logind[1593]: Session 14 logged out. Waiting for processes to exit. Sep 13 01:19:44.360359 systemd[1]: sshd@11-10.230.52.250:22-139.178.68.195:40978.service: Deactivated successfully. Sep 13 01:19:44.379619 systemd[1]: session-14.scope: Deactivated successfully. Sep 13 01:19:44.385323 systemd-logind[1593]: Removed session 14. Sep 13 01:19:44.490382 systemd[1]: Started sshd@12-10.230.52.250:22-139.178.68.195:40986.service - OpenSSH per-connection server daemon (139.178.68.195:40986). Sep 13 01:19:45.155422 systemd[1]: Started sshd@13-10.230.52.250:22-194.0.234.19:31402.service - OpenSSH per-connection server daemon (194.0.234.19:31402). Sep 13 01:19:45.397155 sshd[6117]: Accepted publickey for core from 139.178.68.195 port 40986 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:19:45.398866 sshd[6117]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:19:45.411041 systemd-logind[1593]: New session 15 of user core. Sep 13 01:19:45.417304 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 13 01:19:45.831075 sshd[6119]: Invalid user admin from 194.0.234.19 port 31402 Sep 13 01:19:45.921392 sshd[6119]: Connection closed by invalid user admin 194.0.234.19 port 31402 [preauth] Sep 13 01:19:45.929086 systemd[1]: sshd@13-10.230.52.250:22-194.0.234.19:31402.service: Deactivated successfully. Sep 13 01:19:46.446249 sshd[6117]: pam_unix(sshd:session): session closed for user core Sep 13 01:19:46.456621 systemd[1]: sshd@12-10.230.52.250:22-139.178.68.195:40986.service: Deactivated successfully. Sep 13 01:19:46.468355 systemd[1]: session-15.scope: Deactivated successfully. Sep 13 01:19:46.470249 systemd-logind[1593]: Session 15 logged out. Waiting for processes to exit. Sep 13 01:19:46.475687 systemd-logind[1593]: Removed session 15. Sep 13 01:19:46.591342 systemd[1]: Started sshd@14-10.230.52.250:22-139.178.68.195:40990.service - OpenSSH per-connection server daemon (139.178.68.195:40990). Sep 13 01:19:47.537264 sshd[6134]: Accepted publickey for core from 139.178.68.195 port 40990 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:19:47.541241 sshd[6134]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:19:47.551456 systemd-logind[1593]: New session 16 of user core. Sep 13 01:19:47.559602 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 13 01:19:48.333255 sshd[6134]: pam_unix(sshd:session): session closed for user core Sep 13 01:19:48.342273 systemd-logind[1593]: Session 16 logged out. Waiting for processes to exit. Sep 13 01:19:48.343546 systemd[1]: sshd@14-10.230.52.250:22-139.178.68.195:40990.service: Deactivated successfully. Sep 13 01:19:48.348298 systemd[1]: session-16.scope: Deactivated successfully. Sep 13 01:19:48.351266 systemd-logind[1593]: Removed session 16. Sep 13 01:19:53.484425 systemd[1]: Started sshd@15-10.230.52.250:22-139.178.68.195:51320.service - OpenSSH per-connection server daemon (139.178.68.195:51320). Sep 13 01:19:54.443066 sshd[6155]: Accepted publickey for core from 139.178.68.195 port 51320 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:19:54.444317 sshd[6155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:19:54.456513 systemd-logind[1593]: New session 17 of user core. Sep 13 01:19:54.462900 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 13 01:19:55.621402 sshd[6155]: pam_unix(sshd:session): session closed for user core Sep 13 01:19:55.628967 systemd[1]: sshd@15-10.230.52.250:22-139.178.68.195:51320.service: Deactivated successfully. Sep 13 01:19:55.636549 systemd-logind[1593]: Session 17 logged out. Waiting for processes to exit. Sep 13 01:19:55.638761 systemd[1]: session-17.scope: Deactivated successfully. Sep 13 01:19:55.643320 systemd-logind[1593]: Removed session 17. Sep 13 01:19:55.667213 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:19:55.663248 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:19:55.663287 systemd-resolved[1511]: Flushed all caches. Sep 13 01:20:00.767374 systemd[1]: Started sshd@16-10.230.52.250:22-139.178.68.195:50550.service - OpenSSH per-connection server daemon (139.178.68.195:50550). Sep 13 01:20:01.271535 systemd[1]: run-containerd-runc-k8s.io-10133b9378be78c00a3efbaa0e2d019e1616b66844159d9fdc2e7da331a814e3-runc.HxFo4u.mount: Deactivated successfully. Sep 13 01:20:01.725114 sshd[6170]: Accepted publickey for core from 139.178.68.195 port 50550 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:20:01.729231 sshd[6170]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:20:01.744233 systemd-logind[1593]: New session 18 of user core. Sep 13 01:20:01.753952 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 13 01:20:03.035129 sshd[6170]: pam_unix(sshd:session): session closed for user core Sep 13 01:20:03.043083 systemd[1]: sshd@16-10.230.52.250:22-139.178.68.195:50550.service: Deactivated successfully. Sep 13 01:20:03.054397 systemd-logind[1593]: Session 18 logged out. Waiting for processes to exit. Sep 13 01:20:03.055151 systemd[1]: session-18.scope: Deactivated successfully. Sep 13 01:20:03.057054 systemd-logind[1593]: Removed session 18. Sep 13 01:20:03.662180 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:20:03.655774 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:20:03.655787 systemd-resolved[1511]: Flushed all caches. Sep 13 01:20:08.184320 systemd[1]: Started sshd@17-10.230.52.250:22-139.178.68.195:50558.service - OpenSSH per-connection server daemon (139.178.68.195:50558). Sep 13 01:20:09.144926 sshd[6208]: Accepted publickey for core from 139.178.68.195 port 50558 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:20:09.148543 sshd[6208]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:20:09.161073 systemd-logind[1593]: New session 19 of user core. Sep 13 01:20:09.169543 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 13 01:20:10.284860 sshd[6208]: pam_unix(sshd:session): session closed for user core Sep 13 01:20:10.297163 systemd-logind[1593]: Session 19 logged out. Waiting for processes to exit. Sep 13 01:20:10.299948 systemd[1]: sshd@17-10.230.52.250:22-139.178.68.195:50558.service: Deactivated successfully. Sep 13 01:20:10.310864 systemd[1]: session-19.scope: Deactivated successfully. Sep 13 01:20:10.321039 systemd-logind[1593]: Removed session 19. Sep 13 01:20:10.440362 systemd[1]: Started sshd@18-10.230.52.250:22-139.178.68.195:47094.service - OpenSSH per-connection server daemon (139.178.68.195:47094). Sep 13 01:20:11.239294 systemd[1]: run-containerd-runc-k8s.io-ee1d983b0fe8af28d38a552cd04df7689abaa4f1021eb36ea26592b8132e9881-runc.SZnb1k.mount: Deactivated successfully. Sep 13 01:20:11.350013 sshd[6225]: Accepted publickey for core from 139.178.68.195 port 47094 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:20:11.358730 sshd[6225]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:20:11.379196 systemd-logind[1593]: New session 20 of user core. Sep 13 01:20:11.382791 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 13 01:20:12.524915 sshd[6225]: pam_unix(sshd:session): session closed for user core Sep 13 01:20:12.542827 systemd[1]: sshd@18-10.230.52.250:22-139.178.68.195:47094.service: Deactivated successfully. Sep 13 01:20:12.546190 systemd-logind[1593]: Session 20 logged out. Waiting for processes to exit. Sep 13 01:20:12.551789 systemd[1]: session-20.scope: Deactivated successfully. Sep 13 01:20:12.556381 systemd-logind[1593]: Removed session 20. Sep 13 01:20:12.673505 systemd[1]: Started sshd@19-10.230.52.250:22-139.178.68.195:47102.service - OpenSSH per-connection server daemon (139.178.68.195:47102). Sep 13 01:20:13.598698 sshd[6275]: Accepted publickey for core from 139.178.68.195 port 47102 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:20:13.606329 sshd[6275]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:20:13.619361 systemd-logind[1593]: New session 21 of user core. Sep 13 01:20:13.625396 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 13 01:20:13.959639 systemd[1]: run-containerd-runc-k8s.io-5bdd921bb1f101143ee18e00b12fae4ed83ebfc9de1ef6f34bffaae54593fac5-runc.dd9Sic.mount: Deactivated successfully. Sep 13 01:20:14.656818 containerd[1615]: time="2025-09-13T01:20:14.604175050Z" level=info msg="StopPodSandbox for \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\"" Sep 13 01:20:15.645315 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:20:15.623563 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:20:15.670095 containerd[1615]: 2025-09-13 01:20:15.294 [WARNING][6317] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"b466df6e-e38d-402c-86b5-4227a521ff8c", ResourceVersion:"1025", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778", Pod:"coredns-7c65d6cfc9-8m5j5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali331404c00dd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:20:15.670095 containerd[1615]: 2025-09-13 01:20:15.300 [INFO][6317] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:20:15.670095 containerd[1615]: 2025-09-13 01:20:15.301 [INFO][6317] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" iface="eth0" netns="" Sep 13 01:20:15.670095 containerd[1615]: 2025-09-13 01:20:15.301 [INFO][6317] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:20:15.670095 containerd[1615]: 2025-09-13 01:20:15.301 [INFO][6317] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:20:15.670095 containerd[1615]: 2025-09-13 01:20:15.542 [INFO][6324] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" HandleID="k8s-pod-network.7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:20:15.670095 containerd[1615]: 2025-09-13 01:20:15.544 [INFO][6324] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:20:15.670095 containerd[1615]: 2025-09-13 01:20:15.544 [INFO][6324] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:20:15.670095 containerd[1615]: 2025-09-13 01:20:15.590 [WARNING][6324] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" HandleID="k8s-pod-network.7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:20:15.670095 containerd[1615]: 2025-09-13 01:20:15.590 [INFO][6324] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" HandleID="k8s-pod-network.7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:20:15.670095 containerd[1615]: 2025-09-13 01:20:15.602 [INFO][6324] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:20:15.670095 containerd[1615]: 2025-09-13 01:20:15.606 [INFO][6317] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:20:15.623574 systemd-resolved[1511]: Flushed all caches. Sep 13 01:20:15.687355 containerd[1615]: time="2025-09-13T01:20:15.685213402Z" level=info msg="TearDown network for sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\" successfully" Sep 13 01:20:15.687355 containerd[1615]: time="2025-09-13T01:20:15.687229826Z" level=info msg="StopPodSandbox for \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\" returns successfully" Sep 13 01:20:15.796333 containerd[1615]: time="2025-09-13T01:20:15.795073637Z" level=info msg="RemovePodSandbox for \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\"" Sep 13 01:20:15.799293 containerd[1615]: time="2025-09-13T01:20:15.799262440Z" level=info msg="Forcibly stopping sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\"" Sep 13 01:20:16.171330 containerd[1615]: 2025-09-13 01:20:15.990 [WARNING][6340] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"b466df6e-e38d-402c-86b5-4227a521ff8c", ResourceVersion:"1025", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 1, 18, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-qlx5f.gb1.brightbox.com", ContainerID:"b143134d02caf4c741d507bbe29826c7df6e2a2d45db3d4d1d3e15d7e857e778", Pod:"coredns-7c65d6cfc9-8m5j5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali331404c00dd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 01:20:16.171330 containerd[1615]: 2025-09-13 01:20:15.993 [INFO][6340] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:20:16.171330 containerd[1615]: 2025-09-13 01:20:15.993 [INFO][6340] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" iface="eth0" netns="" Sep 13 01:20:16.171330 containerd[1615]: 2025-09-13 01:20:15.993 [INFO][6340] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:20:16.171330 containerd[1615]: 2025-09-13 01:20:15.993 [INFO][6340] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:20:16.171330 containerd[1615]: 2025-09-13 01:20:16.094 [INFO][6347] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" HandleID="k8s-pod-network.7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:20:16.171330 containerd[1615]: 2025-09-13 01:20:16.097 [INFO][6347] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 01:20:16.171330 containerd[1615]: 2025-09-13 01:20:16.097 [INFO][6347] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 01:20:16.171330 containerd[1615]: 2025-09-13 01:20:16.148 [WARNING][6347] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" HandleID="k8s-pod-network.7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:20:16.171330 containerd[1615]: 2025-09-13 01:20:16.148 [INFO][6347] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" HandleID="k8s-pod-network.7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Workload="srv--qlx5f.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--8m5j5-eth0" Sep 13 01:20:16.171330 containerd[1615]: 2025-09-13 01:20:16.158 [INFO][6347] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 01:20:16.171330 containerd[1615]: 2025-09-13 01:20:16.164 [INFO][6340] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1" Sep 13 01:20:16.171330 containerd[1615]: time="2025-09-13T01:20:16.169753530Z" level=info msg="TearDown network for sandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\" successfully" Sep 13 01:20:16.254735 containerd[1615]: time="2025-09-13T01:20:16.254466640Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 01:20:16.254735 containerd[1615]: time="2025-09-13T01:20:16.254635799Z" level=info msg="RemovePodSandbox \"7b1a0aa046d5b64a533388b195042a93995893a58852d3dd7ab517953de8bbd1\" returns successfully" Sep 13 01:20:17.702384 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:20:17.691030 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:20:17.691044 systemd-resolved[1511]: Flushed all caches. Sep 13 01:20:19.090559 sshd[6275]: pam_unix(sshd:session): session closed for user core Sep 13 01:20:19.157681 systemd[1]: sshd@19-10.230.52.250:22-139.178.68.195:47102.service: Deactivated successfully. Sep 13 01:20:19.169678 systemd[1]: session-21.scope: Deactivated successfully. Sep 13 01:20:19.169739 systemd-logind[1593]: Session 21 logged out. Waiting for processes to exit. Sep 13 01:20:19.202132 systemd-logind[1593]: Removed session 21. Sep 13 01:20:19.238653 systemd[1]: Started sshd@20-10.230.52.250:22-139.178.68.195:47104.service - OpenSSH per-connection server daemon (139.178.68.195:47104). Sep 13 01:20:19.725296 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:20:19.736126 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:20:19.736158 systemd-resolved[1511]: Flushed all caches. Sep 13 01:20:20.194445 sshd[6372]: Accepted publickey for core from 139.178.68.195 port 47104 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:20:20.198867 sshd[6372]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:20:20.251776 systemd-logind[1593]: New session 22 of user core. Sep 13 01:20:20.257550 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 13 01:20:21.783594 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:20:21.773254 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:20:21.773274 systemd-resolved[1511]: Flushed all caches. Sep 13 01:20:22.413108 kubelet[2861]: E0913 01:20:22.413025 2861 kubelet.go:2512] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.131s" Sep 13 01:20:23.515108 sshd[6372]: pam_unix(sshd:session): session closed for user core Sep 13 01:20:23.563469 systemd[1]: sshd@20-10.230.52.250:22-139.178.68.195:47104.service: Deactivated successfully. Sep 13 01:20:23.583767 systemd[1]: session-22.scope: Deactivated successfully. Sep 13 01:20:23.584415 systemd-logind[1593]: Session 22 logged out. Waiting for processes to exit. Sep 13 01:20:23.595897 systemd-logind[1593]: Removed session 22. Sep 13 01:20:23.684386 systemd[1]: Started sshd@21-10.230.52.250:22-139.178.68.195:34200.service - OpenSSH per-connection server daemon (139.178.68.195:34200). Sep 13 01:20:23.823212 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:20:23.820953 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:20:23.820992 systemd-resolved[1511]: Flushed all caches. Sep 13 01:20:24.674075 sshd[6410]: Accepted publickey for core from 139.178.68.195 port 34200 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:20:24.678782 sshd[6410]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:20:24.706784 systemd-logind[1593]: New session 23 of user core. Sep 13 01:20:24.717103 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 13 01:20:25.957488 sshd[6410]: pam_unix(sshd:session): session closed for user core Sep 13 01:20:25.962884 systemd-logind[1593]: Session 23 logged out. Waiting for processes to exit. Sep 13 01:20:25.965226 systemd[1]: sshd@21-10.230.52.250:22-139.178.68.195:34200.service: Deactivated successfully. Sep 13 01:20:25.974337 systemd[1]: session-23.scope: Deactivated successfully. Sep 13 01:20:25.979337 systemd-logind[1593]: Removed session 23. Sep 13 01:20:29.708481 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:20:29.708091 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:20:29.708108 systemd-resolved[1511]: Flushed all caches. Sep 13 01:20:31.113455 systemd[1]: Started sshd@22-10.230.52.250:22-139.178.68.195:53718.service - OpenSSH per-connection server daemon (139.178.68.195:53718). Sep 13 01:20:32.176514 sshd[6450]: Accepted publickey for core from 139.178.68.195 port 53718 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:20:32.200297 sshd[6450]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:20:32.233263 systemd-logind[1593]: New session 24 of user core. Sep 13 01:20:32.238607 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 13 01:20:33.803344 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:20:33.798473 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:20:33.798486 systemd-resolved[1511]: Flushed all caches. Sep 13 01:20:34.104143 sshd[6450]: pam_unix(sshd:session): session closed for user core Sep 13 01:20:34.116062 systemd-logind[1593]: Session 24 logged out. Waiting for processes to exit. Sep 13 01:20:34.122025 systemd[1]: sshd@22-10.230.52.250:22-139.178.68.195:53718.service: Deactivated successfully. Sep 13 01:20:34.134539 systemd[1]: session-24.scope: Deactivated successfully. Sep 13 01:20:34.141356 systemd-logind[1593]: Removed session 24. Sep 13 01:20:39.266451 systemd[1]: Started sshd@23-10.230.52.250:22-139.178.68.195:53720.service - OpenSSH per-connection server daemon (139.178.68.195:53720). Sep 13 01:20:40.224020 sshd[6485]: Accepted publickey for core from 139.178.68.195 port 53720 ssh2: RSA SHA256:nCFR9BVD/sBsaMzu6piX/nSqoN/UcYzTi/UCsy9A7bQ Sep 13 01:20:40.227165 sshd[6485]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 01:20:40.239781 systemd-logind[1593]: New session 25 of user core. Sep 13 01:20:40.247375 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 13 01:20:41.678962 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:20:41.670295 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:20:41.670307 systemd-resolved[1511]: Flushed all caches. Sep 13 01:20:41.897462 sshd[6485]: pam_unix(sshd:session): session closed for user core Sep 13 01:20:41.911565 systemd[1]: sshd@23-10.230.52.250:22-139.178.68.195:53720.service: Deactivated successfully. Sep 13 01:20:41.918064 systemd-logind[1593]: Session 25 logged out. Waiting for processes to exit. Sep 13 01:20:41.918718 systemd[1]: session-25.scope: Deactivated successfully. Sep 13 01:20:41.921311 systemd-logind[1593]: Removed session 25. Sep 13 01:20:43.731209 systemd-journald[1181]: Under memory pressure, flushing caches. Sep 13 01:20:43.719393 systemd-resolved[1511]: Under memory pressure, flushing caches. Sep 13 01:20:43.719404 systemd-resolved[1511]: Flushed all caches.