Jan 30 19:13:20.036070 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Wed Jan 29 10:09:32 -00 2025 Jan 30 19:13:20.036119 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=befc9792b021bef43c896e00e1d5172b6224dbafc9b6c92b267e5e544378e681 Jan 30 19:13:20.036134 kernel: BIOS-provided physical RAM map: Jan 30 19:13:20.036151 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Jan 30 19:13:20.036162 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Jan 30 19:13:20.036172 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Jan 30 19:13:20.036184 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Jan 30 19:13:20.036195 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Jan 30 19:13:20.036206 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Jan 30 19:13:20.036217 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Jan 30 19:13:20.036227 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 30 19:13:20.036238 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Jan 30 19:13:20.036258 kernel: NX (Execute Disable) protection: active Jan 30 19:13:20.036269 kernel: APIC: Static calls initialized Jan 30 19:13:20.036282 kernel: SMBIOS 2.8 present. Jan 30 19:13:20.036294 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Jan 30 19:13:20.036306 kernel: Hypervisor detected: KVM Jan 30 19:13:20.036322 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 30 19:13:20.036334 kernel: kvm-clock: using sched offset of 4262586926 cycles Jan 30 19:13:20.036347 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 30 19:13:20.036359 kernel: tsc: Detected 2499.998 MHz processor Jan 30 19:13:20.036371 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 30 19:13:20.036384 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 30 19:13:20.036395 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Jan 30 19:13:20.036407 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Jan 30 19:13:20.036419 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 30 19:13:20.036435 kernel: Using GB pages for direct mapping Jan 30 19:13:20.036447 kernel: ACPI: Early table checksum verification disabled Jan 30 19:13:20.036459 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Jan 30 19:13:20.036471 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 30 19:13:20.036483 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 30 19:13:20.036495 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 30 19:13:20.036507 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Jan 30 19:13:20.036518 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 30 19:13:20.036530 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 30 19:13:20.036546 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 30 19:13:20.036559 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 30 19:13:20.036570 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Jan 30 19:13:20.036582 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Jan 30 19:13:20.036594 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Jan 30 19:13:20.036612 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Jan 30 19:13:20.036625 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Jan 30 19:13:20.036641 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Jan 30 19:13:20.036654 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Jan 30 19:13:20.036666 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 30 19:13:20.036678 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Jan 30 19:13:20.036691 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 30 19:13:20.036703 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Jan 30 19:13:20.036715 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 30 19:13:20.036732 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Jan 30 19:13:20.036744 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 30 19:13:20.036756 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Jan 30 19:13:20.036769 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 30 19:13:20.036802 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Jan 30 19:13:20.036817 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 30 19:13:20.036830 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Jan 30 19:13:20.036842 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 30 19:13:20.036854 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Jan 30 19:13:20.036866 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 30 19:13:20.036885 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Jan 30 19:13:20.036898 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 30 19:13:20.036911 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 30 19:13:20.036923 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Jan 30 19:13:20.036936 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Jan 30 19:13:20.036948 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Jan 30 19:13:20.036961 kernel: Zone ranges: Jan 30 19:13:20.036973 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 30 19:13:20.036985 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Jan 30 19:13:20.037002 kernel: Normal empty Jan 30 19:13:20.037015 kernel: Movable zone start for each node Jan 30 19:13:20.037027 kernel: Early memory node ranges Jan 30 19:13:20.037042 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Jan 30 19:13:20.037055 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Jan 30 19:13:20.037067 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Jan 30 19:13:20.037080 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 30 19:13:20.037102 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Jan 30 19:13:20.037117 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Jan 30 19:13:20.037129 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 30 19:13:20.037147 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 30 19:13:20.037160 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 30 19:13:20.037172 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 30 19:13:20.037185 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 30 19:13:20.037197 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 30 19:13:20.037209 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 30 19:13:20.037222 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 30 19:13:20.037234 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 30 19:13:20.037246 kernel: TSC deadline timer available Jan 30 19:13:20.037264 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Jan 30 19:13:20.037277 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 30 19:13:20.037289 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Jan 30 19:13:20.037301 kernel: Booting paravirtualized kernel on KVM Jan 30 19:13:20.037313 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 30 19:13:20.037326 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Jan 30 19:13:20.037338 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 30 19:13:20.037350 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 30 19:13:20.037363 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Jan 30 19:13:20.037379 kernel: kvm-guest: PV spinlocks enabled Jan 30 19:13:20.037392 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 30 19:13:20.037406 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=befc9792b021bef43c896e00e1d5172b6224dbafc9b6c92b267e5e544378e681 Jan 30 19:13:20.037419 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 30 19:13:20.037431 kernel: random: crng init done Jan 30 19:13:20.037444 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 30 19:13:20.037456 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 30 19:13:20.037468 kernel: Fallback order for Node 0: 0 Jan 30 19:13:20.037486 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Jan 30 19:13:20.037498 kernel: Policy zone: DMA32 Jan 30 19:13:20.037511 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 30 19:13:20.037523 kernel: software IO TLB: area num 16. Jan 30 19:13:20.037543 kernel: Memory: 1901524K/2096616K available (12288K kernel code, 2301K rwdata, 22728K rodata, 42844K init, 2348K bss, 194832K reserved, 0K cma-reserved) Jan 30 19:13:20.037556 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Jan 30 19:13:20.037568 kernel: Kernel/User page tables isolation: enabled Jan 30 19:13:20.037581 kernel: ftrace: allocating 37921 entries in 149 pages Jan 30 19:13:20.037593 kernel: ftrace: allocated 149 pages with 4 groups Jan 30 19:13:20.037611 kernel: Dynamic Preempt: voluntary Jan 30 19:13:20.037628 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 30 19:13:20.037642 kernel: rcu: RCU event tracing is enabled. Jan 30 19:13:20.037654 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Jan 30 19:13:20.037667 kernel: Trampoline variant of Tasks RCU enabled. Jan 30 19:13:20.037701 kernel: Rude variant of Tasks RCU enabled. Jan 30 19:13:20.037719 kernel: Tracing variant of Tasks RCU enabled. Jan 30 19:13:20.037732 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 30 19:13:20.037745 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Jan 30 19:13:20.037758 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Jan 30 19:13:20.037771 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 30 19:13:20.040856 kernel: Console: colour VGA+ 80x25 Jan 30 19:13:20.040894 kernel: printk: console [tty0] enabled Jan 30 19:13:20.040908 kernel: printk: console [ttyS0] enabled Jan 30 19:13:20.040921 kernel: ACPI: Core revision 20230628 Jan 30 19:13:20.040933 kernel: APIC: Switch to symmetric I/O mode setup Jan 30 19:13:20.040945 kernel: x2apic enabled Jan 30 19:13:20.040975 kernel: APIC: Switched APIC routing to: physical x2apic Jan 30 19:13:20.040988 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Jan 30 19:13:20.041001 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499998) Jan 30 19:13:20.041014 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 30 19:13:20.041039 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 30 19:13:20.041052 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 30 19:13:20.041065 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 30 19:13:20.041078 kernel: Spectre V2 : Mitigation: Retpolines Jan 30 19:13:20.041091 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 30 19:13:20.041123 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Jan 30 19:13:20.041137 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Jan 30 19:13:20.041150 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 30 19:13:20.041163 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 30 19:13:20.041175 kernel: MDS: Mitigation: Clear CPU buffers Jan 30 19:13:20.041188 kernel: MMIO Stale Data: Unknown: No mitigations Jan 30 19:13:20.041201 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 30 19:13:20.041214 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 30 19:13:20.041227 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 30 19:13:20.041240 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 30 19:13:20.041253 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 30 19:13:20.041271 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Jan 30 19:13:20.041284 kernel: Freeing SMP alternatives memory: 32K Jan 30 19:13:20.041297 kernel: pid_max: default: 32768 minimum: 301 Jan 30 19:13:20.041310 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 30 19:13:20.041323 kernel: landlock: Up and running. Jan 30 19:13:20.041336 kernel: SELinux: Initializing. Jan 30 19:13:20.041349 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 30 19:13:20.041362 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 30 19:13:20.041374 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Jan 30 19:13:20.041387 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 30 19:13:20.041401 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 30 19:13:20.041419 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 30 19:13:20.041432 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Jan 30 19:13:20.041445 kernel: signal: max sigframe size: 1776 Jan 30 19:13:20.041458 kernel: rcu: Hierarchical SRCU implementation. Jan 30 19:13:20.041472 kernel: rcu: Max phase no-delay instances is 400. Jan 30 19:13:20.041485 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 30 19:13:20.041498 kernel: smp: Bringing up secondary CPUs ... Jan 30 19:13:20.041511 kernel: smpboot: x86: Booting SMP configuration: Jan 30 19:13:20.041524 kernel: .... node #0, CPUs: #1 Jan 30 19:13:20.041541 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 30 19:13:20.041555 kernel: smp: Brought up 1 node, 2 CPUs Jan 30 19:13:20.041568 kernel: smpboot: Max logical packages: 16 Jan 30 19:13:20.041581 kernel: smpboot: Total of 2 processors activated (9999.99 BogoMIPS) Jan 30 19:13:20.041594 kernel: devtmpfs: initialized Jan 30 19:13:20.041607 kernel: x86/mm: Memory block size: 128MB Jan 30 19:13:20.041620 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 30 19:13:20.041645 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Jan 30 19:13:20.041658 kernel: pinctrl core: initialized pinctrl subsystem Jan 30 19:13:20.041674 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 30 19:13:20.041687 kernel: audit: initializing netlink subsys (disabled) Jan 30 19:13:20.041700 kernel: audit: type=2000 audit(1738264398.453:1): state=initialized audit_enabled=0 res=1 Jan 30 19:13:20.041724 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 30 19:13:20.041736 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 30 19:13:20.041749 kernel: cpuidle: using governor menu Jan 30 19:13:20.041761 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 30 19:13:20.041773 kernel: dca service started, version 1.12.1 Jan 30 19:13:20.041785 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Jan 30 19:13:20.041819 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Jan 30 19:13:20.041836 kernel: PCI: Using configuration type 1 for base access Jan 30 19:13:20.041848 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 30 19:13:20.041861 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 30 19:13:20.041873 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 30 19:13:20.041897 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 30 19:13:20.041910 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 30 19:13:20.041923 kernel: ACPI: Added _OSI(Module Device) Jan 30 19:13:20.041935 kernel: ACPI: Added _OSI(Processor Device) Jan 30 19:13:20.041967 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 30 19:13:20.041979 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 30 19:13:20.041992 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 30 19:13:20.042004 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 30 19:13:20.042029 kernel: ACPI: Interpreter enabled Jan 30 19:13:20.042041 kernel: ACPI: PM: (supports S0 S5) Jan 30 19:13:20.042054 kernel: ACPI: Using IOAPIC for interrupt routing Jan 30 19:13:20.042066 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 30 19:13:20.042091 kernel: PCI: Using E820 reservations for host bridge windows Jan 30 19:13:20.042122 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 30 19:13:20.042136 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 30 19:13:20.042379 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 30 19:13:20.042562 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 30 19:13:20.042729 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 30 19:13:20.042749 kernel: PCI host bridge to bus 0000:00 Jan 30 19:13:20.052213 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 30 19:13:20.052444 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 30 19:13:20.052613 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 30 19:13:20.052834 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Jan 30 19:13:20.052996 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 30 19:13:20.053169 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Jan 30 19:13:20.053360 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 30 19:13:20.053572 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Jan 30 19:13:20.053773 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Jan 30 19:13:20.054002 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Jan 30 19:13:20.054192 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Jan 30 19:13:20.054364 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Jan 30 19:13:20.054534 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 30 19:13:20.054735 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Jan 30 19:13:20.054938 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Jan 30 19:13:20.055169 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Jan 30 19:13:20.055381 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Jan 30 19:13:20.055577 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Jan 30 19:13:20.055755 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Jan 30 19:13:20.055991 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Jan 30 19:13:20.056189 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Jan 30 19:13:20.056374 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Jan 30 19:13:20.056548 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Jan 30 19:13:20.056729 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Jan 30 19:13:20.056929 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Jan 30 19:13:20.057151 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Jan 30 19:13:20.057334 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Jan 30 19:13:20.057520 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Jan 30 19:13:20.057692 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Jan 30 19:13:20.064192 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Jan 30 19:13:20.064384 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Jan 30 19:13:20.064566 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Jan 30 19:13:20.064744 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Jan 30 19:13:20.064949 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Jan 30 19:13:20.065165 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Jan 30 19:13:20.065341 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Jan 30 19:13:20.065552 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Jan 30 19:13:20.065724 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Jan 30 19:13:20.065995 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Jan 30 19:13:20.066189 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 30 19:13:20.066385 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Jan 30 19:13:20.066590 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Jan 30 19:13:20.066761 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Jan 30 19:13:20.066974 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Jan 30 19:13:20.067164 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Jan 30 19:13:20.067358 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Jan 30 19:13:20.067574 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Jan 30 19:13:20.067757 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 30 19:13:20.075354 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Jan 30 19:13:20.075552 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jan 30 19:13:20.075746 kernel: pci_bus 0000:02: extended config space not accessible Jan 30 19:13:20.075985 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Jan 30 19:13:20.076194 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Jan 30 19:13:20.076370 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 30 19:13:20.076564 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Jan 30 19:13:20.076755 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Jan 30 19:13:20.076987 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Jan 30 19:13:20.077182 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 30 19:13:20.077351 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Jan 30 19:13:20.077527 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 30 19:13:20.077763 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Jan 30 19:13:20.077975 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Jan 30 19:13:20.078195 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 30 19:13:20.078400 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Jan 30 19:13:20.078569 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 30 19:13:20.078744 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 30 19:13:20.078986 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Jan 30 19:13:20.079179 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 30 19:13:20.079384 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 30 19:13:20.079598 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Jan 30 19:13:20.079864 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 30 19:13:20.080036 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 30 19:13:20.080247 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Jan 30 19:13:20.080446 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 30 19:13:20.080631 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 30 19:13:20.080888 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Jan 30 19:13:20.081055 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 30 19:13:20.081306 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 30 19:13:20.081485 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Jan 30 19:13:20.081661 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 30 19:13:20.081682 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 30 19:13:20.081696 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 30 19:13:20.081711 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 30 19:13:20.081733 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 30 19:13:20.081747 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 30 19:13:20.081809 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 30 19:13:20.081826 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 30 19:13:20.081839 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 30 19:13:20.081853 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 30 19:13:20.081867 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 30 19:13:20.081880 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 30 19:13:20.081893 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 30 19:13:20.081924 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 30 19:13:20.081939 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 30 19:13:20.081959 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 30 19:13:20.081972 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 30 19:13:20.081986 kernel: iommu: Default domain type: Translated Jan 30 19:13:20.081999 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 30 19:13:20.082022 kernel: PCI: Using ACPI for IRQ routing Jan 30 19:13:20.082035 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 30 19:13:20.082049 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Jan 30 19:13:20.082068 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Jan 30 19:13:20.082298 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 30 19:13:20.082479 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 30 19:13:20.082703 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 30 19:13:20.082725 kernel: vgaarb: loaded Jan 30 19:13:20.082739 kernel: clocksource: Switched to clocksource kvm-clock Jan 30 19:13:20.082753 kernel: VFS: Disk quotas dquot_6.6.0 Jan 30 19:13:20.082767 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 30 19:13:20.083837 kernel: pnp: PnP ACPI init Jan 30 19:13:20.084032 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Jan 30 19:13:20.084063 kernel: pnp: PnP ACPI: found 5 devices Jan 30 19:13:20.084077 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 30 19:13:20.084091 kernel: NET: Registered PF_INET protocol family Jan 30 19:13:20.084121 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 30 19:13:20.084135 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 30 19:13:20.084148 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 30 19:13:20.084162 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 30 19:13:20.084183 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 30 19:13:20.084197 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 30 19:13:20.084210 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 30 19:13:20.084224 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 30 19:13:20.084237 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 30 19:13:20.084251 kernel: NET: Registered PF_XDP protocol family Jan 30 19:13:20.084415 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Jan 30 19:13:20.084581 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 30 19:13:20.084757 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 30 19:13:20.084945 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 30 19:13:20.085125 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 30 19:13:20.085292 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 30 19:13:20.085459 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 30 19:13:20.085625 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 30 19:13:20.087930 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Jan 30 19:13:20.088140 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Jan 30 19:13:20.088359 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Jan 30 19:13:20.088531 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Jan 30 19:13:20.088701 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Jan 30 19:13:20.088894 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Jan 30 19:13:20.089062 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Jan 30 19:13:20.089251 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Jan 30 19:13:20.089453 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 30 19:13:20.089657 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Jan 30 19:13:20.091917 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 30 19:13:20.092131 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jan 30 19:13:20.092310 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Jan 30 19:13:20.092479 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jan 30 19:13:20.092648 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 30 19:13:20.093373 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jan 30 19:13:20.093557 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Jan 30 19:13:20.093755 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 30 19:13:20.093962 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 30 19:13:20.094155 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jan 30 19:13:20.094324 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Jan 30 19:13:20.094504 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 30 19:13:20.094684 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 30 19:13:20.094932 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jan 30 19:13:20.095127 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Jan 30 19:13:20.095315 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 30 19:13:20.095484 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 30 19:13:20.095662 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jan 30 19:13:20.095921 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Jan 30 19:13:20.096185 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 30 19:13:20.096358 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 30 19:13:20.096533 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jan 30 19:13:20.096700 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Jan 30 19:13:20.096913 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 30 19:13:20.097076 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 30 19:13:20.097280 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jan 30 19:13:20.097456 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Jan 30 19:13:20.097639 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 30 19:13:20.097858 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 30 19:13:20.098066 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jan 30 19:13:20.098259 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Jan 30 19:13:20.098425 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 30 19:13:20.098595 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 30 19:13:20.098756 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 30 19:13:20.098938 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 30 19:13:20.099113 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Jan 30 19:13:20.099267 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Jan 30 19:13:20.099417 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Jan 30 19:13:20.099603 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jan 30 19:13:20.099822 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Jan 30 19:13:20.099991 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Jan 30 19:13:20.100177 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Jan 30 19:13:20.100358 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Jan 30 19:13:20.100516 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Jan 30 19:13:20.100672 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 30 19:13:20.100900 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Jan 30 19:13:20.101060 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Jan 30 19:13:20.101230 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 30 19:13:20.101415 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Jan 30 19:13:20.101572 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Jan 30 19:13:20.101729 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 30 19:13:20.101950 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Jan 30 19:13:20.102162 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Jan 30 19:13:20.102322 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 30 19:13:20.102514 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Jan 30 19:13:20.102683 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Jan 30 19:13:20.102887 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 30 19:13:20.103055 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Jan 30 19:13:20.103225 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Jan 30 19:13:20.103382 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 30 19:13:20.103557 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Jan 30 19:13:20.103724 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Jan 30 19:13:20.103920 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 30 19:13:20.103943 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 30 19:13:20.103958 kernel: PCI: CLS 0 bytes, default 64 Jan 30 19:13:20.103972 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 30 19:13:20.103985 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Jan 30 19:13:20.103999 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 30 19:13:20.104014 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Jan 30 19:13:20.104040 kernel: Initialise system trusted keyrings Jan 30 19:13:20.104062 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 30 19:13:20.104076 kernel: Key type asymmetric registered Jan 30 19:13:20.104090 kernel: Asymmetric key parser 'x509' registered Jan 30 19:13:20.104116 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 30 19:13:20.104131 kernel: io scheduler mq-deadline registered Jan 30 19:13:20.104145 kernel: io scheduler kyber registered Jan 30 19:13:20.104159 kernel: io scheduler bfq registered Jan 30 19:13:20.104328 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 30 19:13:20.104498 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 30 19:13:20.104678 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 30 19:13:20.104899 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 30 19:13:20.105089 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 30 19:13:20.105279 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 30 19:13:20.105446 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 30 19:13:20.105610 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 30 19:13:20.105784 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 30 19:13:20.105989 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 30 19:13:20.106168 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 30 19:13:20.106334 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 30 19:13:20.106502 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 30 19:13:20.106671 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 30 19:13:20.106892 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 30 19:13:20.107081 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 30 19:13:20.107272 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 30 19:13:20.107438 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 30 19:13:20.107604 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 30 19:13:20.107768 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 30 19:13:20.107989 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 30 19:13:20.108273 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 30 19:13:20.108441 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 30 19:13:20.108608 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 30 19:13:20.108629 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 30 19:13:20.108645 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 30 19:13:20.108667 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 30 19:13:20.108682 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 30 19:13:20.108697 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 30 19:13:20.108711 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 30 19:13:20.108725 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 30 19:13:20.108739 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 30 19:13:20.108963 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 30 19:13:20.108987 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 30 19:13:20.109162 kernel: rtc_cmos 00:03: registered as rtc0 Jan 30 19:13:20.109319 kernel: rtc_cmos 00:03: setting system clock to 2025-01-30T19:13:19 UTC (1738264399) Jan 30 19:13:20.109473 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 30 19:13:20.109494 kernel: intel_pstate: CPU model not supported Jan 30 19:13:20.109515 kernel: NET: Registered PF_INET6 protocol family Jan 30 19:13:20.109530 kernel: Segment Routing with IPv6 Jan 30 19:13:20.109544 kernel: In-situ OAM (IOAM) with IPv6 Jan 30 19:13:20.109558 kernel: NET: Registered PF_PACKET protocol family Jan 30 19:13:20.109573 kernel: Key type dns_resolver registered Jan 30 19:13:20.109591 kernel: IPI shorthand broadcast: enabled Jan 30 19:13:20.109606 kernel: sched_clock: Marking stable (1161003603, 237862336)->(1630539458, -231673519) Jan 30 19:13:20.109620 kernel: registered taskstats version 1 Jan 30 19:13:20.109634 kernel: Loading compiled-in X.509 certificates Jan 30 19:13:20.109648 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: 1efdcbe72fc44d29e4e6411cf9a3e64046be4375' Jan 30 19:13:20.109661 kernel: Key type .fscrypt registered Jan 30 19:13:20.109675 kernel: Key type fscrypt-provisioning registered Jan 30 19:13:20.109689 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 30 19:13:20.109703 kernel: ima: Allocated hash algorithm: sha1 Jan 30 19:13:20.109723 kernel: ima: No architecture policies found Jan 30 19:13:20.109736 kernel: clk: Disabling unused clocks Jan 30 19:13:20.109750 kernel: Freeing unused kernel image (initmem) memory: 42844K Jan 30 19:13:20.109764 kernel: Write protecting the kernel read-only data: 36864k Jan 30 19:13:20.109778 kernel: Freeing unused kernel image (rodata/data gap) memory: 1848K Jan 30 19:13:20.109820 kernel: Run /init as init process Jan 30 19:13:20.109835 kernel: with arguments: Jan 30 19:13:20.109850 kernel: /init Jan 30 19:13:20.109864 kernel: with environment: Jan 30 19:13:20.109885 kernel: HOME=/ Jan 30 19:13:20.109911 kernel: TERM=linux Jan 30 19:13:20.109924 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 30 19:13:20.109941 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 30 19:13:20.109971 systemd[1]: Detected virtualization kvm. Jan 30 19:13:20.109986 systemd[1]: Detected architecture x86-64. Jan 30 19:13:20.110000 systemd[1]: Running in initrd. Jan 30 19:13:20.110015 systemd[1]: No hostname configured, using default hostname. Jan 30 19:13:20.110034 systemd[1]: Hostname set to . Jan 30 19:13:20.110050 systemd[1]: Initializing machine ID from VM UUID. Jan 30 19:13:20.110065 systemd[1]: Queued start job for default target initrd.target. Jan 30 19:13:20.110080 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 30 19:13:20.110108 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 30 19:13:20.110125 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 30 19:13:20.110140 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 30 19:13:20.110162 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 30 19:13:20.110177 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 30 19:13:20.110194 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 30 19:13:20.110210 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 30 19:13:20.110235 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 30 19:13:20.110252 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 30 19:13:20.110267 systemd[1]: Reached target paths.target - Path Units. Jan 30 19:13:20.110288 systemd[1]: Reached target slices.target - Slice Units. Jan 30 19:13:20.110304 systemd[1]: Reached target swap.target - Swaps. Jan 30 19:13:20.110318 systemd[1]: Reached target timers.target - Timer Units. Jan 30 19:13:20.110333 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 30 19:13:20.110348 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 30 19:13:20.110363 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 30 19:13:20.110378 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 30 19:13:20.110393 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 30 19:13:20.110408 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 30 19:13:20.110429 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 30 19:13:20.110443 systemd[1]: Reached target sockets.target - Socket Units. Jan 30 19:13:20.110458 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 30 19:13:20.110481 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 30 19:13:20.110496 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 30 19:13:20.110511 systemd[1]: Starting systemd-fsck-usr.service... Jan 30 19:13:20.110526 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 30 19:13:20.110541 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 30 19:13:20.110556 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 30 19:13:20.110576 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 30 19:13:20.110591 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 30 19:13:20.110662 systemd-journald[200]: Collecting audit messages is disabled. Jan 30 19:13:20.110712 systemd[1]: Finished systemd-fsck-usr.service. Jan 30 19:13:20.110734 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 30 19:13:20.110750 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 30 19:13:20.110764 kernel: Bridge firewalling registered Jan 30 19:13:20.110822 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 30 19:13:20.110850 systemd-journald[200]: Journal started Jan 30 19:13:20.110878 systemd-journald[200]: Runtime Journal (/run/log/journal/b2ca20dfef264b19afedd89300a8953e) is 4.7M, max 38.0M, 33.2M free. Jan 30 19:13:20.027809 systemd-modules-load[201]: Inserted module 'overlay' Jan 30 19:13:20.091414 systemd-modules-load[201]: Inserted module 'br_netfilter' Jan 30 19:13:20.136809 systemd[1]: Started systemd-journald.service - Journal Service. Jan 30 19:13:20.138191 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 19:13:20.146068 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 30 19:13:20.149971 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 30 19:13:20.159990 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 30 19:13:20.164274 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 30 19:13:20.176177 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 30 19:13:20.177353 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 30 19:13:20.182862 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 30 19:13:20.191237 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 30 19:13:20.194688 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 30 19:13:20.206988 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 30 19:13:20.209837 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 30 19:13:20.224721 dracut-cmdline[232]: dracut-dracut-053 Jan 30 19:13:20.229134 dracut-cmdline[232]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=befc9792b021bef43c896e00e1d5172b6224dbafc9b6c92b267e5e544378e681 Jan 30 19:13:20.267306 systemd-resolved[235]: Positive Trust Anchors: Jan 30 19:13:20.267329 systemd-resolved[235]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 30 19:13:20.267375 systemd-resolved[235]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 30 19:13:20.272722 systemd-resolved[235]: Defaulting to hostname 'linux'. Jan 30 19:13:20.274774 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 30 19:13:20.275887 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 30 19:13:20.358836 kernel: SCSI subsystem initialized Jan 30 19:13:20.370864 kernel: Loading iSCSI transport class v2.0-870. Jan 30 19:13:20.383807 kernel: iscsi: registered transport (tcp) Jan 30 19:13:20.409887 kernel: iscsi: registered transport (qla4xxx) Jan 30 19:13:20.409951 kernel: QLogic iSCSI HBA Driver Jan 30 19:13:20.465006 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 30 19:13:20.471976 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 30 19:13:20.503857 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 30 19:13:20.503972 kernel: device-mapper: uevent: version 1.0.3 Jan 30 19:13:20.506102 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 30 19:13:20.555878 kernel: raid6: sse2x4 gen() 13543 MB/s Jan 30 19:13:20.573840 kernel: raid6: sse2x2 gen() 9382 MB/s Jan 30 19:13:20.592523 kernel: raid6: sse2x1 gen() 9887 MB/s Jan 30 19:13:20.592597 kernel: raid6: using algorithm sse2x4 gen() 13543 MB/s Jan 30 19:13:20.611539 kernel: raid6: .... xor() 7735 MB/s, rmw enabled Jan 30 19:13:20.611602 kernel: raid6: using ssse3x2 recovery algorithm Jan 30 19:13:20.637819 kernel: xor: automatically using best checksumming function avx Jan 30 19:13:20.836855 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 30 19:13:20.850610 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 30 19:13:20.857018 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 30 19:13:20.885660 systemd-udevd[419]: Using default interface naming scheme 'v255'. Jan 30 19:13:20.892868 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 30 19:13:20.900995 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 30 19:13:20.926145 dracut-pre-trigger[425]: rd.md=0: removing MD RAID activation Jan 30 19:13:20.966330 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 30 19:13:20.972000 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 30 19:13:21.090067 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 30 19:13:21.101015 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 30 19:13:21.119743 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 30 19:13:21.124910 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 30 19:13:21.127527 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 30 19:13:21.129196 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 30 19:13:21.138034 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 30 19:13:21.163458 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 30 19:13:21.240822 kernel: cryptd: max_cpu_qlen set to 1000 Jan 30 19:13:21.245858 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Jan 30 19:13:21.344009 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Jan 30 19:13:21.344238 kernel: libata version 3.00 loaded. Jan 30 19:13:21.344262 kernel: AVX version of gcm_enc/dec engaged. Jan 30 19:13:21.344281 kernel: ahci 0000:00:1f.2: version 3.0 Jan 30 19:13:21.351968 kernel: AES CTR mode by8 optimization enabled Jan 30 19:13:21.351994 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 30 19:13:21.352015 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Jan 30 19:13:21.352245 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 30 19:13:21.352447 kernel: ACPI: bus type USB registered Jan 30 19:13:21.352469 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 30 19:13:21.352487 kernel: usbcore: registered new interface driver usbfs Jan 30 19:13:21.352505 kernel: GPT:17805311 != 125829119 Jan 30 19:13:21.352523 kernel: usbcore: registered new interface driver hub Jan 30 19:13:21.352541 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 30 19:13:21.352565 kernel: usbcore: registered new device driver usb Jan 30 19:13:21.352590 kernel: GPT:17805311 != 125829119 Jan 30 19:13:21.352609 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 30 19:13:21.352627 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 30 19:13:21.352645 kernel: scsi host0: ahci Jan 30 19:13:21.352874 kernel: scsi host1: ahci Jan 30 19:13:21.353090 kernel: scsi host2: ahci Jan 30 19:13:21.353281 kernel: scsi host3: ahci Jan 30 19:13:21.353476 kernel: scsi host4: ahci Jan 30 19:13:21.353668 kernel: scsi host5: ahci Jan 30 19:13:21.357298 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 Jan 30 19:13:21.357322 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 Jan 30 19:13:21.357341 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 Jan 30 19:13:21.357359 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 Jan 30 19:13:21.357378 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 Jan 30 19:13:21.357396 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 Jan 30 19:13:21.272414 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 30 19:13:21.431595 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 scanned by (udev-worker) (463) Jan 30 19:13:21.431630 kernel: BTRFS: device fsid 64bb5b5a-85cc-41cc-a02b-2cfaa3e93b0a devid 1 transid 38 /dev/vda3 scanned by (udev-worker) (476) Jan 30 19:13:21.272674 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 30 19:13:21.280045 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 30 19:13:21.280844 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 30 19:13:21.281027 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 19:13:21.281774 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 30 19:13:21.287142 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 30 19:13:21.439266 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 30 19:13:21.443726 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 19:13:21.456540 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 30 19:13:21.463910 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 30 19:13:21.469754 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Jan 30 19:13:21.470617 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 30 19:13:21.487183 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 30 19:13:21.492333 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 30 19:13:21.498261 disk-uuid[557]: Primary Header is updated. Jan 30 19:13:21.498261 disk-uuid[557]: Secondary Entries is updated. Jan 30 19:13:21.498261 disk-uuid[557]: Secondary Header is updated. Jan 30 19:13:21.511844 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 30 19:13:21.521914 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 30 19:13:21.521644 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 30 19:13:21.661318 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 30 19:13:21.661388 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 30 19:13:21.661410 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 30 19:13:21.664760 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 30 19:13:21.664817 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 30 19:13:21.666844 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 30 19:13:21.683840 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Jan 30 19:13:21.702110 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Jan 30 19:13:21.702335 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 30 19:13:21.702576 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Jan 30 19:13:21.702801 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Jan 30 19:13:21.703042 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Jan 30 19:13:21.703489 kernel: hub 1-0:1.0: USB hub found Jan 30 19:13:21.703720 kernel: hub 1-0:1.0: 4 ports detected Jan 30 19:13:21.703946 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 30 19:13:21.704207 kernel: hub 2-0:1.0: USB hub found Jan 30 19:13:21.704417 kernel: hub 2-0:1.0: 4 ports detected Jan 30 19:13:21.933927 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 30 19:13:22.084843 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 30 19:13:22.090967 kernel: usbcore: registered new interface driver usbhid Jan 30 19:13:22.091019 kernel: usbhid: USB HID core driver Jan 30 19:13:22.098418 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Jan 30 19:13:22.098465 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Jan 30 19:13:22.524811 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 30 19:13:22.526691 disk-uuid[558]: The operation has completed successfully. Jan 30 19:13:22.578560 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 30 19:13:22.578731 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 30 19:13:22.601006 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 30 19:13:22.605109 sh[583]: Success Jan 30 19:13:22.621996 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Jan 30 19:13:22.683177 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 30 19:13:22.692913 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 30 19:13:22.695245 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 30 19:13:22.728379 kernel: BTRFS info (device dm-0): first mount of filesystem 64bb5b5a-85cc-41cc-a02b-2cfaa3e93b0a Jan 30 19:13:22.728464 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 30 19:13:22.728489 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 30 19:13:22.732945 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 30 19:13:22.732985 kernel: BTRFS info (device dm-0): using free space tree Jan 30 19:13:22.745427 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 30 19:13:22.746918 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 30 19:13:22.752973 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 30 19:13:22.754972 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 30 19:13:22.776249 kernel: BTRFS info (device vda6): first mount of filesystem aa75aabd-8755-4402-b4b6-23093345fe03 Jan 30 19:13:22.776312 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 30 19:13:22.776333 kernel: BTRFS info (device vda6): using free space tree Jan 30 19:13:22.783070 kernel: BTRFS info (device vda6): auto enabling async discard Jan 30 19:13:22.796179 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 30 19:13:22.798453 kernel: BTRFS info (device vda6): last unmount of filesystem aa75aabd-8755-4402-b4b6-23093345fe03 Jan 30 19:13:22.809379 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 30 19:13:22.816613 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 30 19:13:22.912054 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 30 19:13:22.921040 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 30 19:13:22.959676 systemd-networkd[767]: lo: Link UP Jan 30 19:13:22.959689 systemd-networkd[767]: lo: Gained carrier Jan 30 19:13:22.965060 systemd-networkd[767]: Enumeration completed Jan 30 19:13:22.965598 systemd-networkd[767]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 30 19:13:22.965603 systemd-networkd[767]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 30 19:13:22.966947 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 30 19:13:22.967962 systemd[1]: Reached target network.target - Network. Jan 30 19:13:22.968943 systemd-networkd[767]: eth0: Link UP Jan 30 19:13:22.968949 systemd-networkd[767]: eth0: Gained carrier Jan 30 19:13:22.968961 systemd-networkd[767]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 30 19:13:22.981026 ignition[684]: Ignition 2.19.0 Jan 30 19:13:22.981063 ignition[684]: Stage: fetch-offline Jan 30 19:13:22.981189 ignition[684]: no configs at "/usr/lib/ignition/base.d" Jan 30 19:13:22.983881 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 30 19:13:22.981217 ignition[684]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 30 19:13:22.981414 ignition[684]: parsed url from cmdline: "" Jan 30 19:13:22.981421 ignition[684]: no config URL provided Jan 30 19:13:22.981432 ignition[684]: reading system config file "/usr/lib/ignition/user.ign" Jan 30 19:13:22.981449 ignition[684]: no config at "/usr/lib/ignition/user.ign" Jan 30 19:13:22.981458 ignition[684]: failed to fetch config: resource requires networking Jan 30 19:13:22.981739 ignition[684]: Ignition finished successfully Jan 30 19:13:22.993099 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 30 19:13:22.998909 systemd-networkd[767]: eth0: DHCPv4 address 10.244.22.2/30, gateway 10.244.22.1 acquired from 10.244.22.1 Jan 30 19:13:23.017649 ignition[774]: Ignition 2.19.0 Jan 30 19:13:23.017674 ignition[774]: Stage: fetch Jan 30 19:13:23.019269 ignition[774]: no configs at "/usr/lib/ignition/base.d" Jan 30 19:13:23.019299 ignition[774]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 30 19:13:23.019467 ignition[774]: parsed url from cmdline: "" Jan 30 19:13:23.019474 ignition[774]: no config URL provided Jan 30 19:13:23.019485 ignition[774]: reading system config file "/usr/lib/ignition/user.ign" Jan 30 19:13:23.019501 ignition[774]: no config at "/usr/lib/ignition/user.ign" Jan 30 19:13:23.019621 ignition[774]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 30 19:13:23.019680 ignition[774]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 30 19:13:23.019729 ignition[774]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 30 19:13:23.034769 ignition[774]: GET result: OK Jan 30 19:13:23.035527 ignition[774]: parsing config with SHA512: 23954b3646a48434abe4ce907a2e775cc10cb1f6680ff99989ceb11053382fa8ddd654ef3a2f785a7a46129d787fa7c193e5a1877286957494b4c02ed2403f52 Jan 30 19:13:23.041974 unknown[774]: fetched base config from "system" Jan 30 19:13:23.041992 unknown[774]: fetched base config from "system" Jan 30 19:13:23.042496 ignition[774]: fetch: fetch complete Jan 30 19:13:23.042002 unknown[774]: fetched user config from "openstack" Jan 30 19:13:23.042505 ignition[774]: fetch: fetch passed Jan 30 19:13:23.045436 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 30 19:13:23.042638 ignition[774]: Ignition finished successfully Jan 30 19:13:23.055105 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 30 19:13:23.077184 ignition[781]: Ignition 2.19.0 Jan 30 19:13:23.077212 ignition[781]: Stage: kargs Jan 30 19:13:23.077486 ignition[781]: no configs at "/usr/lib/ignition/base.d" Jan 30 19:13:23.077507 ignition[781]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 30 19:13:23.078901 ignition[781]: kargs: kargs passed Jan 30 19:13:23.081080 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 30 19:13:23.078983 ignition[781]: Ignition finished successfully Jan 30 19:13:23.095320 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 30 19:13:23.119826 ignition[788]: Ignition 2.19.0 Jan 30 19:13:23.119850 ignition[788]: Stage: disks Jan 30 19:13:23.120156 ignition[788]: no configs at "/usr/lib/ignition/base.d" Jan 30 19:13:23.122720 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 30 19:13:23.120178 ignition[788]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 30 19:13:23.121523 ignition[788]: disks: disks passed Jan 30 19:13:23.124914 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 30 19:13:23.121597 ignition[788]: Ignition finished successfully Jan 30 19:13:23.126573 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 30 19:13:23.127346 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 30 19:13:23.128019 systemd[1]: Reached target sysinit.target - System Initialization. Jan 30 19:13:23.128688 systemd[1]: Reached target basic.target - Basic System. Jan 30 19:13:23.135996 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 30 19:13:23.160876 systemd-fsck[797]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 30 19:13:23.163935 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 30 19:13:23.175144 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 30 19:13:23.297815 kernel: EXT4-fs (vda9): mounted filesystem 9f41abed-fd12-4e57-bcd4-5c0ef7f8a1bf r/w with ordered data mode. Quota mode: none. Jan 30 19:13:23.299122 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 30 19:13:23.300629 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 30 19:13:23.306936 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 30 19:13:23.316294 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 30 19:13:23.319602 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 30 19:13:23.322961 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 30 19:13:23.324019 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 30 19:13:23.324082 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 30 19:13:23.327791 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 30 19:13:23.339653 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 30 19:13:23.341538 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/vda6 scanned by mount (805) Jan 30 19:13:23.346804 kernel: BTRFS info (device vda6): first mount of filesystem aa75aabd-8755-4402-b4b6-23093345fe03 Jan 30 19:13:23.346920 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 30 19:13:23.348279 kernel: BTRFS info (device vda6): using free space tree Jan 30 19:13:23.358015 kernel: BTRFS info (device vda6): auto enabling async discard Jan 30 19:13:23.362363 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 30 19:13:23.451003 initrd-setup-root[834]: cut: /sysroot/etc/passwd: No such file or directory Jan 30 19:13:23.460831 initrd-setup-root[841]: cut: /sysroot/etc/group: No such file or directory Jan 30 19:13:23.467326 initrd-setup-root[848]: cut: /sysroot/etc/shadow: No such file or directory Jan 30 19:13:23.477048 initrd-setup-root[855]: cut: /sysroot/etc/gshadow: No such file or directory Jan 30 19:13:23.582047 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 30 19:13:23.587920 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 30 19:13:23.589992 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 30 19:13:23.609829 kernel: BTRFS info (device vda6): last unmount of filesystem aa75aabd-8755-4402-b4b6-23093345fe03 Jan 30 19:13:23.630285 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 30 19:13:23.639965 ignition[924]: INFO : Ignition 2.19.0 Jan 30 19:13:23.639965 ignition[924]: INFO : Stage: mount Jan 30 19:13:23.642202 ignition[924]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 30 19:13:23.642202 ignition[924]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 30 19:13:23.642202 ignition[924]: INFO : mount: mount passed Jan 30 19:13:23.642202 ignition[924]: INFO : Ignition finished successfully Jan 30 19:13:23.643156 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 30 19:13:23.724561 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 30 19:13:24.858654 systemd-networkd[767]: eth0: Gained IPv6LL Jan 30 19:13:26.102715 systemd-networkd[767]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:580:24:19ff:fef4:1602/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:580:24:19ff:fef4:1602/64 assigned by NDisc. Jan 30 19:13:26.102736 systemd-networkd[767]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Jan 30 19:13:30.507434 coreos-metadata[807]: Jan 30 19:13:30.507 WARN failed to locate config-drive, using the metadata service API instead Jan 30 19:13:30.531472 coreos-metadata[807]: Jan 30 19:13:30.531 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 30 19:13:30.547652 coreos-metadata[807]: Jan 30 19:13:30.547 INFO Fetch successful Jan 30 19:13:30.549187 coreos-metadata[807]: Jan 30 19:13:30.548 INFO wrote hostname srv-ehdo1.gb1.brightbox.com to /sysroot/etc/hostname Jan 30 19:13:30.552051 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 30 19:13:30.553500 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 30 19:13:30.568127 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 30 19:13:30.592116 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 30 19:13:30.610830 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by mount (940) Jan 30 19:13:30.614339 kernel: BTRFS info (device vda6): first mount of filesystem aa75aabd-8755-4402-b4b6-23093345fe03 Jan 30 19:13:30.614376 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 30 19:13:30.616010 kernel: BTRFS info (device vda6): using free space tree Jan 30 19:13:30.621823 kernel: BTRFS info (device vda6): auto enabling async discard Jan 30 19:13:30.624072 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 30 19:13:30.652717 ignition[958]: INFO : Ignition 2.19.0 Jan 30 19:13:30.652717 ignition[958]: INFO : Stage: files Jan 30 19:13:30.654553 ignition[958]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 30 19:13:30.654553 ignition[958]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 30 19:13:30.654553 ignition[958]: DEBUG : files: compiled without relabeling support, skipping Jan 30 19:13:30.657472 ignition[958]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 30 19:13:30.657472 ignition[958]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 30 19:13:30.659773 ignition[958]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 30 19:13:30.659773 ignition[958]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 30 19:13:30.662115 ignition[958]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 30 19:13:30.660902 unknown[958]: wrote ssh authorized keys file for user: core Jan 30 19:13:30.664300 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 30 19:13:30.664300 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Jan 30 19:13:30.825283 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 30 19:13:31.150502 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 30 19:13:31.152167 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 30 19:13:31.152167 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 30 19:13:31.152167 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 30 19:13:31.152167 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 30 19:13:31.152167 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 30 19:13:31.152167 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 30 19:13:31.152167 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 30 19:13:31.152167 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 30 19:13:31.167541 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 30 19:13:31.167541 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 30 19:13:31.167541 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Jan 30 19:13:31.167541 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Jan 30 19:13:31.167541 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Jan 30 19:13:31.167541 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-x86-64.raw: attempt #1 Jan 30 19:13:31.808893 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 30 19:13:33.986476 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Jan 30 19:13:33.986476 ignition[958]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 30 19:13:33.994427 ignition[958]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 30 19:13:33.994427 ignition[958]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 30 19:13:33.994427 ignition[958]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 30 19:13:33.994427 ignition[958]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 30 19:13:33.994427 ignition[958]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 30 19:13:33.994427 ignition[958]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 30 19:13:33.994427 ignition[958]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 30 19:13:33.994427 ignition[958]: INFO : files: files passed Jan 30 19:13:33.994427 ignition[958]: INFO : Ignition finished successfully Jan 30 19:13:33.995525 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 30 19:13:34.006087 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 30 19:13:34.011015 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 30 19:13:34.027432 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 30 19:13:34.027660 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 30 19:13:34.040824 initrd-setup-root-after-ignition[986]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 30 19:13:34.040824 initrd-setup-root-after-ignition[986]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 30 19:13:34.043712 initrd-setup-root-after-ignition[990]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 30 19:13:34.045933 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 30 19:13:34.047100 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 30 19:13:34.054004 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 30 19:13:34.095740 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 30 19:13:34.095969 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 30 19:13:34.098194 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 30 19:13:34.099340 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 30 19:13:34.101012 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 30 19:13:34.114129 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 30 19:13:34.131664 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 30 19:13:34.139074 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 30 19:13:34.153403 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 30 19:13:34.155335 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 30 19:13:34.156270 systemd[1]: Stopped target timers.target - Timer Units. Jan 30 19:13:34.157835 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 30 19:13:34.158038 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 30 19:13:34.159832 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 30 19:13:34.160823 systemd[1]: Stopped target basic.target - Basic System. Jan 30 19:13:34.162343 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 30 19:13:34.163822 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 30 19:13:34.165242 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 30 19:13:34.166800 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 30 19:13:34.168512 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 30 19:13:34.170110 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 30 19:13:34.171578 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 30 19:13:34.173188 systemd[1]: Stopped target swap.target - Swaps. Jan 30 19:13:34.174703 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 30 19:13:34.174942 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 30 19:13:34.176764 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 30 19:13:34.177761 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 30 19:13:34.179210 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 30 19:13:34.179397 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 30 19:13:34.180752 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 30 19:13:34.180949 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 30 19:13:34.183046 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 30 19:13:34.183219 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 30 19:13:34.184970 systemd[1]: ignition-files.service: Deactivated successfully. Jan 30 19:13:34.185125 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 30 19:13:34.200633 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 30 19:13:34.204099 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 30 19:13:34.205286 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 30 19:13:34.205542 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 30 19:13:34.208362 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 30 19:13:34.208611 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 30 19:13:34.218748 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 30 19:13:34.218918 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 30 19:13:34.232642 ignition[1010]: INFO : Ignition 2.19.0 Jan 30 19:13:34.241005 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 30 19:13:34.242987 ignition[1010]: INFO : Stage: umount Jan 30 19:13:34.242987 ignition[1010]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 30 19:13:34.242987 ignition[1010]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 30 19:13:34.242987 ignition[1010]: INFO : umount: umount passed Jan 30 19:13:34.242987 ignition[1010]: INFO : Ignition finished successfully Jan 30 19:13:34.242535 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 30 19:13:34.242708 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 30 19:13:34.244096 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 30 19:13:34.244241 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 30 19:13:34.246346 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 30 19:13:34.246429 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 30 19:13:34.247833 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 30 19:13:34.247921 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 30 19:13:34.249175 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 30 19:13:34.249242 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 30 19:13:34.250529 systemd[1]: Stopped target network.target - Network. Jan 30 19:13:34.251882 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 30 19:13:34.251962 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 30 19:13:34.253367 systemd[1]: Stopped target paths.target - Path Units. Jan 30 19:13:34.254746 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 30 19:13:34.256858 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 30 19:13:34.257929 systemd[1]: Stopped target slices.target - Slice Units. Jan 30 19:13:34.259299 systemd[1]: Stopped target sockets.target - Socket Units. Jan 30 19:13:34.260707 systemd[1]: iscsid.socket: Deactivated successfully. Jan 30 19:13:34.260773 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 30 19:13:34.262099 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 30 19:13:34.262163 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 30 19:13:34.263504 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 30 19:13:34.263585 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 30 19:13:34.264892 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 30 19:13:34.264960 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 30 19:13:34.266445 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 30 19:13:34.266513 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 30 19:13:34.268396 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 30 19:13:34.270113 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 30 19:13:34.275965 systemd-networkd[767]: eth0: DHCPv6 lease lost Jan 30 19:13:34.278463 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 30 19:13:34.278694 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 30 19:13:34.280993 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 30 19:13:34.281169 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 30 19:13:34.285375 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 30 19:13:34.285467 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 30 19:13:34.294283 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 30 19:13:34.297539 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 30 19:13:34.297624 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 30 19:13:34.299200 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 30 19:13:34.299270 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 30 19:13:34.300772 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 30 19:13:34.300887 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 30 19:13:34.302207 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 30 19:13:34.302273 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 30 19:13:34.304019 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 30 19:13:34.313341 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 30 19:13:34.313557 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 30 19:13:34.324101 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 30 19:13:34.324181 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 30 19:13:34.324976 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 30 19:13:34.325032 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 30 19:13:34.325991 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 30 19:13:34.326061 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 30 19:13:34.328685 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 30 19:13:34.328757 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 30 19:13:34.330337 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 30 19:13:34.330408 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 30 19:13:34.339032 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 30 19:13:34.340734 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 30 19:13:34.340829 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 30 19:13:34.343404 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 30 19:13:34.343474 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 30 19:13:34.344594 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 30 19:13:34.344661 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 30 19:13:34.345473 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 30 19:13:34.345540 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 19:13:34.348540 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 30 19:13:34.348715 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 30 19:13:34.350453 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 30 19:13:34.350579 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 30 19:13:34.352991 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 30 19:13:34.361077 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 30 19:13:34.370620 systemd[1]: Switching root. Jan 30 19:13:34.407842 systemd-journald[200]: Journal stopped Jan 30 19:13:35.825213 systemd-journald[200]: Received SIGTERM from PID 1 (systemd). Jan 30 19:13:35.825310 kernel: SELinux: policy capability network_peer_controls=1 Jan 30 19:13:35.825336 kernel: SELinux: policy capability open_perms=1 Jan 30 19:13:35.825374 kernel: SELinux: policy capability extended_socket_class=1 Jan 30 19:13:35.825400 kernel: SELinux: policy capability always_check_network=0 Jan 30 19:13:35.825425 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 30 19:13:35.825445 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 30 19:13:35.825471 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 30 19:13:35.825490 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 30 19:13:35.825508 kernel: audit: type=1403 audit(1738264414.640:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 30 19:13:35.825539 systemd[1]: Successfully loaded SELinux policy in 50.497ms. Jan 30 19:13:35.825588 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 21.312ms. Jan 30 19:13:35.825612 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 30 19:13:35.825634 systemd[1]: Detected virtualization kvm. Jan 30 19:13:35.825655 systemd[1]: Detected architecture x86-64. Jan 30 19:13:35.825676 systemd[1]: Detected first boot. Jan 30 19:13:35.825695 systemd[1]: Hostname set to . Jan 30 19:13:35.825722 systemd[1]: Initializing machine ID from VM UUID. Jan 30 19:13:35.825743 zram_generator::config[1053]: No configuration found. Jan 30 19:13:35.825791 systemd[1]: Populated /etc with preset unit settings. Jan 30 19:13:35.825817 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 30 19:13:35.825849 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 30 19:13:35.825881 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 30 19:13:35.825904 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 30 19:13:35.825924 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 30 19:13:35.825945 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 30 19:13:35.825965 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 30 19:13:35.825986 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 30 19:13:35.826981 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 30 19:13:35.827013 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 30 19:13:35.827034 systemd[1]: Created slice user.slice - User and Session Slice. Jan 30 19:13:35.827054 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 30 19:13:35.827075 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 30 19:13:35.827095 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 30 19:13:35.827116 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 30 19:13:35.827136 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 30 19:13:35.827174 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 30 19:13:35.827196 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 30 19:13:35.827217 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 30 19:13:35.827237 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 30 19:13:35.827257 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 30 19:13:35.827277 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 30 19:13:35.827298 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 30 19:13:35.827331 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 30 19:13:35.827354 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 30 19:13:35.827375 systemd[1]: Reached target slices.target - Slice Units. Jan 30 19:13:35.827396 systemd[1]: Reached target swap.target - Swaps. Jan 30 19:13:35.827417 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 30 19:13:35.827437 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 30 19:13:35.827457 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 30 19:13:35.827489 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 30 19:13:35.827536 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 30 19:13:35.827560 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 30 19:13:35.827580 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 30 19:13:35.827601 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 30 19:13:35.827622 systemd[1]: Mounting media.mount - External Media Directory... Jan 30 19:13:35.827642 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 19:13:35.827662 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 30 19:13:35.827711 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 30 19:13:35.827734 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 30 19:13:35.827756 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 30 19:13:35.827814 systemd[1]: Reached target machines.target - Containers. Jan 30 19:13:35.827851 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 30 19:13:35.827875 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 30 19:13:35.827896 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 30 19:13:35.827917 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 30 19:13:35.827952 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 30 19:13:35.827974 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 30 19:13:35.827994 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 30 19:13:35.828022 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 30 19:13:35.828043 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 30 19:13:35.828070 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 30 19:13:35.828091 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 30 19:13:35.828112 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 30 19:13:35.828143 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 30 19:13:35.828165 systemd[1]: Stopped systemd-fsck-usr.service. Jan 30 19:13:35.828185 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 30 19:13:35.828215 kernel: loop: module loaded Jan 30 19:13:35.828236 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 30 19:13:35.828257 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 30 19:13:35.828279 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 30 19:13:35.828298 kernel: ACPI: bus type drm_connector registered Jan 30 19:13:35.828318 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 30 19:13:35.828338 systemd[1]: verity-setup.service: Deactivated successfully. Jan 30 19:13:35.828370 systemd[1]: Stopped verity-setup.service. Jan 30 19:13:35.828392 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 19:13:35.828418 kernel: fuse: init (API version 7.39) Jan 30 19:13:35.828476 systemd-journald[1153]: Collecting audit messages is disabled. Jan 30 19:13:35.828512 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 30 19:13:35.828534 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 30 19:13:35.828555 systemd[1]: Mounted media.mount - External Media Directory. Jan 30 19:13:35.828590 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 30 19:13:35.828612 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 30 19:13:35.828633 systemd-journald[1153]: Journal started Jan 30 19:13:35.828677 systemd-journald[1153]: Runtime Journal (/run/log/journal/b2ca20dfef264b19afedd89300a8953e) is 4.7M, max 38.0M, 33.2M free. Jan 30 19:13:35.428080 systemd[1]: Queued start job for default target multi-user.target. Jan 30 19:13:35.445835 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 30 19:13:35.446479 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 30 19:13:35.831916 systemd[1]: Started systemd-journald.service - Journal Service. Jan 30 19:13:35.833272 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 30 19:13:35.834590 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 30 19:13:35.835723 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 30 19:13:35.837059 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 30 19:13:35.837283 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 30 19:13:35.838637 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 30 19:13:35.838917 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 30 19:13:35.840152 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 30 19:13:35.840356 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 30 19:13:35.841472 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 30 19:13:35.841679 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 30 19:13:35.843089 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 30 19:13:35.843304 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 30 19:13:35.844398 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 30 19:13:35.844595 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 30 19:13:35.845763 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 30 19:13:35.846905 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 30 19:13:35.848357 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 30 19:13:35.863350 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 30 19:13:35.875882 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 30 19:13:35.883887 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 30 19:13:35.884720 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 30 19:13:35.884768 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 30 19:13:35.887805 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 30 19:13:35.894047 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 30 19:13:35.903371 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 30 19:13:35.906269 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 30 19:13:35.909020 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 30 19:13:35.912750 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 30 19:13:35.914570 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 30 19:13:35.920019 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 30 19:13:35.932181 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 30 19:13:35.952022 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 30 19:13:35.961486 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 30 19:13:35.966016 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 30 19:13:35.977018 systemd-journald[1153]: Time spent on flushing to /var/log/journal/b2ca20dfef264b19afedd89300a8953e is 80.497ms for 1137 entries. Jan 30 19:13:35.977018 systemd-journald[1153]: System Journal (/var/log/journal/b2ca20dfef264b19afedd89300a8953e) is 8.0M, max 584.8M, 576.8M free. Jan 30 19:13:36.104521 systemd-journald[1153]: Received client request to flush runtime journal. Jan 30 19:13:36.104608 kernel: loop0: detected capacity change from 0 to 142488 Jan 30 19:13:36.104653 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 30 19:13:36.104687 kernel: loop1: detected capacity change from 0 to 8 Jan 30 19:13:35.974601 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 30 19:13:35.983942 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 30 19:13:35.991350 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 30 19:13:35.994058 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 30 19:13:36.000217 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 30 19:13:36.018098 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 30 19:13:36.043608 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 30 19:13:36.100566 systemd-tmpfiles[1187]: ACLs are not supported, ignoring. Jan 30 19:13:36.100588 systemd-tmpfiles[1187]: ACLs are not supported, ignoring. Jan 30 19:13:36.109092 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 30 19:13:36.119372 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 30 19:13:36.120413 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 30 19:13:36.123980 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 30 19:13:36.136098 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 30 19:13:36.152653 kernel: loop2: detected capacity change from 0 to 218376 Jan 30 19:13:36.179304 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 30 19:13:36.188968 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 30 19:13:36.214757 kernel: loop3: detected capacity change from 0 to 140768 Jan 30 19:13:36.253263 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 30 19:13:36.264155 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 30 19:13:36.266444 udevadm[1208]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Jan 30 19:13:36.282721 kernel: loop4: detected capacity change from 0 to 142488 Jan 30 19:13:36.313315 kernel: loop5: detected capacity change from 0 to 8 Jan 30 19:13:36.316085 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Jan 30 19:13:36.316114 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Jan 30 19:13:36.329078 kernel: loop6: detected capacity change from 0 to 218376 Jan 30 19:13:36.337130 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 30 19:13:36.359816 kernel: loop7: detected capacity change from 0 to 140768 Jan 30 19:13:36.381046 (sd-merge)[1212]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Jan 30 19:13:36.382840 (sd-merge)[1212]: Merged extensions into '/usr'. Jan 30 19:13:36.392772 systemd[1]: Reloading requested from client PID 1186 ('systemd-sysext') (unit systemd-sysext.service)... Jan 30 19:13:36.392812 systemd[1]: Reloading... Jan 30 19:13:36.495677 zram_generator::config[1236]: No configuration found. Jan 30 19:13:36.667125 ldconfig[1181]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 30 19:13:36.810354 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 30 19:13:36.878565 systemd[1]: Reloading finished in 483 ms. Jan 30 19:13:36.920799 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 30 19:13:36.930545 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 30 19:13:36.949131 systemd[1]: Starting ensure-sysext.service... Jan 30 19:13:36.960528 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 30 19:13:36.980139 systemd[1]: Reloading requested from client PID 1296 ('systemctl') (unit ensure-sysext.service)... Jan 30 19:13:36.980418 systemd[1]: Reloading... Jan 30 19:13:37.003480 systemd-tmpfiles[1297]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 30 19:13:37.004108 systemd-tmpfiles[1297]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 30 19:13:37.008083 systemd-tmpfiles[1297]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 30 19:13:37.008529 systemd-tmpfiles[1297]: ACLs are not supported, ignoring. Jan 30 19:13:37.008657 systemd-tmpfiles[1297]: ACLs are not supported, ignoring. Jan 30 19:13:37.017678 systemd-tmpfiles[1297]: Detected autofs mount point /boot during canonicalization of boot. Jan 30 19:13:37.017701 systemd-tmpfiles[1297]: Skipping /boot Jan 30 19:13:37.047906 systemd-tmpfiles[1297]: Detected autofs mount point /boot during canonicalization of boot. Jan 30 19:13:37.047926 systemd-tmpfiles[1297]: Skipping /boot Jan 30 19:13:37.082842 zram_generator::config[1324]: No configuration found. Jan 30 19:13:37.272981 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 30 19:13:37.342101 systemd[1]: Reloading finished in 361 ms. Jan 30 19:13:37.364699 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 30 19:13:37.370491 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 30 19:13:37.387055 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Jan 30 19:13:37.398070 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 30 19:13:37.403135 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 30 19:13:37.415409 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 30 19:13:37.421001 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 30 19:13:37.431163 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 30 19:13:37.446398 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 30 19:13:37.451144 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 19:13:37.452141 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 30 19:13:37.455127 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 30 19:13:37.463875 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 30 19:13:37.468234 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 30 19:13:37.469607 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 30 19:13:37.470024 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 19:13:37.480465 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 19:13:37.481223 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 30 19:13:37.481554 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 30 19:13:37.481766 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 19:13:37.488151 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 30 19:13:37.505370 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 19:13:37.505893 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 30 19:13:37.523773 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 30 19:13:37.525746 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 30 19:13:37.526122 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 30 19:13:37.528217 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 30 19:13:37.530730 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 30 19:13:37.530995 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 30 19:13:37.536742 systemd[1]: Finished ensure-sysext.service. Jan 30 19:13:37.541400 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 30 19:13:37.542187 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 30 19:13:37.544261 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 30 19:13:37.545132 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 30 19:13:37.552237 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 30 19:13:37.562858 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 30 19:13:37.565600 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 30 19:13:37.567340 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 30 19:13:37.568309 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 30 19:13:37.570937 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 30 19:13:37.575023 augenrules[1415]: No rules Jan 30 19:13:37.574262 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Jan 30 19:13:37.579420 systemd-udevd[1391]: Using default interface naming scheme 'v255'. Jan 30 19:13:37.592257 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 30 19:13:37.595399 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 30 19:13:37.596548 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 30 19:13:37.615892 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 30 19:13:37.630691 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 30 19:13:37.643039 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 30 19:13:37.759264 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 30 19:13:37.779839 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (1441) Jan 30 19:13:37.871333 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 30 19:13:37.873333 systemd[1]: Reached target time-set.target - System Time Set. Jan 30 19:13:37.888910 systemd-resolved[1387]: Positive Trust Anchors: Jan 30 19:13:37.888930 systemd-resolved[1387]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 30 19:13:37.888975 systemd-resolved[1387]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 30 19:13:37.893644 systemd-networkd[1437]: lo: Link UP Jan 30 19:13:37.893656 systemd-networkd[1437]: lo: Gained carrier Jan 30 19:13:37.902490 systemd-networkd[1437]: Enumeration completed Jan 30 19:13:37.902616 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 30 19:13:37.904671 systemd-resolved[1387]: Using system hostname 'srv-ehdo1.gb1.brightbox.com'. Jan 30 19:13:37.907360 systemd-networkd[1437]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 30 19:13:37.907367 systemd-networkd[1437]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 30 19:13:37.912380 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 30 19:13:37.915717 systemd-networkd[1437]: eth0: Link UP Jan 30 19:13:37.915730 systemd-networkd[1437]: eth0: Gained carrier Jan 30 19:13:37.915762 systemd-networkd[1437]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 30 19:13:37.916901 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 30 19:13:37.918111 systemd[1]: Reached target network.target - Network. Jan 30 19:13:37.919701 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 30 19:13:37.944157 systemd-networkd[1437]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 30 19:13:37.949980 systemd-networkd[1437]: eth0: DHCPv4 address 10.244.22.2/30, gateway 10.244.22.1 acquired from 10.244.22.1 Jan 30 19:13:37.951256 systemd-timesyncd[1413]: Network configuration changed, trying to establish connection. Jan 30 19:13:37.968822 kernel: mousedev: PS/2 mouse device common for all mice Jan 30 19:13:37.972863 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Jan 30 19:13:37.978839 kernel: ACPI: button: Power Button [PWRF] Jan 30 19:13:38.024760 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 30 19:13:38.034056 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 30 19:13:38.038852 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 30 19:13:38.046387 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Jan 30 19:13:38.046656 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 30 19:13:38.062071 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Jan 30 19:13:38.074549 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 30 19:13:38.142265 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 30 19:13:38.296315 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 30 19:13:38.344884 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 30 19:13:38.352089 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 30 19:13:38.373821 lvm[1469]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 30 19:13:38.412478 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 30 19:13:38.414266 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 30 19:13:38.415101 systemd[1]: Reached target sysinit.target - System Initialization. Jan 30 19:13:38.416045 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 30 19:13:38.416933 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 30 19:13:38.418216 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 30 19:13:38.419191 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 30 19:13:38.420016 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 30 19:13:38.420820 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 30 19:13:38.420873 systemd[1]: Reached target paths.target - Path Units. Jan 30 19:13:38.421515 systemd[1]: Reached target timers.target - Timer Units. Jan 30 19:13:38.423165 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 30 19:13:38.425722 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 30 19:13:38.440405 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 30 19:13:38.443150 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 30 19:13:38.444654 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 30 19:13:38.445523 systemd[1]: Reached target sockets.target - Socket Units. Jan 30 19:13:38.446260 systemd[1]: Reached target basic.target - Basic System. Jan 30 19:13:38.446994 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 30 19:13:38.447045 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 30 19:13:38.458979 systemd[1]: Starting containerd.service - containerd container runtime... Jan 30 19:13:38.464335 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 30 19:13:38.465939 lvm[1474]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 30 19:13:38.468682 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 30 19:13:38.472938 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 30 19:13:38.475121 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 30 19:13:38.476898 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 30 19:13:38.483991 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 30 19:13:38.488946 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 30 19:13:38.498028 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 30 19:13:38.501887 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 30 19:13:38.514006 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 30 19:13:38.516722 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 30 19:13:38.518482 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 30 19:13:38.520317 systemd[1]: Starting update-engine.service - Update Engine... Jan 30 19:13:38.527506 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 30 19:13:38.531533 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 30 19:13:38.539440 dbus-daemon[1477]: [system] SELinux support is enabled Jan 30 19:13:38.544450 dbus-daemon[1477]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.1' (uid=244 pid=1437 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Jan 30 19:13:38.541155 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 30 19:13:38.553526 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 30 19:13:38.553579 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 30 19:13:38.555023 dbus-daemon[1477]: [system] Successfully activated service 'org.freedesktop.systemd1' Jan 30 19:13:38.563376 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 30 19:13:38.563412 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 30 19:13:38.575856 jq[1478]: false Jan 30 19:13:38.577343 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 30 19:13:38.578131 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 30 19:13:38.600055 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Jan 30 19:13:38.613553 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 30 19:13:38.614481 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 30 19:13:38.627593 (ntainerd)[1495]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 30 19:13:38.632181 jq[1488]: true Jan 30 19:13:38.645528 tar[1494]: linux-amd64/LICENSE Jan 30 19:13:38.645528 tar[1494]: linux-amd64/helm Jan 30 19:13:38.662899 extend-filesystems[1479]: Found loop4 Jan 30 19:13:38.662899 extend-filesystems[1479]: Found loop5 Jan 30 19:13:38.662899 extend-filesystems[1479]: Found loop6 Jan 30 19:13:38.662899 extend-filesystems[1479]: Found loop7 Jan 30 19:13:38.662899 extend-filesystems[1479]: Found vda Jan 30 19:13:38.662899 extend-filesystems[1479]: Found vda1 Jan 30 19:13:38.662899 extend-filesystems[1479]: Found vda2 Jan 30 19:13:38.662899 extend-filesystems[1479]: Found vda3 Jan 30 19:13:38.662899 extend-filesystems[1479]: Found usr Jan 30 19:13:38.662899 extend-filesystems[1479]: Found vda4 Jan 30 19:13:38.662899 extend-filesystems[1479]: Found vda6 Jan 30 19:13:38.662899 extend-filesystems[1479]: Found vda7 Jan 30 19:13:38.662899 extend-filesystems[1479]: Found vda9 Jan 30 19:13:38.662899 extend-filesystems[1479]: Checking size of /dev/vda9 Jan 30 19:13:38.749610 extend-filesystems[1479]: Resized partition /dev/vda9 Jan 30 19:13:38.752889 update_engine[1487]: I20250130 19:13:38.712131 1487 main.cc:92] Flatcar Update Engine starting Jan 30 19:13:38.752889 update_engine[1487]: I20250130 19:13:38.737543 1487 update_check_scheduler.cc:74] Next update check in 7m0s Jan 30 19:13:38.682245 systemd[1]: motdgen.service: Deactivated successfully. Jan 30 19:13:38.682560 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 30 19:13:38.756924 jq[1511]: true Jan 30 19:13:38.689515 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 30 19:13:38.737287 systemd[1]: Started update-engine.service - Update Engine. Jan 30 19:13:38.755422 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 30 19:13:38.763833 extend-filesystems[1520]: resize2fs 1.47.1 (20-May-2024) Jan 30 19:13:38.788140 systemd-logind[1486]: Watching system buttons on /dev/input/event2 (Power Button) Jan 30 19:13:38.796336 systemd-logind[1486]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 30 19:13:38.796761 systemd-logind[1486]: New seat seat0. Jan 30 19:13:38.800221 systemd[1]: Started systemd-logind.service - User Login Management. Jan 30 19:13:38.808009 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Jan 30 19:13:38.846689 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (1438) Jan 30 19:13:38.945642 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Jan 30 19:13:38.945445 dbus-daemon[1477]: [system] Successfully activated service 'org.freedesktop.hostname1' Jan 30 19:13:38.971193 systemd[1]: Starting polkit.service - Authorization Manager... Jan 30 19:13:38.950397 dbus-daemon[1477]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1500 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Jan 30 19:13:38.998077 polkitd[1541]: Started polkitd version 121 Jan 30 19:13:39.011367 bash[1536]: Updated "/home/core/.ssh/authorized_keys" Jan 30 19:13:39.019539 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 30 19:13:39.024762 polkitd[1541]: Loading rules from directory /etc/polkit-1/rules.d Jan 30 19:13:39.025470 polkitd[1541]: Loading rules from directory /usr/share/polkit-1/rules.d Jan 30 19:13:39.028189 systemd[1]: Starting sshkeys.service... Jan 30 19:13:39.050871 polkitd[1541]: Finished loading, compiling and executing 2 rules Jan 30 19:13:39.055943 dbus-daemon[1477]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Jan 30 19:13:39.056177 systemd[1]: Started polkit.service - Authorization Manager. Jan 30 19:13:39.059386 polkitd[1541]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Jan 30 19:13:39.098392 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 30 19:13:39.109401 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 30 19:13:39.123031 systemd-hostnamed[1500]: Hostname set to (static) Jan 30 19:13:39.141271 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Jan 30 19:13:39.163673 extend-filesystems[1520]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 30 19:13:39.163673 extend-filesystems[1520]: old_desc_blocks = 1, new_desc_blocks = 8 Jan 30 19:13:39.163673 extend-filesystems[1520]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Jan 30 19:13:39.176477 extend-filesystems[1479]: Resized filesystem in /dev/vda9 Jan 30 19:13:39.164974 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 30 19:13:39.165879 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 30 19:13:39.183852 locksmithd[1519]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 30 19:13:39.185018 containerd[1495]: time="2025-01-30T19:13:39.184486234Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Jan 30 19:13:39.223401 containerd[1495]: time="2025-01-30T19:13:39.221079020Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 30 19:13:39.224011 containerd[1495]: time="2025-01-30T19:13:39.223966897Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.74-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 30 19:13:39.224071 containerd[1495]: time="2025-01-30T19:13:39.224010683Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 30 19:13:39.224071 containerd[1495]: time="2025-01-30T19:13:39.224037295Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 30 19:13:39.224333 containerd[1495]: time="2025-01-30T19:13:39.224302582Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 30 19:13:39.224408 containerd[1495]: time="2025-01-30T19:13:39.224342123Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 30 19:13:39.224480 containerd[1495]: time="2025-01-30T19:13:39.224451200Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 30 19:13:39.224533 containerd[1495]: time="2025-01-30T19:13:39.224481201Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 30 19:13:39.224737 containerd[1495]: time="2025-01-30T19:13:39.224705175Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 30 19:13:39.224833 containerd[1495]: time="2025-01-30T19:13:39.224739393Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 30 19:13:39.224833 containerd[1495]: time="2025-01-30T19:13:39.224762846Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 30 19:13:39.224833 containerd[1495]: time="2025-01-30T19:13:39.224810487Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 30 19:13:39.224981 containerd[1495]: time="2025-01-30T19:13:39.224954022Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 30 19:13:39.226210 containerd[1495]: time="2025-01-30T19:13:39.225338327Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 30 19:13:39.226210 containerd[1495]: time="2025-01-30T19:13:39.225522664Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 30 19:13:39.226210 containerd[1495]: time="2025-01-30T19:13:39.225547893Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 30 19:13:39.226210 containerd[1495]: time="2025-01-30T19:13:39.225676225Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 30 19:13:39.226210 containerd[1495]: time="2025-01-30T19:13:39.225769891Z" level=info msg="metadata content store policy set" policy=shared Jan 30 19:13:39.232313 containerd[1495]: time="2025-01-30T19:13:39.232276173Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 30 19:13:39.232395 containerd[1495]: time="2025-01-30T19:13:39.232368491Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 30 19:13:39.232540 containerd[1495]: time="2025-01-30T19:13:39.232402658Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 30 19:13:39.232540 containerd[1495]: time="2025-01-30T19:13:39.232482731Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 30 19:13:39.232540 containerd[1495]: time="2025-01-30T19:13:39.232517220Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 30 19:13:39.233055 containerd[1495]: time="2025-01-30T19:13:39.232703402Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 30 19:13:39.233175 containerd[1495]: time="2025-01-30T19:13:39.233144625Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 30 19:13:39.233347 containerd[1495]: time="2025-01-30T19:13:39.233317348Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 30 19:13:39.233400 containerd[1495]: time="2025-01-30T19:13:39.233351943Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 30 19:13:39.233400 containerd[1495]: time="2025-01-30T19:13:39.233374141Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 30 19:13:39.233485 containerd[1495]: time="2025-01-30T19:13:39.233395442Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 30 19:13:39.233485 containerd[1495]: time="2025-01-30T19:13:39.233424466Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 30 19:13:39.233485 containerd[1495]: time="2025-01-30T19:13:39.233450501Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 30 19:13:39.233485 containerd[1495]: time="2025-01-30T19:13:39.233480585Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 30 19:13:39.233654 containerd[1495]: time="2025-01-30T19:13:39.233503376Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 30 19:13:39.233654 containerd[1495]: time="2025-01-30T19:13:39.233523548Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 30 19:13:39.233654 containerd[1495]: time="2025-01-30T19:13:39.233543524Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 30 19:13:39.233654 containerd[1495]: time="2025-01-30T19:13:39.233563449Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 30 19:13:39.233654 containerd[1495]: time="2025-01-30T19:13:39.233601186Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233654 containerd[1495]: time="2025-01-30T19:13:39.233624310Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233654 containerd[1495]: time="2025-01-30T19:13:39.233643316Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233955 containerd[1495]: time="2025-01-30T19:13:39.233664482Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233955 containerd[1495]: time="2025-01-30T19:13:39.233685781Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233955 containerd[1495]: time="2025-01-30T19:13:39.233705966Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233955 containerd[1495]: time="2025-01-30T19:13:39.233724910Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233955 containerd[1495]: time="2025-01-30T19:13:39.233744105Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233955 containerd[1495]: time="2025-01-30T19:13:39.233764969Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233955 containerd[1495]: time="2025-01-30T19:13:39.233835926Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233955 containerd[1495]: time="2025-01-30T19:13:39.233861143Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233955 containerd[1495]: time="2025-01-30T19:13:39.233880826Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233955 containerd[1495]: time="2025-01-30T19:13:39.233901188Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.233955 containerd[1495]: time="2025-01-30T19:13:39.233925226Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 30 19:13:39.234369 containerd[1495]: time="2025-01-30T19:13:39.233963891Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.234369 containerd[1495]: time="2025-01-30T19:13:39.233986604Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.234369 containerd[1495]: time="2025-01-30T19:13:39.234004488Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 30 19:13:39.236361 containerd[1495]: time="2025-01-30T19:13:39.234833239Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 30 19:13:39.236361 containerd[1495]: time="2025-01-30T19:13:39.234968879Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 30 19:13:39.236361 containerd[1495]: time="2025-01-30T19:13:39.234993171Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 30 19:13:39.236361 containerd[1495]: time="2025-01-30T19:13:39.235015171Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 30 19:13:39.236361 containerd[1495]: time="2025-01-30T19:13:39.235032492Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.236361 containerd[1495]: time="2025-01-30T19:13:39.235051849Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 30 19:13:39.236361 containerd[1495]: time="2025-01-30T19:13:39.235073442Z" level=info msg="NRI interface is disabled by configuration." Jan 30 19:13:39.236361 containerd[1495]: time="2025-01-30T19:13:39.235092026Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 30 19:13:39.236671 containerd[1495]: time="2025-01-30T19:13:39.235517544Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 30 19:13:39.236671 containerd[1495]: time="2025-01-30T19:13:39.235622341Z" level=info msg="Connect containerd service" Jan 30 19:13:39.236671 containerd[1495]: time="2025-01-30T19:13:39.235675998Z" level=info msg="using legacy CRI server" Jan 30 19:13:39.236671 containerd[1495]: time="2025-01-30T19:13:39.235692360Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 30 19:13:39.236671 containerd[1495]: time="2025-01-30T19:13:39.235875894Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 30 19:13:39.239816 containerd[1495]: time="2025-01-30T19:13:39.239297906Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 30 19:13:39.239816 containerd[1495]: time="2025-01-30T19:13:39.239492191Z" level=info msg="Start subscribing containerd event" Jan 30 19:13:39.239816 containerd[1495]: time="2025-01-30T19:13:39.239568284Z" level=info msg="Start recovering state" Jan 30 19:13:39.239816 containerd[1495]: time="2025-01-30T19:13:39.239684472Z" level=info msg="Start event monitor" Jan 30 19:13:39.239816 containerd[1495]: time="2025-01-30T19:13:39.239724524Z" level=info msg="Start snapshots syncer" Jan 30 19:13:39.239816 containerd[1495]: time="2025-01-30T19:13:39.239752521Z" level=info msg="Start cni network conf syncer for default" Jan 30 19:13:39.239816 containerd[1495]: time="2025-01-30T19:13:39.239772087Z" level=info msg="Start streaming server" Jan 30 19:13:39.243389 containerd[1495]: time="2025-01-30T19:13:39.243130410Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 30 19:13:39.243389 containerd[1495]: time="2025-01-30T19:13:39.243315287Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 30 19:13:39.260924 containerd[1495]: time="2025-01-30T19:13:39.259333620Z" level=info msg="containerd successfully booted in 0.076840s" Jan 30 19:13:39.259477 systemd[1]: Started containerd.service - containerd container runtime. Jan 30 19:13:39.284292 sshd_keygen[1509]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 30 19:13:39.314493 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 30 19:13:39.323221 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 30 19:13:39.327968 systemd[1]: Started sshd@0-10.244.22.2:22-139.178.89.65:55722.service - OpenSSH per-connection server daemon (139.178.89.65:55722). Jan 30 19:13:39.341168 systemd[1]: issuegen.service: Deactivated successfully. Jan 30 19:13:39.341777 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 30 19:13:39.360327 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 30 19:13:39.392211 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 30 19:13:39.406651 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 30 19:13:39.417099 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 30 19:13:39.419453 systemd[1]: Reached target getty.target - Login Prompts. Jan 30 19:13:39.691451 tar[1494]: linux-amd64/README.md Jan 30 19:13:39.710661 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 30 19:13:39.898024 systemd-networkd[1437]: eth0: Gained IPv6LL Jan 30 19:13:39.899246 systemd-timesyncd[1413]: Network configuration changed, trying to establish connection. Jan 30 19:13:39.902145 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 30 19:13:39.905439 systemd[1]: Reached target network-online.target - Network is Online. Jan 30 19:13:39.914258 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 19:13:39.919195 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 30 19:13:39.973884 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 30 19:13:40.245879 sshd[1574]: Accepted publickey for core from 139.178.89.65 port 55722 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:13:40.248955 sshd[1574]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:13:40.264666 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 30 19:13:40.276681 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 30 19:13:40.280623 systemd-logind[1486]: New session 1 of user core. Jan 30 19:13:40.302087 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 30 19:13:40.314276 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 30 19:13:40.322590 (systemd)[1601]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 30 19:13:40.470858 systemd[1601]: Queued start job for default target default.target. Jan 30 19:13:40.477570 systemd[1601]: Created slice app.slice - User Application Slice. Jan 30 19:13:40.477736 systemd[1601]: Reached target paths.target - Paths. Jan 30 19:13:40.477778 systemd[1601]: Reached target timers.target - Timers. Jan 30 19:13:40.480517 systemd[1601]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 30 19:13:40.519597 systemd[1601]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 30 19:13:40.520123 systemd[1601]: Reached target sockets.target - Sockets. Jan 30 19:13:40.520271 systemd[1601]: Reached target basic.target - Basic System. Jan 30 19:13:40.520362 systemd[1601]: Reached target default.target - Main User Target. Jan 30 19:13:40.520430 systemd[1601]: Startup finished in 186ms. Jan 30 19:13:40.521054 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 30 19:13:40.530163 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 30 19:13:40.917996 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 19:13:40.935314 (kubelet)[1615]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 30 19:13:41.180102 systemd[1]: Started sshd@1-10.244.22.2:22-139.178.89.65:38682.service - OpenSSH per-connection server daemon (139.178.89.65:38682). Jan 30 19:13:41.407598 systemd-timesyncd[1413]: Network configuration changed, trying to establish connection. Jan 30 19:13:41.409310 systemd-networkd[1437]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:580:24:19ff:fef4:1602/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:580:24:19ff:fef4:1602/64 assigned by NDisc. Jan 30 19:13:41.409323 systemd-networkd[1437]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Jan 30 19:13:41.563655 kubelet[1615]: E0130 19:13:41.563494 1615 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 30 19:13:41.566893 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 30 19:13:41.567132 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 30 19:13:41.567602 systemd[1]: kubelet.service: Consumed 1.070s CPU time. Jan 30 19:13:42.083249 sshd[1622]: Accepted publickey for core from 139.178.89.65 port 38682 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:13:42.085545 sshd[1622]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:13:42.093984 systemd-logind[1486]: New session 2 of user core. Jan 30 19:13:42.111182 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 30 19:13:42.458973 systemd-timesyncd[1413]: Network configuration changed, trying to establish connection. Jan 30 19:13:42.702546 sshd[1622]: pam_unix(sshd:session): session closed for user core Jan 30 19:13:42.707008 systemd[1]: sshd@1-10.244.22.2:22-139.178.89.65:38682.service: Deactivated successfully. Jan 30 19:13:42.709812 systemd[1]: session-2.scope: Deactivated successfully. Jan 30 19:13:42.711666 systemd-logind[1486]: Session 2 logged out. Waiting for processes to exit. Jan 30 19:13:42.713166 systemd-logind[1486]: Removed session 2. Jan 30 19:13:42.866262 systemd[1]: Started sshd@2-10.244.22.2:22-139.178.89.65:38684.service - OpenSSH per-connection server daemon (139.178.89.65:38684). Jan 30 19:13:43.754020 sshd[1634]: Accepted publickey for core from 139.178.89.65 port 38684 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:13:43.756150 sshd[1634]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:13:43.763026 systemd-logind[1486]: New session 3 of user core. Jan 30 19:13:43.775192 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 30 19:13:44.375270 sshd[1634]: pam_unix(sshd:session): session closed for user core Jan 30 19:13:44.379534 systemd[1]: sshd@2-10.244.22.2:22-139.178.89.65:38684.service: Deactivated successfully. Jan 30 19:13:44.382967 systemd[1]: session-3.scope: Deactivated successfully. Jan 30 19:13:44.385511 systemd-logind[1486]: Session 3 logged out. Waiting for processes to exit. Jan 30 19:13:44.387417 systemd-logind[1486]: Removed session 3. Jan 30 19:13:44.473569 login[1581]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 30 19:13:44.476395 login[1582]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 30 19:13:44.484938 systemd-logind[1486]: New session 4 of user core. Jan 30 19:13:44.495282 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 30 19:13:44.500076 systemd-logind[1486]: New session 5 of user core. Jan 30 19:13:44.505412 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 30 19:13:45.628352 coreos-metadata[1476]: Jan 30 19:13:45.628 WARN failed to locate config-drive, using the metadata service API instead Jan 30 19:13:45.656150 coreos-metadata[1476]: Jan 30 19:13:45.656 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 30 19:13:45.666923 coreos-metadata[1476]: Jan 30 19:13:45.666 INFO Fetch failed with 404: resource not found Jan 30 19:13:45.666923 coreos-metadata[1476]: Jan 30 19:13:45.666 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 30 19:13:45.667628 coreos-metadata[1476]: Jan 30 19:13:45.667 INFO Fetch successful Jan 30 19:13:45.667759 coreos-metadata[1476]: Jan 30 19:13:45.667 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 30 19:13:45.682127 coreos-metadata[1476]: Jan 30 19:13:45.682 INFO Fetch successful Jan 30 19:13:45.682127 coreos-metadata[1476]: Jan 30 19:13:45.682 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 30 19:13:45.718135 coreos-metadata[1476]: Jan 30 19:13:45.718 INFO Fetch successful Jan 30 19:13:45.718371 coreos-metadata[1476]: Jan 30 19:13:45.718 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 30 19:13:45.735765 coreos-metadata[1476]: Jan 30 19:13:45.735 INFO Fetch successful Jan 30 19:13:45.736070 coreos-metadata[1476]: Jan 30 19:13:45.736 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 30 19:13:45.755526 coreos-metadata[1476]: Jan 30 19:13:45.755 INFO Fetch successful Jan 30 19:13:45.795395 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 30 19:13:45.796659 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 30 19:13:46.225275 coreos-metadata[1554]: Jan 30 19:13:46.225 WARN failed to locate config-drive, using the metadata service API instead Jan 30 19:13:46.248737 coreos-metadata[1554]: Jan 30 19:13:46.248 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 30 19:13:46.274064 coreos-metadata[1554]: Jan 30 19:13:46.273 INFO Fetch successful Jan 30 19:13:46.274800 coreos-metadata[1554]: Jan 30 19:13:46.274 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 30 19:13:46.302203 coreos-metadata[1554]: Jan 30 19:13:46.302 INFO Fetch successful Jan 30 19:13:46.304815 unknown[1554]: wrote ssh authorized keys file for user: core Jan 30 19:13:46.328735 update-ssh-keys[1676]: Updated "/home/core/.ssh/authorized_keys" Jan 30 19:13:46.329512 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 30 19:13:46.332102 systemd[1]: Finished sshkeys.service. Jan 30 19:13:46.335194 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 30 19:13:46.337009 systemd[1]: Startup finished in 1.338s (kernel) + 14.888s (initrd) + 11.745s (userspace) = 27.972s. Jan 30 19:13:51.699976 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 30 19:13:51.707074 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 19:13:51.878289 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 19:13:51.900550 (kubelet)[1687]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 30 19:13:51.979609 kubelet[1687]: E0130 19:13:51.979401 1687 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 30 19:13:51.983334 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 30 19:13:51.983572 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 30 19:13:54.536197 systemd[1]: Started sshd@3-10.244.22.2:22-139.178.89.65:40470.service - OpenSSH per-connection server daemon (139.178.89.65:40470). Jan 30 19:13:55.429153 sshd[1696]: Accepted publickey for core from 139.178.89.65 port 40470 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:13:55.431494 sshd[1696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:13:55.438325 systemd-logind[1486]: New session 6 of user core. Jan 30 19:13:55.449152 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 30 19:13:56.045466 sshd[1696]: pam_unix(sshd:session): session closed for user core Jan 30 19:13:56.049828 systemd[1]: sshd@3-10.244.22.2:22-139.178.89.65:40470.service: Deactivated successfully. Jan 30 19:13:56.052382 systemd[1]: session-6.scope: Deactivated successfully. Jan 30 19:13:56.054440 systemd-logind[1486]: Session 6 logged out. Waiting for processes to exit. Jan 30 19:13:56.055897 systemd-logind[1486]: Removed session 6. Jan 30 19:13:56.213503 systemd[1]: Started sshd@4-10.244.22.2:22-139.178.89.65:40484.service - OpenSSH per-connection server daemon (139.178.89.65:40484). Jan 30 19:13:57.097321 sshd[1703]: Accepted publickey for core from 139.178.89.65 port 40484 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:13:57.100030 sshd[1703]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:13:57.106635 systemd-logind[1486]: New session 7 of user core. Jan 30 19:13:57.114022 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 30 19:13:57.711942 sshd[1703]: pam_unix(sshd:session): session closed for user core Jan 30 19:13:57.716660 systemd[1]: sshd@4-10.244.22.2:22-139.178.89.65:40484.service: Deactivated successfully. Jan 30 19:13:57.717116 systemd-logind[1486]: Session 7 logged out. Waiting for processes to exit. Jan 30 19:13:57.719019 systemd[1]: session-7.scope: Deactivated successfully. Jan 30 19:13:57.720918 systemd-logind[1486]: Removed session 7. Jan 30 19:13:57.869941 systemd[1]: Started sshd@5-10.244.22.2:22-139.178.89.65:40490.service - OpenSSH per-connection server daemon (139.178.89.65:40490). Jan 30 19:13:58.770307 sshd[1710]: Accepted publickey for core from 139.178.89.65 port 40490 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:13:58.772385 sshd[1710]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:13:58.780005 systemd-logind[1486]: New session 8 of user core. Jan 30 19:13:58.787090 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 30 19:13:59.393543 sshd[1710]: pam_unix(sshd:session): session closed for user core Jan 30 19:13:59.398816 systemd[1]: sshd@5-10.244.22.2:22-139.178.89.65:40490.service: Deactivated successfully. Jan 30 19:13:59.400897 systemd[1]: session-8.scope: Deactivated successfully. Jan 30 19:13:59.401748 systemd-logind[1486]: Session 8 logged out. Waiting for processes to exit. Jan 30 19:13:59.403308 systemd-logind[1486]: Removed session 8. Jan 30 19:13:59.549122 systemd[1]: Started sshd@6-10.244.22.2:22-139.178.89.65:40494.service - OpenSSH per-connection server daemon (139.178.89.65:40494). Jan 30 19:14:00.440923 sshd[1717]: Accepted publickey for core from 139.178.89.65 port 40494 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:14:00.442957 sshd[1717]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:14:00.450668 systemd-logind[1486]: New session 9 of user core. Jan 30 19:14:00.456016 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 30 19:14:00.927976 sudo[1720]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 30 19:14:00.928453 sudo[1720]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 30 19:14:00.941119 sudo[1720]: pam_unix(sudo:session): session closed for user root Jan 30 19:14:01.086216 sshd[1717]: pam_unix(sshd:session): session closed for user core Jan 30 19:14:01.091677 systemd[1]: sshd@6-10.244.22.2:22-139.178.89.65:40494.service: Deactivated successfully. Jan 30 19:14:01.093860 systemd[1]: session-9.scope: Deactivated successfully. Jan 30 19:14:01.094752 systemd-logind[1486]: Session 9 logged out. Waiting for processes to exit. Jan 30 19:14:01.096403 systemd-logind[1486]: Removed session 9. Jan 30 19:14:01.246099 systemd[1]: Started sshd@7-10.244.22.2:22-139.178.89.65:47696.service - OpenSSH per-connection server daemon (139.178.89.65:47696). Jan 30 19:14:02.132695 sshd[1725]: Accepted publickey for core from 139.178.89.65 port 47696 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:14:02.134798 sshd[1725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:14:02.136017 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 30 19:14:02.144097 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 19:14:02.150340 systemd-logind[1486]: New session 10 of user core. Jan 30 19:14:02.158107 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 30 19:14:02.320110 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 19:14:02.339830 (kubelet)[1735]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 30 19:14:02.400976 kubelet[1735]: E0130 19:14:02.399865 1735 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 30 19:14:02.402883 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 30 19:14:02.403180 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 30 19:14:02.614200 sudo[1744]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 30 19:14:02.614970 sudo[1744]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 30 19:14:02.621894 sudo[1744]: pam_unix(sudo:session): session closed for user root Jan 30 19:14:02.631018 sudo[1743]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Jan 30 19:14:02.631499 sudo[1743]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 30 19:14:02.652267 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Jan 30 19:14:02.669265 auditctl[1747]: No rules Jan 30 19:14:02.670974 systemd[1]: audit-rules.service: Deactivated successfully. Jan 30 19:14:02.671484 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Jan 30 19:14:02.681146 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Jan 30 19:14:02.718813 augenrules[1765]: No rules Jan 30 19:14:02.720926 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Jan 30 19:14:02.723064 sudo[1743]: pam_unix(sudo:session): session closed for user root Jan 30 19:14:02.867943 sshd[1725]: pam_unix(sshd:session): session closed for user core Jan 30 19:14:02.873710 systemd-logind[1486]: Session 10 logged out. Waiting for processes to exit. Jan 30 19:14:02.874659 systemd[1]: sshd@7-10.244.22.2:22-139.178.89.65:47696.service: Deactivated successfully. Jan 30 19:14:02.877538 systemd[1]: session-10.scope: Deactivated successfully. Jan 30 19:14:02.880298 systemd-logind[1486]: Removed session 10. Jan 30 19:14:03.021553 systemd[1]: Started sshd@8-10.244.22.2:22-139.178.89.65:47698.service - OpenSSH per-connection server daemon (139.178.89.65:47698). Jan 30 19:14:03.923752 sshd[1773]: Accepted publickey for core from 139.178.89.65 port 47698 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:14:03.926052 sshd[1773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:14:03.934087 systemd-logind[1486]: New session 11 of user core. Jan 30 19:14:03.944096 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 30 19:14:04.401812 sudo[1776]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 30 19:14:04.402269 sudo[1776]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 30 19:14:04.904317 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 30 19:14:04.904536 (dockerd)[1791]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 30 19:14:05.351722 dockerd[1791]: time="2025-01-30T19:14:05.350820446Z" level=info msg="Starting up" Jan 30 19:14:05.475433 systemd[1]: var-lib-docker-metacopy\x2dcheck690468490-merged.mount: Deactivated successfully. Jan 30 19:14:05.495468 dockerd[1791]: time="2025-01-30T19:14:05.495393552Z" level=info msg="Loading containers: start." Jan 30 19:14:05.655918 kernel: Initializing XFRM netlink socket Jan 30 19:14:05.690372 systemd-timesyncd[1413]: Network configuration changed, trying to establish connection. Jan 30 19:14:05.760366 systemd-networkd[1437]: docker0: Link UP Jan 30 19:14:05.784649 dockerd[1791]: time="2025-01-30T19:14:05.784591632Z" level=info msg="Loading containers: done." Jan 30 19:14:05.820517 dockerd[1791]: time="2025-01-30T19:14:05.820447823Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 30 19:14:05.820912 dockerd[1791]: time="2025-01-30T19:14:05.820614264Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Jan 30 19:14:05.820912 dockerd[1791]: time="2025-01-30T19:14:05.820855423Z" level=info msg="Daemon has completed initialization" Jan 30 19:14:05.874849 dockerd[1791]: time="2025-01-30T19:14:05.873828027Z" level=info msg="API listen on /run/docker.sock" Jan 30 19:14:05.875298 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 30 19:14:06.410028 systemd-timesyncd[1413]: Contacted time server [2a02:390:56d0:900d:e99::]:123 (2.flatcar.pool.ntp.org). Jan 30 19:14:06.410172 systemd-timesyncd[1413]: Initial clock synchronization to Thu 2025-01-30 19:14:06.409560 UTC. Jan 30 19:14:06.410297 systemd-resolved[1387]: Clock change detected. Flushing caches. Jan 30 19:14:07.254598 containerd[1495]: time="2025-01-30T19:14:07.253818860Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.1\"" Jan 30 19:14:08.070533 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3553027562.mount: Deactivated successfully. Jan 30 19:14:10.011966 containerd[1495]: time="2025-01-30T19:14:10.010893720Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:10.013305 containerd[1495]: time="2025-01-30T19:14:10.013232171Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.1: active requests=0, bytes read=28674832" Jan 30 19:14:10.014681 containerd[1495]: time="2025-01-30T19:14:10.014626405Z" level=info msg="ImageCreate event name:\"sha256:95c0bda56fc4dd44cf1876f15c04427feabe5556394553874934ffd2514eeb0a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:10.018215 containerd[1495]: time="2025-01-30T19:14:10.018135655Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:b88ede8e7c3ce354ca0c45c448c48c094781ce692883ee56f181fa569338c0ac\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:10.020249 containerd[1495]: time="2025-01-30T19:14:10.019770961Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.1\" with image id \"sha256:95c0bda56fc4dd44cf1876f15c04427feabe5556394553874934ffd2514eeb0a\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.1\", repo digest \"registry.k8s.io/kube-apiserver@sha256:b88ede8e7c3ce354ca0c45c448c48c094781ce692883ee56f181fa569338c0ac\", size \"28671624\" in 2.765812527s" Jan 30 19:14:10.020249 containerd[1495]: time="2025-01-30T19:14:10.019869496Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.1\" returns image reference \"sha256:95c0bda56fc4dd44cf1876f15c04427feabe5556394553874934ffd2514eeb0a\"" Jan 30 19:14:10.020887 containerd[1495]: time="2025-01-30T19:14:10.020856187Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.1\"" Jan 30 19:14:11.926546 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Jan 30 19:14:12.197515 containerd[1495]: time="2025-01-30T19:14:12.197327682Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:12.199286 containerd[1495]: time="2025-01-30T19:14:12.199223431Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.1: active requests=0, bytes read=24770719" Jan 30 19:14:12.200515 containerd[1495]: time="2025-01-30T19:14:12.200456502Z" level=info msg="ImageCreate event name:\"sha256:019ee182b58e20da055b173dc0b598fbde321d4bf959e1c2a832908ed7642d35\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:12.206633 containerd[1495]: time="2025-01-30T19:14:12.206563850Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7e86b2b274365bbc5f5d1e08f0d32d8bb04b8484ac6a92484c298dc695025954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:12.208408 containerd[1495]: time="2025-01-30T19:14:12.208153895Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.1\" with image id \"sha256:019ee182b58e20da055b173dc0b598fbde321d4bf959e1c2a832908ed7642d35\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.1\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7e86b2b274365bbc5f5d1e08f0d32d8bb04b8484ac6a92484c298dc695025954\", size \"26258470\" in 2.18725676s" Jan 30 19:14:12.208408 containerd[1495]: time="2025-01-30T19:14:12.208204131Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.1\" returns image reference \"sha256:019ee182b58e20da055b173dc0b598fbde321d4bf959e1c2a832908ed7642d35\"" Jan 30 19:14:12.208841 containerd[1495]: time="2025-01-30T19:14:12.208796975Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.1\"" Jan 30 19:14:12.920029 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 30 19:14:12.937262 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 19:14:13.120088 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 19:14:13.138627 (kubelet)[2005]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 30 19:14:13.227090 kubelet[2005]: E0130 19:14:13.226892 2005 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 30 19:14:13.230277 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 30 19:14:13.230558 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 30 19:14:14.110444 containerd[1495]: time="2025-01-30T19:14:14.110366988Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:14.112058 containerd[1495]: time="2025-01-30T19:14:14.112008723Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.1: active requests=0, bytes read=19169767" Jan 30 19:14:14.112807 containerd[1495]: time="2025-01-30T19:14:14.112722641Z" level=info msg="ImageCreate event name:\"sha256:2b0d6572d062c0f590b08c3113e5d9a61e381b3da7845a0289bdbf1faa1b23d1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:14.120729 containerd[1495]: time="2025-01-30T19:14:14.120640752Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b8fcbcd2afe44acf368b24b61813686f64be4d7fff224d305d78a05bac38f72e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:14.122125 containerd[1495]: time="2025-01-30T19:14:14.122080842Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.1\" with image id \"sha256:2b0d6572d062c0f590b08c3113e5d9a61e381b3da7845a0289bdbf1faa1b23d1\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.1\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b8fcbcd2afe44acf368b24b61813686f64be4d7fff224d305d78a05bac38f72e\", size \"20657536\" in 1.913220884s" Jan 30 19:14:14.122211 containerd[1495]: time="2025-01-30T19:14:14.122137699Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.1\" returns image reference \"sha256:2b0d6572d062c0f590b08c3113e5d9a61e381b3da7845a0289bdbf1faa1b23d1\"" Jan 30 19:14:14.122954 containerd[1495]: time="2025-01-30T19:14:14.122705204Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.1\"" Jan 30 19:14:16.132334 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4002323435.mount: Deactivated successfully. Jan 30 19:14:16.863948 containerd[1495]: time="2025-01-30T19:14:16.863451982Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:16.865969 containerd[1495]: time="2025-01-30T19:14:16.865866821Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.1: active requests=0, bytes read=30909474" Jan 30 19:14:16.866924 containerd[1495]: time="2025-01-30T19:14:16.866848912Z" level=info msg="ImageCreate event name:\"sha256:e29f9c7391fd92d96bc72026fc755b0f9589536e36ecd7102161f1ded087897a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:16.870670 containerd[1495]: time="2025-01-30T19:14:16.870583104Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:0244651801747edf2368222f93a7d17cba6e668a890db72532d6b67a7e06dca5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:16.872288 containerd[1495]: time="2025-01-30T19:14:16.871671202Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.1\" with image id \"sha256:e29f9c7391fd92d96bc72026fc755b0f9589536e36ecd7102161f1ded087897a\", repo tag \"registry.k8s.io/kube-proxy:v1.32.1\", repo digest \"registry.k8s.io/kube-proxy@sha256:0244651801747edf2368222f93a7d17cba6e668a890db72532d6b67a7e06dca5\", size \"30908485\" in 2.748906121s" Jan 30 19:14:16.872288 containerd[1495]: time="2025-01-30T19:14:16.871762326Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.1\" returns image reference \"sha256:e29f9c7391fd92d96bc72026fc755b0f9589536e36ecd7102161f1ded087897a\"" Jan 30 19:14:16.873217 containerd[1495]: time="2025-01-30T19:14:16.872922393Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 30 19:14:17.525428 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount662990929.mount: Deactivated successfully. Jan 30 19:14:18.795248 containerd[1495]: time="2025-01-30T19:14:18.795141416Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:18.796895 containerd[1495]: time="2025-01-30T19:14:18.796817631Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Jan 30 19:14:18.797856 containerd[1495]: time="2025-01-30T19:14:18.797761672Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:18.801978 containerd[1495]: time="2025-01-30T19:14:18.801902478Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:18.803821 containerd[1495]: time="2025-01-30T19:14:18.803623195Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.93065939s" Jan 30 19:14:18.803821 containerd[1495]: time="2025-01-30T19:14:18.803671053Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jan 30 19:14:18.805097 containerd[1495]: time="2025-01-30T19:14:18.805056334Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 30 19:14:19.371135 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4162779918.mount: Deactivated successfully. Jan 30 19:14:19.377879 containerd[1495]: time="2025-01-30T19:14:19.377639012Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:19.378909 containerd[1495]: time="2025-01-30T19:14:19.378856055Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Jan 30 19:14:19.380186 containerd[1495]: time="2025-01-30T19:14:19.380108622Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:19.384897 containerd[1495]: time="2025-01-30T19:14:19.384780449Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:19.386560 containerd[1495]: time="2025-01-30T19:14:19.386062119Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 580.640262ms" Jan 30 19:14:19.386560 containerd[1495]: time="2025-01-30T19:14:19.386113678Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 30 19:14:19.388077 containerd[1495]: time="2025-01-30T19:14:19.387759573Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 30 19:14:20.224514 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2255957847.mount: Deactivated successfully. Jan 30 19:14:22.909055 containerd[1495]: time="2025-01-30T19:14:22.908975277Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:22.910824 containerd[1495]: time="2025-01-30T19:14:22.910769070Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551328" Jan 30 19:14:22.911624 containerd[1495]: time="2025-01-30T19:14:22.911583120Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:22.919930 containerd[1495]: time="2025-01-30T19:14:22.919872441Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:22.921876 containerd[1495]: time="2025-01-30T19:14:22.921818207Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.534014302s" Jan 30 19:14:22.922018 containerd[1495]: time="2025-01-30T19:14:22.921989849Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Jan 30 19:14:23.350796 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 30 19:14:23.358161 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 19:14:23.619982 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 19:14:23.629475 (kubelet)[2160]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 30 19:14:23.704683 kubelet[2160]: E0130 19:14:23.704593 2160 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 30 19:14:23.707051 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 30 19:14:23.707289 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 30 19:14:24.844118 update_engine[1487]: I20250130 19:14:24.843945 1487 update_attempter.cc:509] Updating boot flags... Jan 30 19:14:24.943897 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (2178) Jan 30 19:14:24.989856 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (2178) Jan 30 19:14:28.070059 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 19:14:28.083217 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 19:14:28.121088 systemd[1]: Reloading requested from client PID 2192 ('systemctl') (unit session-11.scope)... Jan 30 19:14:28.121134 systemd[1]: Reloading... Jan 30 19:14:28.309866 zram_generator::config[2234]: No configuration found. Jan 30 19:14:28.444861 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 30 19:14:28.552111 systemd[1]: Reloading finished in 430 ms. Jan 30 19:14:28.628773 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 30 19:14:28.628945 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 30 19:14:28.629539 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 19:14:28.635188 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 19:14:28.779881 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 19:14:28.792319 (kubelet)[2298]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 30 19:14:28.843514 kubelet[2298]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 19:14:28.843514 kubelet[2298]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 30 19:14:28.843514 kubelet[2298]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 19:14:28.844145 kubelet[2298]: I0130 19:14:28.843655 2298 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 30 19:14:29.179666 kubelet[2298]: I0130 19:14:29.179566 2298 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Jan 30 19:14:29.179666 kubelet[2298]: I0130 19:14:29.179614 2298 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 30 19:14:29.180051 kubelet[2298]: I0130 19:14:29.179989 2298 server.go:954] "Client rotation is on, will bootstrap in background" Jan 30 19:14:29.214992 kubelet[2298]: I0130 19:14:29.214462 2298 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 30 19:14:29.217678 kubelet[2298]: E0130 19:14:29.217612 2298 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.244.22.2:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.244.22.2:6443: connect: connection refused" logger="UnhandledError" Jan 30 19:14:29.234659 kubelet[2298]: E0130 19:14:29.234556 2298 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Jan 30 19:14:29.234659 kubelet[2298]: I0130 19:14:29.234633 2298 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Jan 30 19:14:29.244141 kubelet[2298]: I0130 19:14:29.244080 2298 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 30 19:14:29.248179 kubelet[2298]: I0130 19:14:29.248100 2298 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 30 19:14:29.248446 kubelet[2298]: I0130 19:14:29.248174 2298 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-ehdo1.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 30 19:14:29.250244 kubelet[2298]: I0130 19:14:29.250190 2298 topology_manager.go:138] "Creating topology manager with none policy" Jan 30 19:14:29.250244 kubelet[2298]: I0130 19:14:29.250222 2298 container_manager_linux.go:304] "Creating device plugin manager" Jan 30 19:14:29.250493 kubelet[2298]: I0130 19:14:29.250458 2298 state_mem.go:36] "Initialized new in-memory state store" Jan 30 19:14:29.254307 kubelet[2298]: I0130 19:14:29.254254 2298 kubelet.go:446] "Attempting to sync node with API server" Jan 30 19:14:29.254407 kubelet[2298]: I0130 19:14:29.254384 2298 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 30 19:14:29.254488 kubelet[2298]: I0130 19:14:29.254433 2298 kubelet.go:352] "Adding apiserver pod source" Jan 30 19:14:29.254488 kubelet[2298]: I0130 19:14:29.254451 2298 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 30 19:14:29.275464 kubelet[2298]: W0130 19:14:29.274899 2298 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.244.22.2:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.244.22.2:6443: connect: connection refused Jan 30 19:14:29.275464 kubelet[2298]: E0130 19:14:29.275002 2298 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.244.22.2:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.22.2:6443: connect: connection refused" logger="UnhandledError" Jan 30 19:14:29.275464 kubelet[2298]: W0130 19:14:29.275114 2298 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.244.22.2:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-ehdo1.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.244.22.2:6443: connect: connection refused Jan 30 19:14:29.275464 kubelet[2298]: E0130 19:14:29.275148 2298 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.244.22.2:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-ehdo1.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.22.2:6443: connect: connection refused" logger="UnhandledError" Jan 30 19:14:29.275464 kubelet[2298]: I0130 19:14:29.275294 2298 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Jan 30 19:14:29.279637 kubelet[2298]: I0130 19:14:29.279130 2298 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 30 19:14:29.279637 kubelet[2298]: W0130 19:14:29.279249 2298 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 30 19:14:29.281400 kubelet[2298]: I0130 19:14:29.281130 2298 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 30 19:14:29.281400 kubelet[2298]: I0130 19:14:29.281205 2298 server.go:1287] "Started kubelet" Jan 30 19:14:29.282984 kubelet[2298]: I0130 19:14:29.282511 2298 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 30 19:14:29.286360 kubelet[2298]: I0130 19:14:29.286335 2298 server.go:490] "Adding debug handlers to kubelet server" Jan 30 19:14:29.287955 kubelet[2298]: I0130 19:14:29.287517 2298 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 30 19:14:29.288017 kubelet[2298]: I0130 19:14:29.287978 2298 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 30 19:14:29.291987 kubelet[2298]: I0130 19:14:29.291963 2298 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 30 19:14:29.292911 kubelet[2298]: E0130 19:14:29.289471 2298 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.244.22.2:6443/api/v1/namespaces/default/events\": dial tcp 10.244.22.2:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-ehdo1.gb1.brightbox.com.181f8e53392fcddd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-ehdo1.gb1.brightbox.com,UID:srv-ehdo1.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-ehdo1.gb1.brightbox.com,},FirstTimestamp:2025-01-30 19:14:29.281164765 +0000 UTC m=+0.484303918,LastTimestamp:2025-01-30 19:14:29.281164765 +0000 UTC m=+0.484303918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-ehdo1.gb1.brightbox.com,}" Jan 30 19:14:29.293562 kubelet[2298]: I0130 19:14:29.293534 2298 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 30 19:14:29.298585 kubelet[2298]: E0130 19:14:29.298549 2298 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"srv-ehdo1.gb1.brightbox.com\" not found" Jan 30 19:14:29.298756 kubelet[2298]: I0130 19:14:29.298735 2298 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 30 19:14:29.301389 kubelet[2298]: I0130 19:14:29.301291 2298 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Jan 30 19:14:29.302887 kubelet[2298]: I0130 19:14:29.301556 2298 reconciler.go:26] "Reconciler: start to sync state" Jan 30 19:14:29.302887 kubelet[2298]: W0130 19:14:29.302512 2298 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.244.22.2:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.244.22.2:6443: connect: connection refused Jan 30 19:14:29.302887 kubelet[2298]: E0130 19:14:29.302576 2298 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.244.22.2:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.22.2:6443: connect: connection refused" logger="UnhandledError" Jan 30 19:14:29.303395 kubelet[2298]: E0130 19:14:29.303361 2298 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.22.2:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-ehdo1.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.22.2:6443: connect: connection refused" interval="200ms" Jan 30 19:14:29.304486 kubelet[2298]: I0130 19:14:29.304460 2298 factory.go:221] Registration of the systemd container factory successfully Jan 30 19:14:29.306416 kubelet[2298]: I0130 19:14:29.306388 2298 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 30 19:14:29.315019 kubelet[2298]: I0130 19:14:29.314980 2298 factory.go:221] Registration of the containerd container factory successfully Jan 30 19:14:29.326797 kubelet[2298]: E0130 19:14:29.326754 2298 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 30 19:14:29.342431 kubelet[2298]: I0130 19:14:29.342368 2298 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 30 19:14:29.349908 kubelet[2298]: I0130 19:14:29.349613 2298 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 30 19:14:29.349908 kubelet[2298]: I0130 19:14:29.349666 2298 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 30 19:14:29.349908 kubelet[2298]: I0130 19:14:29.349703 2298 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 30 19:14:29.349908 kubelet[2298]: I0130 19:14:29.349720 2298 kubelet.go:2388] "Starting kubelet main sync loop" Jan 30 19:14:29.349908 kubelet[2298]: E0130 19:14:29.349814 2298 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 30 19:14:29.351444 kubelet[2298]: W0130 19:14:29.351392 2298 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.244.22.2:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.244.22.2:6443: connect: connection refused Jan 30 19:14:29.352628 kubelet[2298]: E0130 19:14:29.352571 2298 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.244.22.2:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.22.2:6443: connect: connection refused" logger="UnhandledError" Jan 30 19:14:29.354006 kubelet[2298]: I0130 19:14:29.353982 2298 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 30 19:14:29.354146 kubelet[2298]: I0130 19:14:29.354126 2298 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 30 19:14:29.354268 kubelet[2298]: I0130 19:14:29.354249 2298 state_mem.go:36] "Initialized new in-memory state store" Jan 30 19:14:29.357259 kubelet[2298]: I0130 19:14:29.357234 2298 policy_none.go:49] "None policy: Start" Jan 30 19:14:29.357390 kubelet[2298]: I0130 19:14:29.357369 2298 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 30 19:14:29.357541 kubelet[2298]: I0130 19:14:29.357522 2298 state_mem.go:35] "Initializing new in-memory state store" Jan 30 19:14:29.367201 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 30 19:14:29.383151 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 30 19:14:29.387893 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 30 19:14:29.400447 kubelet[2298]: I0130 19:14:29.399479 2298 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 30 19:14:29.400447 kubelet[2298]: I0130 19:14:29.399779 2298 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 30 19:14:29.400447 kubelet[2298]: I0130 19:14:29.399820 2298 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 30 19:14:29.400447 kubelet[2298]: I0130 19:14:29.400302 2298 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 30 19:14:29.402590 kubelet[2298]: E0130 19:14:29.402460 2298 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 30 19:14:29.402590 kubelet[2298]: E0130 19:14:29.402555 2298 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-ehdo1.gb1.brightbox.com\" not found" Jan 30 19:14:29.466513 systemd[1]: Created slice kubepods-burstable-podaae72a3d5f96b5efc2580d509c27c5fd.slice - libcontainer container kubepods-burstable-podaae72a3d5f96b5efc2580d509c27c5fd.slice. Jan 30 19:14:29.485866 kubelet[2298]: E0130 19:14:29.485616 2298 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-ehdo1.gb1.brightbox.com\" not found" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.488472 systemd[1]: Created slice kubepods-burstable-podb26a71cfdde83d612efcaf7666f5e5f9.slice - libcontainer container kubepods-burstable-podb26a71cfdde83d612efcaf7666f5e5f9.slice. Jan 30 19:14:29.502878 kubelet[2298]: E0130 19:14:29.502089 2298 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-ehdo1.gb1.brightbox.com\" not found" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.504633 kubelet[2298]: I0130 19:14:29.504605 2298 kubelet_node_status.go:76] "Attempting to register node" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.505340 kubelet[2298]: E0130 19:14:29.505306 2298 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.244.22.2:6443/api/v1/nodes\": dial tcp 10.244.22.2:6443: connect: connection refused" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.506018 kubelet[2298]: E0130 19:14:29.505969 2298 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.22.2:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-ehdo1.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.22.2:6443: connect: connection refused" interval="400ms" Jan 30 19:14:29.507383 systemd[1]: Created slice kubepods-burstable-podb137df1b4582753958de4ab3a878cdd9.slice - libcontainer container kubepods-burstable-podb137df1b4582753958de4ab3a878cdd9.slice. Jan 30 19:14:29.510046 kubelet[2298]: E0130 19:14:29.509988 2298 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-ehdo1.gb1.brightbox.com\" not found" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.603432 kubelet[2298]: I0130 19:14:29.602975 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aae72a3d5f96b5efc2580d509c27c5fd-ca-certs\") pod \"kube-apiserver-srv-ehdo1.gb1.brightbox.com\" (UID: \"aae72a3d5f96b5efc2580d509c27c5fd\") " pod="kube-system/kube-apiserver-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.603432 kubelet[2298]: I0130 19:14:29.603067 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b26a71cfdde83d612efcaf7666f5e5f9-ca-certs\") pod \"kube-controller-manager-srv-ehdo1.gb1.brightbox.com\" (UID: \"b26a71cfdde83d612efcaf7666f5e5f9\") " pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.603432 kubelet[2298]: I0130 19:14:29.603111 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b26a71cfdde83d612efcaf7666f5e5f9-flexvolume-dir\") pod \"kube-controller-manager-srv-ehdo1.gb1.brightbox.com\" (UID: \"b26a71cfdde83d612efcaf7666f5e5f9\") " pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.603432 kubelet[2298]: I0130 19:14:29.603145 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b26a71cfdde83d612efcaf7666f5e5f9-kubeconfig\") pod \"kube-controller-manager-srv-ehdo1.gb1.brightbox.com\" (UID: \"b26a71cfdde83d612efcaf7666f5e5f9\") " pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.603432 kubelet[2298]: I0130 19:14:29.603179 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b26a71cfdde83d612efcaf7666f5e5f9-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-ehdo1.gb1.brightbox.com\" (UID: \"b26a71cfdde83d612efcaf7666f5e5f9\") " pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.603955 kubelet[2298]: I0130 19:14:29.603208 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b137df1b4582753958de4ab3a878cdd9-kubeconfig\") pod \"kube-scheduler-srv-ehdo1.gb1.brightbox.com\" (UID: \"b137df1b4582753958de4ab3a878cdd9\") " pod="kube-system/kube-scheduler-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.603955 kubelet[2298]: I0130 19:14:29.603235 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b26a71cfdde83d612efcaf7666f5e5f9-k8s-certs\") pod \"kube-controller-manager-srv-ehdo1.gb1.brightbox.com\" (UID: \"b26a71cfdde83d612efcaf7666f5e5f9\") " pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.603955 kubelet[2298]: I0130 19:14:29.603264 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aae72a3d5f96b5efc2580d509c27c5fd-k8s-certs\") pod \"kube-apiserver-srv-ehdo1.gb1.brightbox.com\" (UID: \"aae72a3d5f96b5efc2580d509c27c5fd\") " pod="kube-system/kube-apiserver-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.603955 kubelet[2298]: I0130 19:14:29.603293 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aae72a3d5f96b5efc2580d509c27c5fd-usr-share-ca-certificates\") pod \"kube-apiserver-srv-ehdo1.gb1.brightbox.com\" (UID: \"aae72a3d5f96b5efc2580d509c27c5fd\") " pod="kube-system/kube-apiserver-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.709978 kubelet[2298]: I0130 19:14:29.709927 2298 kubelet_node_status.go:76] "Attempting to register node" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.711301 kubelet[2298]: E0130 19:14:29.710875 2298 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.244.22.2:6443/api/v1/nodes\": dial tcp 10.244.22.2:6443: connect: connection refused" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:29.789427 containerd[1495]: time="2025-01-30T19:14:29.789179473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-ehdo1.gb1.brightbox.com,Uid:aae72a3d5f96b5efc2580d509c27c5fd,Namespace:kube-system,Attempt:0,}" Jan 30 19:14:29.811478 containerd[1495]: time="2025-01-30T19:14:29.811414999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-ehdo1.gb1.brightbox.com,Uid:b137df1b4582753958de4ab3a878cdd9,Namespace:kube-system,Attempt:0,}" Jan 30 19:14:29.811772 containerd[1495]: time="2025-01-30T19:14:29.811417237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-ehdo1.gb1.brightbox.com,Uid:b26a71cfdde83d612efcaf7666f5e5f9,Namespace:kube-system,Attempt:0,}" Jan 30 19:14:29.906960 kubelet[2298]: E0130 19:14:29.906872 2298 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.22.2:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-ehdo1.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.22.2:6443: connect: connection refused" interval="800ms" Jan 30 19:14:30.115106 kubelet[2298]: I0130 19:14:30.114564 2298 kubelet_node_status.go:76] "Attempting to register node" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:30.116102 kubelet[2298]: E0130 19:14:30.116071 2298 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.244.22.2:6443/api/v1/nodes\": dial tcp 10.244.22.2:6443: connect: connection refused" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:30.228989 kubelet[2298]: W0130 19:14:30.228938 2298 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.244.22.2:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.244.22.2:6443: connect: connection refused Jan 30 19:14:30.229201 kubelet[2298]: E0130 19:14:30.229002 2298 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.244.22.2:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.22.2:6443: connect: connection refused" logger="UnhandledError" Jan 30 19:14:30.367543 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount652248798.mount: Deactivated successfully. Jan 30 19:14:30.376890 containerd[1495]: time="2025-01-30T19:14:30.374455935Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 30 19:14:30.376890 containerd[1495]: time="2025-01-30T19:14:30.376674780Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 30 19:14:30.376890 containerd[1495]: time="2025-01-30T19:14:30.376792345Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 30 19:14:30.379043 containerd[1495]: time="2025-01-30T19:14:30.378993906Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 30 19:14:30.379821 containerd[1495]: time="2025-01-30T19:14:30.379623788Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Jan 30 19:14:30.382114 containerd[1495]: time="2025-01-30T19:14:30.382057890Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 30 19:14:30.382518 containerd[1495]: time="2025-01-30T19:14:30.382356519Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 30 19:14:30.384995 containerd[1495]: time="2025-01-30T19:14:30.384914227Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 30 19:14:30.386550 containerd[1495]: time="2025-01-30T19:14:30.386236896Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 574.266555ms" Jan 30 19:14:30.387729 containerd[1495]: time="2025-01-30T19:14:30.387676451Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 598.345464ms" Jan 30 19:14:30.393047 kubelet[2298]: W0130 19:14:30.392910 2298 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.244.22.2:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.244.22.2:6443: connect: connection refused Jan 30 19:14:30.393047 kubelet[2298]: E0130 19:14:30.392995 2298 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.244.22.2:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.22.2:6443: connect: connection refused" logger="UnhandledError" Jan 30 19:14:30.394848 containerd[1495]: time="2025-01-30T19:14:30.394671732Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 583.157618ms" Jan 30 19:14:30.581582 containerd[1495]: time="2025-01-30T19:14:30.581388693Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:14:30.582854 containerd[1495]: time="2025-01-30T19:14:30.581609660Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:14:30.582854 containerd[1495]: time="2025-01-30T19:14:30.581633940Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:30.582854 containerd[1495]: time="2025-01-30T19:14:30.582192956Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:30.588398 containerd[1495]: time="2025-01-30T19:14:30.587937034Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:14:30.588398 containerd[1495]: time="2025-01-30T19:14:30.588028779Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:14:30.588398 containerd[1495]: time="2025-01-30T19:14:30.588084570Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:30.588398 containerd[1495]: time="2025-01-30T19:14:30.588216960Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:30.590509 containerd[1495]: time="2025-01-30T19:14:30.590402367Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:14:30.590638 containerd[1495]: time="2025-01-30T19:14:30.590544328Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:14:30.591226 containerd[1495]: time="2025-01-30T19:14:30.590576102Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:30.591226 containerd[1495]: time="2025-01-30T19:14:30.591095796Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:30.598202 kubelet[2298]: W0130 19:14:30.598124 2298 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.244.22.2:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-ehdo1.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.244.22.2:6443: connect: connection refused Jan 30 19:14:30.598329 kubelet[2298]: E0130 19:14:30.598207 2298 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.244.22.2:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-ehdo1.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.22.2:6443: connect: connection refused" logger="UnhandledError" Jan 30 19:14:30.611374 kubelet[2298]: W0130 19:14:30.611304 2298 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.244.22.2:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.244.22.2:6443: connect: connection refused Jan 30 19:14:30.611478 kubelet[2298]: E0130 19:14:30.611394 2298 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.244.22.2:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.22.2:6443: connect: connection refused" logger="UnhandledError" Jan 30 19:14:30.639201 systemd[1]: Started cri-containerd-4dfacad81ab9710a8576d6f6abcb2ad4cf5333a8fcb2e8b032a25c62ab1c1e03.scope - libcontainer container 4dfacad81ab9710a8576d6f6abcb2ad4cf5333a8fcb2e8b032a25c62ab1c1e03. Jan 30 19:14:30.643639 systemd[1]: Started cri-containerd-9283d398d5874922216e36334059478cf7833e975ccefb1928ba2f3339110deb.scope - libcontainer container 9283d398d5874922216e36334059478cf7833e975ccefb1928ba2f3339110deb. Jan 30 19:14:30.646973 systemd[1]: Started cri-containerd-97e3d5738dcac7b45354e3da9a598ad99ab6c36163b5302b985bca78ec465ce8.scope - libcontainer container 97e3d5738dcac7b45354e3da9a598ad99ab6c36163b5302b985bca78ec465ce8. Jan 30 19:14:30.708577 kubelet[2298]: E0130 19:14:30.708468 2298 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.22.2:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-ehdo1.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.22.2:6443: connect: connection refused" interval="1.6s" Jan 30 19:14:30.770953 containerd[1495]: time="2025-01-30T19:14:30.769332493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-ehdo1.gb1.brightbox.com,Uid:b137df1b4582753958de4ab3a878cdd9,Namespace:kube-system,Attempt:0,} returns sandbox id \"9283d398d5874922216e36334059478cf7833e975ccefb1928ba2f3339110deb\"" Jan 30 19:14:30.776568 containerd[1495]: time="2025-01-30T19:14:30.776532051Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-ehdo1.gb1.brightbox.com,Uid:aae72a3d5f96b5efc2580d509c27c5fd,Namespace:kube-system,Attempt:0,} returns sandbox id \"4dfacad81ab9710a8576d6f6abcb2ad4cf5333a8fcb2e8b032a25c62ab1c1e03\"" Jan 30 19:14:30.798497 containerd[1495]: time="2025-01-30T19:14:30.798413374Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-ehdo1.gb1.brightbox.com,Uid:b26a71cfdde83d612efcaf7666f5e5f9,Namespace:kube-system,Attempt:0,} returns sandbox id \"97e3d5738dcac7b45354e3da9a598ad99ab6c36163b5302b985bca78ec465ce8\"" Jan 30 19:14:30.805210 containerd[1495]: time="2025-01-30T19:14:30.805041704Z" level=info msg="CreateContainer within sandbox \"4dfacad81ab9710a8576d6f6abcb2ad4cf5333a8fcb2e8b032a25c62ab1c1e03\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 30 19:14:30.806022 containerd[1495]: time="2025-01-30T19:14:30.805977597Z" level=info msg="CreateContainer within sandbox \"9283d398d5874922216e36334059478cf7833e975ccefb1928ba2f3339110deb\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 30 19:14:30.806657 containerd[1495]: time="2025-01-30T19:14:30.806624497Z" level=info msg="CreateContainer within sandbox \"97e3d5738dcac7b45354e3da9a598ad99ab6c36163b5302b985bca78ec465ce8\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 30 19:14:30.864639 containerd[1495]: time="2025-01-30T19:14:30.864562406Z" level=info msg="CreateContainer within sandbox \"4dfacad81ab9710a8576d6f6abcb2ad4cf5333a8fcb2e8b032a25c62ab1c1e03\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"80ecac930d0abe72a2097e3728ba66d3b648864410071b4f3f55cb925d33bf68\"" Jan 30 19:14:30.866591 containerd[1495]: time="2025-01-30T19:14:30.866074327Z" level=info msg="StartContainer for \"80ecac930d0abe72a2097e3728ba66d3b648864410071b4f3f55cb925d33bf68\"" Jan 30 19:14:30.870977 containerd[1495]: time="2025-01-30T19:14:30.870941663Z" level=info msg="CreateContainer within sandbox \"9283d398d5874922216e36334059478cf7833e975ccefb1928ba2f3339110deb\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"afe00dd947f6c1c645894a236939957f3fa5c402f389f15f074c7c58c2a7079b\"" Jan 30 19:14:30.871564 containerd[1495]: time="2025-01-30T19:14:30.871530950Z" level=info msg="CreateContainer within sandbox \"97e3d5738dcac7b45354e3da9a598ad99ab6c36163b5302b985bca78ec465ce8\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"6c19514be5348b4dec8d3f1733ce7a79962cd223bc4d20b14e35fc87ab59e5b2\"" Jan 30 19:14:30.871797 containerd[1495]: time="2025-01-30T19:14:30.871617788Z" level=info msg="StartContainer for \"afe00dd947f6c1c645894a236939957f3fa5c402f389f15f074c7c58c2a7079b\"" Jan 30 19:14:30.873345 containerd[1495]: time="2025-01-30T19:14:30.873306869Z" level=info msg="StartContainer for \"6c19514be5348b4dec8d3f1733ce7a79962cd223bc4d20b14e35fc87ab59e5b2\"" Jan 30 19:14:30.921113 kubelet[2298]: I0130 19:14:30.921079 2298 kubelet_node_status.go:76] "Attempting to register node" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:30.921587 kubelet[2298]: E0130 19:14:30.921514 2298 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.244.22.2:6443/api/v1/nodes\": dial tcp 10.244.22.2:6443: connect: connection refused" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:30.929715 systemd[1]: Started cri-containerd-80ecac930d0abe72a2097e3728ba66d3b648864410071b4f3f55cb925d33bf68.scope - libcontainer container 80ecac930d0abe72a2097e3728ba66d3b648864410071b4f3f55cb925d33bf68. Jan 30 19:14:30.943199 systemd[1]: Started cri-containerd-6c19514be5348b4dec8d3f1733ce7a79962cd223bc4d20b14e35fc87ab59e5b2.scope - libcontainer container 6c19514be5348b4dec8d3f1733ce7a79962cd223bc4d20b14e35fc87ab59e5b2. Jan 30 19:14:30.954806 systemd[1]: Started cri-containerd-afe00dd947f6c1c645894a236939957f3fa5c402f389f15f074c7c58c2a7079b.scope - libcontainer container afe00dd947f6c1c645894a236939957f3fa5c402f389f15f074c7c58c2a7079b. Jan 30 19:14:31.051460 containerd[1495]: time="2025-01-30T19:14:31.051259680Z" level=info msg="StartContainer for \"80ecac930d0abe72a2097e3728ba66d3b648864410071b4f3f55cb925d33bf68\" returns successfully" Jan 30 19:14:31.073145 containerd[1495]: time="2025-01-30T19:14:31.072047094Z" level=info msg="StartContainer for \"6c19514be5348b4dec8d3f1733ce7a79962cd223bc4d20b14e35fc87ab59e5b2\" returns successfully" Jan 30 19:14:31.078738 containerd[1495]: time="2025-01-30T19:14:31.078683218Z" level=info msg="StartContainer for \"afe00dd947f6c1c645894a236939957f3fa5c402f389f15f074c7c58c2a7079b\" returns successfully" Jan 30 19:14:31.375818 kubelet[2298]: E0130 19:14:31.371186 2298 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-ehdo1.gb1.brightbox.com\" not found" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:31.387462 kubelet[2298]: E0130 19:14:31.381328 2298 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-ehdo1.gb1.brightbox.com\" not found" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:31.396374 kubelet[2298]: E0130 19:14:31.396317 2298 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.244.22.2:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.244.22.2:6443: connect: connection refused" logger="UnhandledError" Jan 30 19:14:31.397615 kubelet[2298]: E0130 19:14:31.397560 2298 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-ehdo1.gb1.brightbox.com\" not found" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:32.390664 kubelet[2298]: E0130 19:14:32.390612 2298 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-ehdo1.gb1.brightbox.com\" not found" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:32.392220 kubelet[2298]: E0130 19:14:32.392189 2298 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-ehdo1.gb1.brightbox.com\" not found" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:32.527735 kubelet[2298]: I0130 19:14:32.527687 2298 kubelet_node_status.go:76] "Attempting to register node" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:34.404852 kubelet[2298]: E0130 19:14:34.404785 2298 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-ehdo1.gb1.brightbox.com\" not found" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:34.569896 kubelet[2298]: I0130 19:14:34.566321 2298 kubelet_node_status.go:79] "Successfully registered node" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:34.603914 kubelet[2298]: I0130 19:14:34.603823 2298 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:34.630860 kubelet[2298]: E0130 19:14:34.627899 2298 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-ehdo1.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:34.630860 kubelet[2298]: I0130 19:14:34.627976 2298 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:34.635282 kubelet[2298]: E0130 19:14:34.635199 2298 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-ehdo1.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:34.635282 kubelet[2298]: I0130 19:14:34.635248 2298 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:34.641550 kubelet[2298]: E0130 19:14:34.641469 2298 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-ehdo1.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:35.275293 kubelet[2298]: I0130 19:14:35.275229 2298 apiserver.go:52] "Watching apiserver" Jan 30 19:14:35.302077 kubelet[2298]: I0130 19:14:35.302013 2298 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Jan 30 19:14:35.615013 kubelet[2298]: I0130 19:14:35.614117 2298 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:35.622294 kubelet[2298]: W0130 19:14:35.621703 2298 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 30 19:14:36.337739 systemd[1]: Reloading requested from client PID 2574 ('systemctl') (unit session-11.scope)... Jan 30 19:14:36.337774 systemd[1]: Reloading... Jan 30 19:14:36.463968 zram_generator::config[2619]: No configuration found. Jan 30 19:14:36.646972 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 30 19:14:36.782188 systemd[1]: Reloading finished in 443 ms. Jan 30 19:14:36.845093 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 19:14:36.858697 systemd[1]: kubelet.service: Deactivated successfully. Jan 30 19:14:36.859147 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 19:14:36.866247 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 30 19:14:37.072161 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 30 19:14:37.083474 (kubelet)[2677]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 30 19:14:37.180319 kubelet[2677]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 19:14:37.182856 kubelet[2677]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 30 19:14:37.182856 kubelet[2677]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 19:14:37.182856 kubelet[2677]: I0130 19:14:37.180980 2677 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 30 19:14:37.190312 kubelet[2677]: I0130 19:14:37.190250 2677 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Jan 30 19:14:37.190312 kubelet[2677]: I0130 19:14:37.190288 2677 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 30 19:14:37.190611 kubelet[2677]: I0130 19:14:37.190562 2677 server.go:954] "Client rotation is on, will bootstrap in background" Jan 30 19:14:37.193641 kubelet[2677]: I0130 19:14:37.193600 2677 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 30 19:14:37.203762 kubelet[2677]: I0130 19:14:37.203507 2677 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 30 19:14:37.214188 kubelet[2677]: E0130 19:14:37.214144 2677 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Jan 30 19:14:37.214188 kubelet[2677]: I0130 19:14:37.214190 2677 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Jan 30 19:14:37.219631 kubelet[2677]: I0130 19:14:37.219553 2677 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 30 19:14:37.220113 kubelet[2677]: I0130 19:14:37.220053 2677 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 30 19:14:37.220336 kubelet[2677]: I0130 19:14:37.220110 2677 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-ehdo1.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 30 19:14:37.220497 kubelet[2677]: I0130 19:14:37.220342 2677 topology_manager.go:138] "Creating topology manager with none policy" Jan 30 19:14:37.220497 kubelet[2677]: I0130 19:14:37.220359 2677 container_manager_linux.go:304] "Creating device plugin manager" Jan 30 19:14:37.220497 kubelet[2677]: I0130 19:14:37.220411 2677 state_mem.go:36] "Initialized new in-memory state store" Jan 30 19:14:37.220693 kubelet[2677]: I0130 19:14:37.220672 2677 kubelet.go:446] "Attempting to sync node with API server" Jan 30 19:14:37.220749 kubelet[2677]: I0130 19:14:37.220697 2677 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 30 19:14:37.221363 kubelet[2677]: I0130 19:14:37.221316 2677 kubelet.go:352] "Adding apiserver pod source" Jan 30 19:14:37.221363 kubelet[2677]: I0130 19:14:37.221355 2677 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 30 19:14:37.225445 kubelet[2677]: I0130 19:14:37.225191 2677 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Jan 30 19:14:37.226856 kubelet[2677]: I0130 19:14:37.226099 2677 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 30 19:14:37.226856 kubelet[2677]: I0130 19:14:37.226706 2677 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 30 19:14:37.226856 kubelet[2677]: I0130 19:14:37.226744 2677 server.go:1287] "Started kubelet" Jan 30 19:14:37.232529 kubelet[2677]: I0130 19:14:37.232505 2677 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 30 19:14:37.239735 kubelet[2677]: I0130 19:14:37.239683 2677 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 30 19:14:37.241159 kubelet[2677]: I0130 19:14:37.241134 2677 server.go:490] "Adding debug handlers to kubelet server" Jan 30 19:14:37.250078 kubelet[2677]: I0130 19:14:37.249841 2677 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 30 19:14:37.250189 kubelet[2677]: E0130 19:14:37.250160 2677 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"srv-ehdo1.gb1.brightbox.com\" not found" Jan 30 19:14:37.250317 kubelet[2677]: I0130 19:14:37.250257 2677 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 30 19:14:37.250741 kubelet[2677]: I0130 19:14:37.250717 2677 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 30 19:14:37.252144 kubelet[2677]: I0130 19:14:37.252117 2677 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 30 19:14:37.253393 kubelet[2677]: I0130 19:14:37.253365 2677 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Jan 30 19:14:37.253605 kubelet[2677]: I0130 19:14:37.253579 2677 reconciler.go:26] "Reconciler: start to sync state" Jan 30 19:14:37.257702 kubelet[2677]: I0130 19:14:37.257669 2677 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 30 19:14:37.259337 kubelet[2677]: I0130 19:14:37.259314 2677 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 30 19:14:37.259457 kubelet[2677]: I0130 19:14:37.259438 2677 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 30 19:14:37.259593 kubelet[2677]: I0130 19:14:37.259572 2677 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 30 19:14:37.259704 kubelet[2677]: I0130 19:14:37.259686 2677 kubelet.go:2388] "Starting kubelet main sync loop" Jan 30 19:14:37.260861 kubelet[2677]: E0130 19:14:37.260453 2677 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 30 19:14:37.270192 kubelet[2677]: I0130 19:14:37.270158 2677 factory.go:221] Registration of the systemd container factory successfully Jan 30 19:14:37.270494 kubelet[2677]: I0130 19:14:37.270463 2677 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 30 19:14:37.276298 kubelet[2677]: I0130 19:14:37.276255 2677 factory.go:221] Registration of the containerd container factory successfully Jan 30 19:14:37.279147 kubelet[2677]: E0130 19:14:37.279115 2677 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 30 19:14:37.346999 kubelet[2677]: I0130 19:14:37.346263 2677 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 30 19:14:37.346999 kubelet[2677]: I0130 19:14:37.346293 2677 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 30 19:14:37.346999 kubelet[2677]: I0130 19:14:37.346331 2677 state_mem.go:36] "Initialized new in-memory state store" Jan 30 19:14:37.346999 kubelet[2677]: I0130 19:14:37.346555 2677 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 30 19:14:37.346999 kubelet[2677]: I0130 19:14:37.346583 2677 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 30 19:14:37.346999 kubelet[2677]: I0130 19:14:37.346623 2677 policy_none.go:49] "None policy: Start" Jan 30 19:14:37.346999 kubelet[2677]: I0130 19:14:37.346638 2677 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 30 19:14:37.346999 kubelet[2677]: I0130 19:14:37.346656 2677 state_mem.go:35] "Initializing new in-memory state store" Jan 30 19:14:37.346999 kubelet[2677]: I0130 19:14:37.346796 2677 state_mem.go:75] "Updated machine memory state" Jan 30 19:14:37.356521 kubelet[2677]: I0130 19:14:37.356486 2677 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 30 19:14:37.356848 kubelet[2677]: I0130 19:14:37.356804 2677 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 30 19:14:37.358146 kubelet[2677]: I0130 19:14:37.358077 2677 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 30 19:14:37.358685 kubelet[2677]: I0130 19:14:37.358405 2677 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 30 19:14:37.362578 kubelet[2677]: I0130 19:14:37.362103 2677 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.364921 kubelet[2677]: I0130 19:14:37.364112 2677 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.364921 kubelet[2677]: I0130 19:14:37.364468 2677 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.365389 kubelet[2677]: E0130 19:14:37.365362 2677 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 30 19:14:37.381747 kubelet[2677]: W0130 19:14:37.381617 2677 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 30 19:14:37.382725 kubelet[2677]: W0130 19:14:37.381965 2677 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 30 19:14:37.382725 kubelet[2677]: W0130 19:14:37.382209 2677 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 30 19:14:37.382725 kubelet[2677]: E0130 19:14:37.382258 2677 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-ehdo1.gb1.brightbox.com\" already exists" pod="kube-system/kube-scheduler-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.487175 kubelet[2677]: I0130 19:14:37.487130 2677 kubelet_node_status.go:76] "Attempting to register node" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.506254 kubelet[2677]: I0130 19:14:37.506070 2677 kubelet_node_status.go:125] "Node was previously registered" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.506254 kubelet[2677]: I0130 19:14:37.506187 2677 kubelet_node_status.go:79] "Successfully registered node" node="srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.556823 kubelet[2677]: I0130 19:14:37.556667 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aae72a3d5f96b5efc2580d509c27c5fd-usr-share-ca-certificates\") pod \"kube-apiserver-srv-ehdo1.gb1.brightbox.com\" (UID: \"aae72a3d5f96b5efc2580d509c27c5fd\") " pod="kube-system/kube-apiserver-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.556823 kubelet[2677]: I0130 19:14:37.556742 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b26a71cfdde83d612efcaf7666f5e5f9-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-ehdo1.gb1.brightbox.com\" (UID: \"b26a71cfdde83d612efcaf7666f5e5f9\") " pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.556823 kubelet[2677]: I0130 19:14:37.556778 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b137df1b4582753958de4ab3a878cdd9-kubeconfig\") pod \"kube-scheduler-srv-ehdo1.gb1.brightbox.com\" (UID: \"b137df1b4582753958de4ab3a878cdd9\") " pod="kube-system/kube-scheduler-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.556823 kubelet[2677]: I0130 19:14:37.556851 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aae72a3d5f96b5efc2580d509c27c5fd-ca-certs\") pod \"kube-apiserver-srv-ehdo1.gb1.brightbox.com\" (UID: \"aae72a3d5f96b5efc2580d509c27c5fd\") " pod="kube-system/kube-apiserver-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.556823 kubelet[2677]: I0130 19:14:37.556894 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aae72a3d5f96b5efc2580d509c27c5fd-k8s-certs\") pod \"kube-apiserver-srv-ehdo1.gb1.brightbox.com\" (UID: \"aae72a3d5f96b5efc2580d509c27c5fd\") " pod="kube-system/kube-apiserver-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.557367 kubelet[2677]: I0130 19:14:37.556950 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b26a71cfdde83d612efcaf7666f5e5f9-ca-certs\") pod \"kube-controller-manager-srv-ehdo1.gb1.brightbox.com\" (UID: \"b26a71cfdde83d612efcaf7666f5e5f9\") " pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.557367 kubelet[2677]: I0130 19:14:37.556980 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b26a71cfdde83d612efcaf7666f5e5f9-flexvolume-dir\") pod \"kube-controller-manager-srv-ehdo1.gb1.brightbox.com\" (UID: \"b26a71cfdde83d612efcaf7666f5e5f9\") " pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.557367 kubelet[2677]: I0130 19:14:37.557026 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b26a71cfdde83d612efcaf7666f5e5f9-k8s-certs\") pod \"kube-controller-manager-srv-ehdo1.gb1.brightbox.com\" (UID: \"b26a71cfdde83d612efcaf7666f5e5f9\") " pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:37.557367 kubelet[2677]: I0130 19:14:37.557059 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b26a71cfdde83d612efcaf7666f5e5f9-kubeconfig\") pod \"kube-controller-manager-srv-ehdo1.gb1.brightbox.com\" (UID: \"b26a71cfdde83d612efcaf7666f5e5f9\") " pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:38.224853 kubelet[2677]: I0130 19:14:38.224727 2677 apiserver.go:52] "Watching apiserver" Jan 30 19:14:38.254532 kubelet[2677]: I0130 19:14:38.254440 2677 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Jan 30 19:14:38.321403 kubelet[2677]: I0130 19:14:38.321000 2677 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:38.322785 kubelet[2677]: I0130 19:14:38.322013 2677 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:38.328426 kubelet[2677]: W0130 19:14:38.328132 2677 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 30 19:14:38.328426 kubelet[2677]: E0130 19:14:38.328195 2677 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-ehdo1.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:38.336307 kubelet[2677]: W0130 19:14:38.336268 2677 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 30 19:14:38.336497 kubelet[2677]: E0130 19:14:38.336342 2677 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-ehdo1.gb1.brightbox.com\" already exists" pod="kube-system/kube-scheduler-srv-ehdo1.gb1.brightbox.com" Jan 30 19:14:38.451039 kubelet[2677]: I0130 19:14:38.450263 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-ehdo1.gb1.brightbox.com" podStartSLOduration=1.4502242810000001 podStartE2EDuration="1.450224281s" podCreationTimestamp="2025-01-30 19:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-30 19:14:38.387734667 +0000 UTC m=+1.268958518" watchObservedRunningTime="2025-01-30 19:14:38.450224281 +0000 UTC m=+1.331448116" Jan 30 19:14:38.517510 kubelet[2677]: I0130 19:14:38.517301 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-ehdo1.gb1.brightbox.com" podStartSLOduration=3.517268048 podStartE2EDuration="3.517268048s" podCreationTimestamp="2025-01-30 19:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-30 19:14:38.451800214 +0000 UTC m=+1.333024068" watchObservedRunningTime="2025-01-30 19:14:38.517268048 +0000 UTC m=+1.398491892" Jan 30 19:14:38.577035 kubelet[2677]: I0130 19:14:38.576869 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-ehdo1.gb1.brightbox.com" podStartSLOduration=1.5768242959999998 podStartE2EDuration="1.576824296s" podCreationTimestamp="2025-01-30 19:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-30 19:14:38.535947227 +0000 UTC m=+1.417171080" watchObservedRunningTime="2025-01-30 19:14:38.576824296 +0000 UTC m=+1.458048145" Jan 30 19:14:42.413360 kubelet[2677]: I0130 19:14:42.413073 2677 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 30 19:14:42.415295 kubelet[2677]: I0130 19:14:42.414450 2677 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 30 19:14:42.415591 containerd[1495]: time="2025-01-30T19:14:42.414126092Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 30 19:14:43.198658 kubelet[2677]: I0130 19:14:43.196254 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wchnr\" (UniqueName: \"kubernetes.io/projected/04e527eb-3013-4b74-8dd7-857c9879bb20-kube-api-access-wchnr\") pod \"kube-proxy-67nxv\" (UID: \"04e527eb-3013-4b74-8dd7-857c9879bb20\") " pod="kube-system/kube-proxy-67nxv" Jan 30 19:14:43.198658 kubelet[2677]: I0130 19:14:43.196328 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/04e527eb-3013-4b74-8dd7-857c9879bb20-kube-proxy\") pod \"kube-proxy-67nxv\" (UID: \"04e527eb-3013-4b74-8dd7-857c9879bb20\") " pod="kube-system/kube-proxy-67nxv" Jan 30 19:14:43.198658 kubelet[2677]: I0130 19:14:43.196371 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/04e527eb-3013-4b74-8dd7-857c9879bb20-xtables-lock\") pod \"kube-proxy-67nxv\" (UID: \"04e527eb-3013-4b74-8dd7-857c9879bb20\") " pod="kube-system/kube-proxy-67nxv" Jan 30 19:14:43.198658 kubelet[2677]: I0130 19:14:43.196401 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04e527eb-3013-4b74-8dd7-857c9879bb20-lib-modules\") pod \"kube-proxy-67nxv\" (UID: \"04e527eb-3013-4b74-8dd7-857c9879bb20\") " pod="kube-system/kube-proxy-67nxv" Jan 30 19:14:43.208055 systemd[1]: Created slice kubepods-besteffort-pod04e527eb_3013_4b74_8dd7_857c9879bb20.slice - libcontainer container kubepods-besteffort-pod04e527eb_3013_4b74_8dd7_857c9879bb20.slice. Jan 30 19:14:43.464134 systemd[1]: Created slice kubepods-besteffort-pod10007f7b_59c2_4a3d_a27e_45ad2ab19bc6.slice - libcontainer container kubepods-besteffort-pod10007f7b_59c2_4a3d_a27e_45ad2ab19bc6.slice. Jan 30 19:14:43.499245 kubelet[2677]: I0130 19:14:43.499072 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd4r7\" (UniqueName: \"kubernetes.io/projected/10007f7b-59c2-4a3d-a27e-45ad2ab19bc6-kube-api-access-rd4r7\") pod \"tigera-operator-7d68577dc5-j5kxn\" (UID: \"10007f7b-59c2-4a3d-a27e-45ad2ab19bc6\") " pod="tigera-operator/tigera-operator-7d68577dc5-j5kxn" Jan 30 19:14:43.499245 kubelet[2677]: I0130 19:14:43.499146 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/10007f7b-59c2-4a3d-a27e-45ad2ab19bc6-var-lib-calico\") pod \"tigera-operator-7d68577dc5-j5kxn\" (UID: \"10007f7b-59c2-4a3d-a27e-45ad2ab19bc6\") " pod="tigera-operator/tigera-operator-7d68577dc5-j5kxn" Jan 30 19:14:43.524434 containerd[1495]: time="2025-01-30T19:14:43.522679343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-67nxv,Uid:04e527eb-3013-4b74-8dd7-857c9879bb20,Namespace:kube-system,Attempt:0,}" Jan 30 19:14:43.569633 containerd[1495]: time="2025-01-30T19:14:43.569291570Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:14:43.569633 containerd[1495]: time="2025-01-30T19:14:43.569401416Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:14:43.569633 containerd[1495]: time="2025-01-30T19:14:43.569425056Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:43.569633 containerd[1495]: time="2025-01-30T19:14:43.569562120Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:43.617258 systemd[1]: Started cri-containerd-18e8c046dc04d9e2438d4db0d9efe48c37b811bcde8336b119fbbd130c58bd45.scope - libcontainer container 18e8c046dc04d9e2438d4db0d9efe48c37b811bcde8336b119fbbd130c58bd45. Jan 30 19:14:43.679122 containerd[1495]: time="2025-01-30T19:14:43.678941680Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-67nxv,Uid:04e527eb-3013-4b74-8dd7-857c9879bb20,Namespace:kube-system,Attempt:0,} returns sandbox id \"18e8c046dc04d9e2438d4db0d9efe48c37b811bcde8336b119fbbd130c58bd45\"" Jan 30 19:14:43.688129 containerd[1495]: time="2025-01-30T19:14:43.687698242Z" level=info msg="CreateContainer within sandbox \"18e8c046dc04d9e2438d4db0d9efe48c37b811bcde8336b119fbbd130c58bd45\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 30 19:14:43.711043 containerd[1495]: time="2025-01-30T19:14:43.710435444Z" level=info msg="CreateContainer within sandbox \"18e8c046dc04d9e2438d4db0d9efe48c37b811bcde8336b119fbbd130c58bd45\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"320af0a05000eafd3d573cc884d9dece0e22b421f0f6a9f36a6bf110d59bb562\"" Jan 30 19:14:43.711596 containerd[1495]: time="2025-01-30T19:14:43.711551226Z" level=info msg="StartContainer for \"320af0a05000eafd3d573cc884d9dece0e22b421f0f6a9f36a6bf110d59bb562\"" Jan 30 19:14:43.775927 containerd[1495]: time="2025-01-30T19:14:43.775781124Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7d68577dc5-j5kxn,Uid:10007f7b-59c2-4a3d-a27e-45ad2ab19bc6,Namespace:tigera-operator,Attempt:0,}" Jan 30 19:14:43.778556 systemd[1]: Started cri-containerd-320af0a05000eafd3d573cc884d9dece0e22b421f0f6a9f36a6bf110d59bb562.scope - libcontainer container 320af0a05000eafd3d573cc884d9dece0e22b421f0f6a9f36a6bf110d59bb562. Jan 30 19:14:43.780200 sudo[1776]: pam_unix(sudo:session): session closed for user root Jan 30 19:14:43.844192 containerd[1495]: time="2025-01-30T19:14:43.843510533Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:14:43.844192 containerd[1495]: time="2025-01-30T19:14:43.843609103Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:14:43.844192 containerd[1495]: time="2025-01-30T19:14:43.843628980Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:43.844192 containerd[1495]: time="2025-01-30T19:14:43.843918418Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:43.855763 containerd[1495]: time="2025-01-30T19:14:43.854430526Z" level=info msg="StartContainer for \"320af0a05000eafd3d573cc884d9dece0e22b421f0f6a9f36a6bf110d59bb562\" returns successfully" Jan 30 19:14:43.875052 systemd[1]: Started cri-containerd-932058a8a04e1df09a935419fd3762231431cae69d2709bd17759fe207b838f9.scope - libcontainer container 932058a8a04e1df09a935419fd3762231431cae69d2709bd17759fe207b838f9. Jan 30 19:14:43.928771 sshd[1773]: pam_unix(sshd:session): session closed for user core Jan 30 19:14:43.936552 systemd[1]: sshd@8-10.244.22.2:22-139.178.89.65:47698.service: Deactivated successfully. Jan 30 19:14:43.940424 systemd[1]: session-11.scope: Deactivated successfully. Jan 30 19:14:43.942877 systemd[1]: session-11.scope: Consumed 7.282s CPU time, 141.2M memory peak, 0B memory swap peak. Jan 30 19:14:43.945181 systemd-logind[1486]: Session 11 logged out. Waiting for processes to exit. Jan 30 19:14:43.948441 systemd-logind[1486]: Removed session 11. Jan 30 19:14:43.960665 containerd[1495]: time="2025-01-30T19:14:43.959960492Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7d68577dc5-j5kxn,Uid:10007f7b-59c2-4a3d-a27e-45ad2ab19bc6,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"932058a8a04e1df09a935419fd3762231431cae69d2709bd17759fe207b838f9\"" Jan 30 19:14:43.964021 containerd[1495]: time="2025-01-30T19:14:43.963929742Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 30 19:14:44.358652 kubelet[2677]: I0130 19:14:44.358358 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-67nxv" podStartSLOduration=1.35833486 podStartE2EDuration="1.35833486s" podCreationTimestamp="2025-01-30 19:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-30 19:14:44.355172778 +0000 UTC m=+7.236396641" watchObservedRunningTime="2025-01-30 19:14:44.35833486 +0000 UTC m=+7.239558708" Jan 30 19:14:47.526669 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3987493847.mount: Deactivated successfully. Jan 30 19:14:48.337869 containerd[1495]: time="2025-01-30T19:14:48.337555795Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:48.338863 containerd[1495]: time="2025-01-30T19:14:48.338800860Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21762497" Jan 30 19:14:48.339718 containerd[1495]: time="2025-01-30T19:14:48.339635028Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:48.343227 containerd[1495]: time="2025-01-30T19:14:48.342656195Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:48.343998 containerd[1495]: time="2025-01-30T19:14:48.343939990Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 4.379939285s" Jan 30 19:14:48.343998 containerd[1495]: time="2025-01-30T19:14:48.343986361Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Jan 30 19:14:48.397067 containerd[1495]: time="2025-01-30T19:14:48.397015388Z" level=info msg="CreateContainer within sandbox \"932058a8a04e1df09a935419fd3762231431cae69d2709bd17759fe207b838f9\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 30 19:14:48.414537 containerd[1495]: time="2025-01-30T19:14:48.414477074Z" level=info msg="CreateContainer within sandbox \"932058a8a04e1df09a935419fd3762231431cae69d2709bd17759fe207b838f9\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"9f90bcd9f7d060895733eead395f5d0c537c9e9968a81a681cd37a09904ec11c\"" Jan 30 19:14:48.416730 containerd[1495]: time="2025-01-30T19:14:48.416697120Z" level=info msg="StartContainer for \"9f90bcd9f7d060895733eead395f5d0c537c9e9968a81a681cd37a09904ec11c\"" Jan 30 19:14:48.456128 systemd[1]: Started cri-containerd-9f90bcd9f7d060895733eead395f5d0c537c9e9968a81a681cd37a09904ec11c.scope - libcontainer container 9f90bcd9f7d060895733eead395f5d0c537c9e9968a81a681cd37a09904ec11c. Jan 30 19:14:48.504867 containerd[1495]: time="2025-01-30T19:14:48.504793669Z" level=info msg="StartContainer for \"9f90bcd9f7d060895733eead395f5d0c537c9e9968a81a681cd37a09904ec11c\" returns successfully" Jan 30 19:14:49.375420 kubelet[2677]: I0130 19:14:49.374605 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7d68577dc5-j5kxn" podStartSLOduration=1.976292751 podStartE2EDuration="6.374584523s" podCreationTimestamp="2025-01-30 19:14:43 +0000 UTC" firstStartedPulling="2025-01-30 19:14:43.961876482 +0000 UTC m=+6.843100318" lastFinishedPulling="2025-01-30 19:14:48.360168249 +0000 UTC m=+11.241392090" observedRunningTime="2025-01-30 19:14:49.374370953 +0000 UTC m=+12.255594805" watchObservedRunningTime="2025-01-30 19:14:49.374584523 +0000 UTC m=+12.255808369" Jan 30 19:14:51.927155 systemd[1]: Created slice kubepods-besteffort-pod8419f904_f69d_45d8_8090_5b35c75935f8.slice - libcontainer container kubepods-besteffort-pod8419f904_f69d_45d8_8090_5b35c75935f8.slice. Jan 30 19:14:51.952402 kubelet[2677]: I0130 19:14:51.952335 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/8419f904-f69d-45d8-8090-5b35c75935f8-typha-certs\") pod \"calico-typha-6975c8487b-b8fh2\" (UID: \"8419f904-f69d-45d8-8090-5b35c75935f8\") " pod="calico-system/calico-typha-6975c8487b-b8fh2" Jan 30 19:14:51.954044 kubelet[2677]: I0130 19:14:51.952458 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8419f904-f69d-45d8-8090-5b35c75935f8-tigera-ca-bundle\") pod \"calico-typha-6975c8487b-b8fh2\" (UID: \"8419f904-f69d-45d8-8090-5b35c75935f8\") " pod="calico-system/calico-typha-6975c8487b-b8fh2" Jan 30 19:14:51.954044 kubelet[2677]: I0130 19:14:51.952496 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lbtf\" (UniqueName: \"kubernetes.io/projected/8419f904-f69d-45d8-8090-5b35c75935f8-kube-api-access-6lbtf\") pod \"calico-typha-6975c8487b-b8fh2\" (UID: \"8419f904-f69d-45d8-8090-5b35c75935f8\") " pod="calico-system/calico-typha-6975c8487b-b8fh2" Jan 30 19:14:52.040306 systemd[1]: Created slice kubepods-besteffort-pod4ab0dba4_7df4_4368_89f4_b0279789aa7a.slice - libcontainer container kubepods-besteffort-pod4ab0dba4_7df4_4368_89f4_b0279789aa7a.slice. Jan 30 19:14:52.052971 kubelet[2677]: I0130 19:14:52.052919 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ab0dba4-7df4-4368-89f4-b0279789aa7a-lib-modules\") pod \"calico-node-w4qr5\" (UID: \"4ab0dba4-7df4-4368-89f4-b0279789aa7a\") " pod="calico-system/calico-node-w4qr5" Jan 30 19:14:52.052971 kubelet[2677]: I0130 19:14:52.052982 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4ab0dba4-7df4-4368-89f4-b0279789aa7a-xtables-lock\") pod \"calico-node-w4qr5\" (UID: \"4ab0dba4-7df4-4368-89f4-b0279789aa7a\") " pod="calico-system/calico-node-w4qr5" Jan 30 19:14:52.053216 kubelet[2677]: I0130 19:14:52.053012 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/4ab0dba4-7df4-4368-89f4-b0279789aa7a-policysync\") pod \"calico-node-w4qr5\" (UID: \"4ab0dba4-7df4-4368-89f4-b0279789aa7a\") " pod="calico-system/calico-node-w4qr5" Jan 30 19:14:52.053216 kubelet[2677]: I0130 19:14:52.053042 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/4ab0dba4-7df4-4368-89f4-b0279789aa7a-cni-bin-dir\") pod \"calico-node-w4qr5\" (UID: \"4ab0dba4-7df4-4368-89f4-b0279789aa7a\") " pod="calico-system/calico-node-w4qr5" Jan 30 19:14:52.053216 kubelet[2677]: I0130 19:14:52.053069 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/4ab0dba4-7df4-4368-89f4-b0279789aa7a-cni-log-dir\") pod \"calico-node-w4qr5\" (UID: \"4ab0dba4-7df4-4368-89f4-b0279789aa7a\") " pod="calico-system/calico-node-w4qr5" Jan 30 19:14:52.053216 kubelet[2677]: I0130 19:14:52.053111 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ab0dba4-7df4-4368-89f4-b0279789aa7a-tigera-ca-bundle\") pod \"calico-node-w4qr5\" (UID: \"4ab0dba4-7df4-4368-89f4-b0279789aa7a\") " pod="calico-system/calico-node-w4qr5" Jan 30 19:14:52.053216 kubelet[2677]: I0130 19:14:52.053164 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/4ab0dba4-7df4-4368-89f4-b0279789aa7a-var-run-calico\") pod \"calico-node-w4qr5\" (UID: \"4ab0dba4-7df4-4368-89f4-b0279789aa7a\") " pod="calico-system/calico-node-w4qr5" Jan 30 19:14:52.053458 kubelet[2677]: I0130 19:14:52.053192 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4ab0dba4-7df4-4368-89f4-b0279789aa7a-var-lib-calico\") pod \"calico-node-w4qr5\" (UID: \"4ab0dba4-7df4-4368-89f4-b0279789aa7a\") " pod="calico-system/calico-node-w4qr5" Jan 30 19:14:52.053458 kubelet[2677]: I0130 19:14:52.053220 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/4ab0dba4-7df4-4368-89f4-b0279789aa7a-flexvol-driver-host\") pod \"calico-node-w4qr5\" (UID: \"4ab0dba4-7df4-4368-89f4-b0279789aa7a\") " pod="calico-system/calico-node-w4qr5" Jan 30 19:14:52.053458 kubelet[2677]: I0130 19:14:52.053266 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/4ab0dba4-7df4-4368-89f4-b0279789aa7a-node-certs\") pod \"calico-node-w4qr5\" (UID: \"4ab0dba4-7df4-4368-89f4-b0279789aa7a\") " pod="calico-system/calico-node-w4qr5" Jan 30 19:14:52.053458 kubelet[2677]: I0130 19:14:52.053292 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/4ab0dba4-7df4-4368-89f4-b0279789aa7a-cni-net-dir\") pod \"calico-node-w4qr5\" (UID: \"4ab0dba4-7df4-4368-89f4-b0279789aa7a\") " pod="calico-system/calico-node-w4qr5" Jan 30 19:14:52.053458 kubelet[2677]: I0130 19:14:52.053340 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t74rn\" (UniqueName: \"kubernetes.io/projected/4ab0dba4-7df4-4368-89f4-b0279789aa7a-kube-api-access-t74rn\") pod \"calico-node-w4qr5\" (UID: \"4ab0dba4-7df4-4368-89f4-b0279789aa7a\") " pod="calico-system/calico-node-w4qr5" Jan 30 19:14:52.166086 kubelet[2677]: E0130 19:14:52.166038 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.166086 kubelet[2677]: W0130 19:14:52.166089 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.166300 kubelet[2677]: E0130 19:14:52.166173 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.185082 kubelet[2677]: E0130 19:14:52.184968 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.185082 kubelet[2677]: W0130 19:14:52.184997 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.185082 kubelet[2677]: E0130 19:14:52.185026 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.231484 kubelet[2677]: E0130 19:14:52.231405 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwmcj" podUID="eaeacb08-27c1-40ef-baaf-66029c9f99c5" Jan 30 19:14:52.235163 containerd[1495]: time="2025-01-30T19:14:52.235073665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6975c8487b-b8fh2,Uid:8419f904-f69d-45d8-8090-5b35c75935f8,Namespace:calico-system,Attempt:0,}" Jan 30 19:14:52.246194 kubelet[2677]: E0130 19:14:52.245975 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.246194 kubelet[2677]: W0130 19:14:52.246020 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.246194 kubelet[2677]: E0130 19:14:52.246052 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.246742 kubelet[2677]: E0130 19:14:52.246401 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.246742 kubelet[2677]: W0130 19:14:52.246417 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.246742 kubelet[2677]: E0130 19:14:52.246432 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.248574 kubelet[2677]: E0130 19:14:52.248287 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.248574 kubelet[2677]: W0130 19:14:52.248306 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.248574 kubelet[2677]: E0130 19:14:52.248322 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.249172 kubelet[2677]: E0130 19:14:52.249018 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.249172 kubelet[2677]: W0130 19:14:52.249038 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.249172 kubelet[2677]: E0130 19:14:52.249054 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.250097 kubelet[2677]: E0130 19:14:52.249722 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.250097 kubelet[2677]: W0130 19:14:52.249875 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.250097 kubelet[2677]: E0130 19:14:52.249896 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.252177 kubelet[2677]: E0130 19:14:52.252033 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.252177 kubelet[2677]: W0130 19:14:52.252054 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.252177 kubelet[2677]: E0130 19:14:52.252072 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.252651 kubelet[2677]: E0130 19:14:52.252515 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.252651 kubelet[2677]: W0130 19:14:52.252535 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.252651 kubelet[2677]: E0130 19:14:52.252551 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.253205 kubelet[2677]: E0130 19:14:52.253060 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.253205 kubelet[2677]: W0130 19:14:52.253080 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.253205 kubelet[2677]: E0130 19:14:52.253096 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.253653 kubelet[2677]: E0130 19:14:52.253518 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.253653 kubelet[2677]: W0130 19:14:52.253537 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.253653 kubelet[2677]: E0130 19:14:52.253556 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.254341 kubelet[2677]: E0130 19:14:52.254118 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.254341 kubelet[2677]: W0130 19:14:52.254138 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.254341 kubelet[2677]: E0130 19:14:52.254155 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.255276 kubelet[2677]: E0130 19:14:52.254938 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.255276 kubelet[2677]: W0130 19:14:52.255063 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.255276 kubelet[2677]: E0130 19:14:52.255083 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.256221 kubelet[2677]: E0130 19:14:52.255900 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.256221 kubelet[2677]: W0130 19:14:52.255920 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.256221 kubelet[2677]: E0130 19:14:52.255936 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.257158 kubelet[2677]: E0130 19:14:52.256758 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.257158 kubelet[2677]: W0130 19:14:52.256773 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.257158 kubelet[2677]: E0130 19:14:52.256788 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.257848 kubelet[2677]: E0130 19:14:52.257757 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.258173 kubelet[2677]: W0130 19:14:52.257943 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.258173 kubelet[2677]: E0130 19:14:52.257967 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.259199 kubelet[2677]: E0130 19:14:52.258919 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.259199 kubelet[2677]: W0130 19:14:52.258940 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.259199 kubelet[2677]: E0130 19:14:52.258957 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.261680 kubelet[2677]: E0130 19:14:52.261075 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.261680 kubelet[2677]: W0130 19:14:52.261097 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.261680 kubelet[2677]: E0130 19:14:52.261114 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.263715 kubelet[2677]: E0130 19:14:52.263457 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.263715 kubelet[2677]: W0130 19:14:52.263482 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.263715 kubelet[2677]: E0130 19:14:52.263501 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.264572 kubelet[2677]: E0130 19:14:52.264429 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.264572 kubelet[2677]: W0130 19:14:52.264450 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.264572 kubelet[2677]: E0130 19:14:52.264468 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.266172 kubelet[2677]: E0130 19:14:52.266020 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.266172 kubelet[2677]: W0130 19:14:52.266044 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.266172 kubelet[2677]: E0130 19:14:52.266062 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.266787 kubelet[2677]: E0130 19:14:52.266522 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.266787 kubelet[2677]: W0130 19:14:52.266542 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.266787 kubelet[2677]: E0130 19:14:52.266559 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.267615 kubelet[2677]: E0130 19:14:52.267537 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.267733 kubelet[2677]: W0130 19:14:52.267710 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.270358 kubelet[2677]: E0130 19:14:52.270161 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.270358 kubelet[2677]: I0130 19:14:52.270215 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eaeacb08-27c1-40ef-baaf-66029c9f99c5-socket-dir\") pod \"csi-node-driver-xwmcj\" (UID: \"eaeacb08-27c1-40ef-baaf-66029c9f99c5\") " pod="calico-system/csi-node-driver-xwmcj" Jan 30 19:14:52.272498 kubelet[2677]: E0130 19:14:52.270677 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.272498 kubelet[2677]: W0130 19:14:52.270703 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.272498 kubelet[2677]: E0130 19:14:52.270730 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.272498 kubelet[2677]: I0130 19:14:52.270756 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bgt5\" (UniqueName: \"kubernetes.io/projected/eaeacb08-27c1-40ef-baaf-66029c9f99c5-kube-api-access-4bgt5\") pod \"csi-node-driver-xwmcj\" (UID: \"eaeacb08-27c1-40ef-baaf-66029c9f99c5\") " pod="calico-system/csi-node-driver-xwmcj" Jan 30 19:14:52.273141 kubelet[2677]: E0130 19:14:52.273116 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.273473 kubelet[2677]: W0130 19:14:52.273447 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.273689 kubelet[2677]: E0130 19:14:52.273589 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.273689 kubelet[2677]: I0130 19:14:52.273628 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eaeacb08-27c1-40ef-baaf-66029c9f99c5-kubelet-dir\") pod \"csi-node-driver-xwmcj\" (UID: \"eaeacb08-27c1-40ef-baaf-66029c9f99c5\") " pod="calico-system/csi-node-driver-xwmcj" Jan 30 19:14:52.275074 kubelet[2677]: E0130 19:14:52.275039 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.275074 kubelet[2677]: W0130 19:14:52.275066 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.275210 kubelet[2677]: E0130 19:14:52.275098 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.275470 kubelet[2677]: E0130 19:14:52.275447 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.275470 kubelet[2677]: W0130 19:14:52.275467 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.275778 kubelet[2677]: E0130 19:14:52.275701 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.277976 kubelet[2677]: E0130 19:14:52.276330 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.277976 kubelet[2677]: W0130 19:14:52.276345 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.278333 kubelet[2677]: E0130 19:14:52.278307 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.278597 kubelet[2677]: E0130 19:14:52.278570 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.278597 kubelet[2677]: W0130 19:14:52.278593 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.280694 kubelet[2677]: E0130 19:14:52.279036 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.280694 kubelet[2677]: W0130 19:14:52.279059 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.280694 kubelet[2677]: E0130 19:14:52.279636 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.280694 kubelet[2677]: I0130 19:14:52.279671 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/eaeacb08-27c1-40ef-baaf-66029c9f99c5-varrun\") pod \"csi-node-driver-xwmcj\" (UID: \"eaeacb08-27c1-40ef-baaf-66029c9f99c5\") " pod="calico-system/csi-node-driver-xwmcj" Jan 30 19:14:52.280694 kubelet[2677]: E0130 19:14:52.279694 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.281736 kubelet[2677]: E0130 19:14:52.281698 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.281736 kubelet[2677]: W0130 19:14:52.281722 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.281857 kubelet[2677]: E0130 19:14:52.281739 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.284297 kubelet[2677]: E0130 19:14:52.283780 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.284297 kubelet[2677]: W0130 19:14:52.283801 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.284297 kubelet[2677]: E0130 19:14:52.283883 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.284297 kubelet[2677]: E0130 19:14:52.284240 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.284297 kubelet[2677]: W0130 19:14:52.284255 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.284297 kubelet[2677]: E0130 19:14:52.284270 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.285405 kubelet[2677]: E0130 19:14:52.285040 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.285405 kubelet[2677]: W0130 19:14:52.285056 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.285405 kubelet[2677]: E0130 19:14:52.285251 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.285405 kubelet[2677]: I0130 19:14:52.285290 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eaeacb08-27c1-40ef-baaf-66029c9f99c5-registration-dir\") pod \"csi-node-driver-xwmcj\" (UID: \"eaeacb08-27c1-40ef-baaf-66029c9f99c5\") " pod="calico-system/csi-node-driver-xwmcj" Jan 30 19:14:52.286772 kubelet[2677]: E0130 19:14:52.286172 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.286772 kubelet[2677]: W0130 19:14:52.286201 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.286772 kubelet[2677]: E0130 19:14:52.286219 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.287969 kubelet[2677]: E0130 19:14:52.287234 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.287969 kubelet[2677]: W0130 19:14:52.287249 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.287969 kubelet[2677]: E0130 19:14:52.287264 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.287969 kubelet[2677]: E0130 19:14:52.287536 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.287969 kubelet[2677]: W0130 19:14:52.287550 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.287969 kubelet[2677]: E0130 19:14:52.287565 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.325242 containerd[1495]: time="2025-01-30T19:14:52.324716429Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:14:52.326651 containerd[1495]: time="2025-01-30T19:14:52.326512531Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:14:52.326909 containerd[1495]: time="2025-01-30T19:14:52.326549677Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:52.327651 containerd[1495]: time="2025-01-30T19:14:52.327481382Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:52.355915 containerd[1495]: time="2025-01-30T19:14:52.351349755Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-w4qr5,Uid:4ab0dba4-7df4-4368-89f4-b0279789aa7a,Namespace:calico-system,Attempt:0,}" Jan 30 19:14:52.372875 systemd[1]: Started cri-containerd-efb45fa081660fac4d8bbd76f87ef7e0fcd80b3c00adcb7f2a7efef601785a5a.scope - libcontainer container efb45fa081660fac4d8bbd76f87ef7e0fcd80b3c00adcb7f2a7efef601785a5a. Jan 30 19:14:52.390168 kubelet[2677]: E0130 19:14:52.390063 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.390168 kubelet[2677]: W0130 19:14:52.390098 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.390168 kubelet[2677]: E0130 19:14:52.390149 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.391101 kubelet[2677]: E0130 19:14:52.390491 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.391101 kubelet[2677]: W0130 19:14:52.390507 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.391101 kubelet[2677]: E0130 19:14:52.390523 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.391101 kubelet[2677]: E0130 19:14:52.390989 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.391101 kubelet[2677]: W0130 19:14:52.391005 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.391101 kubelet[2677]: E0130 19:14:52.391043 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.393888 kubelet[2677]: E0130 19:14:52.391596 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.393888 kubelet[2677]: W0130 19:14:52.391619 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.394059 kubelet[2677]: E0130 19:14:52.393890 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.394315 kubelet[2677]: E0130 19:14:52.394243 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.394315 kubelet[2677]: W0130 19:14:52.394266 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.394315 kubelet[2677]: E0130 19:14:52.394309 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.394776 kubelet[2677]: E0130 19:14:52.394609 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.394776 kubelet[2677]: W0130 19:14:52.394624 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.394776 kubelet[2677]: E0130 19:14:52.394715 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.395904 kubelet[2677]: E0130 19:14:52.395017 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.395904 kubelet[2677]: W0130 19:14:52.395032 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.395904 kubelet[2677]: E0130 19:14:52.395122 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.395904 kubelet[2677]: E0130 19:14:52.395364 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.395904 kubelet[2677]: W0130 19:14:52.395379 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.395904 kubelet[2677]: E0130 19:14:52.395470 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.395904 kubelet[2677]: E0130 19:14:52.395714 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.395904 kubelet[2677]: W0130 19:14:52.395729 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.395904 kubelet[2677]: E0130 19:14:52.395891 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.397378 kubelet[2677]: E0130 19:14:52.396142 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.397378 kubelet[2677]: W0130 19:14:52.396156 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.397378 kubelet[2677]: E0130 19:14:52.396540 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.397378 kubelet[2677]: E0130 19:14:52.397072 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.397378 kubelet[2677]: W0130 19:14:52.397088 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.397378 kubelet[2677]: E0130 19:14:52.397188 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.399105 kubelet[2677]: E0130 19:14:52.398933 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.399105 kubelet[2677]: W0130 19:14:52.398956 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.399105 kubelet[2677]: E0130 19:14:52.399078 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.399463 kubelet[2677]: E0130 19:14:52.399264 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.399463 kubelet[2677]: W0130 19:14:52.399278 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.399463 kubelet[2677]: E0130 19:14:52.399368 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.399628 kubelet[2677]: E0130 19:14:52.399584 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.399628 kubelet[2677]: W0130 19:14:52.399597 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.400276 kubelet[2677]: E0130 19:14:52.399710 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.400365 kubelet[2677]: E0130 19:14:52.400336 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.400365 kubelet[2677]: W0130 19:14:52.400357 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.401722 kubelet[2677]: E0130 19:14:52.401385 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.404215 kubelet[2677]: E0130 19:14:52.403989 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.404215 kubelet[2677]: W0130 19:14:52.404015 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.413223 kubelet[2677]: E0130 19:14:52.409963 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.413223 kubelet[2677]: W0130 19:14:52.410004 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.415804 kubelet[2677]: E0130 19:14:52.415502 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.415804 kubelet[2677]: W0130 19:14:52.415536 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.420390 kubelet[2677]: E0130 19:14:52.419718 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.420390 kubelet[2677]: W0130 19:14:52.419773 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.420582 kubelet[2677]: E0130 19:14:52.420416 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.421728 kubelet[2677]: W0130 19:14:52.421686 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.421908 kubelet[2677]: E0130 19:14:52.421771 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.425474 kubelet[2677]: E0130 19:14:52.425264 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.425554 kubelet[2677]: W0130 19:14:52.425480 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.425554 kubelet[2677]: E0130 19:14:52.425504 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.425782 kubelet[2677]: E0130 19:14:52.425755 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.427262 kubelet[2677]: E0130 19:14:52.427234 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.427885 kubelet[2677]: E0130 19:14:52.427849 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.427981 kubelet[2677]: W0130 19:14:52.427885 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.427981 kubelet[2677]: E0130 19:14:52.427905 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.427981 kubelet[2677]: E0130 19:14:52.427935 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.429275 kubelet[2677]: E0130 19:14:52.429245 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.431662 kubelet[2677]: E0130 19:14:52.431528 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.431662 kubelet[2677]: W0130 19:14:52.431560 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.431662 kubelet[2677]: E0130 19:14:52.431581 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.444892 containerd[1495]: time="2025-01-30T19:14:52.439643059Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:14:52.444892 containerd[1495]: time="2025-01-30T19:14:52.440026416Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:14:52.444892 containerd[1495]: time="2025-01-30T19:14:52.440045361Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:52.444892 containerd[1495]: time="2025-01-30T19:14:52.441010261Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:14:52.448653 kubelet[2677]: E0130 19:14:52.448619 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.448653 kubelet[2677]: W0130 19:14:52.448650 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.449387 kubelet[2677]: E0130 19:14:52.448682 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.449637 kubelet[2677]: E0130 19:14:52.449618 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.449637 kubelet[2677]: W0130 19:14:52.449633 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.449744 kubelet[2677]: E0130 19:14:52.449648 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.479825 kubelet[2677]: E0130 19:14:52.479773 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:52.480949 kubelet[2677]: W0130 19:14:52.479815 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:52.480949 kubelet[2677]: E0130 19:14:52.480030 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:52.505145 systemd[1]: Started cri-containerd-4fbad2d10240e845d0b202bf0e33e6fb60297c1a42708bdb78bb79680a94d069.scope - libcontainer container 4fbad2d10240e845d0b202bf0e33e6fb60297c1a42708bdb78bb79680a94d069. Jan 30 19:14:52.638895 containerd[1495]: time="2025-01-30T19:14:52.638337415Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6975c8487b-b8fh2,Uid:8419f904-f69d-45d8-8090-5b35c75935f8,Namespace:calico-system,Attempt:0,} returns sandbox id \"efb45fa081660fac4d8bbd76f87ef7e0fcd80b3c00adcb7f2a7efef601785a5a\"" Jan 30 19:14:52.640509 containerd[1495]: time="2025-01-30T19:14:52.640393298Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-w4qr5,Uid:4ab0dba4-7df4-4368-89f4-b0279789aa7a,Namespace:calico-system,Attempt:0,} returns sandbox id \"4fbad2d10240e845d0b202bf0e33e6fb60297c1a42708bdb78bb79680a94d069\"" Jan 30 19:14:52.643975 containerd[1495]: time="2025-01-30T19:14:52.643916265Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Jan 30 19:14:54.263177 kubelet[2677]: E0130 19:14:54.261179 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwmcj" podUID="eaeacb08-27c1-40ef-baaf-66029c9f99c5" Jan 30 19:14:54.537817 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4179147507.mount: Deactivated successfully. Jan 30 19:14:55.573756 containerd[1495]: time="2025-01-30T19:14:55.573697243Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:55.575529 containerd[1495]: time="2025-01-30T19:14:55.574928263Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Jan 30 19:14:55.575704 containerd[1495]: time="2025-01-30T19:14:55.575664750Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:55.578691 containerd[1495]: time="2025-01-30T19:14:55.578649559Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:55.580125 containerd[1495]: time="2025-01-30T19:14:55.580083211Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 2.936122173s" Jan 30 19:14:55.580207 containerd[1495]: time="2025-01-30T19:14:55.580130409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Jan 30 19:14:55.612595 containerd[1495]: time="2025-01-30T19:14:55.612525798Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 30 19:14:55.646230 containerd[1495]: time="2025-01-30T19:14:55.646167340Z" level=info msg="CreateContainer within sandbox \"efb45fa081660fac4d8bbd76f87ef7e0fcd80b3c00adcb7f2a7efef601785a5a\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 30 19:14:55.678601 containerd[1495]: time="2025-01-30T19:14:55.677759781Z" level=info msg="CreateContainer within sandbox \"efb45fa081660fac4d8bbd76f87ef7e0fcd80b3c00adcb7f2a7efef601785a5a\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d98cc58f919df40af2b5a0ba5b7ebb1ad4e48ff9fac8abc00c1629dde8213d6d\"" Jan 30 19:14:55.679897 containerd[1495]: time="2025-01-30T19:14:55.678764370Z" level=info msg="StartContainer for \"d98cc58f919df40af2b5a0ba5b7ebb1ad4e48ff9fac8abc00c1629dde8213d6d\"" Jan 30 19:14:55.736039 systemd[1]: Started cri-containerd-d98cc58f919df40af2b5a0ba5b7ebb1ad4e48ff9fac8abc00c1629dde8213d6d.scope - libcontainer container d98cc58f919df40af2b5a0ba5b7ebb1ad4e48ff9fac8abc00c1629dde8213d6d. Jan 30 19:14:55.810688 containerd[1495]: time="2025-01-30T19:14:55.810639567Z" level=info msg="StartContainer for \"d98cc58f919df40af2b5a0ba5b7ebb1ad4e48ff9fac8abc00c1629dde8213d6d\" returns successfully" Jan 30 19:14:56.261259 kubelet[2677]: E0130 19:14:56.260796 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwmcj" podUID="eaeacb08-27c1-40ef-baaf-66029c9f99c5" Jan 30 19:14:56.391810 kubelet[2677]: E0130 19:14:56.391762 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.392047 kubelet[2677]: W0130 19:14:56.391852 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.392047 kubelet[2677]: E0130 19:14:56.391910 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.392855 kubelet[2677]: E0130 19:14:56.392279 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.392855 kubelet[2677]: W0130 19:14:56.392301 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.392855 kubelet[2677]: E0130 19:14:56.392318 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.392855 kubelet[2677]: E0130 19:14:56.392662 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.392855 kubelet[2677]: W0130 19:14:56.392678 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.392855 kubelet[2677]: E0130 19:14:56.392713 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.393855 kubelet[2677]: E0130 19:14:56.393208 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.393855 kubelet[2677]: W0130 19:14:56.393229 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.393855 kubelet[2677]: E0130 19:14:56.393245 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.393855 kubelet[2677]: E0130 19:14:56.393611 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.393855 kubelet[2677]: W0130 19:14:56.393626 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.393855 kubelet[2677]: E0130 19:14:56.393644 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.394179 kubelet[2677]: E0130 19:14:56.393951 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.394179 kubelet[2677]: W0130 19:14:56.393966 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.394179 kubelet[2677]: E0130 19:14:56.393981 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.394323 kubelet[2677]: E0130 19:14:56.394273 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.394374 kubelet[2677]: W0130 19:14:56.394323 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.394374 kubelet[2677]: E0130 19:14:56.394343 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.394870 kubelet[2677]: E0130 19:14:56.394605 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.394870 kubelet[2677]: W0130 19:14:56.394626 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.394870 kubelet[2677]: E0130 19:14:56.394654 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.395094 kubelet[2677]: E0130 19:14:56.394988 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.395094 kubelet[2677]: W0130 19:14:56.395035 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.395094 kubelet[2677]: E0130 19:14:56.395054 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.396048 kubelet[2677]: E0130 19:14:56.395324 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.396048 kubelet[2677]: W0130 19:14:56.395363 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.396048 kubelet[2677]: E0130 19:14:56.395381 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.396048 kubelet[2677]: E0130 19:14:56.395697 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.396048 kubelet[2677]: W0130 19:14:56.395712 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.396048 kubelet[2677]: E0130 19:14:56.395727 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.396363 kubelet[2677]: E0130 19:14:56.396118 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.396363 kubelet[2677]: W0130 19:14:56.396133 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.396363 kubelet[2677]: E0130 19:14:56.396160 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.396509 kubelet[2677]: E0130 19:14:56.396452 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.396509 kubelet[2677]: W0130 19:14:56.396467 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.396509 kubelet[2677]: E0130 19:14:56.396482 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.397161 kubelet[2677]: E0130 19:14:56.396787 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.397161 kubelet[2677]: W0130 19:14:56.396808 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.397161 kubelet[2677]: E0130 19:14:56.396823 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.397161 kubelet[2677]: E0130 19:14:56.397156 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.397976 kubelet[2677]: W0130 19:14:56.397171 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.397976 kubelet[2677]: E0130 19:14:56.397185 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.426357 kubelet[2677]: E0130 19:14:56.426313 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.426357 kubelet[2677]: W0130 19:14:56.426349 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.426686 kubelet[2677]: E0130 19:14:56.426378 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.426774 kubelet[2677]: E0130 19:14:56.426727 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.426774 kubelet[2677]: W0130 19:14:56.426745 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.427114 kubelet[2677]: E0130 19:14:56.426789 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.427607 kubelet[2677]: E0130 19:14:56.427426 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.427607 kubelet[2677]: W0130 19:14:56.427447 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.427607 kubelet[2677]: E0130 19:14:56.427476 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.428039 kubelet[2677]: E0130 19:14:56.427894 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.428039 kubelet[2677]: W0130 19:14:56.427934 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.428039 kubelet[2677]: E0130 19:14:56.427963 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.428640 kubelet[2677]: E0130 19:14:56.428466 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.428640 kubelet[2677]: W0130 19:14:56.428485 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.428640 kubelet[2677]: E0130 19:14:56.428528 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.428981 kubelet[2677]: E0130 19:14:56.428938 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.428981 kubelet[2677]: W0130 19:14:56.428957 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.429373 kubelet[2677]: E0130 19:14:56.429214 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.429571 kubelet[2677]: E0130 19:14:56.429524 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.429571 kubelet[2677]: W0130 19:14:56.429546 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.429958 kubelet[2677]: E0130 19:14:56.429770 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.430143 kubelet[2677]: E0130 19:14:56.430123 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.430245 kubelet[2677]: W0130 19:14:56.430224 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.430445 kubelet[2677]: E0130 19:14:56.430352 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.430697 kubelet[2677]: E0130 19:14:56.430676 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.430933 kubelet[2677]: W0130 19:14:56.430790 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.431514 kubelet[2677]: E0130 19:14:56.431186 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.431514 kubelet[2677]: W0130 19:14:56.431204 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.431514 kubelet[2677]: E0130 19:14:56.431221 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.432128 kubelet[2677]: E0130 19:14:56.432111 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.432270 kubelet[2677]: E0130 19:14:56.432244 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.432270 kubelet[2677]: W0130 19:14:56.432265 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.432384 kubelet[2677]: E0130 19:14:56.432320 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.433394 kubelet[2677]: E0130 19:14:56.433367 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.433394 kubelet[2677]: W0130 19:14:56.433391 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.433523 kubelet[2677]: E0130 19:14:56.433425 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.433751 kubelet[2677]: E0130 19:14:56.433726 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.433751 kubelet[2677]: W0130 19:14:56.433748 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.433908 kubelet[2677]: E0130 19:14:56.433765 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.434282 kubelet[2677]: E0130 19:14:56.434254 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.434361 kubelet[2677]: W0130 19:14:56.434284 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.434361 kubelet[2677]: E0130 19:14:56.434338 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.434799 kubelet[2677]: E0130 19:14:56.434769 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.434932 kubelet[2677]: W0130 19:14:56.434810 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.434932 kubelet[2677]: E0130 19:14:56.434892 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.435531 kubelet[2677]: E0130 19:14:56.435507 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.435531 kubelet[2677]: W0130 19:14:56.435530 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.435921 kubelet[2677]: E0130 19:14:56.435564 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.436296 kubelet[2677]: E0130 19:14:56.436271 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.436296 kubelet[2677]: W0130 19:14:56.436292 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.436296 kubelet[2677]: E0130 19:14:56.436315 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:56.437618 kubelet[2677]: E0130 19:14:56.437290 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:56.437618 kubelet[2677]: W0130 19:14:56.437440 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:56.437618 kubelet[2677]: E0130 19:14:56.437484 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.258434 containerd[1495]: time="2025-01-30T19:14:57.258357678Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:57.260511 containerd[1495]: time="2025-01-30T19:14:57.260351980Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Jan 30 19:14:57.262405 containerd[1495]: time="2025-01-30T19:14:57.261590827Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:57.264597 containerd[1495]: time="2025-01-30T19:14:57.264537720Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:14:57.266390 containerd[1495]: time="2025-01-30T19:14:57.265639461Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.652821155s" Jan 30 19:14:57.266390 containerd[1495]: time="2025-01-30T19:14:57.265692320Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Jan 30 19:14:57.269130 containerd[1495]: time="2025-01-30T19:14:57.269005066Z" level=info msg="CreateContainer within sandbox \"4fbad2d10240e845d0b202bf0e33e6fb60297c1a42708bdb78bb79680a94d069\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 30 19:14:57.334331 containerd[1495]: time="2025-01-30T19:14:57.334097518Z" level=info msg="CreateContainer within sandbox \"4fbad2d10240e845d0b202bf0e33e6fb60297c1a42708bdb78bb79680a94d069\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"8bbc0a3d56b808523922dc8cc47812669550314310075d5556812d9b7f1c8251\"" Jan 30 19:14:57.336161 containerd[1495]: time="2025-01-30T19:14:57.335150077Z" level=info msg="StartContainer for \"8bbc0a3d56b808523922dc8cc47812669550314310075d5556812d9b7f1c8251\"" Jan 30 19:14:57.398131 kubelet[2677]: I0130 19:14:57.398081 2677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 19:14:57.401093 systemd[1]: Started cri-containerd-8bbc0a3d56b808523922dc8cc47812669550314310075d5556812d9b7f1c8251.scope - libcontainer container 8bbc0a3d56b808523922dc8cc47812669550314310075d5556812d9b7f1c8251. Jan 30 19:14:57.407977 kubelet[2677]: E0130 19:14:57.406284 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.407977 kubelet[2677]: W0130 19:14:57.406344 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.407977 kubelet[2677]: E0130 19:14:57.406370 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.407977 kubelet[2677]: E0130 19:14:57.406814 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.407977 kubelet[2677]: W0130 19:14:57.406860 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.407977 kubelet[2677]: E0130 19:14:57.406881 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.407977 kubelet[2677]: E0130 19:14:57.407333 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.407977 kubelet[2677]: W0130 19:14:57.407349 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.407977 kubelet[2677]: E0130 19:14:57.407364 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.407977 kubelet[2677]: E0130 19:14:57.407889 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.408854 kubelet[2677]: W0130 19:14:57.407906 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.408854 kubelet[2677]: E0130 19:14:57.407958 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.408854 kubelet[2677]: E0130 19:14:57.408591 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.408854 kubelet[2677]: W0130 19:14:57.408608 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.408854 kubelet[2677]: E0130 19:14:57.408623 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.411012 kubelet[2677]: E0130 19:14:57.410065 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.411012 kubelet[2677]: W0130 19:14:57.410082 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.411012 kubelet[2677]: E0130 19:14:57.410101 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.411012 kubelet[2677]: E0130 19:14:57.410617 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.411012 kubelet[2677]: W0130 19:14:57.410632 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.411012 kubelet[2677]: E0130 19:14:57.410678 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.411310 kubelet[2677]: E0130 19:14:57.411065 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.411310 kubelet[2677]: W0130 19:14:57.411081 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.411310 kubelet[2677]: E0130 19:14:57.411110 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.411494 kubelet[2677]: E0130 19:14:57.411434 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.411548 kubelet[2677]: W0130 19:14:57.411493 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.411548 kubelet[2677]: E0130 19:14:57.411512 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.411914 kubelet[2677]: E0130 19:14:57.411885 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.411914 kubelet[2677]: W0130 19:14:57.411908 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.412044 kubelet[2677]: E0130 19:14:57.411924 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.412296 kubelet[2677]: E0130 19:14:57.412268 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.412296 kubelet[2677]: W0130 19:14:57.412290 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.412423 kubelet[2677]: E0130 19:14:57.412307 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.412695 kubelet[2677]: E0130 19:14:57.412646 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.412695 kubelet[2677]: W0130 19:14:57.412694 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.412813 kubelet[2677]: E0130 19:14:57.412713 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.413856 kubelet[2677]: E0130 19:14:57.413111 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.413856 kubelet[2677]: W0130 19:14:57.413136 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.413856 kubelet[2677]: E0130 19:14:57.413153 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.413856 kubelet[2677]: E0130 19:14:57.413516 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.413856 kubelet[2677]: W0130 19:14:57.413531 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.413856 kubelet[2677]: E0130 19:14:57.413545 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.414268 kubelet[2677]: E0130 19:14:57.413913 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.414268 kubelet[2677]: W0130 19:14:57.413928 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.414268 kubelet[2677]: E0130 19:14:57.413943 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.436360 kubelet[2677]: E0130 19:14:57.436147 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.436360 kubelet[2677]: W0130 19:14:57.436176 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.436360 kubelet[2677]: E0130 19:14:57.436199 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.437510 kubelet[2677]: E0130 19:14:57.436660 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.437510 kubelet[2677]: W0130 19:14:57.436684 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.437510 kubelet[2677]: E0130 19:14:57.436701 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.437510 kubelet[2677]: E0130 19:14:57.437355 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.437510 kubelet[2677]: W0130 19:14:57.437372 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.437510 kubelet[2677]: E0130 19:14:57.437389 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.438801 kubelet[2677]: E0130 19:14:57.438458 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.438801 kubelet[2677]: W0130 19:14:57.438481 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.438801 kubelet[2677]: E0130 19:14:57.438501 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.438801 kubelet[2677]: E0130 19:14:57.438776 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.438801 kubelet[2677]: W0130 19:14:57.438792 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.440500 kubelet[2677]: E0130 19:14:57.438888 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.440500 kubelet[2677]: E0130 19:14:57.439441 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.440500 kubelet[2677]: W0130 19:14:57.439456 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.440500 kubelet[2677]: E0130 19:14:57.439661 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.441779 kubelet[2677]: E0130 19:14:57.441745 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.441779 kubelet[2677]: W0130 19:14:57.441771 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.442304 kubelet[2677]: E0130 19:14:57.441790 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.442561 kubelet[2677]: E0130 19:14:57.442537 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.442561 kubelet[2677]: W0130 19:14:57.442560 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.445796 kubelet[2677]: E0130 19:14:57.442578 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.445796 kubelet[2677]: E0130 19:14:57.444796 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.445796 kubelet[2677]: W0130 19:14:57.444813 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.445796 kubelet[2677]: E0130 19:14:57.444856 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.445796 kubelet[2677]: E0130 19:14:57.445789 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.446187 kubelet[2677]: W0130 19:14:57.445862 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.446187 kubelet[2677]: E0130 19:14:57.445896 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.446645 kubelet[2677]: E0130 19:14:57.446615 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.446645 kubelet[2677]: W0130 19:14:57.446639 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.447084 kubelet[2677]: E0130 19:14:57.447049 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.447169 kubelet[2677]: E0130 19:14:57.447141 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.447169 kubelet[2677]: W0130 19:14:57.447157 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.447421 kubelet[2677]: E0130 19:14:57.447332 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.447507 kubelet[2677]: E0130 19:14:57.447490 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.447564 kubelet[2677]: W0130 19:14:57.447507 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.447564 kubelet[2677]: E0130 19:14:57.447543 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.448256 kubelet[2677]: E0130 19:14:57.448227 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.448256 kubelet[2677]: W0130 19:14:57.448250 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.448404 kubelet[2677]: E0130 19:14:57.448276 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.448568 kubelet[2677]: E0130 19:14:57.448541 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.448568 kubelet[2677]: W0130 19:14:57.448564 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.448692 kubelet[2677]: E0130 19:14:57.448599 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.448981 kubelet[2677]: E0130 19:14:57.448918 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.448981 kubelet[2677]: W0130 19:14:57.448934 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.448981 kubelet[2677]: E0130 19:14:57.448969 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.449635 kubelet[2677]: E0130 19:14:57.449496 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.449635 kubelet[2677]: W0130 19:14:57.449511 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.449635 kubelet[2677]: E0130 19:14:57.449536 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.449813 kubelet[2677]: E0130 19:14:57.449789 2677 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 30 19:14:57.449813 kubelet[2677]: W0130 19:14:57.449811 2677 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 30 19:14:57.449996 kubelet[2677]: E0130 19:14:57.449855 2677 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 30 19:14:57.461444 containerd[1495]: time="2025-01-30T19:14:57.461387686Z" level=info msg="StartContainer for \"8bbc0a3d56b808523922dc8cc47812669550314310075d5556812d9b7f1c8251\" returns successfully" Jan 30 19:14:57.507517 systemd[1]: cri-containerd-8bbc0a3d56b808523922dc8cc47812669550314310075d5556812d9b7f1c8251.scope: Deactivated successfully. Jan 30 19:14:57.624801 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8bbc0a3d56b808523922dc8cc47812669550314310075d5556812d9b7f1c8251-rootfs.mount: Deactivated successfully. Jan 30 19:14:57.818262 containerd[1495]: time="2025-01-30T19:14:57.803946905Z" level=info msg="shim disconnected" id=8bbc0a3d56b808523922dc8cc47812669550314310075d5556812d9b7f1c8251 namespace=k8s.io Jan 30 19:14:57.818262 containerd[1495]: time="2025-01-30T19:14:57.817965254Z" level=warning msg="cleaning up after shim disconnected" id=8bbc0a3d56b808523922dc8cc47812669550314310075d5556812d9b7f1c8251 namespace=k8s.io Jan 30 19:14:57.818262 containerd[1495]: time="2025-01-30T19:14:57.818011658Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 30 19:14:58.260973 kubelet[2677]: E0130 19:14:58.260870 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwmcj" podUID="eaeacb08-27c1-40ef-baaf-66029c9f99c5" Jan 30 19:14:58.405645 containerd[1495]: time="2025-01-30T19:14:58.405529535Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 30 19:14:58.426661 kubelet[2677]: I0130 19:14:58.426487 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6975c8487b-b8fh2" podStartSLOduration=4.45772536 podStartE2EDuration="7.426465718s" podCreationTimestamp="2025-01-30 19:14:51 +0000 UTC" firstStartedPulling="2025-01-30 19:14:52.642931888 +0000 UTC m=+15.524155728" lastFinishedPulling="2025-01-30 19:14:55.611672246 +0000 UTC m=+18.492896086" observedRunningTime="2025-01-30 19:14:56.408546189 +0000 UTC m=+19.289770051" watchObservedRunningTime="2025-01-30 19:14:58.426465718 +0000 UTC m=+21.307689562" Jan 30 19:15:00.260995 kubelet[2677]: E0130 19:15:00.260882 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwmcj" podUID="eaeacb08-27c1-40ef-baaf-66029c9f99c5" Jan 30 19:15:02.261607 kubelet[2677]: E0130 19:15:02.260891 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwmcj" podUID="eaeacb08-27c1-40ef-baaf-66029c9f99c5" Jan 30 19:15:04.260640 kubelet[2677]: E0130 19:15:04.260544 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwmcj" podUID="eaeacb08-27c1-40ef-baaf-66029c9f99c5" Jan 30 19:15:05.351727 containerd[1495]: time="2025-01-30T19:15:05.351637917Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:05.353764 containerd[1495]: time="2025-01-30T19:15:05.353569927Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Jan 30 19:15:05.354797 containerd[1495]: time="2025-01-30T19:15:05.354723930Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:05.357785 containerd[1495]: time="2025-01-30T19:15:05.357743321Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:05.359245 containerd[1495]: time="2025-01-30T19:15:05.359209398Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 6.953315659s" Jan 30 19:15:05.359382 containerd[1495]: time="2025-01-30T19:15:05.359355022Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Jan 30 19:15:05.364071 containerd[1495]: time="2025-01-30T19:15:05.363909162Z" level=info msg="CreateContainer within sandbox \"4fbad2d10240e845d0b202bf0e33e6fb60297c1a42708bdb78bb79680a94d069\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 30 19:15:05.386746 containerd[1495]: time="2025-01-30T19:15:05.386692664Z" level=info msg="CreateContainer within sandbox \"4fbad2d10240e845d0b202bf0e33e6fb60297c1a42708bdb78bb79680a94d069\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"40b2593e25db03c5101c18ef3a46aed1f3dc23b26ec4cbd696ddd6cd810d882f\"" Jan 30 19:15:05.390584 containerd[1495]: time="2025-01-30T19:15:05.390528488Z" level=info msg="StartContainer for \"40b2593e25db03c5101c18ef3a46aed1f3dc23b26ec4cbd696ddd6cd810d882f\"" Jan 30 19:15:05.470089 systemd[1]: Started cri-containerd-40b2593e25db03c5101c18ef3a46aed1f3dc23b26ec4cbd696ddd6cd810d882f.scope - libcontainer container 40b2593e25db03c5101c18ef3a46aed1f3dc23b26ec4cbd696ddd6cd810d882f. Jan 30 19:15:05.523999 containerd[1495]: time="2025-01-30T19:15:05.523938390Z" level=info msg="StartContainer for \"40b2593e25db03c5101c18ef3a46aed1f3dc23b26ec4cbd696ddd6cd810d882f\" returns successfully" Jan 30 19:15:06.262862 kubelet[2677]: E0130 19:15:06.261507 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwmcj" podUID="eaeacb08-27c1-40ef-baaf-66029c9f99c5" Jan 30 19:15:06.524376 systemd[1]: cri-containerd-40b2593e25db03c5101c18ef3a46aed1f3dc23b26ec4cbd696ddd6cd810d882f.scope: Deactivated successfully. Jan 30 19:15:06.569706 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-40b2593e25db03c5101c18ef3a46aed1f3dc23b26ec4cbd696ddd6cd810d882f-rootfs.mount: Deactivated successfully. Jan 30 19:15:06.577991 containerd[1495]: time="2025-01-30T19:15:06.577491110Z" level=info msg="shim disconnected" id=40b2593e25db03c5101c18ef3a46aed1f3dc23b26ec4cbd696ddd6cd810d882f namespace=k8s.io Jan 30 19:15:06.577991 containerd[1495]: time="2025-01-30T19:15:06.577601546Z" level=warning msg="cleaning up after shim disconnected" id=40b2593e25db03c5101c18ef3a46aed1f3dc23b26ec4cbd696ddd6cd810d882f namespace=k8s.io Jan 30 19:15:06.577991 containerd[1495]: time="2025-01-30T19:15:06.577671609Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 30 19:15:06.580015 kubelet[2677]: I0130 19:15:06.579976 2677 kubelet_node_status.go:502] "Fast updating node status as it just became ready" Jan 30 19:15:06.665895 systemd[1]: Created slice kubepods-besteffort-pod310a4559_09ca_47db_a381_43206f870195.slice - libcontainer container kubepods-besteffort-pod310a4559_09ca_47db_a381_43206f870195.slice. Jan 30 19:15:06.682205 systemd[1]: Created slice kubepods-burstable-pode5cdbf5e_4fb8_4a95_8254_b6bc2709291a.slice - libcontainer container kubepods-burstable-pode5cdbf5e_4fb8_4a95_8254_b6bc2709291a.slice. Jan 30 19:15:06.698362 systemd[1]: Created slice kubepods-besteffort-poda19cd91b_5cda_4b1f_99ed_5c7be5d66c14.slice - libcontainer container kubepods-besteffort-poda19cd91b_5cda_4b1f_99ed_5c7be5d66c14.slice. Jan 30 19:15:06.711374 systemd[1]: Created slice kubepods-burstable-pod895b74bc_3470_4c3c_b993_a72d4beb91c4.slice - libcontainer container kubepods-burstable-pod895b74bc_3470_4c3c_b993_a72d4beb91c4.slice. Jan 30 19:15:06.711925 kubelet[2677]: I0130 19:15:06.711410 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgzmh\" (UniqueName: \"kubernetes.io/projected/e5cdbf5e-4fb8-4a95-8254-b6bc2709291a-kube-api-access-fgzmh\") pod \"coredns-668d6bf9bc-ct7bq\" (UID: \"e5cdbf5e-4fb8-4a95-8254-b6bc2709291a\") " pod="kube-system/coredns-668d6bf9bc-ct7bq" Jan 30 19:15:06.711925 kubelet[2677]: I0130 19:15:06.711467 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5cdbf5e-4fb8-4a95-8254-b6bc2709291a-config-volume\") pod \"coredns-668d6bf9bc-ct7bq\" (UID: \"e5cdbf5e-4fb8-4a95-8254-b6bc2709291a\") " pod="kube-system/coredns-668d6bf9bc-ct7bq" Jan 30 19:15:06.711925 kubelet[2677]: I0130 19:15:06.711502 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dcf02077-f75c-473d-88b8-2156144d8423-calico-apiserver-certs\") pod \"calico-apiserver-86bd64cf58-mfbfs\" (UID: \"dcf02077-f75c-473d-88b8-2156144d8423\") " pod="calico-apiserver/calico-apiserver-86bd64cf58-mfbfs" Jan 30 19:15:06.711925 kubelet[2677]: I0130 19:15:06.711534 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/895b74bc-3470-4c3c-b993-a72d4beb91c4-config-volume\") pod \"coredns-668d6bf9bc-snjjb\" (UID: \"895b74bc-3470-4c3c-b993-a72d4beb91c4\") " pod="kube-system/coredns-668d6bf9bc-snjjb" Jan 30 19:15:06.711925 kubelet[2677]: I0130 19:15:06.711562 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/310a4559-09ca-47db-a381-43206f870195-tigera-ca-bundle\") pod \"calico-kube-controllers-55fd5b7757-8rgqt\" (UID: \"310a4559-09ca-47db-a381-43206f870195\") " pod="calico-system/calico-kube-controllers-55fd5b7757-8rgqt" Jan 30 19:15:06.713484 kubelet[2677]: I0130 19:15:06.711591 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckdnc\" (UniqueName: \"kubernetes.io/projected/310a4559-09ca-47db-a381-43206f870195-kube-api-access-ckdnc\") pod \"calico-kube-controllers-55fd5b7757-8rgqt\" (UID: \"310a4559-09ca-47db-a381-43206f870195\") " pod="calico-system/calico-kube-controllers-55fd5b7757-8rgqt" Jan 30 19:15:06.713484 kubelet[2677]: I0130 19:15:06.711621 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a19cd91b-5cda-4b1f-99ed-5c7be5d66c14-calico-apiserver-certs\") pod \"calico-apiserver-86bd64cf58-cm96v\" (UID: \"a19cd91b-5cda-4b1f-99ed-5c7be5d66c14\") " pod="calico-apiserver/calico-apiserver-86bd64cf58-cm96v" Jan 30 19:15:06.713484 kubelet[2677]: I0130 19:15:06.711656 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4blc\" (UniqueName: \"kubernetes.io/projected/895b74bc-3470-4c3c-b993-a72d4beb91c4-kube-api-access-k4blc\") pod \"coredns-668d6bf9bc-snjjb\" (UID: \"895b74bc-3470-4c3c-b993-a72d4beb91c4\") " pod="kube-system/coredns-668d6bf9bc-snjjb" Jan 30 19:15:06.713484 kubelet[2677]: I0130 19:15:06.711686 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvnr9\" (UniqueName: \"kubernetes.io/projected/dcf02077-f75c-473d-88b8-2156144d8423-kube-api-access-tvnr9\") pod \"calico-apiserver-86bd64cf58-mfbfs\" (UID: \"dcf02077-f75c-473d-88b8-2156144d8423\") " pod="calico-apiserver/calico-apiserver-86bd64cf58-mfbfs" Jan 30 19:15:06.713484 kubelet[2677]: I0130 19:15:06.711716 2677 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x2c2\" (UniqueName: \"kubernetes.io/projected/a19cd91b-5cda-4b1f-99ed-5c7be5d66c14-kube-api-access-2x2c2\") pod \"calico-apiserver-86bd64cf58-cm96v\" (UID: \"a19cd91b-5cda-4b1f-99ed-5c7be5d66c14\") " pod="calico-apiserver/calico-apiserver-86bd64cf58-cm96v" Jan 30 19:15:06.723202 systemd[1]: Created slice kubepods-besteffort-poddcf02077_f75c_473d_88b8_2156144d8423.slice - libcontainer container kubepods-besteffort-poddcf02077_f75c_473d_88b8_2156144d8423.slice. Jan 30 19:15:06.976351 containerd[1495]: time="2025-01-30T19:15:06.976258788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55fd5b7757-8rgqt,Uid:310a4559-09ca-47db-a381-43206f870195,Namespace:calico-system,Attempt:0,}" Jan 30 19:15:06.989871 containerd[1495]: time="2025-01-30T19:15:06.989338664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ct7bq,Uid:e5cdbf5e-4fb8-4a95-8254-b6bc2709291a,Namespace:kube-system,Attempt:0,}" Jan 30 19:15:07.007520 containerd[1495]: time="2025-01-30T19:15:07.007158116Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86bd64cf58-cm96v,Uid:a19cd91b-5cda-4b1f-99ed-5c7be5d66c14,Namespace:calico-apiserver,Attempt:0,}" Jan 30 19:15:07.039473 containerd[1495]: time="2025-01-30T19:15:07.039106191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86bd64cf58-mfbfs,Uid:dcf02077-f75c-473d-88b8-2156144d8423,Namespace:calico-apiserver,Attempt:0,}" Jan 30 19:15:07.040988 containerd[1495]: time="2025-01-30T19:15:07.039457535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-snjjb,Uid:895b74bc-3470-4c3c-b993-a72d4beb91c4,Namespace:kube-system,Attempt:0,}" Jan 30 19:15:07.202027 kubelet[2677]: I0130 19:15:07.201297 2677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 19:15:07.439597 containerd[1495]: time="2025-01-30T19:15:07.438640927Z" level=error msg="Failed to destroy network for sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.441545 containerd[1495]: time="2025-01-30T19:15:07.440696784Z" level=error msg="Failed to destroy network for sandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.441545 containerd[1495]: time="2025-01-30T19:15:07.441001634Z" level=error msg="Failed to destroy network for sandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.442059 containerd[1495]: time="2025-01-30T19:15:07.441726791Z" level=error msg="Failed to destroy network for sandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.446106 containerd[1495]: time="2025-01-30T19:15:07.446043138Z" level=error msg="encountered an error cleaning up failed sandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.446330 containerd[1495]: time="2025-01-30T19:15:07.446276793Z" level=error msg="encountered an error cleaning up failed sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.446420 containerd[1495]: time="2025-01-30T19:15:07.446357843Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ct7bq,Uid:e5cdbf5e-4fb8-4a95-8254-b6bc2709291a,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.446760 containerd[1495]: time="2025-01-30T19:15:07.446600159Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86bd64cf58-mfbfs,Uid:dcf02077-f75c-473d-88b8-2156144d8423,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.455965 kubelet[2677]: E0130 19:15:07.454922 2677 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.455965 kubelet[2677]: E0130 19:15:07.454941 2677 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.455965 kubelet[2677]: E0130 19:15:07.455041 2677 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86bd64cf58-mfbfs" Jan 30 19:15:07.455965 kubelet[2677]: E0130 19:15:07.455041 2677 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-ct7bq" Jan 30 19:15:07.456723 kubelet[2677]: E0130 19:15:07.455089 2677 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86bd64cf58-mfbfs" Jan 30 19:15:07.456723 kubelet[2677]: E0130 19:15:07.455093 2677 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-ct7bq" Jan 30 19:15:07.456723 kubelet[2677]: E0130 19:15:07.455197 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-86bd64cf58-mfbfs_calico-apiserver(dcf02077-f75c-473d-88b8-2156144d8423)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-86bd64cf58-mfbfs_calico-apiserver(dcf02077-f75c-473d-88b8-2156144d8423)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-86bd64cf58-mfbfs" podUID="dcf02077-f75c-473d-88b8-2156144d8423" Jan 30 19:15:07.458220 kubelet[2677]: E0130 19:15:07.455263 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-ct7bq_kube-system(e5cdbf5e-4fb8-4a95-8254-b6bc2709291a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-ct7bq_kube-system(e5cdbf5e-4fb8-4a95-8254-b6bc2709291a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-ct7bq" podUID="e5cdbf5e-4fb8-4a95-8254-b6bc2709291a" Jan 30 19:15:07.458220 kubelet[2677]: E0130 19:15:07.457973 2677 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.458220 kubelet[2677]: E0130 19:15:07.458039 2677 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-snjjb" Jan 30 19:15:07.458408 containerd[1495]: time="2025-01-30T19:15:07.457342125Z" level=error msg="encountered an error cleaning up failed sandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.458408 containerd[1495]: time="2025-01-30T19:15:07.457465217Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-snjjb,Uid:895b74bc-3470-4c3c-b993-a72d4beb91c4,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.458408 containerd[1495]: time="2025-01-30T19:15:07.457701384Z" level=error msg="Failed to destroy network for sandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.460736 kubelet[2677]: E0130 19:15:07.458066 2677 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-snjjb" Jan 30 19:15:07.460736 kubelet[2677]: E0130 19:15:07.458108 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-snjjb_kube-system(895b74bc-3470-4c3c-b993-a72d4beb91c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-snjjb_kube-system(895b74bc-3470-4c3c-b993-a72d4beb91c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-snjjb" podUID="895b74bc-3470-4c3c-b993-a72d4beb91c4" Jan 30 19:15:07.460736 kubelet[2677]: E0130 19:15:07.458661 2677 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.460994 containerd[1495]: time="2025-01-30T19:15:07.458462186Z" level=error msg="encountered an error cleaning up failed sandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.460994 containerd[1495]: time="2025-01-30T19:15:07.458522585Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86bd64cf58-cm96v,Uid:a19cd91b-5cda-4b1f-99ed-5c7be5d66c14,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.460994 containerd[1495]: time="2025-01-30T19:15:07.446055767Z" level=error msg="encountered an error cleaning up failed sandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.460994 containerd[1495]: time="2025-01-30T19:15:07.459071889Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55fd5b7757-8rgqt,Uid:310a4559-09ca-47db-a381-43206f870195,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.461641 kubelet[2677]: E0130 19:15:07.458723 2677 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86bd64cf58-cm96v" Jan 30 19:15:07.461641 kubelet[2677]: E0130 19:15:07.458816 2677 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86bd64cf58-cm96v" Jan 30 19:15:07.461641 kubelet[2677]: E0130 19:15:07.458888 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-86bd64cf58-cm96v_calico-apiserver(a19cd91b-5cda-4b1f-99ed-5c7be5d66c14)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-86bd64cf58-cm96v_calico-apiserver(a19cd91b-5cda-4b1f-99ed-5c7be5d66c14)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-86bd64cf58-cm96v" podUID="a19cd91b-5cda-4b1f-99ed-5c7be5d66c14" Jan 30 19:15:07.461823 kubelet[2677]: E0130 19:15:07.459327 2677 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.461823 kubelet[2677]: E0130 19:15:07.459366 2677 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-55fd5b7757-8rgqt" Jan 30 19:15:07.461823 kubelet[2677]: E0130 19:15:07.459390 2677 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-55fd5b7757-8rgqt" Jan 30 19:15:07.462330 kubelet[2677]: E0130 19:15:07.459454 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-55fd5b7757-8rgqt_calico-system(310a4559-09ca-47db-a381-43206f870195)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-55fd5b7757-8rgqt_calico-system(310a4559-09ca-47db-a381-43206f870195)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-55fd5b7757-8rgqt" podUID="310a4559-09ca-47db-a381-43206f870195" Jan 30 19:15:07.585121 containerd[1495]: time="2025-01-30T19:15:07.585053239Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 30 19:15:07.590616 kubelet[2677]: I0130 19:15:07.588443 2677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:07.606853 containerd[1495]: time="2025-01-30T19:15:07.606775198Z" level=info msg="StopPodSandbox for \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\"" Jan 30 19:15:07.612966 kubelet[2677]: I0130 19:15:07.612921 2677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:07.614654 containerd[1495]: time="2025-01-30T19:15:07.614614322Z" level=info msg="Ensure that sandbox 85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38 in task-service has been cleanup successfully" Jan 30 19:15:07.617485 containerd[1495]: time="2025-01-30T19:15:07.617401587Z" level=info msg="StopPodSandbox for \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\"" Jan 30 19:15:07.619184 kubelet[2677]: I0130 19:15:07.619142 2677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:07.619497 containerd[1495]: time="2025-01-30T19:15:07.619462074Z" level=info msg="Ensure that sandbox e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c in task-service has been cleanup successfully" Jan 30 19:15:07.622460 containerd[1495]: time="2025-01-30T19:15:07.622195745Z" level=info msg="StopPodSandbox for \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\"" Jan 30 19:15:07.626462 containerd[1495]: time="2025-01-30T19:15:07.626417028Z" level=info msg="Ensure that sandbox 8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8 in task-service has been cleanup successfully" Jan 30 19:15:07.636633 kubelet[2677]: I0130 19:15:07.635916 2677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:07.637033 containerd[1495]: time="2025-01-30T19:15:07.636985886Z" level=info msg="StopPodSandbox for \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\"" Jan 30 19:15:07.637287 containerd[1495]: time="2025-01-30T19:15:07.637252250Z" level=info msg="Ensure that sandbox 2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c in task-service has been cleanup successfully" Jan 30 19:15:07.646028 kubelet[2677]: I0130 19:15:07.645972 2677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:07.647358 containerd[1495]: time="2025-01-30T19:15:07.646691666Z" level=info msg="StopPodSandbox for \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\"" Jan 30 19:15:07.647358 containerd[1495]: time="2025-01-30T19:15:07.646976557Z" level=info msg="Ensure that sandbox db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b in task-service has been cleanup successfully" Jan 30 19:15:07.735768 containerd[1495]: time="2025-01-30T19:15:07.735598736Z" level=error msg="StopPodSandbox for \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\" failed" error="failed to destroy network for sandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.736661 kubelet[2677]: E0130 19:15:07.736434 2677 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:07.737022 kubelet[2677]: E0130 19:15:07.736816 2677 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8"} Jan 30 19:15:07.737621 kubelet[2677]: E0130 19:15:07.737350 2677 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"dcf02077-f75c-473d-88b8-2156144d8423\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 30 19:15:07.737621 kubelet[2677]: E0130 19:15:07.737500 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"dcf02077-f75c-473d-88b8-2156144d8423\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-86bd64cf58-mfbfs" podUID="dcf02077-f75c-473d-88b8-2156144d8423" Jan 30 19:15:07.740483 containerd[1495]: time="2025-01-30T19:15:07.740040094Z" level=error msg="StopPodSandbox for \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\" failed" error="failed to destroy network for sandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.740590 kubelet[2677]: E0130 19:15:07.740325 2677 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:07.740590 kubelet[2677]: E0130 19:15:07.740369 2677 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c"} Jan 30 19:15:07.740590 kubelet[2677]: E0130 19:15:07.740407 2677 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"a19cd91b-5cda-4b1f-99ed-5c7be5d66c14\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 30 19:15:07.740590 kubelet[2677]: E0130 19:15:07.740440 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"a19cd91b-5cda-4b1f-99ed-5c7be5d66c14\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-86bd64cf58-cm96v" podUID="a19cd91b-5cda-4b1f-99ed-5c7be5d66c14" Jan 30 19:15:07.755721 containerd[1495]: time="2025-01-30T19:15:07.755100904Z" level=error msg="StopPodSandbox for \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\" failed" error="failed to destroy network for sandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.755924 kubelet[2677]: E0130 19:15:07.755508 2677 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:07.755924 kubelet[2677]: E0130 19:15:07.755576 2677 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c"} Jan 30 19:15:07.755924 kubelet[2677]: E0130 19:15:07.755629 2677 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"895b74bc-3470-4c3c-b993-a72d4beb91c4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 30 19:15:07.755924 kubelet[2677]: E0130 19:15:07.755664 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"895b74bc-3470-4c3c-b993-a72d4beb91c4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-snjjb" podUID="895b74bc-3470-4c3c-b993-a72d4beb91c4" Jan 30 19:15:07.757222 containerd[1495]: time="2025-01-30T19:15:07.756201638Z" level=error msg="StopPodSandbox for \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\" failed" error="failed to destroy network for sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.757642 kubelet[2677]: E0130 19:15:07.757393 2677 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:07.757642 kubelet[2677]: E0130 19:15:07.757494 2677 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38"} Jan 30 19:15:07.757642 kubelet[2677]: E0130 19:15:07.757552 2677 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"e5cdbf5e-4fb8-4a95-8254-b6bc2709291a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 30 19:15:07.757642 kubelet[2677]: E0130 19:15:07.757603 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"e5cdbf5e-4fb8-4a95-8254-b6bc2709291a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-ct7bq" podUID="e5cdbf5e-4fb8-4a95-8254-b6bc2709291a" Jan 30 19:15:07.775150 containerd[1495]: time="2025-01-30T19:15:07.775067407Z" level=error msg="StopPodSandbox for \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\" failed" error="failed to destroy network for sandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:07.775761 kubelet[2677]: E0130 19:15:07.775562 2677 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:07.775761 kubelet[2677]: E0130 19:15:07.775627 2677 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b"} Jan 30 19:15:07.775761 kubelet[2677]: E0130 19:15:07.775675 2677 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"310a4559-09ca-47db-a381-43206f870195\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 30 19:15:07.775761 kubelet[2677]: E0130 19:15:07.775710 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"310a4559-09ca-47db-a381-43206f870195\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-55fd5b7757-8rgqt" podUID="310a4559-09ca-47db-a381-43206f870195" Jan 30 19:15:08.279401 systemd[1]: Created slice kubepods-besteffort-podeaeacb08_27c1_40ef_baaf_66029c9f99c5.slice - libcontainer container kubepods-besteffort-podeaeacb08_27c1_40ef_baaf_66029c9f99c5.slice. Jan 30 19:15:08.290740 containerd[1495]: time="2025-01-30T19:15:08.290431544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xwmcj,Uid:eaeacb08-27c1-40ef-baaf-66029c9f99c5,Namespace:calico-system,Attempt:0,}" Jan 30 19:15:08.401232 containerd[1495]: time="2025-01-30T19:15:08.401110095Z" level=error msg="Failed to destroy network for sandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:08.403431 containerd[1495]: time="2025-01-30T19:15:08.403364463Z" level=error msg="encountered an error cleaning up failed sandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:08.403513 containerd[1495]: time="2025-01-30T19:15:08.403487414Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xwmcj,Uid:eaeacb08-27c1-40ef-baaf-66029c9f99c5,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:08.404764 kubelet[2677]: E0130 19:15:08.403904 2677 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:08.404982 kubelet[2677]: E0130 19:15:08.403998 2677 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xwmcj" Jan 30 19:15:08.404982 kubelet[2677]: E0130 19:15:08.404933 2677 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xwmcj" Jan 30 19:15:08.405409 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc-shm.mount: Deactivated successfully. Jan 30 19:15:08.406354 kubelet[2677]: E0130 19:15:08.406124 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-xwmcj_calico-system(eaeacb08-27c1-40ef-baaf-66029c9f99c5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-xwmcj_calico-system(eaeacb08-27c1-40ef-baaf-66029c9f99c5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-xwmcj" podUID="eaeacb08-27c1-40ef-baaf-66029c9f99c5" Jan 30 19:15:08.651418 kubelet[2677]: I0130 19:15:08.651021 2677 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:08.654290 containerd[1495]: time="2025-01-30T19:15:08.652160025Z" level=info msg="StopPodSandbox for \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\"" Jan 30 19:15:08.654290 containerd[1495]: time="2025-01-30T19:15:08.652410271Z" level=info msg="Ensure that sandbox 2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc in task-service has been cleanup successfully" Jan 30 19:15:08.697214 containerd[1495]: time="2025-01-30T19:15:08.697114369Z" level=error msg="StopPodSandbox for \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\" failed" error="failed to destroy network for sandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:08.697854 kubelet[2677]: E0130 19:15:08.697613 2677 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:08.697854 kubelet[2677]: E0130 19:15:08.697695 2677 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc"} Jan 30 19:15:08.697854 kubelet[2677]: E0130 19:15:08.697748 2677 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"eaeacb08-27c1-40ef-baaf-66029c9f99c5\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 30 19:15:08.697854 kubelet[2677]: E0130 19:15:08.697784 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"eaeacb08-27c1-40ef-baaf-66029c9f99c5\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-xwmcj" podUID="eaeacb08-27c1-40ef-baaf-66029c9f99c5" Jan 30 19:15:17.575970 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3423231648.mount: Deactivated successfully. Jan 30 19:15:17.735939 containerd[1495]: time="2025-01-30T19:15:17.722861754Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Jan 30 19:15:17.748919 containerd[1495]: time="2025-01-30T19:15:17.747951247Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:17.787787 containerd[1495]: time="2025-01-30T19:15:17.787694733Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:17.788676 containerd[1495]: time="2025-01-30T19:15:17.788617233Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:17.792533 containerd[1495]: time="2025-01-30T19:15:17.792302778Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 10.202436844s" Jan 30 19:15:17.792533 containerd[1495]: time="2025-01-30T19:15:17.792364612Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Jan 30 19:15:17.840095 containerd[1495]: time="2025-01-30T19:15:17.839332335Z" level=info msg="CreateContainer within sandbox \"4fbad2d10240e845d0b202bf0e33e6fb60297c1a42708bdb78bb79680a94d069\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 30 19:15:17.927472 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4226132981.mount: Deactivated successfully. Jan 30 19:15:17.939692 containerd[1495]: time="2025-01-30T19:15:17.939619321Z" level=info msg="CreateContainer within sandbox \"4fbad2d10240e845d0b202bf0e33e6fb60297c1a42708bdb78bb79680a94d069\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"d843e4067bfb307c87fe7c6e67fb59381c720c98994c512e5054f90c2246c25f\"" Jan 30 19:15:17.946070 containerd[1495]: time="2025-01-30T19:15:17.945686149Z" level=info msg="StartContainer for \"d843e4067bfb307c87fe7c6e67fb59381c720c98994c512e5054f90c2246c25f\"" Jan 30 19:15:18.173186 systemd[1]: Started cri-containerd-d843e4067bfb307c87fe7c6e67fb59381c720c98994c512e5054f90c2246c25f.scope - libcontainer container d843e4067bfb307c87fe7c6e67fb59381c720c98994c512e5054f90c2246c25f. Jan 30 19:15:18.248250 containerd[1495]: time="2025-01-30T19:15:18.247636988Z" level=info msg="StartContainer for \"d843e4067bfb307c87fe7c6e67fb59381c720c98994c512e5054f90c2246c25f\" returns successfully" Jan 30 19:15:18.271392 containerd[1495]: time="2025-01-30T19:15:18.271121711Z" level=info msg="StopPodSandbox for \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\"" Jan 30 19:15:18.386733 containerd[1495]: time="2025-01-30T19:15:18.386605481Z" level=error msg="StopPodSandbox for \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\" failed" error="failed to destroy network for sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 30 19:15:18.413260 kubelet[2677]: E0130 19:15:18.412985 2677 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:18.413260 kubelet[2677]: E0130 19:15:18.413097 2677 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38"} Jan 30 19:15:18.413260 kubelet[2677]: E0130 19:15:18.413155 2677 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"e5cdbf5e-4fb8-4a95-8254-b6bc2709291a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 30 19:15:18.413260 kubelet[2677]: E0130 19:15:18.413205 2677 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"e5cdbf5e-4fb8-4a95-8254-b6bc2709291a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-ct7bq" podUID="e5cdbf5e-4fb8-4a95-8254-b6bc2709291a" Jan 30 19:15:18.522390 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 30 19:15:18.523541 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 30 19:15:19.263868 containerd[1495]: time="2025-01-30T19:15:19.263220796Z" level=info msg="StopPodSandbox for \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\"" Jan 30 19:15:19.373120 kubelet[2677]: I0130 19:15:19.366816 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-w4qr5" podStartSLOduration=2.21570051 podStartE2EDuration="27.363768487s" podCreationTimestamp="2025-01-30 19:14:52 +0000 UTC" firstStartedPulling="2025-01-30 19:14:52.645360492 +0000 UTC m=+15.526584328" lastFinishedPulling="2025-01-30 19:15:17.793428463 +0000 UTC m=+40.674652305" observedRunningTime="2025-01-30 19:15:18.732328959 +0000 UTC m=+41.613552817" watchObservedRunningTime="2025-01-30 19:15:19.363768487 +0000 UTC m=+42.244992335" Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.365 [INFO][3888] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.366 [INFO][3888] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" iface="eth0" netns="/var/run/netns/cni-69accdbb-82b2-2e77-4a5b-3d02eb2f2d87" Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.367 [INFO][3888] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" iface="eth0" netns="/var/run/netns/cni-69accdbb-82b2-2e77-4a5b-3d02eb2f2d87" Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.369 [INFO][3888] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" iface="eth0" netns="/var/run/netns/cni-69accdbb-82b2-2e77-4a5b-3d02eb2f2d87" Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.369 [INFO][3888] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.369 [INFO][3888] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.577 [INFO][3900] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" HandleID="k8s-pod-network.8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.580 [INFO][3900] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.580 [INFO][3900] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.598 [WARNING][3900] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" HandleID="k8s-pod-network.8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.598 [INFO][3900] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" HandleID="k8s-pod-network.8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.601 [INFO][3900] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:19.606033 containerd[1495]: 2025-01-30 19:15:19.603 [INFO][3888] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:19.611177 containerd[1495]: time="2025-01-30T19:15:19.606643424Z" level=info msg="TearDown network for sandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\" successfully" Jan 30 19:15:19.611177 containerd[1495]: time="2025-01-30T19:15:19.606683751Z" level=info msg="StopPodSandbox for \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\" returns successfully" Jan 30 19:15:19.611177 containerd[1495]: time="2025-01-30T19:15:19.609938599Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86bd64cf58-mfbfs,Uid:dcf02077-f75c-473d-88b8-2156144d8423,Namespace:calico-apiserver,Attempt:1,}" Jan 30 19:15:19.610684 systemd[1]: run-netns-cni\x2d69accdbb\x2d82b2\x2d2e77\x2d4a5b\x2d3d02eb2f2d87.mount: Deactivated successfully. Jan 30 19:15:19.745669 systemd[1]: run-containerd-runc-k8s.io-d843e4067bfb307c87fe7c6e67fb59381c720c98994c512e5054f90c2246c25f-runc.eAtvfR.mount: Deactivated successfully. Jan 30 19:15:19.906813 systemd-networkd[1437]: calic880996c1cf: Link UP Jan 30 19:15:19.921225 systemd-networkd[1437]: calic880996c1cf: Gained carrier Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.683 [INFO][3907] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.705 [INFO][3907] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0 calico-apiserver-86bd64cf58- calico-apiserver dcf02077-f75c-473d-88b8-2156144d8423 745 0 2025-01-30 19:14:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:86bd64cf58 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-ehdo1.gb1.brightbox.com calico-apiserver-86bd64cf58-mfbfs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic880996c1cf [] []}} ContainerID="d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-mfbfs" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-" Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.705 [INFO][3907] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-mfbfs" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.791 [INFO][3928] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" HandleID="k8s-pod-network.d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.813 [INFO][3928] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" HandleID="k8s-pod-network.d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00047e3f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-ehdo1.gb1.brightbox.com", "pod":"calico-apiserver-86bd64cf58-mfbfs", "timestamp":"2025-01-30 19:15:19.79150827 +0000 UTC"}, Hostname:"srv-ehdo1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.813 [INFO][3928] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.813 [INFO][3928] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.813 [INFO][3928] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-ehdo1.gb1.brightbox.com' Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.818 [INFO][3928] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.828 [INFO][3928] ipam/ipam.go 372: Looking up existing affinities for host host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.840 [INFO][3928] ipam/ipam.go 489: Trying affinity for 192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.845 [INFO][3928] ipam/ipam.go 155: Attempting to load block cidr=192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.852 [INFO][3928] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.853 [INFO][3928] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.47.64/26 handle="k8s-pod-network.d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.857 [INFO][3928] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1 Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.871 [INFO][3928] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.47.64/26 handle="k8s-pod-network.d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.880 [INFO][3928] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.47.65/26] block=192.168.47.64/26 handle="k8s-pod-network.d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.880 [INFO][3928] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.47.65/26] handle="k8s-pod-network.d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.880 [INFO][3928] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:19.944082 containerd[1495]: 2025-01-30 19:15:19.880 [INFO][3928] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.47.65/26] IPv6=[] ContainerID="d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" HandleID="k8s-pod-network.d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:19.950363 containerd[1495]: 2025-01-30 19:15:19.884 [INFO][3907] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-mfbfs" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0", GenerateName:"calico-apiserver-86bd64cf58-", Namespace:"calico-apiserver", SelfLink:"", UID:"dcf02077-f75c-473d-88b8-2156144d8423", ResourceVersion:"745", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86bd64cf58", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-86bd64cf58-mfbfs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.47.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic880996c1cf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:19.950363 containerd[1495]: 2025-01-30 19:15:19.884 [INFO][3907] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.47.65/32] ContainerID="d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-mfbfs" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:19.950363 containerd[1495]: 2025-01-30 19:15:19.885 [INFO][3907] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic880996c1cf ContainerID="d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-mfbfs" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:19.950363 containerd[1495]: 2025-01-30 19:15:19.909 [INFO][3907] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-mfbfs" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:19.950363 containerd[1495]: 2025-01-30 19:15:19.911 [INFO][3907] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-mfbfs" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0", GenerateName:"calico-apiserver-86bd64cf58-", Namespace:"calico-apiserver", SelfLink:"", UID:"dcf02077-f75c-473d-88b8-2156144d8423", ResourceVersion:"745", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86bd64cf58", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1", Pod:"calico-apiserver-86bd64cf58-mfbfs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.47.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic880996c1cf", MAC:"4a:63:53:22:57:cb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:19.950363 containerd[1495]: 2025-01-30 19:15:19.937 [INFO][3907] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-mfbfs" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:19.999314 containerd[1495]: time="2025-01-30T19:15:19.998382047Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:15:19.999813 containerd[1495]: time="2025-01-30T19:15:19.999751769Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:15:20.000057 containerd[1495]: time="2025-01-30T19:15:19.999944964Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:15:20.003037 containerd[1495]: time="2025-01-30T19:15:20.000711991Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:15:20.031258 systemd[1]: Started cri-containerd-d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1.scope - libcontainer container d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1. Jan 30 19:15:20.101561 containerd[1495]: time="2025-01-30T19:15:20.101304310Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86bd64cf58-mfbfs,Uid:dcf02077-f75c-473d-88b8-2156144d8423,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1\"" Jan 30 19:15:20.105547 containerd[1495]: time="2025-01-30T19:15:20.104962154Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 30 19:15:20.262675 containerd[1495]: time="2025-01-30T19:15:20.262611329Z" level=info msg="StopPodSandbox for \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\"" Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.335 [INFO][4010] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.337 [INFO][4010] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" iface="eth0" netns="/var/run/netns/cni-01bdee4f-6dac-f5fd-cab9-dc7d39fe7351" Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.338 [INFO][4010] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" iface="eth0" netns="/var/run/netns/cni-01bdee4f-6dac-f5fd-cab9-dc7d39fe7351" Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.339 [INFO][4010] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" iface="eth0" netns="/var/run/netns/cni-01bdee4f-6dac-f5fd-cab9-dc7d39fe7351" Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.339 [INFO][4010] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.339 [INFO][4010] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.398 [INFO][4032] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" HandleID="k8s-pod-network.db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.400 [INFO][4032] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.400 [INFO][4032] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.412 [WARNING][4032] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" HandleID="k8s-pod-network.db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.412 [INFO][4032] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" HandleID="k8s-pod-network.db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.415 [INFO][4032] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:20.423659 containerd[1495]: 2025-01-30 19:15:20.419 [INFO][4010] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:20.424789 containerd[1495]: time="2025-01-30T19:15:20.424464565Z" level=info msg="TearDown network for sandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\" successfully" Jan 30 19:15:20.424789 containerd[1495]: time="2025-01-30T19:15:20.424536473Z" level=info msg="StopPodSandbox for \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\" returns successfully" Jan 30 19:15:20.426428 containerd[1495]: time="2025-01-30T19:15:20.426385830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55fd5b7757-8rgqt,Uid:310a4559-09ca-47db-a381-43206f870195,Namespace:calico-system,Attempt:1,}" Jan 30 19:15:20.617798 systemd[1]: run-netns-cni\x2d01bdee4f\x2d6dac\x2df5fd\x2dcab9\x2ddc7d39fe7351.mount: Deactivated successfully. Jan 30 19:15:20.693336 systemd-networkd[1437]: cali8d5b74b40a9: Link UP Jan 30 19:15:20.696210 systemd-networkd[1437]: cali8d5b74b40a9: Gained carrier Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.498 [INFO][4076] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.530 [INFO][4076] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0 calico-kube-controllers-55fd5b7757- calico-system 310a4559-09ca-47db-a381-43206f870195 753 0 2025-01-30 19:14:52 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:55fd5b7757 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-ehdo1.gb1.brightbox.com calico-kube-controllers-55fd5b7757-8rgqt eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali8d5b74b40a9 [] []}} ContainerID="cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" Namespace="calico-system" Pod="calico-kube-controllers-55fd5b7757-8rgqt" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-" Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.530 [INFO][4076] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" Namespace="calico-system" Pod="calico-kube-controllers-55fd5b7757-8rgqt" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.595 [INFO][4107] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" HandleID="k8s-pod-network.cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.617 [INFO][4107] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" HandleID="k8s-pod-network.cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00048f670), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-ehdo1.gb1.brightbox.com", "pod":"calico-kube-controllers-55fd5b7757-8rgqt", "timestamp":"2025-01-30 19:15:20.595111656 +0000 UTC"}, Hostname:"srv-ehdo1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.617 [INFO][4107] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.617 [INFO][4107] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.617 [INFO][4107] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-ehdo1.gb1.brightbox.com' Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.629 [INFO][4107] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.641 [INFO][4107] ipam/ipam.go 372: Looking up existing affinities for host host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.650 [INFO][4107] ipam/ipam.go 489: Trying affinity for 192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.655 [INFO][4107] ipam/ipam.go 155: Attempting to load block cidr=192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.658 [INFO][4107] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.658 [INFO][4107] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.47.64/26 handle="k8s-pod-network.cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.661 [INFO][4107] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.671 [INFO][4107] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.47.64/26 handle="k8s-pod-network.cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.680 [INFO][4107] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.47.66/26] block=192.168.47.64/26 handle="k8s-pod-network.cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.681 [INFO][4107] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.47.66/26] handle="k8s-pod-network.cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.681 [INFO][4107] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:20.735592 containerd[1495]: 2025-01-30 19:15:20.681 [INFO][4107] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.47.66/26] IPv6=[] ContainerID="cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" HandleID="k8s-pod-network.cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:20.737435 containerd[1495]: 2025-01-30 19:15:20.685 [INFO][4076] cni-plugin/k8s.go 386: Populated endpoint ContainerID="cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" Namespace="calico-system" Pod="calico-kube-controllers-55fd5b7757-8rgqt" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0", GenerateName:"calico-kube-controllers-55fd5b7757-", Namespace:"calico-system", SelfLink:"", UID:"310a4559-09ca-47db-a381-43206f870195", ResourceVersion:"753", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"55fd5b7757", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-55fd5b7757-8rgqt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.47.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8d5b74b40a9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:20.737435 containerd[1495]: 2025-01-30 19:15:20.685 [INFO][4076] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.47.66/32] ContainerID="cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" Namespace="calico-system" Pod="calico-kube-controllers-55fd5b7757-8rgqt" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:20.737435 containerd[1495]: 2025-01-30 19:15:20.685 [INFO][4076] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8d5b74b40a9 ContainerID="cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" Namespace="calico-system" Pod="calico-kube-controllers-55fd5b7757-8rgqt" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:20.737435 containerd[1495]: 2025-01-30 19:15:20.699 [INFO][4076] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" Namespace="calico-system" Pod="calico-kube-controllers-55fd5b7757-8rgqt" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:20.737435 containerd[1495]: 2025-01-30 19:15:20.701 [INFO][4076] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" Namespace="calico-system" Pod="calico-kube-controllers-55fd5b7757-8rgqt" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0", GenerateName:"calico-kube-controllers-55fd5b7757-", Namespace:"calico-system", SelfLink:"", UID:"310a4559-09ca-47db-a381-43206f870195", ResourceVersion:"753", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"55fd5b7757", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc", Pod:"calico-kube-controllers-55fd5b7757-8rgqt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.47.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8d5b74b40a9", MAC:"2a:2c:a6:16:20:02", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:20.737435 containerd[1495]: 2025-01-30 19:15:20.733 [INFO][4076] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc" Namespace="calico-system" Pod="calico-kube-controllers-55fd5b7757-8rgqt" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:20.822746 containerd[1495]: time="2025-01-30T19:15:20.820947457Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:15:20.822746 containerd[1495]: time="2025-01-30T19:15:20.821038586Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:15:20.822746 containerd[1495]: time="2025-01-30T19:15:20.821074169Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:15:20.822746 containerd[1495]: time="2025-01-30T19:15:20.821283355Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:15:20.888009 systemd[1]: Started cri-containerd-cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc.scope - libcontainer container cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc. Jan 30 19:15:21.108996 containerd[1495]: time="2025-01-30T19:15:21.108906367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55fd5b7757-8rgqt,Uid:310a4559-09ca-47db-a381-43206f870195,Namespace:calico-system,Attempt:1,} returns sandbox id \"cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc\"" Jan 30 19:15:21.308602 kernel: bpftool[4207]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 30 19:15:21.667490 systemd-networkd[1437]: vxlan.calico: Link UP Jan 30 19:15:21.667529 systemd-networkd[1437]: vxlan.calico: Gained carrier Jan 30 19:15:21.680517 systemd-networkd[1437]: calic880996c1cf: Gained IPv6LL Jan 30 19:15:21.937077 systemd-networkd[1437]: cali8d5b74b40a9: Gained IPv6LL Jan 30 19:15:22.266503 containerd[1495]: time="2025-01-30T19:15:22.265923941Z" level=info msg="StopPodSandbox for \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\"" Jan 30 19:15:22.266503 containerd[1495]: time="2025-01-30T19:15:22.265924109Z" level=info msg="StopPodSandbox for \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\"" Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.373 [INFO][4319] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.373 [INFO][4319] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" iface="eth0" netns="/var/run/netns/cni-fa66e268-202e-3466-320a-429f7ea23865" Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.375 [INFO][4319] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" iface="eth0" netns="/var/run/netns/cni-fa66e268-202e-3466-320a-429f7ea23865" Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.377 [INFO][4319] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" iface="eth0" netns="/var/run/netns/cni-fa66e268-202e-3466-320a-429f7ea23865" Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.377 [INFO][4319] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.377 [INFO][4319] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.465 [INFO][4334] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" HandleID="k8s-pod-network.e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.465 [INFO][4334] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.465 [INFO][4334] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.484 [WARNING][4334] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" HandleID="k8s-pod-network.e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.484 [INFO][4334] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" HandleID="k8s-pod-network.e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.487 [INFO][4334] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:22.497857 containerd[1495]: 2025-01-30 19:15:22.494 [INFO][4319] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:22.501852 containerd[1495]: time="2025-01-30T19:15:22.498727571Z" level=info msg="TearDown network for sandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\" successfully" Jan 30 19:15:22.501852 containerd[1495]: time="2025-01-30T19:15:22.498773691Z" level=info msg="StopPodSandbox for \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\" returns successfully" Jan 30 19:15:22.501852 containerd[1495]: time="2025-01-30T19:15:22.499737979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-snjjb,Uid:895b74bc-3470-4c3c-b993-a72d4beb91c4,Namespace:kube-system,Attempt:1,}" Jan 30 19:15:22.500962 systemd[1]: run-netns-cni\x2dfa66e268\x2d202e\x2d3466\x2d320a\x2d429f7ea23865.mount: Deactivated successfully. Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.398 [INFO][4326] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.398 [INFO][4326] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" iface="eth0" netns="/var/run/netns/cni-26833097-a81e-6eb7-0d8e-8dc5d290fc65" Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.398 [INFO][4326] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" iface="eth0" netns="/var/run/netns/cni-26833097-a81e-6eb7-0d8e-8dc5d290fc65" Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.399 [INFO][4326] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" iface="eth0" netns="/var/run/netns/cni-26833097-a81e-6eb7-0d8e-8dc5d290fc65" Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.400 [INFO][4326] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.400 [INFO][4326] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.490 [INFO][4338] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" HandleID="k8s-pod-network.2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.492 [INFO][4338] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.492 [INFO][4338] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.517 [WARNING][4338] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" HandleID="k8s-pod-network.2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.518 [INFO][4338] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" HandleID="k8s-pod-network.2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.525 [INFO][4338] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:22.545210 containerd[1495]: 2025-01-30 19:15:22.539 [INFO][4326] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:22.548302 containerd[1495]: time="2025-01-30T19:15:22.545815837Z" level=info msg="TearDown network for sandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\" successfully" Jan 30 19:15:22.548302 containerd[1495]: time="2025-01-30T19:15:22.545871081Z" level=info msg="StopPodSandbox for \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\" returns successfully" Jan 30 19:15:22.558815 systemd[1]: run-netns-cni\x2d26833097\x2da81e\x2d6eb7\x2d0d8e\x2d8dc5d290fc65.mount: Deactivated successfully. Jan 30 19:15:22.561980 containerd[1495]: time="2025-01-30T19:15:22.561609284Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86bd64cf58-cm96v,Uid:a19cd91b-5cda-4b1f-99ed-5c7be5d66c14,Namespace:calico-apiserver,Attempt:1,}" Jan 30 19:15:22.912606 systemd-networkd[1437]: cali7ca68738d10: Link UP Jan 30 19:15:22.916523 systemd-networkd[1437]: cali7ca68738d10: Gained carrier Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.701 [INFO][4349] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0 coredns-668d6bf9bc- kube-system 895b74bc-3470-4c3c-b993-a72d4beb91c4 763 0 2025-01-30 19:14:43 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-ehdo1.gb1.brightbox.com coredns-668d6bf9bc-snjjb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7ca68738d10 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-snjjb" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-" Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.702 [INFO][4349] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-snjjb" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.805 [INFO][4377] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" HandleID="k8s-pod-network.7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.834 [INFO][4377] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" HandleID="k8s-pod-network.7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003127f0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-ehdo1.gb1.brightbox.com", "pod":"coredns-668d6bf9bc-snjjb", "timestamp":"2025-01-30 19:15:22.805388387 +0000 UTC"}, Hostname:"srv-ehdo1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.834 [INFO][4377] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.835 [INFO][4377] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.835 [INFO][4377] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-ehdo1.gb1.brightbox.com' Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.842 [INFO][4377] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.853 [INFO][4377] ipam/ipam.go 372: Looking up existing affinities for host host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.866 [INFO][4377] ipam/ipam.go 489: Trying affinity for 192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.869 [INFO][4377] ipam/ipam.go 155: Attempting to load block cidr=192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.875 [INFO][4377] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.875 [INFO][4377] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.47.64/26 handle="k8s-pod-network.7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.878 [INFO][4377] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9 Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.887 [INFO][4377] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.47.64/26 handle="k8s-pod-network.7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.899 [INFO][4377] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.47.67/26] block=192.168.47.64/26 handle="k8s-pod-network.7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.899 [INFO][4377] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.47.67/26] handle="k8s-pod-network.7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.899 [INFO][4377] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:22.952472 containerd[1495]: 2025-01-30 19:15:22.899 [INFO][4377] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.47.67/26] IPv6=[] ContainerID="7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" HandleID="k8s-pod-network.7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:22.954719 containerd[1495]: 2025-01-30 19:15:22.904 [INFO][4349] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-snjjb" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"895b74bc-3470-4c3c-b993-a72d4beb91c4", ResourceVersion:"763", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"", Pod:"coredns-668d6bf9bc-snjjb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.47.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7ca68738d10", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:22.954719 containerd[1495]: 2025-01-30 19:15:22.904 [INFO][4349] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.47.67/32] ContainerID="7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-snjjb" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:22.954719 containerd[1495]: 2025-01-30 19:15:22.904 [INFO][4349] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7ca68738d10 ContainerID="7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-snjjb" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:22.954719 containerd[1495]: 2025-01-30 19:15:22.918 [INFO][4349] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-snjjb" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:22.954719 containerd[1495]: 2025-01-30 19:15:22.921 [INFO][4349] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-snjjb" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"895b74bc-3470-4c3c-b993-a72d4beb91c4", ResourceVersion:"763", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9", Pod:"coredns-668d6bf9bc-snjjb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.47.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7ca68738d10", MAC:"4a:fa:24:8c:0a:5b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:22.954719 containerd[1495]: 2025-01-30 19:15:22.948 [INFO][4349] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-snjjb" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:23.023995 systemd-networkd[1437]: vxlan.calico: Gained IPv6LL Jan 30 19:15:23.109596 containerd[1495]: time="2025-01-30T19:15:23.099359257Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:15:23.109596 containerd[1495]: time="2025-01-30T19:15:23.099443480Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:15:23.109596 containerd[1495]: time="2025-01-30T19:15:23.099464117Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:15:23.109596 containerd[1495]: time="2025-01-30T19:15:23.099602036Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:15:23.138804 systemd-networkd[1437]: calidee7242abcd: Link UP Jan 30 19:15:23.139209 systemd-networkd[1437]: calidee7242abcd: Gained carrier Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:22.698 [INFO][4359] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0 calico-apiserver-86bd64cf58- calico-apiserver a19cd91b-5cda-4b1f-99ed-5c7be5d66c14 764 0 2025-01-30 19:14:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:86bd64cf58 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-ehdo1.gb1.brightbox.com calico-apiserver-86bd64cf58-cm96v eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calidee7242abcd [] []}} ContainerID="48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-cm96v" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-" Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:22.698 [INFO][4359] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-cm96v" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:22.827 [INFO][4373] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" HandleID="k8s-pod-network.48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:22.850 [INFO][4373] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" HandleID="k8s-pod-network.48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0005c44e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-ehdo1.gb1.brightbox.com", "pod":"calico-apiserver-86bd64cf58-cm96v", "timestamp":"2025-01-30 19:15:22.826565944 +0000 UTC"}, Hostname:"srv-ehdo1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:22.850 [INFO][4373] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:22.899 [INFO][4373] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:22.900 [INFO][4373] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-ehdo1.gb1.brightbox.com' Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:22.955 [INFO][4373] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:22.979 [INFO][4373] ipam/ipam.go 372: Looking up existing affinities for host host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:23.011 [INFO][4373] ipam/ipam.go 489: Trying affinity for 192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:23.022 [INFO][4373] ipam/ipam.go 155: Attempting to load block cidr=192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:23.044 [INFO][4373] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:23.044 [INFO][4373] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.47.64/26 handle="k8s-pod-network.48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:23.051 [INFO][4373] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:23.089 [INFO][4373] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.47.64/26 handle="k8s-pod-network.48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:23.118 [INFO][4373] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.47.68/26] block=192.168.47.64/26 handle="k8s-pod-network.48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:23.120 [INFO][4373] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.47.68/26] handle="k8s-pod-network.48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:23.120 [INFO][4373] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:23.168376 containerd[1495]: 2025-01-30 19:15:23.120 [INFO][4373] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.47.68/26] IPv6=[] ContainerID="48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" HandleID="k8s-pod-network.48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:23.170735 containerd[1495]: 2025-01-30 19:15:23.127 [INFO][4359] cni-plugin/k8s.go 386: Populated endpoint ContainerID="48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-cm96v" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0", GenerateName:"calico-apiserver-86bd64cf58-", Namespace:"calico-apiserver", SelfLink:"", UID:"a19cd91b-5cda-4b1f-99ed-5c7be5d66c14", ResourceVersion:"764", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86bd64cf58", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-86bd64cf58-cm96v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.47.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidee7242abcd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:23.170735 containerd[1495]: 2025-01-30 19:15:23.128 [INFO][4359] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.47.68/32] ContainerID="48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-cm96v" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:23.170735 containerd[1495]: 2025-01-30 19:15:23.128 [INFO][4359] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidee7242abcd ContainerID="48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-cm96v" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:23.170735 containerd[1495]: 2025-01-30 19:15:23.139 [INFO][4359] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-cm96v" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:23.170735 containerd[1495]: 2025-01-30 19:15:23.141 [INFO][4359] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-cm96v" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0", GenerateName:"calico-apiserver-86bd64cf58-", Namespace:"calico-apiserver", SelfLink:"", UID:"a19cd91b-5cda-4b1f-99ed-5c7be5d66c14", ResourceVersion:"764", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86bd64cf58", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a", Pod:"calico-apiserver-86bd64cf58-cm96v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.47.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidee7242abcd", MAC:"62:bf:b4:de:b8:aa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:23.170735 containerd[1495]: 2025-01-30 19:15:23.163 [INFO][4359] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a" Namespace="calico-apiserver" Pod="calico-apiserver-86bd64cf58-cm96v" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:23.187174 systemd[1]: Started cri-containerd-7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9.scope - libcontainer container 7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9. Jan 30 19:15:23.268224 containerd[1495]: time="2025-01-30T19:15:23.268076771Z" level=info msg="StopPodSandbox for \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\"" Jan 30 19:15:23.289500 containerd[1495]: time="2025-01-30T19:15:23.288646916Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:15:23.308348 containerd[1495]: time="2025-01-30T19:15:23.290583514Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:15:23.308348 containerd[1495]: time="2025-01-30T19:15:23.290629520Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:15:23.308348 containerd[1495]: time="2025-01-30T19:15:23.291004549Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:15:23.368100 systemd[1]: Started cri-containerd-48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a.scope - libcontainer container 48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a. Jan 30 19:15:23.389069 containerd[1495]: time="2025-01-30T19:15:23.389006503Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-snjjb,Uid:895b74bc-3470-4c3c-b993-a72d4beb91c4,Namespace:kube-system,Attempt:1,} returns sandbox id \"7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9\"" Jan 30 19:15:23.398454 containerd[1495]: time="2025-01-30T19:15:23.398378793Z" level=info msg="CreateContainer within sandbox \"7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 30 19:15:23.426541 containerd[1495]: time="2025-01-30T19:15:23.426393403Z" level=info msg="CreateContainer within sandbox \"7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"28a5e5fd9823b378ca3b8f9ace2b8df74448e6eadcd25a714a2a0713928eeebb\"" Jan 30 19:15:23.430979 containerd[1495]: time="2025-01-30T19:15:23.430943643Z" level=info msg="StartContainer for \"28a5e5fd9823b378ca3b8f9ace2b8df74448e6eadcd25a714a2a0713928eeebb\"" Jan 30 19:15:23.553644 systemd[1]: Started cri-containerd-28a5e5fd9823b378ca3b8f9ace2b8df74448e6eadcd25a714a2a0713928eeebb.scope - libcontainer container 28a5e5fd9823b378ca3b8f9ace2b8df74448e6eadcd25a714a2a0713928eeebb. Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.487 [INFO][4484] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.490 [INFO][4484] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" iface="eth0" netns="/var/run/netns/cni-f95b43bc-2736-42ef-7e91-489ef7fd0ab9" Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.491 [INFO][4484] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" iface="eth0" netns="/var/run/netns/cni-f95b43bc-2736-42ef-7e91-489ef7fd0ab9" Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.491 [INFO][4484] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" iface="eth0" netns="/var/run/netns/cni-f95b43bc-2736-42ef-7e91-489ef7fd0ab9" Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.491 [INFO][4484] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.491 [INFO][4484] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.613 [INFO][4518] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" HandleID="k8s-pod-network.2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Workload="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.614 [INFO][4518] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.615 [INFO][4518] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.640 [WARNING][4518] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" HandleID="k8s-pod-network.2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Workload="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.640 [INFO][4518] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" HandleID="k8s-pod-network.2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Workload="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.645 [INFO][4518] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:23.655467 containerd[1495]: 2025-01-30 19:15:23.652 [INFO][4484] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:23.660025 containerd[1495]: time="2025-01-30T19:15:23.659958530Z" level=info msg="TearDown network for sandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\" successfully" Jan 30 19:15:23.660645 containerd[1495]: time="2025-01-30T19:15:23.660612837Z" level=info msg="StopPodSandbox for \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\" returns successfully" Jan 30 19:15:23.667294 systemd[1]: run-netns-cni\x2df95b43bc\x2d2736\x2d42ef\x2d7e91\x2d489ef7fd0ab9.mount: Deactivated successfully. Jan 30 19:15:23.668386 containerd[1495]: time="2025-01-30T19:15:23.668336780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xwmcj,Uid:eaeacb08-27c1-40ef-baaf-66029c9f99c5,Namespace:calico-system,Attempt:1,}" Jan 30 19:15:23.723140 containerd[1495]: time="2025-01-30T19:15:23.722607873Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86bd64cf58-cm96v,Uid:a19cd91b-5cda-4b1f-99ed-5c7be5d66c14,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a\"" Jan 30 19:15:23.751544 containerd[1495]: time="2025-01-30T19:15:23.751444335Z" level=info msg="StartContainer for \"28a5e5fd9823b378ca3b8f9ace2b8df74448e6eadcd25a714a2a0713928eeebb\" returns successfully" Jan 30 19:15:23.870924 kubelet[2677]: I0130 19:15:23.870559 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-snjjb" podStartSLOduration=40.870504826 podStartE2EDuration="40.870504826s" podCreationTimestamp="2025-01-30 19:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-30 19:15:23.866882395 +0000 UTC m=+46.748106263" watchObservedRunningTime="2025-01-30 19:15:23.870504826 +0000 UTC m=+46.751728676" Jan 30 19:15:24.112492 systemd-networkd[1437]: cali7ca68738d10: Gained IPv6LL Jan 30 19:15:24.179960 systemd-networkd[1437]: calif99e5ec88ec: Link UP Jan 30 19:15:24.183580 systemd-networkd[1437]: calif99e5ec88ec: Gained carrier Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:23.856 [INFO][4555] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0 csi-node-driver- calico-system eaeacb08-27c1-40ef-baaf-66029c9f99c5 776 0 2025-01-30 19:14:52 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:84cddb44f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-ehdo1.gb1.brightbox.com csi-node-driver-xwmcj eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif99e5ec88ec [] []}} ContainerID="8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" Namespace="calico-system" Pod="csi-node-driver-xwmcj" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-" Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:23.858 [INFO][4555] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" Namespace="calico-system" Pod="csi-node-driver-xwmcj" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.027 [INFO][4570] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" HandleID="k8s-pod-network.8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" Workload="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.052 [INFO][4570] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" HandleID="k8s-pod-network.8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" Workload="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000246810), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-ehdo1.gb1.brightbox.com", "pod":"csi-node-driver-xwmcj", "timestamp":"2025-01-30 19:15:24.027007841 +0000 UTC"}, Hostname:"srv-ehdo1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.053 [INFO][4570] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.053 [INFO][4570] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.053 [INFO][4570] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-ehdo1.gb1.brightbox.com' Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.060 [INFO][4570] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.077 [INFO][4570] ipam/ipam.go 372: Looking up existing affinities for host host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.096 [INFO][4570] ipam/ipam.go 489: Trying affinity for 192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.100 [INFO][4570] ipam/ipam.go 155: Attempting to load block cidr=192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.110 [INFO][4570] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.110 [INFO][4570] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.47.64/26 handle="k8s-pod-network.8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.115 [INFO][4570] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5 Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.125 [INFO][4570] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.47.64/26 handle="k8s-pod-network.8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.147 [INFO][4570] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.47.69/26] block=192.168.47.64/26 handle="k8s-pod-network.8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.147 [INFO][4570] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.47.69/26] handle="k8s-pod-network.8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.147 [INFO][4570] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:24.238031 containerd[1495]: 2025-01-30 19:15:24.147 [INFO][4570] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.47.69/26] IPv6=[] ContainerID="8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" HandleID="k8s-pod-network.8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" Workload="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:24.242038 containerd[1495]: 2025-01-30 19:15:24.154 [INFO][4555] cni-plugin/k8s.go 386: Populated endpoint ContainerID="8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" Namespace="calico-system" Pod="csi-node-driver-xwmcj" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eaeacb08-27c1-40ef-baaf-66029c9f99c5", ResourceVersion:"776", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"84cddb44f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-xwmcj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.47.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif99e5ec88ec", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:24.242038 containerd[1495]: 2025-01-30 19:15:24.155 [INFO][4555] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.47.69/32] ContainerID="8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" Namespace="calico-system" Pod="csi-node-driver-xwmcj" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:24.242038 containerd[1495]: 2025-01-30 19:15:24.155 [INFO][4555] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif99e5ec88ec ContainerID="8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" Namespace="calico-system" Pod="csi-node-driver-xwmcj" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:24.242038 containerd[1495]: 2025-01-30 19:15:24.187 [INFO][4555] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" Namespace="calico-system" Pod="csi-node-driver-xwmcj" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:24.242038 containerd[1495]: 2025-01-30 19:15:24.189 [INFO][4555] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" Namespace="calico-system" Pod="csi-node-driver-xwmcj" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eaeacb08-27c1-40ef-baaf-66029c9f99c5", ResourceVersion:"776", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"84cddb44f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5", Pod:"csi-node-driver-xwmcj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.47.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif99e5ec88ec", MAC:"66:5f:2d:25:48:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:24.242038 containerd[1495]: 2025-01-30 19:15:24.227 [INFO][4555] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5" Namespace="calico-system" Pod="csi-node-driver-xwmcj" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:24.306146 containerd[1495]: time="2025-01-30T19:15:24.305925722Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:15:24.306146 containerd[1495]: time="2025-01-30T19:15:24.306003706Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:15:24.306146 containerd[1495]: time="2025-01-30T19:15:24.306021181Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:15:24.306982 containerd[1495]: time="2025-01-30T19:15:24.306907096Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:15:24.347866 systemd[1]: Started cri-containerd-8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5.scope - libcontainer container 8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5. Jan 30 19:15:24.464254 containerd[1495]: time="2025-01-30T19:15:24.464166064Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xwmcj,Uid:eaeacb08-27c1-40ef-baaf-66029c9f99c5,Namespace:calico-system,Attempt:1,} returns sandbox id \"8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5\"" Jan 30 19:15:24.560158 systemd-networkd[1437]: calidee7242abcd: Gained IPv6LL Jan 30 19:15:25.458610 containerd[1495]: time="2025-01-30T19:15:25.458490469Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:25.464966 containerd[1495]: time="2025-01-30T19:15:25.464904941Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Jan 30 19:15:25.470859 containerd[1495]: time="2025-01-30T19:15:25.470766711Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:25.473982 containerd[1495]: time="2025-01-30T19:15:25.473901617Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:25.475441 containerd[1495]: time="2025-01-30T19:15:25.475095252Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 5.369778163s" Jan 30 19:15:25.475441 containerd[1495]: time="2025-01-30T19:15:25.475155084Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 30 19:15:25.478401 containerd[1495]: time="2025-01-30T19:15:25.477168994Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Jan 30 19:15:25.479725 containerd[1495]: time="2025-01-30T19:15:25.479451438Z" level=info msg="CreateContainer within sandbox \"d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 30 19:15:25.513479 containerd[1495]: time="2025-01-30T19:15:25.513419508Z" level=info msg="CreateContainer within sandbox \"d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a20eafa891dc47131781dff3399e51a6efc3ebb3ca8f66365231ecfb8565860c\"" Jan 30 19:15:25.515900 containerd[1495]: time="2025-01-30T19:15:25.514481432Z" level=info msg="StartContainer for \"a20eafa891dc47131781dff3399e51a6efc3ebb3ca8f66365231ecfb8565860c\"" Jan 30 19:15:25.566651 systemd[1]: run-containerd-runc-k8s.io-a20eafa891dc47131781dff3399e51a6efc3ebb3ca8f66365231ecfb8565860c-runc.qIyfLu.mount: Deactivated successfully. Jan 30 19:15:25.576051 systemd[1]: Started cri-containerd-a20eafa891dc47131781dff3399e51a6efc3ebb3ca8f66365231ecfb8565860c.scope - libcontainer container a20eafa891dc47131781dff3399e51a6efc3ebb3ca8f66365231ecfb8565860c. Jan 30 19:15:25.642931 containerd[1495]: time="2025-01-30T19:15:25.642814712Z" level=info msg="StartContainer for \"a20eafa891dc47131781dff3399e51a6efc3ebb3ca8f66365231ecfb8565860c\" returns successfully" Jan 30 19:15:25.881896 kubelet[2677]: I0130 19:15:25.853412 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-86bd64cf58-mfbfs" podStartSLOduration=29.480525796 podStartE2EDuration="34.853387756s" podCreationTimestamp="2025-01-30 19:14:51 +0000 UTC" firstStartedPulling="2025-01-30 19:15:20.10399341 +0000 UTC m=+42.985217245" lastFinishedPulling="2025-01-30 19:15:25.476855351 +0000 UTC m=+48.358079205" observedRunningTime="2025-01-30 19:15:25.850689523 +0000 UTC m=+48.731913400" watchObservedRunningTime="2025-01-30 19:15:25.853387756 +0000 UTC m=+48.734611603" Jan 30 19:15:26.160260 systemd-networkd[1437]: calif99e5ec88ec: Gained IPv6LL Jan 30 19:15:26.841800 kubelet[2677]: I0130 19:15:26.841306 2677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 19:15:28.573202 containerd[1495]: time="2025-01-30T19:15:28.572696692Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:28.574919 containerd[1495]: time="2025-01-30T19:15:28.574301658Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Jan 30 19:15:28.574919 containerd[1495]: time="2025-01-30T19:15:28.574846049Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:28.577930 containerd[1495]: time="2025-01-30T19:15:28.577866808Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:28.579192 containerd[1495]: time="2025-01-30T19:15:28.579016107Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 3.100079314s" Jan 30 19:15:28.579192 containerd[1495]: time="2025-01-30T19:15:28.579061994Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Jan 30 19:15:28.581533 containerd[1495]: time="2025-01-30T19:15:28.581311485Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 30 19:15:28.605219 containerd[1495]: time="2025-01-30T19:15:28.605152673Z" level=info msg="CreateContainer within sandbox \"cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 30 19:15:28.626695 containerd[1495]: time="2025-01-30T19:15:28.626643328Z" level=info msg="CreateContainer within sandbox \"cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"35fe3ac00fa2e7668b7c281efc329181098a05398f3f75844e354fdf44f5218a\"" Jan 30 19:15:28.629618 containerd[1495]: time="2025-01-30T19:15:28.628041448Z" level=info msg="StartContainer for \"35fe3ac00fa2e7668b7c281efc329181098a05398f3f75844e354fdf44f5218a\"" Jan 30 19:15:28.675529 systemd[1]: Started cri-containerd-35fe3ac00fa2e7668b7c281efc329181098a05398f3f75844e354fdf44f5218a.scope - libcontainer container 35fe3ac00fa2e7668b7c281efc329181098a05398f3f75844e354fdf44f5218a. Jan 30 19:15:28.755982 containerd[1495]: time="2025-01-30T19:15:28.755677932Z" level=info msg="StartContainer for \"35fe3ac00fa2e7668b7c281efc329181098a05398f3f75844e354fdf44f5218a\" returns successfully" Jan 30 19:15:28.884565 kubelet[2677]: I0130 19:15:28.884272 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-55fd5b7757-8rgqt" podStartSLOduration=29.416300157 podStartE2EDuration="36.883363662s" podCreationTimestamp="2025-01-30 19:14:52 +0000 UTC" firstStartedPulling="2025-01-30 19:15:21.113563489 +0000 UTC m=+43.994787331" lastFinishedPulling="2025-01-30 19:15:28.580626984 +0000 UTC m=+51.461850836" observedRunningTime="2025-01-30 19:15:28.880324384 +0000 UTC m=+51.761548244" watchObservedRunningTime="2025-01-30 19:15:28.883363662 +0000 UTC m=+51.764587510" Jan 30 19:15:28.945778 containerd[1495]: time="2025-01-30T19:15:28.945533604Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:28.947097 containerd[1495]: time="2025-01-30T19:15:28.946986498Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Jan 30 19:15:28.954590 containerd[1495]: time="2025-01-30T19:15:28.954393629Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 373.040876ms" Jan 30 19:15:28.954590 containerd[1495]: time="2025-01-30T19:15:28.954531446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 30 19:15:28.958049 containerd[1495]: time="2025-01-30T19:15:28.957569293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 30 19:15:28.963627 containerd[1495]: time="2025-01-30T19:15:28.963584970Z" level=info msg="CreateContainer within sandbox \"48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 30 19:15:28.984324 containerd[1495]: time="2025-01-30T19:15:28.984229537Z" level=info msg="CreateContainer within sandbox \"48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2891c525b4995a849428f347e17f244bd4795e015d625ada98383d1f5b1dde48\"" Jan 30 19:15:28.987866 containerd[1495]: time="2025-01-30T19:15:28.986764396Z" level=info msg="StartContainer for \"2891c525b4995a849428f347e17f244bd4795e015d625ada98383d1f5b1dde48\"" Jan 30 19:15:29.044138 systemd[1]: Started cri-containerd-2891c525b4995a849428f347e17f244bd4795e015d625ada98383d1f5b1dde48.scope - libcontainer container 2891c525b4995a849428f347e17f244bd4795e015d625ada98383d1f5b1dde48. Jan 30 19:15:29.119895 containerd[1495]: time="2025-01-30T19:15:29.119047982Z" level=info msg="StartContainer for \"2891c525b4995a849428f347e17f244bd4795e015d625ada98383d1f5b1dde48\" returns successfully" Jan 30 19:15:29.269452 containerd[1495]: time="2025-01-30T19:15:29.267373327Z" level=info msg="StopPodSandbox for \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\"" Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.360 [INFO][4808] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.361 [INFO][4808] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" iface="eth0" netns="/var/run/netns/cni-da24d0ae-58eb-00e7-dd15-47456416cb8f" Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.362 [INFO][4808] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" iface="eth0" netns="/var/run/netns/cni-da24d0ae-58eb-00e7-dd15-47456416cb8f" Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.362 [INFO][4808] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" iface="eth0" netns="/var/run/netns/cni-da24d0ae-58eb-00e7-dd15-47456416cb8f" Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.362 [INFO][4808] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.363 [INFO][4808] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.402 [INFO][4814] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" HandleID="k8s-pod-network.85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.404 [INFO][4814] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.404 [INFO][4814] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.416 [WARNING][4814] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" HandleID="k8s-pod-network.85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.417 [INFO][4814] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" HandleID="k8s-pod-network.85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.419 [INFO][4814] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:29.424104 containerd[1495]: 2025-01-30 19:15:29.420 [INFO][4808] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:29.424104 containerd[1495]: time="2025-01-30T19:15:29.423917277Z" level=info msg="TearDown network for sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\" successfully" Jan 30 19:15:29.424104 containerd[1495]: time="2025-01-30T19:15:29.423954576Z" level=info msg="StopPodSandbox for \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\" returns successfully" Jan 30 19:15:29.427326 containerd[1495]: time="2025-01-30T19:15:29.426348438Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ct7bq,Uid:e5cdbf5e-4fb8-4a95-8254-b6bc2709291a,Namespace:kube-system,Attempt:1,}" Jan 30 19:15:29.601758 systemd[1]: run-netns-cni\x2dda24d0ae\x2d58eb\x2d00e7\x2ddd15\x2d47456416cb8f.mount: Deactivated successfully. Jan 30 19:15:29.644393 systemd-networkd[1437]: cali62864393510: Link UP Jan 30 19:15:29.645509 systemd-networkd[1437]: cali62864393510: Gained carrier Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.515 [INFO][4823] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0 coredns-668d6bf9bc- kube-system e5cdbf5e-4fb8-4a95-8254-b6bc2709291a 824 0 2025-01-30 19:14:43 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-ehdo1.gb1.brightbox.com coredns-668d6bf9bc-ct7bq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali62864393510 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" Namespace="kube-system" Pod="coredns-668d6bf9bc-ct7bq" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-" Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.516 [INFO][4823] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" Namespace="kube-system" Pod="coredns-668d6bf9bc-ct7bq" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.566 [INFO][4834] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" HandleID="k8s-pod-network.60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.581 [INFO][4834] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" HandleID="k8s-pod-network.60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000050af0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-ehdo1.gb1.brightbox.com", "pod":"coredns-668d6bf9bc-ct7bq", "timestamp":"2025-01-30 19:15:29.566163077 +0000 UTC"}, Hostname:"srv-ehdo1.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.581 [INFO][4834] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.581 [INFO][4834] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.581 [INFO][4834] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-ehdo1.gb1.brightbox.com' Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.585 [INFO][4834] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.591 [INFO][4834] ipam/ipam.go 372: Looking up existing affinities for host host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.607 [INFO][4834] ipam/ipam.go 489: Trying affinity for 192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.611 [INFO][4834] ipam/ipam.go 155: Attempting to load block cidr=192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.617 [INFO][4834] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.47.64/26 host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.617 [INFO][4834] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.47.64/26 handle="k8s-pod-network.60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.620 [INFO][4834] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40 Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.626 [INFO][4834] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.47.64/26 handle="k8s-pod-network.60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.636 [INFO][4834] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.47.70/26] block=192.168.47.64/26 handle="k8s-pod-network.60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.637 [INFO][4834] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.47.70/26] handle="k8s-pod-network.60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" host="srv-ehdo1.gb1.brightbox.com" Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.637 [INFO][4834] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:29.668387 containerd[1495]: 2025-01-30 19:15:29.638 [INFO][4834] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.47.70/26] IPv6=[] ContainerID="60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" HandleID="k8s-pod-network.60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:29.673181 containerd[1495]: 2025-01-30 19:15:29.640 [INFO][4823] cni-plugin/k8s.go 386: Populated endpoint ContainerID="60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" Namespace="kube-system" Pod="coredns-668d6bf9bc-ct7bq" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e5cdbf5e-4fb8-4a95-8254-b6bc2709291a", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"", Pod:"coredns-668d6bf9bc-ct7bq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.47.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali62864393510", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:29.673181 containerd[1495]: 2025-01-30 19:15:29.640 [INFO][4823] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.47.70/32] ContainerID="60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" Namespace="kube-system" Pod="coredns-668d6bf9bc-ct7bq" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:29.673181 containerd[1495]: 2025-01-30 19:15:29.640 [INFO][4823] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali62864393510 ContainerID="60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" Namespace="kube-system" Pod="coredns-668d6bf9bc-ct7bq" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:29.673181 containerd[1495]: 2025-01-30 19:15:29.645 [INFO][4823] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" Namespace="kube-system" Pod="coredns-668d6bf9bc-ct7bq" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:29.673181 containerd[1495]: 2025-01-30 19:15:29.647 [INFO][4823] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" Namespace="kube-system" Pod="coredns-668d6bf9bc-ct7bq" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e5cdbf5e-4fb8-4a95-8254-b6bc2709291a", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40", Pod:"coredns-668d6bf9bc-ct7bq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.47.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali62864393510", MAC:"76:a3:7c:59:9b:d4", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:29.673181 containerd[1495]: 2025-01-30 19:15:29.663 [INFO][4823] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40" Namespace="kube-system" Pod="coredns-668d6bf9bc-ct7bq" WorkloadEndpoint="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:29.736433 containerd[1495]: time="2025-01-30T19:15:29.735943538Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 30 19:15:29.736433 containerd[1495]: time="2025-01-30T19:15:29.736046109Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 30 19:15:29.736433 containerd[1495]: time="2025-01-30T19:15:29.736087958Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:15:29.737497 containerd[1495]: time="2025-01-30T19:15:29.737005959Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 30 19:15:29.783471 systemd[1]: Started cri-containerd-60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40.scope - libcontainer container 60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40. Jan 30 19:15:29.855809 containerd[1495]: time="2025-01-30T19:15:29.855532141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ct7bq,Uid:e5cdbf5e-4fb8-4a95-8254-b6bc2709291a,Namespace:kube-system,Attempt:1,} returns sandbox id \"60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40\"" Jan 30 19:15:29.862259 containerd[1495]: time="2025-01-30T19:15:29.862208408Z" level=info msg="CreateContainer within sandbox \"60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 30 19:15:29.890890 containerd[1495]: time="2025-01-30T19:15:29.890817699Z" level=info msg="CreateContainer within sandbox \"60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"2213536008720e448f3fcd65fd475ad40a1302e7b9467ef3a4623a1b48220c60\"" Jan 30 19:15:29.891619 containerd[1495]: time="2025-01-30T19:15:29.891555583Z" level=info msg="StartContainer for \"2213536008720e448f3fcd65fd475ad40a1302e7b9467ef3a4623a1b48220c60\"" Jan 30 19:15:29.896903 kubelet[2677]: I0130 19:15:29.896712 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-86bd64cf58-cm96v" podStartSLOduration=33.666778086 podStartE2EDuration="38.896682049s" podCreationTimestamp="2025-01-30 19:14:51 +0000 UTC" firstStartedPulling="2025-01-30 19:15:23.727032794 +0000 UTC m=+46.608256636" lastFinishedPulling="2025-01-30 19:15:28.956936757 +0000 UTC m=+51.838160599" observedRunningTime="2025-01-30 19:15:29.894452892 +0000 UTC m=+52.775676768" watchObservedRunningTime="2025-01-30 19:15:29.896682049 +0000 UTC m=+52.777905898" Jan 30 19:15:29.954108 systemd[1]: Started cri-containerd-2213536008720e448f3fcd65fd475ad40a1302e7b9467ef3a4623a1b48220c60.scope - libcontainer container 2213536008720e448f3fcd65fd475ad40a1302e7b9467ef3a4623a1b48220c60. Jan 30 19:15:30.028457 containerd[1495]: time="2025-01-30T19:15:30.028389837Z" level=info msg="StartContainer for \"2213536008720e448f3fcd65fd475ad40a1302e7b9467ef3a4623a1b48220c60\" returns successfully" Jan 30 19:15:30.794883 containerd[1495]: time="2025-01-30T19:15:30.794409486Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:30.796931 containerd[1495]: time="2025-01-30T19:15:30.796505009Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Jan 30 19:15:30.797764 containerd[1495]: time="2025-01-30T19:15:30.797674360Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:30.801791 containerd[1495]: time="2025-01-30T19:15:30.801473338Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:30.805212 containerd[1495]: time="2025-01-30T19:15:30.805171118Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.847533516s" Jan 30 19:15:30.805324 containerd[1495]: time="2025-01-30T19:15:30.805219336Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Jan 30 19:15:30.810861 containerd[1495]: time="2025-01-30T19:15:30.810519218Z" level=info msg="CreateContainer within sandbox \"8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 30 19:15:30.842612 containerd[1495]: time="2025-01-30T19:15:30.842079917Z" level=info msg="CreateContainer within sandbox \"8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"d39117fa387a22906e644f1ce6ae0dbb14df51077f612bd4f7b16033e22807c9\"" Jan 30 19:15:30.845367 containerd[1495]: time="2025-01-30T19:15:30.845226964Z" level=info msg="StartContainer for \"d39117fa387a22906e644f1ce6ae0dbb14df51077f612bd4f7b16033e22807c9\"" Jan 30 19:15:30.892703 kubelet[2677]: I0130 19:15:30.892663 2677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 19:15:30.916042 systemd[1]: Started cri-containerd-d39117fa387a22906e644f1ce6ae0dbb14df51077f612bd4f7b16033e22807c9.scope - libcontainer container d39117fa387a22906e644f1ce6ae0dbb14df51077f612bd4f7b16033e22807c9. Jan 30 19:15:30.926880 kubelet[2677]: I0130 19:15:30.926733 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-ct7bq" podStartSLOduration=47.926708059 podStartE2EDuration="47.926708059s" podCreationTimestamp="2025-01-30 19:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-30 19:15:30.920776711 +0000 UTC m=+53.802000588" watchObservedRunningTime="2025-01-30 19:15:30.926708059 +0000 UTC m=+53.807931909" Jan 30 19:15:31.018326 containerd[1495]: time="2025-01-30T19:15:31.018272952Z" level=info msg="StartContainer for \"d39117fa387a22906e644f1ce6ae0dbb14df51077f612bd4f7b16033e22807c9\" returns successfully" Jan 30 19:15:31.021380 containerd[1495]: time="2025-01-30T19:15:31.021190985Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 30 19:15:31.088176 systemd-networkd[1437]: cali62864393510: Gained IPv6LL Jan 30 19:15:34.063409 containerd[1495]: time="2025-01-30T19:15:34.063288192Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:34.065930 containerd[1495]: time="2025-01-30T19:15:34.065767230Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Jan 30 19:15:34.066880 containerd[1495]: time="2025-01-30T19:15:34.066579427Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:34.073257 containerd[1495]: time="2025-01-30T19:15:34.073185999Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 30 19:15:34.074549 containerd[1495]: time="2025-01-30T19:15:34.074505023Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 3.053241653s" Jan 30 19:15:34.074630 containerd[1495]: time="2025-01-30T19:15:34.074558707Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Jan 30 19:15:34.079349 containerd[1495]: time="2025-01-30T19:15:34.079291491Z" level=info msg="CreateContainer within sandbox \"8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 30 19:15:34.108571 containerd[1495]: time="2025-01-30T19:15:34.106631743Z" level=info msg="CreateContainer within sandbox \"8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"99cf283c5f4d1d2dc049a9d384671264475ea33d2fc4a83d9535ead12bd7fdb1\"" Jan 30 19:15:34.107548 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3940679580.mount: Deactivated successfully. Jan 30 19:15:34.110324 containerd[1495]: time="2025-01-30T19:15:34.109378118Z" level=info msg="StartContainer for \"99cf283c5f4d1d2dc049a9d384671264475ea33d2fc4a83d9535ead12bd7fdb1\"" Jan 30 19:15:34.172288 systemd[1]: Started cri-containerd-99cf283c5f4d1d2dc049a9d384671264475ea33d2fc4a83d9535ead12bd7fdb1.scope - libcontainer container 99cf283c5f4d1d2dc049a9d384671264475ea33d2fc4a83d9535ead12bd7fdb1. Jan 30 19:15:34.222723 containerd[1495]: time="2025-01-30T19:15:34.221645155Z" level=info msg="StartContainer for \"99cf283c5f4d1d2dc049a9d384671264475ea33d2fc4a83d9535ead12bd7fdb1\" returns successfully" Jan 30 19:15:34.638550 kubelet[2677]: I0130 19:15:34.638263 2677 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 30 19:15:34.638550 kubelet[2677]: I0130 19:15:34.638406 2677 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 30 19:15:37.303621 containerd[1495]: time="2025-01-30T19:15:37.303198930Z" level=info msg="StopPodSandbox for \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\"" Jan 30 19:15:37.471068 containerd[1495]: 2025-01-30 19:15:37.417 [WARNING][5040] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0", GenerateName:"calico-apiserver-86bd64cf58-", Namespace:"calico-apiserver", SelfLink:"", UID:"a19cd91b-5cda-4b1f-99ed-5c7be5d66c14", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86bd64cf58", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a", Pod:"calico-apiserver-86bd64cf58-cm96v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.47.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidee7242abcd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:37.471068 containerd[1495]: 2025-01-30 19:15:37.418 [INFO][5040] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:37.471068 containerd[1495]: 2025-01-30 19:15:37.418 [INFO][5040] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" iface="eth0" netns="" Jan 30 19:15:37.471068 containerd[1495]: 2025-01-30 19:15:37.418 [INFO][5040] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:37.471068 containerd[1495]: 2025-01-30 19:15:37.418 [INFO][5040] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:37.471068 containerd[1495]: 2025-01-30 19:15:37.452 [INFO][5046] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" HandleID="k8s-pod-network.2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:37.471068 containerd[1495]: 2025-01-30 19:15:37.453 [INFO][5046] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:37.471068 containerd[1495]: 2025-01-30 19:15:37.453 [INFO][5046] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:37.471068 containerd[1495]: 2025-01-30 19:15:37.464 [WARNING][5046] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" HandleID="k8s-pod-network.2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:37.471068 containerd[1495]: 2025-01-30 19:15:37.464 [INFO][5046] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" HandleID="k8s-pod-network.2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:37.471068 containerd[1495]: 2025-01-30 19:15:37.466 [INFO][5046] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:37.471068 containerd[1495]: 2025-01-30 19:15:37.469 [INFO][5040] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:37.473144 containerd[1495]: time="2025-01-30T19:15:37.471130706Z" level=info msg="TearDown network for sandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\" successfully" Jan 30 19:15:37.473144 containerd[1495]: time="2025-01-30T19:15:37.471174283Z" level=info msg="StopPodSandbox for \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\" returns successfully" Jan 30 19:15:37.547583 containerd[1495]: time="2025-01-30T19:15:37.547466329Z" level=info msg="RemovePodSandbox for \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\"" Jan 30 19:15:37.547583 containerd[1495]: time="2025-01-30T19:15:37.547551940Z" level=info msg="Forcibly stopping sandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\"" Jan 30 19:15:37.668976 containerd[1495]: 2025-01-30 19:15:37.609 [WARNING][5065] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0", GenerateName:"calico-apiserver-86bd64cf58-", Namespace:"calico-apiserver", SelfLink:"", UID:"a19cd91b-5cda-4b1f-99ed-5c7be5d66c14", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86bd64cf58", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"48597486a55a1776cabc3362da09abb876483df5e931ebb1b653672fe89b306a", Pod:"calico-apiserver-86bd64cf58-cm96v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.47.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidee7242abcd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:37.668976 containerd[1495]: 2025-01-30 19:15:37.610 [INFO][5065] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:37.668976 containerd[1495]: 2025-01-30 19:15:37.610 [INFO][5065] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" iface="eth0" netns="" Jan 30 19:15:37.668976 containerd[1495]: 2025-01-30 19:15:37.610 [INFO][5065] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:37.668976 containerd[1495]: 2025-01-30 19:15:37.610 [INFO][5065] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:37.668976 containerd[1495]: 2025-01-30 19:15:37.646 [INFO][5071] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" HandleID="k8s-pod-network.2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:37.668976 containerd[1495]: 2025-01-30 19:15:37.647 [INFO][5071] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:37.668976 containerd[1495]: 2025-01-30 19:15:37.647 [INFO][5071] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:37.668976 containerd[1495]: 2025-01-30 19:15:37.655 [WARNING][5071] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" HandleID="k8s-pod-network.2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:37.668976 containerd[1495]: 2025-01-30 19:15:37.655 [INFO][5071] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" HandleID="k8s-pod-network.2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--cm96v-eth0" Jan 30 19:15:37.668976 containerd[1495]: 2025-01-30 19:15:37.659 [INFO][5071] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:37.668976 containerd[1495]: 2025-01-30 19:15:37.663 [INFO][5065] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c" Jan 30 19:15:37.668976 containerd[1495]: time="2025-01-30T19:15:37.667979684Z" level=info msg="TearDown network for sandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\" successfully" Jan 30 19:15:37.673793 containerd[1495]: time="2025-01-30T19:15:37.673752149Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 19:15:37.685477 containerd[1495]: time="2025-01-30T19:15:37.685427476Z" level=info msg="RemovePodSandbox \"2763435b23687ebd6667008c59ad73278980e8d706868f321b581e2d0cf41f2c\" returns successfully" Jan 30 19:15:37.686519 containerd[1495]: time="2025-01-30T19:15:37.686479691Z" level=info msg="StopPodSandbox for \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\"" Jan 30 19:15:37.839752 containerd[1495]: 2025-01-30 19:15:37.749 [WARNING][5089] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0", GenerateName:"calico-apiserver-86bd64cf58-", Namespace:"calico-apiserver", SelfLink:"", UID:"dcf02077-f75c-473d-88b8-2156144d8423", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86bd64cf58", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1", Pod:"calico-apiserver-86bd64cf58-mfbfs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.47.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic880996c1cf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:37.839752 containerd[1495]: 2025-01-30 19:15:37.749 [INFO][5089] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:37.839752 containerd[1495]: 2025-01-30 19:15:37.749 [INFO][5089] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" iface="eth0" netns="" Jan 30 19:15:37.839752 containerd[1495]: 2025-01-30 19:15:37.749 [INFO][5089] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:37.839752 containerd[1495]: 2025-01-30 19:15:37.749 [INFO][5089] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:37.839752 containerd[1495]: 2025-01-30 19:15:37.811 [INFO][5095] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" HandleID="k8s-pod-network.8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:37.839752 containerd[1495]: 2025-01-30 19:15:37.811 [INFO][5095] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:37.839752 containerd[1495]: 2025-01-30 19:15:37.811 [INFO][5095] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:37.839752 containerd[1495]: 2025-01-30 19:15:37.830 [WARNING][5095] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" HandleID="k8s-pod-network.8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:37.839752 containerd[1495]: 2025-01-30 19:15:37.830 [INFO][5095] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" HandleID="k8s-pod-network.8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:37.839752 containerd[1495]: 2025-01-30 19:15:37.833 [INFO][5095] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:37.839752 containerd[1495]: 2025-01-30 19:15:37.836 [INFO][5089] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:37.840724 containerd[1495]: time="2025-01-30T19:15:37.839986343Z" level=info msg="TearDown network for sandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\" successfully" Jan 30 19:15:37.840724 containerd[1495]: time="2025-01-30T19:15:37.840036535Z" level=info msg="StopPodSandbox for \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\" returns successfully" Jan 30 19:15:37.841595 containerd[1495]: time="2025-01-30T19:15:37.841513935Z" level=info msg="RemovePodSandbox for \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\"" Jan 30 19:15:37.841595 containerd[1495]: time="2025-01-30T19:15:37.841571672Z" level=info msg="Forcibly stopping sandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\"" Jan 30 19:15:37.955438 containerd[1495]: 2025-01-30 19:15:37.902 [WARNING][5115] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0", GenerateName:"calico-apiserver-86bd64cf58-", Namespace:"calico-apiserver", SelfLink:"", UID:"dcf02077-f75c-473d-88b8-2156144d8423", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86bd64cf58", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"d0051797be05dac7b8485e16f838076dbb7f40d1f377d1d238b0fae08a5826a1", Pod:"calico-apiserver-86bd64cf58-mfbfs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.47.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic880996c1cf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:37.955438 containerd[1495]: 2025-01-30 19:15:37.903 [INFO][5115] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:37.955438 containerd[1495]: 2025-01-30 19:15:37.903 [INFO][5115] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" iface="eth0" netns="" Jan 30 19:15:37.955438 containerd[1495]: 2025-01-30 19:15:37.904 [INFO][5115] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:37.955438 containerd[1495]: 2025-01-30 19:15:37.904 [INFO][5115] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:37.955438 containerd[1495]: 2025-01-30 19:15:37.938 [INFO][5121] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" HandleID="k8s-pod-network.8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:37.955438 containerd[1495]: 2025-01-30 19:15:37.938 [INFO][5121] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:37.955438 containerd[1495]: 2025-01-30 19:15:37.938 [INFO][5121] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:37.955438 containerd[1495]: 2025-01-30 19:15:37.949 [WARNING][5121] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" HandleID="k8s-pod-network.8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:37.955438 containerd[1495]: 2025-01-30 19:15:37.949 [INFO][5121] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" HandleID="k8s-pod-network.8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--apiserver--86bd64cf58--mfbfs-eth0" Jan 30 19:15:37.955438 containerd[1495]: 2025-01-30 19:15:37.950 [INFO][5121] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:37.955438 containerd[1495]: 2025-01-30 19:15:37.952 [INFO][5115] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8" Jan 30 19:15:37.955438 containerd[1495]: time="2025-01-30T19:15:37.955401638Z" level=info msg="TearDown network for sandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\" successfully" Jan 30 19:15:37.960159 containerd[1495]: time="2025-01-30T19:15:37.960094547Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 19:15:37.960249 containerd[1495]: time="2025-01-30T19:15:37.960195079Z" level=info msg="RemovePodSandbox \"8410ac0b079bacce2d3531f339c38b4712778f9c3ac6247b0f5bf5d73fd229f8\" returns successfully" Jan 30 19:15:37.961433 containerd[1495]: time="2025-01-30T19:15:37.960929322Z" level=info msg="StopPodSandbox for \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\"" Jan 30 19:15:38.080661 containerd[1495]: 2025-01-30 19:15:38.021 [WARNING][5139] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e5cdbf5e-4fb8-4a95-8254-b6bc2709291a", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40", Pod:"coredns-668d6bf9bc-ct7bq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.47.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali62864393510", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:38.080661 containerd[1495]: 2025-01-30 19:15:38.022 [INFO][5139] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:38.080661 containerd[1495]: 2025-01-30 19:15:38.022 [INFO][5139] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" iface="eth0" netns="" Jan 30 19:15:38.080661 containerd[1495]: 2025-01-30 19:15:38.022 [INFO][5139] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:38.080661 containerd[1495]: 2025-01-30 19:15:38.022 [INFO][5139] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:38.080661 containerd[1495]: 2025-01-30 19:15:38.056 [INFO][5145] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" HandleID="k8s-pod-network.85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:38.080661 containerd[1495]: 2025-01-30 19:15:38.057 [INFO][5145] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:38.080661 containerd[1495]: 2025-01-30 19:15:38.057 [INFO][5145] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:38.080661 containerd[1495]: 2025-01-30 19:15:38.073 [WARNING][5145] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" HandleID="k8s-pod-network.85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:38.080661 containerd[1495]: 2025-01-30 19:15:38.073 [INFO][5145] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" HandleID="k8s-pod-network.85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:38.080661 containerd[1495]: 2025-01-30 19:15:38.075 [INFO][5145] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:38.080661 containerd[1495]: 2025-01-30 19:15:38.078 [INFO][5139] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:38.082658 containerd[1495]: time="2025-01-30T19:15:38.081916521Z" level=info msg="TearDown network for sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\" successfully" Jan 30 19:15:38.082658 containerd[1495]: time="2025-01-30T19:15:38.081999014Z" level=info msg="StopPodSandbox for \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\" returns successfully" Jan 30 19:15:38.082939 containerd[1495]: time="2025-01-30T19:15:38.082887790Z" level=info msg="RemovePodSandbox for \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\"" Jan 30 19:15:38.083035 containerd[1495]: time="2025-01-30T19:15:38.082940164Z" level=info msg="Forcibly stopping sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\"" Jan 30 19:15:38.185069 containerd[1495]: 2025-01-30 19:15:38.137 [WARNING][5163] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e5cdbf5e-4fb8-4a95-8254-b6bc2709291a", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"60068d467ceed8e40e66dac1ae2a40653947100a4b6365f9043e7cff06816e40", Pod:"coredns-668d6bf9bc-ct7bq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.47.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali62864393510", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:38.185069 containerd[1495]: 2025-01-30 19:15:38.138 [INFO][5163] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:38.185069 containerd[1495]: 2025-01-30 19:15:38.138 [INFO][5163] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" iface="eth0" netns="" Jan 30 19:15:38.185069 containerd[1495]: 2025-01-30 19:15:38.138 [INFO][5163] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:38.185069 containerd[1495]: 2025-01-30 19:15:38.138 [INFO][5163] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:38.185069 containerd[1495]: 2025-01-30 19:15:38.167 [INFO][5169] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" HandleID="k8s-pod-network.85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:38.185069 containerd[1495]: 2025-01-30 19:15:38.168 [INFO][5169] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:38.185069 containerd[1495]: 2025-01-30 19:15:38.168 [INFO][5169] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:38.185069 containerd[1495]: 2025-01-30 19:15:38.178 [WARNING][5169] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" HandleID="k8s-pod-network.85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:38.185069 containerd[1495]: 2025-01-30 19:15:38.178 [INFO][5169] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" HandleID="k8s-pod-network.85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--ct7bq-eth0" Jan 30 19:15:38.185069 containerd[1495]: 2025-01-30 19:15:38.180 [INFO][5169] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:38.185069 containerd[1495]: 2025-01-30 19:15:38.182 [INFO][5163] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38" Jan 30 19:15:38.187072 containerd[1495]: time="2025-01-30T19:15:38.185767551Z" level=info msg="TearDown network for sandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\" successfully" Jan 30 19:15:38.190162 containerd[1495]: time="2025-01-30T19:15:38.190091212Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 19:15:38.190285 containerd[1495]: time="2025-01-30T19:15:38.190172717Z" level=info msg="RemovePodSandbox \"85c6aad1176783be1467ac1da2f39faa768210850c0a19d6c89d539579351b38\" returns successfully" Jan 30 19:15:38.191511 containerd[1495]: time="2025-01-30T19:15:38.191084709Z" level=info msg="StopPodSandbox for \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\"" Jan 30 19:15:38.305368 containerd[1495]: 2025-01-30 19:15:38.257 [WARNING][5187] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0", GenerateName:"calico-kube-controllers-55fd5b7757-", Namespace:"calico-system", SelfLink:"", UID:"310a4559-09ca-47db-a381-43206f870195", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"55fd5b7757", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc", Pod:"calico-kube-controllers-55fd5b7757-8rgqt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.47.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8d5b74b40a9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:38.305368 containerd[1495]: 2025-01-30 19:15:38.258 [INFO][5187] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:38.305368 containerd[1495]: 2025-01-30 19:15:38.258 [INFO][5187] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" iface="eth0" netns="" Jan 30 19:15:38.305368 containerd[1495]: 2025-01-30 19:15:38.258 [INFO][5187] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:38.305368 containerd[1495]: 2025-01-30 19:15:38.258 [INFO][5187] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:38.305368 containerd[1495]: 2025-01-30 19:15:38.288 [INFO][5193] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" HandleID="k8s-pod-network.db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:38.305368 containerd[1495]: 2025-01-30 19:15:38.289 [INFO][5193] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:38.305368 containerd[1495]: 2025-01-30 19:15:38.289 [INFO][5193] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:38.305368 containerd[1495]: 2025-01-30 19:15:38.298 [WARNING][5193] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" HandleID="k8s-pod-network.db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:38.305368 containerd[1495]: 2025-01-30 19:15:38.298 [INFO][5193] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" HandleID="k8s-pod-network.db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:38.305368 containerd[1495]: 2025-01-30 19:15:38.300 [INFO][5193] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:38.305368 containerd[1495]: 2025-01-30 19:15:38.303 [INFO][5187] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:38.309407 containerd[1495]: time="2025-01-30T19:15:38.305365990Z" level=info msg="TearDown network for sandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\" successfully" Jan 30 19:15:38.309407 containerd[1495]: time="2025-01-30T19:15:38.305431992Z" level=info msg="StopPodSandbox for \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\" returns successfully" Jan 30 19:15:38.309407 containerd[1495]: time="2025-01-30T19:15:38.307326178Z" level=info msg="RemovePodSandbox for \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\"" Jan 30 19:15:38.309407 containerd[1495]: time="2025-01-30T19:15:38.307374115Z" level=info msg="Forcibly stopping sandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\"" Jan 30 19:15:38.424770 containerd[1495]: 2025-01-30 19:15:38.367 [WARNING][5211] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0", GenerateName:"calico-kube-controllers-55fd5b7757-", Namespace:"calico-system", SelfLink:"", UID:"310a4559-09ca-47db-a381-43206f870195", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"55fd5b7757", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"cf1f3fa325c6154d45f1f742a7ce0726e79e638027c895af3610b42f980664cc", Pod:"calico-kube-controllers-55fd5b7757-8rgqt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.47.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8d5b74b40a9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:38.424770 containerd[1495]: 2025-01-30 19:15:38.367 [INFO][5211] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:38.424770 containerd[1495]: 2025-01-30 19:15:38.367 [INFO][5211] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" iface="eth0" netns="" Jan 30 19:15:38.424770 containerd[1495]: 2025-01-30 19:15:38.367 [INFO][5211] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:38.424770 containerd[1495]: 2025-01-30 19:15:38.367 [INFO][5211] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:38.424770 containerd[1495]: 2025-01-30 19:15:38.405 [INFO][5217] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" HandleID="k8s-pod-network.db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:38.424770 containerd[1495]: 2025-01-30 19:15:38.405 [INFO][5217] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:38.424770 containerd[1495]: 2025-01-30 19:15:38.405 [INFO][5217] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:38.424770 containerd[1495]: 2025-01-30 19:15:38.416 [WARNING][5217] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" HandleID="k8s-pod-network.db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:38.424770 containerd[1495]: 2025-01-30 19:15:38.417 [INFO][5217] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" HandleID="k8s-pod-network.db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Workload="srv--ehdo1.gb1.brightbox.com-k8s-calico--kube--controllers--55fd5b7757--8rgqt-eth0" Jan 30 19:15:38.424770 containerd[1495]: 2025-01-30 19:15:38.419 [INFO][5217] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:38.424770 containerd[1495]: 2025-01-30 19:15:38.422 [INFO][5211] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b" Jan 30 19:15:38.425709 containerd[1495]: time="2025-01-30T19:15:38.424881857Z" level=info msg="TearDown network for sandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\" successfully" Jan 30 19:15:38.445137 containerd[1495]: time="2025-01-30T19:15:38.444980163Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 19:15:38.445326 containerd[1495]: time="2025-01-30T19:15:38.445177486Z" level=info msg="RemovePodSandbox \"db47788e29e122f559b6c354d6f9831ca86b7993052768cd681324db6127be7b\" returns successfully" Jan 30 19:15:38.446025 containerd[1495]: time="2025-01-30T19:15:38.445986132Z" level=info msg="StopPodSandbox for \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\"" Jan 30 19:15:38.570681 containerd[1495]: 2025-01-30 19:15:38.519 [WARNING][5235] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eaeacb08-27c1-40ef-baaf-66029c9f99c5", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"84cddb44f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5", Pod:"csi-node-driver-xwmcj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.47.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif99e5ec88ec", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:38.570681 containerd[1495]: 2025-01-30 19:15:38.520 [INFO][5235] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:38.570681 containerd[1495]: 2025-01-30 19:15:38.520 [INFO][5235] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" iface="eth0" netns="" Jan 30 19:15:38.570681 containerd[1495]: 2025-01-30 19:15:38.520 [INFO][5235] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:38.570681 containerd[1495]: 2025-01-30 19:15:38.520 [INFO][5235] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:38.570681 containerd[1495]: 2025-01-30 19:15:38.551 [INFO][5241] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" HandleID="k8s-pod-network.2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Workload="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:38.570681 containerd[1495]: 2025-01-30 19:15:38.551 [INFO][5241] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:38.570681 containerd[1495]: 2025-01-30 19:15:38.551 [INFO][5241] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:38.570681 containerd[1495]: 2025-01-30 19:15:38.561 [WARNING][5241] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" HandleID="k8s-pod-network.2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Workload="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:38.570681 containerd[1495]: 2025-01-30 19:15:38.561 [INFO][5241] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" HandleID="k8s-pod-network.2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Workload="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:38.570681 containerd[1495]: 2025-01-30 19:15:38.563 [INFO][5241] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:38.570681 containerd[1495]: 2025-01-30 19:15:38.564 [INFO][5235] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:38.570681 containerd[1495]: time="2025-01-30T19:15:38.566780002Z" level=info msg="TearDown network for sandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\" successfully" Jan 30 19:15:38.570681 containerd[1495]: time="2025-01-30T19:15:38.566818397Z" level=info msg="StopPodSandbox for \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\" returns successfully" Jan 30 19:15:38.570681 containerd[1495]: time="2025-01-30T19:15:38.567852324Z" level=info msg="RemovePodSandbox for \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\"" Jan 30 19:15:38.570681 containerd[1495]: time="2025-01-30T19:15:38.567890530Z" level=info msg="Forcibly stopping sandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\"" Jan 30 19:15:38.665437 containerd[1495]: 2025-01-30 19:15:38.621 [WARNING][5259] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eaeacb08-27c1-40ef-baaf-66029c9f99c5", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"84cddb44f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"8cc83595c8cfb67696637fb94014b34bd597c2d83632501d5fd6998ac186e2c5", Pod:"csi-node-driver-xwmcj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.47.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif99e5ec88ec", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:38.665437 containerd[1495]: 2025-01-30 19:15:38.621 [INFO][5259] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:38.665437 containerd[1495]: 2025-01-30 19:15:38.621 [INFO][5259] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" iface="eth0" netns="" Jan 30 19:15:38.665437 containerd[1495]: 2025-01-30 19:15:38.621 [INFO][5259] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:38.665437 containerd[1495]: 2025-01-30 19:15:38.621 [INFO][5259] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:38.665437 containerd[1495]: 2025-01-30 19:15:38.650 [INFO][5265] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" HandleID="k8s-pod-network.2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Workload="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:38.665437 containerd[1495]: 2025-01-30 19:15:38.650 [INFO][5265] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:38.665437 containerd[1495]: 2025-01-30 19:15:38.650 [INFO][5265] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:38.665437 containerd[1495]: 2025-01-30 19:15:38.659 [WARNING][5265] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" HandleID="k8s-pod-network.2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Workload="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:38.665437 containerd[1495]: 2025-01-30 19:15:38.659 [INFO][5265] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" HandleID="k8s-pod-network.2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Workload="srv--ehdo1.gb1.brightbox.com-k8s-csi--node--driver--xwmcj-eth0" Jan 30 19:15:38.665437 containerd[1495]: 2025-01-30 19:15:38.661 [INFO][5265] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:38.665437 containerd[1495]: 2025-01-30 19:15:38.663 [INFO][5259] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc" Jan 30 19:15:38.666162 containerd[1495]: time="2025-01-30T19:15:38.665501113Z" level=info msg="TearDown network for sandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\" successfully" Jan 30 19:15:38.670328 containerd[1495]: time="2025-01-30T19:15:38.670263921Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 19:15:38.670510 containerd[1495]: time="2025-01-30T19:15:38.670396596Z" level=info msg="RemovePodSandbox \"2eefc2e608563afbbfa70d13ae41f81ee76df9b1c5855fbbdf6a210b61d3fdfc\" returns successfully" Jan 30 19:15:38.671536 containerd[1495]: time="2025-01-30T19:15:38.671484859Z" level=info msg="StopPodSandbox for \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\"" Jan 30 19:15:38.775987 containerd[1495]: 2025-01-30 19:15:38.729 [WARNING][5284] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"895b74bc-3470-4c3c-b993-a72d4beb91c4", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9", Pod:"coredns-668d6bf9bc-snjjb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.47.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7ca68738d10", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:38.775987 containerd[1495]: 2025-01-30 19:15:38.729 [INFO][5284] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:38.775987 containerd[1495]: 2025-01-30 19:15:38.729 [INFO][5284] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" iface="eth0" netns="" Jan 30 19:15:38.775987 containerd[1495]: 2025-01-30 19:15:38.729 [INFO][5284] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:38.775987 containerd[1495]: 2025-01-30 19:15:38.729 [INFO][5284] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:38.775987 containerd[1495]: 2025-01-30 19:15:38.756 [INFO][5291] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" HandleID="k8s-pod-network.e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:38.775987 containerd[1495]: 2025-01-30 19:15:38.756 [INFO][5291] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:38.775987 containerd[1495]: 2025-01-30 19:15:38.756 [INFO][5291] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:38.775987 containerd[1495]: 2025-01-30 19:15:38.767 [WARNING][5291] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" HandleID="k8s-pod-network.e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:38.775987 containerd[1495]: 2025-01-30 19:15:38.768 [INFO][5291] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" HandleID="k8s-pod-network.e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:38.775987 containerd[1495]: 2025-01-30 19:15:38.772 [INFO][5291] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:38.775987 containerd[1495]: 2025-01-30 19:15:38.774 [INFO][5284] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:38.777652 containerd[1495]: time="2025-01-30T19:15:38.776051955Z" level=info msg="TearDown network for sandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\" successfully" Jan 30 19:15:38.777652 containerd[1495]: time="2025-01-30T19:15:38.776089258Z" level=info msg="StopPodSandbox for \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\" returns successfully" Jan 30 19:15:38.777652 containerd[1495]: time="2025-01-30T19:15:38.776859349Z" level=info msg="RemovePodSandbox for \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\"" Jan 30 19:15:38.777652 containerd[1495]: time="2025-01-30T19:15:38.776895757Z" level=info msg="Forcibly stopping sandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\"" Jan 30 19:15:38.912417 containerd[1495]: 2025-01-30 19:15:38.830 [WARNING][5309] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"895b74bc-3470-4c3c-b993-a72d4beb91c4", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.January, 30, 19, 14, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-ehdo1.gb1.brightbox.com", ContainerID:"7a2fc8de411f7d9c8b7a726b7d90d3fafd15dc3ccbecee9b3b3f98dc2774ceb9", Pod:"coredns-668d6bf9bc-snjjb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.47.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7ca68738d10", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 30 19:15:38.912417 containerd[1495]: 2025-01-30 19:15:38.836 [INFO][5309] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:38.912417 containerd[1495]: 2025-01-30 19:15:38.837 [INFO][5309] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" iface="eth0" netns="" Jan 30 19:15:38.912417 containerd[1495]: 2025-01-30 19:15:38.837 [INFO][5309] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:38.912417 containerd[1495]: 2025-01-30 19:15:38.837 [INFO][5309] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:38.912417 containerd[1495]: 2025-01-30 19:15:38.887 [INFO][5315] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" HandleID="k8s-pod-network.e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:38.912417 containerd[1495]: 2025-01-30 19:15:38.887 [INFO][5315] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 30 19:15:38.912417 containerd[1495]: 2025-01-30 19:15:38.887 [INFO][5315] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 30 19:15:38.912417 containerd[1495]: 2025-01-30 19:15:38.906 [WARNING][5315] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" HandleID="k8s-pod-network.e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:38.912417 containerd[1495]: 2025-01-30 19:15:38.906 [INFO][5315] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" HandleID="k8s-pod-network.e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Workload="srv--ehdo1.gb1.brightbox.com-k8s-coredns--668d6bf9bc--snjjb-eth0" Jan 30 19:15:38.912417 containerd[1495]: 2025-01-30 19:15:38.908 [INFO][5315] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 30 19:15:38.912417 containerd[1495]: 2025-01-30 19:15:38.910 [INFO][5309] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c" Jan 30 19:15:38.914403 containerd[1495]: time="2025-01-30T19:15:38.912534406Z" level=info msg="TearDown network for sandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\" successfully" Jan 30 19:15:38.918501 containerd[1495]: time="2025-01-30T19:15:38.918463009Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 30 19:15:38.918602 containerd[1495]: time="2025-01-30T19:15:38.918545084Z" level=info msg="RemovePodSandbox \"e710a00c4de3e1d4288363e4f5d2a3f8ea03afda448ac2dd44c8b54dbf8a993c\" returns successfully" Jan 30 19:15:49.855425 kubelet[2677]: I0130 19:15:49.853754 2677 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-xwmcj" podStartSLOduration=48.248356473 podStartE2EDuration="57.853722999s" podCreationTimestamp="2025-01-30 19:14:52 +0000 UTC" firstStartedPulling="2025-01-30 19:15:24.470678956 +0000 UTC m=+47.351902802" lastFinishedPulling="2025-01-30 19:15:34.076045488 +0000 UTC m=+56.957269328" observedRunningTime="2025-01-30 19:15:34.93186938 +0000 UTC m=+57.813093244" watchObservedRunningTime="2025-01-30 19:15:49.853722999 +0000 UTC m=+72.734946848" Jan 30 19:15:55.741419 kubelet[2677]: I0130 19:15:55.741031 2677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 19:15:56.225001 kubelet[2677]: I0130 19:15:56.224411 2677 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 19:16:09.732124 systemd[1]: Started sshd@9-10.244.22.2:22-139.178.89.65:41648.service - OpenSSH per-connection server daemon (139.178.89.65:41648). Jan 30 19:16:10.741114 sshd[5406]: Accepted publickey for core from 139.178.89.65 port 41648 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:16:10.746546 sshd[5406]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:16:10.760552 systemd-logind[1486]: New session 12 of user core. Jan 30 19:16:10.770162 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 30 19:16:12.003184 sshd[5406]: pam_unix(sshd:session): session closed for user core Jan 30 19:16:12.008791 systemd[1]: sshd@9-10.244.22.2:22-139.178.89.65:41648.service: Deactivated successfully. Jan 30 19:16:12.012754 systemd[1]: session-12.scope: Deactivated successfully. Jan 30 19:16:12.015626 systemd-logind[1486]: Session 12 logged out. Waiting for processes to exit. Jan 30 19:16:12.019178 systemd-logind[1486]: Removed session 12. Jan 30 19:16:17.162580 systemd[1]: Started sshd@10-10.244.22.2:22-139.178.89.65:41394.service - OpenSSH per-connection server daemon (139.178.89.65:41394). Jan 30 19:16:18.070393 sshd[5424]: Accepted publickey for core from 139.178.89.65 port 41394 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:16:18.078019 sshd[5424]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:16:18.090156 systemd-logind[1486]: New session 13 of user core. Jan 30 19:16:18.100077 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 30 19:16:18.798554 sshd[5424]: pam_unix(sshd:session): session closed for user core Jan 30 19:16:18.804910 systemd[1]: sshd@10-10.244.22.2:22-139.178.89.65:41394.service: Deactivated successfully. Jan 30 19:16:18.808948 systemd[1]: session-13.scope: Deactivated successfully. Jan 30 19:16:18.810360 systemd-logind[1486]: Session 13 logged out. Waiting for processes to exit. Jan 30 19:16:18.812737 systemd-logind[1486]: Removed session 13. Jan 30 19:16:23.964290 systemd[1]: Started sshd@11-10.244.22.2:22-139.178.89.65:60250.service - OpenSSH per-connection server daemon (139.178.89.65:60250). Jan 30 19:16:24.894643 sshd[5460]: Accepted publickey for core from 139.178.89.65 port 60250 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:16:24.897953 sshd[5460]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:16:24.910333 systemd-logind[1486]: New session 14 of user core. Jan 30 19:16:24.916097 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 30 19:16:25.631245 sshd[5460]: pam_unix(sshd:session): session closed for user core Jan 30 19:16:25.637980 systemd[1]: sshd@11-10.244.22.2:22-139.178.89.65:60250.service: Deactivated successfully. Jan 30 19:16:25.641418 systemd[1]: session-14.scope: Deactivated successfully. Jan 30 19:16:25.643525 systemd-logind[1486]: Session 14 logged out. Waiting for processes to exit. Jan 30 19:16:25.645501 systemd-logind[1486]: Removed session 14. Jan 30 19:16:25.790275 systemd[1]: Started sshd@12-10.244.22.2:22-139.178.89.65:60266.service - OpenSSH per-connection server daemon (139.178.89.65:60266). Jan 30 19:16:26.689781 sshd[5475]: Accepted publickey for core from 139.178.89.65 port 60266 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:16:26.692312 sshd[5475]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:16:26.699087 systemd-logind[1486]: New session 15 of user core. Jan 30 19:16:26.705086 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 30 19:16:27.483474 sshd[5475]: pam_unix(sshd:session): session closed for user core Jan 30 19:16:27.488328 systemd[1]: sshd@12-10.244.22.2:22-139.178.89.65:60266.service: Deactivated successfully. Jan 30 19:16:27.491817 systemd[1]: session-15.scope: Deactivated successfully. Jan 30 19:16:27.494963 systemd-logind[1486]: Session 15 logged out. Waiting for processes to exit. Jan 30 19:16:27.496706 systemd-logind[1486]: Removed session 15. Jan 30 19:16:27.642557 systemd[1]: Started sshd@13-10.244.22.2:22-139.178.89.65:60268.service - OpenSSH per-connection server daemon (139.178.89.65:60268). Jan 30 19:16:28.598921 sshd[5486]: Accepted publickey for core from 139.178.89.65 port 60268 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:16:28.601082 sshd[5486]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:16:28.608323 systemd-logind[1486]: New session 16 of user core. Jan 30 19:16:28.614061 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 30 19:16:29.369289 sshd[5486]: pam_unix(sshd:session): session closed for user core Jan 30 19:16:29.374267 systemd[1]: sshd@13-10.244.22.2:22-139.178.89.65:60268.service: Deactivated successfully. Jan 30 19:16:29.376820 systemd[1]: session-16.scope: Deactivated successfully. Jan 30 19:16:29.379642 systemd-logind[1486]: Session 16 logged out. Waiting for processes to exit. Jan 30 19:16:29.381620 systemd-logind[1486]: Removed session 16. Jan 30 19:16:34.530401 systemd[1]: Started sshd@14-10.244.22.2:22-139.178.89.65:50946.service - OpenSSH per-connection server daemon (139.178.89.65:50946). Jan 30 19:16:35.420247 sshd[5520]: Accepted publickey for core from 139.178.89.65 port 50946 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:16:35.422375 sshd[5520]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:16:35.429082 systemd-logind[1486]: New session 17 of user core. Jan 30 19:16:35.436041 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 30 19:16:36.137351 sshd[5520]: pam_unix(sshd:session): session closed for user core Jan 30 19:16:36.141601 systemd[1]: sshd@14-10.244.22.2:22-139.178.89.65:50946.service: Deactivated successfully. Jan 30 19:16:36.144452 systemd[1]: session-17.scope: Deactivated successfully. Jan 30 19:16:36.146807 systemd-logind[1486]: Session 17 logged out. Waiting for processes to exit. Jan 30 19:16:36.148488 systemd-logind[1486]: Removed session 17. Jan 30 19:16:41.302608 systemd[1]: Started sshd@15-10.244.22.2:22-139.178.89.65:58642.service - OpenSSH per-connection server daemon (139.178.89.65:58642). Jan 30 19:16:42.227900 sshd[5541]: Accepted publickey for core from 139.178.89.65 port 58642 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:16:42.229586 sshd[5541]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:16:42.238310 systemd-logind[1486]: New session 18 of user core. Jan 30 19:16:42.245238 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 30 19:16:42.960026 sshd[5541]: pam_unix(sshd:session): session closed for user core Jan 30 19:16:42.964686 systemd[1]: sshd@15-10.244.22.2:22-139.178.89.65:58642.service: Deactivated successfully. Jan 30 19:16:42.967721 systemd[1]: session-18.scope: Deactivated successfully. Jan 30 19:16:42.970688 systemd-logind[1486]: Session 18 logged out. Waiting for processes to exit. Jan 30 19:16:42.972345 systemd-logind[1486]: Removed session 18. Jan 30 19:16:48.123595 systemd[1]: Started sshd@16-10.244.22.2:22-139.178.89.65:58652.service - OpenSSH per-connection server daemon (139.178.89.65:58652). Jan 30 19:16:49.034429 sshd[5562]: Accepted publickey for core from 139.178.89.65 port 58652 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:16:49.037183 sshd[5562]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:16:49.045177 systemd-logind[1486]: New session 19 of user core. Jan 30 19:16:49.053139 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 30 19:16:49.728781 systemd[1]: run-containerd-runc-k8s.io-d843e4067bfb307c87fe7c6e67fb59381c720c98994c512e5054f90c2246c25f-runc.p6rR25.mount: Deactivated successfully. Jan 30 19:16:49.753045 sshd[5562]: pam_unix(sshd:session): session closed for user core Jan 30 19:16:49.761423 systemd[1]: sshd@16-10.244.22.2:22-139.178.89.65:58652.service: Deactivated successfully. Jan 30 19:16:49.765738 systemd[1]: session-19.scope: Deactivated successfully. Jan 30 19:16:49.770902 systemd-logind[1486]: Session 19 logged out. Waiting for processes to exit. Jan 30 19:16:49.773257 systemd-logind[1486]: Removed session 19. Jan 30 19:16:49.915358 systemd[1]: Started sshd@17-10.244.22.2:22-139.178.89.65:58658.service - OpenSSH per-connection server daemon (139.178.89.65:58658). Jan 30 19:16:50.836150 sshd[5599]: Accepted publickey for core from 139.178.89.65 port 58658 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:16:50.838969 sshd[5599]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:16:50.846406 systemd-logind[1486]: New session 20 of user core. Jan 30 19:16:50.852072 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 30 19:16:51.893892 sshd[5599]: pam_unix(sshd:session): session closed for user core Jan 30 19:16:51.907393 systemd[1]: sshd@17-10.244.22.2:22-139.178.89.65:58658.service: Deactivated successfully. Jan 30 19:16:51.910037 systemd[1]: session-20.scope: Deactivated successfully. Jan 30 19:16:51.911418 systemd-logind[1486]: Session 20 logged out. Waiting for processes to exit. Jan 30 19:16:51.913449 systemd-logind[1486]: Removed session 20. Jan 30 19:16:52.052196 systemd[1]: Started sshd@18-10.244.22.2:22-139.178.89.65:38326.service - OpenSSH per-connection server daemon (139.178.89.65:38326). Jan 30 19:16:52.970259 sshd[5610]: Accepted publickey for core from 139.178.89.65 port 38326 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:16:52.972588 sshd[5610]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:16:52.980175 systemd-logind[1486]: New session 21 of user core. Jan 30 19:16:52.989053 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 30 19:16:54.908564 sshd[5610]: pam_unix(sshd:session): session closed for user core Jan 30 19:16:54.919900 systemd[1]: sshd@18-10.244.22.2:22-139.178.89.65:38326.service: Deactivated successfully. Jan 30 19:16:54.922633 systemd[1]: session-21.scope: Deactivated successfully. Jan 30 19:16:54.923772 systemd-logind[1486]: Session 21 logged out. Waiting for processes to exit. Jan 30 19:16:54.925822 systemd-logind[1486]: Removed session 21. Jan 30 19:16:55.066036 systemd[1]: Started sshd@19-10.244.22.2:22-139.178.89.65:38328.service - OpenSSH per-connection server daemon (139.178.89.65:38328). Jan 30 19:16:55.992823 sshd[5635]: Accepted publickey for core from 139.178.89.65 port 38328 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:16:55.998320 sshd[5635]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:16:56.007214 systemd-logind[1486]: New session 22 of user core. Jan 30 19:16:56.015139 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 30 19:16:56.938385 sshd[5635]: pam_unix(sshd:session): session closed for user core Jan 30 19:16:56.944131 systemd[1]: sshd@19-10.244.22.2:22-139.178.89.65:38328.service: Deactivated successfully. Jan 30 19:16:56.946604 systemd[1]: session-22.scope: Deactivated successfully. Jan 30 19:16:56.948072 systemd-logind[1486]: Session 22 logged out. Waiting for processes to exit. Jan 30 19:16:56.949965 systemd-logind[1486]: Removed session 22. Jan 30 19:16:57.095193 systemd[1]: Started sshd@20-10.244.22.2:22-139.178.89.65:38336.service - OpenSSH per-connection server daemon (139.178.89.65:38336). Jan 30 19:16:57.986364 sshd[5646]: Accepted publickey for core from 139.178.89.65 port 38336 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:16:57.988674 sshd[5646]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:16:57.995489 systemd-logind[1486]: New session 23 of user core. Jan 30 19:16:58.000050 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 30 19:16:58.688409 sshd[5646]: pam_unix(sshd:session): session closed for user core Jan 30 19:16:58.694147 systemd[1]: sshd@20-10.244.22.2:22-139.178.89.65:38336.service: Deactivated successfully. Jan 30 19:16:58.697263 systemd[1]: session-23.scope: Deactivated successfully. Jan 30 19:16:58.699153 systemd-logind[1486]: Session 23 logged out. Waiting for processes to exit. Jan 30 19:16:58.700943 systemd-logind[1486]: Removed session 23. Jan 30 19:17:03.850406 systemd[1]: Started sshd@21-10.244.22.2:22-139.178.89.65:33634.service - OpenSSH per-connection server daemon (139.178.89.65:33634). Jan 30 19:17:04.742424 sshd[5691]: Accepted publickey for core from 139.178.89.65 port 33634 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:17:04.744736 sshd[5691]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:17:04.752213 systemd-logind[1486]: New session 24 of user core. Jan 30 19:17:04.757065 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 30 19:17:05.484585 sshd[5691]: pam_unix(sshd:session): session closed for user core Jan 30 19:17:05.489966 systemd-logind[1486]: Session 24 logged out. Waiting for processes to exit. Jan 30 19:17:05.490473 systemd[1]: sshd@21-10.244.22.2:22-139.178.89.65:33634.service: Deactivated successfully. Jan 30 19:17:05.495326 systemd[1]: session-24.scope: Deactivated successfully. Jan 30 19:17:05.498538 systemd-logind[1486]: Removed session 24. Jan 30 19:17:10.644261 systemd[1]: Started sshd@22-10.244.22.2:22-139.178.89.65:33650.service - OpenSSH per-connection server daemon (139.178.89.65:33650). Jan 30 19:17:11.601177 sshd[5724]: Accepted publickey for core from 139.178.89.65 port 33650 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:17:11.604464 sshd[5724]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:17:11.611330 systemd-logind[1486]: New session 25 of user core. Jan 30 19:17:11.624133 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 30 19:17:12.338735 sshd[5724]: pam_unix(sshd:session): session closed for user core Jan 30 19:17:12.343900 systemd-logind[1486]: Session 25 logged out. Waiting for processes to exit. Jan 30 19:17:12.344407 systemd[1]: sshd@22-10.244.22.2:22-139.178.89.65:33650.service: Deactivated successfully. Jan 30 19:17:12.347279 systemd[1]: session-25.scope: Deactivated successfully. Jan 30 19:17:12.349901 systemd-logind[1486]: Removed session 25. Jan 30 19:17:17.500231 systemd[1]: Started sshd@23-10.244.22.2:22-139.178.89.65:49688.service - OpenSSH per-connection server daemon (139.178.89.65:49688). Jan 30 19:17:18.393878 sshd[5738]: Accepted publickey for core from 139.178.89.65 port 49688 ssh2: RSA SHA256:u8+itYrLEk8gleuOQPYU4Ynz962uCQsxC4IoVAtgGFc Jan 30 19:17:18.396192 sshd[5738]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 30 19:17:18.406444 systemd-logind[1486]: New session 26 of user core. Jan 30 19:17:18.414894 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 30 19:17:19.110327 sshd[5738]: pam_unix(sshd:session): session closed for user core Jan 30 19:17:19.116590 systemd[1]: sshd@23-10.244.22.2:22-139.178.89.65:49688.service: Deactivated successfully. Jan 30 19:17:19.120161 systemd[1]: session-26.scope: Deactivated successfully. Jan 30 19:17:19.122168 systemd-logind[1486]: Session 26 logged out. Waiting for processes to exit. Jan 30 19:17:19.124213 systemd-logind[1486]: Removed session 26.