克隆仓库

git clone --recursive https://github.com/utilityai/llama-cpp-rs
cd llama-cpp-rs

构建

cargo build --release --bin simple --features cuda

一、报错

/home/etsme/lining/llama-cpp-rs/llama-cpp-sys-2/llama.cpp/ggml/include/ggml.h:207:10: fatal error: 'stdbool.h' file not found

安装依赖

sudo apt-get install -y build-essential clang libclang-dev

二、报错

  CMake project was already configured. Skipping configuration step.
  running: cd "/home/etsme/lining/llama-cpp-rs/target/release/build/llama-cpp-sys-2-1450ffccd36fc223/out/build" && LC_ALL="C" "cmake" "--build" "/home/etsme/lining/llama-cpp-rs/target/release/build/llama-cpp-sys-2-1450ffccd36fc223/out/build" "--target" "install" "--config" "Release" "--parallel" "64"

  --- stderr
  gmake: Makefile: No such file or directory
  gmake: *** No rule to make target 'Makefile'.  Stop.

  thread 'main' panicked at /home/etsme/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/cmake-0.1.54/src/lib.rs:1119:5:

  command did not execute successfully, got: exit status: 2

  build script failed, must exit now
  note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
  ptxas fatal : Value 'sm_52' is not defined for option 'gpu-name'

检查是不是有nvidia-cuda-toolkit 的旧安装

ryan@mt:~/workspace/ai$ apt-cache policy nvidia-cuda-toolkit
nvidia-cuda-toolkit:
  已安装:(无)
  候选: 12.0.140~12.0.1-4build4
  版本列表:
     12.0.140~12.0.1-4build4 500
        500 http://mirrors.aliyun.com/ubuntu noble/multiverse amd64 Packages
        100 /var/lib/dpkg/status

如果有则删除

sudo apt remove nvidia-cuda-toolkit

参考来源