-
Notifications
You must be signed in to change notification settings - Fork 14.1k
Closed
Labels
Description
Git commit
Operating systems
Mac
GGML backends
BLAS
Problem description & steps to reproduce
The build fails with a number of errors, see the log extract below.
Reproduce: [Non-trivial due to hardware availability.] Run the build with MacPorts on powerpc-darwin.
First Bad Commit
I am not sure which commit introduced the breakage, but the last which was known to build fine is b52edd2 (bisecting is possible but will take forever)
Compile command
Executing: cd "/opt/local/var/macports/build/llama.cpp-b7053e9e/work/build" && /opt/local/bin/cmake -G "CodeBlocks - Unix Makefiles" -DCMAKE_BUILD_TYPE=MacPorts -DCMAKE_INSTALL_PREFIX="/opt/local" -DCMAKE_INSTALL_NAME_DIR="/opt/local/lib" -DCMAKE_SYSTEM_PREFIX_PATH="/opt/local;/usr" -DCMAKE_C_COMPILER="$CC" -DCMAKE_CXX_COMPILER="$CXX" -DCMAKE_OBJC_COMPILER="$CC" -DCMAKE_OBJCXX_COMPILER="$CXX" -DCMAKE_POLICY_DEFAULT_CMP0025=NEW -DCMAKE_POLICY_DEFAULT_CMP0060=NEW -DCMAKE_VERBOSE_MAKEFILE=ON -DCMAKE_COLOR_MAKEFILE=ON -DCMAKE_FIND_FRAMEWORK=LAST -DCMAKE_EXPORT_COMPILE_COMMANDS=ON -DCMAKE_MAKE_PROGRAM=/usr/bin/make -DCMAKE_MODULE_PATH="/opt/local/share/cmake/Modules" -DCMAKE_PREFIX_PATH="/opt/local/share/cmake/Modules" -DCMAKE_BUILD_WITH_INSTALL_RPATH:BOOL=ON -DCMAKE_INSTALL_RPATH="/opt/local/lib" -Wno-dev -DGGML_CCACHE=OFF -DGGML_LTO=ON -DGGML_OPENMP=ON -DLLAMA_CURL=ON -DGGML_METAL=OFF -DGGML_METAL_EMBED_LIBRARY=OFF -DGGML_BLAS=ON -DGGML_ACCELLERATE=ON -DGGML_BLAS_VENDOR=Apple -DCMAKE_OSX_ARCHITECTURES="ppc" -DCMAKE_OSX_DEPLOYMENT_TARGET="10.6" -DCMAKE_OSX_SYSROOT="/" /opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378Relevant log output
[ 99%] Building CXX object tools/server/CMakeFiles/llama-server.dir/server-http.cpp.o
cd /opt/local/var/macports/build/llama.cpp-b7053e9e/work/build/tools/server && /opt/local/bin/g++-mp-14 -DGGML_BACKEND_SHARED -DGGML_SHARED -DGGML_USE_BLAS -DGGML_USE_CPU -DLLAMA_SHARED -DLLAMA_USE_CURL -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/build/tools/server -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/../mtmd -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378 -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/common/. -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/common/../vendor -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/src/../include -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/ggml/src/../include -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/mtmd/. -pipe -Os -fpermissive -DNDEBUG -isystem/opt/local/include/LegacySupport -I/opt/local/include -D_GLIBCXX_USE_CXX11_ABI=0 -arch ppc -mmacosx-version-min=10.6 -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wextra-semi -MD -MT tools/server/CMakeFiles/llama-server.dir/server-http.cpp.o -MF CMakeFiles/llama-server.dir/server-http.cpp.o.d -o CMakeFiles/llama-server.dir/server-http.cpp.o -c /opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-http.cpp
In file included from /opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-http.cpp:5:
/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/common/../vendor/cpp-httplib/httplib.h:27:2: warning: #warning before C++23 is a GCC extension
27 | #warning \
| ^~~~~~~
/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/common/../vendor/cpp-httplib/httplib.h:27:2: warning: #warning "cpp-httplib doesn't support 32-bit platforms. Please use a 64-bit compiler." [-Wcpp]
[ 99%] Building CXX object tools/server/CMakeFiles/llama-server.dir/server-models.cpp.o
cd /opt/local/var/macports/build/llama.cpp-b7053e9e/work/build/tools/server && /opt/local/bin/g++-mp-14 -DGGML_BACKEND_SHARED -DGGML_SHARED -DGGML_USE_BLAS -DGGML_USE_CPU -DLLAMA_SHARED -DLLAMA_USE_CURL -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/build/tools/server -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/../mtmd -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378 -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/common/. -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/common/../vendor -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/src/../include -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/ggml/src/../include -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/mtmd/. -pipe -Os -fpermissive -DNDEBUG -isystem/opt/local/include/LegacySupport -I/opt/local/include -D_GLIBCXX_USE_CXX11_ABI=0 -arch ppc -mmacosx-version-min=10.6 -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wextra-semi -MD -MT tools/server/CMakeFiles/llama-server.dir/server-models.cpp.o -MF CMakeFiles/llama-server.dir/server-models.cpp.o.d -o CMakeFiles/llama-server.dir/server-models.cpp.o -c /opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-models.cpp
In file included from /opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-models.cpp:7:
/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/common/../vendor/cpp-httplib/httplib.h:27:2: warning: #warning before C++23 is a GCC extension
27 | #warning \
| ^~~~~~~
/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/common/../vendor/cpp-httplib/httplib.h:27:2: warning: #warning "cpp-httplib doesn't support 32-bit platforms. Please use a 64-bit compiler." [-Wcpp]
/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-models.cpp:41:25: error: 'path' in namespace 'std::filesystem' does not name a type
41 | static std::filesystem::path get_server_exec_path() {
| ^~~~
/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-models.cpp: In function 'std::vector<local_model> list_local_models(const std::string&)':
/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-models.cpp:89:27: error: 'exists' is not a member of 'std::filesystem'
89 | if (!std::filesystem::exists(dir) || !std::filesystem::is_directory(dir)) {
| ^~~~~~
/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-models.cpp:89:60: error: 'is_directory' is not a member of 'std::filesystem'
89 | if (!std::filesystem::exists(dir) || !std::filesystem::is_directory(dir)) {
| ^~~~~~~~~~~~
/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-models.cpp: In member function 'void server_presets::render_args(server_model_meta&)':
/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-models.cpp:224:41: error: 'get_server_exec_path' was not declared in this scope
224 | meta.args.insert(meta.args.begin(), get_server_exec_path().string());
| ^~~~~~~~~~~~~~~~~~~~
/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-models.cpp: In constructor 'server_models::server_models(const common_params&, int, char**, char**)':
/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-models.cpp:245:24: error: 'get_server_exec_path' was not declared in this scope
245 | base_args[0] = get_server_exec_path().string();
| ^~~~~~~~~~~~~~~~~~~~
make[2]: *** [tools/server/CMakeFiles/llama-server.dir/server-models.cpp.o] Error 1
make[2]: *** Waiting for unfinished jobs....
[ 99%] Building CXX object tools/server/CMakeFiles/llama-server.dir/server-task.cpp.o
cd /opt/local/var/macports/build/llama.cpp-b7053e9e/work/build/tools/server && /opt/local/bin/g++-mp-14 -DGGML_BACKEND_SHARED -DGGML_SHARED -DGGML_USE_BLAS -DGGML_USE_CPU -DLLAMA_SHARED -DLLAMA_USE_CURL -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/build/tools/server -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/../mtmd -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378 -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/common/. -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/common/../vendor -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/src/../include -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/ggml/src/../include -I/opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/mtmd/. -pipe -Os -fpermissive -DNDEBUG -isystem/opt/local/include/LegacySupport -I/opt/local/include -D_GLIBCXX_USE_CXX11_ABI=0 -arch ppc -mmacosx-version-min=10.6 -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -Wextra-semi -MD -MT tools/server/CMakeFiles/llama-server.dir/server-task.cpp.o -MF CMakeFiles/llama-server.dir/server-task.cpp.o.d -o CMakeFiles/llama-server.dir/server-task.cpp.o -c /opt/local/var/macports/build/llama.cpp-b7053e9e/work/llama.cpp-7378/tools/server/server-task.cpp
make[2]: Leaving directory `/opt/local/var/macports/build/llama.cpp-b7053e9e/work/build'
make[1]: *** [tools/server/CMakeFiles/llama-server.dir/all] Error 2
make[1]: Leaving directory `/opt/local/var/macports/build/llama.cpp-b7053e9e/work/build'
make: *** [all] Error 2