Skip to content

Conversation

@yurivict
Copy link
Contributor

Make sure to read the contributing guidelines before submitting a PR

cache_directory = std::getenv("LOCALAPPDATA");
#endif // __linux__
#else
# error Unknown architecture
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I recently seen people using llama.cpp on AIX, not sure if this will trigger the error on AIX or not

Copy link
Contributor

@mehendarkarprajwal mehendarkarprajwal Apr 12, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @ngxson . We have an internal CI and it broke today because of this. I will be contributing a patch to fix it on AIX. Thank you for thinking about us. :)

@yurivict
Copy link
Contributor Author

I recently seen people using llama.cpp on AIX, not sure if this will trigger the error on AIX or not

Lack of cache directory triggers a crash due to an assertion anyway.
This way it would fail early, at compile time, and make them to submit patches.

@ngxson ngxson merged commit 68b08f3 into ggml-org:master Apr 11, 2025
51 checks passed
colout pushed a commit to colout/llama.cpp that referenced this pull request Apr 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants