Bailey Wallace
Bailey00
Β·
AI & ML interests
None yet
Recent Activity
upvoted a collection 4 days ago
DeepSeek-V4 reacted to danielhanchen's post with β€οΈ 22 days ago
You donβt need to set LLM parameters anymore! π
llama.cpp uses only the context length + compute your local setup needs. Unsloth also auto-applies the correct model settings
Try in Unsloth Studio - now with precompiled llama.cpp binaries.
GitHub: https://github.com/unslothai/unslothOrganizations
None yet