HuggingChat macOS is a native AI chat interface designed specifically for macOS users, leveraging the power of open-source language models. It brings the capabilities of advanced AI conversation right to your desktop, offering a seamless and intuitive experience.
HuggingChat offers the following free large models to use.
- Meta-Llama-3.1-70B-Instruct
- c4ai-command-r-plus-08-2024
- Qwen2.5-72B-Instruct
- Mixtral-8x7B-Instruct-v0.1
- Nous-Hermes-2-Mixtral-8x7B-DPO
- Mistral-7B-Instruct-v0.3
- Phi-3-mini-4k-instruct
What’s New
Version 0.5.0:
What's Changed
- Fixed bugs related to focused inputs (#6), model selection (#5) and other performance issues.
- Dropped the 'Mac'. Just HuggingChat. It's cleaner. (#8)
- Added basic local model inference support for GGUF with the ability to dynamically switch between local and server model using new keyboard shortcut
⌘+shift+\
(modifiable in Settings). - Added a 'Components' pane in Settings for model download/import.
- Added the ability to import local text files (e.g. source code and text)
- Conversations can now be set to clear at preset intervals
- Added acknowledgments section to the About section highlighting OS tools used within the app.
- Fixed app update server.
What To Test
- Test local generation with various models, with a special focus on user experience.
- Please test model management (i.e. import, deletion, etc) in the dedicated Components pane.
- Import text/source code files to local and server-side models
- Test long context local inference
What's Next
- Codebase cleanup
- Add two more themes
- Quality of life improvements and stability fixes
- Open sourcing the code
Compatibility
macOS 12.0 or later
Apple Silicon or Intel Core processor
Screenshots
Download Now