🚀 New
- Added support for Llama 3 models on Groq and Perplexity.
- Introduced support for LaTeX formatting.
✨ Improved
- Optimized overall application performance, including improved scrolling, better handling of long messages, and simplified multi-paragraph selection for copying.
- Disabled network proxy when using local inference sources such as Ollama, LMStudio, MLX.
- Transitioned explanatory messages to popover form in specific settings fields.
- Enhanced user interface for the main message view.
🔧 Fixed
- Resolved problems preventing some users from utilizing Claude 3 models on Vertex AI.