Provide quick prompt bar like Apple Spotlight search
Provide quick prompt bar like Apple Spotlight search with customizable global keyboard shortcut. In certain cases, users asking MindMac would benefit from instant assistance. You can write such a feature, but the best idea is to make use of Alfred and Raycast for this feature. Instead of writing a native function for this, just create an API/command-line interface for this, so that ZSH/BASH or other script-languages and frameworks can execute a binary of from MindMac to send commands and receive the responses. The workflow for Alfred ( https://www.alfredapp.com/help/workflows/ ) and the Plugin for Raycast ( https://www.raycast.com/developer-program ) and other similar applications could be developed then by the community. Suggested by Sascha
Add file upload feature
Add feature to allow user upload files (pdf, doc, docx, txt, …) then ask questions about these documents.
For users of multiple devices, it could be very convenient to have the same favorites, folders and conversations available on all devices. This would require a sync function, or saving the conversations, folder data and most of the settings in a project file. The latter option (project file) would only be an alternative idea if a sync function is too much work for the current state. Then the user himself can simply save the project file on his own cloud share/drive. Once a project file is defined, MindMac could check automatically for changes or update the project file when some are done. This is fore sure, a bigger feature, so no stress, just for road-map documentation. ;D Suggested by Sascha
Keep last location in the conversation
It would be great if, when you click on a chat, the window stayed at your last location in that chat. Suggested by Slush
Remember which model used for the conversation
While using Open AI and Ollama and switching one another, the previous conversation also show current provider instead of which provider it was used for the creation.
Inline allows you to copy past text for summary/context
At the moment, the inline function does not allow to reply to emails/take context into account when using the app. All I can do is ask GPT to write a question without taking into account anything, which makes the usage pointless. I still need to copy paste context within the app. This app ( https://macaify.com/ ) illustrates a great way to use inline GPT in a way that is immediately helpful. I'd strongly recommend check it out and implementing some of the same features to make the offering of MindMac more complete and time-saving.
Get up and running with large language models locally by supporting Ollama . Suggested by lu_chin2k