We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Local LLMs like Mistral, Llama etc allow us to run ChatGPT like large language models locally inside our computers. Since all the processing happens within our systems, I feel more comfortable feeding it personal data compared to hosted LLMs.
A quick way to get started with Local LLMs is to use an application like Ollama. It's very easy to install, but interacting with it involves running commands on a terminal or installing other server based GUI in your system.
These are not a huge dealbreakers, but wouldn't it be nice if you can select a piece of text in any application and ask the LLM to summarize it?
Wouldn't it be nice if you can select a piece of text in any application and ask the LLM to summarize it?
continue reading on www.sheshbabu.com
⚠️ This post links to an external website. ⚠️
If this post was enjoyable or useful for you, please share it! If you have comments, questions, or feedback, you can email my personal email. To get new posts, subscribe use the RSS feed.