The Apple M5 Pro chip is rumored to be announced later this week with improved GPU cores and faster inference. With consumer hardware getting better and better and AI labs squishing down models to fit in tiny amounts of vRAM, it’s becoming increasingly more feasible to have an assistant which has absorbed the entirety of the internet’s knowledge built right into your PC or laptop, all running privately and securely offline. The future is exciting everyone, we are closing the gap
Page Assist already can let ollama models search the internet with SearXNG. But even though it often finds what I’m asking for, it’s still lacking. It’s just a one-shot search and it wouldn’t try a different search query if it doesn’t return any helpful messages