I’m sure that there is a ton of topics on this, and wish that there were some sort of pinned post along these lines. But what is everyone finding is the best model to run locally for deep thinking? Now that Gemini 3 Pro has nerfed its Chain of Thought, I’m ready to to go fully offline. Not worried about overflowing to RAM; I grew up with dialup. So if it takes a few minutes for it to get to the best answer, it’s better than a few seconds to jump to a conclusion that I have to pick apart and revalidate a dozen times over.
EDIT: Grammar.