I don't want this solution delivered in the form of an extension (one practical reason is I use ChatGPT from mobile a lot of the time). I have 0 extensions installed in general.
But I don't want to download extensions, they are too security-unfriendly.
I mean I've been copying chats with my friends and saving them locally since the time online chats were called Instant Messaging.
But I’m at a place where I can’t determine if the ephemeral UX of chatting with AI (ChatGPT, Claude) isn’t actually better. Most chats I want to save these days are things like code snippets that I’m not ready to integrate yet.
Theoretically with an infinite context window a model would just work fine forever by shoving the entire conversation history into context with each request. But a message search/retrieval makes a lot more sense.
I think the long term AI chat is just relatively new as a UI pattern, and so it takes time to build patterns around it.
Ex: in 2023 I told GPT to answer all questions like a pirate. I never told it to stop doing that, so if we're loading every historical chat in memory, should it still be answering as a pirate?
Nope, with an infinite context window the LLM would take forever to give you an answer. Therefore it would be useless.
We don't really have such a thing as a context window, it's an artifact of LLM architecture. We are building a ton of technology around it but who's to say it's the right approach?
Maybe the best AIs will only use a very tiny LLM for actual language processing while delegating storage and compression of memories to something that's actually built for that.
https://mashable.com/article/chatgpt-chat-history-search-int...
It also has projects (says is for Plus, Team, and Pro users). https://help.openai.com/en/articles/10169521-using-projects-...
Any outputs they generate that one finds useful need to be retained outside their walled-garden.
Since then, for whatever reason, it's not available for my account (I'm on a Plus plan).
Just burnout, siloing, and a lack of creativity. We can’t solve these problems in the industry because we are greedy short term thinkers who believe we’re long term innovators. To say nothing of believing we are smarter and more entitled then we are
Claude has a way to star important conversations. Don't think chatgpt has that.
My only solution so far has been aggressively deleting conversations once I find and answer and know I don't need it for reference.
Not really an excuse though, since a product company's mandate is to create a product that doesn't leave its customers baffled about apparently missing functionality.
You can run it locally, or as I do on a $5/month Linode server. I don't want to pay ~20/month for each LLM provider, so I put $5 to $10 on my Anthropic and OpenAI API accounts every couple months, and that lasts me plenty long.
You get to save all your chats, change models mid-chat, view code artifacts, create presets, and much more.
If you don't know how to set up something like this, ask ChatGPT or Claude. They will walk you through it, and you will learn a useful skill. It's shockingly easy.