

There’s also an option to bring your own LLM, with fields for model name, endpoint, and API token available for entry when the manual option is enabled. However, the page itself warns local models may not work correctly.
It looks like there’s an option for people to self-host too. You won’t have to send your history to someone else’s computer.













It would be really cool if they didn’t do that this time.