3 Comments
User's avatar
DJ's avatar

How can I run this in a server environment?

Expand full comment
Simon Willison's avatar

I think Ollama will work on a server too, as will llama.cpp server - but to be honest I don't know what the best options for that are at the moment.

Expand full comment
DJ's avatar

There's probably a huge opportunity for some PAAS provider to offer this. If Ollama is as good as you say, it can handle a lot of use cases at a much lower cost than OpenAI and Anthropic.

Expand full comment