3 Comments

How can I run this in a server environment?

Expand full comment

I think Ollama will work on a server too, as will llama.cpp server - but to be honest I don't know what the best options for that are at the moment.

Expand full comment

There's probably a huge opportunity for some PAAS provider to offer this. If Ollama is as good as you say, it can handle a lot of use cases at a much lower cost than OpenAI and Anthropic.

Expand full comment