Hosting a model of that size requires ~800GB of VRAM. Even if they release their models, it wouldn’t make them obsolete since most people and many companies couldn’t host it either way.
Anyone can now provide that service. Why pay OpenAI when you can pay a different service who is cheaper or provides a service more aligned with your needs or ethics or legal requirements?
Hosting a model of that size requires ~800GB of VRAM. Even if they release their models, it wouldn’t make them obsolete since most people and many companies couldn’t host it either way.
Anyone can now provide that service. Why pay OpenAI when you can pay a different service who is cheaper or provides a service more aligned with your needs or ethics or legal requirements?
Anyone that has 300.000$ per instance, the know-how to set it up, the means to support it and can outbid OpenAI, yes.
I don’t see that happening on a large scale, just like I don’t see tons of DeepSeek instances being hosted cheaper than the original any time soon.
If they really are afraid of that they can always license it in a way that forbids reselling.