This is something I don't understand, how will all these AI-as-a-service companies survive in the future when hardware gets better and people are able to run LLMs locally? Of course right now the rent vs. buy equation is heavily tilted towards rent, but eventually I could see people buying a desktop they keep at home, and having all their personal inference running on that one machine. Or even having inference pools to distribute load among many people.
do you think what this is possible, and what are these companies plans in that event?
On the business side, to a first approximation, no one is running their own computers in their office. They are using either a Colo or cloud service.
Speaking of which, AI that they access through an API key is not a product that most people buy. They buy products that use AI - like ChatGPT. Speaking of which, open AI has no illusions about becoming profitable based on $20/month subscriptions or even lesser so advertising.
The money that AI companies make from selling API access directly (except maybe Anthropic via Claude Code) pales in comparison to what they make selling through cloud providers who then sell to businesses.
> I could see people buying a desktop they keep at home, and having all their personal inference running on that one machine. Or even having inference pools to distribute load among many people
Yes I’m sure my 80 year old mother who uses ChatGPT is going to get together with her sisters and buy computers that they can network together over their 30 Mbps uplink cable modem…
This is so much not how normal people operate.
reply