Since most applications aren't latency sensitive, space and power can be nearly free by setting up the data center in a place where it's cold, there's nearly free electricity and few people live. Leaves you with cost for infrastructure and connectivity, but I guess electricity prices shouldn't be the issue?
I'd think cost of internet would be the big issue even if can afford the AI hardware.
In rural areas or even with low population it takes forever to get fiber to roll out and if your selling access to your hardware infrastructure then you really want to get a direct connection to the nearest IX so you can offer customers the best speed for accessing data and the IX would probably be one of the few places you might be able to get 400G or higher direct fiber. But if your hooking up to a IX chances are your not an end user but a autonomous system and already are shoving moving and signing NDA's to be a peer with other Autonomous Systems in the exchange and be able to bgp announce.
(Source - my old highschool networking class where I got sick of my shitty internet and looked into how I could get fiber from an exchange. I'm probably mistaken on stuff here as it was years ago and its either wrong or outdated from all those years ago.)
Assuming rural areas have less fiber availability isn't always a good assumption.
In NW Washington state at least, the rural counties (Whatcom, Island, Skagit, etc) have had a robust market in dark fiber for over two decades.
The normal telcos weren't responsive to need, so private carriers picked up the slack. When I was last involved in this market, you could get a P2P strand, including reasonable buildout, for less than a cost of a T1 line with a two-year commit.
The tiny four-branch credit union I worked for had dedicated fiber loops between all our locations, no big deal. It was great.
Ambient cooling can only go so far. At the end of the day if you have a rack of GPU’s using 6000 watts per node, you’re going to need some very serious active cooling regardless of your location. You’ll save a little but it’s a small percentage of your overall costs.
in industrial manufacturing, recovering waste heat is a very common junior engineer task, usually a great first year project from recent grads to do a simple, $50-100k project that has a 1-2 year payback period.
Surely someone in the trillion dollar datacenter industry can figure out a way to take waste heat and use it in a profitable way, right?
I’d guess that there’s not enough energy density in the waste heat to do anything useful, especially once you bring it away from the clean areas of the facility where it’s produced to someplace you could actually use it at scale.