Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> There is absolutely no way anyone is going to be making any money offering $2 H100s unless they stole them and they get free space/power...

At the highest power settings, H100s consume 400 W. Add another 200 W for CPU/RAM. Assume you have an incredibly inefficient cooling system, so you also need 600 W of cooling.

Google tells me US energy prices average around 17 cents/kWh - even if you don't locate your data centre somewhere with cheap electricity.

17 cents/kWh * 1200 watts * 1 hour is only 20.4 cents/hour.



That's just the power. If one expects a H100 to run for three years at full load, 24 x 365 x 3 = 26280. Assuming a price of $25K per H100, that means about $1/h to amortize costs. Hence the unless they stole them, I guess.

Factor in space, networking, cooling, security, etc., and $2 really do seem undoable.


None of that matters if you already bought the H100 and have no use for it. You might as well recoup as much money as you can on it.


> You might as well recoup as much money as you can on it.

Depending on how fast their value depreciates, selling them might recoup more money then renting them away. And being exposed to 3y of various risks.

Selling now at a 40% loss gets you back the equivalent of 60c/h over three years, and without having other costs (DC, power, network, security) and risks.


If you already have the H100s, renting access to them at a loss isn't better. Throwing them in the trash will lose you less money.


That's not how this works.

Imagine I own a factory, and I've just spent $50k on a widget-making machine. The machine has a useful life of 25,000 widgets.

In addition to the cost of the machine, each widget needs $0.20 of raw materials and operator time. So $5k over the life of the machine - if I choose to run the machine.

But it turns out the widget-making machine was a bad investment. The market price of widgets is now only $2.

If I throw the machine in the trash on day 1 without having produced a single widget, I've spent $50k and earned $0 so I've lost $50k.

If I buy $5k of raw materials and produce 25k widgets which sell for $50k, I've spent $55k and earned $50k so I've lost $5k. It's still a loss, sure, but a much smaller one.


The concept you're looking for is "marginal cost". The initial $50,000 for the machine has already been spent - the only calculation left is that each new widget costs 20 cents to make (that's the marginal cost) and generates $2.00 in revenue. At this point, making widgets is highly profitable.


and for GPUs, the math is even more stark because rather than having a 25k item lifespan, the lifespan is the time until GPUs improve enough to make the current one irrelevant.


GGP already showed the marginal power cost is well below $2.


There is so much more to lifecycle sustainment cost than that.

Rackspace. Networking. Physical safety. Physical security. Sales staff. Support staff. Legal. Finance. HR. Support staff for those folks.

That’s just off the top of my head. Sitting down for a couple days at the very least, like a business should, would likely reveal significant depths that $2 won’t cover.


These are all costs of any server hosting business. Other commenters have already shown that $2/hr for a racked 1U server at 400W is perfectly sustainable.


Just because you have all of those costs already doesn't make them go away. If you're cross-subsidising the H100 access with the rest of a profitable business, that's a choice you can make, but it doesn't mean it's suddenly profitable at $2: you still need the profitable rest of the business in order to lose money here.


So you terminate all of the above right now, or continue selling at a loss (which still extends the runway) and wait for better times? Also, do you know that similar situations occasionally occur in pretty much any market out there?

The market doesn't care how much you're losing, it will set a price and it's up to you to take it, or leave it.


No, if its only a “loss” due to counting amortization of the sunk cost of initial acquisition, throwing them in the trash will lose you more money. The only way you can avoid the key cost is to travel back in time and not buy them, and, yeah, if you can do that instead, maybe you should (but, the time travel technology will make you more money than the H100s would ever cost, so maybe don't bother.)


amortization curves for gpus are 5-7 years per my gpu rich contacts. even after they cease to be top of the line they are still useful for inference. so you can halve that $1/h


Haven’t electric costs been increasing though? Eventually those two curves should death cross


You are not looking at the full economics of the situation.

There are very few data centers left that can do 45kW+ rack density, which translates to 32 H100/MI300x GPUs in a rack.

Most datacenters, you're looking at 1 or 2 boxes of 8 GPU, a rack. As a result, it isn't just the price of power, it is whatever the data center wants to charge you.

Then you factor in cooling on top of that...


For the fuller math one has to include the cost of infrastructure financing, which is tied to interest rates. Given how young most of these H100 shops are, I assume that they pay more to service their debts than for power.


> I assume that they pay more to service their debts than for power.

Well yes, because for GPU datacentres fixed/capital costs make up a much higher fraction than power and other expenses than for CPUs. To such an extent that power usage barely even matters. A $20k that uses 1 kW ( which is way more than it would in reality ) 24x7 would cost $1.3k to run per year at 0.15$ per kWh, that's almost insignificant compared to depreciation.

The premise is that nobody could make any money by renting H100s for 2$ even if they got them for free unless they only had free power. That makes no sense whatsoever when you can get 2x AMD EPYC™ 9454P servers at 2x408 W (for full system) for around $0.70 in a German data center.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: