Imagine I own a factory, and I've just spent $50k on a widget-making machine. The machine has a useful life of 25,000 widgets.
In addition to the cost of the machine, each widget needs $0.20 of raw materials and operator time. So $5k over the life of the machine - if I choose to run the machine.
But it turns out the widget-making machine was a bad investment. The market price of widgets is now only $2.
If I throw the machine in the trash on day 1 without having produced a single widget, I've spent $50k and earned $0 so I've lost $50k.
If I buy $5k of raw materials and produce 25k widgets which sell for $50k, I've spent $55k and earned $50k so I've lost $5k. It's still a loss, sure, but a much smaller one.
The concept you're looking for is "marginal cost". The initial $50,000 for the machine has already been spent - the only calculation left is that each new widget costs 20 cents to make (that's the marginal cost) and generates $2.00 in revenue. At this point, making widgets is highly profitable.
and for GPUs, the math is even more stark because rather than having a 25k item lifespan, the lifespan is the time until GPUs improve enough to make the current one irrelevant.
Imagine I own a factory, and I've just spent $50k on a widget-making machine. The machine has a useful life of 25,000 widgets.
In addition to the cost of the machine, each widget needs $0.20 of raw materials and operator time. So $5k over the life of the machine - if I choose to run the machine.
But it turns out the widget-making machine was a bad investment. The market price of widgets is now only $2.
If I throw the machine in the trash on day 1 without having produced a single widget, I've spent $50k and earned $0 so I've lost $50k.
If I buy $5k of raw materials and produce 25k widgets which sell for $50k, I've spent $55k and earned $50k so I've lost $5k. It's still a loss, sure, but a much smaller one.