Training takes a few months, but I bet that they'll do some testing first before releasing it to the public.
I also believe that they will delay the release of GPT-5 as much as possible, the reason being that it will be underwhelming (at least in comparison to GPT3.5 hype). Possibly release close to some Google new release timeline (their main competitor).
They are the main driver of a bubble that has benefited a lot both Microsoft and NVidia and other hyperscalers, and if they release the model and display that we're at the "diminished returns" phase, this will crash a big part of the industry, not to mention NVidia.
Companies are buying H100s and investing in expensive AI talent because they believe they progress quickly, if the progress stalls for LLMs, there'll be a huge drop in sales and CAPEX in this industry.
There are still many up-and-coming projects that rely on NVidia hardware for training, like Tesla's autopilot and others, but the bulk of the investment in H100 in recent years has been mostly because of LLMs.
Also all the new AI talent will move on to do something new and hopefully we will have more discoveries and potential uses, but we're definitely peak LLMs.
I also believe that they will delay the release of GPT-5 as much as possible, the reason being that it will be underwhelming (at least in comparison to GPT3.5 hype). Possibly release close to some Google new release timeline (their main competitor).
They are the main driver of a bubble that has benefited a lot both Microsoft and NVidia and other hyperscalers, and if they release the model and display that we're at the "diminished returns" phase, this will crash a big part of the industry, not to mention NVidia.
Companies are buying H100s and investing in expensive AI talent because they believe they progress quickly, if the progress stalls for LLMs, there'll be a huge drop in sales and CAPEX in this industry.
There are still many up-and-coming projects that rely on NVidia hardware for training, like Tesla's autopilot and others, but the bulk of the investment in H100 in recent years has been mostly because of LLMs.
Also all the new AI talent will move on to do something new and hopefully we will have more discoveries and potential uses, but we're definitely peak LLMs.
(ps: just my opinion)