When fusion arrives in the next 40 years: "Lithium from sea water would last 60 million years, however, and a more complicated fusion process using only deuterium would have fuel for 150 billion years." [calculated at 1995 global power output]
Why will fusion arrive in the next 40 years? I feel like that's what was stated 40 years ago too. Are there 0 barriers at this point and it's just a matter of obtaining funding and starting to build plants and by then there will be a big enough track record for widespread deployment?
Despite this attitude being memed, multiple efforts to refine fusion reactors to the point they are useful for energy purposes have made fairly constant (albeit slow and incremental) progress over the years. Furthermore, all the science and engineering concepts are sound based on the physics models.
One thing people don't get is we only need a tiny fraction of net positive output due to the insanely large amount of fuel available to us. 2-3% positive output is all we're talking about and it'll be able to economically replace most other energy sources (though it will take time to ramp up of course.)
Basically it's just a matter of time and continued advances in material sciences. Purity of materials and perfection of manufacturing are the primary barriers. Operational consistency a secondary one (and continued improvement in microcontroller tech and data analysis/AI/etc. will help here.)
One example, in focus fusion, the current progress has definitely been incremental but basically every step of the way the progress has been hindered by exactly the same problem: the materials and apparatus have to be nearly perfect to increase plasma duration sufficiently; to avoid contamination of the self-sustaining plasmoid. And indeed, the progress on that end has been continuous. Once it's proven it will be a matter of perfecting the manufacturing process and automations/etc. to ensure that degree of material quality and device engineering and voila we'll have cheap scalable power.
Based on what I've been watching for the last 15 years, I think within 40 years is quite reasonable. In fact, I think it would not be a surprised to see consistent 0.5% net positive or better in the next 5-10 years, whether it is focus fusion or tokamak or whatever.
TL;DR: Fusion is a 100% sound energy generation avenue that is simply waiting for its "welding" moment. By which I refer to the point where gas production became suddenly economically viable due to the invention of welding (which was motivated at least partly by the desire to improve gasoline production.)
That is certainly an exaggeration on the low end. By some estimates we don't have more than about 100 years of nuclear power. It's not really beneficial to give these wildly overoptimistic estimates of recoverable resources.
Seems like there’s no trade off to over-engineering health and safety in the context. N100 would be adequate but the P100 additionally provides a broader spectrum of air pollution exposure reduction which in the context of infection control helps protect the mucosal barriers from damage and preserves immune fitness, preventing T-cell depletion caused by inhalation of vehicular pollution and industrial/commercial combustion emissions, smoke, and so on.
Ableton Live Suite. Started on 9, went to 10, I’m now beta testing 11. After adjusting / finding a workflow I really like it. Picked originally based on recommendations and compatibility with other members of my band. I have written Max4live plugins. Also written synths in NI Reaktor.
Had to be Win and Mac (some plaftform freedom) due to availability of highend VSTs I wanted and now have (FabFilter, Serum, Massive, etc).
I also use ChucK for some sample development programming.
This looks really neat but I unfortunately don't have the necessary background knowledge to understand what this is even visualizing. Is there a tutorial to help acquire the necessary math/background knowledge to be able to comprehend what's being modeled?
For example, what are "Power", "Alpha", "n", and "d"?
Also "Type I" and "Type II" errors? What's the intuition as to how these all relate each other?
Is there a blog post, chapter, that explains all of this? Or would this require significant learning i.e. a full semester course or a textbook.
I'm going to try my best to explain these in an intuitive way. Statistics has lots of terms with names that are arbitrary and confusing.
To set the context, we are trying to use data to help us test a hypothesis. An example might be: "if we give this pill to a person, they will be cured of their disease". Statisticians test this by setting up two groups: Group A gets nothing (or a placebo), Group B gets the pill.
In statistics, you assume the "Null Hypothesis", in other words, that there is no difference between the two groups. You use hypothesis testing to help you "reject" the null hypothesis, to say that the groups are actually different. If the groups are different, that means the pill cures the disease. So we take a bunch of data about the two groups, run some math on that data, and use the result of that math to help us decide if we can reject the null hypothesis.
Statistics is a bunch of tradeoffs between certainty, making the wrong call, and data volume. The terms you have mentioned are either "knobs" (tradeoffs) we can make or measures that helps us understand our results.
Here's what those terms mean:
Type 1 Error: also known as "False Positive". You thought the pill cured the disease, but it does not.
Type 2 Error: also known as "False Negative". You thought the pill did nothing, but it actually works.
Power: the chance to avoid Type 2 error (false negative). The higher your power value, the lower chance you incorrectly assume your pill is ineffective.
N: The number of "observations", in our case, the number of patients in the trial for our pill.
The others are a little trickier to explain.
Alpha: Statisticians use a "confidence interval" as a way to communicate how uncertain they are about a particular result. In our trial we might say "patients were 15% less likely to have the disease after taking the pill, give or take 2%". We don't think the decrease is exactly 15% (what we observed) but is instead somewhere in that neighborhood. Alpha is a measure of the chance the real effect is OUTSIDE of your confidence interval. So in this case, the chance the effect is < 13% or > 17%.
Cohen's D: In our trial, we might measure "the number of times the patient coughed in a day" in addition to "do they have the disease anymore yes/no". In order to compare our two groups, we make look at the average number of coughs per day in group A vs group B. This is called measuring the "difference in means". Cohen's D is a formula to measure the difference in means that also encodes your uncertainty in the result.
> Alpha: Statisticians use a "confidence interval" as a way to communicate how uncertain they are about a particular result. In our trial we might say "patients were 15% less likely to have the disease after taking the pill, give or take 2%". We don't think the decrease is exactly 15% (what we observed) but is instead somewhere in that neighborhood. Alpha is a measure of the chance the real effect is OUTSIDE of your confidence interval. So in this case, the chance the effect is < 13% or > 17%.
I know this sounds intuitive, but it is wrong.
The true effect is not a random variable.
The random variable is the statistic.
When we say "95% confidence interval", we are referring to the fact that 95% of the confidence intervals constructed based on the sampling distribution under the null will contain the true effect, not the chance that the true value is in the specific confidence interval you constructed.
Edit: The latter is either 0 or 1 but you don't get to find out in the context of a single test.
I absolutely loath statistics terminology. It’s such a road block for people and an example of erudite traditionalism not being challenged enough.
(although the current terms in question are very mild and not knowing them probably speaks to how the vocab in general blocks people from learning stats rather than these particular terms doing it)
It’ll probably require 2-3 semesters in stats to really understand - most intro courses barely cover the basic and don’t even reach power. You really have to apply regression to many real datasets to truly understand the concepts.
Yep, sales engineer. I was a infrastructure consultant and SRE before that.
It happened through meeting the sales teams of the vendors I worked with. Eventually one of them asked me if I wanted to apply for a sales engineer role. He then referred me and vouched. Always be nice to your vendors and partners.
Without a doubt. For comparison a 13” MacBook Pro can drive two 4K displays plus the internal display at 2880x1800, all at 60Hz, from an integrated GPU.
You can probably run as many 4K monitors as the 1650 has video ports.
When fusion arrives in the next 40 years: "Lithium from sea water would last 60 million years, however, and a more complicated fusion process using only deuterium would have fuel for 150 billion years." [calculated at 1995 global power output]
https://en.wikipedia.org/wiki/Fusion_power#Energy_source