Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I personally do not like how his solution boils down to "just learn more" which may be true at an individual level, but not as the general solution to awful software.

You will never be able to force developers worldwide to start writing everything in C/Assembly, or even to care beyond "it performs fine on my machine". Individuals can have fun micro-optimizing their application, but overall, we have the app we have because of compromises we find somewhat acceptable.

More likely the solution will be technical, making great/simple/efficient code the path of least resistance.



Watch some of the intros to his performance aware programming videos. He doesn’t want everyone to use C or Assembly. He also doesn’t want everyone micro-optimizing things.

>compromises we find somewhat acceptable

His entire point is that most developers aren’t actually aware of the compromises they are making. Which is why he calls it “performance aware programming” and not performance first programming.


I have actually watched many of his videos, as an individual I very much like his advices. What I am saying however is that this has nothing to do whatsoever with improving software at scale.

But my point still stands, Casey focuses solely on the cultural aspect but completely ignore the technical one. He says that developers became lazy/uninformed, but why did that happen? Why would anything he currently say solve it?


I don’t think you can blame him for not having a large scale systemic solution to the problem.

Imagine if there was a chef on YouTube who was telling everyone how bad it was that we are eating over processed food, and he makes videos showing how easy it is to make your own food at home.

Would it be reasonable to comment to people who share the chefs message that you don’t like how his cooking videos don’t solve the root problem that processed food is too cheap and tasty?

And here’s the thing, he doesn’t have to solve the whole problem himself. Many developers have no idea that there’s even problem. If he spreads the message about what’s possible and more developers become “performance aware” maybe it causes more people to expect performance from their libraries, frameworks, and languages.

Maybe some of these newly performance aware programmers are the next generation of language and library developers and it inspires one of them to create the technological solution you’re hypothesizing.


But he says that he has a large-scale systemic solution, it is literally the reason for his videos.

He isn't doing it so I can get my CPU trivia, but because he believes it will result in better software. Been at it for years, and so far I don't think it's going better.

My opinion is that better software will not come from CPU-aware people, but from people having enough of all the BS and switching to simpler solutions. You speak about language/libraries, what make you think it would come from those?

I do believe he is smart, just unfortunate he puts his talent on the wrong path. It's not like he does not understand the impact of simplicity/working independently, as it is how he made his "fast terminal" showcase: by tying to bypass system libraries as much as possible, transforming it into a simple (or at least less theoretical) problem.


This is such an absolutely wild take to me. Screw those Mythbusters guys trying to explain the scientific method to people. That's not a systemic solution to fixing scientific illiteracy in the US. Sesame Street teaching kids to count? Those guys are way off the mark because it's not going to measurably improve economic performance. You'll never actually change anything just by teaching people stuff.

I'm 100% certain that Casey Muratori doesn't think that his paywalled course is going convince literally every programmer to to care about performance, or to stop Facebook from building slow apps.

He's trying to convince some small percentage of people that performance is something they should be thinking about when they program, and that they should understand the tradeoffs they are making when they choose to use some slow method, algorithm, library, language, or system because it provides some other benefit.

When someone writes a book like "math for dummies" that says on the back cover "I think basic math is a useful skill for everyone to have, and if we all learned basic math the world would be a better place!"

That's not the author stating their true belief that their book is going to literally teach the majority of the world basic math. That's an aspirational goal or in the worst case it's sales copy.

Publicly calling that person out for wasting their time is really something else.

Creating straw men to attack like "You will never be able to force developers worldwide to start writing everything in C/Assembly" when he explicitly states that this isn't his goal makes me think you're just looking for some reason to shit on the guy.

>You speak about language/libraries, what make you think it would come from those?

I have no idea when if or how a technical solution will be found.


"Math for dummies" book writers don't argue that mathematicians are stupid/incompetent because they do not spend enough time on the basics.

It has also nothing to do with teaching people to count, read, or the scientific method. Every. Single. Software I use is bloated, I am unaware of any software solely consuming that it should, nothing is close to the theoretical limit, this is not a knowledge issue. If you have such software in mind feel free.

Casey continuously says that the problem with software is that developers do not understand performance, and so that the problem will be solved once they do. I am not overly familiar with his paid course, but he did stuff before that. He has essentially the same opinion as Jonathan Blow.

The course being paywalled isn't really an argument for either of us, on his side it is most likely a compromise, just because this isn't the grand plan to convince millions of developers doesn't mean he doesn't intent for it to help his cause.

Casey believing the problem at scale to be cultural isn't some guess, I actually asked him: https://x.com/cmuratori/status/1687138791356833793

> that this isn't his goal makes me think you're just looking for some reason to shit on the guy.

You can consider it a hyperbole if you want, but essentially what I am saying is that he wants to put the burden on the developers, without really changing the underlying structure.

I am not shitting on him for the sake of it. I do really like his (and Jonathan) observations, and I will continue to watch them, I just find it unfortunate that both waste their time on solution that will not affect anything at scale (one with a custom language, the other with specialized course)


>"Math for dummies" book writers don't argue that mathematicians are stupid/incompetent because they do not spend enough time on the basics.

His argument isn't that they don't spend enough time on the basics. His argument is that they don't know the basics even exist.

This is true in my experience. I've interviewed scores of candidates over the years who just have absolutely no mental model of what is going on under whatever abstraction layer they generally work in.

Making trade offs to sacrifice performance for things like developer speed is fine. But if you're not aware you are making those trade offs, I might not call you stupid, but I will call you ignorant.

>Every. Single. Software I use is bloated, I am unaware of any software solely consuming that it should, nothing is close to the theoretical limit, this is not a knowledge issue.

It's not binary. There's a continuum. No software is ever going to use exactly the minimum theoretically possible number of instructions and memory over the entire program.

But some software is worse and some is better.

It is possible to believe that the problem can be solved by changing the culture, while simultaneously believing that you aren't going to be able to completely change the culture. While also believing that you can shift the culture ever so slightly such that you make it just a little bit better.

You can also hold all of the above beliefs while believing that there is a systemic underlying issue that you have no idea how to solve.

From my observations (having watched a good bit of early handmade hero and a good bit of his paid course) the above is pretty close to his beliefs.

It's also fairly close to my own thoughts on the subject. The underlying issue is that software is an industry that primarily competes on features, not quality. I think the reasons for that are numerous--everything from software being an immature industry, to the distorting effects of cheap money flowing into the industry from the previous few decades.

I believe this will work itself out to some degree over time as the industry becomes more mature and we start to see more diminishing returns for just adding new features. Developers will start to look towards other things to differentiate themselves and quality/performance could be one of them.

I don't think there's anything I or Casey can do to change the market dynamics, and I can't think of a technical fix to this issue. I suspect Casey can't either.

I also think that there are niches today where performance and quality can outcompete more features.

Given all the above, I think pushing to convince a small percentage of people that spending a bit more time thinking about performance is a rational goal. Maybe it's the best you can do. But maybe you end up with just a few less wasted cycles in the world. Maybe you end up with a few more pieces of software that feel snappy and pleasant to use than you otherwise would have.


I completely agree that some (if not the majority of) developers aren't aware of what's happening under abstraction layers, but then I have to ask: is it the developer fault, or the abstraction's?

If you were to ask a new programmer to make a very simple calculator which would then be distributed to various people using various devices, what would they use? How long would it take them? How much would it consume? Does this cost have anything to do with the programmer being unaware CPUs have multiple cores or that memory access is slow? Theoretically I would struggle to find a way to make this calculator takes up more than 10mb of memory (which is already more than Mario 64), both as CLI and GUI. You literally have 4 bytes instructions for add/sub/mul/div and a framebuffer, it is not like I am talking about micro-optimization, this should be the default and simplest path.

Discord takes around 400mb on my machine and will happily take a whole +3ghz core from me if I start scrolling. If I were to give a new programmer a method to query messages, and another to render to a framebuffer through software rendering, would they even succeed matching discord/chromium's bloat? Seems to me it would require some genuine effort.

You could explain me that this bloat come from fancy/convenient features, but it does not change that programmers are always exposed to complex problems even though theoretically easily composable and therefore friendlier to changes.

If you were to ask me for less theory and for a more practical example, I would say that each programs should be written/compiled to a potentially different format with the common point of being easily interpretable (stack machine, Turing machine, lambda calculus, cellular automata). Each platform (OSes, hardware) should come with a program that expose every single potential IO action, and to run an actual program requires finding an interpreter for the specific format (either made yourself in an hour at most, or downloaded) and mapping your exposed IO calls through the platform app, in the same way you would configure keybinds in a video game.

- Developers are always exposed to their lowest layer

- Programs are always cross-platform, even to future ones

- Programs are stable, increasing the chance of being optimized over time

- Heavily limit dependencies (and therefore additional abstractions)

In this example, none of it depends on understanding how CPUs work. Also does not require a change in market dynamics, individuals can start, and make it slowly gain relevance as anything written this way cannot (easily) break and become abandonware.

Ultimately, even unaware programmers are able to explain their webapp in few words, the problem is that they cannot easily map those words and have to work around frameworks which they cannot really give up on. Android/iOS helloworld templates illustrate it nicely.


I 100% agree that you don't necessarily need to know how a computer works to make a less bloated discord app.

But I do think that knowing the limit of what is possible with respect to performance makes you appreciate just how bloated and slow something like discord is. If you've only ever don't web development or electron apps, and you've only ever used bloated apps, you have no idea that you can do better.

You don't have to teach people about the limits, you could build a discord client on top of a "a method to query messages, and another to render to a framebuffer through software rendering" and show them how much faster it could be how much less memory it could use.

But the thing is Casey likes teaching at a lower level than that. He likes teaching about the domain he finds interesting and that he's an expert in. He thinks he can make the world more like the world he wants to live in, and he thinks he can do it by doing something he likes doing and is capable of doing.

That doesn't mean there aren't other faster or more impactful solutions to less bloated software, but he's not necessarily the guy to work on those solutions and there's nothing wrong with that. Judging by the number of people who watch his videos and the frequency with which the topics he promotes are discussed, he's making an impact--maybe not the optimal impact he could be making, but that's a fair thing to judge someone on.

> Each platform (OSes, hardware) should come with a program that expose every single potential IO action, and to run an actual program requires finding an interpreter for the specific format (either made yourself in an hour at most, or downloaded) and mapping your exposed IO calls through the platform app, in the same way you would configure keybinds in a video game.

Have you looked at the Roc language? It has a lot of similarities to what you're describing.


> You don't have to teach people about the limits, you could build a discord client on top of a "a method to query messages, and another to render to a framebuffer through software rendering" and show them how much faster it could be how much less memory it could use.

The problem is exactly that this is not that simple. I believe that there is a huge gap between the theory of a chat application, and the way they are implemented now. I do not believe it is about better abstraction layers, personally I have always found explanations using raw sockets and bits the simplest.

If you were to ask a programmer to write a discord client, you would very likely end up with a giant mess that somehow break a few months later at most (if ever). Even though they would be able to flawlessly go through the program flow step by step if you asked.

Discussions about efficient code kind of avoid the topic, I don't believe we are in a situation where multithreading is relevant, the bloat is on another level. The bloat is generally not related to whatever language you use, the O(1) or O(n) algorithm you choose, but more likely the overall control flow of the program which you already understand in your head but cannot figure out how to act on, and the inability to make simple problems simple to solve (like the calculator, text editor, or even games)

Now you are probably right about Casey ultimately doing what he likes, and even if suboptimal make other people aware. Although I would believe that this benefit comes somewhat in spite of himself.

> Have you looked at the Roc language? It has a lot of similarities to what you're describing.

Gave it a look and unfortunately, I don't think I have seen the similarities? It has environment/IO access, call host language functions, and doesn't really strive to be reimplemented by independent people.


>Gave it a look and unfortunately, I don't think I have seen the similarities?

I was specifically thinking about this https://www.roc-lang.org/platforms

I'm not an expert. I haven't actually used it, but I have been following it a bit. Roc splits things into apps and platforms. When you implement a platform you're responsible for implementing all (or the subset you want to provide) of the I/O primitives and memory allocators.

>Discussions about efficient code kind of avoid the topic, I don't believe we are in a situation where multithreading is relevant, the bloat is on another level

Case does talk about other kinds of bloat. Things like structuring things so blocking operations can be batched etc...


The real problem is that we have a broken, anti-performance culture. We have allowed "premature optimizaiton is the root of all evil" to morph into "optimization is the root of all evil. That single quote has done untold damage to the software industry. We don't need to pass laws to force all programmers worldwide to do anything. Fixing our culture will be enough. In my view, that's what Casey is trying to do.


I don't believe this is the root cause, computers got faster, and software got quicker to the state of "run good enough". I'm calling Wirth's law on it.

"Clean code" is indeed often a bad idea, but you are overestimating the impact. Even software written by people caring very much about performance consume way more than it theoretically should.

Plus, if this was that simple, people would have already rewritten all the bad software.

Your message is exactly the reason why I do not like Casey, he is brainwashing everyone into thinking this is a culture problem. Meanwhile nobody tries to solve it technically.


The free market is preventing technical solutions. People generally buy based on features first and everything else second. This allows for a precarious situation in software market: the company producing the most bloat the fastest wins the biggest market share and sees no need to invest in proper fixes. Everyone that cares about software quality too much gets outcompeted almost immediately.

And since software can be rebuilt and replicated with virtually zero cost, there is no intrinsic pressure to keep unit costs down as it happens in industry, where it tends to keeps physical products simple.


It doesn't have to come from the free market, FOSS is hardly exempt of awfully slow/unstable software. Nobody figured out yet how to make writing good software the default/path-of-least-resistance.


Wirth's law doesn't bolster your point. He observed software is getting slower at a more rapid rate than computers are getting faster. Which is the whole point. We write increasingly slow, increasingly shitty code each year. I read and hear this attitude all the time that is basically "if you optimize something, you're bad at your job, only juniors try to do that". That's a culture problem.

It's frankly insulting you think Casey brainwashed me into this stance, when it's been obvious to me since long before I'd ever heard of him. IDGAF if code is clean or not. I care that Jira can display a new ticket in less than 15 seconds. I care that vscode actually keeps up with the characters I type. None of this software is remotely close to "runs good enough".


What I am saying is that this is a natural phenomenon assuming no technical solution. People will tend to optimize their software based on the performance of their hardware.

I completely agree that many apps are horrendously slow, but given the alternative are hard pressed to arrive, I can only conclude they are considered "good enough" for our current tech level.

The difficulty involved in rewriting modern apps is one of the reason I would give that result in slow software. Can't really complain about the number of independent web browsers when you look at the spec. Ensuring the software we use is easily reimplementable by a few or one developer in a few days would go a long way improving performance.

Another reason would be the constant need to rewrite working code, to work on new platforms, to support some new trendy framework, etc. etc. You cannot properly optimize without some sort of stability.


Jira runs good enough to get idiot managers to pay for it, which is what it's designed for. And, yeah, microoptimising (or micropessimising, who knows since the changes are usually just made on vibes anyway) random bits of code that probably aren't even on any kind of hot path, while compromising maintainability, is something only juniors and people who are bad at their job do. It's easy to forget how common security flaws and outright crashes were in the "good old days" - frankly even today the industry is right to not prioritise performance given how much we struggle with correctness.

A lot of code is slow and could be faster, often much faster. This is more often because of people who thought they should bypass the abstractions and clean code and do something clever and low level than the opposite. Cases where you actually gain performance on a realistic-sized codebase by doing that are essentially nonexistent. The problem isn't too many abstractions, it's using the wrong algorithm or the wrong datastructure (which, sure, sometimes happens in a library or OS layer, but the answer to that isn't to bypass the abstraction, it's to fix it), and that's easier to spot and fix when the code is clean.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: