Let me be the devils advocate here.
Ok, let's say you optimize that TODO list app to only use 16 mb of RAM. What did you gain by that? Would you buy a smartphone that has less RAM now?
16MB still seems massive for this kind of app. I ran Visual Studio 4, not an app, but an entire app factory, on a 66MHz 486 with 16MB RAM. And it was snappy. A TODO list app that uses system UI elements could be significantly smaller.
What do I gain if more developers take this approach? Lightning fast performance. Faster backups. Decreased battery drain => longer battery service lifetime => more time in between hardware refreshes. Improved security posture due to orders of magnitude less SLOC. Improved reliability from decreased complexity.
Easier to run your todo list at the same time as applications that need the RAM for raw function. Maybe that’s CAD, maybe that’s A/V production, maybe it’s a context window.
It’s been convenient that we can throw better hardware at our constraints regularly. Our convenience much less our personal economic functions is not necessarily what markets will generally optimize for, much like developers of electron apps aren’t optimizing for user resources.
Technically no (except for the gradual performance drop they introduce, + occasional TPM bullshit), but of course in practice, companies see this as a choice of spending money on back-porting security fixes to a growing range of hardware, vs. making money by not doing that and forcing everyone to buy new hardware instead.
My "new" computer is a laptop with an 8th-gen i5 and 8 gb ram salvaged from the Windows 11-incompatibility heap (actually 3 of them, so I have 16gb in it). I installed Kubuntu on it and it runs extremely well. I can even install Windows 11 in a VM if I really need it. I'll probably be buying new (old) ram for it and maybe a bigger drive.
I’m running Windows 10 ESU on a 13 year old PC without issues. While it’s admittedly near the end of its life (mostly just due to Windows 11, though I might repurpose it for Linux), I’m expecting the next one to also last a decade or longer.
So is my wife, her laptop is still decent today, but doesn't support Win 11. I'm not worried about Microsoft as much as certain other competitors killing it - similarly to how she was forced to update to Windows 10 in the first place because, one day, out of the sudden, her web browser decided to refuse running on Windows 7.
We can't ever escape the market forces? You're right, of course if software gets less bloated, vendors will "value-optimize" hardware so in the end, computers keep being barely usable as they are today.
This year's average phone is already going to have less RAM than last year's average phone - so anything that reduces the footprint of the apps (and even more importantly, websites) we're using can only be a good thing. Plus it extends the usable life of current hardware.
Sure, but the price increase will be less, because less ram. Also, the need to keep buying new computers will decrease, because this year's computer isn't much better then last years (but now we can run more/better software!)
Less bloat is 100% always a good thing, no matter what the market conditions are.
So is the “binary” nature of today’s switches the core objection? We routinely simulate non-binary, continuous, and probabilistic systems using binary hardware. Neuroscientific models, fluid solvers, analog circuit simulators, etc., all run on the same “binary switches,” and produce behavior that cannot meaningfully be described as binary, only the substrate is.
I was shocked to discover how incredibly poorly IndexedDB works. I always thought it would be fast and snappy if a bit alien. But nope, it's incredibly bad!
Despite being a native feature to the browser it's incredibly slow, and the way it works in terms of fetching records based on non-primary keys forces you to either load your entire dataset into RAM at once or iterate though it record-by-record in a slow callback. Something as trivial as 10k records can bring your webapp to a crawl.
I've built some pretty intensive stuff in indexeddb and it was the only thing I've ever done, using native browser features, that I could get to consistently crash the browsers I tested it on (granted, this was many years ago). On top of that, the API is so ugly. I cannot believe indexeddb won over websql (when every browser ever already embeds sqlite). What a shame.
There are several "American" cars interesting for our market they talk about when they talk about importing American cars (ex. Toyotas) it's usually not the kind of car you Americans think about, and not much to worry for us ...
I think companies always prioritized their own interests.
A company can increase its profits (1) by improving their products and services, so that they'll get more customers or customers willing to pay more, or (2) by increasing how much of their revenue is profit by (e.g.) cutting corners on quality or raising prices or selling customers' personal information to third parties.
Either of those can work. Yes, a noble idealistic company might choose #1 over #2 out of virtue, but I think that if most companies picked #1 in the past it's because they thought they'd get richer that way.
I think what's happened is that for some reason #2 has become easier or more profitable, relative to #1, over time. Or maybe it used not to be so clearly understood that #2 was a live option, and #1 seemed safer, but now everyone knows that you can get away with #2 so they do that.
We can have stable user-friendly software. We had a nice sweet spot in the early 2000s with Windows XP and Mac OS X: stable operating systems built on workstation-quality kernels (NT and Mach/BSD, respectively), and a userland that respected the user by providing distraction-free experiences and not trying to upsell the user. Users of workstations already experienced this in the 1990s (NeXT, Sun, SGI, HP, and PCs running IBM OS/2 Windows NT), but it wasn’t until the 2000s when workstation-grade operating systems became readily available to home users, with both Windows XP and Mac OS X 10.0 being released in 2001.
We do of course still have this in modern computing with Linux/KDE. Stable, snappy, and does exactly what you ask. The computer doesn't get in your way, nor does it try to get you to do something else. It just does what you tell it to do, immediately.
Yup, desktop Linux and other FOSS systems like ReactOS and Haiku are the last bastions of personal computing that haven’t been made into platforms that nag and upsell us.
> first version of Linux - which I did not have the honor to use but I can imagine how user friendly it was
My first accounts were on Linux 1.x. It was glorious. Simple, sensible, and with manuals one command away. And it allowed you to just get things done. And there were tools. So many tools. 80's home computers and DOS crap and Macs that couldn't even open a file if it hadn't been tagged as the property of some application... Hells I would never have to be a part of any more. Except for work and school. But for personal computing, a brighter future was coming. In 30+ years since I've never had to step away.
It's not impossible that an AI was asked to sprinkle in a few typos for effect, but perhaps it really is just written by a person who really loves emojis.
What an absurd take. If we use FLOPS as a crude measure, the Air would be comparable to the leading supercomputers of ~1999/2000. There's many reasons why that's a very poor comparison but ignoring the absolute insanity of the raw compute available in a pocketable, thin, battery-powered handheld that you can buy literally this week, is ridiculous. Modern smartphones are nothing short of sci-fi when compared to even recent living memory. We're simply used to them due to their sheer ubiquity.
The A19 GPU doesn't even have hardware support for FP64, which is the precision used for TOP500. No, it is not comparable to leading supercomputers of 1999/2000.
reply