Hacker Newsnew | past | comments | ask | show | jobs | submit | guardian5x's commentslogin

Let me be the devils advocate here. Ok, let's say you optimize that TODO list app to only use 16 mb of RAM. What did you gain by that? Would you buy a smartphone that has less RAM now?

16MB still seems massive for this kind of app. I ran Visual Studio 4, not an app, but an entire app factory, on a 66MHz 486 with 16MB RAM. And it was snappy. A TODO list app that uses system UI elements could be significantly smaller.

What do I gain if more developers take this approach? Lightning fast performance. Faster backups. Decreased battery drain => longer battery service lifetime => more time in between hardware refreshes. Improved security posture due to orders of magnitude less SLOC. Improved reliability from decreased complexity.


16MB is less than a display buffer for a 4k display. It is never ever going to happen again just due to hardware realities.

Less RAM usage doesn't equal better performances or faster software. It actually might mean the opposite, if you're not caching things in RAM.

Easier to run your todo list at the same time as applications that need the RAM for raw function. Maybe that’s CAD, maybe that’s A/V production, maybe it’s a context window.

It’s been convenient that we can throw better hardware at our constraints regularly. Our convenience much less our personal economic functions is not necessarily what markets will generally optimize for, much like developers of electron apps aren’t optimizing for user resources.


It’s the upgrade treadmill you would stop using, and stick to the initial entry device.

If only there wasn't a security update treadmill forcing everyone to do regular hardware upgrades.

Of course, as long as we're in the dreamland, most of these security upgrades do not actually require a hardware upgrade.

Technically no (except for the gradual performance drop they introduce, + occasional TPM bullshit), but of course in practice, companies see this as a choice of spending money on back-porting security fixes to a growing range of hardware, vs. making money by not doing that and forcing everyone to buy new hardware instead.

My "new" computer is a laptop with an 8th-gen i5 and 8 gb ram salvaged from the Windows 11-incompatibility heap (actually 3 of them, so I have 16gb in it). I installed Kubuntu on it and it runs extremely well. I can even install Windows 11 in a VM if I really need it. I'll probably be buying new (old) ram for it and maybe a bigger drive.

I’m running Windows 10 ESU on a 13 year old PC without issues. While it’s admittedly near the end of its life (mostly just due to Windows 11, though I might repurpose it for Linux), I’m expecting the next one to also last a decade or longer.

So is my wife, her laptop is still decent today, but doesn't support Win 11. I'm not worried about Microsoft as much as certain other competitors killing it - similarly to how she was forced to update to Windows 10 in the first place because, one day, out of the sudden, her web browser decided to refuse running on Windows 7.

We can't ever escape the market forces? You're right, of course if software gets less bloated, vendors will "value-optimize" hardware so in the end, computers keep being barely usable as they are today.

This year's average phone is already going to have less RAM than last year's average phone - so anything that reduces the footprint of the apps (and even more importantly, websites) we're using can only be a good thing. Plus it extends the usable life of current hardware.

Sure, but the price increase will be less, because less ram. Also, the need to keep buying new computers will decrease, because this year's computer isn't much better then last years (but now we can run more/better software!)

Less bloat is 100% always a good thing, no matter what the market conditions are.


It would be nice for browser tabs and apps to reload less often.

Back in the day, when you saw as many ads or popups as some websites show today, it usually meant you had at least 3 viruses on the computer.

So is the “binary” nature of today’s switches the core objection? We routinely simulate non-binary, continuous, and probabilistic systems using binary hardware. Neuroscientific models, fluid solvers, analog circuit simulators, etc., all run on the same “binary switches,” and produce behavior that cannot meaningfully be described as binary, only the substrate is.


Binary logic (aka a computer) can be used to model or simulate anything that has a clear mathematical definition.

Currently, "intelligence" is lacking a clear mathematical definition.


> Currently, "intelligence" is lacking a clear mathematical definition.

On the contrary, there are many. You just don't like them. E.g. skill of prediction.


Yes, I don't like them --- because they only offer language prediction --- not to be confused with intelligence.


What is the advantage of using something like this instead of the IndexedDB Browser Feature


You get all the features of postgres.


run your backend tests against this in memory and tests can be run in parallel instead of using a single real postgres instance


I was shocked to discover how incredibly poorly IndexedDB works. I always thought it would be fast and snappy if a bit alien. But nope, it's incredibly bad!

Despite being a native feature to the browser it's incredibly slow, and the way it works in terms of fetching records based on non-primary keys forces you to either load your entire dataset into RAM at once or iterate though it record-by-record in a slow callback. Something as trivial as 10k records can bring your webapp to a crawl.


I've built some pretty intensive stuff in indexeddb and it was the only thing I've ever done, using native browser features, that I could get to consistently crash the browsers I tested it on (granted, this was many years ago). On top of that, the API is so ugly. I cannot believe indexeddb won over websql (when every browser ever already embeds sqlite). What a shame.


I wonder if those issues are resolved by using the Dexie.js wrapper, because I've had no problems with that.


It does not, I've also used Dexie.js. Your usecase has most likely been too small to run into the very annoying walls.


Well not for long, if the US pressures Europe to go back on these


There are several "American" cars interesting for our market they talk about when they talk about importing American cars (ex. Toyotas) it's usually not the kind of car you Americans think about, and not much to worry for us ...


Well, now you can set it up better like that.


Back then, the focus was on optimising for the user. Now, however, companies prioritise their own interests over the user.


I think companies always prioritized their own interests.

A company can increase its profits (1) by improving their products and services, so that they'll get more customers or customers willing to pay more, or (2) by increasing how much of their revenue is profit by (e.g.) cutting corners on quality or raising prices or selling customers' personal information to third parties.

Either of those can work. Yes, a noble idealistic company might choose #1 over #2 out of virtue, but I think that if most companies picked #1 in the past it's because they thought they'd get richer that way.

I think what's happened is that for some reason #2 has become easier or more profitable, relative to #1, over time. Or maybe it used not to be so clearly understood that #2 was a live option, and #1 seemed safer, but now everyone knows that you can get away with #2 so they do that.



Indeed, the good old days when "optimizing for the user" got us... Windows 3.1 (release date April 6, 1992 , ref https://en.wikipedia.org/wiki/List_of_Microsoft_Windows_vers...) or the first version of Linux - which I did not have the honor to use but I can imagine how user friendly it was considering what I ended up using couple of years later (https://en.wikipedia.org/wiki/History_of_Linux)

/s


We can have stable user-friendly software. We had a nice sweet spot in the early 2000s with Windows XP and Mac OS X: stable operating systems built on workstation-quality kernels (NT and Mach/BSD, respectively), and a userland that respected the user by providing distraction-free experiences and not trying to upsell the user. Users of workstations already experienced this in the 1990s (NeXT, Sun, SGI, HP, and PCs running IBM OS/2 Windows NT), but it wasn’t until the 2000s when workstation-grade operating systems became readily available to home users, with both Windows XP and Mac OS X 10.0 being released in 2001.


We do of course still have this in modern computing with Linux/KDE. Stable, snappy, and does exactly what you ask. The computer doesn't get in your way, nor does it try to get you to do something else. It just does what you tell it to do, immediately.


Yup, desktop Linux and other FOSS systems like ReactOS and Haiku are the last bastions of personal computing that haven’t been made into platforms that nag and upsell us.


There are myriad ways to optimise for the user, user friendliness is only one of them.

As the old joke went "Unix is user friendly, it's particular about who its friends are".


> first version of Linux - which I did not have the honor to use but I can imagine how user friendly it was

My first accounts were on Linux 1.x. It was glorious. Simple, sensible, and with manuals one command away. And it allowed you to just get things done. And there were tools. So many tools. 80's home computers and DOS crap and Macs that couldn't even open a file if it hadn't been tagged as the property of some application... Hells I would never have to be a part of any more. Except for work and school. But for personal computing, a brighter future was coming. In 30+ years since I've never had to step away.


With all the emojis in the source code, one can instantly recognize it as AI slop. :)


I thought that at first too until I read the first bullet point

" Stores all you sensitive data "

That's a grammar error I don't expect an LLM to make?


There are a couple more in the README too.

It's not impossible that an AI was asked to sprinkle in a few typos for effect, but perhaps it really is just written by a person who really loves emojis.


Maybe they wanted to intentionally make it look like AI as an added pun


I've seen major React ecosystem packages with emoji readmes nearly 10 years back. Your super serious bank app may have emoji in their bundle.


Gees, Rust is fast!


Maybe a supercomputer from a 1980s perspective.


What an absurd take. If we use FLOPS as a crude measure, the Air would be comparable to the leading supercomputers of ~1999/2000. There's many reasons why that's a very poor comparison but ignoring the absolute insanity of the raw compute available in a pocketable, thin, battery-powered handheld that you can buy literally this week, is ridiculous. Modern smartphones are nothing short of sci-fi when compared to even recent living memory. We're simply used to them due to their sheer ubiquity.


The A19 GPU doesn't even have hardware support for FP64, which is the precision used for TOP500. No, it is not comparable to leading supercomputers of 1999/2000.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: