Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You seem to be missing the crux of the issue I was trying to address — even today, loading 50.000 items into a frontend application would be a very, very niche edge case. 10 years ago even more so. Tacking on arbitrary reference points from Wikipedia doesn't change any of that.


> You seem to be missing the crux of the issue I was trying to address — even today, loading 50.000 items into a frontend application would be a very, very niche edge case.

It's a medium-sized inventory, if you need fast / offline access then loading it to the client makes a lot of sense. I'm sure there are plenty other things of which you can easily reach 50k, and that you'd want to index or cross-reference somehow.


I can think of plenty of tasks where a 2009 frontend will potentially be exposed to workloads in the 10k-100k elements range:

* Editing objects in a 3D scene(e.g. a AAA open-world game or a CGI film)

* Plotting events in a long-running log on a graph

* Scraping and presenting data from web sources

The common thread here is that you have most of your data in application memory but not necessarily in a formal database system, and so you shoulder the full burden of managing it properly. In most cases the solution is to define a real backend and filter it there, paginate or otherwise reduce the amount that gets presented because data at that scale won't be usable by humans to begin with. But sometimes you do have a reason to explicitly want a "big list of everything," and equally as often you end up with the big list of everything just by accident. It just comes with the territory of report generation tasks.


50k items seems actually quite small. Front loading data can make your app look a lot snappier at the cost of a slightly longer load time. It may be worth it in many situations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: