Plus the data itself (encoded in load-db.js) at another 101KB.
Though given the first couple of links on the current front page that are local (Ask HN) so small because this site is tight, video content, etc, have payloads of 4.2MByte (a Washington Post article), 2.2MByte (something on the Economist), 4.5MByte (Wired), 4.7MByte (the-odin), ..., if that 780K+101K is doing something genuinely useful it perhaps isn't that big compared to the bad standard set elsewhere!
For a serious use, storing the DB engine and data in local resources rather than reloading each time which the devtools network profiler suggests is happening, would be a good idea (preferably only transferring a diff when there is an update).
A 700kb dependency isn't too bad in a modern browser + connection. Depending on how much else is loaded and how it is loaded.
Developers often have an out-of-date perspective on what is big these days and the realistic performance impacts. Especially with HTTP2 and multiple assets loading async.
Though given the first couple of links on the current front page that are local (Ask HN) so small because this site is tight, video content, etc, have payloads of 4.2MByte (a Washington Post article), 2.2MByte (something on the Economist), 4.5MByte (Wired), 4.7MByte (the-odin), ..., if that 780K+101K is doing something genuinely useful it perhaps isn't that big compared to the bad standard set elsewhere!
For a serious use, storing the DB engine and data in local resources rather than reloading each time which the devtools network profiler suggests is happening, would be a good idea (preferably only transferring a diff when there is an update).