DB2 was crazy good for certain use cases but very weird. For one, the pattern for DB2 efficiency was pretty much the exact opposite of every other database. Every other database would say "Normalize your tables, use BCNF, blah blah, small reference tables, special indices etc".
DB2, the pattern was "denormalize everything into one gigantic wide table". If you did that it was insanely fast for the time and could handle very large datasets.
I have not had much experience with DB2, but given that the relational data model and normalization was invented at IBM (Codd) and IBM's implemenation of those concepts was DB2, DB2 performing poorly with a normalized data model seems strange.
My recollection was that DB2 did not support multi version concurrency control like Oracle and Postgres did. The result was a lot of lock contention with DB2 if you were not careful. MVCC was eventually added to DB2, but by then it was too late.
DB2 had/has excellent data compression capabilities, achieving ratios for OLTP that would only be equaled by later OLAP columnar systems.
For raw performance needs, many financial services schema were going to be denormalized anyway. Compression was a great way to claw some of the resulting inefficient storage back.
> DB2, the pattern was "denormalize everything into one gigantic wide table". If you did that it was insanely fast for the time and could handle very large datasets.
I bought some DOS games wrapped in DOSBox on GOG, and I'm not sure if GOG uses some bad version or bad config, but it's pain the the ass - you can't resize a window to be able to actually see something on 4k screen, no obvious way to switch to fullscreen and back, etc.
It's one thing to be able to emulate DOS games (something which worked 20+ years ago), it's another thing to offer reasonable ergonomics in a modern environment...
I run the GOG installer, then copy the game files to where the virtual C: is for my DOSBox-X (in a git-repo). Then there is usually some small BAT script to write to launch the game (what is required can be figured out from the config GOG installed). I play all the games from within DOSBox-X, so I start that up in fullscreen and then run games as if running real DOS (from COMMAND.COM prompt).
It is of course possible to launch games from outside of DOSBox the way GOG does it, using host OS (non DOS) launch scripts and config for each game, but I prefer to have more like a virtual DOS fantasy console with all games installed. It also means DOSBox is fully self-contained, with no dependencies on the host OS. Once set up there is no maintenance. There is no code rot in DOS of course, so when something has been set up it will always work.
Yeah this is key - in order to get out of full-screen you have to find and navigate some some ini file, then its still doing the mouse capture thing, which I think is also a setting, but all this faffing about just to get it into a usable state is pretty user hostile
I want the window to be open like any other window, and the mouse pointer to work transparently in and out of it - like when I hover the mouse over the dosbox window, change the pointer but keep the same mouse speed, momentum etc
I think this would be really hard to do in an emulated environment, maybe even would require patching each game executable to get the mouse speed right (not sure?), the modern environment integration, like you say, is what Im after too
It's possible in practice: that's how DAI worked originally. It's just not very competitive where the main customer -- traders -- want a lot of liquidity and razor thin spread.
IBM market cap is 225B, Microsoft market cap is 2.9T. IBM literally lost its matket to Microsoft in 80s and 90s specifically because it was too focused on enterprise...
Microslop got a much bigger market capture before pivoting to B2B as its focus. It could shift because it feels its too entrenched in society, so not much is needed to maintain the safe revenue stream.
Any article about biodegradable plastics should start with advantages over cellophane/cellulose.
People have figured out how to make it a hundred years ago, it's already used for food packaging, known properties, abundant and cheap - made from trees / other plants.
The article starts as if it's some breakthrough miracle which is unheard of. I can literally just buy compostable bags for organic waste made of corn starch on Amazon. It's already a product.
Journalist demonstrate less awareness than 8B LLM. Scientist tells you about a new plastic? Ask them how it's better than what's already on the market.
It's not that the "journalist" didn't think to ask, it's that this is a PR piece sent out to media outlets from the university that did the research. Nearly all universities have a PR team that sets fluff pieces out to the media to promote the work of the university.
The person who wrote this is being paid not to ask tough and important questions around this research.
My understanding is that cellophane generally does biodegrade in most settings. Polylactic acid (those cornstarch-derived bags) mostly biodegrades in hot enough compost or (after several years) in ambient-temperature soil, but not very well in cooler water (One study: "The half-life period of degradation [of polylactic acid in artificial seawater] is 12 [days at 90° C] or 468 days [at 60° C]").
Those temperatures are certainly hard to find in nature, outside of hot springs! Even if this is an error and we are talking about 90°F/60°F, the higher temperature is pretty much constrained to the tropics, so we're talking a year+ to degrade in real conditions. It is better than centuries, but not exactly rapid?
Yeah, I imagine it's considerably slower at ambient ocean temperature. Don't throw your PLA bags in the ocean or a river. Here's a different paper:
> For example, PLA is not biodegradable in freshwater and seawater at low temperatures [32,36–39]. There are two primary reasons for this: (i) The hydrophobic nature of PLA, which does not easily absorb water [40–42]. In aqueous environments, the lack of hydrophilicity diminishes the hydrolysis process, which is crucial for the initial breakdown of PLA into smaller, more degradable fragments. (ii) Resistance to enzymatic attack; the enzymes that degrade PLA are not prevalent or active under typical freshwater and seawater conditions [39,43,44]. The microbial communities in these environments may not produce the necessary enzymes in sufficient quantities or at the required activity levels to effectively breakdown PLA. Additionally, the relatively stable and crystalline domains of PLA can further resist enzymatic degradation.
Also:
> It should be emphasized that neat PLA cannot be classified as a completely biodegradable polymer, as it generates microplastics (MPs) during biodegradation.
My father got me a second-hand computer with Am386DX-40 somewhere around 1997, IIRC. An upgrade to older 286.
It was two generations old at that time but still a lot of fun: it could run a lot of games (incl. DOOM, of course), programming (largely Turbo Pascal 7), and some word processing under Windows 3.11.
I didn't bother with Win95, though.
I've been using it up until 1999, when I finally got a then-modern computer with Windows 98. But in some ways MS-DOS felt more capable - I really knew what each file is for, what computer is doing, etc. I.e. the entire machine is fully comprehensible. You really don't get it with Windows unless you're Russinovich or something.
You can train a model with GPT-2 level of capability for $20-$100.
But, guess what, that's exactly what thousands of AI researchers have been doing for the past 5+ years. They've been training smallish models. And while these smallish models might be good for classification and whatnot, people strongly prefer big-ass frontier models for code generation.