MSIs are so bloated and horribly designed, it isn't even funny. Why does it insist on storing a copy of the entire .msi - and any patches - in C:\Windows\Installer? From what I've observed, the original MSI is required to perform repairs and reconfiguration, or setting up new-user defaults for that app. This seems horribly inefficient to me, and on heavily used system this can easily waste space on the C: drive.
And occasionally you run into issues with the MSI database corrupting, where you can't install/upgrade/uninstall something (we see this commonly with Adobe apps) , so you'd need to resort to third-party tools like MSI Zapper to get rid of all the references from the database. What a messy system.
I miss the days when programs used to use NSIS (Nullsoft) installers - they were tiny, and super easy to make and automate.
To make an uninstall of course you need the msi db parts (but you wouldn’t need the entire binary content). To make a reinstall (aka repair) you DO need both the db and the content.
I haven’t heard msis being bloated. I assume they contain exactly what they need to contain (though unsure about compression methods).
In the past when storage was scarce it seems like a strange design but these days I can’t say it feels so wrong.
> These days it’s more and more common to completely ignore the windows installer system and install per-user to app data, which has the benefit of allowing a better self-updating experience without elevated permissions, and lower risk of pollution of system files.
On the flip side, this is a nightmare to manage as sysadmins, who try to maintain a tight ship using AppLocker polices. So many apps these days have a standard installer, but then they download an update and try to run the newer version from AppData, and of course, it gets blocked and we get calls from angry users saying that their app no longer works. Of course, we could whitelist the digitally-signed executable, but some apps aren't even digitally signed (or only partially signed), sometimes the digital signature changes completely... it's a mess.
Then there's the problem of dumping large binaries and entire applications into the AppData folders, which bloats up user profiles. This can be a bit of an issue with certain roaming profile systems like Citrix User Profile Manager, which by default works in a blacklist mode (ie, you have to explicitly blacklist paths that you don't want to roam). If you don't stay on top of this and add new AppData subfolders to the exclusion list, then you'll find all your large Chrome updates or whatever (with several versioned folders) all syncing back up to the profile server, wasting space, bandwidth, and increasing logon, backup, and AV Scan times. In a large organization with several thousands of users, this is a disaster waiting to happen.
Also, using AppData to store entire apps is just plan wrong - that's NOT what it was meant for. AppData was meant for storing app data, and the apps themselves are supposed to be stored in Program Files.
These self-updating apps are the worst thing that could ever happen in a corporate environment.
I see two main camps of modern application installers, if you want to call them modern. one is Squirrel which is userland based (Microsoft Teams, Spotify, Slack irrc use this) and it's all but impossible to deal with as an admin.
the other camp is Omaha which chrome uses (along with Edge) and runs as a service to handle updates and install.
Neither of these handle the actual problem but push it down the road a bit - you'll find .exe and .MSI installers for both flavors of these applications (usually a boot strap to install the updater and then have that install the application)
I think one push for userland installers was that "you don't need admin" and to some developers it feels like the right place to install an app (user profiles) however I absolutely agree, this makes it a nightmare to deal with as an admin for windows devices.
I really do not like how squirrel handles updates since it feels like any app that uses it you have an in-your-face update experience vs. Omaha where it does background updates and you typically have no idea as a user anything even happened.
As far as code signing I went on a tangent one day trying to figure out why most code isn't signed and I think it's just too expensive and or complicated for developers not forced to be in a position to actually need it - enterprise or an app store requiring it.
Squirrel doesn’t have any installation gui or specific process. It downloads and installs version N+1 in the background. Next start will run that version. Any decision to show even a notification that this happened is entirely up to the developer. I think it’s even near impossible to show an “installer” in the traditional sense for the initial install. You can show a splash at initial install that’s it.
As far as userland installers go they are all (almost by definition) the same in the end
I can see the concern but from the user perspective what’s the solution assuming I want the one week release cycle of dozens of apps?
In the end I think the idea that apps install elevated but run under lower user privs is now completely outdated. One can’t and shouldn’t separate the idea of the user and the application maintainer.
> Also, using AppData to store entire apps is just plan wrong - that's NOT what it was meant for.
It doesn’t seem conceptually wrong to use program files for system wide (not per user) executables and localappdata for per user program files, regardless of whether they are program binaries or program data. Since they aren’t roaming and typically not backed up, what’s the harm in having teams.exe in there?
If chrome dumps large volumes of program binaries into app data\roaming (does it really?) then that sounds really bad. Similarly for any system that would roam anything outside of appdata\roaming.
The problem is, some of these missing cmdlets are pretty basic/essential (speaking as a sysadmin), so it's a bit ridiculous that they're still left out. I mean, sure, I could just call the native *nix binary, but then I'd lose out on the objects, which then defeats the whole point of PowerShell. Plus, I'd have to rewrite all my existing scripts/scriptlets/libraries, which kills the portability.
I could write wrappers around the native binaries, or build custom PSCustomObjects on the fly in the pipeline, but I don't want to waste my time doing this, when these cmdlets are all pretty standard on Windows, and we really shouldn't be expected to do Microsoft's work.
For instance, Test-NetConnection, Resolve-DnsName (and other net commands), Register-ScheduledJob (background jobs), Get-Partition (and other disk management stuff), I could go on, but these and a lot of essential cmdlets are missing.
More than that, my biggest issue is that a lot of my favorite modules and scripts that I've been using (from the PowerShell Gallery and other places) don't work, and many of my scripts have dependencies on them - which means I'd have to rewrite them all to make it work, and some of them may never have a hope working in Linux/macOS due to missing .NET libs/Windows APIs.
Basically, PowerShell on non-windows systems is not fun. It's neither here, nor there - ie neither does it provide all the functionally of Windows PowerShell, nor does it fit in like a proper *nix shell, and even then, it doesn't object-ify any popular *nix native binaries, so you lose out on all the object-oriented benefits, unless you spend time writing wrappers or constantly build PSCOs in the pipeline.
At the end of the day, if I've resolved myself to just parse all the text from *nix binaries, then I might as well work in a shell that is a first-class *nix citizen like fish or zsh, which is more suitable for that sort of workflow, instead of working in a confusing shell that's neither here nor there.
You equalized shell with the tool orchestrator (well, maybe it is for you). It's much more than that, and while pwsh might be a bit of "here and there" in that domain, it's totally "here" in all other domains.
As one example, pwsh language makes one much more efficient with scripts that are way easier to grasp and a lot shorter.
Your tool support questioning is cherry-picking on some cases - I for example, would never use jq in place of pwsh alternatives which are much more capable and readable. And for some formats like CSV there is really no standard binary on linux AFAIK. People basically jump to python et al. for anything more serious then couple liners.
*Allegedly. We do not know yet they that this is from the actual Linux Foundation, could be some patent troll looking for a quick buck, or maybe a rival seller getting pissy.
There are thousands of T-shirts with literal "Linux" written on them[1], with no relation to the Foundation, so it's extremely unlikely that this takedown is from them.
You know what would be cool? A defragmentation game. It's basically Tetris but in a circular layout, and you get more points the better optimized it is - with frequently used data blocks towards the outer tracks and old/archive file types towards the inner tracks.
I always enjoyed watching graphical defragmenters do their magic back in the day and would unironically love to play a defragging game.
It's top-down 2D. You are alone in a large, empty warehouse. At regular intervals, a truck drops off a load of assorted objects at the back, and a few people ask you for some specific objects at the counter in front. Your role is to arrange the objects in the warehouse such that you can fill the requests in the least amount of time, and you can do this in any way that works for you (by color, function, name, etc).
As the game goes on, the truck drops off more and more different kinds of objects (usually ones that fit into more than one category), and the requests get more and more complicated.
At the end you get a time-lapse of the whole warehouse over the whole game and you can see your strategy evolving over time.
Pretty well executed and simple. Heard about it from one of its devs in a HN comment actually.
I like the idea, but I noticed that there seems to be some kind of issue with it registering clicks. I don't know if I'm just sometimes clicking slightly outside of the hitboxes or what, but I find myself "dragging" stuff by accident when the game doesn't register that I "dropped" whatever object I had selected.
There used to be a few old ones in the day that I recall trying which were a bit more direct simulacrums than these. Not sure if anyone else might find them!
The "career" in question could be autorouter developing.
That said, I'd suspect it's similar to CS: autoprogrammers are "known" to not work. If you can understand and articulate the business logic in a concise way, understand and articulate all the components of the system and how they may interfere with each other, understand and articulate the system's nominal and practical input ranges, etc., then sure something might autoprogram the code for you, but it's not the autoprogrammer doing the real work.
Well now ASIC design is an even more niche EE application. How does someone in neuroscience end up needing those skills? Did you Major in EE in undergrad?
it's not really related in concept, but I feel like playing various puzzles with pentominoes would feel a bit similar to this, as it's spatial reasoning puzzles. I adore pentominoes and spent a large portion of my childhood playing with them, recently rediscovered them as an adult.
For me it was Tangrams, which I've only seen one somewhat odd (in a "feels outsourced" kind of way) computer game (not that you can really improve on the physical version). I first thought you said pantomimes, which could also be interesting :).
Definitely played with tangrams too, and funnily enough, I had a bunch of animal-shaped puzzles that I guess you could call pantomimes haha. If you enjoyed tangrams enough, I can't recommend enough picking up a set of pentominoes - definitely agree that a physical set is the right call. There's 12 total so you have 60 individual squares, (pentominoes are 5-square tiles), so popular figures are 6x10, 5x12 (a bit harder than 6x10), and 3x20 (very challenging). You can also get cubes and do 2x5x6 (really hard) or 3x4x5 (even harder).
Recently though, I've been doing 8x8 - 4, which means picking 4 squares to remove from an 8x8 square and then filling in the rest (I have some wooden boards i picked up from an etsy seller but you can just as easily trace the outline on paper and color in the forbidden cells, which I've also done). Which has been delightful!
I found jkdefrag, and it was dot based instead of block based. Now that became the coolest - until it was discontinued... Now hard to find... But, mydefrag has become the best defragmenter.
> A distro that is made to be resilient and simple, that has wizards for everything, sensible defaults and sturdy guardrails, to the point Linux pros would turn up their nose, is certainly technically possible
There are already distros like that - pretty much most immutable distros are aiming to be that way. Eg, SteamOS, or Fedora's Silverblue.
It's because popular PC manufacturers don't bundle Linux (for the masses, so not counting developer machines). Your average Joe isn't going to care, or bother about installing an operating system on their device - they'll look at the brand, the looks of the device, and the price. They may not even know, what an "operating system" is in the first place.
The Steam Deck's popularity shows this, most people who bought the Deck bought it because it's a good handheld gaming device, and they don't care that it's running Linux (at least, prior to making the purchase decision).
Furthermore, the Steam Deck shows that Linux is accepted by the masses when it's marketed as a purpose-built device. Take Chromebooks for instance, which is basically a nerfed Linux - people still buy them in spite of all it's limitations.
Now imagine a big OEM marketing and championing a full-fledged distro, which is just as user friendly and secure as ChromeOS but allows you to do much more - with the right marketing, budget and polish, it could take off.
I really like what System76 is doing with their Pop_OS!, but unfortunately they're still just a niche brand, and without the right marketing and partnership, they will continue to remain a niche player. But imagine if they spent $$$ on marketing (including viral campaigns on new social media) and partnered up with the likes of Best Buy and Amazon etc to sell their PCs, they could really take off.
Cool, but did you ever hear about and play BioMenace? Everyone always talks about Keen, but literally no one ever mentions BioMenace, which was built on the same engine as Keen. :(
BioMenace was on a computer of the parallel group (pupils were put into groups A, B, C, etc., who would have their classes separately). One of my best friends excelled at that game. I enjoyed watching him play it.
In hindsight, we were pretty lucky to have computers at that time at school. Even if old ones. Still have not found all of the games again on DOS game archives, that were on these machines.
And occasionally you run into issues with the MSI database corrupting, where you can't install/upgrade/uninstall something (we see this commonly with Adobe apps) , so you'd need to resort to third-party tools like MSI Zapper to get rid of all the references from the database. What a messy system.
I miss the days when programs used to use NSIS (Nullsoft) installers - they were tiny, and super easy to make and automate.