- Supertux2, it got recently revamped, the quality skyrocketed. Much better controls and artwork.
- Supetux Advance, this is really great too.
- Retux (More Wariolike than Mario)
- Nethack/Slashem. A Roguelike more bound to interaction/exploration/mechanics than combat, but Slashem makes combat crazy with the Doppleganger Monk, which is basically a Shonen Manga, the role. (Dragon Ball/Naruto depending on your age).
- DCSS. Basically, not Nethack/Slashem, much more combat oriented
than the Slashem combinatorics playing with the Monk a la Jackie Chan, this is more like an ARPG made a Rogue.
- Frotz/Lectrote/Winfrotz/whatver Z Machine interpreter and "All Things Devour". Spiritwrak, too. Great libre text adventures and still enganing because of weird mechanics.
- Frozen Bubble
- OpenArena.
- FreeDoom, better compiled with Deutex on daily builds.
- FreeCiv.
- OpenTTD today can be standalone enough.
- Frozen Bubble
- Minetest+tons of subgames such as Glitch, Nodecore...
- OOlite
- Speed Dreams. If the controls are hard, try the arcade mode. If the controls are still hard, get SupertuxKart, pick some real life car from the addons and get all the SD tracks from the inline downloader, they are several.
Heads up: they recently changed name to Luanti to get away from the "it's Minecraft but worse" perception. They also un-bundled the built-in game and are trying to be a game engine these days.
I recommend looking into Age of Mending. It's still in alpha, but if it's ever finished it'll give Minecraft a good run for its money, especially among the builder-minded players.
Luanti had an annoying habit of regenerating terrain while I was still walking on it. In my book it is very much still "minecraft but worse." I was on a very weak ARM device though.
OpenArena even has a browser version these days but sadly it doesn't seem to have any active servers anymore. I had progressed to the point where I could strafe jump and rocket jump all day.
Mono used to have libwine embedded. You know, libwine exists as a library running and compiling Win32 natively under Unix. Instead of PE binaries you would run ELF Linux ones, but with nearly the same outcome.
Every time I tried following alone with the winelib/winemaker documentation, I always ended up with an ELF that had to be invoked using "wine" to run. Nothing that could self-load any of the wine dependencies.
This. An old netbook cam emulate a PDP10 with ITS, Maclisp and some DECNET-TCP/IP clients and barely suffer any lag...
Also the Amiga's have AmiSSL and it will run on a 68040 or some FPGA with same constraints. IRC over TLS, Gemini, JS-less web, Usenet, EMail... not requiring tons of GB.
Nowadays even the Artemis crew can't properly launch Outlook. If I were the IT manager I'd just set Claws-mail/thunderbird with file attachments, MSMTP+ISYNC as backends (caching and batch sending/receiving emails, you know, high end technology inspired by the 80's) and NNCP to relay packets where cuts in space are granted and thus NNCP can just push packets on demand.
The cost? my Atom n270 junk can run NNCP and it's written in damn Golang. Any user can understand Thunderbird/Claws Mail. They don't need to setup anything, the IT manager would set it all and the mail client would run seamlessly, you know, with a fancy GUI for everything.
Yet we are suffering the 'wonders' of vibe coding and Electron programmers pushing fancy tecnology where the old one would just work as it's tested like crazy.
> Also the Amiga's have AmiSSL and it will run on a 68040 or some FPGA with same constraints. IRC over TLS, Gemini, JS-less web, Usenet, EMail... not requiring tons of GB.
The AmiSSL came out long after the C64 was a relic and required hardware that was an order of magnitude more powerful than the C64 ;)
Back the day people had BASIC and some machines had Forth and it was like
print "Hello world"
or
." Hello world " / .( Hello world )
for Forth.
By comparison, giving how they optimized the games for 8 and 16 bit machines I should have been able to compile Cataclysm DDA:BN under my potato netbook and yet it needs GIGABYTES of RAM to compile, it crazy that you need damn swap for something it required far less RAM 15 years ago for the same features.
If the game was reimplemented in Golang it wouldn't feel many times slower. But no, we are suffering the worst from both sides of the coin: something that should have been replaced by Inferno -plan9 people, the C and Unix creators and now Golang, their cousin- with horrible compiline times, horrible and incompatible ABI's, featuritis, crazy syntax with templates and if you are lucky, memory safety.
Meanwhile I wish the forked Inferno/Purgatorio got a seamless -no virtual desktops- mode so you fired the application in a VM integrated with the guest window manager -a la Java- and that's it. Limbo+Tk+Sqlite would have been incredible for CRUD/RAD software once the GUI was polished up a little, with sticky menus as TCL/Tk and the like. In the end, if you know Golang you could learn Limbo's syntax (same channels too) with ease.
BASIC was slow in the 80s. Games for the C64 (and similar machines) were written in machine code.
> By comparison, giving how they optimized the games for 8 and 16 bit machines I should have been able to compile Cataclysm DDA:BN under my potato netbook and yet it needs GIGABYTES of RAM to compile, it crazy that you need damn swap for something it required far less RAM 15 years ago for the same features.
That’s not crazy. You’re comparing interpreted, line delimited, ASCII, with a compiler that converts structured ASCII into machine code.
The two processes are as different to one another as a driving a bus is to being a passenger on it.
I don’t understand what your point is in the next two paragraphs. What Go, TCL, UNIX nor Inferno have to do with the C64 or modern software. So you’ll have to help out there.
Some BASIC was slow, Darthmouth BASIC was compiled to machine code before execution, and to make it fit as the machine shell on 8 bit computers, the only option available on ROM was an interpreter.
Machines using CP/M had BASIC compilers available, and professional devs had the option to buy BASIC compilers for 8 and 16 bit home computers, when the 16 bit home computers came to be, BASIC compilers became common again.
It was only during the glory 8 bit days, with most users being students with limited budgets, and the pain of switching tapes all the time, that made using BASIC compilers a no go option for most, thus we ended up using DATA blocks for Assembly code within our budget, or if lucky to get a hexdump monitor or proper Assembler, use those instead.
I’m aware you can compile BASIC but we aren’t talking about those systems. We are talking specifically about the ones which you just admitted were slow.
Compare Limbo+Tk under Inferno with current C#/Java. Or C++ against Plan9C.
We have impressive CPU's running really crappy software.
Remember Claude Code asking 66GB for a damn CLI AI agent for something NetBSD under a Vax (real or physical) from 1978 could do with NCurses in miliseconds every time you spawn Nethack or any other NCurses tool/game.
On speed, Forth for the ACE was faster than Basic running under the ZX80. So, it wasn't about using a text-parsed language. Forth was fast, but people was not ready for neither RPN nor to manage the stack, people tought in an algebraic way.
But that was an 'obsolete' mindset, because once you hit HS you were supposed to split 'big problems into smaller tasks (equations). In order to implement a 2nd degree equation solver in Forth you wouldn't juggle with the stack; you created discrete functions (words) for the discrimination part and so on.
In the end you just managed two stack items per step.
If Forth won instead of Basic, instead of allowing spaghetti code as a normal procedure we would be pretty much asking to decompose code into small functions as the right thing to do from the start.
Most dialects of BASIC actually had functions too. They just weren’t popularised because line numbers were still essential for line editing on home micros.
> On speed, Forth for the ACE was faster than Basic running under the ZX80. So, it wasn't about using a text-parsed language.
Forth and BASIC are completely different languages and you’re arguing a different point to the one I made too.
Also I don’t see much value in hypothetical arguments like “if Forth won instead of BASIC” because it didn’t and thus we are talking about actual systems people owned.
I mean, I could list a plethora of technologies I’d have preferred to dominate: Pascal and LISP being two big examples. But the C64 wasn’t a lisp machine and people aren’t writing modern software in Pascal. So they’re completely moot to the conversation.
They were different but both came in-ROM and with similar storage options (cassette/floppy).
On Pascal, Delphi was used for tons of RAD software in the 90's, both for the enterprise and for home users with zillions of shareware (and shovelware). And Lazarus/FPC+SQLITE3 today is not bad at all.
On Lisp... it was used on niche places such as game engines, Emacs -Org Mode today it's a beast-, a whole GNU supported GNU distro (Scheme) and Maxima among others.
Still, the so called low-level C++ it's an example on things picking the wrong route. C++ and QT5/6 can be performant enough. But, for a roguelike, the performance on compiling it's atrocious and by design Go with the GC would fix a 90% of the problems and even gain more portability.
I’m very aware of Lazarus, Delphi and Emacs. But they’re exceptions rather than industry norms.
And thus pointing them out misses the point I was making when, ironically, I was pointing out how you’re missing the original point of this discussion.
My point was about performance. Yes, Basic vs Forth was the worst choice back in the day, and you could say low level stuff was done under assembler.
Fine. But the correct choice for 'low level' stuff it's C++ and I state that most of the C++ compilers have huge compiling times for software (GCC), or much better but they still eat ram like crazy (clang) and except for few software, the performance boost compared to Go doesn't look as huge for mosts tasks except for Chromium/Electron and QT.
For what software it's doing a 90% of the time, Go + a nice toolkit UI would be enough to cover most tasks while having a safe language to use. Even for bloated propietary IM clones such as Discord and Slack.
Because, ironically, most of the optimized C++ code is to run bloated runtimes like Electron tossing out any C++ gives to you, because most Electron software it's implementing half an OS with every application.
With KDE and QT at least you are sharing code, even by using Flatpak, which somehow deduplicates stuff a little bit. With Electron you are running separate, isolated silos with no awareness of each other. You are basically running several 'desktop environments' at once.
You can say, hey, Go statically builds everything, there's no gain on shared libraries then... until you find the Go compiler can still do a better job using less RAM than average than tons of stuff.
With Electron often you are shipping the whole debugging environment with yourself. Loaded, and running graphical software with far less performance than the 'bloated' KDE3 software back in the day doing bells and wistles under a Kopete chat window under an AMD Athlon. QT3 tools felt snappy. Seeing Electron based software everywhere has the appeal of running everything GUI based under TCL/Tk under a Pentium modulo video decoders and the like. It will crawl against pure Win32/XLib under a Pentium 90 if everything it's a TK window with debugging options enabled.
So, these are our current times. You got an i7 with 16GB of RAM and barely got any improvement with modern 'apps' over an i3 with 2GB of RAM and native software.
You’re talking about compiler footprint and runtime footprint in the same conversation but they’re entirely different processes (obviously) and I don’t think it makes any sense to compare the two.
C++ is vastly more performant than Go. I love Go as a language but let’s not get ourselves carried away here about Gos performance.
It also makes no sense no sense to talk about Electron as C++. The problem with Electron isn’t that it was written in C++, it’s that it’s ostensibly an entire operating system running inside a virtual machine executing JIT code.
You talked about using Go for UI stuff, but have you actually tried? I’ve written a terminal emulator in Go and performance UI was a big problem. Almost everything requires either CGO (thus causing portability problems) or uses of tricks like WASM or dynamic calls that introduced huge performance overheads. This was something I benchmarked in SDL so have first hand experience.
Then you have issues that GUI operations need to be owned by the OS thread, this causes issues writing idiomatic Go that calls GUI widgets.
And then you have a crap load of edge cases for memory leaks where Go’s GC will clear pointers but any allocations happening outside of Go will need to be manually deallocated.
In the end I threw out all the SDL code. It was slow to develop, hard to make pretty, and hard to maintain. It worked well but it was just far too limiting. So switched to Wails, which basically displays a WebKit (on MacOS) window so it’s lower footprint than Electron, allows you to write native Go code, but super easy to build UIs with. I hate myself for doing this but it was by far the best option available, depressingly.
I know C++ it's far more performant than Go but for some games and software C++ wouldn't be needed at all, such as nchat with tdlib (the library should be a Go native one by itself, is not rocket science). These could be working close in low end machines with barely performance losses.
In these cases there's nothing to gain with C++, because even compared to C, most C++ software -save for Dillo and niche cases- won't run as snappy as C ones. Running them under Golang won't make them unusable, for sure.
On the GUI, there's Fyne; but what Go truly needs it's a default UI promoted from the Golang developers written in the spirit of Tk.Tk itself would be good enough. Even Limbo for Inferno (Go's inspiration) borrowed it from TCL. Nothing fancy, but fast and usable enough for most entry tasks.
Python ships it by default because it weights near NIL and most platforms have a similar syntax to pack the widgets. Is not fancy and under mobile you need to write dedicated code and set theming but again if people got to set Androwish as a proof of concept, Golang could do it better...
Another good use case for Go would be Mosh. C++ and Protobuf? Goland should have been good for this. C++ mosh would be far snappier (it feels with some software like Bombadillo and Anfora vs Telescope) but for 'basic' modern machines (first 64 bit machines with Core Duo's or AMD64 processors) it would be almost no delay for the user.
Yes, 32 bit machines, sorry, but for 2030 and up I expect these be like using 16 bit DOS machines in 1999. Everyone moved on and 32 bit machines were cheap enough. Nowadays it's the same, I own an Atom n270 and I love it, but I don't expect to reuse it as a client or Go programming (modulo for Eforth) in 4 years, I'd expect to compute everything in the low 64 end bit machines I own.
But it will be a good Go testing case, for sure. If it runs fast in the Atom, it would shine under amd64.
With the current crysis, everyone should expect to refurbish and keep 'older' machines just in case. And be sure that long compiling times should be cut in half, even if you use ccache. RAM and storage will be expensive and current practices will be pretty much discarded. Yes, C++ will be used in these times, but Golang too. Forget Electron/Chromium being used as a standalone toolkit outside of being the engine of a browser.
And if oil/gas usage it's throttled for the common folk, E/V and electric heating will reach crazy numbers. Again, telecomms and data centers will have their prices skyrocketted so the power rise doesn't blackout a whole country/state. Again, expect power computing caps, throttled resolutions for internet media/video/RDP content, even bandwith caps (unless you pay a premium price, that's it) and tons of changes. React developers using 66GB of RAM for Claude Code... forget it.
Either they rebase their software in Go... or they already lost.
Some Pokémon Crystal ROMs pack a huge amount of gaming in very few MB. Z80-ish ASM, KB's of RAM.
The ZMachine games, ditto. A few kb's and an impressive simulated environment will run even under 8bit machines running a virtual machine. Of course z3 machine games will have less features for parsing/obj interaction than z8 machine games, but from a 16 bit machine and up (nothing today, a DOS PC would count) will run z8 games and get pretty complex text adventures. Compare Tristam Island or the first Zork I-III to Spiritwrak, where a subway it's simulated, or Anchorhead.
And you can code the games with Inform6 and Inform6lib with maybe a 286 with DOS or 386 and any text editor. Check Inform Beginner's Guide and DM4.pdf
And not just DOS, Windows, Linux, BSD, Macs... even Android under Termux. And the games will run either Frotz for Termux or Lectrote, or Fabularium. Under iOS, too.
Nethack/lashem weights MB's and has tons of replayability. Written in C. It will even run under a 68020 System 7 based Mac... emulated under 9front with an 720 CPU as the host. It will fly from a 486 CPU and up.
Meanwhile, Cataclysm DDA uses C++ and it needs a huge chunk of RAM and a fastly CPU to compile it today. Some high end Pentium4 with 512MB of RAM will run it well enough, but you need to cross compile it.
If I had the skills I would rewrite (no AI/LLM's please) CDDA:BN into Golang. The compiling times would plummet down and the CPU usage would be nearly the same. OFC the GC would shine here prunning tons of unused code and data from generated worlds.
The fantasy computer by 100 rabbits? I love their philosophy, I'm glad Varvara exists, but I'm personally not up to program assembly for a 4-color screen, and I'm sure many others are the same.
Northern Spaniard there, bring a Saunaa lover Finn with one of these climate-change induced hours at 43C at some day or two in Summer... in the Atlantic, in Bilbao, which is... inside a valley.
I've been in saunas at 60-70C and the feeling inside was much bearable because of the lack of humidity than 43C under a climate closer to UK than inner/Mediterranean Spain.
- Supertux2, it got recently revamped, the quality skyrocketed. Much better controls and artwork.
- Supetux Advance, this is really great too.
- Retux (More Wariolike than Mario)
- Nethack/Slashem. A Roguelike more bound to interaction/exploration/mechanics than combat, but Slashem makes combat crazy with the Doppleganger Monk, which is basically a Shonen Manga, the role. (Dragon Ball/Naruto depending on your age).
- DCSS. Basically, not Nethack/Slashem, much more combat oriented than the Slashem combinatorics playing with the Monk a la Jackie Chan, this is more like an ARPG made a Rogue.
- Frotz/Lectrote/Winfrotz/whatver Z Machine interpreter and "All Things Devour". Spiritwrak, too. Great libre text adventures and still enganing because of weird mechanics.
- Frozen Bubble
- OpenArena.
- FreeDoom, better compiled with Deutex on daily builds.
- FreeCiv.
- OpenTTD today can be standalone enough.
- Frozen Bubble
- Minetest+tons of subgames such as Glitch, Nodecore...
- OOlite
- Speed Dreams. If the controls are hard, try the arcade mode. If the controls are still hard, get SupertuxKart, pick some real life car from the addons and get all the SD tracks from the inline downloader, they are several.
reply