Hacker Newsnew | past | comments | ask | show | jobs | submit | joshowar's commentslogin

As a youngster who has had to pick up a saw and chisel to repair an old chair, I think you're looking back with rose tinted glasses a bit. I've never been astonished by legacy code, as much as bemused for the lack of object orientation, data normalization, etc. The technology changes of course, but there are advancements made in craftsmanship/techniques.

There's some 'cleverness' lost probably, in the new ways vs the old. I've seen some pretty novel approaches to what should be simple tasks in legacy code. And I do recognize that for what it is. But as impressive as that cleverness can be, I think it's one of the reasons as to why a lot of legacy code persists and nobody wants to touch it. Among many other reasons.


The truth is that most code, whether old or new, is terrible. The odds of any old code you might end up having to look at being terrible is very high.


Any of my code more than 5 years old is terrible. This has always been true for me.


Could be survivor bias. All legacy code that isn't terrible is rewritten. All terrible legacy code isn't touched by anyone and therefore remains.


Yeah but this guy wrote the first native C++ compiler and then the D programming language


Did not know that until you mentioned! Sometimes even legends post here on HN!


Walter is pretty regular around here, but I agree, it's like Arnold Schwarzenegger dropping in at reddit to hang out


To da choppah!


Huge disagree on this. Old code tend to be much more terrible because the tools were terrible. They gave little incentive and absolutely no help to write even passable code. We end up with a mess of poorly documented, sparingly commented spaghetti code that sometimes work, often by accident.

Nowadays, we have linters to point out common classes of mistakes, programming languages with actual type systems to enforce invariants, integrated testing tools, etc...

Note that, yes, we do still have new, terrible codebases. But at least the tooling nowadays raised the bar to have a minimum floor of quality that, while very low, is still oh so much higher than it used to be.


> Huge disagree on this. Old code tend to be much more terrible because the tools were terrible.

I've looked at 1990s code in the Windows NT kernel. It was wonderful.

Oldest source file I went through was from 1993. Perfectly readable and understandable.

One of the best modern code bases I ever worked in didn't even have a linter setup. The principle dev reviewed every single commit and enforced a consistency across the code base that was better than what any tools ever could have done.


> The principle dev reviewed every single commit and enforced a consistency across the code base that was better than what any tools ever could have done.

How was it so much better that it justified that level of busywork on the part of the senior member of the team (and busywork for everyone else, to fix the style nits they enforced in review)? I would have guessed that taking a few hours to install and configure a formatting linter to free up the principal dev's time to focus on other things would have been hugely high leverage.


It ensured not only consistency of style, but also consistency of ideas. Every file was structured similarly, impedance mismatches were minimized, work across the entire code base was organized and unified. Junior engineers got a chance to talk to the principle developer about every commit they made, and accordingly their abilities as software engineers skyrocketed.

Developers quickly set their IDEs to follow the team's coding guidelines, style wasn't really a problem.


Reading every commit seems like the best short term way to maintain code quality and keep the whole project in your head?


This is nonsense. The code quality is determined by the author, just like today.

> Nowadays, we have linter

I was using a C linter in 1986. The tooling is better now but it still comes down to the author.


We had "coding standards," and if you didn't code it to the right format, you'd get fussed at. I didn't start professionally until the mid 90s, so I'm a little younger than the OP. I've been coding since the mid 80s though as a youngster.

Putting BEGIN..END around a one line block was always a contention since it wasn't required. Less code vs arguably more readable code (Pascal)

  If BLA Then
  Begin
    DO STUFF
  End
vs

  If BLA Then
    DO STUFF


> The code quality is determined by the author, just like today.

This is nonsense. The worst authors can still produce bad code with the best tools and the best authors can still produce good code with the worst tools, but most authors are somewhere in the middle, and tooling makes a huge difference to the average.


I agree with the first sentence, and re: the second, also was using a C linter in 1986.


> As a youngster ... as much as bemused for the lack of object orientation.

And as an oldster you be happy again to see the lack of object orientation and not the 5 level misabstracted inheritance hierarchy someone didn't do for purpose but just because university taught the modern way to automatically end up with sane structured code... and sure sure every tool can be used well or misused and never overgeneralize 8-)

I may be wrong but youngsters as oldsters have both their own rose tinted glasses.


I think you might be missing parent comment's point. The way I read it they aren't saying that old code is better, just that you used built everything up from scratch generally (or to a much greater extent than nowadays) instead of using whatever framework is in fashion this week


You also have to understand one thing (source: me, a sw developer since late 80s in Europe) - there was no internet, and most of the people working in the field had no CS degree.

This meant that neat new tricks (and stupid old mistakes) were done and (re)discovered all over the place, ALL THE TIME. You might have decided that data normalization was a good idea, or you might have maybe got the idea by one of your mentors... but you could not know that precisely the same thing was being done/taught on the other side of the street, or even 2 floors below your office.


Frankly, we still don't.

It amazes me how many interesting private (never published) ideas is in any proprietary codebase that had high-caliber engineers working on it.


Data normalisation was huge in the past because of storage concerns. But it makes less sense now. There's still concerns of inconsistent data and duplication but often it's more performant to not do it to the extreme extent we learned to in the 90s with all the normal forms. It's still a good concept but no longer a strict rule.

The inconsistency can be covered with stored procedures and prepared statements which are also a great way to improve security (SQL injection)

So IMO excessive normalisation is no longer the be all and end all in this day and age. Just the way an RDBMS is always the way to go. Sometimes object storage is better.

But anyway I'm surprised you saw less normalisation in old code. Personally I saw much more then than now.


This so reminds me of a conversation with my dad.

When I told him about SQL databases. Relational stuff, normalization. Awesome.

He told me that was all fine and dandy but just too slow, this new fangled SQL database stuff. He used databases where you simply accessed rows by key. If you needed to access something by a different key you just made a different table where the same data was arranged by that key instead. Super performant.

Of course my dad also programmed in languages like 370 assembler.

Funny how the young folks today talk about NoSQL databases indeed.


SQL databases were a dumb misstep, it's always baffled me how they ever caught on. In 10 or 20 years we'll look back on them the same way we look on C++/Java-style OO today.


It's not really such a dumb idea actually. It just depends on what you favour. There are plenty of nice things about SQL databases. I suppose it's the regular back and forth between one extreme and another. Both driven by "we need something new to work on" and changing technology landscape.

Relational databases really weren't all that practical back in the day with the hardware that was available. Putting thought into defining how you are going to query your data was available though. Relational algebra wasn't a thing from the start either.

If someone comes along and tells you that you don't have to know all this up-front and you can come up with a query you want to ask about the data you have, you will be able to, isn't that awesome? Ad-hoc, just like that. No need to carefully transform the data you have, ensure you keep it all up-to-date in multiple places etc. Of course data volumes grow and even the newer hardware you have can soon no longer handle what you got in a timeframe that you like. Indexing will be a thing. Of course even indexes grow way too huge to really perform, but hardware to the rescue, where at least all the indexes you need frequently will fit in RAM. Lots of caching going on too for your regular workloads.

Guess where the story is going? Well of course there's the old analytical vs. transactional load thing, i.e. your "Data Warehouse" is a separate database that is optimized for the pre-defined queries again, actually de-normalizing lots of things, being on different hardware, so as not to disturb warm caches for the transactional load etc. And yes, finally NoSQL again, i.e. back to the roots. Put more thought into how you're going to query this as your globe spanning SaaS load won't fit onto the hardware you have available. Of course this brings problems because we're just so good at predicting what kind of query we want to ask about our data. Databases like MongoDB, Cassandra, AWS DocumentDB etc. grow indexes supporting querying arbitrarily ... There's a hole in my bucket dear Liza, dear Liza ... :)

[I'm sure this nice story line is not globally completely correct/adhering to exact timelines but illustrates the point]


A different index, surely, not a different table? This sounds like an ISAM database to me, where you'd have to do lookups manually one by one, picking the right index for each yourself.


To be honest, I don't really know much about what he told me any more and I can't go back and ask him any longer. It's possible but I can't tell you yes or no for sure. It just evoked the memory of that conversation. Same with the Pick and MUMPS the other reply mentions. Doesn't ring a bell, seems possible though.


Your dad probably knew all about Pick and MUMPS though


Its complicated.

Technically speaking, if there's any decay in your teeth, that's a cavity. The deeper it is, the faster it'll get worse. When you want to get that filling though, is a judgement call that a lot of dentists make for you sadly. Is it best to go ahead and get those fillings now? Probably, as far as your dental health is concerned. Can it be delayed though and done later? Almost certainly. Just a bigger filling later. Or maybe a crown if it gets too far gone.

Also, how fast your decay progresses depends on a variety of factors. For some people, it won't be long before those questionable cavities become serious problems. For others, it can be many years if they decay much further at all even. I think it's largely genetic, but also depends on oral care, diet habits, etc.

So, no, I don't think most dentists are just seeing dollar signs and putting fillings onto teeth without cavities. Its not outright fraud or malpractice. But some dentists are sympathetic to your financial situation while others think dental health should be a priority above everything, no matter the cost.

So it's a mix I think, between dentists' values/philosophies about dental care, and the lack of science about how fast tooth decay can vary from person to person (and an inability to measure that).


It's not that cut and dry. Just because you have a small carie doesn't mean it's only going to get worse. That's the reason for fluoride in our water, in our toothpaste and in the wax the dentists applies at the end of your cleaning (that for some reason, insurance never wants to pay for).

Fluoride treatments and good dental hygiene can reverse carries. Carries should only be treated if they're large, or if they're still growing after discovery and alerting the patient.


And did you know dental plaque and tartar are actually protecting your teeth? And just because vanity we want to get it removed?

https://pubmed.ncbi.nlm.nih.gov/17016887/


Just because having plaque on your teeth will block acids doesn't mean it's actually a good idea not to clean your teeth.

I guess if you could show a study showing that leaving plaque/tartar on teeth leads to better health outcomes I might be interested.


The reason it's a good idea to clean your teeth is because if you don't they'll stink and become discolored. It's not because it keeps them healthy, because cleaning your teeth strips some protection from bacteria and acids away from them while deepening the pockets around them, creating a nice place for disease to live.

The idea that intensely cleaning your teeth (and keeping them white) is healthy is an intuitive leap that marketers take advantage of, just like the bad intuition that makes people clean their faces intensely to get rid of skin problems. The reality is more complicated. Clean, pure, healthy, white.

Fossils have better teeth than we do, but we have prettier, less fragrant teeth.


>Fossils have better teeth than we do, but we have prettier, less fragrant teeth.

That's because those fossils had way less access to sugar, which is the main reason we need so much more dental hygiene nowadays.


You all think sugar causes tooth decay, no, sorry. It's lipids. Not all bacteria use glucose, most use fatty acids for energy.

https://www.dentaleconomics.com/science-tech/oral-medicine-a...

It is more probably that the sugar throws off lipid metabolic and that changes the oral microbiome.


If you leave the plaque and tartar on the teeth without fixing the fundamental problem (immune health) then there of course will not be better health outcomes.

Caries are a sign of an immune disorder or imbalance. As long as dentists only scrape peoples teeth and do not integrate their health into their practice you will keep having plaques and tartar to be remove as your body fight this battle.

https://www.lupus.org/news/people-with-lupus-exhibit-increas...


>Caries are a sign of an immune disorder or imbalance

Or just terrible habits like drinking liters of soda everyday. Besides the diabetes, the mix of sugar and acid is terrible for dental health, and contrary to the myth, brushing your teeth will not save them, at all, if one persists in terrible habits.


Sugar does not directly cause oral microbiome imbalance. Lipids do.

https://www.dentaleconomics.com/science-tech/oral-medicine-a...


In vitro study shows that plaque protects against externally-applied acid. Ok. Now what about the acid produced by the bacteria hiding behind the plaque?


Just because dental caries show up with plaque and tartar in no way means that plaque and tartar are causing the caries. These are the mechanism our body uses to protect our teeth from unbalanced oral bacteria. Just like the microbiome of the gut, we have one in our mouth. Removing plaque and tartar does not stop caries.

Caries are initiated by direct demineralization of the enamel of teeth due to lactic acid and other organic acids which accumulate in dental plaque. What is tartar?

Heavy staining and calculus deposits exhibited on the lingual surface of the mandibular anterior teeth, along the gumline

Tartar is a form of hardened dental plaque. It is caused by precipitation of minerals from saliva and gingival crevicular fluid in plaque on the teeth. This process of precipitation kills the bacterial cells within dental plaque.


> Is it best to go ahead and get those fillings now? Probably, as far as your dental health is concerned.

Modern science says no, you should apply topical fluoride for most early cavities; and if the teeth use a modern pronamel (cough stuck behind USDA approval) should be able to recover quickly.


Could you add more regarding pronamel, so I can begin my negotiations with The Algorithm?



I think the poster might be referring to "Sensodyne Repair & Protect with NovaMin"


> Is it best to go ahead and get those fillings now? Probably, as far as your dental health is concerned.

This is mostly untrue. Fillings don't last forever, so prematurely filling cavities "starts the clock" on the longevity of the filling. When the dentist is replacing a broken filling, the dentist has to remove more of the tooth. Eventually you will need a root canal and/or crown. So for maximizing your health, the dentist still has to make a judgment call on how early or late to fill.

I say mostly because mercury amalgam fillings do seem like they can last effectively forever (20+ years), but they have fallen out of favor.


A bug bounty program is aimed to find individual instances of a security hole in your technical architecture. Like finding a weak spot in a ship's hull, and punching a hole.

A security consulting firm would do more for you. They'd basically be telling you how to make your entire hull stronger. And one of the things they might tell you to do, is start a bug bounty program. And they would also likely put things in place for the real security problem in your org: social engineering. Among other things.

And more than that, spending x dollars on a security consulting firm demonstrates that you did some diligence in securing customer data. And that goes a long way in a courtroom.


What kind of structure would you build in the wild, if you'd never seen a house, or hut, or lean-to? Think you'd just arrive at these ideas on your own? Think you'd even consider building your own at all? Or would you just keep walking until you found a cave?

We depend more on our collective intelligence and teachings than you think.


> What kind of structure would you build in the wild, if you'd never seen a house, or hut, or lean-to?

For an instructive example-- a crow's nest of sticks, leaves, etc. Especially if that human comes up happens to come upon a crow's nest in the wild. Notice that the human can probably scale that design to fit their own larger stature.

Now imagine a crow coming upon a little 1x1 dugout crawdad battle arena, or a little lattice-structure of sticks that encloses a caddisfly larva collection of a 9 year old. If you saw a crow observe and then build that you'd have quite a research paper on your hands.

But we're probably going to get stuck in physical differences so let's change the subject and just give both animals a stick.

One of them uses it to extract a morsel of food from a hard to reach area.

The other uses it to mark time according to either the moon cycle or possibly something else that lasts that long. I.e., this animal has created a calendar.

Using historical evidence please tell me which animal performed which feat?

The thing is-- I really do find the intelligence of crows fascinating! I just don't get the desire to pair that with the "human's aren't such hot shit" trope.


I think you are making of conflating how a socially raised human would behave in the the wild with how a feral human would behave. Despite nearly 8 billion people on Earth, we have very little data on human cognitive development when isolated from other humans.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: