Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think you prevent these "corrupt file corrupts important memory" bugs in two ways: 1) use a memory safe language 2) write fuzz tests just like you'd write unit tests.

1 is becoming extremely viable, infrastructure for 2 is starting to be included with modern programming languages and so will soon become the norm.



> write fuzz tests just like you'd write unit tests

You're missing a key point the GP made, operating systems are big. There's also a finite number of person-hours in a development cycle. Third, fuzzing is a complicated and open ended process that's difficult to analyze.

Fuzzing is not a simple subject. Testing fuzzed input requires a lot of instrumentation that can affect measurements if it's instrumentation not enabled in production builds. Not all code is easily tested with fuzzed input even when it can be properly instrumented.

Because there's limited amounts of time in a day, every working moment can't be spent on the near infinite permutations of fuzzed input for every component on a system. Even automated test suites take time to run and then can only test known workflows or code paths.

An OS has thousands of components with complex graphs of those interacting components. There's vast surface area for bugs and limited time to explore it all. Just switching languages and adding more tests aren't going to make bugs go away.


Big makes it challenging. But also we are talking about big companies. The OSX/Apple, Windows/Microsoft and linux/IBM/Goog/Amazon/long-tail are big enough that they already invest lots in security and can invest even more. This is where central tooling teams and come into play.


I was at Apple for more than 15 years in SWE. The scale of OS builds isn't just big but just this side of intractable. They have tool and build teams. It's not an easily solved problem and you can't just throw money at it to make it so.

The OS has thousands of components with combinatorial amounts of different interactions and states. You can have multiple bug free components that create an exploitable situation in combination. The same is true for Windows or Linux or any large software project.

Unit tests are not magic. Neither is fuzzing. Even if you've got a great fuzzing infrastructure and coverage it still takes non-zero time for an engineer to analyze, understand, and then fix a problem. Many components in an OS require a fair amount of domain knowledge so engineers aren't fungible. You can't grab Sally from the Mail team and give her a bunch of CoreMedia bugs to fix.

Keep in mind all the companies you mention invest significantly in security and just barely keep ahead of exploits and often can't keep ahead of exploits. Hackers don't have the same burden. They don't need to find every bug. They only need to find a profitable one (depending on the hat they're wearing). They also don't need to maintain their exploit in future versions. They just move on to find the next exploit.


You want all of every OS rewritten in Rust?

Boy I’ve got bad news for you


It is quite telling that every single time one talks about secure OSes, someone has to bring out Rust.

The parent only mentioned "use a memory safe language", yet Rust it must be.

No, it could have been JOVIAL, NEWP, PL/I, BLISS, Modula-2, Modula-3, Oberon, Object Pascal, D, Nim, System C#, Ada, Swift, ....


That is why i keep mentioning Ada in the hope to balance the "Memory Safe language" selection bias.

I only mention it a few times whenever the topic of memory safe language and Rust came up. And I have already been asked to stop.


> And I have already been asked to stop.

Don't worry about it.

It's important to bring it up because there's quite a few people who don't want to adopt Rust for one reason or another, but want a mature lower level language alternative to what they're using now.

I try to mention Ada when people are looking for something with its characteristics, but don't appear to have considered it. I try not to hijack threads or interject, be rude or overbearing, though it has probably read that way at least once.

> That is why i keep mentioning Ada in the hope to balance the "Memory Safe language" selection bias.

Ada suffers from a lack of proper marketing and being misunderstood. The Ariane 5 disaster gets mentioned as reason to not use it, but not Ariane 5's track record since as of the most reliable payload rockets in history (including a streak of 80+ successive launches), and because of this was picked to launch the James Webb Space Telescope. Ada's usually mentioned along with COBOL or Modula-2, so I expected a bloated bureaucratic grotesque language, not a modern one with decent tooling and a package manager. While it may have been "complicated" when released, the newer versions, especially Ada 2012 have really polished up the language well.

I've been productive in it since the early months of using it, and it has exceeded most of my expectations.


A good example would be NVidia picking up Ada for autonomous vehicles firmware.


I guess there are multiple issues with memory safe language bringing Rust into the talk.

One is the naive one, someone that lacks the proper background and for whatever reasons thinks it is the only way, thus Rust.

Then we have those that are aware, but think all the other safety approaches are not valid for whatever reason, thus Rust.

Then we have those that kind of feel threatened by the security awareness that Rust brings into the picture, so whenever we talk about security, there is pavlovian reaction that it must be Rust when the subject is security.

Picking any language that embraces bounds checking enabled by default, and proper string/arrays, is already a major improvement.


>Picking any language that embraces bounds checking enabled by default, and proper string/arrays, is already a major improvement.

Even bog standard C can be made safe, and include bounds checking if you create and consistently use an API that supports that mindset - see for e.g. Microsoft's string safe lib (not that I'm saying that is the epitome of such a thing).

However the will and the effort has to be there, C is an incredibly flexible and capable language and can be as safe as you want it to be.

The last Random ascii article I read, he found an easily exploitable buffer overflow in iOS simply by checking the code to see where it used memmove and worked out the exploit from there, guessing correctly that there was no bounds checking. So it seems that the usual culprits are fairly easy to spot, but somebody needs to replace the worst APIs with something safer (memmove, memcpy, memset, malloc, a = b).


Any "safe" string library for C that uses pointer and length as separate arguments is safe only in name.

Microsoft does it, because it comes along with SAL[0], which is kind of Microsoft's own Frama-C.

Also as long as WG14 doesn't care, everyone will keep passing char* around while hoping it is actually null terminated and points to the right place in memory.

Technically, ISO C could get safer types, or have something like SAL/FORTIFY, but as you say, without will it will never happen.

[0] - https://docs.microsoft.com/en-us/cpp/code-quality/using-sal-...


Please keep fighting. The automatic association here on memory safe language with Rust annoys me as well, just as the one about mixing languages on a VM and WebAssembly. Both features are available since decades on C# and the CLR, and Java and the JVM to name a few.


The quote was "corrupt file corrupts important memory" which is exactly what WUFFS is for. In Rust you can screw up badly enough to still have the bug, doing so is harder but not impossible - but in WUFFS the stupidest file format parsing you can do is merely wrong and can't corrupt anything. In WUFFS possible outcomes of incompetent morons reading an image file include: the colours in the image are wrong, the image is just a solid black or white rectangle because they didn't remember to actually render the image, bits of it are smashed up somehow, and so on. Notice how it's always about the image? WUFFS can't express programs that do other stuff.

General purpose languages are cool, but their unlimited expressiveness is exactly what you don't want for security. Why is this PNG loader able to make network connections and control my camera? You know a bad guy will find a way to force it to use those capabilities right?


Apple is already writing more of the OS in Swift: https://blog.timac.org/2021/1219-state-of-swift-and-swiftui-...


Most of them are Apps. Not what I considered as "OS"


To be fair, they need to start somewhere, and if userspace gets more secure, it is an easier sell to non-belivers.

Think a bit like Android, versus the hard battle from .NET on Windows with WinDev love for C++/COM.


Directionally, yes, I want more parts of more OSes written in Rust and other languages that allow building safe abstractions over unsafe code. This is table stakes in 2022.


Rust? You mean swift right?


could you elaborate on the "bad news" part?

Is there something rust inherently can't do but c/c++ can?


Exist 20 years ago when the OS was written.


Even ColorSync, which the top of this thread mentioned, is older than that. It was first released in 1993 (https://en.wikipedia.org/wiki/List_of_macOS_components#Color...)

The work to remove all AT&T code from BSD started in 1989, so other parts of MacOS probably are over 30 years old, too.

I don’t think anything of the original Mac OS remains, lots of it being written in 68k assembly and pascal.


Yea, rust doesn't already exist. Rewriting a whole OS is an impossibly high ask.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: