I remember working on a codebase where everything had null check pyramids like
if (foo != null) {
if (foo.getBar() != null) {
Bar bar = foo.getBar();
if (bar.getBaz() != null) {
for (Quux q : bar.getBaz()) {
if (q.getFoobar() != null) { ... }
}
}
}
but that was a long time a go in a code base that operated by passing humongous mega-objects with hundreds of fields (running into the limit of how many parameters could be in the constructor was a constant headache), it solved a lot of its problems by printing a stacktrace and shrugging, and objects would be half-constructed and methods would return null in all sorts of scenarios.
In comparison, the entire codebase for marginalia search has about 65 nullchecks in total. A decent chunk of them are in dealing with older APIs that sometimes idiomatically use nulls to communicate things. java.util.Map, BufferedReader and the Servlet api in particular, the rest in dealing with the inherently noisy nature of crawl data. So while there are a few examples where there are nulls in the data, but they are almost always contained to the module that produced them and don't cross interface boundaries. It's very far removed from some pinnacle of code quality, but just gets a few engineering principles right that that other code base didn't.
Maybe this is from my epoch of education, but with early Java null checking everything was promoted as a good practice. Even today, depending on what your teacher said was correct, your coding experiences (e.g. if you came from C++, if you read 'Effective Java'), you might still null check everything.
Personally, I lean towards the "use nulls sparingly and be very explicit where null values are possible", use immutable objects etc. This reduces the unnecessary null-check noise and makes things more robust. In java, I even prefer to add @Nullable to fields to try to switch the default mindset to "assume non-null unless @Nullable annotation exists" - though another developer made a note on a code review that "that is not how Java is designed and @Nullable is just noise".
Having nullability as part of the type makes everything more explicit, so I'm a huge fan. Beyond just the perf and safety benefits, it's a win for domain modeling.
I remember working on a codebase where everything had null check pyramids like
but that was a long time a go in a code base that operated by passing humongous mega-objects with hundreds of fields (running into the limit of how many parameters could be in the constructor was a constant headache), it solved a lot of its problems by printing a stacktrace and shrugging, and objects would be half-constructed and methods would return null in all sorts of scenarios.In comparison, the entire codebase for marginalia search has about 65 nullchecks in total. A decent chunk of them are in dealing with older APIs that sometimes idiomatically use nulls to communicate things. java.util.Map, BufferedReader and the Servlet api in particular, the rest in dealing with the inherently noisy nature of crawl data. So while there are a few examples where there are nulls in the data, but they are almost always contained to the module that produced them and don't cross interface boundaries. It's very far removed from some pinnacle of code quality, but just gets a few engineering principles right that that other code base didn't.