Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Using this programming style requires a rather powerful type system, if you want to go past the simple examples the author shows and encode more complex validity conditions. I am still learning these things, but AFAIU the extreme edition of all this is the Calculus of Constructions used in Coq and Lean, where you can encode theorems and algorithm specifications as very complicated type; proving a theorem corresponds to showing an element of the theorem itself (seen as a type), and if you find an element of a type encoding a certain specification, Coq can work out for you a program that implements that specification.

This is what I understood, at least. Things are quite complicated!



A pragmatic interpretation of this is to:

* subclass or compose String whenever possible. (Id, Name, Address, AddressLine1)

* subclass or compose Number/Matrix types whenever possible (e.g. Users, Packages, Widgets)

* Use immutable collections (e.g. Google Guava library in Java)

I have built very powerful software with small teams using these principles.

At scale, your day to day programming becomes easy as the compiler is doing all the error checking.

It is also very slow to program this way. You write A LOT of boilerplate, and there's only so many code-generation libraries you can throw at it before it becomes a pain to setup your IDE.

But it is worthwhile for complex applications. We did not have bugs.


Ghost of Departed proofs provides a fairly trivial way to generate proofs from arbitrary code and track them in the type system. See this article for example: https://ocharles.org.uk/blog/posts/2019-08-09-who-authorized...

This post shows a simple "positive" assertions, without any combinatorial logic associated with the proofs. But this is similar to programming by contracts.


> Using this programming style requires a rather powerful type system, if you want to go past the simple examples the author shows and encode more complex validity conditions.

You need a strong type system if you want to encode complex constraints at compile time, but you can still encode a surprising number of conditions in weaker type systems. And if you can't enforce a constraint you care about at compile time, capturing that constraint in an ADT using sum types and runtime assertions can still provide a lot of value to anyone reading or maintaining your code.

This sort of style requires a lot more diligence and verbosity than the same code in Haskell would, but I think dynamically typed programs can make a lot more use of their type system than they usually do.

At the very least, humans can try to pick up the slack where a compiler is insufficient, and while designing your data model in Ruby as if it were in Haskell may not help the interpreter at all, it can give the programmer a lot more context about how the code is intended to work, and what assumptions it is making about its data.


I think it's a pragmatic style even with languages with much less powerful type systems. You might have to do more of the work yourself -- manually creating types, for example -- but the principle is good.

I think it's advice you don't have to go all in on -- you get benefits even if you use it just a little.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: