More math in code

So why does math work? Or rather why does it work out? Why are you never surprised that there is a corner case where 1037 + 1 = 1039 instead of the expected 1038? Of course you could define such math. You just don’t because it’s not that useful. Why do we define such programs then?

Haskell is mathy in its structure. And that’s not only due to the high level of abstraction of some of the concepts. Instead the key here is that the concepts state and obey some rules. Just like addition obeys some set of rules and you are never surprised by the result. The amount of deviation is minimal which results in clean blocks that fit together to form larger constructs. Like legos.

The most common example of not-enough-math is mutability. Consider:

f(x) == f(x)

This is always true in mathematics. Not so in programs. And I’m not even talking about side effects like network calls or database writes. Often someone will make f mutate its argument so the next call to f does something else. The effect? Code like the following will not work!

assert(f(x) > 0)
return f(x)

Far-fetched? What if f(x) is order_total(order)? And it mangles prices to apply discounts?

I think this extends to the level of how we define features. Every corner case is costly. The cost does not stem from how long it takes to add another if for that case. Rather, it comes later, when you think something like “this code always does X”. But that may not be true. A couple corner cases exist where something different than X happens. So your prediction of the behavior will be incorrect.

What’s even worse is that when your features start depending on each other a corner case somewhere at the bottom will result in unpredictable behavior at the top. Needless to say that it will be difficult to find the culprit in this situation.

Systems that follow a mathy philosophy turn out to compose features well. Think linux, git or vim. In all of these you get some basic features and ways to compose them into workflows. And since the basic features are well-defined thanks to their compactness you end up with a resilient and flexible structure.

Now in a real application that seems like a lofty, unachievable ideal. The deadlines bear down on you, the corner cases multiply before your eyes. But it also ends up costing you. Think about every time you had the “AHA!” moment while investigating a bug report. How the pieces combined in a slightly unexpected manner to give a totally unexpected result. How you might have avoided that if the pieces were all obeying a number contract. Or an iterator contract. Or a monad contract.

I’m not advocating building every system as painstakingly as Haskell. For every job a tool, and for every tool a job. This is just a note to future self - keep an eye out for opportunities for more math in your code!


Now read this

Code as data is just annotated functions?

What I wrote previously might just boil down to having a bunch of annotations on your functions. In a functional language a lot of your functions will look something like this: (defn new-functionality [x] (-> x first-step second-step... Continue →