hi.

Technical superstition, and the disassociation of cause and effect in computer "science"

This is a pessimistic post.

"Technical superstition" is the term that I use at work to denote the phenomenon where a piece of technology or procedure exists for no clear reason, except that everyone believes that it needs to. The original purpose of the code or patch or subsystem, and its true motivations, lay shrouded in the mists of organization memory, shielded by the complexities of the system which has grown up around it, like a Burmese jungle around a god-head ruin. Inquiries are met with half-smiles, shrugs, and answers like "Historical reasons; don't worry about it", leaving one with no further enlightenment except about the true depth and breadth of black and sticky mystery that must surely lie behind every kilobyte.

I don't think this is a feature unique to the business of software engineering. It seems clear that the psychological and cultural forces which feed technical superstition also feed many of our own cognitive biases and productivity antipatterns, and these can play huge roles in our lives and our illusions about how the world really works.

I define "superstition" to be any set of actions or behaviors whose stated justification is either suspect, spurious, or nonexistent, and yet the behaviors continue to be performed because there is a belief that doing so will avoid harm.

The pattern is reinforced every time that the anticipated harm does not occur. It is a "specious cycle" which is difficult to rationally argue one's way out of; correlation may not imply causation, but it is a necessary prerequisite, say our Anxieties.

One important condition for this kind of cognitive bias is for the stakes to be sufficiently high. The anticipated harm must be one hundred percent unallowable. It should be perceived as so fearsome that any number of small hardships are endurable to prevent its occurrence.

A great "secular" example of this is the commonplace understanding of the relationship between antibiotics and birth control pills. Over the years, in some regions of the world, there is the received wisdom that when a woman is on antibiotics, some form of backup contraception should be employed to prevent accidental pregnancy.

If this is taught as gospel, and men and women employ this practice faithfully throughout their lives, they have an extremely low chance of experiencing an accidental pregnancy. Based on this evidence, they perceive their behavior as rational and sensible, and they teach the practice, as gospel, to the next generation. And so forth.

However, the objective truth is that only a single antibiotic medication, rifampin, typically only prescribed for tuberculosis, has ever been clinically shown to have any impact on the effectiveness of the Pill. But when making birth control decisions, the stakes are so high, the consequences of an unwanted pregnancy so monumental, that even medical professionals hesitate to suggest anything except the most conservative course of action. Because why risk it?

In the software world, we see this same sort of magical thinking manifested in situations such as the XML file which, despite what the DTD says, must never be reordered, because of that outage that happened that one time that may have been related to that we think. The outage was so harmful, and the number of possible causes was so high, and the complexities of the system were so untestable, that it is cheaper to just do a little rain dance around the XML, than it is to either replicate the conditions to investigate the true root causes or to accept the risk of another outage.

The scenario with the birth control is different from most software analogues because in medicine, there is a direct financial incentive for companies to show the safety and predictability of their products. In software, the incentives for such non-value-added rigor are very low. From a resourcing point of view, the only perceived costs are the seconds-to-minutes per week which are required to perform the superstitious behaviors. Well, that, plus the impact to morale.

And that's the point I wanted to come back to. I'm going to go out on a limb here and say that people who work in the computer business tend to be a little more analytic than intuitive. A little more thinking than sensing. A little more science than art. This means that what makes their heart sing is a nice stack of causes which lead to a determinate result.

To ask them to perform tasks with only a woo-woo, hand-wavy explanation is like asking them to join a different industry. The promise of computers is a world where virtually every atom has been intentionally placed by a human, a world where variation and indeterminacy are anathema and bad for business, and those who are uncomfortable with such things can be paid to stamp them out.

This is all true, as long as the system maintains a certain rationality and simplicity. But modern computing needs can no longer be served by simplicity. The user's hunger for functionality and robustness and elegance provide business incentives for continually increasing levels of complexity, and the rate of growth of these desires necessarily forbids the use of resources for decomplexification efforts.

In other words, it is simply more profitable to burn a few seconds of time here and there to shore up the dykes against our worst fears, than it is to ensure that the work is rational and meaningful. And that's not even wrong. That's just capitalism.

But it must be recognized that the result of all of these tiny rational decisions, piled together over a course of years and a panoply of personalities, is an irrational system. In front of this system, we sit people whose career and life satisfaction depend on working in a rational world. Part of their education is to learn the harsh truth that at any realistic level of complexity, the user-facing behavior of the system is, at best, a non-linearly emergent property of the code. The relationship between developer effort and user value must necessarily be lost in any such system, and along with it, things like engagement and joy.

To return to my original point, technical superstition is caused by monotonically increasing levels of complexity, which disassociate effects from their causes, both in terms of bugs and features. The resulting superstitious behaviors are individually cheap to perpetuate, and so are only recognized as problems at a management level when lifting their accumulated weight becomes expensive enough to justify systemic attribution. But for those asked to perform these behaviors, they are immediately perceived as demoralizing evidence of the progressive de-rationalization of what at first appears to be a profoundly rational science.

I said this was a pessimistic post because I literally cannot see any sensible path out of the current state of affairs, nor even any way it could have turned out differently.


Never buy me a new book

I hate working from home