— Software Engineering — 3 min read
I work in a team that manages far more software libraries than there are people. I wanted to raise my concern about standardisation across projects/ libraries (e.g. readmes, tools, github workflow file naming conventions). For example, synchronizing all the projects to have:
check.yamlif they are related to tests, static analysis, unit tests or integrations tests.
What are the benefits (and costs) of doing these things? An engineer did create these, so there's is actually a benefit to considering them. Generally, it avoids "duplicate" thinking (dummy-proofing).
asdf: Developers can easily drop in and out of SDKs (mercenary style) and run the runtimes.
Rubyare tools designed specifically for that runtime, are already established.
It can be detrimental and restrict the improvement/ growth than each repo/ project can make, compared to if repos didn’t have this standardisation cost. Developers have to “argue” their case for breaking the standard, when the standard was never justified very well or discussed. IMHO there are much more important things to think about.
Once the decision is made to create a standard, there might still be a cost to design a standard that is suitable. For example, if creating a consistent README structure, work has to be done to create a suitable README that fits all libraries. Problem alert?: This structure might be the lowest common denominator or contain unnecessary sections.
When a standard is put in place, you'll find lots of things that don't conform to the standard. I've noticed my team lead who has many years of development experience working on this type of non-technical work. Apart from this synchronising cost between libraries when the standard is first enacted/ discovered, I think there is a hidden, but much greater cost of maintaining this synchronicity. Developers are restricted in choosing their tools or an approach in their project, which makes the process dummy-proof. Reed Hasting's (Netflix founder) says "if you dummy-proof the process, you only get dummies to work there". Importantly, when a new tool or technology comes out, we’d be stuck with a standard, and the onus is on the developer to argue the case. A huge amount of work would be on us to decide to upgrade the standard, and make a new standards leap (upgrade to the new tool). This is unnecessary friction, not all developers voice their concerns like I do, and defaults are powerful.
I believe avoiding this type of cost is actually in some ways a Netflix principle. Read no rules rules, by Reed Hastings :wink:
So I say:
Bad standards are easy to add, but expensive to maintain. Good standards are hard to add, but saves money.
This is part of my Engineering principle to just keep things simple. Synchronisation between projects is not simple.
For every instance of standardisation, there should be a clear argument for (benefits vs. costs) and a discussion about it, before it gets made a standard. And along this standard document, a justification should live, where future developers can question and remove the justifications if they are no longer necessary. Make it harder for the dummy-proof process designers to create their processes, so developers can keep our autonomy and productivity.