The year without features
Foreword
This is going to be a rant about the current state of the software infrastructure as a whole. It has long been brewing in my mind and with writing it down I try to collect those thoughts in an orderly manner (possibly unsuccesfully).
So this is mostly an opinion piece and reflects my world view, but I do hope it it has some worth to a reader or two.
Introduction
The modern world (or well the world seen from Germany in the year 2024) depends on computers. Turn them off and the civilization is gone. There is no company (or other big organisation) anymore, which could work without computers or at least very few would be able to switch back to paper and survive it, including the paper manufacturers, which would cause a problem for the survivors.
This is not limited to “computer companies”, which, of course, would be useless, but also everything else. Computers provide the means of communication, between humans, but also for transmitting orders, bills, information in general. Not being reachable via MS Teams(TM) would be attractive in itself for many people, but if the less hurtful means of communications fail too, there is no way to organize people anymore and, in the end, every company or institution is about organized human action. (I am slightly sorry about the MS Teams(TM) bashing here, but, one has to ask what kind of bad trip inspires this way of implementing a chat.) Private human interaction would also be interrupted in catastrophic, even for people living close enough together to communicate face to face again. We rely on computers and especially their ability to build networks. This is now a necessary irreplaceable infrastructure, in the very way [Deb Chacra](link to “How infrastructure works”) writes about it, something everybody uses and/or depends on, but seldom actively notices its presence.
As every other infrastructure, this one must, too, be sustainable to be able to serve humans in the future. We must invest in it to keep it working, to keep it useful and to keep the bad effects to a tolerable level.
And at this point, I am currently not very optimistic. There are several semi distinct developments, which, to me, are problematic and will only get worse, if not addressed appropriately. This post will start with legacy systems and the economic effects which are related to them.
Some complaints
Legacy system and technological debt
A legacy system (and I use this rather vague term on purpose) are the things, software or hardware, which are present in the current system (and might have been for quite some time), but are problematic regarding the future.
The problem in abstract terms
The proposition here is: When differences between two interacting systems become big enough (initially, with developements in one or both of them or with a changing context) the number or problems and the amount of work is greater than the sum of both individually, while the positiv effects of the interaction will stay constant or decrease. While this might seem obvious in it’s generality, in practice, especially regarding aging systems, the topic might undervalued in it’s importance. At least that is my point here.
The introduction of new system, mechanism or interfaces creates not only increased cost (in the sense of more work, increased complexity or just money), but also, implicitly and unintentendly, increases the costs for already present systems. There is probably already an economic term for this idea, which I am not aware of. Since interacting systems is the default, for a broad enough definition of interaction and the interaction often does not stop at the borders of a particular organisation, changes outside of one organisation will inflict additional cost on the organisation without any agreement or, sometimes, even knowledge.
Certainly these are not new thoughts, but just my short summary of the problem.
Feature creep and the complexity explosion
The second concern I would like to address is the phenomenon of feature creep, where a particular system, software, hardware gets expanded, piece by piece, to address more edge case, new problems or event things which are completely out of scope regarding the original purpose. These kind of system will, at different points in time and at different stages, develop some kind of internal inconsistency. At this point (at the latest) even some minimal and elegant original design will devolve in a messy porridge of different ideas where changing one line might have unforeseeable consequences at a completely different location.
Of course, this is nothing new to most and I do not want to be judgemental about it (although it might sound different). These things happen, often software is created for scratching a personal itch and then adapted, because it works. While the idea of having perfect knowledge before starting something is desirable, it is, for most projects unrealistic, which may be the reason for people ridiculing the “waterfall model”.
The main problem here is, there is never a cleanup, a refactoring, to make the whole thing more consistent once again. The lamentations of software engineers who actually would like to do that are endless, there is never time nor money for solid engineering (at least for most projects). Of course a reason for that might be the lack of insights in these internals, seldom did I encounter a website advertising some specific software with the sentence “Someone started this 12 years ago in college while being high on different substances for a completely different purpose. We fear touching that code, so we incrementally taped other pieces to it in a completely different fashion. Some parts were made for specific customers and we haven’t looked at the in years and just pray nobody else does”.
In in a certain sense, these are results of “natural” processes and can hardly be avoid, but there still is a possibility to get better in dealing with the fallout.
Hype technologies and fashion
Since this is a rant, I am making place here for some of my rather strong opinions on some “newish” development which I abhor. Every few years a new hype with new buzzwords goes through the computer community, once it was big data (extracting actually useful statistics is hard), then blockchain/crypto (I still don’t know an actual good use case), containerization $something (good for certain scenarios, not so good in most others) and now AI.
Common to all of these is a rather vague definition of what it actually is (or even a near complete lack thereof). The most positive interpretation of this is the hope of finally solving some existing problems, a more cynical one is that after some time people learn how the charlatans and frauds look and act and they need to change appearance to perform their scams. The problem, as I see it, that, at least in some cases, actual resources are being moved from useful things to the current hype and are either just burned up or allow some grifter an early retirement.
Apart from current hypes, the current fashions are another annoyance, the example here being “modern” websites, where, for no apparent reason at all, somebody picked the Javascript-Framework of the year for something which could have been small, fast, stable and useful. Certainly there are some use cases where an interactive, reactive website might be THE thing, I honestly won’t deny that. One could be more sceptic though, whether this should apply everywhere. To have a concrete example, visit the HPE website and try to a MIB file or documentation for a server. Especially for things like documentation, firmware, etc. some plain folder offered via autoindex was and is probably still the best solution and would be far less annoying, faster and actually usable. But no, instead everything has to have animations, useless stock photos and is horribly slow. Good job there, web devs (or whoever decided that).
Insofar as I am concerned and regarding websites, I would like to have a button on every website with something like “Technical users”, “Less annoying” or something like that, which brings me to a static HTML version without most of the Javascript, where I can quickly find the useful stuff. The whole marketing shenanigans can stay on the other one. At this point I consider websites with a late 90s/early 2000s optic to be a sign of competence.
A proposal
To end this incoherent rambling, I would like to finish up with some vague idea about what should be done. The idea is this: The year without features. A coordinated and overall attempt to make things better, a whole year dedicated to prepare for new things (and not actually implement them). During this time, resources are not allocated to build the new and next thing, but to repair, document or even remove the old thing. A season for cleanups, for finally replacing the scary old machine in the corner on which half of the business depends. For finally migrating to the current version of a framework, to a current operating system. To do the refactoring, to fix more bugs, to finally removing the functionality which has been deprecated for 5 years. Getting the number of Issues on $code_forge to zero (or at least below 100), homogenize the setup, remove that one dependency which is hanging over ones head like the executioners axe.
I am pretty sure a lot of software engineers, developers, administrator and other people would appreciate this, but the pressure of economical concerns do seem to prohibit such things. At least, that is my explanation for why such a thing has not happened and why things are currently as broken as they are. The “normal” state of affairs is not to do this, so maybe it needs to be something exceptional, similar to a debt jubilee, a break from the continuation of the current way of doing things.
Of course, the idea is not wholly consistent, replacing old parts of your system with something new is, well, introducing something new, if it makes the system less complex, the ideas less diverging, the system more homogeneous, I would not really count it as a “feature”. The terms here are surely, in the details, not quite clear and open to interpretation, but I hope, that the general idea came across.
At least in Germany, the time around Christmas is often called a time of reflection, a calm time which allows one to reflect on the past year and think about the things that happened. While this might often not be the case, for many people it is indeed more stressful, the general idea of taking some time to reflect on past deeds, mistakes and decision and then deduct from that the direction for the future seems desirable to me. I think this is a thing needed in the computing community and we should make it happen.
Thank you for reading
In the end, I have to thank you dear reader, for staying on this until this point and maybe the words above did entertain or even sparked some thought. In any case, I am open to feedback and criticism and would to like to hear/read about where I am wrong or right or even about better, more concrete and/or more realistic plans. Have a nice day!