Complexity
2024-07-13
I am a strong advocate for pragmatism and simplicity. I struggle with things that are unnecessarily complicated. Complexity, I would argue, is the greatest enemy of engineering.
Let’s take a brief look at the automotive world. You don’t need to be an expert on cars (I myself am far from one :)) to grasp the essence of the following paragraphs.
There was a time in history when Audi used diesel engines in racing (Audi R10 TDI specifically). Paradoxically, in 2007, the Audi R10 became the first car to win the 24 Hours of Le Mans race with a diesel engine. Such a victory, naturally, brought a wave of confidence to the company. Inspired by this diesel success, Audi asked itself a rhetorical question—if diesel engines can win races, why not apply the same principle to everyday cars? While the idea seems logical, the engine itself is worth examining.
First of all, the R10 TDI was an extremely complex engine (especially for its time), consisting of numerous components: a DPF (yes, even in racing, due to regulations), 12 cylinders, twin turbochargers, and so on. Although there were even more intricate details, it’s important to understand that race cars are optimized for the track. This means that the typical problems of everyday cars become irrelevant. Audi had a team of engineers who deeply understood the engine and could fix it relatively easily when needed. Talking about operational costs also doesn’t make much sense when the ultimate goal is to win Le Mans. So, when a vehicle is designed not for everyday driving but for completing a race, transforming it into a daily driver, especially a V12 diesel, presents a huge challenge.
But management decided to move forward with the idea and assigned the task to Audi’s brilliant engineers.
What came out of this? The Audi V12 TDI engine, commonly found in the Audi Q7 SUV. Although placing a racing engine in an SUV sounds paradoxical (shouldn’t sports engines belong in sports cars?), only such a platform had enough space to fit such a complex and massive engineering solution. On paper, the technical specifications looked flawless, but the engine failed to attract buyers and became one of the worst-selling engines in Audi’s history.
Diesel cars were valued for their practicality and efficiency. People didn’t buy them for sportiness but for economical travel from point A to point B. Moreover, no one wanted to buy a V12 diesel car that only a handful of mechanics understood. The engine was simply too complex for its intended purpose.
The moral of the story is this—complex solutions designed for simple tasks are doomed to fail.
I believe we face similar issues in the world of IT engineering, particularly in web technologies.
It all started fairly simply. A product needed a database (previously, a de facto choice was some RDBMS) and a server to handle client requests (e.g., PHP).
However, as the number of internet users and their needs grew, new challenges emerged. Traditional databases struggled to handle large query loads, a single server was no longer enough to serve all clients (e.g., OS file descriptor limits were reached), and user interfaces were cumbersome, requiring a server response after nearly every action.
Naturally, this led to the emergence of various technological solutions. NoSQL databases promised infinite scalability, SPAs enabled interactive websites with minimal server involvement, and distributed architectures (e.g., microservices) became prevalent.
But it’s crucial to understand that these solutions didn’t appear out of nowhere. They arose as complex responses to complex problems—problems that typically plagued giants like Facebook, Google, etc., not your average internet blog.
This led to what I call complexity psychosis. Developers blindly started adopting technologies used by tech giants. The common justification was (and still is)—we need technology X because the old solution Y won’t handle a million users; if Google, Facebook, and others use it, then it must be a good technology; our system will be more reliable.
This mentality diminished the engineering aspect of software development. Developers started searching for a holy grail—a technology that could do absolutely everything, regardless of limitations or context.
However, engineering is about finding the most optimal solution given specific constraints (e.g., time, budget, legal regulations, etc.). When we remove the engineering aspect from software development, numerous problems arise. Simple tasks become unnecessarily complex due to the technologies chosen.
Previously, we had a simple RDBMS with a well-researched, documented, and easily understandable concurrency model. But that wasn’t enough for web scale—we needed a NoSQL database for horizontal scaling (what if we get a million users?). Yet, all the complexity that comes with NoSQL was often overlooked; transactions (if the NoSQL database even supported distributed transactions) typically operated at a read uncommitted isolation level, maintaining a sharded cluster required sleepless nights, and ensuring data integrity demanded additional effort.
This doesn’t mean that technologies like NoSQL are fundamentally bad. But in certain contexts, their benefits are questionable and may create more issues than they solve. These systems, with their many intricacies, require deep analysis and ongoing maintenance to ensure reliability. An engineer should always ask—do we truly need this solution, and is the added layer of complexity justified and manageable?
It’s always wise to start with simple technologies and only introduce more complex mechanisms when necessary. Managing complexity is challenging, but it’s one of the most critical tasks we must constantly approach with a critical mindset. If too much complexity is introduced, the system becomes difficult to understand, unpredictable, and reduces overall productivity.
Simplicity wins almost every time.