Contents

Doing Things Right at the First Time..In Software Engineering

Over the last few years, I increasingly come across the phrase doing it right at the first time (DIRFT) in the context of various applications of software engineering(infrastructure, cloud, microservice design as well)

However, it causes damage to the efficiency and value delivery of software engineering when it is shoehorned software engineering.

In the rest of this post, I’ll visit the dangers introduced with this approach.

What is DIRFT ?

Coined by renowned businessman and quality expert Philip B. Crosby, DIRFT stems from total quality management and focuses on reducing costs,optimizing people and processes to guarantee the customer satisfaction with zero defects in the products and processes. It has solid applicability in many different industries and disciplines .

Why do we see lurking in software engineering ?

Motivation to apply DIRFT probably originates from recurring failures, high costs, failed improvement and transformation activities in the past.

After repeated failures in an organization that does not learn, people can look for the next recipe and expect that this one is going to return more satisfactory, less problematic, cheaper results.

Failure to learn, failure to apply scientific methods and failure to actually execute continuous improvement, as usual, remain as the elephant herd in the room.

What are the problems of this mindset ?

More stressful environment

This might be the most severe impact. When people are negatively affected, other problems automatically follow.

Even though DIRFT itself does not promote day-1 perfectionism, works closely with JIT, and allows buffer for improvements, people often perceive it as a goal towards perfect or near-perfect results that can survive long and reach efficiency goals from the beginning. One might say that this misunderstanding is a communication issue. Regardless the reason, impact on people can be problematic.

My friend Altug Bilgin Altintas once said, such an understanding creates pressure on people, which in turn creates counter-pressure, all leading towards a vicious cycle.

Ironically, this is a good way to drive people to do mistakes that they would not normally do. People that are indoctrinized to think anything except hitting the bullseye is a failure, can not reach their potentials since this trauma and fear will paralyze them.

Pressure and counter-pressure is a catalyst for toxic environments. It creates an unrealistic expectation on people. Feeling under constant spotlight is never healthy. Individualism, defensiveness can increase because of insecurity and threats.

Of course, we don’t recklessly ride towards sunset. All decisions we make are risk decisions. Unnaturally increasing all of them to high probability high impact risk leads towards a rigid environment.

Lack of trust

Trust and subsequently empowerment disappears. Since teams are unable to deliver, they never gain trust from the stakeholders. Hence, they are seen as people who let down the rest of the organization, creating more pressure.

When teams fail to deliver, empowerment is replaced with micromanagement. Ultimately we end up an environment where people avoid making decisions, taking initiatives and crap tornado roams amongst the management layers.

Becoming process-driven rather than value-driven

It leans very heavily towards processes instead of value. This creates an adverse affect in software engineering output because the organization tends to become more process-driven than value-driven.

To avoid misunderstanding here, being value-driven does not exclude processes. However, it prioritizes continuous improvement activities on processes as well, in favor of value.

When the value is a moving target and not a deterministic or concrete (like an OLED display, or an airbag) thing, process based approaches don’t help.

The map is not the territory fits perfectly to explain this problem and fundamental lean/agile principles embrace this reality.

Moving away from empiricism to go back to determinism

DIRFT intrinsically denies the fact that software engineering is an empirical discipline. That is, making the software right, is a continuous process of improving the existing value, fed by the new information we learn from frequent exposure to feedback collection points.

This is not a fault of DIRFT itself. It works for many other manufacturing and process oriented domains. It is just that software engineering domain is not compatible in its nature with this approach.

Without experimentation and feedback loops, learning does not happen. When learning stops, we fail in the empirical nature of software engineering because our decisions start to detach from the reality rely more and more on assumptions, wishful thinking. Contiunous improvement activities also rely on learning.

Sticking with deterministic methods is ultimately a form of denial.

Premature decisions that assume we know the future

DIRFT can result in making too much premature, upfront decisions because of relying much on determinism. Why do we resort to that ? Since evolutionary approach now considered as a fault, most correct decisions are expected to arrive early in the phases, almost going back to the waterfall model where people play the role of Nostradamus.

Best practices are sought for that. However, a best practice doesn’t always mean it has to be included right away. Some practices become applicable when surrounding conditions support the necessity. Lean principles recommend deferring decisions until they are needed and best practices can be subject to that. As with many items, there’s no silver bullet in this and it remains to be checked case by case.

Especially when people take it as perfectionism, it feeds the idea that we can and should know the future and encourage people to act like omnipotent.

Growing silos and ivory towers

As iterative approaches experimentations become difficult and somewhat considered wasteful, decisions tend to come from silos and experts in ivory towers.

This is also contrary to the lean management,lean software development and agile principles, where decisions are to be taken by empowered local teams who actually deliver the value.

Due to the stressful environment and individualism I mentioned, silos also become fortresses people try to keep alive. They encourage “somebody else’s job” approach, rather than team work, to avoid backslash.

Severely damaged learning

Learning (hence continuous improvement) activities are structurally broken as we lure ourselves in to determinism. Since frequent experimentation and feedback collection are lacking, divergence from satisfactory (from customer point of view) becomes a stronger threat.

As it drifts towards waterfallish activities, problems are discovered very late stages, while fixing them becomes costlier.

Higher lead times, lower efficiency

Intention of DIRFT involves lower lead times. After all, we want to be more efficient, right ? However empirical nature of software plays its counter game and no matter how much team try to anticipate, there will always be last-minute drastic changes needed. This results in larger deliveries as well. Ultimately lead times increase. Pressure keeps mounting as results are missing and/or far from being satisfactory.

Lack of simplicity, abundance of technical debt

It contradicts with core principles on simplicity, as it prioritizes clairvoyance driven “just-in-case” upfront complex designs and decisions over simplicity.

Living solutions get more complex. Starting with huge complexity carries great risk of early accumulation of technical debt (because we are also adding stuff that we do not really know if we will need)

In contrast to manufacturing, we already have the ability to evolve the design and implementation as needed. We have actually way better chance of protecting the simplicity.

Why do organizations fall on it ?

When organizations seek solutions to agility problems by starting with renaming business lines to tribes, teams to squads, operations/IT teams to devops teams, throwing some technologies, it rarely works. When such seemingly agile changes fail to deliver results, focus moves on people and processes.

It might at first make sense to apply DIRFT. After all, it works well in many other places. Why shouldn’t it work in the same way in software engineering ? Didn’t we inherit lean principles and apply to software engineering ?

For those who aren’t from software engineering fields, it might be difficult to see why it will damage. DIRFT looks like a nice short cut to leap forward, although fundamentally inappropriate in its original form.

Often it ends up going back to waterfall cycle, where people spend huge amount of time in design and analysis (and face analysis paralysis) only to be rejected by the reality at the very late stages.

When continuous improvement (provided that the data backs improvemnts) isn’t part of the dna and everyday activities, iterative and incremental works may seem very costly, and recurring failures may occur.

Conclusion

In the pursuit of sustainable success, some organizations try to import DIRFT to software based environments, instead of being value-driven and instead of solving efficiency issues using scientific approaches.

Some of the downsides and risks mentioned here, sounds much like what software engineering is already suffering from waterfall model. (Yes, waterfall is still happening, disguised within lean/agile practices) The resemblance is uncanny.

The solution, doesn’t lie in fixing DIRFT for software engineering. We already have scientific, value oriented approaches and principles that can supoort us well in an empirical world of software engineering. It is a matter of taking the actions in accordance with data and principles.