Enforcing Quality through Tests and DevOps Practices

Enforcing Quality through Tests and DevOps Practices #

Let’s talk about tests. #

There are different types of tests for the whole software development life cycle.

Unit Tests are the smallest and fastest of the tests, validating code inside method and functions. They cover most of the codebase because they can validate the business logic in a granular way, including its negative cases that can produce errors. More advanced implementations of unit tests use Generative and Mutation techniques to check all code paths while reducing the boilerplate. The unit under test should be isolated from another unit’s business logic. For example, when ClassA.method() calls a ClassB.method() and there is a bug inside a ClassB.method(), ClassA tests should pass while ClassB should fail. Mocks and stubs provide this isolation by artificially setting the external objects' behavior while keeping the logic under test intact.

Integration Tests are broader than Unit Tests. Instead of validating the methods and business logic’s specificities, they should be designed to validate configurations that couple components together, units of the code integrating with other units, and ensuring that the glue such as dependency injection or network configurations are working as expected. When tests require network usage, tools such as in-memory databases and containers are used to improve test predictability.

Functional Tests, also called End-to-End Tests, validate the application’s vertical layers, from the contract interface, through the business logic processing, all the way to the data persistence in most cases. When implemented correctly, these tests represent the business requirements specified by the product team. BDD (Behavior-Driven Design) is the practice that attempts to standardize the functional test implementations through ubiquitous language and user story language standardization, such as As [role], I want to [action] to achieve [benefit] and Given/When/Then.

The best practices for enterprise applications favor concise and decoupled services that are reachable through APIs, allowing teams to change implementations without compromising the whole system. For this purpose, Contract Tests are a type of Functional Test that checks the API schema’s specificities, validating the contract for REST or RPC implementations preventing mismatches that can break the communication with the service’s clients.

Traditionally, the QA team was responsible for writing Functional Tests. However, in a more effective agile approach, the engineers who implement the business logic are accountable for delivering production-ready code that includes BDD style tests proving that recently created features conform to specifications. Preventing side effects that break existing functionality is an essential feature of these tests; that’s why some prefer to call them Regression Tests since they verify the existing business logic’s integrity.

All these tests are useless if they do not evolve with the application. For that to happen, engineers need to think about tests as the foundation to deliver quality software. TDD and BDD techniques prescribe that tests should be the first task when writing the business logic. Tests help to guide the implantation without overengineering. The only exceptions to writing tests up-front are experimentations to guide future implementations.

When a bug is detected, the sequence of actions to provide a fix should start with a failing test that reproduces the bug, followed by the implementation fix and the same test that previously failed passing, proving that the correction works. Other professions require a similar process, especially those that demand the empirical method. Would you take a medication that has not passed a level of safety approvals? Or employee an accountant that doesn’t care about balanced worksheets?

Software that does not meet SLAs provides a poor user experience or has intermittent failures. Therefore, performance tests must be part of the delivery pipeline. The fundamental SLA benchmark needs continuous validation on the new and existing features before production deployments.

Culture #

The current software engineering practice recommends integrating the operational functions, such as deployments, infrastructure maintenance, and monitoring, closer to the development lifecycle. As technology complexity increases, we need more abstractions to make sense of the whole software eco-systems. This cultural shift requires that engineers be proficient in many parts of the SDLC, taking responsibility for implementations from inception to production.

Advances in virtualization are allowing scaling in ways that have not been possible a decade ago. To deal with the new complexity in a predictive manner, practices such as Infrastructure as Code and Immutable Infrastructure developed to improve the environment’s predictability and reproducibility.

Infrastructure as Code uses templates and recipes to recreate environments that live along with the source code. Immutable infrastructure is the term used when there are no changes to VMs or container states after initialization. In other words, the prescribed image does not get updated from outside agents. It is self-contained, predictable, and enables the reproducibility of defects in different environments, even in local workstations.

CI stands for Continuous Integration; this means that it monitors continuous updates in the main repository. Once the code is pushed from local to the central repo, the CI pipeline runs tests and static code analysis such as code inspections and test coverage. CI is the protective shield that prevents suboptimal code from going into the main source code repository.

CD stands for continuous delivery or deployment. It is crucial to differentiate Continuous Delivery from Continuous Deployment: delivery means that we have a manual approval step before pushing code to production; deployment means moving code to production upon a merge into the main repository.

A successful CD process signifies that the team has reached a high level of maturity. It requires comprehensive automated tests and enforcement of coding best practices. Small testable stories delivered incrementally with high testing coverage are the CD best friends; otherwise, features became too bulky, increasing the risk of conflicts and the probability of errors.

Modern development pipelines combine CI/CD to enable fast value delivery, and Infrastructure as Code and Immutable Infrastructure create and reproduce environments as needed. Additionally, deployment strategies such as rolling, blue-green, canary allow the rollout of new features to end-users seamlessly.

Focus in Value #

Kanban, XP, or Scrum, and other agile methodologies share the same values described in the Agile Manifest. Furthermore, “Responding to Changes” is the critical point that ties DevOps, Tests, and Agile. We live in an increasingly changing world. To survive, organizations “pivot” strategies to deliver value to their customers, impacting existing software systems. To support these changes without compromising quality, the DevOps culture, combined with rigorous functional, integration, unit, and performance tests, enhances confidence while unlocking competitive advantages to organizations.

Building Software is a complex endeavor; DevOps and testing practices help create a culture of quality that can sustain business demand with stability. On the other hand, neglecting them will be expensive in the long run, much like building a house on unstable soil.