Architecture and concepts
Testing Methodology

Testing

Intro

This section outlines the testing strategy adopted in the SCDOM platform to ensure robustness, quality, and maintainability across all modules.

Given the platform's modular, event-driven, and contract-based architecture, testing is organized into multiple levels — from fast unit tests validating core business logic to end-to-end and DOT-based tests validating plan generation based on real catalogue data.

Each test layer focuses on a specific concern, enabling independent development and deployment of components, while maintaining confidence in system behavior.

You will find here a description of all testing levels, recommended practices for custom deployments, and a matrix showing which types of tests apply to which components.

In SCDOM development, to ensure the quality of the code , we have a set of tests that are run automatically. These tests include:

  • Testing business logic of each domain (unit tests)
  • Testing database, broker, API, and other integrations for a single deployment unit (integration tests)
  • Testing end-to-end scenarios (e2e tests)
  • For product catalogue development: separate level of tests based on the proprietary solution DOT-based tests

Levels of tests

Unit tests of each domain

Unit tests are used to test the business logic of each domain. They are run automatically on each commit and pull request. Unit tests are written in JUnit5 (opens in a new tab), Mockito (opens in a new tab), and AssertJ (opens in a new tab).

Hexagonal architecture is used to separate the business logic from the infrastructure. This allows us to test the business logic without having to worry about the infrastructure. This also follows BDD principles, which means that unit tests are written in a way that covers the whole flow through the application and domain layer in a single test. This greatly reduces the impact of tests on refactoring because only behavior is tested, not the implementation.

These tests, because of the true unit-level approach, are ultra-fast (under 1s) and are the bread and butter of daily development. They are also run in the CI/CD pipeline. The tests are written in a way that they are easy to read and understand, allowing us to easily add new tests and refactor the existing ones.

SCDOM core domains are highly covered with these business unit tests. The coverage is about 80%-90% for all domains.

All tests are run after merging each merge request to the main branch. The tests are run in a CI/CD pipeline. Additionally, Sonar (opens in a new tab) is used to check the code quality and coverage.

Here is a sample of sonar report for Fulfillment Service (one of core domains): Sonar Tests Report

Integration tests of each domain

Integration tests are used to test the integration of the domain with the database, broker, and other integrations. They are run automatically on each commit and pull request. Integration tests are written in JUnit5 (opens in a new tab), Spring Test Framework (opens in a new tab), TestContainers (opens in a new tab), and AssertJ (opens in a new tab).

These tests are not as fast as unit tests, but they are still fast enough to be run in the CI/CD pipeline. The tests are written in a way that they are easy to read and understand, allowing us to easily add new tests and refactor the existing ones. Such tests may also cover technology-specific solutions such as:

  • Caches
  • Retry mechanisms
  • Queue topology creation
  • Spring context creation
  • etc.

Those tests are also run in a CI/CD pipeline nad coverage result is stored in the Sonar (opens in a new tab) instance.

E2E tests of the whole system

E2E tests are used to test the whole system. They are run periodically on the working DEV environment. E2E tests are written in JUnit5 (opens in a new tab), Serenity (opens in a new tab), Cucumber (opens in a new tab), and AssertJ (opens in a new tab). They are run in a separate CI/CD pipeline.

The main purpose of these tests is to check if an order is being processed end-to-end. This includes:

  • Accepting an order
  • Decomposing the order into a fulfillment execution plan
  • Executing the fulfillment execution plan based on mock Step Executors
  • Checking if the Product/Service Order was updated properly
  • Checking mock interactions
  • Validating that expected audit logs are created

All tests are done automatically at least once a day. All reports from tests are stored in a separate project in git. This is a sample of a report: System Tests Report 1

A sample list of scenarios run: System Tests Report 2

DOT tests of product catalogue

DOT-based tests are a special test level that allows us to speed up the process of modeling the product catalogue. They are created on the same stack as e2e tests, but the purpose is different. The main purpose of these tests is to check if the product catalogue is being processed end-to-end. This includes:

  • Importing and deploying the product catalogue
  • Sending an order to the system
  • Checking the decomposed fulfillment execution plan

The most important part is the speed of the tests. They work against the real system but are not as slow as e2e tests. This is because only the first steps of order processing are done, and fulfillment is not needed to execute. To assert the fulfillment execution plan, we needed a convenient way of writing down a graph with node and edge details. That is why the SCDOM DOT framework was created. It allows us to write down a graph with node and edge details in a simple way. The framework is based on Graphviz (opens in a new tab), which is a powerful tool for creating graphs. The framework allows us to create graphs in a simple way and assert the result.

This is a sample of failed test using Serenity (opens in a new tab) framework. Actual graph

Guidance on how to prepare and run Execution plan building tests (dot tests) is available in the documentation.

How to prepare tests for a new deployment

A new deployment may contain various types of deployment units. The purpose and the way of testing are different for each deployment unit. The following table shows the types of deployment units and the way of testing them.

Recommended testing strategy

Tests target typeUnit testsIntegration testsConsumer Driven Contract TestsE2E testsDOT tests
Custom librariesHighly recommendedIf neededNot neededDo not applyDo not apply
Custom microservices based on coreHighly recommendedIf neededNot neededDo not applyDo not apply
Step executorsHighly recommendedHighly recommendedSuggestedDo not applyDo not apply
Other microservicesHighly recommendedHighly recommendedIf neededDo not applyDo not apply
Working environmentDo not applyDo not applyDo not applyAt least smoke testsFor product catalogue testing

We recommend using the product "dev" deployment as a base for creating new tests. This mainly applies to e2e and DOT tests as the framework is already in place.

Rationale behind the table

The assumption is that the core mechanism works and is being tested by the product team on a daily basis. So there is no need to test the standard flow of an order. During customization, the development team should focus on testing business logic, mainly custom plugins for order and plan building and step executors. All of them can be tested in isolation. The customization team does not need to change the flow after fulfillment plan generation. The next place is the Step Executors themselves. That is why the DOT tests should be enough to test the crucial area of fulfillment plan generation, and unit/integration tests are sufficient to test the last phase of processing - running Step Executors.