Unit tests in a DBMS - how we do it in Sportmaster, part two

First part - here.

Unit tests in a DBMS - how we do it in Sportmaster, part two

Imagine a situation. You are faced with the task of developing new functionality. You have experience from your predecessors. Assuming that you have no moral obligations, what would you do?

Most often, all the old developments are forgotten and everything starts all over again. Nobody likes to dig into someone else's code, and if you have time, why not start creating your own system? This is a typical approach, and in many respects it is correct. But in our project, we acted differently. We based the future automated testing system on the developments in unit tests on utPLSQL from our predecessors, and then went to work in several parallel directions.

  1. Restoring old unit tests. Recovery means adaptation of tests to the existing state of the loyalty system and adaptation of tests to utPLSQL standards.
  2. Solving the problem with understanding, and what exactly, what methods and processes, we have covered with autotests. You must either keep this information in your head, or draw conclusions based directly on the code of autotests. Therefore, we decided to create a catalog. We assigned a unique mnemonic code to each autotest, created a description, and fixed the settings (for example, under what conditions it should run, or what should happen if the test run fails). In essence, we filled in the metadata about the autotests and placed this metadata in the standard tables of the utPLSQL schema.
  3. Definition of expansion strategy, i.e. selection of functionality that is subject to verification by autotests. We decided to pay attention to three things: new improvements to the system, incidents from production, and key processes of the system. Thus, we develop in parallel with the release, ensuring its higher quality, simultaneously expanding the scope of the regression and ensuring the reliability of the system in critical places. The first such bottleneck was the process of distributing discounts and bonuses by check.
  4. Naturally, we started developing new autotests. One of the first release tasks was to evaluate the performance of predefined samples of the loyalty system. Our project has a block of rigidly fixed sql queries that select clients according to conditions. For example, get a list of all customers whose last purchase was in a particular city, or a list of customers whose average purchase amount is above a certain value. Having written autotests, we checked predefined samples, fixed benchmark performance parameters, and additionally we got load testing.
  5. Working with autotests should be convenient. The two most common actions are running autotests and generating test data. This is how two auxiliary modules appeared in our system: the launch module and the data generation module.

    The launcher is represented as one generic procedure with one input text parameter. As a parameter, you can pass an autotest mnemonic code, package name, test name, autotest setting, or a reserved keyword. The procedure selects and runs all autotests that meet the conditions.

    The data generation module is presented as a package, in which for each object of the system under test (a table in the database), a special procedure has been created that inserts data there. In this procedure, the default values ​​are filled to the maximum, which ensures the creation of objects literally at the click of a finger. And for ease of use, templates for the generated data were created. For example, create a client of a certain age with a test phone and a completed purchase.

  6. Autotests should run and run within a reasonable time for your system. Therefore, a daily night launch was organized, based on the results of which a report on the results is generated and sent to the entire development team by corporate mail. After restoring old autotests and creating new ones, the total running time was 30 minutes. Such a performance suited everyone, since the launch took place during off-hours.

    But we had to work on optimizing the speed of work. The production loyalty system is updated at night. As part of one of the releases, we had to urgently make changes at night. A half-hour waiting for the results of autotests at three in the morning did not make the person responsible for the release happy (ardent greetings to Alexei Vasyukov!), And the next morning a lot of warm words were said towards our system. But as a result, a 5-minute standard for work was set.

    To speed up performance, we used two methods: we started running autotests in three parallel threads, since this is very convenient due to the architecture of our loyalty system. And we abandoned the approach when the autotest does not create test data for itself, but tries to find something suitable in the system. After making changes, the total operating time was reduced to 3-4 minutes.

  7. A project with autotests should be able to be deployed on various stands. At the beginning of the journey, there were attempts to write our own batch files, but it became clear that a self-written automated installation is a complete horror, and we turned towards industrial solutions. Due to the fact that the project has a lot of code directly (first of all, we store the code of autotests) and very little data (the main data is metadata about autotests), it turned out to be very easy to integrate into the Liquibase project.

    It is an open source database independent library for tracking, managing and applying database schema changes. Managed via command line or frameworks like Apache Maven. The principle of operation of Liquibase is quite simple. We have a project organized in a certain way, which consists of changes or scripts that need to be rolled onto the target server, and control files that determine in what order and with what parameters these changes should be installed.

    At the DBMS level, a special table is created in which Liquibase stores the rollback log. Each change has a calculated hash that is compared each time between the project and the state in the database. Thanks to Liquibase, we can easily roll out changes to our system to any circuit. Autotests are now run on test and release circuits, as well as on containers (personal developer circuits).

Unit tests in a DBMS - how we do it in Sportmaster, part two

So, let's talk about the results of applying our unit testing system.

  1. Of course, first of all, we are convinced that we started developing better software. Autotests run daily and find dozens of bugs every release. Moreover, some of these errors are only indirectly related to the functionality that we really wanted to change. There are strong doubts that these errors were found by manual testing.
  2. The team gained confidence that specific functionality works correctly... First of all, this concerns our critical processes. For example, over the past six months, we have had no problems with the distribution of discounts and bonuses by check, despite the changes made every release, although in previous periods errors occurred with some frequency
  3. We managed to reduce the number of testing iterations. Due to the fact that autotests are written for new functionality, analysts and part-time testers get a code of higher quality, because it has already been verified.
  4. Part of the developments of automated testing is used by developers. For example, test data on containers is created using the object generation module.
  5. It is important that we have developed an “acceptance” of the automated testing system by developers. There is an understanding that this is important and useful. And from my own experience, I can say that this is far from the case. Autotests need to be written, they need to be maintained and developed, the results analyzed, and often these time costs are simply not worth it. It is much easier to go to production and deal with problems there. In our country, developers line up and ask to cover their functionality with autotests.

What's next

Unit tests in a DBMS - how we do it in Sportmaster, part two

Let's talk about the development plans of the automated testing project.

Of course, as long as the Sportmaster loyalty system is alive and continues to develop, autotests can also be developed almost endlessly. Therefore, the main direction of development is the expansion of the coverage area.

As the number of autotests increases, the total time of their work will steadily increase, and we will again have to return to the issue of performance. Most likely, the solution will be to increase the number of parallel threads.

But these are obvious ways of development. If we talk about something more non-trivial, we highlight the following:

  1. Now autotests are managed at the DBMS level, i.e. knowledge of PL/SQL is required for successful work. If necessary, system management (for example, launching or creating metadata) can be taken out by some kind of admin panel using Jenkins or something similar.
  2. Everyone loves quantitative and qualitative indicators. For automatic testing, such a universal indicator is Code Coverage or code coverage metric. Using this indicator, we can determine what percentage of the code of our system under test is covered by autotests. Starting from version 12.2, Oracle provides the ability to calculate this metric and suggests using the standard DBMS_PLSQL_CODE_COVERAGE package.

    Our autotest system is just over a year old and it might be time to evaluate coverage. In my last project (a project not by Sportmaster), this happened. A year after working on autotests, the management set the task of assessing what percentage of the code we covered. With more than 1% coverage, management would be happy. We, the developers, expected a result of about 10%. We screwed up code coverage, measured it, got 20%. To celebrate, we went for the award, but how we went for it and where we went later is a completely different story.

  3. Autotests can test exposed web services. Oracle allows you to do this, and we will no longer encounter a number of problems.
  4. And, of course, our automated testing system can be applied to another project. The solution we received is universal and only requires the use of Oracle. I heard that there is an interest in automated testing on other Sportmaster projects and, perhaps, we will go to them.

Conclusions

Let's recap. On the loyalty system project in Sportmaster, we managed to implement an automated testing system. Its basis is the utPLSQL solution from Stephen Feuerstein. Around utPLSQL is the code for autotests and auxiliary self-written modules: a launcher, a data generation module, and others. Autotests run daily and, most importantly, work and benefit. We are convinced that we have begun to release software of higher quality. At the same time, the resulting solution is universal and can be freely applied to any project where it is necessary to organize automated testing on the Oracle DBMS.

PS This article turned out to be not very specific: there is a lot of text and practically no technical examples. If the topic is globally interesting, then we are ready to continue it and return with a continuation, where we will tell you what has changed over the past six months and give code examples.

Write comments if there are points that should be emphasized in the future, or questions that require disclosure.

Only registered users can participate in the survey. Sign in, you are welcome.

Shall we write more about this?

  • Yes, of course

  • No thanks

12 users voted. 4 users abstained.

Source: habr.com

Add a comment