Unit tests in a DBMS - how we do it in Sportmaster, part one

Hey Habr!

My name is Maxim Ponomarenko and I am a developer at Sportmaster. I have 10 years of experience in the IT field. He started his career in manual testing, then switched to database development. For the last 4 years, accumulating knowledge gained in testing and development, I have been engaged in test automation at the DBMS level.

I have been a member of the Sportmaster team for a little over a year and am developing automated testing on one of the major projects. In April, the guys from Sportmaster Lab and I spoke at a conference in Krasnodar, my report was called "Unit tests in DBMS", and now I want to share it with you. There will be a lot of text, so I decided to split the report into two posts. In the first, we will talk about autotests and testing in general, and in the second, I will dwell in more detail on our unit testing system and the results of its application.

Unit tests in a DBMS - how we do it in Sportmaster, part one

First, some boring theory. What is automated testing? This is testing that is carried out by software, and in modern IT it is increasingly used in software development. This is due to the fact that companies are growing, their information systems are growing and, accordingly, the amount of functionality that needs to be tested is also growing. Manual testing is getting more and more expensive.

I worked for a large company that releases every two months. At the same time, a whole month was spent on ensuring that a dozen testers manually checked the functionality. Thanks to the introduction of automation by a small team of developers, we managed to reduce the testing time to 2 weeks in a year and a half. We not only increased the speed of testing, but also improved its quality. Automatic tests are run regularly and they always perform the entire course of their checks, that is, we exclude the human factor.

Modern IT is characterized by the fact that a developer may be required not only to write product code, but also to write unit tests that check this code.

But what if your system is based primarily on server-side logic? There is no universal solution and best practices on the market. As a rule, companies solve this problem by creating their own self-written testing system. Here is such a self-written automated testing system that was created on our project and I will talk about it in my report.

Unit tests in a DBMS - how we do it in Sportmaster, part one

Testing loyalty

First, let's talk about the project where we deployed an automated testing system. Our project is the Sportmaster loyalty system (by the way, we already wrote about it in this post).

If your company is large enough, then your loyalty system will have three standard properties:

  • Your system will be heavily loaded
  • Your system will contain complex computational processes
  • Your system will be actively developed.

Let's go in order... In total, if we consider all the brands of Sportmaster, then we have more than 1000 stores in Russia, Ukraine, China, Kazakhstan and Belarus. These stores make about 300 purchases daily. That is, every second 000-3 checks enter our system. Naturally, our loyalty system is highly loaded. And since it is actively used, we must provide the highest standards of its quality, because any error in the software is a big financial, reputational and other loss.

At the same time, more than a hundred different promotions work in Sportmaster. Promotions are very different: there are commodity promotions, there are dedicated to the day of the week, there are promotions tied to a specific store, there are promotions for the amount of the check, there are promotions for the number of goods. In general, not bad. Customers have bonuses, there are promotional codes that are used when making purchases. All this leads to the fact that the calculation of any order is a very non-trivial task.

The algorithm that implements the processing of the order is truly terrible and complex. And any change to this algorithm is a rather risky thing. It seemed that the most outwardly insignificant changes could lead to quite unpredictable effects. But it is precisely such complex computational processes, especially those that implement critical functionality, that are the best candidate for automation. It is very time-consuming to check dozens of cases of the same type by hand. And since the entry point to the process is unchanged, having described it once, you can quickly stamp automatic tests and be sure that the functionality will work.

Since our system is actively used, the business will want something new from you, keep up with the times and be customer-oriented. In our loyalty system, releases come out every two months. This means that every two months we need to carry out a complete regression of the entire system. At the same time, of course, as in any modern IT, development does not immediately go from the developer to production. It originates on the developer's circuit, then sequentially bypasses the test bench, release, acceptance, and only then ends up in production. At least on the test and release circuits, we need to carry out a complete regression of the entire system.

The described properties are standard for almost any loyalty system. Let's talk about the features of our project.

Technologically, 90% of the logic of our loyalty system is server-based and implemented on Oracle. There is an exposed client on Delphi, which performs the function of an workstation administrator. There are exposed web services for external applications (eg a website). Therefore, it is very logical that if we deploy an automated testing system, we will do it on Oracle.

The loyalty system in Sportmaster has existed for more than 7 years and was created by single developers... The average number of developers on our project during these 7 years was 3-4 people. But over the past year, our team has grown a lot, and now 10 people are working on the project. That is, people who are not familiar with typical tasks, processes, and architecture come to the project. And there is an increased risk that we will miss mistakes.

The project is characterized by the absence of dedicated testers as staff units. Of course, there is testing, but analysts are involved in testing, in addition to their other main duties: communicating with business customers, users, developing system requirements, etc. etc. Despite the fact that the testing is carried out very well (it is especially appropriate to mention this, since this report may catch the eye of some analysts), no one has canceled the effectiveness of specialization and concentration on one thing.

Given all of the above, in order to improve the quality of the issued product and reduce the development time, the idea of ​​automating testing on the project seems very logical. And at different stages of the existence of the loyalty system, individual developers have made efforts to cover their code with unit tests. It was generally a rather fragmented process, where everyone used their own architecture and methods. The final results were common for unit tests: tests were developed, used for some time, added to the versioned file storage, but at some point they stopped running and were forgotten. First of all, this happened due to the fact that the tests were tied more to a specific performer, and not to a project.

utPLSQL comes to the rescue

Unit tests in a DBMS - how we do it in Sportmaster, part one

Do you know anything about Steven Feuerstein?

This is a smart guy who devoted a long part of his career to working with Oracle and PL / SQL, wrote a fairly large number of works on this topic. One of his well-known books is called β€œOracle PL/SQL. For professionals. It is Stephen who developed the utPLSQL solution, or, as it stands for Unit Testing framework for Oracle PL/SQL. The utPLSQL solution was created in 2016, but it continues to be actively developed and released with new versions. At the time of this report, the latest version is dated March 24, 2019.
What is it. This is a separate open source project. It weighs a couple of megabytes, including examples and documentation. Physically, it is a separate schema in the ORACLE database with a set of packages and tables for organizing unit testing. Installation takes a few seconds. A distinctive feature of utPLSQL is its ease of use.
Globally, utPLSQL is a mechanism for running unit tests, where a unit test is understood as ordinary Oracle batch procedures, the organization of which corresponds to certain rules. In addition to running, utPLSQL keeps a log of all your test runs, and also has an internal reporting system.

Let's look at an example of how the unit-test code implemented using this technique looks like.

Unit tests in a DBMS - how we do it in Sportmaster, part one

So, the screen shows the code of a typical package specification with unit tests. What are the mandatory requirements? The package must be prefixed with "utp_". All procedures with tests must have exactly the same prefix. The package must contain two standard procedures: "utp_setup" and "utp_teardown". The first procedure is called by restarting each unit test, the second - after the start.

"utp_setup" typically prepares our system to run a unit test, such as creating test data. "utp_teardown" - on the contrary, everything returns to the original settings and resets the launch results.

Here is an example of the simplest unit test that checks the normalization of the entered customer phone number to the standard form for our loyalty system. There are no mandatory standards on how to write procedures with unit tests. As a rule, a method of the system under test is called, and the result returned by this method is compared with the reference one. It is important that the comparison of the reference result and the obtained one occurs through standard utPLSQL methods.

A unit test can have any number of checks. As you can see from the example, we make four consecutive calls to the method under test to normalize the phone number and evaluate the result after each call. When developing a unit test, it must be taken into account that there are checks that do not affect the system in any way, and after some it is necessary to roll back to the initial state of the system.
For example, in the presented unit test, we simply format the input phone number, which does not affect the loyalty system in any way.

And if we write unit tests according to the method of creating a new client, then after each test a new client will be created in the system, which may affect the subsequent launch of the test.

Unit tests in a DBMS - how we do it in Sportmaster, part one

This is how unit tests are run. There are two launch options: run all unit tests in a specific package, or run a specific unit test in a specific package.

Unit tests in a DBMS - how we do it in Sportmaster, part one

Here is an example of an internal reporting system. Based on the results of the unit test, utPLSQL builds a small report. In it, we see the result for each specific test and the overall result of the unit test.

6 autotest rules

Before starting to create a new system for automated testing of the loyalty system, together with the management, we determined the principles that our future automated tests should comply with.

Unit tests in a DBMS - how we do it in Sportmaster, part one

  1. Autotests should be effective and should be useful. We have great developers who should definitely be mentioned, because one of them will surely see this report, and they write wonderful code. But even their wonderful code is not perfect and has, contains and will contain errors. Autotests are required to find these errors. If this is not the case, then either we write bad autotests, or we have come to a dead area, which, in principle, is not being finalized. In both cases, we are doing something wrong and our approach is simply meaningless.
  2. Autotests should be used. It makes no sense to spend a lot of time and effort on writing a software product, fold its repository and forget it. Tests should be run, and run as regularly as possible.
  3. Autotests should work stably. Regardless of the time of day, the launch stand, and other system settings, test runs should lead to the same result. As a rule, this is ensured by the fact that autotests work with special test data with fixed system settings.
  4. Autotests should run at a speed acceptable to your project. This time is determined individually for each system. Someone can afford to work all day, and someone is critical to keep within seconds. What speed standards we have achieved in our project, I will tell a little later.
  5. Autotest development should be flexible. It is undesirable to refuse to check any functionality simply because we have not done so yet or for some other reason. utPLSQL does not impose any restrictions on development, and Oracle, in principle, allows you to implement a variety of things. Most problems have a solution, the only question is time and effort.
  6. Deployability. We have several stands where we need to run tests. On each of the stands, a data dump can be updated at any time. It is necessary to conduct a project with autotests in such a way as to be able to painlessly produce its full or partial installation.

And in the second post in a couple of days I will tell you what we did and what results we achieved.

Source: habr.com

Add a comment