Database testing explained

Database testing usually consists of a layered process, including the user interface (UI) layer, the business layer, the data access layer and the database itself. The UI layer deals with the interface design of the database, while the business layer includes databases supporting business strategies.

Purposes

Databases, the collection of interconnected files on a server, storing information, may not deal with the same type of data, i.e. databases may be heterogeneous. As a result, many kinds of implementation and integration errors may occur in large database systems, which negatively affect the system's performance, reliability, consistency and security. Thus, it is important to test in order to obtain a database system which satisfies the ACID properties (Atomicity, Consistency, Isolation, and Durability) of a database management system.[1]

One of the most critical layers is the data access layer, which deals with databases directly during the communication process. Database testing mainly takes place at this layer and involves testing strategies such as quality control and quality assurance of the product databases.[2] Testing at these different layers is frequently used to maintain the consistency of database systems, most commonly seen in the following examples:

Types of testings and processes

The figure indicates the areas of testing involved during different database testing methods, such as black-box testing and white-box testing.

Black-box

Black-box testing involves testing interfaces and the integration of the database, which includes:

  1. Mapping of data (including metadata)
  2. Verifying incoming data
  3. Verifying outgoing data from query functions
  4. Various techniques such as Cause effect graphing technique, equivalence partitioning and boundary-value analysis.

With the help of these techniques, the functionality of the database can be tested thoroughly.

Pros and Cons of black box testing include: Test case generation in black box testing is fairly simple. Their generation is completely independent of software development and can be done in an early stage of development. As a consequence, the programmer has better knowledge of how to design the database application and uses less time for debugging. Cost for development of black box test cases is lower than development of white box test cases. The major drawback of black box testing is that it is unknown how much of the program is being tested. Also, certain errors cannot be detected.[3]

White-box

White-box testing mainly deals with the internal structure of the database. The specification details are hidden from the user.

  1. It involves the testing of database triggers and logical views which are going to support database refactoring.
  2. It performs module testing of database functions, triggers, views, SQL queries etc.
  3. It validates database tables, data models, database schema etc.
  4. It checks rules of Referential integrity.
  5. It selects default table values to check on database consistency.
  6. The techniques used in white box testing are condition coverage, decision coverage, statement coverage, cyclomatic complexity.

The main advantage of white box testing in database testing is that coding errors are detected, so internal bugs in the database can be eliminated. The limitation of white box testing is that SQL statements are not covered.

The WHODATE approach

While generating test cases for database testing, the semantics of SQL statement need to be reflected in the test cases. For that purpose, a technique called WHite bOx Database Application Technique "(WHODATE)" is used. As shown in the figure, SQL statements are independently converted into GPL statements, followed by traditional white box testing to generate test cases which include SQL semantics.[4]

Four stages

A set fixture describes the initial state of the database before entering the testing. After setting fixtures, database behavior is tested for defined test cases. Depending on the outcome, test cases are either modified or kept as is. The "tear down" stage either results in terminating testing or continuing with other test cases.[5]

For successful database testing the following workflow executed by each single test is commonly executed:

  1. Clean up the database: If the testable data is already present in the database, the database needs to be emptied.
  2. Set up fixture: A tool like PHPUnit will then iterate over fixtures and do insertions into the database.
  3. Run test, Verify outcome and then Tear down: After resetting the database to empty and listing the fixtures, the test is run and the output is verified. If the output is as expected, the tear down process follows, otherwise testing is repeated.

Basic techniques

  1. The setup for database testing is costly and complex to maintain because database systems are constantly changing with expected insert, delete and update operations.
  2. Extra overhead is involved in order to determine the state of the database transactions.
  3. After cleaning up the database, new test cases have to be designed.
  4. An SQL generator is needed to transform SQL statements in order to include the SQL semantic into database test cases.

See also

References

External links

Notes and References

  1. Book: Korth, Henry. Henry F. Korth. Database System Concepts. 2010. Macgraw-Hill. 978-0-07-352332-3.
  2. Book: Ambler, Scott. Agile database Techniques: effective strategies for the agile software developer. 2003. wiley. 978-0-471-20283-7.
  3. Book: Pressman, Roger. Software Tester: A Practitioner's Approach. 1994. McGraw-Hill Education. 978-0-07-707732-7.
  4. Book: Zhang, Yanchun. Cooperative databases and applications '99: the proceedings of the Second International Symposium on Cooperative Database Systems for Advanced Applications (CODAS '99), Wollongong, Australia, March 27–28, 1999. 1999. Springer. 978-981-4021-64-7.
  5. Book: Kan, Stephen. Metrics & Models in Software Quality Engineering. Pearson Education. 978-81-297-0175-6.
  6. News: InfoWorld. InfoWorld Media Group, Inc.. 15 Jan 1996.