The importance of Unit Testing in BI
Posted
by Davide Mauri
on SQL Blog
See other posts from SQL Blog
or by Davide Mauri
Published on Thu, 01 Apr 2010 16:50:06 GMT
Indexed on
2010/04/01
17:03 UTC
Read the original article
Hit count: 666
One of the main steps in the process we internally use to develop a BI solution is the implementation of Unit Test of you BI Data.
As you may already know, I’ve create a simple (for now) tool that leverages NUnit to allow us to quickly create Unit Testing without having to resort to use Visual Studio Database Professional:
http://queryunit.codeplex.com/
Once you have a tool like this one, you can start also to make sure that your BI solution (DWH and CUBE) is not only structurally sound (I mean, the cube or the report gets processed correctly), but you can also check that the logical integrity of your business rules is enforced.
For example let’s say that the customer tell you that they will never create an invoice for a specific product-line in 2010 since that product-line is dismissed and will never be sold again. Ok we know that this in theory is true, but a lot of this business rule effectiveness depends on the fact the people does not do a mistake while inserting new orders/invoices and the ERP used implements a check for this business logic. Unfortunately these last two hypotesis are not always true, so you may find yourself really having some invoices for a product line that doesn’t exists anymore.
Maybe this kind of situation in future will be solved using Master Data Management but, meanwhile, how you can give and idea of the data quality to your customers? How can you check that logical integrity of the analytical data you produce is exactly what you expect?
Well, Unit Testing of a DWH or a CUBE can be a solution.
Once you have defined your test suite, by writing SQL and MDX queries that checks that your data is what you expect to be, if you use NUnit (and QueryUnit does), you can then use a tool like NUnit2Report to create a nice HTML report that can be shipped via email to give information of data quality:
In addition to that, since NUnit produces an XML file as a result, you can also import it into a SQL Server Database and then monitor the quality of data over time.
I’ll be speaking about this approach (and more in general about how to “engineer” a BI solution) at the next European SQL PASS
Adaptive BI Best Practices
http://www.sqlpass.org/summit/eu2010/Agenda/ProgramSessions/AdaptiveBIBestPratices.aspx
I’ll enjoy discussing with you all about this, so see you there!
And remember:
“if ain't tested it's broken!”
(Sorry I don’t remember how said that in first place :-))
© SQL Blog or respective owner