Well written, unambiguous requirements are the first step to successful BI/ETL implementation, and good testing of these requirements can significantly reduce your project’s cost and risks.
Glancing at the diagram below, it is immediately apparent how important it is to start testing from the very first stage – at the requirements. The cost of a missed defect grows exponentially as time and work progress, creating a snowball effect. On the other hand, good, verified prerequisites assure reduced errors and cost in the final stage.
However, cost is not the only potential downside to consider when determining test points while building a BI system. Let’s look at a healthcare scenario as an example. A data warehouse and BI system could be helping providers to diagnose illnesses, prescribe treatment and determine drug dosages. Information has life and death consequences – it has to be correct! Similarly, a BI system built for a financial institution could determine where and how credits and debits are made, moving large sums of money between accounts. Incorrect mapping or misinformation could even result in criminal liability. These are matters of great importance.
For this reason requirements should accompany not only Source to Target Mapping (STT) documents, but also diagrams, data model schemes, traceability matrices and other supporting documents. Ideally, different views of the same requirements should be constructed by different people to prevent the possibility of ambiguous interpretation. By validating requirements in a variety of ways we can find identify the most likely bottlenecks and errors before implementation. For instance, we can:
- identify ambiguous requirements;
- see where requirements have been missed;
- note incorrect mapping;
- eliminate repetitive sources for the same metrics that occur in several requirements.
For greater clarity we could use Enterprise Architect to represent the mappings of tables, and also use Excel widely for analysis and more detailed representation of requirements.
In a data warehouse requirements are mostly around reporting. Hence it becomes important to verify whether these reporting requirements can be delivered using the data available.
Successful requirements are structured closely to business rules, addressing functionality and performance. These business rules and requirements provide a solid foundation to the data architect who designs the data model from these. Once requirements and business rules are available it is important to draft rough test scripts to validate data model constraints against defined business rules.
Alena leads efforts associated with data quality improvements and functional testing of large business intelligence and data warehouse enhancements projects. As a lead consultant, Alena participates throughout the software development lifecycle, applying her strong knowledge of DW/BI models to help solve complex business problems. In her position, Alena interfaces with cross-functional teams to manage defects and compile test results, communicating issues, risks and status to ensure Coherent’s solutions are meeting business requirements.