Quality Assessment and Testing

Everything around us, everything we interact with, everything we do, has some sort of quality control aspect to it. When we order food, we expect that a tried and tested formula has been used and that if we ordered the same meal again next time, it wouldn't change drastically, definitely not for the worse. When we buy a new phone, we expect that the specifications advertised are what we will get and that everything will work as it is meant to. When we are baking at home (if you bake) we follow the instructions in front of us because we trust that the recipe has been tested and will give us the best quality dessert. You will be hard pressed to find something that doesn't require some level of quality assessment and testing, and the workflows that one builds as a data analyst is no different.

ISO 9000 defines quality assurance as "a part of quality management focused on providing confidence that quality requirements will be fulfilled".

When thinking about quality assurance and what that means, there is a wide range of attitudes and personalities. Some people feel like getting a finished product quick is more important, one can always go back and fix the problems later. Others believe that taking things 'slow' (which is relative) and quality testing along the way is better. It is important to note that there is no hard and fast rule regarding this and it should be decided on a case by case basis.

Which ever way, quality testing should be done at some point because:

  • You need to make sure what you've done can stand the test of time. i.e. will it work in the future? (for example an Alteryx workflow)
  • You want to ensure that the people you are submitting your work to have confidence in you and what you produce
  • It prevents thinks from breaking later on if you've tested it out.

There are different methods for quality assurance:

  1. Have a check log. List everything that's meant to happen at each step and what you expect to see to know that it's working. As you go along, be ticking each successful step and fix the ones that aren't successful there and then. Once you're happy that everything ahs been tested, give it to another person to also run tests. They may notice gaps you may have missed, or they may just validate what you've already done.
  2. Create a test script. When you've spoken to your stakeholder (the person/team/company you're building the process for) and you're clear on what the process is meant to achieve, create dummy information to put through the process. Then create a manual, similar to what you might give to your stakeholder, (note that it may not be written manual) and give it someone else to follow then let them give you feedback.
  3. I've tried to make this blog applicable to various scenarios however if you're specifically building a flow in Alteryx then use the browse tool to view a summary of the data and look for gaps in the output. You can also use the summarize tool or field summary tool to get descriptive statistics of the data in each column.
  4. Reconciliation. This is really useful if you're trying to make an improvement to something that already exists. Once you've built the upgrade, run the old data through it and see if you get the same results. Of course this only works if you're sure the old data is correct😅. Tableau is a great tool for doing some reconciliation.

The bottom line is CHECK YOUR WORK!!!!!!

Author:
Angelica Obi
Powered by The Information Lab
1st Floor, 25 Watling Street, London, EC4M 9BR
Subscribe
to our Newsletter
Get the lastest news about The Data School and application tips
Subscribe now
© 2025 The Information Lab