top of page

5 Tips to Improve Data Quality

The average person makes over 30,000 decisions every day. How many of these are informed decisions? On the commodity market, 100% of them need to be. To ensure they are, we turn to data.


Data helps us deduce market trends and visualise behaviour. These projections influence the outcomes of market analysis, portfolio optimisation, nomination, forecasting, and trading. But not all data is created equal. Incorrect data is not only costly, but it can lead to misinformation that impacts the market in negative ways.



poor data quality causes broken code


The good news is that the problem of poor data is avoidable. This is where data quality comes in. Data quality, according to TechTarget, may be defined as “a measure of the condition of data based on factors such as accuracy, completeness, consistency, reliability and whether it's up to date”. Data quality, then, influences the legitimacy of a data-driven system. It ensures that the data-driven decisions you make every day are based on facts.


In this article, I argue to include data quality as part of the conversation from the very beginning. Developing a framework to assess data quality at every step of the project ensures that the final product delivers accurate and reliable results.


This article addresses 5 ways to ensure data quality is at the centre of your project:


1. Selecting trustworthy sources

2. Using data cleansing as a template for all data management

3. Choosing a reliable hosting interface

4. Leaving ample time for testing

5. Prioritising routine maintenance


With these best practices, you can build a framework for data quality management that is adaptable to change.


1. Validate your sources


Authenticating your data’s origins—or the origin of the data you would like to work with—is the first way you to include data quality in your project's development.


Understanding where your data comes from helps verify how trustworthy it is. Reliable data leads to reliable decisions, so think critically at the outset helps avoid transparency problems later on.


You will want to define a detailled process to assess source quality. Here are some preliminary considerations:


  • Pre-vetted resources to serve as benchmarks

  • Specifications on existing data reserves

  • Minimum data requirements to complete the project

  • Specifications on public vs private providers

Once you have considered these details, you may then evaluate the data case-by-case.


Be sure to think critically about data governance at this stage: who was responsible for cleansing and shipping this data to the corresponding system? Do we trust them? When and how is the data updated? And, perhaps most importantly, can we afford to do without this data and still complete this project?





Trust is an integral part of the decision-making process. Managing data before it enters the pipeline prevents your system from becoming a liability or developing into a security risk.


Check and double-check your sources at this stage, and you can feel confident about the groundwork of your project.


2. Set the tone with data cleansing


Once you have collected your sources in the database, it is time to evaluate your data sets. You will want to go through your data with a fine-toothed comb in a process called data cleansing. This step allows you to detect outliers and correct any mistakes so that the data you are working with can be manipulated effectively.


In the data cleansing process, you are looking for errors, duplicates, and missing data. You are also addressing formatting and other structural components for consistency. You can perform this process either manually or through automation. And once the initial cleaning has been completed, you can create an algorithm that completes this process for you in the case of future updates to the data set.


Although data cleaning is typically a one-off event, you will want to use this opportunity to set the tone for data analysis, which will be repeated many times throughout the development and maintenance of the project. Sources may change their structure, so developing a routine to account for new developments ensures that the database will always return the right data. Use the best practices of data cleaning to inform your data analysis approach.



white clean library with organised data


With all elements of polishing the data, the goal is the same: to develop a trustworthy home base for your data so that it can deliver on its intended purpose. These are the raw materials that will begin to tell the story of your business. By developing a data set that is clean and organised, you will be prepared for seamless implementation.


3. Choose a reliable hosting framework


Data collection and aggregation wield a lot of power in defining a project. But without a proper destination, the decisions that data can inform will miss their potential. In order to avoid that, take the time to define the requirements for the system based on your business needs.


To find the ideal endpoint, think through all aspects of the problem first, then investigate viable solutions. At appygas, design thinking is the first step we take when we approach a challenge.



A problem-first approach will clarify the goals and define the main questions you are asking of the data. Having a clearer understanding of the problem will help outline the key requirements that will allow you to select the right system for your needs.


4. Leave time for testing


Whether or not a system works is the primary indicator for project success. Test your work. Create a checklist or spreadsheet that lists all widgets and data points and have each member of your team go through it three times: once to ensure all data points have been covered, and (at least) twice during the validation process.


While this is not the only chance to check for errors, the go-live of a data-centric project has the most impact on usability. Mistakes at this stage prevent the intended users from seeing the value in the product. Time and resources went towards making the project answer the users’ pain points: you will want all that work to pay off.


This best practice will take longer than you might think. Allot enough time to compensate for bugs, faulty data, and other weaknesses in the system.


5. Prioritise maintenance


If there is one key takeaway to inform your data-driven project, let it be this: quality comes not from how you store the data, but how you use it to answer core business questions. For this reason, having a maintenance process in place ensures data quality after the initial deployment is over.


We live in a time of high technological turnover. Technologies evolve quickly, and changes like software updates and bug fixes can ensure that the system is always running smoothly. As these systems change, and new technologies are introduced, regular maintenance allows you to acknowledge these changes and respond to them.


Reviewing the existing software helps you secure your system against external threats. In some cases, older versions of systems might have bugs that were not fixed in newer versions, or they have security flaws that were discovered and not corrected.


Updates are intended to replace the older versions, and many developers will not patch fixes to older versions, instead offering a new version to replace the old. If you do not update to the new systems, your data—and your organisation—are left vulnerable.



screw faces nut on white surface



Try adapting the checklist or spreadsheet you made for the testing phase to a maintenance schedule. Include in your documentation an alert system for when developers patch updates. Then, you can schedule regular performance reviews to assess the project’s progress and what might be needed to ensure its relevance over time. With the right preparations, you ensure a process that is adaptable to changes and responds with agility to vulnerabilities.


Data quality is not about short-term wins: it's about long-term success. Centring your project on quality ensures that the final system is functional, flexible, and durable. With these three key components, you are guaranteeing a solution that includes not only data quality, butdata excellence.


Interested in seeing how the experts do it? Book a discovery workshop with appyStudioto find out how you can achieve a higher level of data quality in your organisation.


683 views0 comments
bottom of page