The Power of Frontline Data Quality Controls
Improving data quality was key to ensuring our financial services client retained data integrity and accuracy while also reducing time and money spent on after-hours issues.
Our client needed a modern, transformative data architecture to reach the next level of analytics, including AI.
Our client, a nationwide construction firm, knew they could get more out of their existing data, especially in this era of AI and Machine Learning. However, their current architecture was unwieldy – reporting was difficult to create, time intensive to pull, and the data quality was suspect. They needed a modern data solution to ensure they were getting the most value and insight from their data.
The client’s existing data warehouses simply did not provide the quality, timeliness, or ease of use that their modern business needed.
Because of the multitude of data sources and the variety of information in each, many departments claimed their data was the “source of truth” for reporting and analysis.
As often happens with many data sources, there was very little alignment on definitions, access, and how to resolve data issues that arose.
The current data warehouse was inadequate for the current company needs, making pulling data slow and difficult – even timing out during requests! It also provided little ability to drill down into the data for further analysis.
As a consequence of the many data sources, lack of governance, and the concerns about timeliness, many end users didn’t trust reporting to be accurate or provide much insight into the business.
Given the challenges above and our client’s desire for advanced analytics in the future, we designed a Medallion Architecture, breaking the data into three tiers: bronze, silver, and gold, depending on the data’s use and readiness for analytics. We also knew we needed to establish clear data governance guidelines to keep the same quality issues from arising again.
This new architecture incorporated several technologies, such as Azure Data Factory, Databricks, dbt cloud, and Power BI for data visualization.
The Azure data lake pulls all the difficult-to-source data into a single data lake, where we could then siphon raw data off to be transformed for analysis.
Using Databricks, we designed and launched a new Medallion Architecture. This tiers data into a bronze layer, where the focus is on data quality and accuracy, a silver layer, where useability and discovery is enhanced, and a gold layer, where the data is ready for more advanced analytics.
In order to move data smoothly through the transformation process, we created data pipelines using dbt to improve drill-down features and validation.
The ultimate goal of this project was to deliver usable, insightful reporting that could be understood by non-technical business analysts and other stakeholders. To do that, we used Power BI to create new dashboards that allowed for greater research into root causes of the various KPIs.
With the new data architecture in place, our client gained access to their data and confidence in their insights that they never had before.
With Power BI, we streamlined 11 pre-existing reports into two dashboards with improved drill-down and drill-through functionality, saving users time and headaches, while providing greater access to their business-critical data.
One of the key issues with the old data set up was the time required to pull reporting, with reports often timing out and refusing to load. All new dashboards load within 5 seconds.
By implementing new data governance and security features within our data lake and dashboards, users are assured that the data they are using is accurate and reliable, and no one has access to data they shouldn’t.
Thanks to the new data architecture, data is structured and prepared to be used in AI scenarios such as natural language querying and additional AI-powered analytics.
Improving data quality was key to ensuring our financial services client retained data integrity and accuracy while also reducing time and money spent on after-hours issues.
The transfer of marketing leads to sales teams is critical for customer acquisition, however our client’s process was time consuming and heavily manual. RevGen built a proof-of-concept solution that automated the entire handoff.
To improve sales efficiency, our client needed an “At-a-Glance” customer segmentation model that could evolve with their business.
Get the latest updates and Insights from RevGen delivered straight to your inbox.