Consolidated Data Quality for J&T Banka

J&T is one of central Europe’s most well-known banks and certainly the most important private banking and wealth management institution in the region. As such, their clients expect a very high degree of quality when it comes to financial products. Ever since adopting Accurity, J&T Bank has raised the quality of their reporting to ECB and other national and international regulators to the same degree!

Challenge

As a major bank, J&T is beholden to regulatory requirements for national banks as well as the European Central Bank (ECB), such as AnaCredit or BCBS 239, known as the Basel Committee on Banking Supervision's standard number 239, which is focused on the principles for effective risk data aggregation and risk reporting. The low quality and inaccuracy of the data they report to these authorities can result in significant fines, so for J&T Bank, ensuring quality of data was a priority from day one.

They originally used to measure their data quality using an internally developed approach with a set of scripts, which they used to check the quality of data in specific data attributes. However, further investigations found that this approach came with many flaws on the conceptual design level and was not an effective solution for ensuring regulatory compliance of the bank’s reporting.

Moreover, the original data quality approach lacked any form of data quality reporting that would inform data quality specialists and decision makers alike of the overall state of quality of J&T’s data, especially when compared to regulatory requirements and their top management’s KPIs. It was also impossible to efficiently track the ownership of data quality rules, measurements, checks, and data attributes. That led to further confusion over responsibility for the quality of specific data, creating conflict within the company.

The important thing to mention here is that the process of checking data quality through measurements, rules, and checks and subsequent analysis of the result hasn’t changed since then. The only change to J&T Bank’s data quality management came in the form of exchanging their original data quality approach with internally developed scripts for the Accurity data quality management platform. However, what has changed with the deployment of the Accurity platform are the processes of defining and creating data quality rules, checks, and measurements, and it has a lot to do with data entry.

All in all, J&T Bank was faced with ensuring they could improve the quality of data feeding their regulatory reporting while dealing with an overly complicated data quality measurement approach that required a lot of money, people, time, and effort to service.

Primary issues included:

  • crossMark

    Inefficient internal data quality measurement approach

  • crossMark

    Low-quality data creeping into regulatory reporting

  • crossMark

    Confusion and conflict over data ownership and responsibility

Project in numbers

J&T Bank suffered from low-quality data creeping into their regulatory reporting, risking government fines in the process. They set out to find a tool that would help them not only efficiently detect low-quality data but also eliminate low-quality data being created, to begin with!

867+

data quality rules being run daily

10+

data sources containing regulatory data checked daily

80%

of data quality requirements come ad hoc from the regulatory department and are immediately checked against data quality rules

A few runs

is all it takes to fully polish a new data quality rule and eliminate data quality issues of a data attribute

Objectives

It was estimated that one of the primary sources of low-quality data is located at the point of data entry. The processes are designed in a way that lower-level employees from commercial business units are responsible for the entry of data later used to feed regulatory reporting. One of the objectives, therefore, was to increase awareness of data quality standards among the data entry personnel.

In terms of responsibility for data quality, the stakeholder structure is relatively flat. The entire data quality initiative in the bank is within the responsibility of a division director, and individual business units headed by their respective department directors oversee the data they contribute to the overall regulatory reporting. They are also responsible for defining data quality requirements relating to business metrics and risk management. However, while they are ultimately responsible for the end quality of data, it is not them who create data quality rules and measurements. That task ultimately falls to the BI department, which does the majority of the data quality management work on its own. It was one of their goals to ensure that responsible individuals outside the BI department become aware of the data quality rules and standards demanded by the regulatory department in order to improve the efficiency of data quality management and avoid increasing costs of low-quality data making its way to regulatory reporting.

Lastly, not all data attributes for regulatory reporting were being covered by the data quality initiative. The goal of the stakeholders is to gradually increase the scope of data quality management to cover all necessary regulated data attributes.

Having identified the inefficiencies, the main objectives of implementing higher data quality standards were clear – to create one system with consolidated, standardized, and efficient data quality rules, which business units can rely on and easily gain insight from, and which would also help with any kind of regulations.

Solution

J&T Bank decided to end the utilization of the internal approach to measuring data and instead adopt the Accurity Data quality and data observability solution as a better way of ensuring long-lasting fulfillment of their data quality needs.

Within Accurity, the bank’s team was able to account for data attribute ownership by using the aggregate check concept to build hierarchies of existing data quality rules, checks, and measurements relevant to a particular business unit and stakeholder. For new data quality rules, the identification of any data attribute’s owner is enshrined within the process of defining the new data quality rule.

New data quality rules would be defined based on ad hoc requirements from ECB and other regulators processed by the regulatory department. Over 80% of all data quality rules currently running in J&T Bank have been created this way. The other 20% of the requirements come from various business and IT stakeholders who are keen on checking data about specialized metrics.

For stakeholders and business decision-makers, Accurity reporting dashboards were used to create a set of data quality reports intended to inform about the fulfillment of data quality-based KPIs and the development of data quality trends over time.

As was mentioned earlier, the process of checking data quality has remained the same, and it stayed within the business unit leaders‘ KPIs to ensure sufficient quality of the data their department produced. There would be a time each stakeholder would be given to ensure the correction of a data quality issue within a data attribute belonging to them – the origin of which would mostly be with one of their lower-level employees responsible for data entry.

With regard to the quality issues discovered at data entry, the bank started an enforcement campaign that drew insights from individual data quality issues discovered. Data entry professionals were instructed as to the quality standards required of the data they enter. J&T’s intention was to improve the overall quality of data by fixing systemic issues leading to low data quality rather than focusing on fixing the symptoms of these systemic issues.

In order to double-check much of the data contained within the attributes marked to feed regulatory reporting, a number of data integrity checks were established to compare data of those attributes to the same data contained inside a different data source.

Benefits

According to J&T Bank executives, their data quality overall has improved significantly since deploying Accurity. In their words, it is virtually impossible to compare data quality before and data quality now because previously, they were unable to properly measure anything, especially in terms of data owners.

The benefit of the new initiative has been recognized within J&T not only by IT but also by business users and executives. User-friendly dashboards provide a clear understanding of the rule results and data quality trends. The awareness of the importance of data quality on all levels has been raised.

New data quality issues are detected every time a newly created data quality rule is run in Accurity for the first time. Thanks to Accurity’s data quality issue tracking, the BI department has an easy time locating the stakeholders and employees responsible for the data’s business unit and incorrect data entry. This has led to a major increase in awareness about data quality standards among lower-level commercial employees. These data entry specialists are learning from them and do not repeat the mistakes leading to bad quality of data.

Thanks to this development and the role of Accurity as a crucial facilitator of discussion about data quality on all business and technical levels within the bank, J&T’s BI department has set out to cover every data attribute present in their organization using Accurity’s data quality rules in order to eliminate low data quality entirely at the data entry level.

Because of the data ownership information stored within Accurity, conflict no longer exists over responsibility for data quality issues, and data ownership can be unambiguously claimed by the correct stakeholder, significantly reducing the time needed to resolve a data quality issue.

The inclusion of data quality reporting and trend charts in Accurity also gave J&T access to a treasure trove of operational information. With these newly acquired insights, the bank’s BI department now states that lowering the time needed to fix low-quality data is its primary goal for the future.

Thanks to the awareness Accurity helps spread about data quality, we no longer encounter the data quality issues that we fixed in the past. Data quality rules have become enshrined in the data entry process. I firmly believe that if we had rules over all our data, there would be no bad data quality in our reporting whatsoever.

Daniel Grund

Head of BI & DWH

Want to see Accurity for yourself?

Get a free personalized 1-on-1 demo.

GET YOUR DEMO