The Data Quality Framework in the light of the Future of Pensions Act (WTP)

Date:May 13, 2024

Data quality is a key factor for the sound and controlled operation of pension funds. It enables pension funds to give participants what they are entitled to.

The transition to the Future Pensions Act (WTP) puts even more emphasis on data quality in pension administration. According to the Decree on the Future of Pensions, pension funds must demonstrate that data quality is secured before the transition to the new pension system can take place. This is important for making a balanced entry decision and for the correct conversion of pension entitlements to capital (‘claw-in’).  

The Data Quality Framework – Future Pensions Act (hereafter Framework) provides pension funds with tools to make the quality of data transparent, substantiate it, make corrections (‘get clean’) and ensure it (‘stay clean’). In this article, we discuss the challenges posed by the Framework and the tools to meet these challenges.

The Data Quality Framework

The Framework, published by the Pension Federation, consists of six phases:

  1. Data quality set-up: formulating data quality policy and identifying critical data elements.
  2. Risk inventory and assessment: analysing relevant risk factors, including the data quality management framework and specific participant risks.
  3. Data analysis and partial observations: performing data analysis to identify outliers and anomalies in data.
  4. Reporting and review: preparing a report for the board with findings and action plans.
  5. Specific work by external auditor: engaging an external auditor for verification of the work performed. 
  6. Decision on data quality for investees: decision-making on data quality by the pension fund board.

Challenges within the Data Quality Framework

Pension funds have already invested a lot in recent years to improve data quality. The good news is that pension funds do not have to start from ‘scratch’. However, the Framework does bring some key challenges.

Challenge: peak load

Implementing the six phases requires a lot of capacity from the pension fund and pension administration organisations. This is, for instance, in collecting the activities carried out so far and using them in the Framework. In-depth analysis and documentation of data quality is required, with an average of more than 60 critical data elements to be assessed. The pension fund also needs to conduct data analysis (from data profiling to partial observations), which is mostly carried out by the PUO. Given PUOs often do this for multiple pension funds, this can lead to peak load and limited availability of data analysts.

Advise: peak load

To meet these challenges, early preparation is essential. If pension funds start preparing for data quality work early, you avoid peak workloads and project overruns. In addition, close cooperation between implementers, auditors and IT auditors ensures that the complexity and risks of the process remain manageable.

Challenge: IT-processes

IT and data go hand in hand. To analyse data, it is advisable to use advanced data analysis techniques and AI (machine learning, data science). The risk of IT projects running out of time for pension administration organisations and self-administered funds seems significant, recent research shows. More than 70% of respondents expect that the majority of front-runners will not switch on 1 January 2025, mainly due to implementation and IT-related obstacles. This will also affect the turnaround time and completion of the Framework.

Advise: IT-processes

Pension funds can address challenges in IT processes by investing in advanced and proven technologies. In addition, thorough planning, collaboration with experienced IT partners, continuous monitoring and flexibility and agility are required. Going through the Framework should not be seen as a stand-alone component, but be managed integrally from project and portfolio management.

Challenge: timeframe

Many pension funds have a long history; some funds even have a data history going back more than 70 years. The method of data storage was significantly different back then, and often not all historical information has been preserved. This raises the question: how far back should a pension fund look to get a clear picture of the quality of their data?

All in all, a considerable challenge for pension funds, at a time when the current pension administration needs to be run and the new pension scheme is being set up in parallel.

Pension funds are not starting from scratch, but going through the Data Quality Framework does represent a significant challenge for pension funds.

Advise: timeframe

Pension funds should substantiate to which year the data is traceable. In other words, up to when is the data comparable to the initial source. For data with limited traceability, we recommend carrying out plausibility analyses to understand the ‘credibility’ of the data. If certain data is no longer retrievable, a pension fund should perform a risk analysis on the impact of the missing data to demonstrate that the risk of the unavailable data is within the pension fund’s risk appetite. 

We also recommend determining the time horizon based on the available data and previous analyses performed by the pension fund. Take, for example, a pension fund that has moved to a new pension administration organisation (PUO) and has conducted data research in that context. If this research shows that data up to a specific year has already been analysed, consideration may be given to not including this period within the scope of the Framework.


The listed challenges are obviously not an exhaustive list. What is central, however, is that timely preparation, intensive cooperation, and direction are key to successful implementation of the Framework. We would be happy to have a cup of coffee with you to exchange views on this.