Digitalising government is essential for Europe’s future innovation and economic growth and one of the keys to achieving this is open data – information that public entities gather, create, or fund, and it’s accessible to all to freely use.
This includes everything from public budget details to transport schedules. Open data’s benefits are vast — it fuels research, boosts innovation, and can even save lives in wartime through the creation of chatbots with information about bomb shelter locations. It’s estimated that its economic value will reach a total of EUR 194 billion for EU countries and the UK by 2030.
This is why correctly measuring European countries’ progress in open data is so important. And that’s why the European Commission developed the Open Data Maturity (ODM) ranking, which annually measures open data quality, policies, online portals, and impact across 35 European countries.
Alas, however, it doesn’t work as well as it should and this needs to be addressed.
A closer look at the report’s overall approach reveals the ranking hardly reflects countries’ real progress when it comes to open data. This flawed system, rather than guiding countries towards genuine improvement, risks misrepresenting their actual progress and misleads citizens about their country’s advancements, which further stalls opportunities for innovation.
Take Slovakia. It’s apparently the biggest climber, leaping from 29th to 10th place in just over a year. One would expect that the country has made significant progress in making public sector information available and stimulating its reuse – one of the OMD assessment’s key elements.
A deeper examination reveals that this isn’t the case. Looking at the ODM’s methodology highlights where it falls short… and how it can be fixed.
More than a few problems
First, the ranking relies on self-assessment by competent national administrations. This can lead respondents to exaggerate their progress, and this in turn skewers the ranking.
Paradoxically, less transparent administrations can more easily overstate their progress, as it’s harder to fact-check their answers. Recent research exploring the impact of international rankings on public opinion has shown that citizens tend to express a more positive assessment of a country’s performance when informed about a high ranking. That could explain why some national administrations may feel pressured to ramp up their positioning to appear as if they perform better than other countries.
What’s more, the lack of available resources to fill in a rather long questionnaire (129 questions!) in otherwise open administrations can lead to low scores even if the country has actually made significant progress. This is often the case as answering ‘I don’t know’ or ‘No’ results in an automatic zero.
This was evident when reviewing the published questionnaires for 2022, where missing information for whole sections for some countries (e.g. Slovakia and Latvia) could have been easily resolved with a simple web search, especially regarding questions concerning examples of open data reuse cases or studies exploring open data’s impact in different sectors. This brings us to another critical issue – the lack of insufficient external research and response validation.
While the ODM method paper claims that questionnaire responses undergo a thorough three-round validation by the research team at data.europa.eu, in reality validation seems to be far from robust. For example, countries are asked to provide examples of relevant studies assessing the impact of open data conducted in the last year and these are fully accessible – yet the references provided sometimes lead to findings that are three to four years old.
A telling example is Austria, where a student’s 2020 Master’s thesis (not even an official study or report from a respected institution) was provided as official evidence for two years in a row (see the hyperlink directly above to find it). This resulted in Austria receiving maximum scores – something it really didn’t deserve to get and a damning indictment showing that the country really couldn’t find any other authoritative or up-to-date sources to corroborate its claims of excellence in open data.
Another issue with the ODM is that the methodology changes over time, and this affects year-on-year comparability. The most recent edition of the ranking (the ninth) features dimensions such as policy, impact, online portals, and quality, a major departure from the previous approach (up to 2018), which focused solely on two dimensions: open data readiness and portal maturity.
Importantly, it appears that before 2022, countries were not obliged to provide detailed answers: simply ticking the boxes was deemed sufficient. All this makes it very difficult to evaluate EU Member States’ progress over time and clearly shows that the ODM ranking has, at best, some big improvements to make.
A two-way fix
First, data provided should be thoroughly cross-checked with third-party sources. For example, conducting surveys or consultations with stakeholders who use open data portals – such as software developers, academic researchers or data journalists – would help in getting a better overview of all the various open data initiatives and potentially gather insights into their practical impact.
Such a knowledge exchange would facilitate the development of more effective open data ecosystems, vital for stimulating the economic, social, and political benefits of open data initiatives. This would also streamline the task for national administrations, giving them a better understanding of the current situation without having to do extensive individual research.
Second, external validation must be strengthened to avoid moral hazard in self-assessment responses. Self-assessment can alter rankings, and it has been proven to be an issue for several international ones, such as the World Bank’s Ease of Doing Business Index which was extensively criticised (and eventually discontinued) for methodological flaws. This was illustrated by a staggering improvement in Russia’s 2019 ranking despite worsening indicators on the ground regarding its business conditions.
Sometimes, just comparing responses to last year’s self-evaluation can make monitoring progress better. However, in other cases, a more robust and systematic approach is essential for validating data provided by national administrations. This is especially so when it comes to checking the quality of each published dataset and, furthermore, a dataset’s very existence by clicking on its dedicated URL on the country’s open data portal.
By evaluating ODM, European countries can be held accountable for their transparency, fostering trust and credibility. However, citizens need a robust and legitimate ranking system to see how well their countries are actually doing. And that’s why the European Commission should have a serious rethink about its ODM ranking – as we’ve seen above, fixing it really isn’t an impossible task.