The crucial role of standards for geodata
Reflecting on the industry’s progress with Peter Parslow
During Peter Parslow’s term as chair of ISO/TC 211, the committee celebrated 30 years of developing and maintaining standards for the geospatial community. In this conversation with him, he reflects on the key milestones in the past three decades, including ISO 19115 on metadata and the new Data Quality Measures register, as well as the partnerships with the United Nations. He also looks to the future, touching on topics such as the integration of AI with geodata. His advice to the geospatial community? “Join in to drive the advancement of open standards to address future challenges!”
ISO/TC 211 has been developing and maintaining standards for the geospatial community for 30 years now. How pivotal has this been for the sector?
One key standard has been SQL Spatial, which means that transferring a polygon or line representing land between different systems such as PostgreSQL retains its accuracy, with the location remaining consistent. Over time, this standard has become integrated into mainstream IT infrastructure. Originally developed within TC 211 and Open Geospatial Consortium (OGC), it’s now widely embedded in database systems and maintained by the organization responsible for SQL standards. A true success story of a standard becoming so fundamental that it operates invisibly, fully adopted but rarely noticed as it seamlessly supports everyday software.
Another area where TC 211 collaborated with OGC was in producing the web service standards. These are part of an earlier generation that enabled people to start viewing maps on the web. Since then, companies – especially larger ones that don’t rely on those standards – have come in and taken that much further. Think of Google Maps, Bing Maps or Apple Maps, all with their own different approaches. The concept of putting maps on the web, however, was popularized thanks to open standards. These organizations are actively engaged with the standards community, and maybe we will herd ourselves back into a common picture at some point.
Things have modernized over the past 20 years, and now there’s newer APIs for those things; OGC has been standardizing those, with TC 211 publishing some of them. This allows data publishers to create platforms with their own developers, instead of relying on big companies. Open standards have always been favoured by open-source communities, and there are also strong open-source solutions that rely heavily on these standards.
In the geospatial industry, tools like GeoServer and GeoNetwork are much purer implementations of these standards. They’ve enabled the democratization of geospatial technologies – you no longer need to invest in costly software to access these capabilities. I’m glad to see the Open Source Geospatial Foundation (OSGeo) and others popularizing those approaches to geospatial data.
One major recent achievement in standards development is the introduction of your new Data Quality Measures register. Could you elaborate on its purpose and scope?
Yes, this is quite a significant milestone. Unusually for standards development, and thanks to OGC’s support in providing an implementation of the register, we’ve made it available for public review and comment ahead of time. This makes 2025 a key year for finalizing the functionality people want. The register remains open for feedback and will continue to improve. The main focus here is on defining its purpose and scope. Until now, standardized data quality measures have only existed as appendices buried in PDF files. By moving these measures into the data register, they are no longer locked away. This shift achieves two things: it makes the standards more accessible on the web, and opens the door for others to define additional measures. Some of these new measures will be general enough to be accepted into the ISO register, and can also create specialized domain communities. We’ve made these measures machine-readable and usable, which is the key aim of the pilot. This leads to a shift from human-only data quality assessments to machine-interpretable reports, allowing systems to recognize and validate data quality automatically. This supports machine interoperability and also aligns with the move towards AI.
Before talking about artificial intelligence in more detail, let’s first focus on one of your key standards: ISO 19115 on metadata. Which changes can the community expect in the revision later this year?
Every five years, an ISO standard is reviewed to see whether it’s still up to date. The committee has decided to revise ISO 19115, and - if approved – the revision will have two parts. The first focuses on mapping ISO 19115 to GeoDCAT, ensuring geospatial data aligns with a widely adopted metadata standard. This mapping is increasingly important as demand for DCAT metadata grows in the EU, Australia, New Zealand and North America. The project is expected to move quickly, with a public review planned for late 2025 and publication the following year, barring major comments. The second part will refine ISO 19115, addressing various fixes. Key updates include clarifying its connection to the W3C Provenance vocabulary, simplifying rights and licensing sections.
Outreach is a critical component of the committee’s work, because standards are only effective if people are aware of them and adopt them. How do you approach this in practice?
We create awareness through articles like this one, both generally within the geospatial industry and also focused on land management, and by being represented at conferences. We will be participating in an ISPRS conference in a few months. Our engagement with the UN Group of Experts on Geospatial Information Management (UN-GGIM) is also a key part of our outreach. For the last three years, I’ve had the privilege of presenting at these events, where we’ve connected with attendees from a broad variety of countries, allowing us to explore how we can help others learn about geospatial standards. Twice a year, we host our ‘Standards in Action’ workshops, each focusing on a different country. Our materials are available on our website and promoted through LinkedIn and other channels, such as the short insightful videos on our YouTube channel.
Where do current standards fall short in enabling seamless data harmonization across geospatial domains, and how can they be improved?
I’ve observed two key issues in this area. One is that standards are too broad. For example, with ISO 19115, there are multiple implementations, but they don’t interoperate because the implementers select different options. This leads to unmanageable situations, and it happens to some extent with many standards. However, by working with OGC, we’re making standards more modular, with clearer, testable requirements. The second issue involves domains that legitimately need their own standards, such as maritime navigation with IHO standards for hydrography. They do a good job, but these standards need to be interoperable with broader systems. The challenge is finding the right balance between domain-specific standards – especially those with long legacies, like the IHO, which has been around for over a century – and standards that are more widely applicable. The IHO has spent the last 15 years working to bring its standards into the mainstream, but it will likely take another decade to fully transition its data production systems to this new approach. As the saying goes, it takes time to turn a supertanker. This isn’t just a challenge for maritime navigation – it applies to other fields as well.
I believe the developer-led approach to standardization is crucial here. Newer standards, like API standards which we’re fast-tracking to ISO, benefit from having developers directly involved in the process. These standards are much easier to implement than previous ones. In my opinion, if you’re writing a standard to be implemented in software, it’s essential to have developers at the table, not just theoreticians. OGC has provided a strong lead in this area.
Can you share the current status and outcomes of your partnerships with the UN?
We are now an officially recognized partner of the UN Global Geodetic Centre of Excellence (GGCE), based in Bonn, Germany. This acknowledgment underscores the importance of our standards in geodesy and the valuable contributions of our team, as well as reinforcing our collaboration. In mid-2024, the newly appointed director, Nick Carr, made a passionate appeal at the UN General Assembly to support the fragile geodetic infrastructure. They showed a quite scary video about what happens in the modern world if the geodetic infrastructure fails. It’s not just about someone hacking your satellite, it’s also about the realization that in many parts of the world much of the system still depends on aging and often neglected ground infrastructure. At the same time, fewer people are pursuing geodesy as a specialty, which is an increasing concern.
Another aspect is the development of the ISO Geodetic Register. We saw a clear need for this register at the UN level. The EPSG register provides coordinate system definitions, but its creators acknowledge that it’s simply a repository without validating accuracy. Our work focuses on providing quality assurance for nationally or internationally significant definitions, collaborating with national geodesists in each country. The ISO Geodetic Register acts as a quality-assured subset, ensuring consistency where reference systems align. EPSG has supported this effort, backing the adoption of these validated entries, while the GGCE has also recognized its importance. This is a critical part of our infrastructure. That’s why we’re encouraged to see the geodetic register proving its value and gaining recognition.
How would you personally describe the connection between the UN’s Sustainable Development Goals (SDGs) and geodata?
Let me share an anecdote. Before I got into geospatial, I took a career break from being a computer programmer and I worked in the Himalayas, in the west of Nepal, as general administrator of a tuberculosis and leprosy control programme. The big issue there is getting out into rural locations and reaching the people – both to diagnose, which particularly in the case of leprosy has an element of taboo, and also to administer treatment to the patients. That meant a lot of logistical challenges – things I now would look at and say those are definitely geospatial problems. How do I get medicines out to the people who need them? Well, I need to know where the people who need them are, and I need to know where the health distribution posts are. Exactly, it’s all about geodata!
Almost every SDG focuses on improving access to services, which requires knowing where people are and where services are delivered. We often emphasize that it’s not enough to just monitor and report; geospatial data must be used to improve planning and actively enhance services. This is basic GIS, the foundation for national progress!
Another example that illustrates this is the big earthquake in Haiti. Several SDGs that directly relate to disaster preparedness, response and resilience-building came together following the earthquake. Satellite imagery enabled volunteers to rapidly map the destruction and guide aid efforts. Though not about formal standards, geospatial data quickly proved vital in resilience and recovery. After many years in the sector, I find it hard to believe anyone could try to handle such challenges without spatial data!
Zooming in on land administration, a domain in which reliable and accessible data is absolutely vital, what progress has been made and where are the key challenges?
This is another initiative in collaboration with the UN-GGIM Land Administration group. Due to the complexity of this multifaceted subject, it has taken our relatively small team several years to develop this suite of standards, which we call 19152. The land registration part is due out reasonably soon. Next, there’s the land valuation and spatial planning component. Once that one is operational, we’ll begin exploring how to ensure everything works together seamlessly.
The challenge is that it’s not a one-size-fits-all approach, but it’s not meant to be. For example, land registration systems are designed for regions with established systems. While we believe the standards could work for them, they likely wouldn’t adopt them since the systems have worked for decades, even centuries. These standards are really intended for regions starting from scratch, post-revolution or just countries that haven’t yet registered land. It’s a culturally diverse topic, with varying differences based on government structure and land ownership traditions. In some countries, land ownership traditions can still shift from one week to the next, raising questions of whether to change or not. The goal is a standard that works for both collective and individualistic views of land ownership.
After having served as chair of ISO/TC 211 for three years, how do you look back on the accomplishments during this period?
I had the chance to reflect recently, as we celebrated our 30th anniversary and it was also my final meeting as chair. It has been an incredibly enjoyable experience, leading such a great team. What has been really impressive is how willing people in this group are to put aside their individual interests – whether they’re from the Chinese government, a large American company, or the European Commission – and focus on what’s best for everyone. In the geospatial world, people are genuinely enthusiastic to get things done because they find it intrinsically valuable.
One thing I’m most proud of is how, thanks to the efforts of many people, we’ve become recognized by ISO Central as an innovative committee. A key example is the SMART project, which shifts standards from PDFs to machine-readable formats, a concept we pioneered with data quality measures and XML schemas. Another milestone was when we became the first committee to publish a standard entirely through an online collaboration tool for standard development, which helped drive this transformation forward.
I’m also encouraged to see new and underrepresented nations getting involved. For example, India has made significant strides in the geospatial industry over the past few years, and is even at the forefront of some technical developments in the standards world. Innovative companies that were previously low-profile are now providing excellent services in analysing Earth observation data. The growth is impressive, especially considering they weren’t involved at all a few years ago.
Lastly, I’m glad to remain engaged with TC 211. I’ll be involved in leading revision projects, specifically the ISO 19115 revisions. While I couldn’t do that as chair, I can continue in this capacity as ex-chair, especially since it’s within my area of expertise.
With rapid advancements in artificial intelligence and its integration with geospatial data, what are your expectations for the years ahead?
I have both hopes and fears about this. On the positive side, the technology enables faster and broader processing and makes it easier to find data. On the downside, it doesn’t inherently improve data quality; if AI is fed poor or insufficient data, the systems tend to fill in gaps with data that may seem authoritative but isn’t necessarily accurate. The real risk is when bad data is used, as it leads to unpredictable and unreliable outcomes: a major concern when considering the data quality AI systems work with. It’s not simply about having a machine-readable register, but also about maintaining data integrity. As someone in the standards community, I believe we can’t afford to overlook these risks.
In the geospatial sector, AI already plays an important role behind the scenes, particularly in feature recognition for imagery. But with the increasing volume of data from various sensors, satellites and other sources, the pressure to rely on AI is rising. The only way to handle this growing influx of information is through AI’s faster processing capabilities. But without high-quality input data, this efficiency could lead to even bigger inaccuracies.
With the massive flood of data, you can no longer rely on teams of people sitting in front of photogrammetry monitors digitizing everything. Instead, AI must be trained to handle most of the tasks, while being smart enough to flag uncertainty rather than creating errors. I hope quality checks are going to be good enough to spot those sorts of things.
How can the geospatial community strengthen its role in driving the advancement of open standards to address future challenges?
The short answer is: join in! There are many ways to do so. It starts with clearly defining the challenge and recognizing where a standard would help improve the usability of data. Just articulating this clearly is a great first step.
Being involved often means contributing existing good practices and saying “We have three solid practices, let’s come together, figure out the best one, and make that the standard” – or, in many cases, combining them. For those in geospatial fields who are also developers – doing data manipulation or tweaking software – the OGC innovation programme is a great way to get involved. You don’t need to be directly drafting standards. Instead, you can engage with challenges, contribute ideas and try them out. The OGC and ISO both aim to be open, and you can even be nominated into a project by your professional body. Also, open standards always undergo at least one public inquiry, with all comments considered and addressed. To make engagement easier, we’re running hackathon-type events and moving our issue tracking to GitHub, where many developers already spend their time.
Personal and professional development is also key to driving advancement, of course. I would highly recommend keeping an eye on our ‘Standards in Action’ workshops. We host them twice a year, and they’re a great opportunity to dive into valuable insights. Each 40-minute talk is typically shared as a standalone video, making it easy to absorb the content at your own pace.
Is there anything else you want to share with our readers in the geospatial community?
It has really been an incredible experience being part of this community in a leadership role, and I’m excited to continue contributing, even as I step away from the leadership position. As I mentioned earlier, I can’t imagine addressing any of the SDGs without geospatial data. Indeed, the geospatial community plays a crucial role, even if it’s often behind the scenes, providing a valuable service that drives progress. A collaborative approach is key, whether through formal standards bodies or informal groups that later formalize their work.
About Peter Parslow
Peter Parslow, a recognized authority on data standards in the geographic field, chaired ISO/TC 211 from January 2022 to December 2024. In this capacity, he was instrumental in fostering consensus among the many countries and organizations involved, while also serving as the public face of ISO/TC 211 to international industry and government groups. With over 30 years of experience in software and data design, Parslow was responsible for designing and leading the development of the UKHO’s first XML-based flow-line and the data model for its Hydrographic Database. Having developed a passion for geospatial technologies, he transitioned to Ordnance Survey, the national mapping authority of Great Britain, where he now works as the open standards lead.

Value staying current with geomatics?
Stay on the map with our expertly curated newsletters.
We provide educational insights, industry updates, and inspiring stories to help you learn, grow, and reach your full potential in your field. Don't miss out - subscribe today and ensure you're always informed, educated, and inspired.
Choose your newsletter(s)