Earth Observation and Machine Learning as the key technologies to track implementation of the Green Deal: 10 main takeaways

Earth Observation technology for monitoring climate change

In recent decades, the world has experienced rapid growth in Earth Observation (EO) technology, which has allowed us to gather plenty of information about planet Earth’s physical, chemical and biological systems. Be it land, sea or air, EO is the most robust technology to monitor and assess the status of, and changes in, the natural and artificial environment. Particularly relevant has become its role in rapidly assessing situations during extreme weather events or natural disasters, precisely assessing Greenhouse gas (GHG) emissions, making EO pivotal in ensuring consistent, long-term environmental assessments in face of unpredictable climate change.

  1. target cloud services at intermediate users instead of policy- and decision-makers / avoid over-engineered systems with a high level of abstraction;
  2. increase capacity-building to decrease the existing gap in cloud skills and uptake;
  3. provide a cloud certification mechanism to increase overall trust in cloud-based services.

On July 19th 2022 OpenGeoHub foundation together with Wageningen University organized a public workshop entitled ‘Innovative governance, environmental observations and digital solutions in support of the European Green Deal’. This is the first of a series of interactive meetings within the Horizon-Europe Open-Earth-Monitor project that aims at bringing together Earth Observation and Data experts from business, policy and academia with the community of practice, and connect with potential end-users of the future platforms developed by the project.

#1 Tracking impacts of environmental investments is not trivial

For Joanna Ruiter of the Netherlands Space Office (NSO), one of the key problems of the projects funded by the NSO and the Dutch Government is that it is relatively easy to assign funds to organizations, but relatively difficult to track the effects of funds spent, especially in the terms of impact on the ground.

#2 Overlap in pre-processing of EO data is huge (and highly inefficient)

Patrick Griffiths of the European Space Agency in his talk “EO Platforms and Open Science in Support of Green Deal Ambitions” mentioned some of the key challenges of the EO field, for example, the data management burden affecting scientists, particularly in the EO space, who spend 80–90% of their time ‘cleaning’ data.

The scientific community could address this burden by the simplification and democratization of processes, more collaboration and sharing of data and technologies, and reducing fragmentation and redundancy of platforms; for instance, a few projects are opening up novel solutions: the EuroData CubeOpenEO Platform and OpenEO API (Schramm et al., 2021). But in essence, enormous budgets are still spent on overlapping data cleaning tasks and overlapping functionality, and this is obviously inefficient.

#4 Achieving Land Degradation Neutrality requires data co-design at every stage in decision processes

Patrick Griffiths of the European Space

Barron Orr, Lead Scientist at the UN Convention to Combat Desertification introduced the data and governance gaps in the context of achieving Land Degradation Neutrality (LDN), SDG target nr 15.3. 126 countries have pledged (or at least aim) at halting the rapid land degradation caused by damaging land uses, to ensure food security and healthy ecosystems.

In general, EO, geospatial data, and derived information play insightful roles in monitoring the SDGs targets, planning, tracking progress, and helping nations and stakeholders make informed decisions, plans, and ongoing adjustments that will contribute toward achieving the SDGs. Orr described how data, and especially open data, is critical in achieving LDN since it broadens our understanding of the underlying land potential for any land use decision: for this reason data “must be used at every stage in decision processes, crucially at the beginning in the design process, rather than only for monitoring projects already in progress” concluded the UN scientist.

#3 European EO platforms suffer from redundancies and fragmentation

Patrick also concluded in his talk that “European EO platforms suffer from redundancies and fragmentation”. The European Commission, over the last decade, has invested significantly in EO and projects to channel EO technology into practice including commercial applications, but this has also resulted in high redundancies and many too small applications to survive the market needs. According to Patrick, what could be more efficient if we would “stop reinventing the wheel” and focus on collaboratively building blocks that can be easily combined and are de-centralized. This seems to match also the feedback received from the audience:

Federated ecosystem of proven open source technology as a preferred solution for the future of EO in Europe.

#5 Using EO data only for decisions is too risky: quality ground data helps improve EO and vice versa

Gert-Jan Nabuurs of Wageningen University and Research discussed the challenges and opportunities when better connecting the European Ground Based Forest Inventories with EO Data. With his 27 years of experience gathering ground-based data, Nabuurs recalls how our global data collection and sharing capabilities have come a long way since the beginning of his career (where he ‘had to call government agencies by phone and request floppy disks in the mail.’)

The key takeaway from his talk: using EO data for decision making in forest inventories and planning is too risky! This is illustrated with the example of Ceccherini et al. (2020) where abrupt harvested areas detected in Finland and Sweden were due to changes in the Landsat missions and issues in the Global Forest Watch product, and not as much due to increased harvesting.

Another keynote, Matt Hansen, the Co-Director at Global Land Analysis and Discovery (GLAD) lab, spoke on Global Land Cover and Land Use Monitoring and likewise emphasized that it is the primary concern for global mapping projects to have high-quality ground data to ensure unbiased estimators. No map is unbiased and no map is perfect: known and quantified limitations should be published together with the produced map.

#6 FAIR-TRUST-CARE data principles can help bridge the digital divide

Yana Gevorgyan, Secretariat Director, introduced the Group on Earth Observations (GEO), the global network connecting government institutions, academic and research institutions, data providers, businesses, engineers, scientists and experts to create innovative solutions to global challenges based on open EO.

F.A.I.R.: Findable, Accessible, Interoperable, Reusable; The first step in (re)using data is to find them: metadata and data should be easy to find for both humans and computers. Once the user finds the required data, it is necessary to know how it can be accessed, possibly including authentication and authorisation. The data usually need to be integrated with other data. In addition, the data need to interoperate with applications or workflows for analysis, storage, and processing.

T.R.U.S.T.: Transparency, Responsibility, User focus, Sustainability and Technology; the TRUST Principles provide a common framework to facilitate discussion and implementation of best practices in digital preservation by all stakeholders.

-The ‘CARE Principles for Indigenous Data Governance’: Collective Benefit, Authority to Control, Responsibility, and Ethics; the CARE principles are people and purpose-oriented, reflecting the crucial role of data in advancing Indigenous innovation and self-determination. These principles complement the existing FAIR principles encouraging open and other data movements to consider both people and purpose in their advocacy and pursuits.

#7 Global and local EO products nominally offering the same, can be significantly different

Gilberto Câmara, former director of National Institute of Space Research in Brazil and former director of the GEO Secretariat, explored the challenges of mapping Land Use and Land Cover, highlighting the necessity of having self-consistent maps and consistent definitions of terms, like deforestation, in order to create products that are both globally trustworthy and locally relevant. Just think of the subtle visual difference between a natural and an artificial landscape, like a savanna and a cattle pasture. Mixing or missing out on such classes in the EO product can totally change its usability and applicability.

Gilberto illustrated how EO tools can fall short of being useful to policymakers at the local level when various end-users are not empowered in the development process through what he referred to as “bottom-up map production”. Mr Câmara pointed to the R-based sits library as an example of a commercially ready cloud services tool created with accessibility for end-users at its heart.

#8 Mapping land cover and similar general classes will stay important even though we can today also map detailed continuous variables

During the discussion session, Patrick Griffiths and Tom Hengl asked Gilberto if there is still a need for land cover maps and similar general EO products when we can now also map various continuous variables e.g. NDVI, FPAR, tree species percentage, crop types, canopy height, etc which basically represent land cover but at quantitative scales and in multivariate space, thus in much higher detail. Gilberto believes that land cover maps will remain in use because we need simple explanations of common features that people can interpret and relate to quickly: “especially the policymakers — they need something that they can relate with”.

#9 There is always a creative way to ensure access to basic environmental ground-truth data that can then help extend research

In the talk by Gert-Jan Nabuurs we looked specifically at problems of accessing the (at least for data mining purposes) e.g. NFI Forest inventory data, which is currently kept private by national organizations and hence is not available to the majority of the research community. But there are many creative ways to access this data. Participants have mentioned strategies such as building trust — signing collaboration agreements or offering joint products that benefit both sides. The alternative is to put pressure on the governments to require that all publicly funded data is eventually released and that providing FAIR datasets should be a requirement for publication or even promotion at work.

#10 EO industry needs applications that their customers use to take action and solve problems… on a daily basis

Going back to the topic from the start of this article, in one of his (now famous) blogs, Joe Morrison (“the Controversial industry figure”) questions the sustainability of many modern EO businesses: “lots of new analytics startups are pursuing the dangerously seductive strategy of building a new type of dataset and then closing their eyes and hoping customers will magically show up to start buying it”. Successful businesses should instead “build applications that their customers use to take actions and solve problems. If you use an application every day, you don’t mind paying a subscription for it”. Likewise, we also recognize in our work that EO-based services need to serve (as much as possible) Decision-Ready, relevant and easy-to-use information. As with many modern democratic digital systems, usability and usage (web traffic) will be our final judge of success.

Word cloud of terms most associated with Decision-Ready-Information.

Open-Earth-Monitor project

To support democratic and efficient implementation of the European Green deal (and avoid profound geopolitical turbulence’s and the uncertainties brought by climate change), OpenGeoHub, together with 21 partners has launched an ambitious new European Commission-funded Horizon Europe project called “Open-Earth-Monitor” and which aims at tackling the bottlenecks in the uptake and effective usage of environmental data, both ground-observations and measurements and EO data.

Valentina Delconte
Valentina Delconte