European Climate and Energy Modelling Forum

1 Like

Thanks for the great participation in the breakout workshop everyone.

Here are the notes from the workshop (many thanks to @Lucie who helped record the notes during the workshop).

Introduction

@willu47 introduced the 5 objectives of the ECEMF project and 15 consortium partners (see slides).

The workshop had several aims:

  • raise the profile and discuss benefits of model comparison for a range of stakeholders (modellers, national policy makers etc.)
  • discuss the “legacy” issue of ECEMF - what will happen after the project and how to keep the effort alive

At the Trondheim Energy Transition Week, met the EFECT network → a discussed the possibility of merging the two efforts.

Why do we need to compare models?

  • How does the transition affects the different groups?
  • Different actors: EU, business, academia, national policy makers, civil society, …
  • Need a diversity of models, but…
  • Better to have a dialogue about the model results, harmonising the assumptions… → 1 more coherent dataset
  • Need to do a bit more work upfront

Summary (Will Usher)

  • Many stakeholders who gain value from model insights
  • Many different institutions who develop models
  • Many different types of models
  • ECEMF provides one focal point for untangling assumptions. scenarios and results through model comparison
  • ECEMF Net Zero Scenarios - https://future-sight.ecemf.eu/view?id=15

Q&A (all)

How were views of stakeholders included into the model comparison exercise?: ECEMF developed a stakeholder process - each work package held a bunch of workshops to co-design research questions focussed on a particular topic.

Are the research questions available?: Research questions are going to be published in an (open access) paper very soon

Robbie Morrison: Is the scenario data released under an Open license? [Yes! On Zenodo.]

@RobinGirard: Assumptions shared as much as possible (but it is time-consuming) → “weakness of the comparison so far” (focus on policy questions requires comparison of results rather than inputs. Ideally both would be pursued)

Types of model comparison (Will Usher)

  • Technical focus - differences between modelling frameworks, formulation, model behaviour
  • Data focus - differences in assumptions, role of modeller decisions, data sources
  • Result focus - differences in model outputs
  • Combinations of the above

Three case studies:

  • Stanford EMF (Will Usher)
  • Informal Nordic forum
  • OpenModex

Maximilian: openMODEX was helpful, but a slow start
Robin Girard: took time to build the protocol, quite helpful in the end (on how to do the comparison)
Will Usher: focus on comparing results provides insights for the policy maker
?: Important to have a well written the documentation to understand differences between the models
Robbie Morrison: Data licensing → research issue ; “would be a really useful exercise”

WU: Archiving the scenarios under a “CC-BY 4.0” license on Zenodo; results = direct copy of the database

Sarah: Reiner Lemoine Institute ; there are some model comparison ;

    • Open modelling plateforme → quite useful ; accessing data is OK, but is it understandable?
    • 3 types of complexity models (Excel, …) → compare not so comparable models (are the models compared of the same complexity?)

WU: Set of very complex models (with big teams) ; energy demand → quite nice integrated results ; not so much the case for energy supply though ; lots of assumptions in the IAMs

Robbie Morrison: Software heritage ; produce a DOI for software specifically ; useful metadata for those work, to identify which codebase ; INRIA → “Software heritage” (https://www.softwareheritage.org))

Robin Girard: Visuals? So many complex different sectors ; even better than model comparison

WU: Spend a huge amount of time just discussing; those activities need to be funded (so far so good) ;

Robin Girard: Better to have researchers from a lot of different backgrounds ; ex: ADEME published scenarios (of a lot of different sectors, by asking researchers of each field)

Matteo Giacomo Prina: review article on the topic published a couple of years ago: Comparison methods of energy system frameworks, models and scenario results, https://www.sciencedirect.com/science/article/pii/S1364032122006086

The ECEMF comparison process

Overview (Will Usher)

  1. Protocol
  2. Run the model
  3. Compare

Model Comparison Protocol is published.

Enabling comparison (Will Usher)

• IAMC results template (vocabulary / ontology)
• Result conversion tools (e.g. osemosys2iamc)
• lIASA Scenario Database
• lIASA Scenario Explorer - visualisation, but modeller focussed

• FutureSight - visualisation, user focussed
• [Indicators and metrics for comparison]

Indicators (Will Usher)

  • Indicators are used to compare combined outputs from models
  • WU showed the model fingerprints paper published in Nature Energy and led by Mark Dekker → source code is published under an MIT license

Issues with comparison

Introduction to the FutureSight tool

Live demo

Key concepts:

  1. Dashboard
  2. Control block
  3. Data block
  4. Text block

Goal: allow anyone to interact with the results

All open source (available here)

Christoph Schimeczek: Can upload the results of my own model?

WU: Not yet, but membership to the forum will be opened to anyone

Robbie Morrison: We had a presentation about metadata. where is the link to open energy ontology…?

Will Usher: IAMC template instead ; you can download, but that’s all

RM: Is it a png? What’s the format? Like OSM

RG: Nice to be able to access the document of the model, just by clicking on their name

?: Are the scenarios documented somewhere (yes)

?: Tried the plot with mean average temperature but seems that there’s a bug

RG: How to report an issue?

Visit ECEMF online

• Ask questions and suggest answers at our community forum: https://community.ecemf.eu

• Follow the project progress at our website: https://ecemf.eu

• Building a results dashboard using FutureSight:

https://future-sight.ecemf.eu

Legacy plan

  • How most effective? How funded?

  • Tried to consult the EU, but no one wanted to own this idea

  • Lot of people that value the insights, but limits in how the EU is organised (hard to find cash to support this initiative)

Dec 23 - ECEMF develops final legacy plan
Jan 24 - Project collaboration call to present the future forum
Mar 24 - ECEMP 2024 Organising Committee works start
May 24 - In-depth discussion and decision at the ECEMF project meeting
… - Deadline for report on the legacy structure
… - First Secretariat meeting of the ECEMF forum held
… - ECEMF project ends and FORUM officially setup, first General Assembly

Financing Schemes

  1. A new (or recurrent) call similar to LC-SC3-CC-7-2020.
  2. A tender or service contract with one or several of the partners of the ECEMF project
  3. Membership financing of the forum
  4. A substantial annual fee for the ECEMP conference covering also the ECEM’s activities
  5. No base financing, only an informal network
  • Options 3 and 4 are unlikely to be feasible
  • Option 2 raises issues around competitiveness and fair access.
  • Option 1 is the only option able to provide funding for partners to participate with their staff and models to the forum’s activities

Victor Ecrement: How would the money be spent?

Minimum Estimated financials

• Website and Dissemination: 2,000 EUR/vr
• Online Community Forum Cost: 1,000 EUR/YT
• 2 lIASAd database instances with support for one model intercomparison every two years:
15.000 EUR + overheads

  • FutureSight visualisation backend and tool including support (no development): 2,000 EUR
  • Secretariat staff (0.5 FTE) 30,000 EUR + overheads
  • ECEMP conference organization:
  • Hybrid: Plenary and panel discussions in person, three parallel sessions in person. All other parallel sessions digital. Two-day event. Max 100 attendees. (no registration fees)
  • In person + livestream: All sessions in person plus livestreamed or recorded.
  • Costs: Venue, Catering. Organization in Brussels (15,000+ EUR), Online facilities (5,000 EUR)
  • Model Comparison Coordination (one comparison every two years)
  • Two stakeholder workshops: 1,300 EUR + researcher time
  • Development of the model comparison protocol (0.25 FTE) 15,000 EUR
  • Coordination of monthly meetings (performed by Secretariat)
    o Support for result conversion, upload and resuit validation (included in lIASA costs)

• Core researcher time for compiling report and coordinating author team (0.3 FTE) 20,000 EUR

Total: €130 000 per year

RM: Propaging to other areas? In Africa for instance?

?: Would it be possible to add new data/models? (yes, but the focus of the forum is really on Europe, but no reason why we couldn’t do that for another region)

?: New models? New versions?.. Could create ideally countless project

WU: Anyone can become a member ; get involved in the monthly meetings ; in the workshops with policy-makers…

Christoph Schimeczek: I feel that many people assume the protocol is “cool” ; may be interest to re-use it ; maybe even without funding, should be some incentive to actually participate/use it

WU: Could open up memberships already ; open up the next steps…

?: Planning to include some national models? To compare at the national level?

WU: There are some nation-level results as well ; but very much of interest as well