- genre: break‑out‑group maybe a write-a-thon or learn-a-thon?
- title: New project - new tools?! How granular should our tools and data processing workflows be to allow for reusability?
- presenter: @fwitte
- background: This topic is not a new one (for example parts of it have been touched at openmod 6 years ago Survey-design-do-a-thon: How to increase resuse / efficiency of open source modeling?, Reproducible workflows write-a-thon)! Still I would like to bring it to the table again, and also make a connection to the data side of things: There are many things researchers need to do with data inputs from all kinds of sources to get to their energy system model instance, might it be a European power system or a neighborhood heat supply case study. Many of the tools and workflows used are published, but these sometimes tend to be bound to the specific context they are used in. By making tools and workflows more granular, and by sticking to well defined and documented input and output data structures (e.g. see Do-a-thon: Draft for 'energy' datapackage standard (?) and Developing a common ontology for energy system analysis) the community might be able to simplify the reusing such tools and methods. I would like to discuss and learn, if there are good examples to follow (from within the community or from other fields of research), and what we can do to improve in the future.
New project - new tools?! How granular should our tools and data processing workflows be to allow for reusability?