Hello All,
(this is my first OpenMOD post, so I hope I’m formatting and tagging this correctly)
I’m writing requesting your insights on the feasibility of migrating TIMES 2 JuMP.
Along With @olexandr and @sidk we have been tasked within the IEA-ETSAP community to assess the feasibility of migrating the TIMES Code from GAMS to Julia JuMP.
We have done some prototyping, some benchmarks, had some discussions with JuMP expert developers and had some discussions within the ETSAP Community and I’d love to hear from the OpenMOD community about experiences in migrating an ESOM code from GAMS to Julia-JuMP?
What performance issues did you experience?
1.a Problem creation time?
1.b Solve time?
1.c Data Handling structural change requirements?
What Julia JuMP version were you using?
Did you stay with Julia JuMP? (I believe the GenX folks moved to JuMP and then back to Python?)
If you didn’t stay with Julia JuMP why not?
Do you have strong opinions about other open source ESOM language contenders? LinoPy? Pyomo?
Thank you for your time, If you have insights and want to setup a bilateral google meet, I’m available for that too.
We have done some work porting our OSeMOSYS-derivative GENeSYS-MOD to JuMP in ~2022 when I was visiting SINTEF Energy Research in Norway who were really interested in co-developing an open version of it (previously it only existed in GAMS).
Performance issue definitely lies with problem creation time and/or data handling. Solve time is no issue, as this is really mostly solver-dependent. What we realized is that we had created quite a few data operations that would essentially loop over many sets which GAMS does in a split second, but takes significantly longer in Julia. Therefore, we have started to move more and more processing to the dataset itself to have that performed once and then the model can more or less just read everything in. But also building constraints still takes some time longer than in GAMS, we are still exploring options there.
We started with JuMP version 1.1.0 when we began the conversion efforts.
Yes, we co-maintain our GAMS and Julia versions and have bi-weekly meetings with all developers to ensure that things are aligned.
Since we started the efforts in 2022, some time has passed. Back then, we also considered Linopy and Pyomo. Unfortunately, back then, Linopy was still pretty early in development and did not provide all the features we needed to efficiently code our model. We did some testing with a trimmed-down set of constraints, but the actual data set and compared JuMP and Pyomo and there, JuMP clearly came out on top, which is why we went that way. Also, there were lots of efforts already going with Julia at our departments and SINTEF Energy so it fit in quite nicely.
But a disclaimer: I was/am not the main developer in Julia, that would be @DPinel, so maybe he can weigh in as well.
JuMP developer here. Just wanted to clear up a couple of things since someone linked me to this discussion
What we realized is that we had created quite a few data operations that would essentially loop over many sets which GAMS does in a split second, but takes significantly longer in Julia
Agreed. This is a very common issue for people transliterating between GAMS and JuMP.
Note that this is not JuMP-specific. It will also happen if you use linopy/Pyomo/PyOptInterface etc. The modeling languages that are embedded in a programming language parse the constraints in a way that is fundamentally different to standalone modeling languages like GAMS.
Since it comes up far too often, I’ll link to the (infamous) IJKLM blog post by GAMS and our rebuttal. The takeaway is that choice of data structure matters. (If you look at some of the plots, they claim that JuMP is 2x slower than Pyomo. All that proves is that GAMS used different data structures for their Julia code and Python code. That’s what makes the difference, not anything inherent to either modeling language.) The flexibility that you get in Julia/Python is both a blessing and a curse. You can write fast code in Julia and Python. But you can also write slow code. You need to put on more of a “software engineering” hat when writing JuMP/Linopy/Pyomo models than you do when writing GAMS models.
This early paper describing JuMP offers comparisons with AMPL, GAMS, and Pyomo
If people reference the 2017 paper it needs a bit of context. In the 2018-2022 period we rewrote the entirety of JuMP. We haven’t published any similar performance comparison for the new version of JuMP. I’d say the results are indicative of what can be achieved, but I wouldn’t rely on the exact numbers as fact.
I believe the GenX folks moved to JuMP and then back to Python?
GenX has never been implemented in Python. It has always and continues to use JuMP.
I did port a energy system model from GAMS to JuMP 2 years ago, but I’m not sure it will help. I was learning the GAMS syntax as I was translating it to JuMP, with which I was already familiar. So it’s more a documented example of GAMS→JuMP for beginners that an expert comparison.
As discussed here, most of the work was on adapting data input instructions. Also, I used Symbols a lot.
The motivation was the ability to run this open source model on my computer while I don’t have GAMS.
One other energy system framework written in Julia is AnyMOD.jl from TU Berlin. This was not a translation from GAMS though, but there might be some structural clues on the use of JuMP in their code?