Is there anything better than the sensitivity factors (PTDF, LODF, etc.)?

Hi Open modellers,

Lately I’ve been looking into developing tools that make use of the so-called linear factors.
These are sensitivity matrices that embed the DC power flow and allow very fast evaluation of flows and contingencies. These methods are widespread, and tools like plexos use them extensively to model the branch outages. (See this for instance)

The PTDF and LODF matrices do have horrendous glitches that come from the very definition of the sensitivity (see example below). I’m aware of those glitches but I’d like to do better.

I’ve done a literature search and I’ve found nothing better than the linear sensitivity factors for fast contingency analysis. So, I would love to know if anyone here is considering these things and if they have found something better.

Contingency sensitivity example:

Ok example: Consider two lines A and B. On a base situation A has a flow of 2 MW and B has a flow of 1MW. After the contingency of B, A has now 2.2 MW of flow.
The line outage distribution factor is LODF=(2-2.2) / 1 = -0.2. This is feasible.

Bad example: Consider two lines A and B. On a base situation A has a flow of 2 MW and B has a flow of 0.02MW. After the contingency of B, A has now 2.2MW of flow.
The line outage distribution factor is LODF=(2-2.2) / 0.02 = -10.

This is insane because for a different distribution of flows where in base A has 3MW and B has 2MW, the outage flow would be:

Contingency flow A = 3 + (-10) * 2 = -17.

This implies that the 2MW that “failed” from B turn into 17MW in A. One might argue that this is somehow actually ok in a linear world. But if you do a simple test with an IEEE grid, and check the results with newton raphson doing the outage yourself, you’ll see that the results do not make any sense. Yet, we are using these methods everywhere.

Best regards,
Santiago

Text and images licensed under CC BY 4.0Data licensed under CC0 1.0Code licensed under MITSite terms of serviceOpenmod mailing listOpenmod wiki.