Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

parse_file and compute_mc_pf fail on EPRI Ckt24 [BUG] #455

Open
John-Boik opened this issue Feb 9, 2024 · 10 comments
Open

parse_file and compute_mc_pf fail on EPRI Ckt24 [BUG] #455

John-Boik opened this issue Feb 9, 2024 · 10 comments
Labels
wontfix This will not be worked on

Comments

@John-Boik
Copy link
Contributor

John-Boik commented Feb 9, 2024

I'm trying to parse the dss files for EPRI Ckt24 and run compute_mc_pf, to compare with output using OpenDSSDirect.jl. I can run OpenDSSDirect.jl solve successfully with this circuit. I can also run parse_file successfully, although PMD spits out a long list of messages. But compute_mc_pf fails on the parsed result. I am new to OpenDss and PMD, so I'm not sure how to address the problem, or even what the problem is. And I'm not sure if it is a PMD bug or not. Any assistance would be appreciated.

In order to run parse_file successfully with PMD, I had to make several changes to the dss files downloaded from OpenDSS, including:

  • replace all tabs with spaces, which I did in several dss files
  • change model=4 to model=1 for almost all loads
  • rename some of the dss files to lower case

After those changes, the result of data_eng = PMD.parse_file("master_ckt24.dss", transformations=[PMD.transform_loops!]) is a few (I assume) harmless messages, like:

- PowerModelsDistribution.DssEnergymeter has no field `phasevolt`, skipping...
- PowerModelsDistribution | Warning ] : PowerModelsDistribution.DssMonitor has no field `residual`, skipping...
- PowerModelsDistribution | Warning ] : PowerModelsDistribution.DssMonitor has no field `ppolar`, skipping...

Then PMD spits out 3,638 INFO statements, all similar to:

- An inconsistent number of nodes was specified on n292743; |1.2|!=3.
- An inconsistent number of nodes was specified on g2100fi2200_n292320_sec_6; |2|!=2.

These messages occur whether or not I include transformations=[PMD.transform_loops!] in the call.

The first example occurs in the Lines file as:

New Line.05410_108360UG bus1=N292743.1.2.3 bus2=N292740.1.2.3 linecode=UG_1/0_Al_XLPE_35kV phases=3 length=443.8276 units=ft
New Line.05410_93032UG bus1=N292744.1.2.3 bus2=N292743.1.2 linecode=UG_1/0_Al_XLPE_35kV_1_Phase phases=3 length=64.5949 units=ft
!New Line.05410_9433228UG bus1=N292745.3 bus2=N292743.3 linecode=UG_1/0_Al_XLPE_35kV_1_Phase phases=1 length=35.9813 units=ft

The second example occurs in the Loads file as:

New load.620350100 phases=1 kV=0.24 bus1=G2100FI2200_N292320_sec_6.2 xfkVA=1.24203074 pf=0.98 status=variable  model=1 CVRwatts=0.8 CVRvars=3 conn=wye Vminpu=0.7 yearly=LS_PhaseB  ! Phase 2

I could understand if a few nodes were somehow wrongly structured in the dss files, but 3,638 nodes sounds more like a PMD bug, or at least some consistent characteristic of the dss files that is incompatible with PMD.

When I run compute_mc_pf(data_eng), I get error messages:

ERROR: LoadError: BoundsError: attempt to access 2-element Vector{Int64} at index [3]
Stacktrace:
  [1] getindex
    @ ./essentials.jl:13 [inlined]
  [2] #3448
    @ ./none:0 [inlined]
  [3] iterate
    @ ./generator.jl:47 [inlined]
  [4] collect_to!
    @ ./array.jl:840 [inlined]
  [5] collect_to_with_first!
    @ ./array.jl:818 [inlined]
  [6] collect(itr::Base.Generator{Base.OneTo{Int64}, PowerModelsDistribution.var"#3448#3465"{Vector{Int64}, Int64, Vector{Int64}, Int64}})
    @ Base ./array.jl:792
  [7] calc_start_voltage(data_math::Dict{String, Any}; max_iter::Float64, epsilon::Int64, explicit_neutral::Bool)
    @ PowerModelsDistribution ~/.julia/packages/PowerModelsDistribution/rmSeP/src/data_model/transformations/initialization.jl:33
  [8] calc_start_voltage
    @ ~/.julia/packages/PowerModelsDistribution/rmSeP/src/data_model/transformations/initialization.jl:11 [inlined]
  [9] add_start_voltage!(data_math::Dict{String, Any}; coordinates::Symbol, uniform_v_start::Missing, vr_default::Float64, vi_default::Float64, vm_default::Float64, va_default::Float64, epsilon::Int64, explicit_neutral::Bool)
    @ PowerModelsDistribution ~/.julia/packages/PowerModelsDistribution/rmSeP/src/data_model/transformations/initialization.jl:238
 [10] add_start_voltage!

The same error occurs if I try different variations of compute_mc_pf, such as compute_mc_pf(data_eng, explicit_neutral=true), or if I use the functions from the Native Power Flow Solver example:

data_eng["is_kron_reduced"] = true
data_eng["settings"]["sbase_default"] = 1
data_math = PMD.transform_data_model(data_eng; kron_reduce=false, phase_project=false);
sourcebus_voltage_vector_correction!(data_math, explicit_neutral=false);
update_math_model_3wire!(data_math);
res = PMD.compute_mc_pf(data_math; explicit_neutral=false, max_iter=100)

I'm guessing that the errors for compute_mc_pf are related to all the INFO message of parse_file.

Any help would be appreciated.

@frederikgeth
Copy link
Collaborator

Hi! I'm not going to be able to help with finding the root cause in the short term, but may be able to point you to workarounds. The PMD parser for OpenDSS is limited in scope, i.e. doesn't support all of the OpenDSS spec. Furthermore, OpenDSS's data model is ambiguous which creates parsing challenges (you don't know if the neutral in impedance matrices has been Kron reduced or not). On p.18 of https://arxiv.org/pdf/2305.04405.pdf you can see which test cases we've been able to run with the native solver. What are you trying to achieve?

@John-Boik
Copy link
Contributor Author

John-Boik commented Feb 9, 2024

Thanks @frederikgeth. I am simply trying to run a power flow analysis in PMD using a circuit that colleagues have previously used within OpenDSS. I'm trying to get a feel for the capabilities of PMD, and how practical it might be (now or in the future) in the industrial setting. Your reply raises a few more questions/concerns.

First, comparing PMD and OpenDSS runtimes in Table 4 of the reference you cited, to what do you attribute the large differences in runtime for the two large circuits (61-fold and 165-fold differences)? It sounds as if the OpenDSS and PMD power flow algorithms are similar. Also, it appears that for the two large circuits, PMD has a nearly constant total runtime, even though one circuit is about four times larger than the other. In contrast, the OpenDSS timings show an approximate four-fold difference. Would you expect that for even larger circuits, say 10-fold larger, PMD runtimes would remain similar to those in the paper (~20 seconds), while OpenDSS runtimes would continue to scale linearly? That is, is there likely to be a circuit size where runtimes between OpenDSS and PMD are similar?

Second, is there any guidance or rules to follow when constructing OpenDSS files for eventual parsing by PMD? If not, then the utility of first creating circuits in the OpenDSS format (which PMD recommends) might be limited. One might spend considerable time creating a large circuit, only to learn later that it can't be parsed by PMD. For this reason, perhaps one should create circuits from scratch in PMD itself, or use some backbone created in the OpenDSS format and then complete it in PMD. It sounds like there are fundamental incompatibilities if the OpenDSS format is ambiguous in certain respects. Is there a typical set of steps to go through when trying to alter DSS files for successful PMD parsing?

Similarly, it would appear that the ability to use both tools together in the same project is limited. A circuit created in OpenDSS format might not parse into PMD, and a circuit created or altered in PMD cannot be exported into OpenDSS format (correct?). Yet, the literature suggests that PMD is not trying to replicate many of the capabilities of OpenDSS, so there might be cause to use both tools together. And there might be legacy code available in OpenDSS format for a given circuit. Can you offer any suggestions (for now or in the future) as to how one might approach the parsing issue and potential tool interoperability?

Are there plans to allow circuit export into the OpenDSS format, and are there plans to develop guidance on constructing a (large) circuit in the OpenDSS format that would parse into PMD? If so, then perhaps many of the concerns I raise might be addressed. Or, is there a better way to be thinking about OpenDSS-PMD incompatibilities?

@John-Boik
Copy link
Contributor Author

John-Boik commented Feb 10, 2024

For anyone looking at the parsing issues with Ckt24 in the future, I've made some slight progress in narrowing down problems. To run OPF, I am adapting the code in the "Introduction to PowerModelsDistribution" tutorial. As an aside, this same approach worked out of the box for IEEE123, using script 1 of "Run_IEEE123Bus.DSS".

My code to run the problem (after various import statements) is:

    filename = joinpath(@__DIR__, "master_ckt24.dss")
    data_eng = PMD.parse_file(
        filename, 
        transformations=[PMD.transform_loops!]
    )
    
    PMD.remove_all_bounds!(data_eng)
    PMD.make_lossless!(data_eng)
    PMD.apply_voltage_bounds!(data_eng; vm_lb=0.9, vm_ub=1.1)
    PMD.apply_voltage_angle_difference_bounds!(data_eng, 1)
    
     data_math = PMD.transform_data_model(       #  <--  current error occurs here
        data_eng; 
    )
    
    ipopt_solver = PMD.optimizer_with_attributes(Ipopt.Optimizer, "tol"=>1e-6, "print_level"=>1)
    res2 = PMD.solve_mc_pf(
        data_math,
        PMD.ACPUPowerModel, 
       ipopt_solver;
     )

As I mentioned in the previous post, parsing the file results in several thousand INFO messages. These appear to be due to loads on buses in the dss file where the bus is only connected to the load, not anything else. For example, for bus1 in

New load.965009000 phases=1 kV=0.24 bus1=G2101AC9800_N283938_sec_4.3 xfkVA=6.22605363 pf=0.98 status=variable  model=1 CVRwatts=0.8 CVRvars=3 conn=wye Vminpu=0.7 yearly=LS_PhaseC  ! Phase 3

there is only this instance where bus g2101ac9800_n283938_sec_4 is mentioned in the dss files. I'm guessing that OpenDSS might just ignore these loads. But I could be wrong. Maybe it connects them to the rest of the circuit in some fashion. It's not clear to me if OpenDSS and PMD are supposed to ignore these loads.

PMD does read in a large number of loads and buses without giving the INFO message. Here is an example of one that does have an INFO message:

An inconsistent number of nodes was specified on g2101ac9800_n283938_sec_4; |3|!=2.

Looking further at this example, the issue is from _get_conductors_ordered at about line 349 of src/io/utils.jl. In that code, PMD is checking that length(default)!=length(ret). But in this example, the two lengths are not the same:

busname= g2101ac9800_n283938_sec_4.3, parts= SubString{String}["g2101ac9800_n283938_sec_4", "3"], ret= [3], default= [1, 2]

Now, regarding the actual error, from running transform_data_model, the message is:

ERROR: LoadError: DimensionMismatch: arrays could not be broadcast to a common size; got a dimension with lengths 3 and 2
Stacktrace:
  [1] _bcs1
    @ ./broadcast.jl:529 [inlined]
  [2] _bcs
    @ ./broadcast.jl:523 [inlined]
  [3] broadcast_shape
    @ ./broadcast.jl:517 [inlined]
  [4] combine_axes
    @ ./broadcast.jl:512 [inlined]
  [5] instantiate
    @ ./broadcast.jl:294 [inlined]
  [6] materialize(bc::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1}, Nothing, typeof(==), Tuple{Vector{Int64}, Vector{Int64}}})
    @ Base.Broadcast ./broadcast.jl:873
  [7] _apply_kron_reduction!(data_eng::Dict{String, Any}; kr_phases::Vector{Int64}, kr_neutral::Int64)
    @ PowerModelsDistribution ~/.julia/packages/PowerModelsDistribution/rmSeP/src/data_model/transformations/kron.jl:40
  [8] apply_pmd!(func!::typeof(PowerModelsDistribution._apply_kron_reduction!), data::Dict{String, Any}; apply_to_subnetworks::Bool, kwargs::Base.Pairs{Symbol, Any, Tuple{Symbol, Symbol}, NamedTuple{(:kr_phases, :kr_neutral), Tuple{Vector{Int64}, Int64}}})
    @ PowerModelsDistribution ~/.julia/packages/PowerModelsDistribution/rmSeP/src/core/data.jl:58
  [9] apply_pmd!
    @ ~/.julia/packages/PowerModelsDistribution/rmSeP/src/core/data.jl:50 [inlined]
 [10] #apply_kron_reduction!#3429
    @ ~/.julia/packages/PowerModelsDistribution/rmSeP/src/data_model/transformations/kron.jl:9 [inlined]
 [11] apply_kron_reduction!(data::Dict{String, Any})
    @ PowerModelsDistribution ~/.julia/packages/PowerModelsDistribution/rmSeP/src/data_model/transformations/kron.jl:6
 [12] _map_eng2math(data_eng::Dict{String, Any}; kron_reduce::Bool, phase_project::Bool, eng2math_extensions::Vector{Function}, eng2math_passthrough::Dict{String, Vector{String}}, global_keys::Set{String})
    @ PowerModelsDistribution ~/.julia/packages/PowerModelsDistribution/rmSeP/src/data_model/transformations/eng2math.jl:188
 [13] _map_eng2math
    @ ~/.julia/packages/PowerModelsDistribution/rmSeP/src/data_model/transformations/eng2math.jl:174 [inlined]
 [14] transform_data_model(data::Dict{String, Any}; kron_reduce::Bool, phase_project::Bool, multinetwork::Bool, global_keys::Set{String}, eng2math_passthrough::Dict{String, Vector{String}}, eng2math_extensions::Vector{Function}, make_pu::Bool, make_pu_extensions::Vector{Function}, correct_network_data::Bool)
    @ PowerModelsDistribution ~/.julia/packages/PowerModelsDistribution/rmSeP/src/data_model/transformations/eng2math.jl:152
 [15] transform_data_model
    @ ~/.julia/packages/PowerModelsDistribution/rmSeP/src/data_model/transformations/eng2math.jl:132 [inlined]

Looking deeper at src/data_model/transformations/kron.jl:40, numerous circuit lines run without error. The first error is for:

eng_obj= Dict{String, Any}("length" => 19.68852552, "t_connections" => [1, 2], "f_bus" => "n292744", "source_id" => "line.05410_93032ug", "vad_ub" => [1, 1, 1], "status" => PowerModelsDistribution.ENABLED, "t_bus" => "n292743", "linecode" => "ug_1/0_al_xlpe_35kv_1_phase", "f_connections" => [1, 2, 3], "vad_lb" => [-1, -1, -1])
eng_obj[f_connections]= [1, 2, 3]
eng_obj[t_connections]= [1, 2]

The issue here is that the assert test all(eng_obj["f_connections"] .== eng_obj["t_connections"]) cannot be completed because the vectors are not all the same length. I think perhaps that the assert statement should be changed somehow to give a more useful message. Perhaps there should be two asserts, one to check if the vectors are the same length, and a second to check that the f_connections equal the t_connections.

I'm not sure what to try from here. I don't understand why the two sets of connections are different length. I also tried calling the function several ways, e.g., with kron_reduce true or false, and tried setting data_eng["is_kron_reduced"] to true or false, but all these efforts produced a different, equally inscrutable error (inscrutable for me, at least).

If anyone has suggestions as to what to try next, or where the problem might lie, I would appreciate the help.

@frederikgeth
Copy link
Collaborator

Here's some quick high-level feedback

  • PMD is a collaborative, community-driven research-focused toolbox, with a strong focus on solving state-of-the-art mathematical optimization problems (unbalanced optimal power flow). Simulation problems (unbalanced power flow) can also be solved, but the project is not state-of-the-art there.
  • PMD's built-in power flow solver follows OpenDSS's algorithm pretty closely, however, as the report discusses, there are still a couple of missing features to make it comparable in terms of performance. The most crucial one is the missing load model relaxation, which is needed for numerical robustness and convergence for bigger networks with congestion. The differences in table 4 in https://arxiv.org/pdf/2305.04405.pdf are due to the native solver going up to 100 iterations and then bailing. In terms of transformer modeling, the Julia implementation is also expected to be slightly slower than OpenDSS due to the decomposition of the transformer into primitives. I expect simulation runtimes to be very similar though once the load model relaxation is added.
  • More reliable, structured parsing could be done done through OpenDSSDirect. You can read in the case study OpenDSSDirect, interrogate the objects in memory, project them to the PMD datamodel in mostly a 1:1 fashion.

@John-Boik
Copy link
Contributor Author

Thanks @frederikgeth. That helped.

@hei06j
Copy link
Contributor

hei06j commented Feb 19, 2024

@John-Boik @frederikgeth I had a closer look at the Ckt24 data and here is where I am so far. The problem is when transforming engineering data to mathematical data, which is

eng = parse_file(case_file) # transformations=[transform_loops!, remove_unconnected_terminals!, delete_trailing_lines!])
                            # You can do these transformations, but they won't make any difference in this case

math = transform_data_model(eng)

where we get this error

ERROR: BoundsError: attempt to access 1-element Vector{Float64} at index [[1, 2, 3]]

I have found these two issues so far:

  • there are a few linecodes that their rs, xs, ... matrix sizes don't match that of the corresponding line connections.
  • there is one line with f_connections and t_connection vector lengths mismatched (that you had found as well)
lines_with_rxmatrix_size_smaller_than_connections = [(l,line["linecode"])  for (l,line) in eng["line"] if (haskey(line, "linecode") && size(eng["linecode"]["$(line["linecode"])"]["rs"])[1]<length(line["f_connections"])) ]
problematic_linecode_small = unique(last.(lines_with_rxmatrix_size_smaller_than_connections))

lines_with_rxmatrix_size_larger_than_connections = [(l,line["linecode"]) for (l,line) in eng["line"] if (haskey(line, "linecode") && size(eng["linecode"]["$(line["linecode"])"]["rs"])[1]>length(line["f_connections"])) ]
problematic_linecode_large = unique(last.(lines_with_rxmatrix_size_larger_than_connections))

The linecodes with larger matrix size are not problematic, so we can ignore them, But the other one needs to be fixed

using LinearAlgebra
function make_diag!(obj)
    obj["rs"] = diagm([1,1,1].*obj["rs"][1,1])
    obj["xs"] = diagm([1,1,1].*obj["xs"][1,1])
    obj["g_to"] = diagm([1,1,1].*obj["g_to"][1,1])
    obj["b_to"] = diagm([1,1,1].*obj["b_to"][1,1])
    obj["g_fr"] = diagm([1,1,1].*obj["g_fr"][1,1])
    obj["b_fr"] = diagm([1,1,1].*obj["b_fr"][1,1])
    obj["cm_ub"] = [1,1,1].*obj["cm_ub"][1]
end


make_diag!(eng["linecode"]["ug_1/0_al_xlpe_35kv_1_phase"])
for line in first.(lines_with_rxmatrix_size_smaller_than_connections)
    if haskey(eng["line"][line], "rs")
        make_diag!(eng["line"][line])
    end
end

The second issue of a specific branch with from and to vector connections mismatch is also easily fixed:

line = [l for (l,line) in eng["line"] if all.(line["f_connections"]!=line["t_connections"])]
eng["line"][line[1]]
eng["line"][line[1]]["t_connections"] = eng["line"][line[1]]["f_connections"]

With these fixes so far, the previous error is gone. Runnin math = transform_data_model(eng) now we get this new error

ERROR: AssertionError: A delta configuration has to have at least 2 or 3 connections!

This error arises from most of the load being set to DELTA configuration, but are single phase

loads_delta_1connection = [d for (d,load) in eng["load"] if (load["configuration"]==DELTA && length(load["connections"])<2)]

This error seems to be rooted in PMD dss2eng.jl where the load configuration is set up, and I can see that there is a TODO note for better generalization

conf = nphases==1 && dss_obj["kv"]==0.24 ? DELTA : dss_obj["conn"] # check if load is connected between split-phase terminals of triplex node (nominal line-line voltage=240V), TODO: better generalization

This line seems to be in contradiction to the line that throws the error at
@assert(length(connections) in [2, 3], "A delta configuration has to have at least 2 or 3 connections!")

@PMeira
Copy link

PMeira commented Feb 22, 2024

Hi, I stumbled with this issue through a GitHub search.

Furthermore, OpenDSS's data model is ambiguous which creates parsing challenges (you don't know if the neutral in impedance matrices has been Kron reduced or not).

(@frederikgeth)

We recently added a new function OpenDSSDirect.Circuit.Save to allow tweaking the previous save circuit output. It could be useful in some scenarios. There are some examples and notes on ODD.py: https://dss-extensions.org/OpenDSSDirect.py/notebooks/Saving.html

For a better approach, we're developing a JSON Schema to formalize alternative circuit I/O -- this is what the current Circuit.FromJSON/ToJSON functions support. It should allow unambiguous interpretation, initially through JSON but other formats later. There's a lot to document though, and we started a repo at AltDSS-Schema. Would something like this benefit PMD?

As an aside, this same approach worked out of the box for IEEE123,

(@John-Boik)

Probably not a major issue here, but note that that repo is a personal repo and has not been updated in several years. The official repo is hosted on SourceForge, and we (DSS-Extensions) keep a copy with some changes (e.g. adjust paths, add some tests) at https://github.com/dss-extensions/electricdss-tst

A couple test circuits did get updates in this period, including that one: https://github.com/dss-extensions/electricdss-tst/tree/master/Version8/Distrib/IEEETestCases/123Bus

@John-Boik
Copy link
Contributor Author

Hi @PMeira. Sorry for the slow reply, but thanks much for mentioning the ToJSON functions. The JSON output does indeed look useful in parsing OpenDSS circuits to PowerModelsDistribution. I just want to check two things with you:

  1. The JSON Schema project is ongoing, correct? It looks like the last commits were six months ago.
  2. The Circuit.ToJSON functions in the current Julia/Python versions of OpenDSSDirect already incorporate the schema listed at AltDSS-Schema, correct?

Also, thanks for the info on the IEEE123 circuit.

this is what the current Circuit.FromJSON/ToJSON functions support.

@PMeira
Copy link

PMeira commented Jun 19, 2024

The JSON Schema project is ongoing, correct?

@John-Boik
Currently, the repo AltDSS-Schema is for extra/support code, docs and general tracking. I'm updating it for the next release, but it's mostly changes in the Pydantic code, for the time being. The basic code is integrated in the engine -- the JSON schema is derived from the internal schema (which is hosted in the DSS C-API repo); we'll most likely reverse that someday (keep the JSON schema as the source of truth), but no specific plans yet.

The FromJSON/ToJSON functions pass roundtrip tests for well-behaved data (including many of the test circuits). I imagine you saw the tickets on AltDSS-Schema related to the default values -- I'm finishing that, coupled to the next release of the engine. Besides that, the basics for the components are done and stable (as much as OpenDSS is stable). The JSON schema spec for the command system is ongoing (tied to a refactoring of the command system).

It looks like the last commits were six months ago.

Part of it is in the DSS C-API repo and part in the AltDSS-Schema repo. I'll try to make that more clear next time I updated the docs. And we (e.g. the team at my university) typically have some private branches and tickets that sooner or later land on the public repos.

Copy link

stale bot commented Dec 18, 2024

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix This will not be worked on label Dec 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

4 participants