You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The secondary losses computed through input_multi_data are wrong. Here are some examples:
model1=pyfair.FairModel(name="Insider Threat", n_simulations=10)
model1.input_multi_data('Secondary Loss', {
'Reputational': {
'Secondary Loss Event Frequency': {'constant': 1},
'Secondary Loss Event Magnitude': {'constant': 10},
},
'Legal': {
'Secondary Loss Event Frequency': {'constant': 1},
'Secondary Loss Event Magnitude': {'constant': 10},
}
})
In this case, the secondary loss should be 20 (10 x 1 + 10 x 1) for all simulations. However, all elements of model1._model_table["Secondary Loss"] are equal to 101. If one sets all the frequencies to 1:
model2=pyfair.FairModel(name="Insider Threat", n_simulations=10)
model2.input_multi_data('Secondary Loss', {
'Reputational': {
'Secondary Loss Event Frequency': {'constant': 0},
'Secondary Loss Event Magnitude': {'constant': 10},
},
'Legal': {
'Secondary Loss Event Frequency': {'constant': 0},
'Secondary Loss Event Magnitude': {'constant': 10},
}
})
All elements of model2._model_table["Secondary Loss"] are equal to 100 instead of 0. Furthermore, an error is returned if one uses more than two loss types:
model3=pyfair.FairModel(name="Insider Threat", n_simulations=10)
model3.input_multi_data('Secondary Loss', {
'Reputational': {
'Secondary Loss Event Frequency': {'constant': 1},
'Secondary Loss Event Magnitude': {'constant': 10},
},
'Legal': {
'Secondary Loss Event Frequency': {'constant': 1},
'Secondary Loss Event Magnitude': {'constant': 10},
},
'Response': {
'Secondary Loss Event Frequency': {'constant': 1},
'Secondary Loss Event Magnitude': {'constant': 10},
}
})
The result is:
in FairModel.input_multi_data(self, target, kwargs_dict)
286 """Input data for multiple items that roll up into an aggregate
287
288 As of now, this is only used for Secondary Loss when calculating
(...)
...
--> 258 df1, df2 = df_dict.values()
259 combined_df = df1 * df2
260 # Sum
ValueError: too many values to unpack (expected 2)
The text was updated successfully, but these errors were encountered:
arognoni
pushed a commit
to arognoni/pyfair-secondary-loss
that referenced
this issue
Sep 11, 2022
The secondary losses computed through
input_multi_data
are wrong. Here are some examples:In this case, the secondary loss should be 20 (10 x 1 + 10 x 1) for all simulations. However, all elements of
model1._model_table["Secondary Loss"]
are equal to 101. If one sets all the frequencies to 1:All elements of
model2._model_table["Secondary Loss"]
are equal to 100 instead of 0. Furthermore, an error is returned if one uses more than two loss types:The result is:
The text was updated successfully, but these errors were encountered: