You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It was discovered that test_main_system::test_when_context_created_then_contains_expected_number_of_plans() was taking > 15s to complete
This is because it is actually creating and attempting to connect to real devices, instead of mocked ones. It fails but the error message is caught and no error message is printed as the relevant logger is not enabled, and the test passes.
Ostensibly this test should test that the plans are created as expected. for this to work setup_context() needs to support a mock parameter that can get passed to the beamline device factories. It used to but this was removed in DiamondLightSource/hyperion#1481 although this test never used it.
We should reinstate this override. The desired behaviour implied by 1481 does not happen (at least for ophyd_async) in any case since ophyd_async ensure_connected() always specifies the parameters to device.connect() explicitly which means that default implementation in device will always be overridden in production.
In addition, the wait_for_connection parameter is totally ignored as blueapi with_dodal_module() always invokes utils.connect_devices() unconditionally.
Acceptance Criteria
setup_context() is rewritten so that it no longer promises to do things that it doesn't actually do
unit tests pass without delay
Talk to @DominicOram to find out what the actual use case was for having mock devices on the beamline - do they still exist?
The text was updated successfully, but these errors were encountered:
The use case for having a mock device on the beamline is e.g.:
A slit set fails in the middle of the night
The slits are in the correct position, therefore there is no scientific reason that the beamline cannot run, but can no longer be moved/read
We need a quick way of saying in the DAQ layer "replace this device with something that works enough to run"
We have been doing this so far by replacing the default in the device instantiation to be fake_with_ophyd_sim: bool = True
The issue with the code in 1481 was that when you supplied setup_context(fake_with_ophyd_sim = False) it overrode the default in the device factory itself. A solution I guess could be that setup_context instead takes a param like override_and_mock_all, which if true will make everything a mock and if false falls back to the device default. We would then need to change this behaviour in https://github.com/DiamondLightSource/blueapi/blob/5d8524eeb5ad4de9a80a81c28afc33e17f656086/src/blueapi/core/context.py#L108
Whilst addressing
It was discovered that
test_main_system::test_when_context_created_then_contains_expected_number_of_plans()
was taking > 15s to completeThis is because it is actually creating and attempting to connect to real devices, instead of mocked ones. It fails but the error message is caught and no error message is printed as the relevant logger is not enabled, and the test passes.
Ostensibly this test should test that the plans are created as expected. for this to work
setup_context()
needs to support amock
parameter that can get passed to the beamline device factories. It used to but this was removed in DiamondLightSource/hyperion#1481 although this test never used it.We should reinstate this override. The desired behaviour implied by 1481 does not happen (at least for ophyd_async) in any case since ophyd_async
ensure_connected()
always specifies the parameters todevice.connect()
explicitly which means that default implementation in device will always be overridden in production.In addition, the
wait_for_connection
parameter is totally ignored as blueapiwith_dodal_module()
always invokesutils.connect_devices()
unconditionally.Acceptance Criteria
setup_context()
is rewritten so that it no longer promises to do things that it doesn't actually doThe text was updated successfully, but these errors were encountered: