-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
(only) add more unit tests #76
Conversation
24704d7
to
e53a4e0
Compare
Signed-off-by: Fabrice Normandin <[email protected]>
e53a4e0
to
2aba529
Compare
tests/cli/test_remote.py
Outdated
@dont_run_for_real | ||
@disable_internet_access | ||
def test_ensure_allocation_without_persist(self, mock_connection: Connection): | ||
alloc = ["--time=00:01:00"] | ||
transforms = [some_transform] | ||
remote = SlurmRemote( | ||
mock_connection, alloc=alloc, transforms=transforms, persist=False | ||
) | ||
|
||
def write_stuff( | ||
command: str, | ||
asynchronous: bool, | ||
hide: bool, | ||
pty: bool, | ||
out_stream: QueueIO, | ||
): | ||
assert command == f"bash -c 'salloc {shjoin(alloc)}'" | ||
out_stream.write("salloc: Nodes bob-123 are ready for job") | ||
return unittest.mock.DEFAULT | ||
|
||
mock_connection.run.side_effect = write_stuff | ||
results, runner = remote.ensure_allocation() | ||
|
||
mock_connection.run.assert_called_once_with( | ||
f"bash -c 'salloc {shjoin(alloc)}'", | ||
hide=False, | ||
asynchronous=True, | ||
out_stream=unittest.mock.ANY, | ||
pty=True, | ||
) | ||
assert results == {"node_name": "bob-123"} | ||
# raise NotImplementedError("TODO: Imporant and potentially complicated test") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe test for real?
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
Signed-off-by: Fabrice Normandin <[email protected]>
requires_s_flag = pytest.mark.skipif( | ||
"-s" not in sys.argv, | ||
reason=( | ||
"Seems to require reading from stdin? Works with the -s flag, but other " | ||
"tests might not." | ||
), | ||
) | ||
|
||
|
||
requires_no_s_flag = pytest.mark.skipif( | ||
"-s" in sys.argv, | ||
reason="Passing pytest's -s flag makes this test fail.", | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If the tests fails because of ANSI escape sequence being printed maybe we could just add the escape sequence in the expected output instead? And force the -s
flag in the .toml
? https://docs.pytest.org/en/7.1.x/reference/customize.html?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree, we could just add the -s
by default to all tests and change the regression files. There's unfortunately an issue with that, which I don't fully understand at the moment.
When adding the "-s" flag by default and running all the tests, once it gets to test_remote.py
the tests crash with this message below.
However, running ONLY test_remote.py
, with -s
, works fine! So that's why I currently run a portion of the tests without -s and another portion with -s.
I don't yet fully understand what's going on.
________________________________ test_remote_transform_methods[localhost-with_transforms-args0-initial_transforms0-echo OK] ________________________________
command_to_run = 'echo OK', initial_transforms = [], method = <function Remote.with_transforms at 0x7f66f6edff70>
args = (<function <lambda> at 0x7f66f5d6f4c0>, <function <lambda> at 0x7f66f5d6f550>), host = 'localhost', mock_connection = Connection('localhost')
file_regression = <pytest_regressions.file_regression.FileRegressionFixture object at 0x7f66ee148af0>
capsys = <_pytest.capture.CaptureFixture object at 0x7f66ee148c70>
@requires_s_flag
@pytest.mark.parametrize("command_to_run", ["echo OK"])
@pytest.mark.parametrize("initial_transforms", [[]])
@pytest.mark.parametrize(
("method", "args"),
[
(
Remote.with_transforms,
(
lambda cmd: cmd.replace("OK", "NOT_OK"),
lambda cmd: f"echo 'command before' && {cmd}",
),
),
(
Remote.wrap,
("echo 'echo wrap' && {}",),
),
(
Remote.with_precommand,
("echo 'echo precommand'",),
),
# this need to be a file to source before running the command.
(Remote.with_profile, (".bashrc",)),
(Remote.with_bash, ()),
],
)
def test_remote_transform_methods(
command_to_run: str,
initial_transforms: list[Callable[[str], str]],
method: Callable,
args: tuple,
host: str,
mock_connection: Connection | Mock,
file_regression: FileRegressionFixture,
capsys: pytest.CaptureFixture,
):
"""Test the methods of `Remote` that modify the commands passed to `run` before it
gets passed to the connection and run on the server."""
mock_connection = mock_connection
r = Remote(
host,
connection=mock_connection,
transforms=initial_transforms,
)
# Call the method on the remote, which should return a new Remote.
modified_remote: Remote = method(r, *args)
assert modified_remote.hostname == r.hostname
assert modified_remote.connection is r.connection
> result = modified_remote.run(command_to_run)
tests/cli/test_remote.py:210:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
milatools/cli/remote.py:134: in run
return self._run(cmd, hide=hide, **kwargs)
milatools/cli/remote.py:117: in _run
return self.connection.run(cmd, **kwargs)
../../.conda/envs/milatools/lib/python3.9/unittest/mock.py:1092: in __call__
return self._mock_call(*args, **kwargs)
../../.conda/envs/milatools/lib/python3.9/unittest/mock.py:1096: in _mock_call
return self._execute_mock_call(*args, **kwargs)
../../.conda/envs/milatools/lib/python3.9/unittest/mock.py:1166: in _execute_mock_call
return self._mock_wraps(*args, **kwargs)
../../.conda/envs/milatools/lib/python3.9/site-packages/decorator.py:232: in fun
return caller(func, *(extras + args), **kw)
../../.conda/envs/milatools/lib/python3.9/site-packages/fabric/connection.py:23: in opens
return method(self, *args, **kwargs)
../../.conda/envs/milatools/lib/python3.9/site-packages/fabric/connection.py:763: in run
return self._run(self._remote_runner(), command, **kwargs)
../../.conda/envs/milatools/lib/python3.9/site-packages/invoke/context.py:113: in _run
return runner.run(command, **kwargs)
../../.conda/envs/milatools/lib/python3.9/site-packages/fabric/runners.py:83: in run
return super().run(command, **kwargs)
../../.conda/envs/milatools/lib/python3.9/site-packages/invoke/runners.py:395: in run
return self._run_body(command, **kwargs)
../../.conda/envs/milatools/lib/python3.9/site-packages/invoke/runners.py:451: in _run_body
return self.make_promise() if self._asynchronous else self._finish()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <fabric.runners.Remote object at 0x7f66ee14e460>
def _finish(self) -> "Result":
# Wait for subprocess to run, forwarding signals as we get them.
try:
while True:
try:
self.wait()
break # done waiting!
# Don't locally stop on ^C, only forward it:
# - if remote end really stops, we'll naturally stop after
# - if remote end does not stop (eg REPL, editor) we don't want
# to stop prematurely
except KeyboardInterrupt as e:
self.send_interrupt(e)
# TODO: honor other signals sent to our own process and
# transmit them to the subprocess before handling 'normally'.
# Make sure we tie off our worker threads, even if something exploded.
# Any exceptions that raised during self.wait() above will appear after
# this block.
finally:
# Inform stdin-mirroring worker to stop its eternal looping
self.program_finished.set()
# Join threads, storing inner exceptions, & set a timeout if
# necessary. (Segregate WatcherErrors as they are "anticipated
# errors" that want to show up at the end during creation of
# Failure objects.)
watcher_errors = []
thread_exceptions = []
for target, thread in self.threads.items():
thread.join(self._thread_join_timeout(target))
exception = thread.exception()
if exception is not None:
real = exception.value
if isinstance(real, WatcherError):
watcher_errors.append(real)
else:
thread_exceptions.append(exception)
# If any exceptions appeared inside the threads, raise them now as an
# aggregate exception object.
# NOTE: this is kept outside the 'finally' so that main-thread
# exceptions are raised before worker-thread exceptions; they're more
# likely to be Big Serious Problems.
if thread_exceptions:
> raise ThreadException(thread_exceptions)
E invoke.exceptions.ThreadException:
E Saw 1 exceptions within threads (ValueError):
E
E
E Thread args: {'kwargs': {'echo': None,
E 'input_': <_io.TextIOWrapper name='<stdin>' mode='r' encoding='utf-8'>,
E 'output': <_io.TextIOWrapper encoding='UTF-8'>},
E 'target': <bound method Runner.handle_stdin of <fabric.runners.Remote object at 0x7f66ee14e460>>}
E
E Traceback (most recent call last):
E
E File "/home/fabrice/.conda/envs/milatools/lib/python3.9/site-packages/invoke/util.py", line 211, in run
E super().run()
E
E File "/home/fabrice/.conda/envs/milatools/lib/python3.9/threading.py", line 917, in run
E self._target(*self._args, **self._kwargs)
E
E File "/home/fabrice/.conda/envs/milatools/lib/python3.9/site-packages/invoke/runners.py", line 869, in handle_stdin
E with character_buffered(input_):
E
E File "/home/fabrice/.conda/envs/milatools/lib/python3.9/contextlib.py", line 119, in __enter__
E return next(self.gen)
E
E File "/home/fabrice/.conda/envs/milatools/lib/python3.9/site-packages/invoke/terminals.py", line 189, in character_buffered
E or not isatty(stream)
E
E File "/home/fabrice/.conda/envs/milatools/lib/python3.9/site-packages/invoke/util.py", line 121, in isatty
E return stream.isatty()
E
E ValueError: I/O operation on closed file
../../.conda/envs/milatools/lib/python3.9/site-packages/invoke/runners.py:503: ThreadException
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the end, even if it's hacky the tests run so I'm ok with it. Just in case you're willing to check if this here could fix the issue, it seams a cleaner workaround would be :
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the suggestion, I tried running the tests with this fixture added and autoused, but it didn't fix the error :(
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh well but thanks for trying :)
Adds more unit/integration tests, without changing anything about how the package works.
This is based on just the testing-related changes from #61