Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unit-test / exercise test framework #23

Open
petergjoel opened this issue Apr 1, 2022 · 2 comments
Open

Unit-test / exercise test framework #23

petergjoel opened this issue Apr 1, 2022 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@petergjoel
Copy link
Member

Extend the framework with the ability to replay concrete execution-sequences as to provoke errors in implementation and allow for self-evaluation by the users.

@petergjoel petergjoel added the enhancement New feature or request label Apr 1, 2022
@Mast3rwaf1z
Copy link
Collaborator

I'm interpreting this issue as inputting a specific execution order

would having a few CSV files formatted about like this:

1, 0, "something"
0, 2, "something"

under like tests/csv{index}.csv and executing in that order be sufficient?

@Mast3rwaf1z
Copy link
Collaborator

Mast3rwaf1z commented Jul 13, 2022

I'm having trouble implementing the above example as the async emulator and the sync emulator execute completely different, i can't have the csv like this:

'send', 1, 0
'send', 2, 0
'receive', 1, 0

and have it working for a sync emulator as the first round is always send only, so either the execution sequence is gonna look completely like what would be sent in the sync emulator for every csv, or i have to make a csv for both sync and async for each exercise

additionally i had an idea to show some self-evaluation which i am quite pleased with, i have made a little addition in the EmulatorStub class where if the test is being run, it will load a test module corresponding to that exercise, where the code will evaluate whether the solution is correct, this is where one of the course holders would have to implement the testing module for each exercise, but i have provided an example i implemented for the demo exercise.

The testing modules should be added under emulators/tests/modules/exercise<number>.py while the csv files should be added under emulators/tests/exercise<number>.csv

The execution is rather random in some of the exercises so it is proving quite tricky to make the execution sequence follow a specific pattern, i have tried making the devices wait and check occationally whether its their turn to send or receive but they always end up waiting endlessly.

I have considered making a seperate sending execution sequence and receiving execution sequence but the issue arises when a device is trying to send while it's not its turn to send. Is it sufficient to have the execution sequence for the sending and receiving devices seperate in this way? the execution sequence will follow a similar pattern but isn't guaranteed to always be the same in this way, quite the opposite in fact.

i have also made a few changes for the exercise_runner, as i was getting tired of copy-pasting the command in the readme file, so all the options are optional and default to the demo if both the lecture and algorithm argument is not provided

for this i have also made a new --test flag which enables the unit test modules for the exercise being run

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

2 participants