Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding suport for reading .mat and .xml format annotations given by UA-DETRAC challenge. #53

Merged
merged 6 commits into from
Oct 31, 2019

Conversation

muaz-urwa
Copy link
Contributor

UA-DETRAC is an important multiple object tracking benchmark which focuses on road traffic scenarios. It is now a part of "AI City Challenge".

Challenge requires you to produce results in "MOT Challenge" format but the ground truth annotations are only available in .XML and .MAT formats which structure very different from MOT format. I had to spend several hours to parse these files and convert them into MOT format so that I can use motmetrics.

I have made following additions to support UA-DETRAC benchmark:

  • added new formats and relevant file loaders to io.py
  • updated the requirements file.
  • created a new app specifically for evaluating py-motmetrics for U-DETRAC challenge.

Please review and add if you think it is useful. Let me know if you need any modifications.

Regards

@cheind
Copy link
Owner

cheind commented Oct 30, 2019

This sounds useful, thank you! Would you mind changing the merging destination to develop branch?

@muaz-urwa muaz-urwa changed the base branch from master to develop October 30, 2019 12:34
@muaz-urwa
Copy link
Contributor Author

Sure, done.

@cheind
Copy link
Owner

cheind commented Oct 30, 2019

Thanks! Also is there any unit test you could add to tests\ that would prove correctness of execution for future versions of motmetrics? I.e for MOT16 I added a test case based on a small sample input file (.xml, .mat in your case) with expected results.

@muaz-urwa
Copy link
Contributor Author

Ah sure, I have added similar unit test for detrac data loaders.

@cheind cheind merged commit 3dc9ea1 into cheind:develop Oct 31, 2019
@cheind
Copy link
Owner

cheind commented Oct 31, 2019

thanks, merged. will be released with next version.

@fguney
Copy link

fguney commented Feb 18, 2020

Are there any plans to integrate the aggregated tracking metric used in DETRAC overall detection thresholds, i.e. 0:0.1:1? This way, it is not comparable to official results in the benchmark.

@cheind
Copy link
Owner

cheind commented Feb 18, 2020

linking this to #84

@goldentimecoolk
Copy link

Hi, have you finished the evaluation under the detrac-format? I'm not sure where to control the detection thresholds. I guess it should be the confidence threshold in NMS during detection postprocessing. Can you help me? Any other feedback is also welcome. Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants