MLModelScope enables easy evaluations of both performance and accuracy of models across frameworks and systems. This repo contains commands that help summarize and visualize the experiments results.
forked from rai-project/evaluation
-
Notifications
You must be signed in to change notification settings - Fork 0
Evaluation tools for model performance / accuracy for MLModelScope
License
c3sr/evaluation
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Evaluation tools for model performance / accuracy for MLModelScope
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published
Languages
- Mathematica 98.2%
- Go 1.7%
- Shell 0.1%