Skip to content

Commit

Permalink
fix name
Browse files Browse the repository at this point in the history
  • Loading branch information
JoannaLXY authored and innerlee committed Jul 11, 2020
1 parent 0fb191e commit 02c0626
Show file tree
Hide file tree
Showing 11 changed files with 38 additions and 38 deletions.
6 changes: 3 additions & 3 deletions .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Contributing to mmaction
# Contributing to mmaction2

All kinds of contributions are welcome, including but not limited to the following.

Expand All @@ -7,14 +7,14 @@ All kinds of contributions are welcome, including but not limited to the followi

## Workflow

1. fork and pull the latest mmaction
1. fork and pull the latest mmaction2
2. checkout a new branch (do not use master branch for PRs)
3. commit your changes
4. create a PR

Note
- If you plan to add some new features that involve large changes, it is encouraged to open an issue for discussion first.
- If you are the author of some papers and would like to include your method to mmaction,
- If you are the author of some papers and would like to include your method to mmaction2,
please contact Kai Chen ([email protected]). We will much appreciate your contribution.

## Code style
Expand Down
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,17 +5,17 @@
## Introduction

<div align="left">
<a href='https://mmaction.readthedocs.io/en/latest/?badge=latest'>
<img src='https://readthedocs.org/projects/mmaction/badge/?version=latest' alt='Documentation Status' />
<a href='https://mmaction2.readthedocs.io/en/latest/?badge=latest'>
<img src='https://readthedocs.org/projects/mmaction2/badge/?version=latest' alt='Documentation Status' />
</a>
<a href="https://github.com/open-mmlab/mmaction/blob/master/LICENSE">
<img src="https://img.shields.io/github/license/open-mmlab/mmaction.svg">
<a href="https://github.com/open-mmlab/mmaction2/blob/master/LICENSE">
<img src="https://img.shields.io/github/license/open-mmlab/mmaction2.svg">
</a>
</div>

The master branch works with **PyTorch 1.3+**.

MMAction is an open-source toolbox for action understanding based on PyTorch.
MMAction2 is an open-source toolbox for action understanding based on PyTorch.
It is a part of the [OpenMMLab project](https://github.com/open-mmlab) developed by [Multimedia Laboratory, CUHK](http://mmlab.ie.cuhk.edu.hk/).

<div align="center">
Expand All @@ -35,7 +35,7 @@ It is a part of the [OpenMMLab project](https://github.com/open-mmlab) developed

- **Support for multiple action understanding frameworks**

MMAction implements popular frameworks for action understanding:
MMAction2 implements popular frameworks for action understanding:

- For action recognition, various algorithms are implemented, including TSN, TSM, R(2+1)D, I3D, SlowOnly, SlowFast.

Expand Down Expand Up @@ -79,10 +79,10 @@ There are also tutorials for [finetuning models](docs/tutorials/finetune.md), [a

## Contributing

We appreciate all contributions to improve MMAction. Please refer to [CONTRIBUTING.md](.github/CONTRIBUTING.md) for the contributing guideline.
We appreciate all contributions to improve MMAction2. Please refer to [CONTRIBUTING.md](.github/CONTRIBUTING.md) for the contributing guideline.

## Acknowledgement

MMAction is an open source project that is contributed by researchers and engineers from various colleges and companies.
MMAction2 is an open source project that is contributed by researchers and engineers from various colleges and companies.
We appreciate all the contributors who implement their methods or add new features, as well as users who give valuable feedbacks.
We wish that the toolbox and benchmark could serve the growing research community by providing a flexible toolkit to reimplement existing methods and develop their own new models.
2 changes: 1 addition & 1 deletion docker/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ RUN apt-get update && apt-get install -y git ninja-build libglib2.0-0 libsm6 lib

# Install mmaction
RUN conda clean --all
RUN git clone https://github.com/open-mmlab/mmaction.git /mmaction
RUN git clone https://github.com/open-mmlab/mmaction2.git /mmaction
WORKDIR /mmaction
ENV FORCE_CUDA="1"
RUN pip install cython --no-cache-dir
Expand Down
6 changes: 3 additions & 3 deletions docs/benchmark.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ We compare our results with some popular frameworks and official releases in ter

## Comparision Rules

Here we compare our MMAction repo with other video understanding toolboxes in the same data and model settings
Here we compare our MMAction2 repo with other video understanding toolboxes in the same data and model settings
by the training time per iteration. Here, we use
- commit id [7f3490d](https://github.com/open-mmlab/mmaction/tree/7f3490d3db6a67fe7b87bfef238b757403b670e3)(1/5/2020) of MMAction V0.1
- commit id [8d53d6f](https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd)(5/5/2020) of Temporal-Shift-Module
Expand All @@ -21,7 +21,7 @@ The training speed is measure with s/iter. The lower, the better.

## Recognizers

| Model | MMAction (s/iter) | MMAction V0.1 (s/iter) | Temporal-Shift-Module (s/iter) | PySlowFast (s/iter) |
| Model | MMAction2 (s/iter) | MMAction V0.1 (s/iter) | Temporal-Shift-Module (s/iter) | PySlowFast (s/iter) |
| :--- | :---------------: | :--------------------: | :----------------------------: | :-----------------: |
| TSN ([tsn_r50_1x1x3_100e_kinetics400_rgb](/configs/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb.py)) | **0.29** | 0.36 | 0.45 | x |
| I3D ([i3d_r50_32x2x1_100e_kinetics400_rgb](/configs/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb.py)) | **0.45** | 0.58 | x | x |
Expand All @@ -35,7 +35,7 @@ The training speed is measure with s/iter. The lower, the better.

## Localizers

| Model | MMAction (s/iter) | BSN(boundary sensitive network) (s/iter) |BMN(boundary matching network) (s/iter)|
| Model | MMAction2 (s/iter) | BSN(boundary sensitive network) (s/iter) |BMN(boundary matching network) (s/iter)|
| :--- | :---------------: | :-------------------------------------: | :-------------------------------------: |
| BSN ([TEM + PEM + PGM](/configs/localization/bsn)) | **0.074(TEM)+0.040(PEM)** | 0.101(TEM)+0.040(PEM) | x |
| BMN ([bmn_400x100_2x8_9e_activitynet_feature](/configs/localization/bmn/bmn_400x100_2x8_9e_activitynet_feature.py)) | **3.27** | x | 3.30 |
Expand Down
2 changes: 1 addition & 1 deletion docs/data_preparation.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Notes on Video Data Format

MMAction supports two types of data format: raw frames and video. The former is widely used in previous projects such as [TSN](https://github.com/yjxiong/temporal-segment-networks).
MMAction2 supports two types of data format: raw frames and video. The former is widely used in previous projects such as [TSN](https://github.com/yjxiong/temporal-segment-networks).
This is fast when SSD is available but fails to scale to the fast-growing datasets.
(For example, the newest edition of [Kinetics](https://deepmind.com/research/open-source/open-source-datasets/kinetics/) has 650K videos and the total frames will take up several TBs.)
The latter saves much space but has to do the computation intensive video decoding at execution time
Expand Down
2 changes: 1 addition & 1 deletion docs/getting_started.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Getting Started

This page provides basic tutorials about the usage of MMAction.
This page provides basic tutorials about the usage of MMAction2.
For installation instructions, please see [install.md](install.md).

## Datasets
Expand Down
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Welcome to MMAction's documentation!
Welcome to MMAction2's documentation!
====================================

.. toctree::
Expand Down
22 changes: 11 additions & 11 deletions docs/install.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ CFLAGS="${CFLAGS} -mavx2" pip install --upgrade --no-cache-dir --force-reinstall
conda install -y jpeg libtiff
```

### Install mmaction
### Install mmaction2

a. Create a conda virtual environment and activate it.

Expand Down Expand Up @@ -57,21 +57,21 @@ conda install pytorch=1.3.1 cudatoolkit=9.2 torchvision=0.4.2 -c pytorch

If you build PyTorch from source instead of installing the prebuilt package, you can use more CUDA versions such as 9.0.

c. Clone the mmaction repository
c. Clone the mmaction2 repository

```shell
git clone https://github.com/open-mmlab/mmaction.git
cd mmaction
git clone https://github.com/open-mmlab/mmaction2.git
cd mmaction2
```

d. Install build requirements and then install mmaction
d. Install build requirements and then install mmaction2

```shell
pip install -r requirements/build.txt
pip install -v -e . # or "python setup.py develop"
```

If you build mmaction on macOS, replace the last command with
If you build mmaction2 on macOS, replace the last command with

```
CC=clang CXX=clang++ CFLAGS='-stdlib=libc++' pip install -e .
Expand All @@ -83,7 +83,7 @@ Note:
1. The git commit id will be written to the version number with step d, e.g. 0.6.0+2e7045c. The version will also be saved in trained models.
It is recommended that you run step d each time you pull some updates from github. If C++/CUDA codes are modified, then this step is compulsory.

2. Following the above instructions, mmaction is installed on `dev` mode, any local modifications made to the code will take effect without the need to reinstall it (unless you submit some commits and want to update the version number).
2. Following the above instructions, mmaction2 is installed on `dev` mode, any local modifications made to the code will take effect without the need to reinstall it (unless you submit some commits and want to update the version number).

3. If you would like to use `opencv-python-headless` instead of `opencv-python`,
you can install it before installing MMCV.
Expand Down Expand Up @@ -117,7 +117,7 @@ docker run --gpus all --shm-size=8g -it -v {DATA_DIR}:/mmaction/data mmaction

### A from-scratch setup script

Here is a full script for setting up mmaction with conda and link the dataset path (supposing that your Kinetics-400 dataset path is $KINETICS400_ROOT).
Here is a full script for setting up mmaction2 with conda and link the dataset path (supposing that your Kinetics-400 dataset path is $KINETICS400_ROOT).

```shell
conda create -n open-mmlab python=3.7 -y
Expand All @@ -133,11 +133,11 @@ mkdir data
ln -s $KINETICS400_ROOT data
```

### Using multiple MMAction versions
### Using multiple MMAction2 versions

The train and test scripts already modify the `PYTHONPATH` to ensure the script use the MMAction in the current directory.
The train and test scripts already modify the `PYTHONPATH` to ensure the script use the MMAction2 in the current directory.

To use the default MMAction installed in the environment rather than that you are working with, you can remove the following line in those scripts.
To use the default MMAction2 installed in the environment rather than that you are working with, you can remove the following line in those scripts.

```shell
PYTHONPATH="$(dirname $0)/..":$PYTHONPATH
Expand Down
14 changes: 7 additions & 7 deletions docs/merge_docs.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ cat ../configs/recognition/*/*.md > recognition_models.md
cat ./tutorials/finetune.md ./tutorials/new_dataset.md ./tutorials/data_pipeline.md ./tutorials/new_modules.md > tutorials.md

sed -i 's/](\/docs\//](/g' ../tools/data/*/*.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction/tree/master/=g' ../tools/data/*/*.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction2/tree/master/=g' ../tools/data/*/*.md
cat ../tools/data/*/*.md > prepare_data.md

sed -i 's/.md](\/tools\/data\/activitynet\/preparing_activitynet.md/](#activitynet/g' data_preparation.md
Expand Down Expand Up @@ -41,12 +41,12 @@ sed -i 's/](new_dataset.md)/](#tutorial-2-adding-new-dataset)/g' tutorials.md

sed -i 's/](\/docs\//](/g' recognition_models.md # remove /docs/ for link used in doc site
sed -i 's/](\/docs\//](/g' localization_models.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction/tree/master/=g' recognition_models.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction/tree/master/=g' localization_models.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction/tree/master/=g' benchmark.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction/tree/master/=g' getting_started.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction/tree/master/=g' install.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction/tree/master/=g' tutorials.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction2/tree/master/=g' recognition_models.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction2/tree/master/=g' localization_models.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction2/tree/master/=g' benchmark.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction2/tree/master/=g' getting_started.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction2/tree/master/=g' install.md
sed -i 's=](/=](https://github.com/open-mmlab/mmaction2/tree/master/=g' tutorials.md


cat localization_models.md recognition_models.md > modelzoo.md
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/finetune.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ What we need is `load_from`, which will be discussed later.

## Modify Dataset

MMAction supports UCF101, Kinetics-400, Moments in Time, Multi-Moments in Time, THUMOS14,
MMAction2 supports UCF101, Kinetics-400, Moments in Time, Multi-Moments in Time, THUMOS14,
Something-Something V1&V2, ActivityNet Dataset.
The users may need to adapt one of the above dataset to fit for their special datasets.
In our case, UCF101 is already supported by various dataset types, like `RawframeDataset`,
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/new_dataset.md
Original file line number Diff line number Diff line change
Expand Up @@ -197,7 +197,7 @@ dataset_A_train = dict(

## Customize Dataset by Mixing Dataset

MMAction also supports to mix dataset for training. Currently it supports to repeat dataset.
MMAction2 also supports to mix dataset for training. Currently it supports to repeat dataset.

### Repeat dataset

Expand Down

0 comments on commit 02c0626

Please sign in to comment.