README.md 4.0 KB
Newer Older
1
# BMN
L
lixuanyi 已提交
2

H
Haodong Duan 已提交
3 4 5 6 7 8 9 10 11 12 13
## Introduction
```
@inproceedings{lin2019bmn,
  title={Bmn: Boundary-matching network for temporal action proposal generation},
  author={Lin, Tianwei and Liu, Xiao and Li, Xin and Ding, Errui and Wen, Shilei},
  booktitle={Proceedings of the IEEE International Conference on Computer Vision},
  pages={3889--3898},
  year={2019}
}
```

14 15 16 17
## Model Zoo

### ActivityNet feature

H
Haodong Duan 已提交
18 19 20 21 22
|config |feature | gpus | pretrain | AR@100| AUC | gpu_mem(M) | iter time(s) | ckpt | log| json|
|:--|:--:|:--:|:--:|:--:|:--:|:--:|:--:|:--:|:--:|:-:|
|[bmn_400x100_9e_2x8_activitynet_feature](/configs/localization/bmn/bmn_400x100_2x8_9e_activitynet_feature.py) |cuhk_mean_100 |2| None |75.28|67.22|5420|3.27|[ckpt](https://openmmlab.oss-accelerate.aliyuncs.com/mmaction/localization/bmn/bmn_400x100_9e_activitynet_feature/bmn_400x100_9e_activitynet_feature_20200619-42a3b111.pth)| [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmaction/localization/bmn/bmn_400x100_9e_activitynet_feature/bmn_400x100_9e_activitynet_feature.log)| [json](https://openmmlab.oss-accelerate.aliyuncs.com/mmaction/localization/bmn/bmn_400x100_9e_activitynet_feature/bmn_400x100_9e_activitynet_feature.log.json)|
| |mmaction_video |2| None |75.43|67.22|5420|3.27|[ckpt](https://openmmlab.oss-accelerate.aliyuncs.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_video/bmn_400x100_2x8_9e_mmaction_video_20200809-c9fd14d2.pth)| [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_video/bmn_400x100_2x8_9e_mmaction_video_20200809.log) | [json](https://openmmlab.oss-accelerate.aliyuncs.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_video/bmn_400x100_2x8_9e_mmaction_video_20200809.json) |
| |mmaction_clip |2| None |75.35|67.38|5420|3.27|[ckpt](https://openmmlab.oss-accelerate.aliyuncs.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_clip/bmn_400x100_2x8_9e_mmaction_clip_20200809-10d803ce.pth)| [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_clip/bmn_400x100_2x8_9e_mmaction_clip_20200809.log) | [json](https://openmmlab.oss-accelerate.aliyuncs.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_clip/bmn_400x100_2x8_9e_mmaction_clip_20200809.json) |
23

L
linjintao 已提交
24
Notes:
25
1. The **gpus** indicates the number of gpu we used to get the checkpoint.
L
linjintao 已提交
26 27
According to the [Linear Scaling Rule](https://arxiv.org/abs/1706.02677), you may set the learning rate proportional to the batch size if you use different GPUs or videos per GPU,
e.g., lr=0.01 for 4 GPUs * 2 video/gpu and lr=0.08 for 16 GPUs * 4 video/gpu.
H
Haodong Duan 已提交
28
2. For feature column, cuhk_mean_100 denotes the widely used cuhk activitynet feature extracted by [anet2016-cuhk](https://github.com/yjxiong/anet2016-cuhk), mmaction_video and mmaction_clip denote feature extracted by mmaction, with video-level activitynet finetuned model or clip-level activitynet finetuned model respectively.
29

L
lixuanyi 已提交
30
For more details on data preparation, you can refer to ActivityNet feature in [Data Preparation](/docs/data_preparation.md).
31 32 33

## Train
You can use the following command to train a model.
L
linjintao 已提交
34
```shell
35
python tools/train.py ${CONFIG_FILE} [optional arguments]
L
linjintao 已提交
36
```
37

L
linjintao 已提交
38 39
Example: train BMN model on ActivityNet features dataset.
```shell
L
lixuanyi 已提交
40
python tools/train.py configs/localization/bmn/bmn_400x100_2x8_9e_activitynet_feature.py
41
```
L
linjintao 已提交
42
For more details and optional arguments infos, you can refer to **Training setting** part in [getting_started](/docs/getting_started.md#training-setting) .
43 44 45

## Test
You can use the following command to test a model.
L
linjintao 已提交
46
```shell
47
python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments]
L
linjintao 已提交
48
```
49

L
linjintao 已提交
50 51
Example: test BMN on ActivityNet feature dataset.
```shell
52
# Note: If evaluated, then please make sure the annotation file for test data contains groundtruth.
L
linjintao 已提交
53
python tools/test.py configs/localization/bmn/bmn_400x100_2x8_9e_activitynet_feature.py checkpoints/SOME_CHECKPOINT.pth --eval AR@AN --out results.json
54
```
L
linjintao 已提交
55
For more details and optional arguments infos, you can refer to **Test a dataset** part in [getting_started](/docs/getting_started.md#test-a-dataset) .