mmpose/configs/body_3d_keypoint/motionbert/h36m/motionbert_h36m.md

4.1 KiB

MotionBERT (2022)
 @misc{Zhu_Ma_Liu_Liu_Wu_Wang_2022,
 title={Learning Human Motion Representations: A Unified Perspective},
 author={Zhu, Wentao and Ma, Xiaoxuan and Liu, Zhaoyang and Liu, Libin and Wu, Wayne and Wang, Yizhou},
 year={2022},
 month={Oct},
 language={en-US}
 }
Human3.6M (TPAMI'2014)
@article{h36m_pami,
author = {Ionescu, Catalin and Papava, Dragos and Olaru, Vlad and Sminchisescu, Cristian},
title = {Human3.6M: Large Scale Datasets and Predictive Methods for 3D Human Sensing in Natural Environments},
journal = {IEEE Transactions on Pattern Analysis and Machine Intelligence},
publisher = {IEEE Computer Society},
volume = {36},
number = {7},
pages = {1325-1339},
month = {jul},
year = {2014}
}

Results on Human3.6M dataset with ground truth 2D detections

Arch MPJPE average MPJPE P-MPJPE ckpt
MotionBERT* 34.5 34.6 27.1 ckpt
MotionBERT-finetuned* 26.9 26.8 21.0 ckpt

Results on Human3.6M dataset converted from the official repo1 with ground truth 2D detections

Arch MPJPE average MPJPE P-MPJPE ckpt log
MotionBERT* 39.8 39.2 33.4 ckpt /
MotionBERT-finetuned* 37.7 37.2 32.2 ckpt /

1 By default, we test models with Human 3.6m dataset processed by MMPose. The official repo's dataset includes more data and applies a different pre-processing technique. To achieve the same result with the official repo, please download the test annotation file, train annotation file and factors under $MMPOSE/data/h36m/annotation_body3d/fps50 and test with the configs we provided.

Models with * are converted from the official repo. The config files of these models are only for validation. We don't ensure these config files' training accuracy and welcome you to contribute your reproduction results.