BML-MoVi


MoVi: A Large Multipurpose Motion and Video Dataset

MoVi is the first human motion dataset to contain synchronized pose, body meshes and video recordings. The MoVi database can be applied in human pose estimation and tracking, human motion prediction and synthesis, action recognition and gait analysis.  

Publication

Here is a link to the preprint manuscript.

Referencing the MoVi Dataset

@misc{ghorbani2020movi,
    title={MoVi: A Large Multipurpose Motion and Video Dataset},
    author={Saeed Ghorbani and Kimia Mahdaviani and Anne Thaler and Konrad Kording and Douglas James Cook and Gunnar Blohm and Nikolaus F. Troje},
    year={2020},
    eprint={2003.01888},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}

Authors

Saeed Ghorbani (York University, Canada), Kimia Mahdaviani (Queen’s University, Canada), Anne Thaler (York University, Canada), Konrad Kording (University of Pennsylvania, USA), Douglas James Cook (Queen’s University, Canada), Gunnar Blohm (Queen’s University, Canada) and Nikolaus F. Troje (York University, Canada)

Abstract

Human movements are both an area of intense study and the basis of many applications such as character animation. For many applications it is crucial to identify movements from videos or analyze datasets of movements. Here we introduce a new human Motion and Video dataset MoVi, which we make available publicly. It contains 60 female and 30 male actors performing a collection of 20 predefined everyday actions and sports movements, and one self-chosen movement. In five capture rounds, the same actors and movements were recorded using different hardware systems, including an optical motion capture system, video cameras, and inertial measurement units (IMU). For some of the capture rounds the actors were recorded when wearing natural clothing, for the other rounds they wore minimal clothing. In total, our dataset contains 9 hours of motion capture data, 17 hours of video data from 4 different points of view (including one hand-held camera), and 6.6 hours of IMU data. In this paper, we describe how the dataset was collected and post-processed; We present state-of-the-art estimates of skeletal motions and full-body shape deformations associated with skeletal motion. We discuss examples for potential studies this dataset could enable.

Download

We have various formats of data in the BML-MoVi Database. A document detailing access to the data can be found here.

The video and MoCap data is hosted on the York University Scholar’s Portal found at this link. The same link has been embedded below the license for easier access. 

The code containing the model and tutorial is hosted on GitHub which can be found here.

The license for accessing and using the data from the MoVi Dataset can be found below and downloaded here.

Data License

For non-commercial and scientific research purposes

Please read carefully the following terms and conditions and any accompanying documentation before you download and/or use the BMLmovi Data (the “Data”). By downloading and/or using the Data, you acknowledge that you have read these terms and conditions, understand them, and agree to be bound by them. If you do not agree with these terms and conditions, you must not download and/or use the Data. Any infringement of the terms of this agreement will automatically terminate your rights under this License.

Ownership

The Data was collected between July and Dec 2018 at the BioMotion Lab. The funding was provided by NSERC and CFREF VISTA. Data is owned by and proprietary material of Prof. Dr. Nikolaus Troje, Director of the BioMotion Lab, which is currently located at York University, Toronto, Canada.

License Grant

Nikolaus Troje grants you (Licensee) a non-exclusive, non-transferable, free of charge right:

  • To obtain and install the Data on computers owned, leased or otherwise controlled by you and/or your organization;
  • To use the Data for the sole purpose of performing non-commercial scientific research, non-commercial education, or non-commercial artistic projects;
  • To modify, adapt, translate or create derivative works based upon the

Any other use, in particular any use for commercial purposes, is prohibited. This includes, without limitation, incorporation in a commercial product, use in a commercial service, or production of other artefacts for commercial purposes including, for example, 3D models, pictures, movies, or video games. The Data may not be reproduced, modified and/or made available in any form to any third party without prior written permission from Nikolaus Troje.

The Data may not be used for pornographic purposes or to generate pornographic material whether commercial or not. This license also prohibits the use of the Data to train methods/algorithms/neural networks/etc. for commercial use of any kind. By downloading the Data, you agree not to reverse engineer it.

No Distribution

The Data and the License herein granted shall not be copied, shared, distributed, re-sold, offered for re-sale, transferred or sub-licensed in whole or in part, except that you may make one copy for archive purposes only.

Human Subjects Data

The Data was captured using a population of students and staff from Queen’s University, Kingston, Canada. All participants gave their informed, written, consent for the scientific analysis and publication of their video Data, IMU Data, and 3D motion capture Data.

Disclaimer of Representations and Warranties

You expressly acknowledge and agree that the Data results from basic research, is provided “AS IS”, may contain errors, and that any use of the Data is at your sole risk. Nikolaus Troje and the BioMotion Lab make no representations or warranties of any kind concerning the Data neither expressed nor implied, and the absence of any legal or actual defects, whether discoverable or not. Specifically, and not to limit the foregoing, Nikolaus Troje and the BioMotion Lab make no representations or warranties (i) regarding the merchantability or fitness for a particular purpose of the Data, (ii) that the use of the Data will not infringe any patents, copyrights or other intellectual property rights of a third party, and (iii) that the use of the Data will not cause any damage of any kind to you or a third party.

Limitation of Liability

Under no circumstances shall Nikolaus Troje or the BioMotion Lab be liable for any incidental, special, indirect or consequential damages arising out of or relating to this license, including but not limited to, any lost profits, business interruption, loss of programs or other Data, or all other commercial damages or losses, even if advised of the possibility thereof.

No Maintenance Services

You understand and agree that Nikolaus Troje and the BioMotion Lab are under no obligation to provide either maintenance services, update services, notices of latent defects, or corrections of defects with regard to the Data. The BioMotion Lab nevertheless reserves the right to update, modify, or discontinue the Data at any time.

Defects of the Data must be notified in writing to the BioMotion Lab with a comprehensible description of the error symptoms (Email: [email protected]). The notification of the defect should enable the reproduction of the error. The Licensee is encouraged to communicate any use, results, modification or publication.

Publications using BMLmovi Data

You acknowledge that the Data is a valuable scientific resource and agree to appropriately cite the most recent paper describing the BMLmovi database in any publication making use of the Data. (Note: Citing the dataset URL instead of the publication(s) would not be compliant with this license agreement). Note that part of the BMLmovi Data is included in the Archive of Motion Capture as Surface Shapes (AMASS; https://amass.is.tue.mpg.de/) and was processed using methods introduced by AMASS.  Using this part of the dataset falls under the AMASS license agreement and requires you to appropriately cite the AMASS publicationby N Mahmood, N Ghorbani, NF Troje, G Pons-Moll, and MJ Black (2019). 

Acknowledgements

We wish to thank Nima Ghorbani for post-processing our motion capture data so it could be added to the AMASS dataset, and all others authors of AMASS for their approval to add the processed data to our dataset. 

We further wish to thank Viswaijt Kumar for his help with post-processing the data, setting up the data repository and designing and managing the website.

This research was funded by a NSERC Discovery Grant and contributions from CFREF VISTA to NFT.

Contact

Please send an email to [email protected] with any questions or concerns