extra_gated_heading: Access FreeMan on Hugging Face
extra_gated_description: >-
This is a form to enable access to FreeMan on Hugging Face after you have been
granted access.
Please visit the [FreeMan Project Page](https://wangjiongw.github.io/freeman/)
and **complete [pre requisite
procudures](https://wangjiongw.github.io/freeman/download.html) before
confirming request here**.
Requests will be processed in 1-2 days.
extra_gated_prompt: >-
**Your Hugging Face account email address MUST match the email or Huggingface
ID you provide on the information backup forms matches the one filled in
previous step, or your request will not be approved.** **For who are in
mainland China, you can also apply & download from
[OpenDataLab](https://openxlab.org.cn/datasets/wangjiongwow/FreeMan/tree/main)
for stable connection.**
**中国大陆的使用者可以考虑通过[OpenDataLab](https://openxlab.org.cn/datasets/wangjiongwow/FreeMan/tree/main)获取下载链接。**
**FreeMan Usage Agreement can be found on [our
website](https://wangjiongw.github.io/freeman/download.html).**
extra_gated_fields:
Name: text
Institution: text
Email: text
I agree not to further copy, publish or distribute any portion of this dataset to any third party for any purpose: checkbox
I have already reviewed Usage Agreement at FreeMan website: checkbox
I understand FreeMan is for non-commercial research purpose ONLY: checkbox
extra_gated_button_content: Submit
license: cc-by-nc-nd-4.0
language:
- en
tags:
- pose estimation
- computer vision
- 3d human
FreeMan: Towards 3D Human Pose Estimation In the Wild
🌏 Project Page • 🙋♂️ Download Procedure • 📄 Paper • ▶️ YouTube • 🖥️ Code
This is official release of FreeMan dataset. To access the dataset, please submit previous steps at HERE.
[❗️❗️❗️] MAKE SURE you finish required steps HERE before apply dataset access here. Otherwise access request will NOT be approved.
For who are in mainland China, you can also apply & download from OpenDataLab for stable connection.
中国大陆的使用者可以考虑通过OpenDataLab获取下载链接。
Notice
[2023.09] 30FPS data for 3D Human Pose Estimation uploaded. Other data are under processing.
[2023.09] Paper released on arxiv. We are uploading the datasets. Please stay tuned.
Overview
Current uploaded data:
- 8 views
- 40 subjects
- 11M frames
- 10 types of scenarios
- 27 locations
File Info
The whole dataset is about 800GB and all RGB data are stored in video format. RGB data are zipped by subject, while all annotation files are zipped all togethers. Besides zip files, text files store list of each subsets.
session_list.txt: all session names included in FreeMan;
session_list_mono.txt: sample names and corrsponding view index in monocular experiments;
ignore_list.txt: sessions not used temporarily in experiments of FreeMan Paper(https://arxiv.org/abs/2309.05073);
train/valid/test.txt: session names in train/validation/test subset, respectivelly. They are seperated by subject.
To verify downloaded items, sha256sum for video files are shown here:
File Name | sha256sum | File Name | sha256sum | File Name | sha256sum | File Name | sha256sum |
---|---|---|---|---|---|---|---|
subj01.zip | a02d8e47d36235ae760f553b85b7df21cb711fecea73e7542df49e5c2d1441e6 | subj02.zip | 10d9b9af46ade9069401832ad0f1b06edd12a9e39f2ea9d06a55f1371bb9a28f | subj02.01 | 94ac1a0af242c33bb0d8777bea094fe0542ace0077f8836ff3944b0e41d3e45b | subj02.02 | 67174ca443c7c35fd1604d69ae30a46fece96af83d83d7428db1ce6483782739 |
subj02.03 | 05534d4e352454a5ffe0df802da53057953df274879d3067e023f2667773e4f6 | subj03.zip | 6825c59ecaec5114ead9f8d5f88d3d8ec2949cb9db102fb6beee6c338d2ae174 | subj04.zip | 48dc77e2c6fe741b8b271ee302ba5883eabec5ed960dac66181b58e38a36282d | subj05.zip | ab8033cc55e5e4f578490a225e60e986e67f162b79da44e68afcab805f396f2a |
subj06.zip | a204970bc9ebd142a97276ed6bb095cb3a9e621090a8fad57fb8bbe99b23d733 | subj07.zip | 292cf18231592c730a811488e9896756ee137b737c1e207428efda9f73a75d2d | subj08.zip | fad741aa4400c813a8a71303ad855c7d0e130613db3c72222b14f010ee8c74d4 | subj09.zip | 46bcf599207c7fcaac84102e40a60c29720e916431f13b932273d33199bab922 |
subj10.zip | 320f9c0cb169261bbcdc60163525d5b0b35fb17110dc488ad71491047b35d582 | subj11.zip | 72e3f99b5bfb918c269bfc53515c1d556e3ee8eed8d5aa82d619971eb74e76af | subj12.zip | 5e7c8dea74b59f14f65b08a51c1688f920ac24f943a1a6102fdb519b5df4702a | subj13.zip | 4a76b93a26fad023b315f46ff107ed619adb920cb2fb5d03e8de7f33bc68d0f6 |
subj14.zip | d78998c715bd54a65c9da7b247212c1a1833405c125bb5492b5d5fca9ec0e17f | subj15.zip | 070dfdd2d420c4392f487ae9558406cee970fe43f802f26ca28a4871f8b46d0b | subj16.zip | 9933c2e3f8b114b96e4b5f9938d22ca77686b6952489406ee07705ac4601caa2 | subj17.zip | 2e80f886faf3a5b3679da8f8f7bde0034008415128c404ea9445368f1d0cbe17 |
subj18.zip | 0f060e6c2b1699e9780f60b826946d69a77ae6c668cf6d02d3063b172ba3e3e0 | subj19.zip | 12bb121a1f4bd57a741ab651375af9bed105a9226390d225d8b08512d910f5d6 | subj20.zip | c07f8e261b4df3913399589405cbefd6a70807c6400373c92a4635390d4ee644 | subj21.zip | 26bc74962012698402836d2c327872c240b638a2bb54ae3b85af12b6a93269ee |
subj22.zip | 0b69237ff95205ff51c95bdb34cdaabf0304ecf856d4ce1f07d9f76c16e42cd1 | subj22.01 | f66ac6370835164ee66e7db1cf9c60609218f3c6d19dfe2b07f39bffa12635ae | subj23.zip | 2ad8a29c949e0208c7338efd955a3f6b9322c61a691ceac3be89d356fb02eb1c | subj24.zip | 1902fd34d9d9717c5be95af36162095aa2db35d6327e6d7d828b3c7ffe97587c |
subj25.zip | 01209473ca4a50e72e20849150dd489b62b6153fd2be798b699ceb106af40d1d | subj26.zip | 6cf1b76860a68bf8452e7e7719b2bb18bcb727691fb14e37da3864170973bfba | subj27.zip | a6e17a99b4597e26bb38ab86eb1c6f31534a643fb5ede55c888d34eda87fa546 | subj28.zip | f4ba067b02e2fd15d06f3e9d0eb750d2054d199b18d53258a77158153bef5c20 |
subj29.zip | 2cfec4175ed17b2763af4d21b6a924fd31cc4176109e53beaa5141ff63715db0 | subj30.zip | c57c28de39dd81f579df00758717e1e9f3a73df338c61b51667be450beea8220 | subj31.zip | ee83b60bc25e5e3fe63f56437e8a9c023d9e7deebea987e20516d568280f01c0 | subj32.zip | 7e217b54696f9214585072def09f43a11a3f0fa2181d9478106055c69f2a2a48 |
subj33.zip | 3f214e38daf8d416d2c5a5f5c32e6ea093a6c5cbc4c9eaa47e4a7e23c733a001 | subj34.zip | 39634d0f0e2db1bce9b9254a7dbc1425d5c4f83b38b64158e6ad8d411dfe4c65 | subj35.zip | 75c41ae4ade932774a3e8d58684c2b7fb9d62895dc87df9e1112d4d54894dae5 | subj36.zip | fc71637a6c46ad8de362204277c0bd68058b7a1e88c05c1bd713d80a85ea74ed |
subj37.zip | 75347b1c8cd5e4afaa923376bfe5109e890c5593388de2145cfd2fc081d9ba83 | subj38.zip | e99d2de0a05c798fa91112ea69468bd0cbeb00aaad1c5d330bacee0ede30f540 | subj39.zip | 669feb7b7effee0cc6d532151c509c89dcd5e85a259ad56cf9df321e526f56fa | subj40.zip | c3f632c5e52331209af17c44d4ee6a16511d625096a19b4b043bc811489b0fa3 |
Citation
If you find FreeMan helpful and used in your project, please cite our paper.
@article{wang2023freeman,
title={FreeMan: Towards Benchmarking 3D Human Pose Estimation in the Wild},
author={Wang, Jiong and Yang, Fengyu and Gou, Wenbo and Li, Bingliang and Yan, Danqi and Zeng, Ailing and Gao, Yijun and Wang, Junle and Zhang, Ruimao},
journal={arXiv preprint arXiv:2309.05073},
year={2023}
}
Contact
If any questions, please feel free to email us at [email protected], [email protected], or leave your questions in chat/issues.