TY - GEN
T1 - 360° video viewing dataset in head-mounted virtual reality
AU - Lo, Wen Chih
AU - Fan, Ching Ling
AU - Lee, Jean
AU - Huang, Chun-Ying
AU - Chen, Kuan Ta
AU - Hsu, Cheng Hsin
PY - 2017/6/20
Y1 - 2017/6/20
N2 - 360° videos and Head-Mounted Displays (HMDs) are ge.ing increasingly popular. However, streaming 360° videos to HMDs is challenging. This is because only video content in viewers' Fieldof- Views (FoVs) is rendered, and thus sending complete 360° videos wastes resources, including network bandwidth, storage space, and processing power. Optimizing the 360° video streaming to HMDs is, however, highly data and viewer dependent, and thus dictates real datasets. However, to our best knowledge, such datasets are not available in the literature. In this paper,we present our datasets of both content data (such as image saliency maps and motion maps derived from 360° videos) and sensor data (such as viewer head positions and orientations derived from HMD sensors). We put extra e.orts to align the content and sensor data using the timestamps in the raw log files. The resulting datasets can be used by researchers, engineers, and hobbyists to either optimize existing 360° video streaming applications (like rate-distortion optimization) and novel applications (like crowd-driven camera movements). We believe that our dataset will stimulate more research activities along this exciting new research direction.
AB - 360° videos and Head-Mounted Displays (HMDs) are ge.ing increasingly popular. However, streaming 360° videos to HMDs is challenging. This is because only video content in viewers' Fieldof- Views (FoVs) is rendered, and thus sending complete 360° videos wastes resources, including network bandwidth, storage space, and processing power. Optimizing the 360° video streaming to HMDs is, however, highly data and viewer dependent, and thus dictates real datasets. However, to our best knowledge, such datasets are not available in the literature. In this paper,we present our datasets of both content data (such as image saliency maps and motion maps derived from 360° videos) and sensor data (such as viewer head positions and orientations derived from HMD sensors). We put extra e.orts to align the content and sensor data using the timestamps in the raw log files. The resulting datasets can be used by researchers, engineers, and hobbyists to either optimize existing 360° video streaming applications (like rate-distortion optimization) and novel applications (like crowd-driven camera movements). We believe that our dataset will stimulate more research activities along this exciting new research direction.
KW - 360° dataset
KW - 360° video
KW - HMD
KW - Head tracking dataset
KW - Virtual reality
UR - http://www.scopus.com/inward/record.url?scp=85025583871&partnerID=8YFLogxK
U2 - 10.1145/3083187.3083219
DO - 10.1145/3083187.3083219
M3 - Conference contribution
AN - SCOPUS:85025583871
T3 - Proceedings of the 8th ACM Multimedia Systems Conference, MMSys 2017
SP - 211
EP - 216
BT - Proceedings of the 8th ACM Multimedia Systems Conference, MMSys 2017
PB - Association for Computing Machinery, Inc
Y2 - 20 June 2017 through 23 June 2017
ER -