LaRED: A large RGB-D extensible hand gesture dataset

Yuan Sheng Hsiao, Jordi Sanchez-Riera, Tekoing Lim, Kai Lung Hua, Wen-Huang Cheng

Research output: Contribution to conferencePaperpeer-review

18 Scopus citations


We present the LaRED, a Large RGB-D Extensible hand gesture Dataset, recorded with an Intel's newly-developed short range depth camera. This dataset is unique and differs from the existing ones in several aspects. Firstly, the large volume of data recorded: 243, 000 tuples where each tuple is composed of a color image, a depth image, and a mask of the hand region. Secondly, the number of different classes provided: a total of 81 classes (27 gestures in 3 different rotations). Thirdly, the extensibility of dataset: the software used to record and inspect the dataset is also available, giving the possibility for future users to increase the number of data as well as the number of gestures. Finally, in this paper, some experiments are presented to characterize the dataset and establish a baseline as the start point to develop more complex recognition algorithms. The LaRED dataset is publicly available at:

Original languageEnglish
Number of pages6
StatePublished - 1 Jan 2014
Event5th ACM Multimedia Systems Conference, MMSys 2014 - Singapore, Singapore
Duration: 19 Mar 201421 Mar 2014


Conference5th ACM Multimedia Systems Conference, MMSys 2014


  • Dataset
  • Depth data
  • Hand gestures
  • RGB images

Fingerprint Dive into the research topics of 'LaRED: A large RGB-D extensible hand gesture dataset'. Together they form a unique fingerprint.

Cite this