Direct view manipulation for drone photography

Yi Ling Chen, Wei Tse Lee, Li-Wei Chan, Rong Hao Liang, Bing Yu Chen

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations


For a long time, photographers hold and move their cameras, and consider how to frame a good shot all at the same time. With the emergence of drones, people start to let the flying carriers to hold their cameras in order to take more compelling pictures. However, the viewports between the photographer and device become decoupled and every single movement needs to be explicitly instructed via a remote controller. Even with the first-person view video streaming, users still have to be very skillful to fluently pilot the drone without causing distraction to photo composition. Inspired by the concept of viewfinder editing [Baek et al. 2013], we propose a more intuitive interface to control the flying camera (i.e., the drone) by direct view manipulation embodied with multi-touch gestures, which allows the users to directly alter and rearrange the visual elements in the picture prior to image capturing. In our proof-of-concept implementation, the viewfinder of a flying camera is mapped to the screen of a mobile device. The physical camera movements are encoded by common photo manipulation operations, such as translation and scaling, with multi-touch gestures.

Original languageEnglish
Title of host publicationSIGGRAPH Asia 2015 Posters, SA 2015
PublisherAssociation for Computing Machinery, Inc
ISBN (Electronic)9781450339261
StatePublished - 2 Nov 2015
EventSIGGRAPH Asia, SA 2015 - Kobe, Japan
Duration: 2 Nov 20156 Nov 2015

Publication series

NameSIGGRAPH Asia 2015 Posters, SA 2015


ConferenceSIGGRAPH Asia, SA 2015

Fingerprint Dive into the research topics of 'Direct view manipulation for drone photography'. Together they form a unique fingerprint.

Cite this