Design of a self-portrait application with sensor-assisted guiding for smart devices

Chi Chung Lo, Sz Pin Huang, Yi Ren, Yu-Chee Tseng

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Taking self-portrait on a smart device, such as a smart phone and a digital camera, is a lot of fun and can be easy when we know how. However, there are rare cases that we can get satisfied snapshots at our first try. Furthermore, the snapshots are usually stored on the camera. The user would not able to check the snapshots immediately on the remote side since she stands away the camera. In this paper, we propose a self-portrait application which enables a smart device to prevent faces from being cut out of the camera frame by giving suggestions to users until they are in a suitable position in the frame. In the same time, we enable the smart device to share the photos automatically to the remote device by machine-to-machine technique. The remote device can also control the camera to take photos again. The proposed application has been implemented on an Android smart phone.

Original languageEnglish
Title of host publication2013 1st International Black Sea Conference on Communications and Networking, BlackSeaCom 2013
Pages99-101
Number of pages3
DOIs
StatePublished - 16 Dec 2013
Event2013 1st International Black Sea Conference on Communications and Networking, BlackSeaCom 2013 - Batumi, Georgia
Duration: 3 Jul 20135 Jul 2013

Publication series

Name2013 1st International Black Sea Conference on Communications and Networking, BlackSeaCom 2013

Conference

Conference2013 1st International Black Sea Conference on Communications and Networking, BlackSeaCom 2013
CountryGeorgia
CityBatumi
Period3/07/135/07/13

Keywords

  • Camera
  • Face Detection
  • Inertial Sensor
  • Smart Devices

Fingerprint Dive into the research topics of 'Design of a self-portrait application with sensor-assisted guiding for smart devices'. Together they form a unique fingerprint.

Cite this