Visual analysis of deep neural networks for device-free wireless localization

Shing Jiuan Liu*, Ronald Y. Chang, Feng-Tsun Chien

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Device-free indoor localization is a key enabling technology for many Internet of Things (IoT) applications. Deep neural network (DNN)-based location estimators achieve high-precision localization performance by automatically learning discriminative features from noisy wireless signals without much human intervention. However, the inner workings of DNN are not transparent and not adequately understood especially in wireless localization applications. In this paper, we conduct visual analyses of DNN-based location estimators trained with Wi-Fi channel state information (CSI) fingerprints in a real-world experiment. We address such questions as 1) how well has the DNN learned and been trained, and 2) what critical features has the DNN learned to distinguish different classes, via visualization techniques. The results provide plausible explanations and allow for a better understanding of the mechanism of DNN-based wireless indoor localization.

Original languageEnglish
Title of host publication2019 IEEE Global Communications Conference, GLOBECOM 2019 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Electronic)9781728109626
DOIs
StatePublished - Dec 2019
Event2019 IEEE Global Communications Conference, GLOBECOM 2019 - Waikoloa, United States
Duration: 9 Dec 201913 Dec 2019

Publication series

Name2019 IEEE Global Communications Conference, GLOBECOM 2019 - Proceedings

Conference

Conference2019 IEEE Global Communications Conference, GLOBECOM 2019
CountryUnited States
CityWaikoloa
Period9/12/1913/12/19

Fingerprint Dive into the research topics of 'Visual analysis of deep neural networks for device-free wireless localization'. Together they form a unique fingerprint.

Cite this