Visual analysis of deep neural networks for device-free wireless localization

Shing Jiuan Liu*, Ronald Y. Chang, Feng-Tsun Chien

*Corresponding author for this work

研究成果: Conference contribution同行評審

摘要

Device-free indoor localization is a key enabling technology for many Internet of Things (IoT) applications. Deep neural network (DNN)-based location estimators achieve high-precision localization performance by automatically learning discriminative features from noisy wireless signals without much human intervention. However, the inner workings of DNN are not transparent and not adequately understood especially in wireless localization applications. In this paper, we conduct visual analyses of DNN-based location estimators trained with Wi-Fi channel state information (CSI) fingerprints in a real-world experiment. We address such questions as 1) how well has the DNN learned and been trained, and 2) what critical features has the DNN learned to distinguish different classes, via visualization techniques. The results provide plausible explanations and allow for a better understanding of the mechanism of DNN-based wireless indoor localization.

原文English
主出版物標題2019 IEEE Global Communications Conference, GLOBECOM 2019 - Proceedings
發行者Institute of Electrical and Electronics Engineers Inc.
頁數6
ISBN(電子)9781728109626
DOIs
出版狀態Published - 十二月 2019
事件2019 IEEE Global Communications Conference, GLOBECOM 2019 - Waikoloa, United States
持續時間: 9 十二月 201913 十二月 2019

出版系列

名字2019 IEEE Global Communications Conference, GLOBECOM 2019 - Proceedings

Conference

Conference2019 IEEE Global Communications Conference, GLOBECOM 2019
國家United States
城市Waikoloa
期間9/12/1913/12/19

指紋 深入研究「Visual analysis of deep neural networks for device-free wireless localization」主題。共同形成了獨特的指紋。

引用此