Deep Learning based Delay and Bandwidth Efficient Data Transmission in IoT


Kok İ., Corak B. H., YAVANOĞLU U., ÖZDEMİR S.

IEEE International Conference on Big Data (Big Data), Los-Angeles, Şili, 9 - 12 Aralık 2019, ss.2327-2333 identifier identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Doi Numarası: 10.1109/bigdata47090.2019.9005680
  • Basıldığı Şehir: Los-Angeles
  • Basıldığı Ülke: Şili
  • Sayfa Sayıları: ss.2327-2333
  • Ankara Üniversitesi Adresli: Hayır

Özet

Internet of Things (IoT) applications are generating tremendous amount of data which is not only extremely big, but also missing, noisy, and uncertain due to intrinsic characteristics of IoT. These phenomenons pose a number of challenges in managing the IoT network and trustworthiness of the data analytics. Specifically, transferring all IoT data to the cloud for data analytics may be costly, inefficient and infeasible in some cases. Therefore, migrating sensor data processing and analysis closer to the edge devices plays a vital role in terms of reducing the amount of data sent to the cloud, IoT service delay and network latency. In this paper, we first aim to enable deep learning models in resource constrained IoT devices. Then, we design and implement a real IoT testbed consisting of resource constrained devices. We also provide a solution to the missing sensor data problem in IoT from the perspectives of edge, fog and cloud computing. Finally, we compare all computing approaches in terms of network load, latency and delay. Experimental results show that deep learning based edge and fog computing approaches can improve network delay and bandwidth requirements greatly and efficiently.