My PhD research was based on an investigation into how the volume of data generated by new IoT devices, or wireless sensors (term used in my research) could be intelligently compressed before being sent to a reporting station, or sink.
Thus, the aim of the research was to develop a model that could dynamically detect the characteristics of environment around a set of sensors (that is the scenario or application), and to determine and apply the best technique for the sensors to coordinate among themselves on how best to compress the data for transmission.
An example would be using a smartphone to collect data from surrounding smartphones to send data to a server. The questions that arise include whether it is best to have a device send its data to the next device, and then the receiving device sends to the next device, and on and on, till the last device sends to the server, or whether it was better to have all devices send to a single central device, which does the transmission to the server. In these two cases, there are obvious differences in the energy consumed by each device, the latency of getting the information to the server, as well as the bandwidth used to send the information. My research involved enabling the mobile devices to, autonomously, determine the best cause of action, that is selecting one of these options, based on an objective, whether its minimising energy consumption, latency, or bandwidth. There are other metrics that could be considered in this case, such as throughput, network lifetime, accuracy, etc.
The research involved the usually scientific study steps such as the selection of the research questions, hypothesis, data collection, processing and machine learning and development. I do not discuss this item as they do not seem to serve any purpose for the objective of this write-up, which is to provide a layman discussion of what the research was about.
Next, I present an architecture of what the outcome of the research involved. This includes the design of a model for the required data, the collection of data, generation of rules, machine learning, and model evaluation. In case more information is required for this research, kindly get in touch to discuss this further.
Some of the tools used during the research included:
- NS3 Simulator (https://www.nsnam.org/) – for simulating network scenarios requires strict development using C++
- Machine Learning – requires Python and Scikit-learn, pandas, matplotlib, etc.
- Ubuntu OS deployed in a VirtualBox virtual environment
I shall update this post to provide more details on the research. Please subscribe to keep informed.