Motivation

Wireless sensors are everywhere.  The Internet of Things is exploding, and everything from little battery powered temperature sensors to full-featured webcams are grabbing IP addresses and hopping on the Internet. Is there really a need to explore new architectures for sensor platforms?  We believe so.

Technology shifts have enabled new sensor platforms, new usage modes, and a rapidly expanding set of big data-driven applications to analyze and model complex systems. Some of the key technology improvements that motivate us to rethink traditional Internet of Things and wireless sensor architectures include:

My FitBit, a micro-sized fitness tracker with battery powered 3D accelerometer and wireless networking with low-power bluetooth.

1) Massive Computational Power: Years ago, scientists used “data loggers” — devices that simply recorded data from sensors for later download via RS232 serial link.  Beginning about the year 2000, computer scientists begin building small micro-sensor “motes” equipped with simple Amtel microprocessors and powered by a small battery and linked via ad-hoc wireless mesh networks. These systems simply replaced the data logger with a wireless link. While these nano sensors are useful for providing tiny data streams, improvements in low-power computation have made it possible to build a new class of intelligent and attentive sensors with extraordinarily powerful multicore processors in small low-power packages. This opens up new exciting architecture possibilities and challenges. In this new class, sensors become powerful, always attentive computational nodes in a large-scale dynamic distributed system.

2) Inexpensive, High-resolution Sensors: For many years, sensor deployments simply recorded as much data as could be practically stored in the logger.  Today, high-definition cameras, hyperspectral imagers, LIDAR, and micro radar are examples of new advanced sensors that can easily generate more data than is practical for local storage or network uplink to a server.  A hyperspectral camera can generate tens of gigabytes of data a day. Intelligent sensors must leverage computational power to enable in-situ processing and feature extraction. In-situ data processing can also be used to improve the privacy of sensor systems.  For example, a high-definition camera in an intelligent sensor node might be able to count pedestrian traffic, determine street congestion, or detect a person in distress without ever saving or sending images outside the node.

3) Cloud Computing: In the past, building a distributed sensor platform required significant investments in server-side computing resources and software.  Data from sensors has been plentiful, but the scalability of the data repository has been a major shortcoming. Elastically growing the storage or data serving components while providing public interfaces for scientific applications to extract and use the data was difficult if not impossible.  Today, ubiquitous cloud services has made it possible to scale resources to meet demand. The cloud model also simplifies software development, allowing developers to instantly deploy their own cloud for experimental sensors they are building or testing.

These three technology shifts enable new designs and architectures. However they only represent half of the story. The data from sensor networks is being used in new ways:

Nick and Gavin set up a Hyperspectral Camera at the Green Earth Institute organic farm in Naperville, 2013
Nick and Gavin set up a Hyperspectral Camera at the Green Earth Institute organic farm in Naperville, 2013

1) Near Real-time Analysis: Simulations of weather, traffic, plant growth, wildlife migration, and air pollution must all begin with initial conditions — inputs to the models.  While there may be little urgency to use “live” data for a climate model predicting changes decades in the future, many simulations are more accurate, and provide better predictive capability when they run with the most up-to-date data. Intelligent attentive sensor networks need architectures that can quickly aggregate, compress, verify, and stream data into computer simulations.

2) Actuation and Control: In many applications, there is a natural feedback loop between computation and analysis, and the sensor platform.  Imagers can be moved and refocused, air particulate sensors have adjustable pumps and filters, and microfluidic sensors have MEMS controllers. A new breed of attentive sensor platforms can use local in-situ computation and as well as remote cloud computing resources to continuously analyze data and adjust and control the sensor platform.

3) Privacy and Security: Unfortunately, data breeches and cybersecurity vulnerabilities are becoming commonplace.  Scientists building sensor networks must carefully consider the privacy issues of the data collected and the security protocols to protect data. New architectures for sensor networks can improve data integrity, privacy, and security with clear boundaries, data management protocols, and best-of-class security practices.

4) Fault-tolerant, Autonomous Operation: While cheap walnut-sized nano-sized sensors might be expected to fail often, intelligent attentive systems actively controlling a range of sensors and supporting complex in-situ data analysis should be extremely robust and resistant to failure. Like a NASA deep space probe, these sensor platforms must be built with a layered software architecture supporting multiple safe modes, consistency checks, and active fault avoidance.

5) Modular Open Source: Many of today’s embedded systems are either built from proprietary operating systems and software stacks, or the software stack is a difficult-to-maintain one-off, cobbled together for the particular deployment and without support for extensibility or adding new modules.  To support a diverse set of sensor platforms, a new modular hardware and software architecture is needed.