When contemplating a new Internet of Things (IoT) design, embedded developers have many options to consider when selecting a means of communication for distances of up to approximately 100 meters. They might choose the IEEE 802.15.1 Bluetooth with its various forms, such as Bluetooth Low Energy (BLE) and Bluetooth Classic. There are also the wireless protocols based around IEEE 802.15.4 such as ZigBee, WirelessHart, and Thread. Other possibilities might include Z-Wave and 802.11 Wi-Fi. Weighing the merits of each might occupy a lot of time as you investigate their capabilities, the key criteria including range, mesh networking capability, data throughput, IP support, and, for a battery-powered device, the typical power consumption profile.
To complicate matters further, the specifications of each protocol continue to evolve in step with application requirements and available technology. For example, Bluetooth’s stated range is due to increase by a factor of four during 2016. For a Class 2 radio, that would mean a shift from a range of 10 meters to 40. Naturally, each implementation varies, and since the Bluetooth Special Interest Group (SIG) only states a minimum range, it might be possible with a well-designed radio and good antenna placement for a BLE (v4.x) design to reach 100 m and a Classic Bluetooth (v2.0) up to 1 km. The Bluetooth SIG will double the data rate for BLE, which would mean going from 1 Mbps to 2 Mbps gross rate, while also lowering its latency. For industrial applications, latencies in the range of 10 ms are needed if a system is to be able to react in a timely manner to an anomaly, as fail-safe capability is critical for industrial IoT systems and devices.
Another aspect of adding wireless communications is also to provide location, time, and other contextual data. The belief moving forward is that the primary data that the IoT application requires is perhaps not enough. The value extracted from an IoT application (for example, managing a fleet of trucks), can be made significantly more with this additional data. Many Bluetooth radios already have a temperature sensor built into them, so that’s a good starting point.
For location alerts, iBeacons, introduced by Apple in 2013, have become popular but are not terribly accurate, having difficulty resolving ranges of under 1 m (see Figure 1). Within Bluetooth, the proximity profile (PXP) uses the received signal strength indicator (RSSI) to determine range. However, environmental factors such as the absorption of building materials and interference from other devices can vary the RSSI enough to make the estimated position rather inaccurate – something that gets worse as distance increases.
[Figure 1 | Bluetooth iBeacons rely upon the relative strength of RF signals to determine position]
For truly accurate positioning down to the centimeter, calculations based angle of arrival (AoA) and angle of departure (AoD) are all well studied and have been proven effective (see Figure 2).
[Figure 2 | A GNSS-enabled gateway can augment short-range wireless communication]
Another example is the use of time of flight (ToF) information when used over Wi-Fi. This measures how long a data packet takes to go from a transmitter to a receiver. Already tests have found this to be accurate to < 30 cm. For asset tracking applications, using a Bluetooth link with a temperature sensor (for example, in a blood bank), this could prove to be an ideal match.
GPS-enabled gateways can also provide very accurate timestamps, which could be combined with this previous example (see Figure 3). Even a smartphone could provide the timestamp data. Such inexpensive devices can also provide the link to the cloud through one of its multiple wireless interfaces.
[Figure 3 | GNSS-based time stamping can be used to provide accuracy in the nanosecond region]
Signal fingerprinting techniques are another way to provide positional information to data. This approach maps the RF paths from the known location of Wi-Fi access points or cellular towers in order to provide a more accurate location.
The need to select a wireless link with the lowest communications latency is also a requirement for some emerging applications such as vehicle-to-infrastructure (V2I), vehicle-to-vehicle (V2V), or vehicle-to-everything (V2X). For these applications, using a suitable communication protocol based on IEEE 802.11p in the 5.9-GHz band with a 10-MHz channel is likely. Typical signal delays are anticipated to be below 50 ms.
Though the options for short-range indoor connectivity are many and varied, that hasn’t always been the case for using cellular communications, particularly for low data rate and ultra low power battery-powered IoT applications. Low power wide area networks (LPWAN) protocols such as LoRa and Sigfox have emerged over the past couple years, but the establishment of the Narrowband IoT (NB-IoT) standard by the 3GPP body for use in the licensed GSM and LTE spectrum is gathering momentum.
Aimed at both indoor and outdoor applications that require a combination of very long battery life, penetration underground, and low cost, NB-IoT is worthy of consideration (see Figure 4).
[Figure 4 | Narrowband IoT offers many advantages]
Typically the traditional GSM and LTE networks can only fulfill 95-99 percent of outdoor coverage needs. Cellular-based IoT applications have high coverage requirements, especially for critical application and those that are in hard to reach locations. The metric for indoor coverage is that it should be 99.5 percent or higher. The 3GPP standards group is looking to address this issue through the enhancement of power spectrum density (PSD) using narrowband technologies. The 16-time retransmission mechanism and standalone and guard-band spectrum planning modes increase the coverage gains of NB-IoT by 20 dB when compared with those of GSM networks.
Operating at significantly lower power and data rates compared to a full cellular service, NB-IoT provides the necessary levels of robustness and reliability that IoT demands. It is well suited to applications such as reading gas and water meters by regularly transmitting small amounts of data. Other potential areas for NB-IoT include street lighting control for smart cities, building automation, people tracking, and agricultural monitoring.
So we have now investigated providing the where and the when connectivity – but is your IoT design secure? Ensuring that an IoT application is operating safely and reliably is crucial. Even the most secure, robust system has many vulnerabilities that a hacker, given enough time, could compromise – an unacceptable situation for any high reliability system – but by adding more contextual information such as the item’s time and position, the level or nature of the attack can be very dramatic. A means of identifying potential gaps in security need to be found and closed as quickly as possible.
In the case of an IoT sensor, a chain of trust must be established from the sensor to the microcontroller and wireless module, and all the way through to the end application (see Figure 5).
[Figure 5 | Securing your IoT design – highlight all potential attack points and leave nothing open]
As a way of understanding the aspects of an IoT design that should be carefully reviewed, developers will find the concept of the five pillars of a secure design useful. They comprise the device firmware and secure boot, the communications to the service, interface security, enforcing API control, and robustness techniques that can handle spoofing or attempts to jam the device.
Secure Boot ensures that a device is executing the intended firmware by authenticating at each stage before booting the next process. Also, while over-the-air updates are useful for mass uploads of many widely deployed IoT devices, they create an attack surface, so all firmware must be first validated before being installed. A good implementation includes a backup of a previously authenticated image to allow backtracking if there is a problem.
Within the transport layer, the data needs to be encrypted so that the device can authenticate itself with a server and communicate data without the possibility of a man-in-the-middle attack. Use of a per-session secure key management routine will make sure this part of the communications process does not introduce vulnerability.
To permit many third parties to communicate with the device, APIs are typically used. The challenge is that they might introduce vulnerability. Many APIs are well-documented, open source libraries, so a hacker would look for parts of the API that might give them access to device functions. During development, engineers might leave some API functions open or use undocumented features, so they need to be diligent about closing these when creating production versions of the code.
The final part in securing an IoT device is to ensure that it is robust enough to maintain security of operation when somebody is trying to jam or spoof the communications link. This is particularly the case for weak GNSS signals.
With some thoughts and the guidance from the points discussed above, developers can ensure their IoT design is enabled in the most efficient and secure way possible.