Growing the Internet of Things, part 3: Interoperability

December 12, 2016 OpenSystems Media

What will it take to really grow the Internet of Things (IoT)? The answer is complex and multifaceted, and we have to consider the following areas:

  • Cost
  • Ease of use
  • Interoperability
  • Future proofing
  • Security

In part 1 we discussed the Cost of IoT devices, and factors such as economies of scale and technology integration that are conspiring to drive down price. In part 2, we addressed the ease of use challenges facing the IoT, and highlighted a few considerations that connected device developers (using the example of connected light bulbs) must consider to overcome the possibly limited technical savvy of lay users.

Now, one of the biggest questions in the IoT space: Interoperability.

Interoperability can mean many things to different people. For someone working on a standard, it means the device has passed certification testing to be interoperable. Most standards have a prescribed set of tests that, once passed, means the product can display the appropriate logo. For a device manufacturer, interoperability means their devices work with each other but not necessarily with other manufacturers’ devices.

For a consumer, it means similar devices should work together and allow the end user to control them, pair them, or define functions between them in an expected way. Consumers are smart enough not to try to pair a Bluetooth headset with a Bluetooth keyboard – they know that does not work. But if you sell them a light switch, they expect it will work with all of their lights and not a subset of lights.

Interoperability has many layers, and this is part of what makes it a critical design consideration. Standards-based certification testing is a basic building block for device interoperability that sets a minimum bar for product testing. However, certification testing is generally designed to validate over-the-air (OTA) packet formats and behaviors, and most of this type of testing does not cover larger networks, performance characteristics, or negative testing. This testing is also normally done on devices and not consumer interfaces, such as a smartphone application used to interact with devices. While certification testing is important, it does not solve all consumer interoperability problems.

Many companies providing more complete ecosystems of products from themselves and other manufacturers often set up their own integration and interoperability testing to further test and validate device performance. In some cases, device manufacturers buy other devices and set up their own internal testing lab to minimize consumer-facing problems.

To solve interoperability issues, they first must be addressed at the connectivity level – devices have to be on common networks and have the ability to connect to each other. This is similar to saying everyone must have a phone connection if we are going to call each other. Connectivity for devices is selected based on performance, power consumption, and what other devices are also expected to be connected. Based on these factors, a manufacturer may choose Wi-Fi, Bluetooth, 802.15.4, or even proprietary connectivity choices. If these are all IP-based protocols, they can still connect through routers.

The growing convergence on IP connectivity across more devices will resolve the connectivity issue, but this alone is not enough to ensure interoperability. Manufacturers who want to make devices look at connectivity choices are somewhat natural, but the choices of application layers can be very confusing. Devices are also speaking different application protocols such as Apple HomeKit, Google’s Weave, ZigBee, Z-Wave, KNX, proprietary protocols, and many others. This means that even if the devices can connect to each other, they do not speak a common language to be able to interact.

A manufacturer does not want to have multiple product families or different software images just to handle the differences in application layer protocol. The next step is for the market to resolve the divergence of application layers, and agree on two or three common standards to be used instead of the vast amount of choices that exist today. Market forces will naturally reduce the wide existing range of application layer choices to increase interoperability, and device makers and ecosystem providers are working now to try to select which application layers will have the most market acceptance, are simple to implement and use, and can be provided across the widest range of devices to allow a complete set of consumer choices.

Interoperability beyond the connectivity and application layers then will encompass the consumer interaction, cloud, and mobile interfaces. These will change rapidly as the number of devices grows, and the market will adopt what works best for end users. These interfaces and applications typically update on a much faster cycle than the embedded devices they are connecting with.

Interoperability requires close focus from the standards organizations as well as from leading companies in the IoT market. If interoperability does not receive clear focus and attention, the market will continue to have fragmented ecosystems, limiting its growth.

Next up: Future Proofing.

Skip Ashton is Vice President of Software at Silicon Labs.

Silicon Labs

www.silabs.com

@siliconlabs

LinkedIn: www.linkedin.com/company-beta/165971

Facebook: www.facebook.com/siliconlabs

YouTube: www.youtube.com/user/ViralSilabs

 

Skip Ashton, Silicon Labs
Previous Article
Microchip's digital-signal controllers suits digital power apps

While I generally don't think about Microchip when I'm thinking about DSP vendors, they do offer some DSP-b...

Next Article
Growing the Internet of Things, part 1: Cost

In my ongoing discussions with customers and the market in general about the growth of the Internet of Thin...

×

Stay updated on processing and related topics with the Processing edition of our Embedded Daily newsletter

Subscribed! Look for 1st copy soon.
Error - something went wrong!