Standards and security-aware design for connected devices

By Brandon Lewis

Editor-in-Chief

Embedded Computing Design

February 22, 2017

In Internet of Things (IoT) devices, security is often viewed as the purview of a specialized person or team in the development process, but the respo...

In Internet of Things (IoT) devices, security is often viewed as the purview of a specialized person or team in the development process, but the responsibility of designing safe devices belongs to all engineers, from mechanical to electronic to software.

The second in a series of monthly interviews on cyber security with Andrew Girson, Co-Founder and CEO of embedded training and consulting firm Barr Group, we discuss how to understand architectural security requirements, the merits (and limits) of security standards and certifications like Common Criteria, and how companies need to change their security approach to prevent cyber attacks on consumer IoT devices.

From a development perspective, what is the most overlooked aspect of security in connected consumer devices?

GIRSON: Perhaps the most overlooked aspect of security design is at a higher level, specifically at the system or architectural level. It is best to frame a question like this in terms of three concepts: strategic security requirements, design disciplines, and design layers. The concept of a secure Internet of Things (IoT) device has to have some meaning to the designers and the end users (requirements) and is likely to extend from the software executing on the electronics to the physical wired and wireless connections to the outside world and the physical device mechanical interfaces and packaging (disciplines), all of which need to be built on a (layered) foundation of security.

Consider, for example, the concept of intrusion detection requirements, which are commonplace for IoT devices and the cloud-based software systems to which they connect. The first step is threat modeling. The threat modeling helps identify threats and viable attacks, which include electrical, mechanical, and software vulnerabilities. System architects must then refine what this means for a device in terms of requirements, because if you ask a mechanical engineer, an electrical engineer, and a software engineer what it means to detect intrusion, you’ll get three different answers. The system architects must define intrusion detection for a specific product and consider mechanical/electronic/software design aspects, because security improves as the weak links are eliminated. That does not mean that if you are designing intrusion detection for a given IoT device, you must achieve equivalent levels of security in the different design disciplines. It’s all about requirements and architecture based on the type of device and how it will be used (both intended and unintended uses).

With a good understanding of architectural security requirements, the engineers of different disciplines can then create a layered security foundation based on building blocks within their discipline. In our Q&A from January, we discussed the need for a foundational “root of trust.” That discussion was about the use of semiconductors, bootloaders, and operating systems as building blocks of a secure software framework that could be extended out to the middleware and application level. There are aspects of electronics involved in this root of trust and additional software and mechanical considerations (e.g., proper design of PCBs, use of specific materials and fasteners, digital signatures, etc.) depending on the device.

Bottom line, security now must be considered a defining strategic element of IoT architectural design. The system architects must define the overall security requirements of a device or system, and it is up to the implementing engineers to incorporate the electronic/software/mechanical building blocks. With these activities completed properly, a foundation exists for the engineers at the app level to leverage and implement the required security and avoid weak links.

Would current security standards or certifications such as Common Criteria, CSPN, or those from the ISO/IEC help resolve some of these issues, or is a different, new approach required?

GIRSON: Technology standards can define specific implementation and interoperability details, but they can also outline design/development/test protocols and processes. Common Criteria (ISO/IEC 15408), CSPN, ISA/IEC 62443, and cryptographic standards are important for specifying interoperability and interpretation of security requirements and expectations for government, corporate, and industrial automation cybersecurity systems. However, as it relates to many types of IoT devices, there are still a tremendous number of unknowns and uncertainties for security implementations.

IoT security is an incredibly dynamic and evolving field. The range of IoT device types creates a wide variety of mechanical, electronic, and software threat vectors. Many of the OEMs with whom we discuss IoT security do not have a good understanding of the available standards and are not even certain how best to secure their own devices. The weak link philosophy of hacking implies that a hacker only has to find one problem, but an OEM must protect against many problems. As IoT evolves and becomes more mainstream, standards will play a larger role in creating economies of scale, objective product evaluations, and interoperability as it relates to security – but we are not there yet.

What steps can the technology industry take to improve the security of smart home and wearable devices within today’s current cost structures and development practices? Would any sort of industry regulation be effective in protecting consumer devices from cyber attack, given the global threat vector?

GIRSON: The philosophy of many IoT device OEMs appears to be that security is a discipline to be learned and practiced by a single guru or a small group within their development organization. This approach tends to reinforce the flawed notion that secure design costs can be measured via a minimal set of line items in a budget or a project schedule. But, just as best practices for designing safer and more reliable devices should be learned by all engineers (electronic, software, mechanical, and others), all engineers also should be taught and should implement a basic set of best practices for incorporating security into their individual design efforts.

Such an approach probably would be viewed as breaking the budget for many OEMs, but the long-term value in such a self-regulating approach would be significant. Alas, we still have a long way to go as an industry in recognizing the true costs of security breaches and of security-aware design. That said, government-imposed regulations are going to be a real challenge for IoT, with the large disparity in device types, costs, and complexities – not to mention the large installed base of deployed devices with limited security and limited upgradeability and the variety of industry and de facto standards and communications protocols that define interoperability within spaces such as home automation. At a fundamental level, as more companies recognize the importance of a security-aware design culture, our industry will slowly move towards a higher level of security in general. It will take time, but the good news is that awareness of the problem is growing.

Andrew Girson is Co-Founder and CEO of Barr Group.

Barr Group

barrgroup.com

@barrgroup

LinkedIn: www.linkedin.com/company-beta/10435404

Facebook: www.facebook.com/barrgroup

Brandon Lewis, Technology Editor

Brandon is responsible for guiding content strategy, editorial direction, and community engagement across the Embedded Computing Design ecosystem. A 10-year veteran of the electronics media industry, he enjoys covering topics ranging from development kits to cybersecurity and tech business models. Brandon received a BA in English Literature from Arizona State University, where he graduated cum laude. He can be reached at [email protected].

More from Brandon