Virtual labs offer cost-effective IoT device testing

By Marc Brown

Vice President

Digital14

August 15, 2017

Virtual labs offer cost-effective IoT device testing

Building a realistic physical testing lab environment is difficult, and even when complete, it becomes the main bottleneck in system testing. Virtual labs remove this bottleneck.

Even the smallest IoT device lives in a complex environment, which may not be fully understood at the time of development. In fact, I’ve already seen the security problems associated with devices being connected to the Internet for the first time. In a previous blog, I discussed the benefits of service orientation for design, development, and testing. Here, I'll take service-based testing and service virtualization to the next step: virtual labs.

Building a realistic physical testing lab environment is difficult, and even when complete, it becomes the main bottleneck in system testing. Virtual labs remove this bottleneck while providing new benefits to service-based IoT device testing.

A recent study found that 80 percent of IoT apps aren’t being tested for security flaws. A second survey found that 56 percent of embedded device developers don’t review source code for security vulnerabilities and 37 percent don’t have a written coding standard. These aren’t encouraging statistics, and it’s clear that IoT device makers need to take quality, safety, and security more seriously.

Test automation is an important step to ensure that testing is done more rigorously, consistently, and thoroughly. Testing, especially for security vulnerabilities, is often seen as too costly and complex, and is therefore rushed or overlooked altogether. But it’s an expensive mistake to let your customers (and attackers) test your IoT device security for you.

A real test lab requires the closest physical manifestation of the environment an IoT device is planned to work in, but even in the most sophisticated lab, it’s difficult to scale to a realistic environment. A virtual lab fixes this problem, as they have evolved passed the need for hard-to-find (or non-existent) hardware dependencies. Use sophisticated service virtualization with other key test automation tools. For example:

  • Service virtualization simulates all of the dependencies needed by the device under test (DUT) to perform full system testing. This includes all connections and protocols used by the device with realistic responses to communication. For instance, virtualization can simulate an enterprise server back-end that an IoT device communicates with to provide periodic sensor readings. Similarly, virtualization can control the IoT device in a realistic manner.
  • Service and API testing provides a way to drive the DUT in a way that ensures the services it provides (and APIs provided) are performing flawlessly. These tests can be manipulated via the automation platform to perform performance and security tests as needed.
  • Runtime monitoring detects errors in real-time on the DUT and captures important trace information. For example, memory leaks, which can remain undetected in a finished product, can be caught and resolved early and cheaply.
  • Test lab management and analytics provide the overarching control of the virtual lab(s). Once virtualized, the entire lab setup can be replicated as needed and test runs can be automated and repeated. Analytics provide the necessary summary of activities and outcomes.

The edge computing IoT ecosystem shown depicts a typical environment in which embedded IoT devices are deployed (Figure 1). Sensors and control devices communicate information to the Edge, which is a series of appliances or applications that can receive information and use logic to communicate back to a device or up to the cloud. The cloud then has higher-level logic that allows it to act upon that information. The cloud is a set of services—microservices, connections to databases, additional logic, or third-party services—that form a complex web of functional building blocks.

[Figure 1 | A typical IoT ecosystem in which embedded devices would be deployed. The “services” appear to the right.]

When it’s time to test in the IoT ecosystem, testing is required at many layers. For example, to test new functionality introduced in the gateway, validation is needed that the gateway can receive information from sensors, and can communicate that in the way you've built the business logic.

To validate this complexity, Parasoft Virtualize (which simulates required dependencies) and Parasoft SOAtest (which drives tests) can simulate those inputs. These tools simulate realistic calls from the devices over the network (whether they’re protocols like REST/HTTP, or IoT popular protocols like CoAP, XMPP, or MQTT), and test that the DUT (the gateway in this example) is communicating with the cloud services appropriately, by validating the responses that come back from SOAtest (Figure 2).

[Figure 2 | Shown is an example of how a virtual lab environment can be created for edge DUTs.]

If there are external ways to communicate information into that gateway, those calls can also be simulated. Virtualize stabilizes the testing environment to create predictable responses to requests that leverage test data from SOAtest, fully testing the gateway and services.

Finally, the top-level services might be communicating back to the Edge, and to other sensors and external actors, and it might be important to know that the flow from your inputs are making their way through the environment back to the back-end systems. Virtualize simulates the receiving of those calls down to the Edge (to the IoT devices) and then relays that information back to SOAtest to confirm that the call made the round trip and behaved as expected inside the IoT ecosystem.

Normal test environments are expensive, probably more than most development managers forecast. A study by voke Research found that the average investment in a pre-production lab was $12 million. And the average time to provision the lab was 18 days, and another 12 to 14 days were spent on configuration. Even after that cost and time, these labs act as the bottleneck for testing due to limited access. Further, day-to-day operational costs of physical labs are significant. In most cases, duplicating a physical lab to increase test throughput is cost prohibitive.

In a post The ABCs of Service Virtualization, the benefits of service virtualization are boiled to improve access to testing devices with better control of behavior of virtualized dependencies, which reduces costs and increases test speed. In a similar fashion, let’s break down the benefits of the virtual IoT test lab:

  • Improved quality through better and more complete testing. Service-based testing ensures that key use cases are exercised and perfected. Automated performance tests ensure stability and reliability under heavy load. In addition, runtime monitoring ensures that hard-to-find bugs are detected and traced.
  • Improved security with automated penetration tests that simulate malformed data. Load testing can simulate denial of service attacks and runtime monitoring can detect security vulnerabilities. Test repeatability ensures that each iteration, patch, or release is tested in the exact same manner. In addition, test development and manipulation (i.e. improving and creating new tests) is simplified.
  • Reducing testing time, risk, and cost by removing the expensive dependencies required for complete systems testing. Automation provides repeatability and consistency that manual testing can’t, while providing better and more complete testing. Virtual labs reduce the provision time needed for physical lab setups, impacting total test time.

Marc Brown is the CMO at Parasoft.

CMO, VP Marketing, VP IoT, VP Products, Research Director, Chief Strategy Officer As an research director and advisor, I cover marketing strategy, org, budget, and content, social, and web marketing bringing a broader range of insights to bear on topics pertaining to B2B CMO priorities. I am a strong advocate of agile marketing, tightly coupled sales and marketing groups, and dedicated sales enablement functions.

More from Marc