Maintaining software with Continuous Integration

April 30, 2015 OpenSystems Media

Held at the prestigious Radison Blue Edwardian Hampshire in Leicester Square, I was privileged to be invited to the Vector Software round table discussion and presentation presented by their COO Bill McCaffrey and EMEA Director Niroshan Rajadurai on how they are overcoming the challenges of ensuring software quality in a Continuous Integration (CI) delivery environment.

For those unfamiliar with the term, Continuous Integration is the philosophy of enabling connected devices to update “in situ”, in the field – thus not being held to ransom functionally by what was the latest firmware version when the unit left the factory. Historically, hardware limitations were often the primary hurdle preventing continuous integration, though with today’s hardware abundant in performance and function – actually the software element more strongly dictates this.

In the past, updating firmware in the field was either cumbersome and manual, or not viable at all. Back then system designers had to think ahead to capture as much future potential functional needs as they could. At the same time, software (especially safety critical software) naturally had extensive testing requirements, with complex software exhuming many man months in testing, which delayed time to market and stifled innovation.

Additionally, the process of testing itself was inherently an afterthought of the core design process. How extensive that testing was dictated by available budgets or typically weak software safety standard testing. Test back then was considered the remit of the less ambitious engineer, a buck passed because it had to be, a necessary inconvenience. My favourite quote lifted from the presentation was “Test was historically a graveyard, now it’s an inspiration!”

As the embedded and electronics industry moved forward, regulatory testing parameters increased exponentially, particularly in regulated industries such as aerospace, automotive, medical, and rail. To cope with these changes, developers through more man hours at the task, quickly increasing development costs and stretching budgets. The emerging “DevOps” mantra places huge pressures on delivery time-scales and no compromises on anything but the highest quality software coding – solutions developers are being squeezed in all directions.

The solution? No single solution could itself resolve the desperate direction regulatory requirements were pushing such software development. Justifiably, as under scrutiny it emerged that a plethora of examples of safety critical equipment were running on inherently untested software code. What was needed was both a change of methodology, incorporating TDD (test driven development) so test was no longer an inconvenient afterthought – and using the increasing power of software to effectively test itself.

That’s where VectorCAST comes in. Their historic success was driven by the increasing regulatory requirements, but the lack of thorough testing it highlighted forced companies to take heed, avoiding the risk of litigation until now it transpired had been more luck than judgement! Their software is able to go through complex software code running to tens of thousands of code lines exponentially more thoroughly and faster than an entire team of software developers.

In an ever competitive marketplace, time to market and reducing development cost is of course key – but never at the expense of product reliability, thus brand reputation. By approving software compliance with VectorCAST, even the most stringent regulatory bodies are satisfied their software fully conforms to safety critical code requirements.

Fast forward to today, Continuous Integration requirements driven by the rapid expansion of connected IoT type devices in every industry translate to an even higher importance of this capability. With regular updates pushed out to devices, we’re no longer talking about A software release needing such testing thoroughness, re-testing becomes effectively a daily requirement! Clearly to start from square one following every minor software change isn’t viable…

The VectorCAST software suite welcomes Continuous Integration, utilising a Change Based Testing (CBT) environment, it can monitor effect of even the tiniest of software coding changes throughout the entire application. This both enables developers to use virtualisation to discover the impact of their personal changes on the core application, but also to test daily software updates committed that day to push out to connected devices without any risk of failure en masse.

This is all well and good for new software, but how do developers address the increasing regulatory requirements on legacy software applications? Especially now “It hasn’t failed, so doesn’t need more testing” no longer washes, especially in safety critical areas.
VectorCAST enables legacy code to be “base-lined”. Base-lining effectively freezing the software at a “proven by time” point yet enabling new functionality built on top of the software core to employ the significant advantages of software based static and dynamic analysis – without the massive expense of starting entirely from scratch.

It is a scary thought to consider how many legacy safety critical applications exist out there with fundamentally unproven software code behind them – I take some comfort that in the future there is no place to hide for such unproven code.

Rory Dear, European Editor/Technical Contributor
Previous Article
Monitoring the slightest movements with radio wave advanced sensor technology

Revolutions in core technology invariably drive a similar revolution in applications, with the applications...

Next Article
Lowest-power processor ever? Possibly

I always raise an eyebrow when a see that a company has "reached a new level of performance" or something l...

×

Stay updated on healthcare-related topics with the Medical edition of our Embedded Daily newsletter

Subscribed! Look for 1st copy soon.
Error - something went wrong!