It’s well understood that electronic systems in automobiles are not only becoming more complex, but they’re also becoming broader in terms of scope. Integrating over-the-air connectivity, such as cellular, Bluetooth, and Wi-Fi, enhance the user experience, and can extend control-by-wire systems. This is in addition to the several USB ports already found in the auto environment. As more components rely on software to perform functions, it’s not uncommon for a modern vehicle to have 10 million lines of code (MLOC) to 100 MLOC – more than many modern jet fighters!
With so much software behind so many access points, the attack surface is not only growing, it’s gaining attention from hackers as the latest exploitable software frontier. Consider, for instance, the number of buffer overflows in 100 MLOCs. Each one is an unrealized opportunity to be maliciously exploited. The problem is that manufacturers don’t typically builds cars with security in mind. Most security issues are quality issues, and according to the Software Engineering Institute, buffer overflows are one of the most common (see Predicting Software Assurance Using Quality and Reliability Measures).
Many organizations view building security into the software from the start of the project as an unnecessary policy. But given the cost and risk associated with a security breach, this is dangerous thinking. And considering the abundance of code analysis solutions, there’s simply no reason not to implement a security defect prevention strategy. In fact, it’s possible to detect security defects in the code by checking it against as few as three static analysis rules. We’ll get to those in a minute, but first let’s look at a typical automotive software security scenario.
Suppose that after conducting a set of traditional penetration tests on your software, you deploy the code but still get hacked. The hack makes headlines, so you’re compelled to expend considerable resources finding and fixing the security defect and pushing an update. Then you’re hacked again be-cause you didn’t find the root cause of the defect and go through the expensive process of playing software security defect whack-a-mole.
How do you ensure that you’ve found all potential vulnerabilities, rather than playing whack-a-mole with monthly automotive antivirus updates? Normally, you’d perform a root/cause analysis on the code. This involves looking at the penetration attempts that succeeded and rather than analyze the specific line of code, analyze the kind of code that was penetrated.
But this can be expensive in terms of time and/or resources. What you would most likely find is that you’re either not checking values, you’ve read or written overflows, or you have all three patterns. You can embark on a months-long project to validate that your code has these issues, or you can run a static analysis tool that implements the MISRA standards and check for just these three patterns to start with:
- Check the values passed to library functions
- Don’t allow read overflow
- Don’t allow write overflow
Other organizations concerned about automotive software security agree. In their paper on secure coding practices for embedded development, Jan Holle and Priyamvadha Vembar identify these patterns as a significant vulnerability:
“A good example of this is an array index overflow. Such a flaw could produce undefined behavior in some cases and hence violate the specifications; hence it is a defect. But, if left unplugged, could also lead to a stack smashing attack. Similarly Buffer Overflows: they may remain undetected and violate specifications in some corner cases, but, under right circumstances, they could become a vulnerability that enables code injection. We could also call such defects generic-defects – a problem that can occur in almost any program written in a given language. Generic defects represent a security problem in any case, almost in any context.”
Of course, you’ll eventually need to check against all MISRA rules, but these three patterns are likely the cause of most defects that impact automotive software security. Use a code analysis tool that enforces compliance with the following MISRA rules to check for these patterns:
- MISRA C:2012 Rule 12.4 “Evaluation of constant expressions should not lead to unsigned integer wrap-around.”
- MISRA C:2012 Rule 17.5: “The function argument corresponding to a parameter declared to have an array type shall have an appropriate number of elements.”
- MISRA C++:2008 18.0.5 “The unbounded functions of library shall not be used.”
- MISRA C:2004 Rule 20.3 “The validity of values passed to library functions shall be checked.”
- MISRA C:2004 Rule 21.1 “Ensure that array indices are within the bounds of the array size before using them to index the array”
To learn more about this topic, check out the webinar, “Your 7-point plan for securing automotive software” on January 28 at 2 p.m. EST.