Open Source IoT Code Is Not The Entire Answer


Summary: Whether or not to open sourcing embedded software is the wrong question. The right question is how can we ensure independent checks and balances on software safety and security. Independent certification agencies have been doing this for decades. So why not use them?

In the wake of the recent Volkswagen diesel software revelations, there has been a call from some that automotive software and even all Internet of Things software should be open source. The idea is that if the software is released publicly, then someone will notice if there is a security problem, a safety problem, or skulduggery of some sort. While open source can make sense, this is neither an economically realistic nor necessary step to apply across-the-board.

The Pro list for open source is pretty straightforward: if you publish the code, someone will come and read it and find all the problems.

The Con list is, however, more reflective of how things really work. You have to assume that someone with enough technical skill will actually spend the time to look, and will actually find the problem. That doesn't always happen. The relatively simple Heartbleed bug was there for all to see in OpenSSL, and it stayed there for a couple years despite being a widely used, crucial piece of open source Internet infrastructure software. Presumably a lot more people care about OpenSSL than your toaster oven's software.

Some of the opponents of open sourcing IoT software invoke the security bogeyman. They say that if you publish the source you'll be vulnerable to attacks. Well sure, it might make it easier to find a way to attack, but it doesn't make you "vulnerable." If your code was already full of vulnerabilities, publishing source code just might make it a little easier for someone to find them.  Did you notice that the automotive security exploits published recently did not rely on source code?  I can believe that exploits could, at least sometimes, be published more quickly for open source code, but I don't see this as a compelling argument for keeping code secret and un-reviewed.

A more fundamental point is that software is often the biggest competitive advantage in making products that would otherwise be commodities. Asking companies to reveal their most important trade secrets (their software), so that a hypothetical person with the time and skills might just happen to find a problem sounds like a hard sell to me.  Especially since there is the well established alternative of having an external, independent certification agency look things over in private.

Safety critical systems have had standards and independent review systems in place for decades. Aviation uses DO-178c and other standards, and has a set of independent reviewers called Designated Engineering Representatives (DERs) that provide design reviews during the development cycle. Rail systems follow EN-50126/8/9 and typically involve oversight from acquisition consultants. The chemical process industry generally follows IEC-61508, and has long used independent certification organizations to check their work (typically I see reviews have been done by Exida or TUV). The consumer appliance industry has long had Underwriters Laboratories (UL) certification, and is moving to a more comprehensive software safety standard approach based on IEC 60730, including external independent certification. There are also more recent domain-specific security standards that can be applied. (It is worth noting that ensuring safety and security requires a lot more than just source code, but that's a topic for another day.)

Cars have long had the option to use the MISRA software safety guidelines, and more recently the ISO 26262 safety standard. Historically, some companies have had external agencies certify automotive components to those standards. But, at least some car companies have not taken advantage of this external audit opportunity, and thus there has been no independent check and balance on their software until we their problems show up in the news. Software safety and security audits are not required to sell cars in the US. (There is some vehicle-level testing according to FMVSS requirements, but it's about vehicle behaviors, not the actual source code.)

For Internet of Things it will be interesting to see how things play out. As I understand it the EU is already requiring IEC 60730 compliance, which means external safety checks for safety critical IoT applications. We could see that mandate spread to more IoT products sold in Europe if there are high-profile problems. And perhaps we'll see a push on automotive software regulation too.

So, there is a well established alternative to open source in the form of external certifying organizations issuing compliance certificates based on international safety and security standards. Rather than get distracted by an open source debate, what we should be doing is asking "what's the most effective way to ensure adequate software safety and dependability in a way that doesn't put companies out of business." Sometimes that might be open source, especially for underlying infrastructure. But other times, probably most times, independent review by a trusted certification party will be up to the task. The question is really what it will take to make companies produce verifiably adequate software.

Having checks and balances works. We should use them.

(For the record, I made some of my source code public domain before "open source" was even a buzzword, and have released other source code under an older version of GPL (Ballista robustness testing) and Creative Commons BY 4.0 (CRC Hamming Distance length calculation). Some code I copyright and release. And some I keep as a trade secret. My interest here is in the public being able to use safe and secure embedded software. We should focus on that, and not let things get sidetracked into another iteration of the open source vs. proprietary software debate.)

0 nhận xét:

Đăng nhận xét