Featured — February 27, 2011 at 5:12 pm

Software: The Broken Door of Cyberspace Security

By Fred D. Taylor, Jr.* —

“Software is most of the problem.  We have to write software which has many fewer errors and which is more secure”

— Dr Ed Amoroso, head of AT&T Network Security in Cyber War.

The Internet has become integrated into the everyday life of millions of people around the world.  It is the undercarriage for international banking, commerce and defense.  The development of advanced software has increased office productivity, management, command, control, communications, computers and intelligence (C4I).

Software is the door to the Internet – and the door is broken, allowing thieves, malcontents and the curious the opportunity to steal, deny or degrade the information and capabilities we hold most dear. The extensive reliance on software has created new and expanding opportunities.  Along with these opportunities, there are new vulnerabilities putting the global infrastructure and our national security at risk.  The ubiquitous nature of the Internet and the fact that it is serviced by common protocols and processes has allowed anyone with the knowledge to create software to engage in world-wide activities.  However, for most software developers there is no incentive to produce software that is more secure.

The software industry is vibrant and healthy.  In the desire to add more functionality in a fast-changing market there is less emphasis on quality software that is secure and error-free.   Companies and users accept that there will be flaws with their software.  Why?  In any other industry it would be unacceptable to allow an industry to produce a faulty product and shirk responsibility.  Instead of taking responsibility for defects in their software, the software producers have been able to transfer responsibility to the user.   Software companies are able to pass on responsibility for the security of their software to the consumer.  Thus, consumers are obligated to purchase security software to address software shortfalls, which has fueled a growing business sector for security software.   In 2010, worldwide security software revenue was expected to reach $16.5B worldwide.  However, this pales in comparison to the enterprise software market, which will reach $246.6B in 2011 according to a 2010 Gartner software market report.  Software development is a growing business but the investment is not in secure software.  If motivated, the software industry could apply greater effort in producing better quality software, but to date that motivation is still lacking.

Given this back-drop what should we do to address the problem?

  1. The government must take an active role to define software quality standards.  Consider instituting something similar to the lemon laws for automobiles, which were enacted to protect consumers from faulty products by forcing responsibility on the automobile industry to monitor and improve quality.  A lemon law applied to the software industry would restrict the sale of any software that does not meet security standards.  Additionally, software companies would be liable for damage or losses resulting from flaws in their software. This concept could also be applied to imported software, requiring review before entering the market place.  Software that does not meet standards will be denied access to the U.S. market.
  2. Motivate the software industry, through government incentives and regulation, to invest in better software design and development.  The software industry should partner with the government, academic and the science and technology community to develop new software coding that is more secure, easier to evaluate and more stringently tested.  For example, research into advanced artificial intelligence software development tools can help further this goal.
  3. The consumer must no longer accept flawed software.  The government should take responsibility for reviewing and evaluating software for quality and security compliance.  With expanded scope and authority, existing organizations such as U.S. Department of Homeland Security/Department of Commerce could serve in this capacity.

Cyberspace security is a vital national security interest, and the United States should take an active role in improving the quality of the software which undergirds the Internet.  The majority of cyberspace security issues can be traced back to software.  Better quality software will have a marked effect on improving cyberspace security.  In turn, cybercrime will be reduced, intellectual property will be more secure, and critical infrastructure will be better protected.  Software will never be perfect, but if we resign ourselves to accept inferior products, it will not improve.  A concerted effort by private industry, government, and the consumer will generate more secure software.  It is time to fix the broken door to the Internet.

*Fred D. Taylor, Jr. is a Lt. Colonel in the United States Air Force and a National Security Fellow at the Harvard Kennedy School. The views expressed in this article are those of the author and do not reflect the official policy or position of the United States Air Force, Department of Defense, or the U.S. Government.

Image courtesy of the U.S. Department of Homeland Security


  1. Excellent article which highlights a real and growing concern. A cursory look at the significant number of Security Updates to Microsoft Windows XP/Internet Explorer reveals a very good example of “the broken door”.

  2. Excellent article! The challenge, however, is a tall one (and, as with all interdependent solutions, may not be as simple as we would hope). While I agree wholeheartedly with the core idea—which to me is that without some kind of concerted consumer push-back and demand for accountability to a reasonable degree, there is not enough incentive to produce software that is, at least, MORE secure (or secure at all)—part of the uniqueness of this issue does, I think, give the “manufacturer” a modicum of limited liability.

    Imagine auto manufacturers being held to similar security standards (e.g. producing cars with *continually evolving* security systems that keep pace with the ever-increasing sophistication of car thieves). At some point mass-production of the product becomes impossible and/or the product becomes exorbitantly (if not prohibitively) expensive.

    With software sectors (i.e. software security developers) DEVOTED to addressing this continuously evolving hacker sophistication, might not this (in many ways) be addressing the problem more effectively and efficiently than expecting the software companies themselves (whose expertise lies in the utilitarianism of their unique product rather than, necessarily, software security) to be security-evolved in and of itself?

    Imagine the automobile security industry and products like LoJack…it’s potentially more efficient (and more cost-effective) to produce solutions that are capable of securing ALL automobiles with aftermarket (or integrated) solutions geared directly at the ever-changing criminal capabilities, rather than expecting every car to come with such sophistication “built-in”.

    Also, giving the consumer the choice of “bolting on” a product (and *which* product), also offers customers who are not at necessarily high risk (isolated networks, environments with no forward-facing connection to hacker-rich clouds, etc. or in the automobile example, those who live in gated communities or extremely safe areas) with less-costly alternatives to otherwise super-priced, all-inclusive software products.

    Sticking with the automobile example, we DO expect a certain level of security built-in (we’d never accept a car with no locks, or even extraordinarily poor-performing locks). So certainly there should be minimal standards for fixing these broken doors. I just think there may be more “good” to the software security industry than simply cleaning up poorly written software.

    As with all thing, however, there is always the middle ground, where such solutions are discovered!

  3. Interesting proposition and I certainly agree that the state of software assurance is dreadful. This challenge was solved in the manufacturing sector through zero-defect initiatives. It doesn’t seem like there is a good parallel there, though, with intellectual property production. Leaving aside the obvious defects that a software developer should know about and avoid, how are potential security defects to be identified? Testing can only carry one as far as the imagination of the test scenario writer. But no team of testers, no matter houw resourceful, could ever be expected to anticipate every last potential attack possibility.

    The problem of software assurance illustrates quite well where the notion of software development as an “engineering” activity fails. Engineers can’t violate the laws of physics, so the potential outcomes of their designs are more or less predictable. Software design only has to (try to) follow the rules of logic which, when they are applied to a sufficiently complex set of statements, can produce quite unpredictable outcomes.

    Because software design is more art than science, it is actually more akin to medicine than engineering. Errors and flaws arise from the interplay between the practitioner’s choices and decisions as applied to a subject with many pre-existing conditions, not all of which are known.

    “Bolt-on” security is clearly not the answer. Figuring out how to make computers do what we want done, safely, despite the fact that there are software flaws and malware is where I think we need to explore.

  4. Great article and concur with comments. The only area I differ with is the suggestion in the 3rd point on how we should address the problem. I don’t think government is the solution just like I don’t think it should be for health care, social security, the US mail or AMTRAK. I think the private sector could police itself much more effectively. Think of how many outlets there are for movie reviews. Turn that idea into several private entities who battle for objectivity and trustworthiness as they delve into programming code and stamp their own “safety rating” on software. Much like the Better Business Bureau, software companies would seek this stamp of approval for a small fee. We all win, and no more government growth required.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

.post-content .entry-content .post-title { text-align: center; }