Internet of Things Security – The Case for Systemic Resilience

Embedded Bits, Bytes & Sensors by Alain Louchez and Gilad Rosner

When discussing the Internet of Things, a central and recurring topic is its massive size. While the world population is about 7.3 billion, there is an incommensurably bigger number of "things", which are potentially connectable. As an illustrative point, in a 2013 solicitation regarding Networking Technology and Systems (NeTS: JUNO) Japan-US Network Opportunity: R&D for "Beyond Trillions of Objects", the National Science Foundation (NSF) was underlining some specific challenges created by "trillions of network-connected objects [that] are expected to emerge in the global network around 2020." In the same year, Cisco, no stranger to hardware projections, estimated a more conservative number of 50 billion connected objects.

While there is a plethora of definitions to choose from, when everything is said and done, the IoT remains about the immersion of almost anything and everything into the communications space thanks to a convergence of scientific, technological and societal advances and trends. As a result, the Internet of Things will transform the dimensions of the economy and society on a scale not experienced before: nothing will be forever fixed; inert will become active; delayed, instantaneous; offline, online; and static, dynamic. In sum, the IoT will give rise to a pulsating world.

This anticipated intelligent and pervasive pulsation is not without danger. The sheer number of IoT touchpoints provides a wide range of entry (read "hacking") options; an enormous attack surface. Therefore, it is not surprising that security and privacy are essential topics on the agendas of IoT conferences and other fora around the globe.

However, as research concentrates on preventing successful attacks, the inevitability of breaches must be integrated in an overall IoT security strategy in much the same way that environmental hazards such as severe weather and naturally occurring climate events must be integral to economic development policy.

Security Breach as a Normal Accident

The Internet of Things and "Big Data" are intertwined concepts and lessons learned from studying one benefit the other. A recent article in the Journal of Business Ethics on "Big Data: A Normal Accident Waiting to Happen" focuses on a new form of system accident called data accidents: "a form of system accident caused by unpredictable interactions within the system." In the case of Big Data, the authors argue that it "is central in creating forms of organization, and organizational systems, in which normal accidents are likely to occur," and view information leaks as examples of such accidents.

The concept of "normal accidents" goes back to sociologist Charles Perrow who, inspired by the nuclear accident at Three Mile Island, wrote a seminal book in 1984, "Normal Accidents: Living with High-Risk Technologies," in which he advanced the theory that accidents are inevitable in certain types of high-risk systems, making these accidents unavoidable and 'normal'.

A reflection on normal accidents is germane to a discussion on the Internet of Things since it has been described in many places as both a system (see the call for papers of a 2016 IEEE conference, which states that "the Internet of Things is a system of sensors and computers that communicate with themselves and your mobile devices"), and a system of systems (see this article in the Proceedings of the 2014 European Conference on Software Architecture Workshops "On the Development of Systems-of-Systems based on the Internet of Things: A Systematic Mapping").

Given the complexity and pervasiveness of the IoT, we can view it as a "high-risk" system and expect that it will induce security accidents that are unpredictable but inevitable. We must recognize that insecurity is a fundamental, built-in characteristic of IoT. Furthermore, the market for IoT devices includes a broad spectrum of manufacturers – from kitchen table tinkerers to traditional manufacturing companies. The problem with this spectrum is that it introduces a host of new players, many of whom are competent at bringing their devices to life, but may be lacking the skills to make them secure.

As Ashkan Soltani, the Federal Trade Commission's former Chief Technologist, wrote in 2015: "Growth and diversity in IoT hardware also means that many devices introduced in the IoT market will be manufactured by new entrants that have very little prior experience in software development and security."

Security Breach as a Black Swan

Nassim Nicholas Taleb's readers are familiar with the concept of Black Swan - "an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility" while carrying an extreme impact. Taleb adds that "in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable" (The Black Swan – Prologue).

This means that even tight security (however we define it) might inherently and quite paradoxically be a source of insecurity. It is not because an event is highly unlikely that it cannot happen. "Isn't it strange" Taleb ponders "to see an event happening precisely because it was not supposed to happen?"

Reflecting on the terrorist attack of September 11, 2001 he observes that, "had the risk been reasonably conceivable on September 10, it would have not happened." Given their potential huge number, how can we "reasonably conceive" the risks, threats and vulnerabilities associated with IoT devices? An unexpected (by definition) Black Swan is bound to fly in the face of even the tightest protective wall.

Security Breach as a Statistical Hazard

According to a Chattam House 2012 report, which looked at the consequences of what is known in risk management as high-impact low-probability (HILP) events (a.k.a. low-probability high-consequence events), "these events can manifest themselves not only as 'black swans' – which by nature are impossible to predict – but also as known hazards such as floods, hurricanes or earthquakes, which, owing to the low likelihood of occurrence or the high cost of mitigating action, remain un- or under-prepared for. There are also crises such as pandemics which typically unfold over weeks, months or a few years, for which the scope or timing remains unknown even with preparations."

A local incident can rapidly morph into a cascade of failures through a wave of interconnected links and make those HILP events catastrophic on a large scale. The "Flash Crash" of May 6, 2010 in the U.S. financial markets provides a telling example of how a seemingly minor event with a low likelihood of occurrence can trigger a full-fledged tsunami (John Miller gives a vivid description and analysis in his recent book on complex systems, "A Crude Look at the Whole"). A firm tried to hedge its equity positions against adverse changes in the U.S. equity markets, an innocuous and routine transaction, but, in the process, because of a faulty algorithm (based on volume of trades rather than the security's price), triggered wild gyrations throughout the market to such a critical state that trading had to be paused. The size of the market volume acting as a powerful lever accentuated the positive feedback.

The bigger the population the bigger the possibility that an adverse event will materialize. For instance, if we suppose that we live in a perfect world in the "six sigma" sense - a business management system conceived by Motorola in the 1980s to limit the number of defects (3.4 per million opportunities) – there is still room for (however limited) imperfection. This "imperfection" alone can bring about consequences with devastating impact.

The IoT universe (with many billions of denizens) gives a wide range of opportunities for the would-be attacker, and, as a result, breaches should be viewed as integral to the IoT space. It is not if, but when and where IoT security systems will be defeated.

In a nutshell, even if we considered a security breach extraordinarily improbable because of the strength of the security systems, bad things would still happen; it is a statistical hazard that cannot be reduced away. Emeritus Professor David Hand at Imperial College, London, calls this effect the "law of truly large numbers" (different from the well-known law of large numbers) which says that "with a large enough number of opportunities, any outrageous thing is likely to happen" (The Improbability Principle – Why Coincidences, Miracles and Rare Events Happen Every Day – Epilogue).

What Can We Do?

Starting from the perspective of the inevitable security breaches of IoT systems – and therefore, the very high likelihood of personal data being stolen or inappropriately accessed – product managers, system architects and even C-level executives must build mitigation strategies into their systems designs from the outset. Essential first steps are Security Impact Analysis (SIAs) and Privacy Impact Assessments (PIAs).

SIAs are conducted to identity how system changes can affect the systems' security state in order to develop additional design requirements necessary to minimize any adverse impact. PIAs, rather than considering the impact on the system or the organization, consider breaches from the perspective of the user. That is, for example, where the loss of 10,000 customer records may be seen as an acceptable risk to a business's brand and operations, a PIA outlines the harms to users and possible breach responses.

In its 2014 Opinion on the Internet of Things, the Article 29 Working Party, the European Commission's data protection watchdog, made several important recommendations regarding the security and privacy of IoT architectures:

  • Raw data should be deleted as soon as the necessary data has been extracted. Developers who do not need raw data should be prevented from ever seeing it. The transport of raw data from a connected device should be minimized as much as possible.
  • Devices should disable their own wireless interfaces when not in use, or use random identifiers (such as randomized MAC addresses) to prevent location tracking via persistent IDs.
  • Device manufacturers should follow a Security by Design process (as advanced by Cavoukian and Dixon) and dedicate some components to the key cryptography primitives.
  • The principle of data minimization should be employed: only collect the data needed for a specific service or application. Extra data should not be collected 'in case it becomes useful in the future.'

Moreover, lessons can be learned from the field of Identity Management (IDM), which is concerned with authentication, access and authorization. From IDM we get the concepts of unlinkability and unobservability. Unlinkability is the intentional severing of data events from their source – breaking the 'links' of one's online activity. IoT system architectures should be built with options for people to prevent information collected in one context from leaking into another.

Unlinkability in the IoT world means that, e.g., your wearable does not need to communicate what it knows to your car insurance company. Or that a wearable shared between two people does not allow one to see the other's data unless they are expressly authorized. Security and data architectures must be built to allow either configuration.

Unobservability is the idea that a host system is 'blind' to the activities that take place within it. From the perspective of IoT security, this means a strong bias towards encrypted systems that by default only allow data payloads to be accessed by correctly authenticated parties.

Conclusion: A Call to Action

When it comes to statistics and probabilities, confusion often sets in when the improbable gets mistaken for the impossible. A highly improbable event does not make its existence vanish – rare events can and will happen. IoT security breaches should not be considered on grounds of probability alone; they should also be understood as a fundamental characteristic of complex systems.

Current research on IoT security and privacy integrates the complexity of the Internet of Things (see for example: Editorial, Ad Hoc Networks, Volume 32, Pages 1-114 (September 2015), "Internet of Things security and privacy: design methods and optimization": "Indeed, the Internet of Things is a complex system in which people interact with the technological ecosystem based on smart objects through complex processes. The interactions of these four IoT components: persons, intelligent objects, technological ecosystem, and processes highlight a systemic and cognitive dimension to the security of IoT.")

However, a comprehensive IoT architecture should not only assume that IoT devices, systems, networks, etc. will be attacked, but also that their security measures will be defeated. As a result, detailed and effective contingency and mitigation plans which are thoroughly maintained, tested, trained and repeatedly exercised must find their place in the IoT security toolbox and within default organizational practice. So, too, must organizations be prepared to mitigate the risks of harm to the individual data subjects who suffer when personal data is breached.

As the Cyber-Physical Systems (CPS) Public Working Group established by the National Institute of Standards and Technology (NIST) clearly emphasized in its September 2015 Framework for CPS draft, "resilience" should be viewed as a top-level Trustworthiness management property of an IoT system, including "the ability to withstand and recover from deliberate attacks, accidents, or naturally occurring threats or incidents."

It is therefore concerning, if not worrying, that as recently as February 2016 it was found that in Germany, a country that takes data security and privacy extremely seriously, "nearly 80% of organizations are not prepared for a cybersecurity incident." More than a warning, shouldn't we read it as a call to action?

About the Author

Alain Louchez is the Managing Director of CDAIT ("Center for the Development and Application of Internet of Things Technologies") at the Georgia Institute of Technology ("Georgia Tech", Atlanta, GA, USA). CDAIT is a global, non-profit, partner-funded center that fosters interdisciplinary research and education while driving general awareness about the Internet of Things. CDAIT bridges sponsors with Georgia Tech faculty and researchers as well as industry members with similar interests. In December 2015, the International Telecommunication Union (ITU), a specialized agency of the United Nations, and Georgia Tech announced an agreement (MoU) to be implemented by ITU's SG 20 and CDAIT whose goal is to monitor global Internet of Things (IoT) activities.

Dr. Gilad L. Rosner is the founder of the IoT Privacy Forum, a Visiting Researcher at the Horizon Digital Economy Research Institute, and a Member of the UK Cabinet Office's Privacy and Consumer Advocacy Group. The IoT Privacy Forum is a non-profit organization serving as a crossroads for industry, regulators, academics, government and privacy advocates to discuss the privacy challenges of the Internet of Things, which focuses on the billions of connected devices that speak autonomously and semi-autonomously to each other and to host systems, and the privacy issues that result from their ever increasing collection of personal data.

“This article solely expresses the views and opinions of the authors and does not necessarily represent those of any company, organization or institution to which they may be affiliated.”

Related Stories

Accelerating New Economic Value With The Internet of Things

What Does It Mean to Secure the Internet of Things?