As your family sits down after dinner and a long day of work, one of the children starts up a conversation with her new connected play doll, while the other begins to watch a movie on the new smart television. The smart thermostat is keeping the living area at steady 22°C, while diverting energy from the rooms that aren't being used at the moment.
Father is making use of the home computer's voice control features, while mother is installing new smart light bulbs that can change color on command or based on variations in the home environment. In the background, the smart refrigerator is transmitting an order for the next-day delivery of groceries.
This setting tells a great story about the consumer Internet of Things (IoT) in that there are exciting new capabilities and conveniences. It also begins to make clear the soon-to-be hyper-connected nature of our homes and environments. If we start to examine these new smart products, we can begin to see the concern surrounding privacy within the IoT.
The privacy challenges with the IoT are enormous, given the vast quantities of data collected, distributed, stored and, ahem, sold every day. Pundits will argue that privacy is dead today. They argue that consumer willingness to click eagerly through so-called end user privacy agreements compromises their privacy, with barely a notion as to what they just agreed to. The pundits are not far off, as privacy concerns are something of a moving target given the fickle nature of consumer sentiment.
Our ability to grasp and find ways of preserving privacy with the IoT represents a monumental challenge. The increased volume and types of data able to be collected and distilled through technical and business analytical systems can produce frighteningly detailed and accurate profiles of end users. Even if the end user carefully reads and agrees to the end user privacy agreement, they are unlikely to imagine the downstream, multiplicative, compromising effect of accepting two, three, or four of them, to say nothing of 30 or 40 privacy agreements.
While an improved targeted advertising experience may have been the superficial rationale for agreeing to privacy agreements, it is no understatement that advertisers are not the only entities procuring this data. Governments, organized crime syndicates, potential stalkers, and others can either directly or indirectly access the information to perform sophisticated analytical queries that ascertain patterns about end users. Combined with other public data sources, data mining is a powerful and dangerous tool. Privacy laws have not kept up with the data science that thwarts them.
Privacy protection is a challenge for every organization and industry. Communications within a privacy-conscious and privacy-protecting organization are vital for ensuring that customer interests are addressed. Later in this chapter, we identify corporate departments and individual qualifications needed to address privacy policies and privacy engineering.
Some privacy challenges are unique to the IoT, but not all. One of the primary differences between IoT and traditional IT privacy is the pervasive capture and sharing of sensor-based data, whether medical, home energy, transportation-related, and so on. This data may, or may not, be authorized. Systems must be designed to make determinations as to whether that authorization exists for the storage and sharing of data that is collected.
Take, for example, video captured by cameras deployed throughout a smart city. These cameras may be set up to support local law enforcement efforts to reduce crime; however, they capture images and video of everyone in their field of view. These people caught on film have not given their consent to be video-recorded.
As such, policies must exist that:
- Notify people coming into view that they are being recorded
- Determine what can be done with the video captured (for example, do people need to be blurred in images that are published?)
The amount of data actively or passively generated by (or for) a single individual is already large. By 2020, the amount of data generated by each of us will increase dramatically. If we consider that our wearable devices, our vehicles, our homes, and even our televisions are constantly collecting and transmitting data, it becomes obvious that trying to restrict the types and amounts of data shared with others is challenging to say the least.
Now, if we consider the life cycle of data, we must be aware of where data is collected, where it is sent, and how. The purposes for collecting data are diverse. Smart TV vendors, for example, use a type of Automated Content Recognition(ACR) technology to sell data to ad companies on your viewing habits. These ad companies use the data for strategic marketing and even political influence. Some smart machine vendors will lease equipment to an organization and collect data on the usage of that equipment for billing purposes. The usage data may include time of day, duty cycle (usage patterns), number and type of operations performed, and who was operating the machine. The data will likely be transmitted through a customer organization's firewall to some internet-based service application that ingests and processes the information. Organizations in this position should consider researching exactly what data is transmitted in addition to the usage information, and ascertain whether any of the information is shared with third parties.
For more information on how to disable ACR features on your smart TV, see the consumer reports article at https://www.consumerreports.org/privacy/how-to-turn-off-smart-tv-snooping-features/.
Data associated with wearables is frequently sent to applications in the cloud for storage and analysis. Such data is already being used to support corporate wellness and similar programs, the implication being that someone other than the device manufacturer or user is collecting and storing the data. In the future, this data may also be passed on to health-care providers. Will the health-care providers pass that data on to insurance companies as well? Are there regulations in the works that restrict the ability of insurance companies to make use of data that has not been explicitly shared by the originator?
Smart home data can be collected by many different devices and sent to many different places. A smart meter, for example, may transmit data to a gateway that then relays it to the utility company for billing purposes. Emergent smart grid features such as demand response will enable the smart meter to collect and forward information from the home's individual appliances that consume electricity from the power grid. In the absence of any privacy protections, an eavesdropper could theoretically begin to piece together a puzzle that shows when certain appliances are used within a home, and whether homeowners are home or not. The merging of electronic data corresponding to physical-world states and events is a serious concern related to privacy in the IoT.
A striking report by Open Effect (https://openeffect.ca/reports/Every_Step_You_Fake.pdf) documented the metadata that is collected by today's consumer wearable devices. In one of the cases they explored, the researchers analyzed the Bluetooth discovery features of different manufacturer's wearable products. The researchers attempted to determine whether the vendors had enabled new privacy features that were designed into the Bluetooth 4.2 specification.
They found that only one of the manufacturers (Apple) had implemented them, leaving open the possibility of the exploitation of the static Media Access Control (MAC) address for persistent tracking of a person wearing one of the products. Without the new privacy feature, the MAC addresses never change, creating an opportunity for adversarial tracking of the devices people are wearing. Frequent updates to a device's MAC address limit an adversary's ability to track a device in space and time as its owner goes about their day.
Another worthy example of the need to rethink privacy for the IoT comes from the connected vehicle market. Just as with the wearables discussed previously, the ability to track someone's vehicle persistently is a cause for concern.
A problem arises, however, when we look at the need to sign all messages transmitted by a connected vehicle digitally. Adding digital signatures to messages such as Basic Safety Messages (BSMs) or infrastructure-generated messages (for example, traffic signal controller Signal Phase and Timing (SPaT) messages) is essential to ensure public safety and the performance of our surface transportation systems. Messages must be integrity-protected and verified to originate from trusted sources. In some cases, they must also be confidentiality-protected. But privacy? That's needed, too. The transportation industry is developing privacy solutions for connected vehicles:
For example, when a connected vehicle transmits a message, there is concern that using the same credentials to sign messages over a period of time could expose the vehicle and owner to persistent tracking. To combat this, security engineers have specified that vehicles will be provisioned with certificates that:
- Have short life spans
- Are provisioned in batches to allow a pool of credentials to be used for signing operations
In the connected vehicle environment, vehicles will be provisioned with a large pool of constantly rotated pseudonym certificates to sign messages transmitted by On-Board Equipment (OBE) devices within the vehicle. This pool of certificates may only be valid for a week, at which point another batch will take effect for the next time period. This reduces the ability to track the location of a vehicle throughout a day, week, or any larger time period, based on the certificates it has attached to its own transmissions.
Ironically, however, a growing number of transportation departments are beginning to take advantage of widespread vehicle and mobile devices by deploying Bluetooth probes along congested freeways and arterial roadways. Some traffic agencies use the probes to measure the time it takes for a passing Bluetooth device (indicated by its MAC address) to traverse a given distance between roadside-mounted probes. This provides data needed for adaptive traffic system control (for example, dynamic or staged signal-timing patterns). Unless traffic agencies are careful and wipe any short-or long-term collection of Bluetooth MAC addresses, correlative data analytics can be used potentially to discern an individual vehicle's (or its owner's) movement in a region. Increased use of alternating Bluetooth MAC addresses may render useless future Bluetooth probe systems and their use by traffic management agencies.
Continuing with the connected vehicle example, we can also see that infrastructure operators should not be able to map provisioned certificates to the vehicles either. This requires changes to the traditional PKI security design, historically engineered to provide certificates that specifically identify and authenticate individuals and organizations (for example, for identity and access management) through X.509 distinguished name, organization, domain, and other attribute types. In the connected vehicle area, the PKI that will provision credentials to vehicles in the United States is known as the Security Credential Management System (SCMS) and is currently being constructed for various connected vehicle pilot deployments around the country. The SCMS has built-in privacy protections ranging from the design of the pseudonym IEEE 1609.2 certificate to internal organizational separations aimed at thwarting insider PKI attacks on drivers' privacy.
One example of SCMS privacy protections is the introduction of a gateway component known as the Location Obscurer Proxy(LOP). The LOP is a proxy gateway that vehicle OBEs can connect to instead of connecting directly to a Registration Authority(RA). This process, properly implemented with request-shuffling logic, should help thwart an insider at the SCMS attempting to locate the network or geographic source of therequests(https://www.wpi.edu/Images/CMS/Cybersecurity/Andre_V2X_WPI.PDF).
The potential for a dystopian society where everything that anyone does is monitored is often invoked as a potential future aided by the IoT. When we bundle things like drones (also known as SUAS) into the conversation, the concerns are validated. Drones with remarkably high-resolution cameras and a variety of other pervasive sensors all raise privacy concerns; therefore, it is clear there is much work to be done to ensure that drone operators are not sued because of a lack of clear guidance on what data can be collected and how, and what the treatment of the data needs to address.
To address these new surveillance methods, new legislation related to the collection of imagery and other data by these platforms may be needed to provide rules and penalties in instances where those rules are broken. For example, even if a drone is not directly flying over a private or otherwise controlled property, its camera may view at slant-range angles into private property due to its high vantage point and zoom capabilities.
Laws may need to be established that require immediate or as-soon-as-practical geospatial scrubbing and filtering of raw imagery according to defined, private-property-aligned geofences. Pixel-based geo-referencing of images is already in today's capabilities and is used in a variety of image post-processing functions related to drone-based photogrammetry, production of orthomosaics, three-dimensional models, and other geospatial products. Broad pixel-based georeferencing within video frames may not be far off.
Such functionality will require consent-based rules to be established so that no drone operator can preserve or post imagery in public online forums containing any private property regions beyond a specific per-pixel resolution. Without such technical and policy controls, there is little other than strong penalties or lawsuits to prevent Peeping Toms from peering into backyards and posting their results on YouTube. Operators need specificity in rules so that companies can build compliance solutions.
New technologies that allow law-abiding collectors of information to respect the wishes of citizens who want their privacy protected are needed in our sensor-rich IoT.
Hope you found this article insightful. If you’d like to learn more about IoT, you must refer to Practical Internet of Things Security – Second Edition. Following a hands-on approach, Practical Internet of Things Security – Second Edition takes you on a journey that begins with understanding the IoT and how it can be applied in various industries, goes on to describe the security challenges associated with the IoT, and then provides a set of guidelines to architecting and deploying a secure IoT in your enterprise.