Privacy issues proliferate as new technologies develop.
This chapter highlights two technology trends that are changing the landscape of privacy concerns:
Big Data
Big Data is a term used to describe the nearly ubiquitous collection of data about individuals from multitudinous sources, coupled with the low costs to store such data and the new data mining techniques used to draw connections and make predictions based on this collected information.
Internet of Things (IoT)
IoT represents a new development in the ways that individuals interact with computing devices.
In the early 1980s, individuals dealt with desktop computers. Later, people adapted to laptops and eventually smartphones.
The next evolution, which is rapidly unfolding, combines sensors almost anywhere with connection to the Internet.
Background
Number of sensors connected to the Internet is now counted in the tens of billions.
From a privacy perspective, cell phones provide intimate details of a person’s life
location information
lists of friends
personal preferences as diverse as shopping habits and medical conditions
Moore’s Law
In 1965, Moore published a now-iconic article in which he observed that the number of transistors that would fit onto a circuit board doubled each year.
With Big Data, a new law has emerged—the amount of data doubles each year.
Computing power available today allows sensors to go everywhere and be networked together.
From a practical standpoint, the basic definition of IoT is that there are inexpensive, precise sensors that can send an unprecedented amount of data into a network
Asimov’s Law of Robotics
Big Data is the fuel that runs algorithms and analytics, which will enable artificial intelligence (AI) systems connected to the cloud
Asimov’s three “laws,” to the extent they apply in practice, will apply to the practices in IoT, Big Data and the cloud.
The three laws are as follows
Robot may not injure a human being or, through inaction, allow a human being to come to harm
A robot must obey orders given it by human beings except where such orders would conflict with the First Law
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law
Electronic Privacy Information Center suggested two additions to Asimov’s iconic laws to address transparency in algorithms and AI.
“Robots should always reveal the basis of their decisions”
“Robots must always reveal their identities”
Big Data’s extensive collection of data related to consumers’ activities, both online and offline, and the subsequent analysis of that data is fraught with risks—financial risk, business risk, security risk and privacy risk.
A Case Study in Big Data To understand the concerns that arise with Big Data, consider a case study of a major financial services firm that decides to create an “All Customer Funds” (ACF) database.
Risks
Bank will not be pleased if the Big Data product turns into a Big Data breach—a centralized database can create the possibility of larger breaches than previously.
The organization must thus create and implement a comprehensive security plan for its major databases, including firewalls (to keep intruders away) and intrusion detection (to find intruders that get in).
The organization should study the “threat models” of who wants to attack the organization and why they want to do so, and raise the level of security preparedness to match the level of the threats.
One tip for senior managers in implementing the comprehensive information plan is the “friends and family test”—would the managers feel comfortable if data on themselves and their family and friends were in the database, subject to possible breach.
Fair Information Privacy Practices, Data Minimization and Anonymization
Relevant FIPPs
Collection Limitation: obtain data by lawful and fair means and (where possible) with the knowledge and consent of the data subject
Purpose Specification: use the data as initially intended or for other purposes that are not incompatible with that purpose (meaning there are limitations on secondary uses of the data)
Use Limitation: do not use or disclose data beyond those purposes except with consent or by authority of law
Data De-identification, Pseudonymization and Anonymization
Big Data is often defined as having a great volume, variety, and velocity of data.
As such, Big Data poses a risk to the anonymization or de-identification of data.
Future of Privacy Forum has developed a useful chart that illustrates the multiple ways data can vary from fully identified (a person’s name) to fully statistical or aggregate
Pseudonymous data: Information from which the direct identifiers have been eliminated.
Indirect identifiers remain intact.
De-identified data: Direct and known indirect identifiers have been removed.
Anonymous data: Direct and indirect identifiers have been removed or technically manipulated to prevent re-identification
Blurring. This technique reduces the precision of disclosed data to reduce the certainty of individual identification.
Masking. This technique masks the original values in a data set with the goal of data privacy protection.
Differential Privacy. This technique uses a mathematical approach to ensure that the risk to an individual’s privacy is not substantially increased as a result of being part of the database.
FTC identified three broad categories of products offered by data brokers at that time:
Marketing (such as appending data to customer information that a marketing company already has)
Risk mitigation (such as information that may reduce the risk of fraud)
Location of individuals (such as identifying an individual from partial information).
FTC Reports on Big Data In recent years, the FTC has issued reports that examine issues that arise as a result of the use of Big Data.
Internet of Things
In 2016, estimates for the number of IoT devices in use topped 15 billion worldwide, with spending on these devices approaching $1 trillion globally.
By 2020, the number of wearable device shipments is estimated to be more than 200 million.
Predictions are that 90 percent of new cars will be connected by 2020, with estimates that a quarter of a billion connected vehicles will be on the road by that time
Privacy frameworks and IoT
Look at the data to determine where it falls on a spectrum that plots personal data at one end and impersonal data on the other.
It is worth noting that much of IoT—such as temperature, traffic statistics, and sensors around industrial production—often does not implicate PII.
Most IoT devices share two characteristics that are important for privacy and cybersecurity discussions:
The devices interact with software running elsewhere (often in the cloud) and function autonomously
When coupled with data analysis, the devices may take proactive steps and make decisions about or suggest next steps for users.
Wearables
Electronic devices that are worn on the body and collect data in real time are known as wearables.
They range from headwear used by soldiers on the battlefield to wrist devices that check heart beats during exercise.
challenges related to wearables data have been examined in research that focused on users’ privacy concerns regarding wearables:
“Right to forget.”
Impact of location disclosure.
Concern that screens will be read
Video and audio recording where those involved were unaware
Lack of control of the data
Automatic syncing with social media
Facial recognition
Connected Cars
Connected cars collect and transmit data about the vehicle, the driver’s driving habits, and the driver’s preferences.
One example would be a vehicle that wirelessly alerts the dealership when tires need to be rotated
Smart Homes
Smart homes have multiple devices that are connected to the Internet to enhance the home environment experience.
These devices are typically user controlled
A task that is often accomplished via a smartphone or another small electronic device.
Smart thermostats.
IoT devices, including thermostats, are vulnerable to attack by hackers and criminals in part because users often do not reset factory set passwords.
Smart TVs.
As with many IoT devices, the Internet connection with a smart TV has often not been effectively secured.
Communication systems.
Homes with multiple devices often use wireless technology to allow devices to communicate.
Security systems.
Security systems that are connected to the Internet.
Provide the user with the ability to remotely lock doors or open the garage also provide hackers with the means to access the house
Smart Cities
Smart cities is a term that primarily refers to municipalities and other government entities using sensors to monitor functions and improve government services.
Changing seams.
From the seams between legacy and new infrastructure to those between urban and rural systems, these boundaries are moving or disappearing as systems are networked and upgraded.
Inconsistent adoption.
Factors such as user preferences, resource availability and scale of technological system will lead to inconsistent adoption.
These inevitable inconsistencies will introduce security challenges for industry, government and people living with the technology. Increased automation. Although automation can reduce certain risks, removal of human interaction from many aspects of cyber-physical infrastructure has the potential to introduce new security challenges.
BITAG Report
In 2016, the Broadband Internet Technical Advisory Group issued a set of recommendation for IoT privacy and security practices.
FTC Report on IoT
In 2014, the FTC undertook its first enforcement action involving an Internet- connected device against a company that provided consumers with Internet- connected cameras for use inside the home.
Discussion of Moore’s Law highlighted the continuous and exponential increase that has existed for decades in computing capabilities.
The terms Big Data and Internet of Things were unknown a decade ago.