October 4, 2016

Consumer Products in the Age of Big Data

Article in Consumer Products: Adapting to Innovation Report

Big data is becoming a defining element of consumer products today—now, even your dog's collar is streaming data. The collection and use of big data presents many promising opportunities for both consumer products companies and consumers, but it also raises a number of complex legal issues. This article provides an overview of some of the societal benefits and potential harms of big data analytics in the consumer products space, the privacy-related legal issues that are emerging and the US federal government's efforts to develop a regulatory framework to address them.

» Click here to read more articles from our Consumer Products: Adapting to Innovation Report.

How Big Is "Big Data?"

The data streams from internet-enabled consumer products are growing rapidly as these products become increasingly common. For instance, millions of consumers have purchased wearable fitness sensors such as FitBit, which tracks how far the wearer walks and how many calories he or she burns each day.1 Sensors for pets, such as GPS-enabled monitors, cameras and health trackers, are also gaining traction.2 Google's Nest thermostat tracks homeowners' activity, automatically adjusting the temperature when they wake up in the morning or leave for work.3 Internet-enabled ovens allow consumers to turn them on, monitor temperature and set alerts with a mobile app.4 A variety of stand-alone home sensors can detect leaking pipes, opened doors, broken freezers and plants in need of watering.5 These and many other internet-enabled consumer products constitute a segment of the Internet of Things (IoT),6 which has been hailed as the third wave in the development of the internet.7 And this is merely the beginning of the IoT trend. Some industry analysts predict that by 2020, 20.8 billion devices will connect to the internet, compared to 6.4 billion devices today.8

The proliferation of IoT devices is fueling the already explosive growth of big data. IBM currently estimates that 2.3 trillion gigabytes of data are created each day worldwide.9 IDC estimates that the volume of data generated annually will double every year through 2020.10

Consumer products companies are increasingly taking advantage of big data analytics to gain new insights into consumer preferences and behaviors. A recent study by IBM revealed that "58 percent of consumer products companies surveyed report the use of information (including big data) and analytics is creating a competitive advantage for their organizations." But most are still in the planning and pilot stages of development.11 Large consumer products companies receive 40 to 50 terabytes of external data from independent research organizations and distribution networks on a weekly basis.12 In addition, consumer products companies are supplementing their existing data sources with unstructured data collected from social media, blog commentaries and other sources. These sources can provide essential, real-time feedback on consumers' reactions to everything from product launches to in-store promotions.13

A Double-Edged Sword

Consumers can benefit when consumer product companies and their retailers employ big data analytics. These companies can track how consumers use their products and use the information to make improvements. It is useful to know, for example, what features of an exercise machine consumers use and for how long, or never use at all. There is insight to be gained from knowing where in an e-book a consumer stops reading. Further, consumers can receive more personalized and helpful product recommendations from online retailers and e-commerce sites that track and assess their purchase patterns and browsing history. And big data analytics can lead to lower consumer prices. For example, by using analytics to predict demand more accurately, manufacturers can produce and distribute their products more efficiently and lower their costs.14 In competitive markets, these savings are typically passed on to consumers.

However, the emergence of the IoT and the advent of big data analytics have also raised the specter of a large-scale and pervasive loss of personal privacy. Consumer products can collect very personal and sensitive information. For instance, electricity sensors can reveal what time a person gets home at night, how much TV he watches and which electronic devices are being used inside the home.15 Data from "smart" refrigerators can imply diseases like alcoholism, or suggest religious affiliation based on the presence of kosher foods or holiday eating patterns.16 Voice-activated devices may overhear and record private conversations.

Many consumers are very concerned about a loss of privacy, not just for its own sake but because personal information can be used to their disadvantage. For example, some companies can price discriminate based on customer preferences gleaned from their data. Personal data can reveal which of their customers are likely to be willing to pay more for a particular product, and those customers may be shown higher prices online compared to customers perceived to be more frugal. Some of the largest online retailers have experimented with such differential pricing.17 One home improvement retailer shows consumers who shop from their smartphones products that are roughly $100 more expensive than the products shown to those shopping from their desktops.18 Some travel sites have been known to quote significantly higher hotel room prices if the reservation is made using a Mac rather than a PC (some studies show Mac users are willing to spend more than PC users).19 It is very difficult for a consumer to know whether she is being offered the lowest price or even what constellation of data has flagged her as likely willing to pay an inflated price.20

Big data analytics can also raise the troubling prospect of discrimination based on race, gender and other protected characteristics. A recent study from Carnegie Mellon University found that on leading online employment sites, ads for high-paying jobs were displayed about six times more frequently when software detected a man than a woman.21 Discrimination can also occur when analytics reveal patterns correlated with a constitutionally protected characteristic. Big data analytics can show that employees with shorter commutes stay with their employer longer than those with longer commutes.22 But factoring commuting distances into hiring decisions may result in racial discrimination since the racial composition of neighborhoods can vary widely across an urban area.23 Not only can this undermine an employer's anti-discrimination efforts, but, as the Federal Trade Commission (FTC) has cautioned, a practice having a disparate impact on a protected class can violate federal equal opportunity laws.24

The Shortcomings of Notice and Consent

"Notice and consent" has been the traditional answer to many of the problems raised by the collection, analysis, use and disclosure of personal data. Longstanding privacy laws, such as the Gramm-Leach-Bliley Act,25 take this approach and represent the current state of the law. However, many consumer advocates increasingly view privacy notices and consumer consent as inadequate. Some think that consumers find them long and complicated, rife with legalese or unclear regarding the limits imposed on data sharing and use. Moreover, some empirical research suggests that even when consumers do read privacy policies, many do not understand them sufficiently to give meaningful consent. For instance, one study found that people correctly answer only 30 percent of questions regarding the privacy of their online transactions.26 Another study has found that 75 percent falsely believe that "[w]hen a website has a privacy policy, it means the site will not share my information with other websites and companies."27

From this perspective, the solution may not be merely a matter of more accessible, complete and meaningful disclosure. The very nature of big data analytics makes it difficult for companies to predict how they may end up using the data. Evermore sophisticated analytical techniques can enable the "repurposing" of data.28 Data from disparate sources can be combined in new ways to reveal unexpected information about the consumer.29 The unknowable nature of future analytical approaches and downstream use of the data therefore may prevent the consumer from providing truly informed consent.

Furthermore, consumers may not be able to fall back on guarantees of digital anonymity. Some recent studies suggest that large, anonymized datasets can be "re-identified" under certain conditions. In one such study, researchers working with anonymized location data from 1.5 million mobile phones could identify a particular individual 95 percent of the time from only four data points.30 The researchers were able to "reconstruct individuals' movements across space and time" and, because most people's pattern of movement is highly idiosyncratic, the researchers could link their movements back to outside data that could be used to identify each person.31 Another study by researchers at Columbia University and Google found that "time-stamped locations in just two social media apps are enough to link accounts held by the same person and identify him or her."32 Other research has shown that individuals can be re-identified in large anonymous datasets of shopping, subscription, taxi fare and credit card transaction data.33 Thus, not only may anonymization offer relatively weak protection of consumer privacy, that protection appears to be eroding.

At the same time, many in the consumer products industry view growing consumer demand for personal data privacy and security as a market opportunity.34 Two-thirds of consumer product company executives surveyed by Deloitte Touche Tohmatsu think that consumers are more likely to purchase brands that are perceived to be protecting their personal information.35 Consumer surveys confirm this intuition.36 Conversely, misuse and abuse of consumer data can result in lost profitability as a result of fines and declining consumer trust.37 This suggests that marketplace dynamics might be sufficient to deliver an economically optimal level of data privacy and security. Apple Inc.'s recent public refusal to cooperate with the FBI to crack a terrorist suspect's iPhone is seen by some as an unusually vivid example of the importance of maintaining a brand's reputation for prioritizing consumer data privacy.38 As greater numbers of consumer product companies respond to the consumer privacy movement, there may be little need for stricter data privacy and security regulation. Such responsiveness may also be driven by a recognition that a lack of responsiveness could result in increased regulatory intervention.39

Expanding Government Regulation


The FTC has assumed the role of the main watchdog for data privacy violations in the United States.41 Its primary legal authority is conferred by Section 5 of the FTC Act, which bars "unfair or deceptive acts or practices in or affecting commerce."42 The FTC has cautioned that a company is likely engaged in a deceptive practice in violation of Section 5 if it does not honor material promises it makes to consumers. Material promises include, for example, pledging not to share data with third parties, promising to provide consumers with choices about data sharing, or assuring consumers that their personal information will be secure.43 In addition, the failure to disclose information material to privacy concerns may be an unfair or deceptive practice under Section 5. For instance, in 2008 the FTC brought suit against a credit card marketing company that failed to disclose to consumers that their credit lines would be reduced if they used their cards for cash advances or for certain transactions, including marriage counseling, bars and nightclubs, pawn shops, and massage parlors.44 Pursuant to a court settlement with the FTC, the credit card marketing company agreed to reverse more than $114 million in fees charged to eligible consumers. The company also agreed to a civil penalty of $2.4 million to settle a parallel administrative action brought by the FDIC.45

In 2013, the FTC brought its first enforcement action against a marketer of an internet-enabled consumer product. The FTC alleged that the marketer of internet-connected video cameras failed to employ reasonable security in the design and testing of its software and transmitted user login credentials in plain text over the internet. As a result, a hacker was able to access and post links to live feeds, which included videos of infants sleeping in their cribs, young children playing and adults engaging in everyday activities.46 The FTC's settlement of the case required that the marketer cease misrepresenting the security of its cameras, establish a security program to address any risks that could result in unauthorized access of the devices, notify customers about security issues and provide customers with free technical support for two years.47

In addition to the FTC Act, the Commission has also enforced a number of sector-specific laws. The Fair Credit Reporting Act (FCRA) regulates consumer reporting agencies, such as credit bureaus and employment background screening companies, that prepare and sell reports used to make employment, credit, housing and other consumer-related eligibility decisions. The FTC has begun to apply FCRA to companies that offer novel approaches to informing eligibility decisions, in particular the use of big data analytics across a broader spectrum of data than traditional consumer reporting agencies typically use. For instance, in 2012, the FTC alleged that the online data broker Spokeo was subject to and failed to comply with FCRA because Spokeo compiled information from online and offline data sources and marketed the profiles to human resources departments to use in hiring decisions.48 Spokeo paid $800,000 in civil penalties under a consent decree to resolve the matter.49

The Children's Online Privacy Protection Act (COPPA) requires operators of commercial websites and online services directed to children under 13 to provide notice to parents and obtain parental consent before collecting children's personal information.50 In the FTC's view, COPPA could apply to the IoT.51 The FTC has brought more than a dozen actions for violations of COPPA since the act was first introduced in 2000.52 COPPA regulations were amended in 2013 to define personal information to include "persistent identifiers," such as cookies or customer numbers that enable retailers to track consumers across websites.53 Last year, the FTC brought its first enforcement actions based solely on collection of persistent identifiers without parental consent. The FTC's complaints allege that the mobile app developers LAI Systems and Retro Dreamer, whose apps were directed at children, allowed third-party ad networks to collect and use persistent identifiers from their apps to target ads to children.54 LAI Systems and Retro Dreamer agreed to pay a combined total of $360,000 in civil penalties to settle the FTC's charges.55 These enforcement actions signal that the FTC intends to enforce COPPA aggressively. In addition, others have also called for an extension of COPPA to data generated by consumer products. In 2015, the parents of minor children filed a class-action suit against the makers of a high-tech "Hello Barbie" doll, which is designed to engage in conversation with children, record the conversation and store the associated data in a cloud database. To satisfy COPPA requirements, the makers obtain parental consent from the owner of the doll. However, since children often play with others, the doll picks up and records conversations with other children, whose parents did not provide the necessary consent.56 The case is currently pending.

In addition to its enforcement of the existing laws, the FTC is actively involved in discussions for the reform of data privacy legislation. In a report published last year on the IoT, the FTC concluded that IoT-specific legislation is premature given the evolving nature of the technology.57 Instead, the FTC continues to favor broad-based privacy legislation that is both flexible and technology-neutral.58 It envisions a statute that would give the FTC the authority to require certain basic privacy protections such as consumer disclosures and consumer choice that the FTC does not have authority to impose under the current FTC Act.59 The FTC also favors data security and breach notification legislation at the federal level.60 Lastly, the FTC has ongoing concerns about the lack of transparency surrounding data practices and the lack of meaningful consumer control over their personal data.61 The FTC has recommended that data brokers provide consumers with access to their data and has proposed a centralized website where data brokers would identify themselves, describe how they collect and use consumer data, and detail consumers' rights to access their data.62 Whether or not Congress adopts these specific recommendations, it is clear that the FTC intends to continue to expand its efforts to protect data privacy generally and regulate the growing use of data generated by consumer products and the IoT.

Originally appeared in the Fall 2016 Consumer Products: Adapting to Innovation Report.

  1. Fitbit, WIKIPEDIA (June 12, 2016, 12:42 PM).

  2. The global pet wearable market was valued at USD 837.6 million in 2014. Pet Wearable Market Analysis By Technology, GRAND VIEW RESEARCH (Feb. 2016).

  3. Nest Labs, WIKIPEDIA (June 15, 2016, 2:07 PM).

  4. Keith Barry, Smart Meets Elegant on GE's New Connected Oven, REVIEWED (Oct. 24, 2013).

  5. Twine, SUPERMECHANICAL (last visited June 16, 2016).

  6. The term Internet of Things refers to "the ability of everyday objects to connect to the Internet and to send and receive data." FED. TRADE COMM'N, INTERNET OF THINGS: PRIVACY AND SECURITY IN A CONNECTED WORLD i (2015) {hereinafter FTC, INTERNET OF THINGS}.


  8. Gartner Says 6.4 Billion Connected "Things" Will Be in Use in 2016, Up 30 Percent From 2015, GARTNER (Nov. 10, 2015).

  9. The Four V's of Big Data, IBM; see also Ben Walker, Everyday Big Data Statistics, VCLOUDNEWS (Apr. 5, 2015).


  11. Id. at 3.


  13. Id. at 3.

  14. See id. at 3-4.

  15. Scott R. Peppet, Sensor Privacy as One Realistic & Reasonable Means to Begin Regulating Big Data, in BIG DATA AND PRIVACY: MAKING ENDS MEET 98, 99 (2013).

  16. Steve Johnson, Internet of Things Will Transform Life, But Experts Fear for Privacy and Personal Data, MERCURY NEWS (Nov. 1, 2014).

  17. Elizabeth Dwoskin, Why You Can't Trust You're Getting the Best Deal Online, WALL ST. J. (Oct. 23, 2014).

  18. Id.

  19. Dana Mattioli, On Orbitz, Mac Users Steered to Pricier Hotels, WALL ST. J. (Aug. 23, 2012).

  20. See Kara Brandeisky, How to Beat Online Price Discrimination, MONEY (Oct. 23, 2014).

  21. Byron Spice, Questioning the Fairness of Targeting Ads Online, CARNEGIE MELLON UNIV. (July 7, 2015).


  23. Id.

  24. Id. at 33.

  25. Pub. L. No. 106-102, 113 Stat. 1338 (1999).

  26. Joseph Turow et al., Americans Reject Tailored Advertising and Three Activities that Enable It, at 20–21 (Sept. 29, 2009) (unpublished manuscript).


  28. Doug Bonderud, Data Repurposing: The Underpinning of Predictive Analytics?, PREDICTIVE ANALYTICS TIMES (Nov. 26, 2013).

  29. See, e.g., Michal Kosinski et al., Private Traits and Attributes Are Predictable From Digital Records of Human Behavior, 110 PROCEEDINGS OF THE NAT'L ACAD. OF SCIS. 5802 (2013) (finding that the combination of information from Facebook Likes, demographic profiles, and the results of psychometric tests allows researchers to accurately predict a male user's sexual orientation 88% of the time, a user's political party affiliation 85% of the time, and whether the user used alcohol, drugs, or cigarettes approximately 70% of the time).

  30. Yves-Alexandre de Montjoye et al., Unique in the Crowd: The Privacy Bounds of Human Mobility, 3 SCI. REP. 1376 (2013); see also John Wihbey, Unique in the Crowd: The Privacy Bounds of Human Mobility, JOURNALIST'S RESOURCE (June 10, 2013), for a review of this report.

  31. Montjoye, supra note 33, at 4.

  32. Kim Martineau, Location Data on Two Apps Enough to Identify Someone, Says Study, COLUM. UNIV. (Apr. 13, 2016).

  33. Id.


  35. Id. at 11.

  36. Eighty percent of surveyed consumers report being more likely to purchase from consumer product companies that they believe protect their personal information. Id. at 6.

  37. Jeff John Roberts, The Competitive Advantages of Data Privacy, BLOOMBERG (July 29, 2013).

  38. See, e.g., Rahul Telang, FBI versus Apple: Analyzing Data Privacy, CARNEGIE MELLON UNIV. (Feb. 26, 2016).

  39. Roberts, supra note 42.

  40. This article focuses on US government regulations to protect data privacy. However, it should be noted that regulations for data privacy and protection are undergoing reform around the world. See, e.g., European Commission Launches EU-U.S. Privacy Shield: Stronger Protection for Transatlantic Data Flows, EUROPEAN COMM'N (July 12, 2016).

  41. The FTC is not the only watchdog for data privacy violations. Financial services and products are heavily regulated by the Consumer Financial Protection Bureau (CFPB). In March 2016, the CFPB brought its first data security enforcement action against an online payment processing company, alleging that the company had misrepresented its data security practices and contradicted its privacy policies. See Consent Order, In the Matter of Dwolla, Inc., File No. 2016-CFPB-0007 (Mar. 2, 2016); see also Paul Besozzi et al., CFPB – The Newest Cop on the Data Privacy Beat, LAW360 (Mar. 16, 2016) ("More than ever before, companies must understand that multiple agencies watch over data protection practices—including the FCC, FTC, U.S. Securities and Exchange Commission, state attorneys general and, now, the CFPB.").

  42. 15 U.S.C. § 45(a)(1).

  43. FTC, BIG DATA TOOL, supra note 25, at 21-22.

  44. Complaint, FTC v. CompuCredit Corp., No. 1:08-cv-1976-BBM-RGV (N.D. Ga. June 10, 2008).

  45. Subprime Credit Card Marketer to Provide At Least $114 Million in Consumer Redress to Settle FTC Charges of Deceptive Conduct, FTC (Dec. 19, 2008).

  46. Complaint, In the Matter of Trendnet, Inc., FTC File No. 1223090.

  47. Marketer of Internet-Connected Home Security Video Cameras Settles FTC Charges It Failed to Protect Consumers' Privacy, FTC (Sept. 4, 2013).

  48. FTC, BIG DATA TOOL, supra note 25, at 13-14.

  49. Consent Decree & Order for Civil Penalties, Injunction & Other Relief, United States v. Spokeo Inc., No. 2:12-cv-5001-MMM-SH (C.D. Cal. June 19, 2012).

  50. 15 U.S.C. §§ 6501-6502 (1998).

  51. FTC, INTERNET OF THINGS, supra note 1, at 53.

  52. Allison Grande, FTC Expands Child Data Protections in Mobile App Action, LAW360 (Jan. 6, 2016).

  53. 16 C.F.R. § 312 (2013).

  54. Complaint, United States v. Lai Sys., LLC, No. 2:15-cv-9691 (C.D. Cal. Dec. 17, 2015); Complaint, United States v. Retro Dreamer, No. 5:15-cv-2569 (C.D. Cal. Dec. 17, 2015). The FTC did not charge the third-party ad networks with COPPA violations because LAI Systems and Retro Dreamer did not tell the ad networks that their apps were directed at children. Third parties are responsible for COPPA violations only if they have "actual knowledge" that they are collecting personal information from children. Grande, supra note 55.

  55. Stipulated Order for Permanent Injunction & Civil Penalty Judgment, United States v. Lai Sys., LLC, No. 2:15-cv-9691 (C.D. Cal. Dec. 17, 2015); Stipulated Order for Permanent Injunction & Civil Penalty Judgment, United States v. Retro Dreamer, No. 5:15-cv-2569 (C.D. Cal. Dec. 17, 2015).

  56. Class Action Complaint, Archer-Hayes v. Toytalk, Inc., No. 2:16-cv-02111-JAK-PLA (C.D. Cal. Dec. 7, 2015).

  57. FTC, INTERNET OF THINGS, supra note 1, at 48-49.

  58. Id. at 55-56.

  59. Id. at 56-57.

  60. Id. at 55.

  61. Id. at 55-58.



Bonnie Phan
Bonnie Phan
Senior Associate
Subscribe Link

Email Disclaimer