Give It to Them Straight: How Entrepreneurs Can Save Consumers’ Online Privacy

pexels-kevin-ku-577585.jpg

When a pair of headphones I ordered arrived at my door, I didn’t expect to read through layers of privacy policy regarding location tracking and disclosures of this information to third parties. I only wanted to listen to music. The headphones themselves came with an application to download onto my phone, and as a default feature, the application would track my location and listening habits.¹ Privacy policies are now integrated in the lives of everyday consumers, and more recently, COVID-19 pushed everyone indoors and online, where Personally Identifiable Information (PII) is generated and collected constantly.2 Data collection through user surveillance has been the norm for the smartphone applications and web-platforms, but now even appliances and other “old” forms of technology are transformed into internet-enabled devices. The Internet of Things (IoT) and wearables recently exploded onto the scene and are collecting their own PII. My headphones are just one example; another example are smart refrigerators, which can collect data on food purchases of the owners, inform them of recipes, and monitor their eating habits.3 One can imagine how a time traveler from the not-so-distant 1990’s might react when faced with the level of modern consumer surveillance. Everything from excitement and awe to 1984-style dread would be on the table. Rather than attempting to answer the biggest questions in online privacy, this essay will expose and elaborate on opportunities for those entrepreneurs seeking to ethically address these systemic privacy issues in accordance with the current legal online privacy framework.

The Current Legal Landscape

Currently, privacy policy regulations in the United States exist as a patchwork of contract law principles,4 state law requirements,5 and sectoral federal regulations based on industry segment.6 Courts have thus far practically avoided applying tort principles to online privacy violations.7 Though calls for consumer privacy began in the late nineteenth century,8 the first major form of privacy legislation came in 1974 with the Privacy Act. This Act regulates information collection by the federal government and its agencies and does not extend to private organizations.9 Around the same time as the Privacy Act of 1974, the U.S. Department of Health, Education, and Welfare created a privacy report that came to be known as the HEW Report.10 This report, along with other documents such as the OECD Guidelines,11 laid out the first steps towards the “notice and choice” structure that exists today. Similarly, the Fair Information Practice Principles (FIPPs) are guidelines based on the recommendations of the HEW Report and the OECD Guidelines; though they are not enforceable in themselves, they have been incorporated into some regulations.12

Until relatively recently, however, there were no statutory requirements on private organizations to post and display a privacy policy regarding the use of user-generated and surveilled data. California enacted a privacy policy requirement in 2003 for websites,13 and the California Attorney General has has indicated that CalOPPA includes mobile apps.14 This law and its interpretation have increased disclosure to consumers.15 The legal theory behind California’s law follows, in part, the notice and choice framework because it assumes that users will read the policies, thereby have adequate notice of the data surveillance practices, and then make an informed choice.16 (Although California’s state laws are a significant source of regulation of privacy policies, the FTC is indirectly empowered under § 5 of the FTC Act to enforce the terms in a website’s privacy policy.17) Boiled down, the notice and choice structure allows applications or web-platforms substantial latitude to surveil consumers and use that data however they please, as long as the data use (1) is traditionally legal under contract law, (2) is spelled out in the privacy policy, and (3) is not subject to one of the outlined industries regulated by federal law there are no legal consequences.18 The rationale behind this system is to allow companies the freedom to innovate and continue to develop new products and services.19 Many policy advocates encourage self-regulating by the internet web-platforms themselves to protect the innovative environment;20 however, others believe self-regulation has been largely unsuccessful.21 For example, some argue that there really is no notice or choice with respect to what online platforms are doing with user surveillance data.22 In addition, calls for more legislation from the states and the federal government are growing23 as part of a broader “techlash,”24 yet there has been little legislative response.25

Another common criticism of the notice and choice paradigm is that online users rarely understand what the privacy policies mean, and therefore, do not have any real notice or adequate recourse when security breaches occur.26 Additionally, many web platforms and edge service providers profit off of the hidden nature of the data surveillance industry.27 Many collect alarmingly wide ranges of data,28 often unrelated to the purpose of the function of application.29 This data is then sold to data brokers, which run a hidden network of PII transactions.30 Consumers are rarely aware of this market, and many are unlikely to consider the far-reaching effects of sharing their PII data when purchasing a fancy new refrigerator or when simply using a search engine. This results in unexpected consequences, such as being targeted unfairly by predatory lending institutions as well as many other unfair discriminatory concerns.31 Even with highly personal and private data otherwise protected by the Health Information Portability and Accountability Act (HIPAA), online platforms can escape the grip of regulations on data usage simply because they are not classified as hospitals or other institutions under the purview of HIPAA.32 The hidden nature of the data broker industry suggests that online users do not truly assent to the policy terms, undermining the justification for the current privacy paradigm. Additionally, many have argued that the asymmetry of the data broker industry is what fuels its profitability.33 However, consumers are willing to pay for privacy and leaving them in the dark leaves money on the table.34

Opportunities for Entrepreneurs

The startup community recognized the importance of privacy policies early in the game.35 Many online entities make their money by collecting and selling user data; however, online business success is frequently determined by a web platform’s “superior ability to understand and meet customer needs” and not by how much data they have collected.36 Implementing more robust online privacy systems could play a significant role in meeting those needs. This can be forced through legislation or firms can compete on winning consumer trust. Some firms, such as Apple, have moved towards creative ways to communicate what data their applications collect and for what purpose.37 Unfortunately, not all online firms are so forward with their intentions, and over time this may prove to be to their detriment. As more people stay at home to work and use the internet as a source of interaction with the outside world, resentment against undesirable data privacy practices and sinking levels of trust will likely become a common business issue, adding to the ongoing techlash. Indeed, there are already concerns regarding data surveillance by many social media platforms in a number of industries.38

A central component to improving the online consumer experience is to increase the level of consumer trust in the particular application, website, or online transaction.39 However, increasing consumer trust is not as easy as simply providing a paid subscription option; only giving those who can afford privacy the option to exercise it has its own issues and does not directly address the trust component.40 Fortunately, this is not the only option available for online commercial enterprises. Apple's recently-introduced privacy “nutrition labels” is one model, and one that is more in accordance with how people actually learn. Providing individuals with a wall of words in legalese, unsurprisingly, does not facilitate consumer trust particularly well.41 Therefore, in the absence of any legal requirements to increase actual notice to consumers regarding use of their data by an application or web-platform, companies seeking to gain a competitive advantage over the established platforms, such as Facebook, Google, or Amazon, can offer services aimed at increasing consumer trust. Startups such as Jumbo are now realizing the venture capital potential of exposing the levels of data exploitation from which established firms profit.42 This signals an appetite for venture capitalists seeking to enhance online trust with applications and platforms. As the techlash continues to plague major tech companies, start-ups and entrepreneurs are presented with an opportunity to win the war without even fighting.43 Consumers are willing to pay for more privacy controls at a time when both Congress and state legislatures are struggling to address the issue through meaningful legislation and established Internet giants continue with business-as-usual.

Conclusion

The techlash trend comes from a general frustration and resentment against technology companies, and privacy policies are one of the trickiest areas of this phenomenon to reform. While legislators and policy experts argue over the path forward, entrepreneurs should take note of the attention venture capitalists are giving to privacy start-ups such as Jumbo and DuckDuckGo. In an area where self-regulation was intended to generate more innovation, it is somewhat surprising that there has not been more innovation regarding the consumer privacy experience, although a closer look at the profitability of the data broker industry suggests that this is more by design than anything coincidental.

Still, there is money to be made by giving consumers what they want: more privacy controls in their virtual lives as they increasingly merge with real lives. Given the expectation that firms would self-regulate, it might be expected a privacy “market” would eventually take form. The next round of industry leaders will be those who learn from the pitfalls of the tech “emperors.”44 Though this evolution seems natural and expected, there are many potential issues entrepreneurs could face when seeking to enhance privacy through enhanced privacy services. One such issue may be neutralization through acquisition by the established tech giants; another, shifting regulatory tides. Yet another issue may be the difficulty of designing a profitable venture while avoiding the troublesome paid-for privacy and freemium models, which rely heavily on targeted advertising and sales of data to third parties. However, these issues have not stopped the founders and investors behind Jumbo and DuckDuckGo. The former uses paid subscriptions and the latter does not collect and sell surveillance data.45 Another potential venture could offer privacy management services similar to Jumbo with IoT devices. Developing novel ways to effectively communicate the contents of privacy policies to consumers is another open opportunity. Future online and IoT firms may find themselves at a great advantage if they can build business models that enhance consumer privacy and trust.


1. I turned off this feature. Return

2. Irving Wladawsky-Berger, Why the ‘Techlash’ Is a Threat to Growth and Progress, The Wall St. J. (June 6, 2020, 1:30 PM) (“COVID-19 will likely exacerbate this issue as firms accelerate their embrace of IT and automation”), https://blogs.wsj.com/cio/2020/06/06/why-the-techlash-is-a-threat-to-growth-and-progress/?guid=BL-CIOB-14913&mod=hp_minor_pos5&dsk=y. Return

3. Samsung, It’s All on Your Fridge, https://www.samsung.com/us/explore/family-hub-refrigerator/overview/ (last visited August 13, 2020). Return

4. See, e.g., In re Google, Inc. Privacy Policy Litig., 58 F. Supp. 3d 968, 985 (2014). Return

5. See generally Nat'l Conf. of St. Legislatures, State Laws Related To Internet Privacy (2020) [hereinafter NSCL] (discussing the various state laws regarding online privacy), https://www.ncsl.org/research/telecommunications-and-information-technology/state-laws-related-to-internet-privacy.aspx. Return

6. See, e.g., John A. Rothchild, Against Notice and Choice: The Manifest Failure of The Proceduralist Paradigm to Protect Privacy Online (Or Anywhere Else), 66 Clev. St. L. Rev. 559, 582 (2018). Return

7. Theodore Rostow, What Happens When an Acquaintance Buys Your Data?: A New Privacy Harm in the Age of Data Brokers, 34 Yale J. on Reg. 667, 679 (2017). See also Thomas Haley, Data Protection in Disarray, Wash. L. Rev. (forthcoming) (discussing the standing doctrine as a barrier for plaintiffs litigating online privacy violations in court). Return

8. See generally Samuel D. Warren & Louis D. Brandeis, The Right to Privacy, 4 Harv. L. Rev. 193 (1890). Return

9. 5 U.S.C. § 552a. Return

10. U.S. Dep't of Health, Educ., and Welfare, Records, Computers, and the Rights of Citizens (1973) [hereinafter HEW Report], https://www.justice.gov/opcl/docs/rec-com-rights.pdf. Return

11. OECD, OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (2013) [hereinafter OECD Guidelines], http://www.oecd.org/internet/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm. Return

12. See Rothchild supra note 6, at 593-608 (discussing the ways in which state and federal laws have incorporated the principles from FIPPs into positive law). Return

13. Cal. Bus. And Prof. Code § 22575-22579 (collectively known as the California Online Privacy Protection Act (CalOPPA)). Return

14. Rothchild supra note 6, at 599-600; Allison Grande, Calif. AG Won’t Back Down After Delta Privacy Suit Setback, Law360 (May 13, 2013), https://www.law360.com/articles/440998/calif-ag-won-t-back-down-after-delta-privacy-suit-setback. Return

15. Rothchild supra note 6, at 598-600. Return

16. Id. at 600 (“[CalOPPA] implements the principle of notice, but not that of choice”). Return

17. See Gregory James Evans, Regulating Data Practices: How State Laws Can Shore Up the FTC’s Authority to Regulate Data Breaches, Privacy, and More, 67 Admin. L. Rev. 187, 192 (2015). See also Chris Jay Hoofnagle, FTC Regulation of Cybersecurity and Surveillance, The Cambridge Handbook of Surveillance L. 708 (2017). Return

18. One major exception is surveillance data collected on children, which is made unlawful by a number state laws. NCSL supra note 5. Return

19. See, e.g., Fed. Trade Comm’n, FTC Staff Report: Self-Regulatory Principles for Online Behavioral Advertising 4 (2009) (“[T]he [FTC] has consistently sought to avoid stifling innovation so that responsible business practices could develop and flourish”). Return

20. See, e.g., Adam Thierer, The Internet of Things and Wearable Technology: Addressing Privacy and Security Concerns without Derailing Innovation, 21 Rich. J.L. & Tech. 6, 124-127 (2015) (discussing IoT privacy issues and arguing that innovation will be stifled by top-down industry regulations). Return

21. See generally, Rebecca Lipman, Online Privacy and the Invisible Market for Our Data, 120 Penn. St. L. Rev. 777 (2016). Return

22. See Rothchild supra note 6, at 608-613. See also, Scott R. Peppet, Regulating the Internet of Things First Steps, 93 Tex. L. Rev. 85, 140-146 (2014) (arguing that there is likely no consent by consumers who purchase wearable devices). Return

23. Robert D. Atkinson, Doug Brake, Daniel Castro, Colin Cunliff, Joe Kennedy, Michael McLaughlin, Alan McQuinn, & Joshua New, A Policymaker’s Guide to the “Techlash”—What It Is and Why It’s a Threat to Growth and Progress, Information Technology & Innovation Foundation (Oct. 28, 2019) (“a wide range of activists rely on and stoke these conditions to help advance long-held policy agendas”), https://itif.org/publications/2019/10/28/policymakers-guide-techlash. Return

24. “Techlash” is used here to refer to the general negative response to technologies which are disrupting many areas of life, such as the Russian hacking of the election, Cambridge Analytica, and loss of jobs due to automation. Atkinson et. al. supra note 23, described techlash as “a general animus and fear, not just of large technology companies, but of innovations grounded in IT.” See also Wladawsky-Berger supra note 2. Return

25. See Atkinson et. al. supra note 23 (discussing the various areas in which Congress should pass legislation regarding technology companies including privacy policy). See also Lauren Bass, The Concealed Cost of Convenience: Protecting Personal Data Privacy in The Age of Alexa, 30 Fordham Intell. Prop., Media & Entm't. L. J. 261, 285 (2019) (discussing the lack of federal legislation for online privacy and the issues of artificial intelligence involved with user-generated surveillance data). Return

26. Stacy-Ann Elvy, Commodifying Consumer Data in the Era of the Internet of Things, 59 B.C. L. Rev. 423, 522 (2018) (arguing that flaws in privacy policies, Article 9 of the UCC, and the Bankruptcy Code regarding the notice and choice paradigm mean “consumer interests may be more adequately protected when restrictions are imposed on the collection, transfer, and assignment of certain types of data under commercial frameworks and in other monetization settings.”). See also Andrew W. Bagley and Justin S. Brown, Limited Consumer Privacy Protections Against the Layers of Big Data, 31 Santa Clara Computer & High Tech. L. J. 483, 495-498 (2015); Bass supra note 23, at 292, (“because privacy policies are one-size-fits-all, and in practice cannot be tailored, altered, or user-customized, the once-empowering concept of autonomous choice has been essentially reduced to ‘choosing’ between de facto acceptance of the stated terms or complete forfeiture of the use of the desired app, website, or software”). Return

27. Vivian Adame, Consumers’ Obsession Becoming Retailers’ Possession: The Way that Retailers are Benefiting from Consumers’ Presence on Social Media, 53 San Diego L. Rev. 653, 659-661 (2016). Return

28. Wolfie Christl, Corporate Surveillance in Everyday Life: How Companies Collect, Combine, Analyze, Trade, and Use Personal Data on Billions 43 (2017) ("[Acxiom, a data broker,] manages 15,000 customer databases and 2.5 billion customer relationships for 7,000 clients, including for 47 of the Fortune 100 companies . . . [and] claims to provide access to up to 5,000 data elements on 700 million people worldwide from 'thousands of sources' in many countries, including data about consumers in the US, the UK, and Germany”) (internal footnotes omitted). See also Latif Nasser, Connected: Surveillance, Netflix (2020) (interview with Judith Duportail discussing the amount of information, over 800 pages’ worth, Tinder collected from her personal use of the application), https://www.netflix.com/title/81031737. Return

29. Christl supra note 28, at 10. Return

30. Id. at 14. Return

31. Claire Cain Miller, When Algorithms Discriminate, N.Y. Times (July 9, 2015) (discussing concerns including perpetuating racial and gender stereotypes in addition to predatory loans), https://www.nytimes.com/2015/07/10/upshot/when-algorithms-discriminate.html. Return

32. Lipman supra note 19, at 788 (“[t]he medical information your FitBit or Apple Watch collects is not covered by HIPAA because HIPAA only covers certain entities like hospitals or health insurance companies, not user-generated health information”). Return

33. Matthew Crain, The Limits of Transparency: Data Brokers and Commodification, City U. of N.Y. Academic Works 6 (2017), https://academicworks.cuny.edu/cgi/viewcontent.cgi?article=1177&context=qc_pubs. Return

34. Michel Schreiner & Thomas Hess, Why Are Consumers Willing to Pay for Privacy? An Application of the Privacy-freemium Model to Media Companies 12 (2015) (“[t]hus, from a practical point of view, offering additional privacy control features, particularly in the form of a privacy-freemium model, might be a way for platform operators to monetize privacy protection as a new revenue model—provided that their users perceive the payable premium version as useful and trustworthy”). Return

35. See, e.g., WeWork, Why your startup needs a Terms of Use and Privacy Policy (2014), https://www.wework.com/ideas/city-guides/need-know-creating-terms-use-privacy-policy-startup. Return

36. Anja Lambrecht & Catherine E. Tucker, Can Big Data Protect a Firm from Competition? 15 (2015). Return

37. Michael Kan, How Much Data Are Your Apps Collecting? Apple 'Privacy Labels' Will Tell You, PC Mag (June 22, 2020), https://www.pcmag.com/news/how-much-data-are-your-apps-collecting-apple-privacy-labels-will-tell-you. Return

38. See generally Ben Larkin and Stephen McKelvey, Of Smart Phones and Facebook: Social Media’s Changing Legal Landscape Provides Cautionary Tales of “Pinterest” for Sports Organizations, 25 J. Legal Aspects of Sport 123 (2015), http://journals.iupui.edu/index.php/jlas/article/view/22211/21358. See also Joseph Steinberg, How Social Media Impacts Data Security, Bus. and Tech (Jun. 22, 2020), https://www.futureofbusinessandtech.com/digital-security/how-social-media-impacts-data-security. Return

39. Schreiner et al. supra note 32, at 12 (“[s]ince we did not find a significant influence of perceived internet privacy risk, our results suggest that even consumers who are afraid of privacy violations do not assess subscribing to a platform’s premium version (more) favorably if they do not perceive it as offering added value and as trustworthy”). Return

40. See, e.g., Stacy-Ann Elvy, Paying For Privacy and the Personal Data Economy, 117 Colum. L. Rev. 1369, 1378 (2017) (“these models may permit companies to continue to monetize consumer data to the detriment of consumers and engage in predatory and discriminatory behavior while hiding behind the veneer of consumer empowerment and control”). Return

41. Rod Sides, To Boost Customer Trust, Make Privacy a Top Priority, The Wall St. J. (January 29, 2020) (“The vast majority (73%) of consumers are more likely to be open to or neutral about sharing data if they are satisfied with privacy policies explaining how data is used”), https://deloitte.wsj.com/cmo/2020/01/08/to-boost-customer-trust-make-privacy-a-top-priority/. Return

42. Annie Musgrove, French-Founded Jumbo Privacy Snags $8M Series A to Help Consumers Protect Data, Tech.EU (June 24, 2020), https://news.crunchbase.com/news/french-founded-jumbo-privacy-snags-8m-series-a-to-help-consumers-protect-data/. Return

43. Sun Tzu, The Art of War 77 (Samuel B. Griffith trans., Oxford U. Press 1963). Return

44. David Ingram, In Heated Hearing, Lawmakers Allege Tech Industry 'Emperors' Hold Too Much Power, NBC News (July 29, 2020) (“Subcommittee Chairman David Cicilline, D-R.I., said the CEOs had become 'emperors' on the internet”), https://www.nbcnews.com/tech/tech-news/4-tech-industry-titans-defend-size-their-companies-congress-n1235190. Return

45. Gennaro Cuofano, How Does DuckDuckGo Make Money? DuckDuckGo Business Model Explained, FourWeekMBA, https://fourweekmba.com/duckduckgo-business-model/ (last accessed Aug. 13, 2020). Return

Matthew Cook

J.D. Candidate, expected 2022, University of Virginia School of Law

Previous
Previous

An Update for an Update: DOJ’s Supplement to its 2015 IEEE Business Review Letter

Next
Next

Welcome to The Corner!