By Kathleen Wills, Esq.*
Kathleen Wills is a graduate of Antonin Scalia Law School and former C-IP2 RA.
Artificial Intelligence and Big Data
While many of us have come to rely on biometrics data when we open our phones with Apple’s “Face ID,” speak to Amazon’s Alexa, or scan our fingerprints to access something, it’s important to understand some of the legal implications about the big data feeding artificial intelligence (AI) algorithms. While “Big Data” refers to processing large-scale and complex data,[1] “biometrics data” refers to the physical characteristics of humans that can be extracted for recognition.[2] AI and biometrics work together in the dynamics as exemplified above, since AI is a data-driven technology and personal data has become propertised.[3] The type and sensitivity of the personal data used by AI depend on the application, and not all applications trace details back to a specific person.[4] The already-active field of Big Data analysis of biometrics working with AI continues to grow, promising to pose challenges and opportunities for consumers, governments, and companies.
A. How AI Uses Big Data
AI works with Big Data to accomplish several different outcomes. For example, AI can use Big Data to recognize, categorize, and find relationships from the data.[5] AI can also work with Big Data to adapt to patterns and identify opportunities so that the data can be understood and put into context. For organizations looking to improve efficiency and effectiveness, AI can leverage Big Data to predict the impact of various decisions. In fact, AI can work with algorithms to suggest actions before they have been deployed, assess risk, and provide feedback in real time from the Big Data pools. When AI works with Big Data and biometrics, AI can perform various types of human recognition for applications in every industry.[6] In other words, the more data AI can process, the more it can learn. Thus, the two rely on each other in order to keep pushing the bounds of technological innovation and machine learning and development.
B. How AI relates to Privacy Laws
Since AI involves analyzing and understanding Big Data, often the type involving biometrics, or personal information, there are privacy considerations and interests to protect. Further, since businesses want access to consumer data in order to optimize the market, governments are placing limits on the use and retention of such data. For some sectors, the boundary between privacy and AI becomes an ethical one. One can immediately imagine the importance of keeping biometric health data private, calling to mind the purpose of HIPAA, the Health Insurance Portability and Accountability Act,[7] even though AI can help doctors better understand patterns in their patients’ health, diagnoses, and even surgeries.
I. United States Privacy Law
A. Federal Privacy Law
As concerns grow about the privacy and security of data used in AI, there is currently no federal privacy law in the United States. Senators Jeff Merkley and Bernie Sanders proposed the National Biometric Information Privacy Act in 2020, which was not passed into law; it contained provisions such as requiring consent from individuals before collecting information, providing a private right of action for violations, and imposing an obligation to safeguard the identifying information.[8] The act also required private entities to draft public policies and implement mechanisms for destroying information, limit collection of information to valid business reasons, inform individuals that their information is stored, and obtain written releases before disclosure.
B. State Privacy Laws
There are a few states that have passed their own privacy laws or amended existing laws to include protections for biometric data, such as Illinois, California, Washington, New York, Arkansas, Louisiana, Oregon, and Colorado. Other states have pending bills or have tried—and currently, failed—to pass biometric protection regulation.
The first, and most comprehensive, biometric regulation was enacted in 2008: the Illinois Biometric Information Privacy Act (BIPA), which governs collecting and storing biometric information.[9] The biometric law applies to all industries and private entities but exempts the State or any local government agency.[10] BIPA requires entities to inform individuals in writing that their information is being collected and stored and why, and restricts selling, leasing, trading, or profiting from such information. There is a right of action for “any person aggrieved by a violation” in state circuit court or a supplemental claim in federal district court that can yield $1,000 for negligence, and $5,000 for intentional and reckless violations, as well as attorneys’ fees and equitable relief. In 2018-2019, over 200 lawsuits have been reported under BIPA, usually in class action lawsuits against employers.[11]
Texas’s regulation, Chapter 503: Biometric Identifiers, varies greatly from Illinois’s act.[12] Under this chapter, a person can’t commercialize another’s biometric identifier unless they inform the person and receive consent; once consent is obtained, one can’t sell, lease, or disclose that identifier to anyone else unless the individual consents to that financial transaction or such disclosure is permitted by a federal or state statute. The chapter suggests a timeframe for destroying identifiers, sets a maximum of $25,000 civil penalty per violation, and is enforced by the state attorney general. Washington’s legislation, Chapter 19.375: Biometric Identifiers, is similar to Texas’s regulation in that the attorney general can enforce it; however, Washington carved out security purposes to the notice and consent procedures usually required before collecting, capturing, or enrolling identifiers.[13]
California enacted the CCPA, or California Consumer Privacy Act of 2018, which provides a broader definition of “biometric data” and that consumers have the right to know which information is collected and how it’s used, delete that information, and opt-out from the sale of that information.[14] This law applies to entities that don’t have a physical presence in the state but either (a) have a gross annual revenue of over $25 million, (b) buy, receive, or sell the personal information of 50,000 or more California residents, households, or devices, or (c) derive 50% or more of their annual revenue from selling California residents’ personal information.[15] This was amended by the CPRA (the California Privacy Rights and Enforcement Act), which will become effective January 1, 2023, and expands the CCPA.[16] One expansion of the CPRA is a new category of “sensitive personal information” which encompasses government identifiers; financial information; geolocation; race; ethnicity; religious or philosophical beliefs; along with genetic, biometric, health information; sexual orientation; nonpublic communications like email and text messages; and union membership. It also adds new consumer privacy rights including the right to restrict sensitive information and creates a new enforcement authority. Thus, the CRPA brings California’s privacy law closer to the European Union’s General Data Protection Regulation.[17]
New York amended its existing data breach notification law to encompass biometric information into the definition of “private information.”[18] Similar to California’s law, the SHIELD Act applies to all companies holding residents’ data; on the other hand, the SHIELD Act outlines various procedures companies should implement for administrative, technical, and physical safeguards. New York also passed a limited biometric legislation for employers, but there is no private right of action.[19] Similar to New York, Arkansas amended its Personal Information Protection Act so “personal information” now includes biometric data. Louisiana also amended its Data Breach Security Notification Law to do the same, as well as added data security and destruction requirements for entities.[20] Finally, Oregon amended its Information Consumer Protection Act to include protections for biometric data with consumer privacy and data rights.
Most recently, on July 8, 2021, Colorado enacted the Colorado Privacy Act (CPA) after the Governor signed the bill into law.[21] The state Attorney General explains that the law “creates personal data privacy rights” and applies to any person, commercial entity, or governmental entity that maintains personal identifying information. Like consumers in California, consumers in Colorado can opt out from certain provisions of the Act—but not all; residents cannot opt out from the unnecessary and irrelevant collection of information, and controllers must receive a resident’s consent before processing personal information. As for remedies, the CAP provides for a 60-day cure period to fix non-compliance of the Act, or controllers will face civil penalties, but consumers do not have a private right of action under this law.
II. International Privacy Law
Other countries have pioneered data privacy regulations, as exemplified by the European Union’s (EU’s) regulation: General Data Protection Regulation (GDPR).[22] Since 2018, this regulation has been enforced against companies that operate within any EU member state in order to protect “natural persons with regard to the processing of personal data and rules relating to the free movement of personal data.” The GDPR “protects fundamental rights and freedoms of natural persons,” particularly personal data. The regulation is quite comprehensive, with chapters on rights of data subjects, transfers, remedies, and even provisions for particular processing situations such as freedom of expression and information. There are several carve-outs or “exceptions” to the regulation, such as where a citizen gives consent for a specific purpose or the data are necessary for preventative or occupational medicine. Citizens also have “the right to be forgotten” or withdraw consent at any time and can lodge a complaint for violations or seek judicial remedy, compensation, or administrative fines.
Since the GDPR protects data of EU citizens and residents, it has an extraterritorial effect. In January of 2021, the European Data Protection Board (EDPB) adopted written opinions for new standard contractual clauses of the GDPR jointly with the European Data Protection Supervisor. One clause will be for the transfer of personal data between processors to third countries outside of the EU.[23] The transfer of personal data to a third country or international organization may only take place if certain conditions are met, namely following some of the safeguards of European data protection law. However, enforcement of the GDPR is taking time, and Ireland’s data protection commissioner, Helen Dixon, has explained that enforcement goes beyond issuing fines. Interestingly, as Apple, Facebook, Google, LinkedIn, and Twitter are based in Ireland, the country takes the lead in investigating companies.[24]
The GDPR has influenced other countries’ privacy laws. For example, Canada has a federal privacy law, the Personal Information Protection and Electronic Documents Act, and provincial laws that protect personal information in the private sector, which were heavily influenced by the EU’s GDPR.[25] Argentina has begun the legislative process to update its National Data protection regime, and such resolution was passed in January 2019.[26] Further, Brazil’s General Data Protection Law replicates portions of the GDPR and includes extraterritoriality provisions, but it also allows for additional flexibility. The GDPR has also affected the Israeli regulatory enforcement, which has been recognized by the European Commission as an adequate jurisdiction for processing personal information. While the list of countries affected by, or taking notes from, the GDPR is quite extensive, it’s important to note that this is a global challenge and opportunity to protect the privacy of consumers when handling biometrics, Big Data, and using them in AI.
III. Why the Legal Considerations for AI Matter
AI and the usage of Big Data and biometric information in everyday life effect a multitude of individuals and entities. AI can use a consumer’s personal information and, often, highly sensitive information. Misappropriation or violations of that information are enforced against business entities. Governments all over the globe are working to determine which, if any, regulations to pass to protect AI and what the scope of such rules should be. In the U.S., some states require the Attorney General to enforce state privacy laws, while other state laws provide individuals with a private right of action. Interestingly, given the role AI plays in innovation and technology, venture capitalists (VC) might also play a role as the law develops, since VC firms can work with policy makers and lobbyists to determine potential market failure, risk assessments, and benefits from protecting AI and data.[27]
In addition to the individuals, governments, entities, and industries affected by AI and Big Data biometric analysis, there are also legal implications. While this article discusses, at a high level, the international and national privacy law considerations from AI, there are other constitutional and consumer protection laws implicated as well. AI and other uses of Big Data and biometric information have quickly become ingrained in our everyday lives since the first smartphone was created by IBM in 1992. As laws all over the world continue to be discussed, drafted, killed, adopted, or amended, it’s important to understand the importance of AI and the data it uses.
* The information in this article does not, nor is it intended to, constitute legal advice, and has been made available for general information purposes only.
[1] Shafagat Mahmudova, Big Data Challenges in Biometric Technology, 5 J. Education and Management Engineering 15-23 (2016).
[2] Ryan N. Phelan, Data Privacy Law and Intellectual Property Considerations for Biometric-Based AI Innovations, Security Magazine (June 12, 2020).
[3] Gianclaudio Malgieri, Property and (Intellectual) Ownership of Consumers’ Information: A New Taxonomy for Personal Data, 4 Privacy in Germany 133 ff (April 20, 2016).
[4] Jan Grijpink, Privacy Law: Biometrics and privacy, 17 Computer Law & Security Review 154-160 (May 2001).
[5] Jim Sinur and Ed Peters, AI & Big Data; Better Together, Forbes, https://www.forbes.com/sites/cognitiveworld/2019/09/30/ai-big-data-better-together/?sh=5c8ed5f360b3 (Sept. 30, 2019).
[6] Joshua Yeung, What is Big Data and What Artificial Intelligence Can Do?, Towards Data Science, https://towardsdatascience.com/what-is-big-data-and-what-artificial-intelligence-can-do-d3f1d14b84ce (Jan. 29, 2020).
[7] David A. Teich, Artificial Intelligence and Data Privacy – Turning a Risk into a Benefit, Forbes, https://www.forbes.com/sites/davidteich/2020/08/10/artificial-intelligence-and-data-privacy–turning-a-risk-into-a-benefit/?sh=5c4959626a95 (Aug. 10, 2020).
[8] Joseph J. Lazzarotti, National Biometric Information Privacy Act, Proposed by Sens. Jeff Merkley and Bernie Sanders, National Law Review, https://www.natlawreview.com/article/national-biometric-information-privacy-act-proposed-sens-jeff-merkley-and-bernie (Aug. 5, 2020).
[9] Natalie A. Prescott, The Anatomy of Biometric Laws: What U.S. Companies Need to Know in 2020, National Law Review (Jan. 15, 2020).
[10] Biometric Information Privacy Act, 740 ILCS 14 (2008).
[11] Supra note 9.
[12] Tex. Bus. & Com. Code § 503.001 (2009).
[13] Wash. Rev. Code Ann. § 19.375.020 (2017).
[14] California Consumer Privacy Act (CCPA), State of California Department of Justice, https://oag.ca.gov/privacy/ccpa (last accessed May 22, 2021).
[15] Rosenthal et. al., Analyzing the CCPA’s Impact on the Biometric Privacy Landscape, https://www.law.com/legaltechnews/2020/10/14/analyzing-the-ccpas-impact-on-the-biometric-privacy-landscape/ (Oct. 14, 2020).
[16] Brandon P. Reilly and Scott T. Lashway, Client Alert: The California Privacy Rights Act has Passed, Manatt, https://www.manatt.com/insights/newsletters/client-alert/the-california-privacy-rights-act-has-passed (Nov. 11, 2020).
[17] Peter Banyai et al., California Consumer Privacy Act 2.0 – What You Need to Know, JDSupra, https://www.jdsupra.com/legalnews/california-consumer-privacy-act-2-0-93257/ (Nov. 27, 2020).
[18] Samantha Ettari, New York SHIELD Act: What New Data Security Requirements Mean for Your Business, JDSupra, (June 1, 2020).
[19] Supra note 9, referring to N.Y. Lab. Law §201-a.
[20] Kristine Argentine & Paul Yovanic, The Growing Number of Biometric Privacy Laws and the Post-COVID Consumer Class Action Risks for Businesses, JDSupra, https://www.jdsupra.com/legalnews/the-growing-number-of-biometric-privacy-2648/#:~:text=In%202019%2C%20Arkansas%20also%20jumped,of%20an%20individual’s%20biological%20characteristics.%E2%80%9D (June 9, 2020).
[21] The Colorado Privacy Act: Explained, Beckage, https://www.beckage.com/privacy-law/the-colorado-privacy-act-explained/ (last accessed July 13, 2021); see also Phil Weiser: Colorado Attorney General, Colorado’s Consumer Data Protection Laws: FAQ’s for Business and Government Agencies, https://coag.gov/resources/data-protection-laws/ (last accessed July 13, 2021).
[22] General Data Protection Regulation (GDPR), https://gdpr-info.eu/ (last accessed May 22, 2021).
[23] Update on European Data Protection Law, National Law Review, https://www.natlawreview.com/article/update-european-data-protection-law (Feb. 24, 2021).
[24] Adam Satariano, Europe’s Privacy Law Hasn’t Shown Its Teeth, Frustrating Advocates, New York Times, https://www.nytimes.com/2020/04/27/technology/GDPR-privacy-law-europe.html (April 28, 2020).
[25] Eduardo Soares et al., Regulation of Artificial Intelligence: The Americas and the Caribbean, Library of Congress Legal Reports, https://www.loc.gov/law/help/artificial-intelligence/americas.php (Jan. 2019).
[26] Ius Laboris, The Impact of the GDPR Outside the EU, Lexology, https://www.lexology.com/library/detail.aspx?g=872b3db5-45d3-4ba3-bda4-3166a075d02f (Sept. 17, 2019).
[27] Jacob Edler et al., The Intersection of Intellectual Property Rights and Innovation Policy Making – A Literature Review, WIPO (July 2015).