The rapid development of technology has brought with it changes in social interaction, workplace productivity and even the legal landscape. Facial recognition software is the next hot topic. Like all new and emerging technology, it brings with it serious ramifications, especially if misused.
One only has to look at the latest trend the FaceApp that took the internet by storm. FaceApp Inc., a mobile application (iOS/Android) created by Russian company Wireless Labs, used artificial intelligence (AI) to create frighteningly realistic transformations of photographs of faces.
Want to look older? Younger? Change gender? This application can do that, and while it may be fun to see what you could look like in 20 to 30 years, the fact is that this kind of information can be decidedly detrimental to one’s privacy.
FaceApp foreshadowing Facebook’s facial recognition technology and lawsuit
The real issue here is privacy and how the company protects users' data. Reservations about using this kind of service are to be expected in the aftermath of the Cambridge Analytica scam in which thousands of people had personal data misused after they answered a fun personality quiz online. People became wary of how their data was accessed, used or sold to someone else.
Consider the story of Ever, a picture storage app, that was utilizing users' pictures to train facial recognition software which was then sold to law enforcement. Or IBM using Flickr photos to teach facial recognition apps without permission, and PopSugar's application, which was making pictures uploaded by its users publicly available on an unsecured web address where the photos were stored.
Then, there are also legal ramifications for this kind of application.
It is not just about the photos. It is about the fact that the application garners users browsing history and location data. Even though some privacy statements say "we will not rent or sell your information to third parties outside FaceApp," the information is shared with "third-party advertising partners."
FaceApp is not alone in its behavior. Social media giant, Facebook is also facing trouble over its facial recognition technology. A class-action suit has been filed in the federal courts over Facebook's app violating users' privacy.
The Facebook facial recognition lawsuit
The U.S. Court of Appeals for the Ninth Circuit affirmed a lower court’s ruling certifying a class action alleging Facebook’s “Tag Suggestions” a facial recognition program, violated the Illinois Biometric Information Privacy Act (BIPA).
Facebook suggested the three Illinois plaintiffs lacked standing to sue, as there was no harm.
BIPA, the only law allowing private individuals to file a lawsuit for damages for violation of the Act, passed in 2008, protects against the unlawful collection and storing of biometric information. Illinois was the first state to pass such a law. Texas and Washington have now passed similar laws.
In summary, the Ninth Circuit affirmed that plaintiffs have alleged a concrete injury and that the class was properly certified. (Patel v. Facebook, Inc., No. 18-15982 (9th Cir. Aug. 8, 2019).
According to Court of Appeals Judge S. Ikuta: “The plaintiffs allege that a violation of these requirements allows Facebook to create and use a face template and to retain … for all time…. the privacy right protected by BIPA is the right not to be subject to the collection and use of such biometric data, (and) Facebook’s alleged violation of statutory requirements would necessarily violate the plaintiffs’ substantive privacy interests.”
The plaintiff’s attorney of record, Shawn Williams, added: “The Ninth Circuit’s opinion further confirms availability of legal redress for the growing privacy intrusions by large corporations surreptitiously amassing mountains of personal information from consumers.”
Facebook in hot water once again
The Ninth Circuit’s ruling has landed just as Facebook is facing growing pressure relating to other alleged privacy violations. Consider the staggering $5 billion fine the Federal Trade Commission (FTC) levied against Facebook in relation to consumer privacy violations. Also in the recent mix of concerns levelled at Facebook and their even increasing alleged privacy breach issues, are security problems in the Messenger Kids’ app.
The alleged technical glitch allowed a friend of a child to create a group chat and invite one or more of the second child’s parent-approved friends without those contacts having been approved by the parent of the first child. It potentially opened the door for an adult posing as a child to become a part of the chat.
And now, the biometric case filed against Facebook
The lawsuit filed against Facebook for alleged violation of the BIPA in Illinois started its journey through the courts in 2015 and took direct aim at the strictest law in the country protecting biometric data, such as iris scans and fingerprints. The law as stated in Illinois mandates companies to get a “written” release to obtain an individual’s biometric data, and to destroy it after a designated period of time. The case was slated to go to trial in July 2018, but the Ninth Circuit granted Facebook’s request for an appeal of a 2018 ruling granting class certification.
Facebook argued the class action was deemed certified without any show of harm beyond a minimal statutory violation. Other industry pundits suggested the class certification was setting a “dangerous precedent.” The Electronic Privacy Information Center president Marc Rotenberg stated: “…collection of biometric information presents profound risks to privacy, safety, and security.”
As for the Justices on the Ninth Circuit bench, they relied on “common law roots to the right of privacy” and the Fourth Amendment, given a nod by SCOTUS, amidst technological advances such as tracking cell-site locations and GPS monitoring.
Justice Ikuta wrote: “Once a face template of an individual is created, Facebook can use it to identify that individual in any of the other hundreds of millions of photos uploaded to Facebook each day, as well as determine when the individual was present at a specific location.” “We conclude that the development of a face template using facial-recognition technology without consent (as alleged here) invades an individual’s private affairs and concrete interests.”
Facebook suggested the lower court did not considered that Facebook’s nine servers, which store the face templates, are not in Illinois, suggesting potential variances among class members. The Ninth Circuit said where the alleged violation took place is a “threshold question” to be decided.
Ultimately, the case returns to a lower court, and given the fact that BIPA’s statutory damage provisions are $1,000 to $5,000 per incident, Facebook may face a staggering damage award or significant settlement. Not that Facebook has not already had to pay out a $5 billion Federal Trade Commission (FTC) fine for violating a 2012 consent order intending to prevent the company from misrepresenting to consumers its data collection, sharing, and security practices.
The statements made by Facebook over the past several years with regard to facial recognition tools and use of data have allegedly been misleading, and the U.S. Securities and Exchange Commission (SEC) and FTC have fined Facebook for their predatory practices aimed at data mining information that is not theirs to share.
The overarching concern in this case, and in others of a similar nature, is that biometric information and data are so sensitive that the public welfare needs to be protected by a statute such as BIPA and that the statute needs to be rigorously enforced.