Taylor Swift’s Facial Recognition is Creepy but Would New US Privacy Laws Prevent It?

Taylor Swift used covert facial recognition technology to identify known stalkers at her concerts. Photo: AFP

Rolling Stone recently reported that Taylor Swift had used covert facial recognition technology to identify known stalkers at her concerts. At the same time, London’s Metropolitan Police continued its controversial trials of such technology on the general public. While U.K. privacy organizations claimed the London trials were illegal under laws such as the European Union’s General Data Protection Regulation (GDPR), there was no hint that Taylor Swift had been doing anything illegal under U.S. law.

In an insightful analysisJay Stanley of the American Civil Liberties Union pointed out the fundamental problems that facial recognition raises in such situations: what will be done with the people recognized? Will they end up on an inscrutable secret blacklist? How does the technology’s lack of accuracy impact on this? Does the public have a right to be informed in any way? Will awareness of such technologies restrain people’s behavior in other and broader situations?

Stanley’s analysis presented a number of desirables. He stated that there should be checks and balances against the abuse of such surveillance. The public should be given notice of where, how, and why it takes place. They should be allowed to base their decision to buy a ticket for a Taylor Swift concert on this knowledge. They should also be told about the policies for retention and on-sharing of the images. The accuracy of the technology and the possible false identifications as a result should be considered. There should be transparency and accountability around the watchlist.

EU’s Privacy Laws

In the E.U., most if not all of those desirables are legal obligations that arise through the GDPR and its precursor laws. Transparency and accountability are broadly required on all processing of personal data by private organizations, including through the publication of privacy notices. People have the right to view and correct their data, and in many circumstances also for it to be deleted.

Facebook CEO Mark Zuckerberg at the European Parliament. Photo: AFP

All data needs to be collected proportionally for defined purpose, and when it is not or no longer relevant, it needs to be deleted. For biometric data, used in facial recognition, the rules are even stricter.

Specific scenarios involving large-scale surveillance – arguably that includes the Taylor Swift one – require an explicit data protection impact assessment judging its proportionality and establishing unintentional privacy consequences before they can be rolled out.

Piecemeal US Legislation

The U.S. has traditionally solved its perceived privacy problems in the private sector through piecemeal legislation, covering individual issues such as credit scoringvideo rentalhealth, or children’s data as they captured the public’s and politicians’ attention.

Now, with many privacy issues taking place on a global scope, including through U.S. companies being held to E.U. laws for much of their processing, there is some movement towards overall privacy legislation.

California privacy law that comes into force in 2020 will be the nation’s toughest data privacy law and provides an extra drive for businesses to push Congress to create a new U.S.-wide privacy law.

Draft Nationwide Privacy Law

It is in this light that the non-profit organization Center for Democracy and Technology (CDT) has introduced a draft U.S.-wide privacy law for private companies’ data processing.

The draft is remarkable in how it narrows its scope seemingly arbitrarily. That governmental organizations are excluded makes sense from the existence of the Privacy Act, but also excluded is data relating to employment. The right to correction is limited to health information and information used in financial decisions – so a false positive erroneously added to Taylor Swift’s or anyone else’s blacklist could not be corrected through this law.

Taylor Swift would have to be more honest about the facial recognition, though. The draft law requires privacy notices setting out the how, why, and what of data collection – and it explicitly forbids the use of false pretenses to induce disclosure of personal information. Taylor Swift’s method of covertly recording people that watched clips at a kiosk that her security team had installed at one of her concerts would be in that category.

Promisingly, the draft law pays explicit attention to biometrics, and forbids biometric tracking when it is “not needed to provide or add to the functionality … requested.” At first glance, this would exclude the Taylor Swift scheme altogether.

Justifying Extensive Data Collection

A modern development in justifying extensive data collection is the prevention of crime and fraud. In the U.K., a scheme for identification was introduced in the National Health Service to combat a barely existing but politically convenient notion of “health tourism.” In the Netherlands, a proposal to give insurance companies access to health data for the prevention of fraud was only narrowly rejected.

In claiming compliance with the E.U.’s GDPR, various web services cite fraud detection as a reason to collect identifying information from their visitors. Similarly, the CDT draft law actually allows Taylor Swift to continue to collect facial biometrics, as she is allegedly doing it to “prevent illegal activity.”

With a lot of emphasis in the California law on on-selling of personal data, and on “deceptive” data processing in the CDT law draft, maybe we are not yet really looking at attempts to achieve broad data privacy legislation in the United States. Rather, in a continuation of U.S. privacy legislation history, these laws represent a backlash against one of the main data privacy stories of 2018: that people have lost sight of how their information is being sold on and used. The California law even explicitly mentions the Cambridge Analytica scandal in its introduction.

As such, these laws certainly force more openness on what Taylor Swift, Cambridge Analytica, and many others will be doing with personal data in the future, but they do not yet constrain significantly what they are allowed to do, and they are too modest to set up a framework of people’s rights on personal data.

Related Post