Privacy

Who’s Looking At You

Who's Looking At You

As facial recognition software has become more reliable it is entering that grey area between personal security and personal privacy. 

Clearview AI have begun to trail a system that scrapes facial images from the Internet and combining that with location and situation data offer a commercial search function for faces.  The system will allow an image to be uploaded and will report back where and when matched individuals have been located.  The service is targeted at law enforcement with the aim of revealing the activities and accomplices of existing suspects. Even if the legality of a match is not upheld in court it can lead to clues as to where an individual may have been at a specific time that can be followed up by investigators and proven by other means.

Clearview cite an example in their promotional video: In 2019 images of a sexual abuser of 7 year old girl were reported through a web service provider to Las Vegas police.  A match within an image was found to a bystander standing in the background in a picture from a bodybuilder event. The individual was tracked down and pleaded guilty in February 2020 receiving 35 years in prison. Few would argue that this violation of the perpetrator’s rights are offset by the rescue of the victim. There is not always such a clear line between risk and benefit.

The moral problem with the model is that the images within Clearview’s database are sourced from publicly available Internet sites.  This would include social media accounts, blogs, corporate websites; anywhere with a decent source image.  The engine does not consider any copyright of the image creator nor the privacy of the subject of the image.  With some hosting services uploading an image may give up some these rights to that host.  On Facebook for example the uploader retains ownership of an image but Facebook gains some rights to use the image including transfer of that use to other parties.  This is a commercial transaction; the uploader is yielding some privacy rights in exchange for a defined service.  In many cases of image hosting there would be no such agreement.  It is not in the interest of ISP hosts to sell files as that would detract anyone from using their services.  Unlike the big social media and search engines most web hosts are unlikely to have the software to identify what is the content of those images that they do host.

Clearview was investigated by the UK Information Commissioner’s Office and Australian  Office of the Australian Information Commissioner in July 2020. On 3rd November the Australian agency announced that Clearview was unlawfully using images of Australians and ordered them to stop collecting facial image of Australians and to delete existing images from their database.  The Australian government view being that the Australians who had posted images did not expect or specifically allow them to be used for surveillance purposes.  Clearview has since announced its intention to appeal.   Clearview’s opinion is that the images were available to anyone; no passwords or security procedures protected them, they also state that as the harvested images are stored in the USA not Australia, Australian privacy law no longer applies to them.

This is a case of existing legislation not keeping up with technology.  As Internet access speeds and camera resolutions (especially on mobile devices) increase it becomes increasing easy to post high quality images on the web.  This provides sufficiently detailed data for recognition software to identify unique facial images.  The concept has been in the realm of Hollywood fiction for some time with computers on crime dramas scanning through facial images to find a match from a photo or video still.  In 2002 Minority Report predicted facial recognition offering personalised adverts to passers-by.  The concept has since leaked into the real world. UK borders use facial recognition booths to match holder’s faces to their passport images.  The process is slow and often fails.  Facial recognition cameras at airports, railway stations and other public venues offer some limited privacy in that an individual could choose not to use that service or visit the venue. In the case of scraping images from the web the target has no means to opt out of the system.

The emergence of COVID has led to widespread use of face masks.  These together with hoodies, hats, glasses and so on will substantially reduce the effectiveness of face recognition.  As long as there is the possibility of this recognition being used there is the inference that someone wearing a mask could be trying to hide their identity.  It is not a new concept that someone would use a mask to disguise their identity.  Consider the cowboys pulling their bandannas over their lower faces in countless Western films; not all cowboys were bad guys.

We still have control over what images we post on-line and hence how we might be tracked in the future. The technology used by Clearview and similar services can only improve and the laws regulating its use will eventually to catch up with it. It will become increasingly important how we treat images of other people when we post them on the Internet as it is the privacy of others that we are exposing.

Leave a comment:

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.