The British Security Industry Association (BSIA) has released the first of its kind ethical and legal use guide for AFT (automated facial recognition) which lays down how, for which purpose, and in what conditions can end-users deploy the technology.
BSIA, the trade association for the professional security industry in the UK, stresses that the aim of the ethical and legal use guide for AFR technology is to ensure that AFR (automated facial recognition) is used in such a manner that it does not cause harm or discriminate against any persons in either a public or private setting.
“The use of AI is an exponentially growing part of daily life and we must ensure that all stakeholders are aware of the ethical and legal considerations of using these solutions. If not, this beneficial technology could be misused, leading to loss of trust and increased scepticism of the technology,” says Dave Wilkinson, BSIA’s Director of Technical Services at the BSIA and the leader of the AFR working group.
“This collaborative piece of work among industry experts has produced a guide with advice and recommendations on ethical and legal AFR usage, which will appeal to anyone in or out of the physical security industry.
“We want to make sure the general public know that this ethical and legal guidance is out there for companies to follow. Compliance with the law is paramount using this technology, and this guide will provide companies with the basis to demonstrate their commitment to complying with the ethical realities, consequences and impacts of using an AI/AFR solution.”
In the absence of a global ethical framework to govern the use of AFR, the ethical and legal use guide has been framed as per the recommendations of OECD (the Organisation for Economic Cooperation and Development) which calls for the technology to be used ethically, transparently, in accordance with the rule of law, and with respect to human rights, diversity, and democratic values.
According to the guide, AFR must not be used to identify individuals without obtaining their prior consent, that the AFR training data must be obtained lawfully, that the database of images against which the AFR matches faces must be legally controlled as set out in the Data Protection Act 2018, and that the use of AFR should be proportionate to the purpose.
It also calls for the users of AFR to ensure that privacy data is made available for subject access requests, and that all data collected is necessary, proportionate, and stored transparently and for no longer than necessary. Users of AFR must also keep data protection policies, conduct Data Protection Impact Assessments, and nominate individuals or groups to take responsibility for the ethical and legal compliance and operation of the system.
Considering that data privacy will be at the centrestage of data collection through AFR technology, users of the technology must also ensure that they identify data controllers, identify measures to reduce risks, assess necessity and proportionality, describe the processing activity, utilize privacy masking, and confirm completion of DPIA and record outcomes.
The guide also stresses that users of AFR solutions should also take special care as far as the storage and retention of sensitive data of private citizens is concerned. They must consider how long the data is to be retained, how often the data needs to be reviewed, ensure that cyber security protections are in place to protect the data, and define the purpose of databases in line with ethical and legal requirements.