When it comes to recruitment, Artificial Intelligence (AI) tools can be used for everything from scanning CVs to identify potential candidates to consider further, as well as sourcing relevant candidates. And using tools like Humanly and Workable are on the increase to help businesses with the recruitment process.
However, just like any other supplier that processes personal data you have to consider the data protection risks. This typically means carrying out due diligence, ensuring a data processing contract is place (as required by Article 28 of UK GDPR) and potentially the need for a Data Protection Impact Assessment (DPIA). Plus, depending on the use of AI and how it works, you will need to consider the implications of the candidates rights regarding automated decision making and profiling.
To add to your AI recruitment due diligence, there are now some other considerations:
• The ICO’s “AI tools in recruitment outcomes report” (https://ico.org.uk/media/about-the-ico/ documents/4031620/ai-in-recruitment-outcomes-report.pdf), the outcomes of which are after they carried out some consensual audits, and
• The UK government’s “Responsible AI in Recruitment” (https://www.gov.uk/government/publ ications/responsible-ai-in-recruitment-guide/responsible-ai-in-recruitment#:~:text=Adoptin g%20Artificial%20Intelligence%20(AI)%E2%80%91,efficiency%2C%20scalability%2C%20and% 20consistency) which has been drafted with input from the ICO
Both documents will give any discerning recruiter a good overview of the data protection and AI considerations when using AI as part of the recruitment process. But, in a nutshell, here’s a list of what needs to be considered:
- Processing by AI must be fair and such fairness should be monitored for things like bias issues, accuracy, data minimisation and indeed whether AI is necessary. This includes considering how the use of AI is communicated (e.g. via a privacy policy) to the candidate entering the recruitment process
- Any processing by AI of special category data should be monitored for bias and discriminatory outputs
- Recruiters and AI providers should be clear who is responsible for what under GDPR and set out whether they are Controllers, Joint Controllers or Processors
- The need for a DPIA to be carried out
- Recruiters must ensure:
• candidates are made aware of the processing by AI, by the provision of privacy information (e.g. via privacy policy) covering what personal data is processed, the logic involved and how any personal data is used for training or training
- they only use the minimal amount of information when processing via AI and that this will only be for that specific purpose
- set out explicit instructions to the AI providers on the processing of their data
- AI providers are complying with their GDPR obligations
• AI providers should:
- provide technical information to recruiters to ensure they understand how the AI works
- assess the minimal amount of information need to develop and train the AI, the purpose for processing and how long the personal information will be used
- follow the explicit instructions provided by recruiters
Providing cost-effective, simple to understand and practical GDPR and ePrivacy advice and guidance, via my one-stop-shop helpline. I ❤️ GDPR