Fairness and Intersectional Non-Discrimination in Human Recommendation (FINDHR)

Algorithmic hiring is the usage of tools based on Artificial intelligence (AI) for finding and selecting job candidates. As other applications of AI, it is vulnerable to perpetuate discrimination. Considering technological, legal, and ethical aspects, the EU-funded FINDHR project was aimed to facilitate the prevention, detection, and management of discrimination in algorithmic hiring and closely related areas involving human recommendation.

New to FINDHR? The project’s main results are summarized in the Toolkits.

Specifically for developers, we released the following materials:

If you are a job seeker, we recommend you read:

Researchers and communicators may also be interested in the following:

FINDHR was a member of an AI Fairness cluster, which includes the European projects AEQUITAS, BIAS, and MAMMOTH, all funded by the Horizon Europe program. This cluster plays a key role in the European Commission’s strategy to promote trustworthy AI, with a focus on developing and implementing AI technologies that are human-centric, sustainable, secure, inclusive, and reliable. For more details and information about our events, click here.

Follow us on 🦣 Mastodon ,🐦 Twitter , 🎥YouTube, 📢LinkedIn and  📢 Join our mailing list

Latest news

VIEW ALL NEWS –>


Partners