Reader small image

You're reading from  Responsible AI in the Enterprise

Product typeBook
Published inJul 2023
PublisherPackt
ISBN-139781803230528
Edition1st Edition
Right arrow
Authors (2):
Adnan Masood
Adnan Masood
author image
Adnan Masood

Adnan Masood, PhD is an artificial intelligence and machine learning researcher, visiting scholar at Stanford AI Lab, software engineer, Microsoft MVP (Most Valuable Professional), and Microsoft's regional director for artificial intelligence. As chief architect of AI and machine learning at UST Global, he collaborates with Stanford AI Lab and MIT CSAIL, and leads a team of data scientists and engineers building artificial intelligence solutions to produce business value and insights that affect a range of businesses, products, and initiatives.
Read more about Adnan Masood

Heather Dawe
Heather Dawe
author image
Heather Dawe

Heather Dawe, MSc. is a renowned data and AI thought leader with over 25 years of experience in the field. Heather has innovated with data and AI throughout her career, highlights include developing the first data science team in the UK public sector and leading on the development of early machine learning and AI assurance processes for the National Health Service (NHS) in England. Heather currently works with large UK Enterprises, innovating with data and technology to improve services in the health, local government, retail, manufacturing, and finance sectors. A STEM Ambassador and multidisciplinary data science pioneer, Heather also enjoys mountain running, rock climbing, painting, and writing. She served as a jury member for the 2021 Banff Mountain Book Competition and guest edited the 2022 edition of The Himalayan Journal. Heather is the author of several books inspired by mountains and has written for national and international print publications including The Guardian and Alpinist.
Read more about Heather Dawe

View More author details
Right arrow

Fairness metrics

Fairness metrics are mathematical measures to determine whether the model is making unbiased predictions and treating all groups fairly. Microsoft Fairlearn provides several fairness metrics, including statistical parity, equal opportunity, equalized odds, predictive parity, and demographic parity, measures critical in promoting fairness in AI systems and ensuring that all groups are treated equally by AI models. Let’s look at these metrics in more detail:

  • Demographic parity aims to ensure that the predictions made by a model are independent of membership to a sensitive group. In other words, demographic parity is achieved when the probability of a certain prediction is not dependent on sensitive group membership. In the binary classification scenario, demographic parity refers to equal selection rates across groups. For example, in the context of a resume-screening model, equal selection would mean that the proportion of applicants selected for a job...
lock icon
The rest of the page is locked
Previous PageNext Page
You have been reading a chapter from
Responsible AI in the Enterprise
Published in: Jul 2023Publisher: PacktISBN-13: 9781803230528

Authors (2)

author image
Adnan Masood

Adnan Masood, PhD is an artificial intelligence and machine learning researcher, visiting scholar at Stanford AI Lab, software engineer, Microsoft MVP (Most Valuable Professional), and Microsoft's regional director for artificial intelligence. As chief architect of AI and machine learning at UST Global, he collaborates with Stanford AI Lab and MIT CSAIL, and leads a team of data scientists and engineers building artificial intelligence solutions to produce business value and insights that affect a range of businesses, products, and initiatives.
Read more about Adnan Masood

author image
Heather Dawe

Heather Dawe, MSc. is a renowned data and AI thought leader with over 25 years of experience in the field. Heather has innovated with data and AI throughout her career, highlights include developing the first data science team in the UK public sector and leading on the development of early machine learning and AI assurance processes for the National Health Service (NHS) in England. Heather currently works with large UK Enterprises, innovating with data and technology to improve services in the health, local government, retail, manufacturing, and finance sectors. A STEM Ambassador and multidisciplinary data science pioneer, Heather also enjoys mountain running, rock climbing, painting, and writing. She served as a jury member for the 2021 Banff Mountain Book Competition and guest edited the 2022 edition of The Himalayan Journal. Heather is the author of several books inspired by mountains and has written for national and international print publications including The Guardian and Alpinist.
Read more about Heather Dawe