AI Simplified: SHAP Values in Machine Learning

February 6, 2020
· 1 min read

What are SHAP values and how are they being used to explain machine learning predictions?

Meet Mark Romanowsky, a Data Scientist at DataRobot. Mark explains the important role that SHAP values play in machine learning by providing real-world examples: how a group of friends can fairly and efficiently share an Uber ride to their separate homes, and helping college students at risk of not graduating. 

In everyday life, Shapley values are a way to fairly split a cost or payout among a group of participants who may not have equal influence on the outcome. In machine learning models, SHAP values are a way to fairly assign impact to features that may not have equal influence on the predictions.

Learn more in his AI Simplified video:


Ready for more AI Simplified? Check out these other videos:

White Paper
Trusted AI 101: A Guide to Building Trustworthy and Ethical AI Systems
Download Now
About the author
Ashley Smith
Ashley Smith
Meet Ashley Smith
  • Listen to the blog
  • Share this post
    Subscribe to DataRobot Blog
    Newsletter Subscription
    Subscribe to our Blog