AI Simplified: SHAP Values in Machine Learning

February 6, 2020
by
· 1 min read

What are SHAP values and how are they being used to explain machine learning predictions?

Meet Mark Romanowsky, a Data Scientist at DataRobot. Mark explains the important role that SHAP values play in machine learning by providing real-world examples: how a group of friends can fairly and efficiently share an Uber ride to their separate homes, and helping college students at risk of not graduating. 

In everyday life, Shapley values are a way to fairly split a cost or payout among a group of participants who may not have equal influence on the outcome. In machine learning models, SHAP values are a way to fairly assign impact to features that may not have equal influence on the predictions.

Learn more in his AI Simplified video:

 

Ready for more AI Simplified? Check out these other videos:

White Paper
Trusted AI 101: A Guide to Building Trustworthy and Ethical AI Systems
Download Now
About the author
Ashley Smith
Ashley Smith
Meet Ashley Smith
  • Listen to the blog
     
  • Share this post
    Subscribe to DataRobot Blog
    Thank you

    We will contact you shortly

    We’re almost there! These are the next steps:

    • Look out for an email from DataRobot with a subject line: Your Subscription Confirmation.
    • Click the confirmation link to approve your consent.
    • Done! You have now opted to receive communications about DataRobot’s products and services.

    Didn’t receive the email? Please make sure to check your spam or junk folders.

    Close
    Newsletter Subscription
    Subscribe to our Blog