AI Simplified: Model Interpretability
What good is a machine learning model if you don’t understand what it is, how it works, or how to leverage it? Bill Surrette, Data Scientist at DataRobot, explains the importance of model interpretability in this AI Simplified video.
“Any time we build a model to represent a complex, real-world situation, we as data scientists want to make sure we understand what the model is doing, but we also need to be able to explain the model to our business partners and stakeholders.” — Bill Surrette, Data Scientist
Bill describes the role and history of the linear model, walks us through a specific example of predicting readmissions at a hospital, and teaches us to look at models differently to ensure that the best one is used for your specific business needs.
Watch the full video below:
Learn more from the AI Simplified series:
- AI Simplified: Machine Learning Problem Types
- AI Simplified: Sports Analytics
- AI Simplified: Data Requirements