Explaining Deployments Dashboard in DataRobot
This post was originally part of the DataRobot Community. Visit now to browse discussions and ask questions about DataRobot, AI Cloud, data science, and more.
The Deployments tab provides you with a dashboard list of all the deployments you have; this includes those that you deployed, as well as deployments shared with you by others. By deployment we mean a model that DataRobot is tracking to allow you to effectively monitor and manage the model performance.
A deployment here in the DataRobot MLOps environment represents one of three different kinds of underlying models, DataRobot, External, or Custom Inference:
The first is a model built and deployed from within DataRobot AutoML or AutoTS. Specifically, these are models built after you upload your data and hit the Start button. You request predictions from these models through the DataRobot API.
The second is an external model that you build outside, and then upload into DataRobot. Similarly, you also request predictions from these custom models through the DataRobot API.
The third is a remote model hosted in your own environment, but remotely communicating with DataRobot servers. In this case, you install the DataRobot MLOps agent software that acts as a bridge between your application and DataRobot. You request predictions from your model as you would normally, but then pass your prediction output to the agent, which then reports your prediction data back to DataRobot so that the deployment can capture that information.
In all three cases, your deployment captures the predictions that the underlying model makes, along with the actual outcomes from those predictions once collected and uploaded. And in all three cases, the Deployment user interface provides you with 1) a view into how the nature of your input data changes, 2) the distribution of the predictions from the model changes, and 3) how the accuracy changes over time.
On the main Deployments page, across the top of the inventory, a summary of the usage and status of all active deployments is displayed, with color-coded health indicators.
Beneath the summary is an individual report for each deployment. There are two unique deployment lenses that modify the information displayed in the dashboard:
- The Prediction Health lens summarizes prediction usage and model status for all active deployments.
- The Governance lens reports the operational and social aspects of all active deployments.
To change deployment lenses, click the active lens in the top right corner and select a lens from the dropdown or click the left or right arrow.
Starting with the Prediction Health lens, next to the name of the deployment you see color-coded icons that represent the level of health; this refers to the number of errors for the Service Health column and the degree of degradation or shift in incoming data for Data Drift and Accuracy columns.
Then, there are few metrics on prediction activity traffic displayed.
Let’s now switch the Governance lens. Importance indicates the model traffic volume, financial impact, and other measures of value. The build environment indicates the environment in which the model was built. Then there’s information about the owner and age of the model, and a Humility monitor, which reports on the model making uncertain predictions. Lastly, there’s a menu of options available to manage the model.
To view all of this information in detail, simply click on the deployment you want to view.
Search the DataRobot Public Platform Documentation for Deployment inventory and Model Registry.
We will contact you shortly
We’re almost there! These are the next steps:
- Look out for an email from DataRobot with a subject line: Your Subscription Confirmation.
- Click the confirmation link to approve your consent.
- Done! You have now opted to receive communications about DataRobot’s products and services.
Didn’t receive the email? Please make sure to check your spam or junk folders.
New DataRobot and Snowflake Integrations: Seamless Data Prep, Model Deployment, and MonitoringMarch 16, 2023· 5 min read
How the DataRobot AI Platform Is Delivering Value-Driven AIMarch 16, 2023· 4 min read
A New Era of Value-Driven AIMarch 16, 2023· 2 min read
DataRobot has long believed that to democratize machine learning (ML) on the path to Augmented Intelligence, any user must have seamless access to learning — for example, how to prepare, create, explore, deploy, monitor, and consume ML models. Those already familiar with the DataRobot experience are also familiar with our in-app documentation that makes it easy to keep up with…
AI projects have many more unknowns than traditional technology projects. You have to know the right use case to start with and know the value you can expect even before you start. You need to understand what data sources to go after and how to get the data ready. You have to pick the right model to meet expected performance goals. Train it, test it, tune it. The list goes on. While you are trying to figure all this out, organizational leaders expect results from their investments in AI faster than ever before.
Welcome to the New Era of AI Our newly-launched DataRobot AI Cloud turns the possibilities of AI into tangible realities. How? By being built specifically for the demands, challenges, and opportunities of AI today. Now, data scientists, analytics experts, business users, and IT teams can collaborate in a single, unified platform. Now, all data from all sources can come together…