Explaining Model Packages and the Model Registry in DataRobot
This post was originally part of the DataRobot Community. Visit now to browse discussions and ask questions about DataRobot, AI Cloud, data science, and more.
Creating a deployment begins with uploading a package into the Model Registry. The Model Registry is the central hub for all your model packages, and a package contains a file or set of files with information about your model; this varies depending on the type of model being deployed. Once the package is in the Model Registry you can create a deployment.
The Model Registry provides you with a consistent deployment, replacement, and management experience for all of your ML models. If the model built is in the DataRobot AutoML platform, the model package is automatically added to the Model Registry package list when the deployment gets created from the Leaderboard; otherwise packages are added to the package list manually by upload.
There are three kinds of model packages in the Model Registry:
- models built within DataRobot,
- your own model built outside of DataRobot that you upload into MLOps, and
- your own external model hosted outside of DataRobot.
A key difference is that DataRobot models and custom models have prediction requests received and processed within MLOps through an API call, while external models handle predictions in an outside environment, and after which the predictions are transferred back to MLOps for tracking.
The Model Registry provides the ability to register all of your models in one place to give you a consistent experience, regardless of the origin and location of a deployment. In all three cases, you use MLOps to track the predictions the model makes and assess the model accuracy just the same.
Search DataRobot public documentation for Create Model Packages and Model Registry.
We will contact you shortly
We’re almost there! These are the next steps:
- Look out for an email from DataRobot with a subject line: Your Subscription Confirmation.
- Click the confirmation link to approve your consent.
- Done! You have now opted to receive communications about DataRobot’s products and services.
Didn’t receive the email? Please make sure to check your spam or junk folders.
How the DataRobot AI Platform Is Delivering Value-Driven AIMarch 16, 2023· 4 min read
New DataRobot and Snowflake Integrations: Seamless Data Prep, Model Deployment, and MonitoringMarch 16, 2023· 5 min read
AI projects have many more unknowns than traditional technology projects. You have to know the right use case to start with and know the value you can expect even before you start. You need to understand what data sources to go after and how to get the data ready. You have to pick the right model to meet expected performance goals. Train it, test it, tune it. The list goes on. While you are trying to figure all this out, organizational leaders expect results from their investments in AI faster than ever before.
AI is a generation-defining technology with the potential to reshape every industry, every business service, every customer interaction. But too often and for far too many, the reality is much more challenging. Siloed teams, disconnected tools, the complexity of deploying across distributed clouds, immature operations, all combine to extend deployment timelines, diminish business impact and increase risk to sensitive data…
DataRobot has long believed that to democratize machine learning (ML) on the path to Augmented Intelligence, any user must have seamless access to learning — for example, how to prepare, create, explore, deploy, monitor, and consume ML models. Those already familiar with the DataRobot experience are also familiar with our in-app documentation that makes it easy to keep up with…