Explaining Scalable Batch Predictions in DataRobot

April 20, 2020
by
· 1 min read

This post was originally part of the DataRobot Community. Visit now to browse discussions and ask questions about DataRobot, AI Platform, data science, and more.

The new Batch Prediction API enables you to efficiently score large datasets stored in the cloud on models deployed in DataRobot on your dedicated prediction servers. You don’t need additional software or to provision new hardware. Everything is managed by DataRobot.

In this learning session, Christian Joergensen, a Senior Software Engineer at DataRobot, will walk you through how to use the new Batch Prediction API to:

  • Efficiently score large datasets on S3
  • Efficiently score large datasets on Snowflake

Hosts

  • Christian Joergensen (DataRobot, Senior Software Engineer)
  • Jack Jablonski (DataRobot, AI Success Manager)

Now what?

After watching the learning session, you should check out these resources for more information.

Pathfinder
Explore the Marketplace of AI Use Cases
Browse Now
About the author
Linda Haviland
Linda Haviland

Community Manager

Meet Linda Haviland
  • Listen to the blog
     
  • Share this post
    Subscribe to DataRobot Blog
    Newsletter Subscription
    Subscribe to our Blog