Delivering More Together with DataRobot and Snowflake Integrations DataRobot AI Cloud BKG

Delivering More Together with DataRobot and Snowflake Integrations

June 9, 2022
by
· 3 min read

Snowflake Summit 2022 (June 13-16) draws ever closer, and I believe it’s going to be a great event. A couple of sessions I’m excited about include the keynote The Engine & Platform Innovations Running the Data Cloud and learning how the frostbyte team conducts Rapid Prototyping of Industry Solutions. Another real treat for attendees will be the conversation with elite rock climber Alex Honnold.

I’m also excited to spread the word about some of the latest enhancements and integrations between Datarobot AI Platform and Snowflake’s Data Cloud. These include scoring code, prediction explanations, telemetry feedback, and automated feature discovery. I’ll explain these briefly, along with why they are good news for our joint customers.

  • Scoring code. Users can now run scoring code directly inside Snowflake. (For those who aren’t familiar with scoring, you can learn about it on our Wiki page.) By eliminating the need to extract and load data, this new capability significantly decreases the time required to score large datasets on comparable infrastructure. Instead of extracting data from Snowflake, scoring it against the DataRobot prediction servers, and loading the results back into the Snowflake database, you deploy and execute the DataRobot scoring code inside Snowflake, taking full advantage of the speed and scalability of the Snowflake Data Cloud.
  • Prediction explanations. DataRobot not only makes predictions from its models, it also explains how those predictions were made, which can be helpful to organizations in meeting regulatory requirements or for general user understanding of the models. This value-add feature is now available on models run within Snowflake. It scales horizontally, as the models can be run within Snowflake on terabytes of data or more, whatever Snowflake supports. Having the data, models, predictions, and explanations together translates into higher reliability for the user. This also helps ensure that the single source of truth that you’re creating within your Snowflake investments extends beyond just your data to your AI as well.
  • Telemetry feedback. DataRobot feeds your telemetry data back into the MLOps system and warns you of data drift that can affect the accuracy of your models. For example, your data may have valid value ranges. If your data returns values outside those ranges, it could mean a faulty device or other mechanical error on the data collection side. DataRobot provides warnings so that you can evaluate whether the data source needs investigation and maintain more accurate models.
  • Automated feature discovery. AFD is a feature I’m really excited about and would like to see used more. With it, users can automatically prepare relational data, running complex joins and aggregations to extract predictive features. If your relational sources live within Snowflake, DataRobot can now push down some operations into Snowflake to accelerate feature discovery. We plan to expand our partnership further by enhancing the push-down capabilities to eventually run most feature engineering within Snowflake, leveraging the infinite scale of the Snowflake Data Cloud.

We have many more features and benefits between Snowflake and DataRobot than can be detailed in one blog. For example, DataRobot provides a cloud-agnostic environment, giving the highest amount of flexibility to customers in choosing how and where to run these tools. DataRobot’s code-first experience also allows advanced users to build their own code that works within DataRobot or can be used for ad hoc analysis within Snowflake or other cloud data sources.

  • If you’re at Snowflake Summit, stop by the DataRobot booth to see our integrations in action and learn more, or join one of our sessions where you can learn more about how our customers are using DataRobot and Snowflake to scale and accelerate their AI initiatives. Get the real experience with DataRobot and Snowflake in our hands-on labs. You’ll learn how to use DataRobot and Snowflake together to prepare data, build and train models, deploy and monitor the models, write data back to Snowflake, and analyze the resulting data in Snowflake.
  • Hear how customers achieve AI at scale using DataRobot and Snowflake by attending Lisa Aguilar’s Fireside Chat session on Wednesday, June 15, where she will talk with customers about how they have made AI core to their business strategy.

Learn more about and register for the Snowflake Summit here. Come stop by DataRobot booth 620! I wish all attendees a great conference!

Event
Learn More About and Register for the Snowflake Summit
Learn more
About the author
Peter Prettenhofer
Peter Prettenhofer

VP of Engineering at DataRobot

Peter Prettenhofer is VP of Engineering at DataRobot. He studied computer science at Graz University of Technology, Austria and Bauhaus University Weimar, Germany, focusing on machine learning and natural language processing. He is a contributor to scikit-learn where he co-authored a number of modules such as Gradient Boosted Regression Trees, Stochastic Gradient Descent, and Decision Trees.

Meet Peter Prettenhofer
  • Listen to the blog
     
  • Share this post
    Subscribe to DataRobot Blog
    Newsletter Subscription
    Subscribe to our Blog