AI and Insurance
Strategic Risk Selection
- Identify the most profitable prospects
- Accelerate conversion rates
- Improve quote accuracy
- Increase renewals while reducing “churn”
- Inculcate “best practices”
Precision Pricing and Reserving
- Access leading-edge, machine learning algorithms
- Deploy pricing models without reprogramming
- Increase accuracy of loss costs
- Develop rates five to fifteen times faster
- Develop losses individually for each claim
- Build reserves accurately from the “bottom up”
Optimized Claims Management
- Identify claims for straight-through or manual processing
- Flag potentially fraudulent claims
- Identify subrogation opportunities
- Predict claim severities and large loss potentials
- Improve adjuster performance with outcome-based assignments
Insurance Case Studies
There is no function in insurance that will be unaffected by the adoption of artificial intelligence and machine learning. Besides automating and informing traditional processes, AI and machine learning create new capabilities that empower insurers to optimize every function in the insurance value chain.
Dynamic Pricing Precision
Using DataRobot for pricing machine learning, a large UK motor insurance carrier substituted a gradient boosting model for a generalized linear model (GLM) in one line of business. As a result, the carrier reduced its losses, improved its combined ratio, increased its retention rates, and reduced its acquisition costs. These improvements resulted in $8 million in savings. DataRobot’s ability to execute linear and nonlinear algorithms simultaneously helps deliver precise, risk-specific pricing that reduces vulnerability to adverse selection.
Using AI in actuarial science — for renewal price management — can also help you keep customers. Insurers lose money when good customers don’t renew their policies. Using DataRobot, a large European insurer incorporated the risk of “churn” into its renewal pricing, leading to reductions in cancellations and non-renewals, an improved loss ratio, and a 24% reduction in variable costs. In all, the company estimates the value at €12.5 million a year.
Mitigating Litigation Risk
A large commercial U.S. property and casualty carrier used DataRobot to develop a model for leveraging machine learning in actuarial science — predicting the likelihood that a workers’ compensation claim would lead to litigation. With these insights available, claims scored with a high probability of litigation are referred to senior claims staff for early and attractive settlement offers. The company estimates that it has avoided 10% of the litigation it would have experienced without the model, leading to a 25% decrease in the cost of at-risk claims and an estimated value to the company of more than $5 million per year.
Modern machine learning is far more effective than static rules in detecting ever-evolving methods of fraud. In one case, a large European property and casualty insurer implemented overnight batch runs of auto claims against a model developed using DataRobot. Claims scoring high for probability of fraud are now assigned to a specialized claims fraud investigation team. The company estimates that it has increased the accuracy of its fraud detection by 30%, yielding more than $10 million in value.
Capitalizing on Subrogation
Subrogation opportunities are like finding money, but only if you can identify them and act quickly. A continental European motor insurer worked with DataRobot to identify claims with a high probability for subrogation recovery. Claims handlers now receive automated lists of claims with subrogation. The company has doubled its subrogation rate from 1.4% to 2.8% of claims and expects annual recoveries of €4-8 million a year.
DataRobot Helps Insurers With:
DataRobot’s AI Cloud platform is designed for users to understand and explain predictions to customers, executives, and regulators. Factors with predictive value are clearly identified and explained, and “prediction explanation” codes tell users why an applicant received a certain price, score, or recommendation.
Easy platform integration
DataRobot provides the capability for straight-through deployment of analytical models, avoiding the need to reprogram. Users have several options for accessing our AI Cloud platform:
- Applications can interact with DataRobot models directly through the REST API;
- DataRobot can export models in Java .jar
- DataRobot can generate Java or Python scoring source code
- DataRobot can generate a scoring application with its own web-based user interface
Companies using DataRobot often find they can start developing real models by the second day of training. That’s because DataRobot allows “citizen data scientists” — business analysts, actuaries, IT staff, product managers, and underwriting and claims specialists — to help create predictive models without needing formal data science credentials.
Using machine learning in insurance liberates insurers from excessive reliance on overburdened data scientists who can usually respond to only a small fraction of the opportunities and challenges created by predictive analytics.
Managing the lifecycle
The accuracy of models can “drift” with new underwriting and loss experience. DataRobot helps maintain model accuracy by automatically notifying users how far current results diverge from modeled predictions and what factors may be causing the divergence.
This capability is critical for alerting users to growing drift before its impact shows up in financial statements and you’ve lost competitive edge–or your job.
Machine learning can also help insurers build individual loss development (ILD) models — overcoming the limitations of traditional GLMs and analyzing and managing the shifting impact of different loss variables over the life of a claim.
Adaptable to “real world” data
DataRobot works with data “as is,” with all the gaps and limitations commonly found in it. There are many insurance applications for machine learning and enterprise AI, and no extensive data preparation is required for you to start creating AI models. DataRobot’s output will help you prioritize your efforts to expand and refine your data.