Skip to content

1.2. Automation

Insurance pricing teams have a lot of tolerance for doing manual and mundane tasks.

It is possible to automate the majority of pricing tasks, in some cases it may be preferable to keep a manual validation step, but any data processing, reporting or other mundane tasks should be automated.

As a first pass tasks can be reduced to scripts - the end goal should be to make these scripts as robust as possible and put into production.

Data pipelines

Any pipelines for modelling or analysis can be automated - data is automatic up until the point it reaches pricing teams where things generally become manual.

Rather than batches, incrementally processing data and inserting on a daily basis allows pricing teams to always have data ready to go.

Look to automate any regular data that gets produced for analysis, reporting, modelling, optimisation.

Reporting

Often the task handed to interns and graduates. As with data pipelines, data for these can be automatic. And then either a dashboarding software such as PowerBI or a python library like Plotly Dash for displaying.

If the output is to be emailed, libraries such as SendGrid can be used for.

Modelling

As above, modelling data can be part of the data pipelines.

For the rest of the modelling lifecycle, it is possible to automate every stage - in the wider data science field, this is often referred to MLOps.

Training - this is the main benefit on GBMs over GLMs is that they are much easier to setup for training and re-training automatically, but still perfectly possible with GLMs.

Validation - model performance on out of same data can be estimated and compared to existing models, and plots and tables can be generated and saved as PDFs.

Deployment - For full automation, models should be served as an endpoint, this means that during rating a call is sent to the API, and a prediction is sent back. This means that updating the model is simply updating the endpoint. There may be some configuration changes in your rating engine if the model now requires different features, but this could also be updated automatically as well if the full end-to-end pipeline is within code.

Monitoring - The easiest way to monitor predictions is to include these in your response during live rating, and then pass these back to your database. This then makes comparing to actuals much easier once they exist, and can form part of an automated data pipeline / reporting process.

Optimisation

Optimisation is a technique for automating the decision making process for specific values.

Pricing is ultimately an optimisation process, pricing teams will have multiple targets and limitations, and setting these up as an objective to maximise whilst satisfying constraints can automate any decision making.

This means rather than performing analysis to decide how to adjust prices, the analysis is on how to improve the optimisation framework.

Optimisation doesn't need to be particularly complex to deploy, if your pricing that is made up of relativities in tables, can be setup so that the relativities are adjusted by an optimisation algorithm.

Testing

Automated testing is a large part of building software.

  • Unit tests - testing individual components and functions of your code.

  • Integration tests - testing

These tests should run at least before putting changes into production, but can also be run at various stages during the development process.

Deployment

For example, this website updates automatically - everytime the main repository is updated, a script is run that installs the packages, builds the site and deploys to a static app.