Blog article
See all stories »

6 steps for data quality competency for an underwriter

The concept of data quality in an insurance environment is a tricky one. The cost in acquiring good data is often weighed against speed of underwriting/quoting and model correctness. This can have a huge impact as incomplete data will force defaults in risk models and pricing or add mathematical uncertainty. If not corrected, risk profiles can also be wrong with potential impact to pricing and portfolio shape. And correcting data requires substantial personnel resources for cleansing and enhancing.

So to avoid costly errors let’s talk about the six steps for data quality competency in underwriting.  When done correctly they can be intelligent and adaptive to changing business needs.

 

Profile– Effectively profile and discover data from multiple sources

We’ll start at the beginning. First, you need to understand your data. Where is it from and in what shape does it come – both from internal and external sources? This will identify problem areas such as external submission data from brokers and MGAs, which is often incomplete. This is then combined with internal and service bureau data to get a full picture of the risk. Once data is profiled, you’ll get a very good sense of where your troubles are.  Then continually profile as you bring other sources online, using the same standards of measurement.  This exercise will also help in remediating brokers that are not meeting the standard.

Measure – Establish data quality metrics and targets

As an underwriter you will need to determine what the quality bar is for the data you use. Usually this means flagging your most critical data fields for meeting underwriting guidelines. See where you are and where you want to be. Determine how you will measure the quality of the data as well as desired state. And by the way, actuarial and risk will likely do the same thing on the same or similar data. Over time it all comes together as a team.

Design – Quickly build comprehensive data quality rules

This is the meaty part of the cycle, and fun to boot. First look to your desired future state and your critical underwriting fields. For each one, determine the rules by which you normally fix errant data. How do you validate, cleanse and remediate discrepancies? This may involve fuzzy logic or supporting data lookups, and can easily be captured. Do this, write it down, and catalogue it to be codified in your data quality tool. As you go along you will see a growing library of data quality rules being compiled for broad use.

Deploy – Native data quality services across the enterprise

Once these rules are compiled and tested, they can be captured in the organisation. Your institutional knowledge of your underwriting criteria can then be reused to cleanse existing data, new data and everything going forward. 

 

Review – Assess performance against goals

Remember those goals you set for your quality when you started? Check and see how you’re doing. After a few weeks and months, you should be able to profile data and run reports and see that the needle will have moved. You can now also identify new issues to tackle and adjust those that aren’t working. One metric that you’ll want to measure over time is the increase of higher quote flow, better productivity and more competitive premium pricing.

Monitor – Proactively address critical issues

Now monitor constantly. As you bring new MGAs online, receive new underwriting guidelines or launch into new lines of business you will repeat this cycle. You will also utilise the same rule set as portfolios are acquired.  It becomes a good way to sanity check the acquisition of business against your quality standards.

 

In case it wasn’t apparent your data quality plan is now more automated. With few manual exceptions you should not have to be remediating data the way you were in the past. In each of these steps there is obvious business value. In the end, it all adds up to better risk/cat modelling, more accurate risk pricing, cleaner data (for everyone in the organisation) and more time doing the core business of underwriting. Imagine you can increase your quote volume simply by not needing to muck around in data. Imagine you can improve your quote to bind ratio through better pricing. This is the real magic that lies within data quality.

4480

Comments: (0)

Now hiring