APIs in specialty insurance

The specialty insurance industry has its fair share of three letter acronyms and confusing terminology (such as binders and lineslips), and so it would seem the perfect place to introduce an API (Application Programming Interface) - whatever that is!
speech marks “By working together brokers and underwriters can build more elaborate automations, such as appetite matching.”

The specialty insurance industry has its fair share of three letter acronyms and confusing terminology (such as binders and lineslips1), and so it would seem the perfect place to introduce an API (Application Programming Interface) - whatever that is!

History of APIs

The concept of an API actually dates back to the 1940s, due to the need for computer programmers to catalogue what different software programmes did.  These ‘interfaces’ could then be used to design how programmes could be combined to produce more elaborate software.   Initially, they were devised to build applications, and so the term Application Programming Interface was coined.  With the dawn of the internet, the use of the term has broadened to describe the data interface for passing information between remote systems: web APIs. The rise of (web) APIs was driven by the commercial benefits of being able to extend the reach of products and services to third parties, pioneered in the early 2000s by Salesforce, eBay and Amazon.  APIs are now ubiquitous in the information you receive via your computer or smart phone: from social media to the control of your home thermostat.

Potential benefits

Given the power of APIs is to transfer data between technology systems/platforms, they have the potential to offer a huge impact across specialty insurance in areas such as:

  • Efficiency:
    • reduce/negate the manual effort in rekeying data between systems
    • enable automation of certain workflows
  • E&O: reduce/remove risk of discrepancies between systems.
  • Renewals:  reduce the manual effort during key renewal periods.  
  • Portfolio management: extracting data enables better Management Information (MI).
  • Data and analytics: as well as policy administration systems, data can be passed to/from analytics tools e.g. exposure management.


Banking versus specialty insurance

Whilst some progress has been made, the market remains largely un-integrated2.  Comparison is often made with what has been achieved in open banking. Launched in 2017 by the Competitions and Markets Authority (CMA), open banking enables consumers and businesses to share their transaction data securely with third parties.  This enables businesses to innovate and provide new services to consumers, enabling consumers to make better decisions, saving them time and money. In January 2023 the six largest banking providers in the UK had implemented fully the standards set to deliver open banking3.

Specialty insurance is quite different, since it is not a simple financial transaction, it is a broad range of data points that reside within a contract, which is negotiated between multiple parties.  The complexity increases with the number of sections within the contract and associated information used to evaluate the risk e.g. a schedule of values for the insured assets.  Further complexity is introduced through scenarios like verticalisation, line conditions and the addition of subjectivities.  Calculations, factoring in variables such as deductions, brokerage and taxes then need to be performed to determine the allocation of premium to the recipients.  This is typically done once the contract is bound, with operational teams across the participants also dealing with the receipt of monies in bulk, across a number of risks. In Lloyd’s, more than 50% of gross written premium remained outstanding at the end of 20224.  Specialty insurance certainly represents more complex challenges to those of banking.

So, it would be fair to question: will specialty insurance ever become integrated through APIs, with data seamlessly flowing through the marketplace? 

Evolution versus revolution

Within the London Market the Blueprint 2 programme has been established to bring about the digital flow of data for placement and claims, through establishing data standards, API standards and a gateway through which to transmit this data.  The intention of the Core Data Record (CDR) is to provide the critical transactional data which needs to be collected by the point of bind to drive downstream processes: premium validation and settlement, claims matching at first notification of loss, tax validation and reporting. This is a large dataset with potentially over 200 fields5 and so will require a huge amount of investment and collaboration; as the Lloyd’s website states, it is an ‘ambitious strategy’6.  Whilst achieving the Blueprint 2 vision will be transformative and should remain an ultimate goal for the London Market, a more progressive route would enable specific benefits to be targeted and realised.  This can be done by using much smaller datasets – an evolutionary versus revolutionary approach.

Innovation platform

In order for the market to evolve three key capabilities need to be in place:

  1. A placing platform: central platform where the data within the risk is negotiated and agreed by all parties.
    Manual re-key from upstream and into downstream systems can only be avoided once consistency is obtained in the centre.
  2. A mechanism to capture the structured data explicitly, as part of the placement workflow e.g. Class of Business, Inception Date, Expiry Date, Limits etc.
    Explicit data entry directly into the contract is a superior approach to AI-driven data extraction.  This is because the extraction and/or human checks may introduce errors, and human intervention also prevents seamless automation.
  3. A comprehensive set of APIs, with a mechanism to listen to events on the placing platform and trigger these APIs:
    1. to access structured data within the risk (both read and write) and pass information between systems, and
    2. to trigger workflows, so that processes can be automated

This provides brokers and underwriters with a means to build out integrations with their own systems, but also collaborate to automate existing processes and/or build innovative new ones.  The provision of a non-production environment is also key to explore and test new solutions.


A mechanism to capture the structured data explicitly, as part of the placement workflow

Making a start

There are a number of ways to build up to a comprehensive set of integrations, with the route taken being specific to a particular broker, underwriter or collaboration effort. 

A simple, initial implementation for brokers and underwriters might be to extract written and signed line information and feed this into a data warehouse to assist with portfolio analysis.

A next step for underwriters could be to examine a couple of key elements in the contract to determine whether they are out of scope and automate a decline, or flag for referral, with a reason to direct the underwriter when they come to review the risk.  Examples of data used to drive rules are risk location and/or limits, as well as generic checks on items such as inception date.  Whilst this is an underwriter implementation it will provide brokers with a quicker response, and so ultimately lead to a better service to customers.

By working together brokers and underwriters can build more elaborate automations, such as appetite matching.  Appetite matching is where an underwriter builds rules against the structured data within the contract to then automate guidance to brokers in an unsolicited manner.  Brokers can control from whom to receive this unsolicited feedback and also implement the underwriting rules for their own in-house facilities and MGAs, to help maximise the usage of these placement vehicles. 

Automation of follow lines7 can also be achieved with a relatively small amount of data, compared to that needed for a full CDR record.  This will speed up the placement process on risks with automations in place and frees up underwriter time to focus on more complex risks.

As data sets grow, integrations can be bult to PAS systems8, to minimise/avoid having to re-key information. The addition of insurable interests opens up automated rating and the option to transfer information to data and analytics platforms.

Whitespace provides this mechanism and is working with customers to build all the above9, with a number of examples already in production.

Once a contract contains enough structured data to define financial transactions, and there is a mechanism to agree them, APIs accessing online payment systems can collect the premium from the customer, and automatically distribute it to the broker and underwriter(s) on a risk-by-risk basis.


Evolving API integrations over time


The use of APIs gives specialty insurance the opportunity to operate far more effectively and provide more value to customers.  Solutions can be built in an iterative manner and use data sets that evolve from minimal to complete. As discipline and confidence builds around the use of structured data, it is not inconceivable that APIs will be used to automatically trigger payments, as has been done in banking.

Here at Whitespace and through our partners Vega IT, we are ready to support our customers to start their integration journeys, to help drive the specialty insurance market into a digital and data-led future.


Stay in touch