APIs – how to mitigate interconnectivity risk

Nick Mair, co-founder of London Market data assurance platform DQPro, asks the question: what does best practice look like for APIs?

Following a very rapid proliferation over the last few years, application programme interfaces (APIs) are now a critical part of all modern smartphone, online and Cloud SaaS applications covering customer-facing and internal functions, across industries globally. Quite simply, we can’t do without them. 

Back in 2020, Lloyd’s announced the launch of a new API to facilitate smoother electronic placement of risks, a meaningful step in the API expansion that has since included the development of an API roadmap to support Lloyd’s future plans around Blueprint 2.

The benefits are clear – the frictionless flow of data between systems and businesses, connecting multiple applications and bringing new sources of data into the market to improve electronic placement, risk understanding, operational efficiency and ultimately profitability. 

It’s a huge leap in collaboration and data sharing that empowers a straight-through experience, all driven quietly by powerful APIs working in the background to enable different multi-stakeholder systems to “talk” to each other. 

However, before we skip off into the sunny uplands of seamless interconnectivity, it would be remiss not to pause for thought and ask ourselves: are there security risks posed by APIs? What does API best practice look like?

The answer to the first question is of course yes: APIs, as with any technology system, if not managed appropriately and implemented with reputable, proven partners, pose significant security risks that (re)insurers must be aware of. These include excessive data exposure, security misconfiguration and insufficient monitoring – all of which can leave (re)insurers’ systems vulnerable to cyber-attack or data loss.

How to mitigate API risk

So what does best practice look like when it comes to API security? First it’s important to work with API partners who are trusted and well-established in the market. An API’s function should be discussed across the technology teams responsible for the respective systems before being implemented, in particular with reference to sharing necessary information, access controls and authentication controls such as encrypted requests and responses. Ultimately, organisations deploying APIs should request visibility and assurances over their API inventory and the behaviour of their application programme itself. 

Working with partners who have carefully considered and mitigated the security risk of APIs is critical. But above and beyond this, it’s also about validating the data being shared between systems – poor quality data being shared across one in-house system is bad enough, but the impact of poor quality data being transferred across the market is significant.  

This is why data standards are also vitally important to the market as it continues its digital evolution. Discussions we have held with managing agents, brokers, insurers and MGAs indicate strong demand for data monitoring capabilities compatible with third party systems, such as e-placing platforms, underwriting workbenches, policy systems and/or other common applications, at any given stage of the placing process, including comprehensive data checks at the pre-bind stage.

Sharing high quality data

Getting this right leads to more accurate underwriting data at source, reduced back office cost, and more profitable underwriting, with an enhanced ability to catch data errors before they impact the bottom line. 

Done properly, integrating APIs that allow (re)insurer systems to ‘talk’ to each other and share data brings big benefits and delivers significant value to system users, with flexibility in how and where they integrate their data services, as well as for policyholders who will benefit from more joined up service thanks to a truly flexible digital ecosystem. 

Global enterprises are more dependent on APIs than ever before, and that’s why it’s important to address any potential API security gaps, and recognise the need for comprehensive data validation. After all, there is no digital ecosystem and no nimble innovation without APIs.