Over my last twenty years selling complex IT systems into the Regulatory space I have realised you can guarantee that one of the weightiest decision criteria an organisation will have is how futureproof is the system being purchased? Will it keep up with regulatory changes, can it build new algorithms to detect changes in criminal behaviour etc. Who wants to spend millions of dollars and months, possibly years, implementing an initiative that becomes outmoded, soon thereafter? Typically, a firm would expect a system’s lifespan to, with updates and modifications, last a decade.
However, what seems to never be considered is how accurate, will the data I’ve spent months integrating into the new solution be, in 12 months or even 5 years. A phrase I have heard repeatedly relating to the integration of data is ‘garbage in, garbage out’. pretty obvious really.
We spend months identifying the right sources, checking the quality (sometimes) then, once the required results are refined, the data is never looked at again. The reason we check the quality of data prior to integration, is we know there could be gaps. Or, it could be out of date. If we pay little attention to our data once it is integrated the same issues reoccur and the millions spent on our bright and shiny new system, the one that spent so long checking was futureproof, is wasted.
An example of how often data changes is evidenced in a report by MelissaData. According to which, 14.19 percent of the US adult population move each year. Add-in changes to corporate ownership structure, deaths, and regulatory changes, such as Adverse Data and PEP status etc. It is conservative to assume 20% of all data changes yearly. If we assume an organisation catches 50% of this every year, it still means within 5 years 50% of the data a system is operating off could be out of date. Obviously, this significantly reduces its effectiveness.
To illustrate my point, leading Financial Institutions are currently spending millions to complete data remediation projects. This is being undertaken because their data is out of both date and relevancy.
Any solution or initiative, is only as good as the data inputted. ICX4 understand that getting the most from any system, is by ensuring that all internal data is up-to-date. Therefore, all data used within key systems must be futureproofed. Over several years, working closely with a large multi service financial institution to build a single client view, we had an idea, what if we could automatically match a company’s internal customer data, to the most up to date information provided by external sources? Whether that be regulatory, ID&V, Land registry, Corporate Registry, Credit Agencies, or Market Data. This ensures the data contained, is as relevant as possible.
We also recognised that up to date/relevant data, not only makes an instant improvement in the effectiveness of current systems but helps avoid the need for future remediation projects, while significantly reducing the requirement of KYC periodic reviews. Reduces the effort in meeting regulatory requirements and, through the clarity provided, improves decision making throughout the corporation. To address what was a very clear issue, ICX4 built a solution called businessDNA, currently this solution can match and update an organisations internal data, with over 200 external data sources in Data Real Time*
*Data Real time means within a few minutes of new information being published by the external list providers.