Internet folklore decrees 90% of the world’s data has been created in the last two years and that this, newly created delta, orbits us at the Speed of Light. Employing the ‘Believe half of what you see and none of what you hear’ premise, by doubling the creation time and halving the speed. It still equates to a previously unfathomable volume of data, encircling the globe at a hitherto unfeasible rate.
So, why is there so much delta and why is it travelling so fast? Because it’s digital and who waits for anything? Data is information and information is knowledge and well, we all know what that is.
Data has always been a precious commodity and now, it’s one growing, ever closer, to becoming not only a currency but the currency. The concept of data as currency is not new. It’s now over a decade since Clive Humby, speaking at a Senior Marketers Summit, made the famous statement ‘Data is the new oil’. Mr Humby went onto say:
‘Data is just like crude. It’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals, etc to create a valuable entity that drives profitable activity; so must data be broken down, analysed for it to have value.’
Given the fortune Mr Humby amassed devising the Tesco Club Card, he clearly appreciated the value of data but could even he, hand on heart, claim to have foreseen the size of the bomb, set to explode? For, not only has the Digital Age empowered us to record and analyse delta on a scale previously imaginable, it has made the creation of it- relatively simple. Is it any wonder there is now so much of it?
How much data is there? Google alone, get 3.5 billion searches a day and 1.2 trillion a year. Total internet traffic is predicted to surpass 1.4 Zettabytes in 2017. It is possible these figures will be exceeded by the time of writing and highly probable, by the time of reading. After all, over half the world’s population contributes.
Despite the profound influence Big Data currently exercises over our lives, the utilisation of this phenomenon is still in its infancy.
ICX4 deal in the currency of Big Data and specialise in salvaging treasure, sunk in Legacy Systems and Siloed Data. We ensure no Great White Sharks lurk unseen, in the galleons of the analogue age…
Over my last twenty years selling complex IT systems into the Regulatory space I have realised you can guarantee that one of the weightiest decision criteria an organisation will have is how futureproof is the system being purchased? Will it keep up with regulatory changes, can it build new algorithms to detect changes in criminal behaviour etc. Who wants to spend millions of dollars and months, possibly years, implementing an initiative that becomes outmoded, soon thereafter? Typically, a firm would expect a system’s lifespan to, with updates and modifications, last a decade.
However, what seems to never be considered is how accurate, will the data I’ve spent months integrating into the new solution be, in 12 months or even 5 years. A phrase I have heard repeatedly relating to the integration of data is ‘garbage in, garbage out’. pretty obvious really.
We spend months identifying the right sources, checking the quality (sometimes) then, once the required results are refined, the data is never looked at again. The reason we check the quality of data prior to integration, is we know there could be gaps. Or, it could be out of date. If we pay little attention to our data once it is integrated the same issues reoccur and the millions spent on our bright and shiny new system, the one that spent so long checking was futureproof, is wasted.
An example of how often data changes is evidenced in a report by MelissaData. According to which, 14.19 percent of the US adult population move each year. Add-in changes to corporate ownership structure, deaths, and regulatory changes, such as Adverse Data and PEP status etc. It is conservative to assume 20% of all data changes yearly. If we assume an organisation catches 50% of this every year, it still means within 5 years 50% of the data a system is operating off could be out of date. Obviously, this significantly reduces its effectiveness.
To illustrate my point, leading Financial Institutions are currently spending millions to complete data remediation projects. This is being undertaken because their data is out of both date and relevancy.
Any solution or initiative, is only as good as the data inputted. ICX4 understand that getting the most from any system, is by ensuring that all internal data is up-to-date. Therefore, all data used within key systems must be futureproofed. Over several years, working closely with a large multi service financial institution to build a single client view, we had an idea, what if we could automatically match a company’s internal customer data, to the most up to date information provided by external sources? Whether that be regulatory, ID&V, Land registry, Corporate Registry, Credit Agencies, or Market Data. This ensures the data contained, is as relevant as possible.
We also recognised that up to date/relevant data, not only makes an instant improvement in the effectiveness of current systems but helps avoid the need for future remediation projects, while significantly reducing the requirement of KYC periodic reviews. Reduces the effort in meeting regulatory requirements and, through the clarity provided, improves decision making throughout the corporation. To address what was a very clear issue, ICX4 built a solution called businessDNA, currently this solution can match and update an organisations internal data, with over 200 external data sources in Data Real Time*
*Data Real time means within a few minutes of new information being published by the external list providers.
Quite naturally this has led to a situation where organisations have data housed in countless databases that:
I) Do not speak to each other
II) Often house data on the same clients, in many cases different data on the same clients
III) Are so old they do not have a scheme
IV) A lack of understanding on the cross dependence of data bases
V) Bottlenecks in performance
Its blatantly obvious to any business that understanding your organisations data, where it sits, how it flows and what is contained in every database can open up a host of benefits: improved operational efficiency, a single customer view, reduced regulatory exposure and cost, time and people savings on any new IT or business initiative.
It is also very clear each benefit would itself produce a strong return on investment, however most organisations have yet to address this issue why?
In our opinion there are a host of reasons, all of which are understandable and in many cases, make sound business sense, competing priorities, cross business lines so no clear sponsor, no idea where to start, many of the older systems being used have no data scheme and the people who built it have left the business or have even retired which increases risk to any data project,
The purpose of this BLOG is to highlight an alternative approach to addressing the problem, delivering meaningful gains to key strategic projects in line with business demands whilst all the time moving towards the enterprise single data model.
After countless meetings with Tier 1 financial institutions trying to persuade them that the achievements ICX4 has had at two of the world’s top multi-faceted financial institutions could be replicated across their business with little success other than some nice conversations that would initiate debate but eventually lead nowhere based on all the reasons highlighted in the introduction we needed to find a less complicated approach to addressing what is too often perceived as a mammoth task.
Our conclusion was simple, take our global business model and focus it on one problem at a time, minimising the risk of a complete architecture redesign whilst adding real value by addressing the next significant business challenge, then the next, then the next, populating the Master Data Model as we go.
Yes, this does mean the business is effectively agreeing our enterprise data model is the right approach however the risk of a complete leap of faith into the unknow is drastically reduced and business value is achieved almost immediately.
So how do you eat the enterprise data elephant? One bite at a time