By Liqueo Director, Warren Truss.
Outsourcing is a term that resonates with most people working in the asset and wealth management industry today. Looking ahead, we are unlikely to see a reduction in outsourcing initiatives in the near future. Outsourcing of non-core functions enables asset and wealth management companies to focus more on their core competencies and strategic initiatives. The aspiration is that leveraging the tangible benefits of outsourcing provides, such as Cost Efficiency, Scalability, Time Savings and Technology & Innovation advancements, could, over time, lead to better overall company performance.
While outsourcing offers numerous benefits, it is important to note that outsourcing also comes with certain risks. These risks might include loss of control over outsourced processes, potential data quality concerns, and the need for effective service management process. Given these risks and benefits, asset managers need to carefully consider their outsourcing decision based on their specific business drivers whilst not forgetting their fiduciary responsibility and regulatory obligation.
Good data quality is essential to the health and success of any business that considers itself a data-driven organisation. Poor data quality can lead to a range of negative impacts on a business. These include financial losses, reputational damage, shareholder confidence, staff retention or regulatory breach.
IBM conducted a study in 2016 which estimated that bad data costs the U.S. economy a staggering $3.1 trillion a year. Similarly, a Gartner survey undertaken in 2020 concluded that organisations calculated that the average cost of poor data quality was $12.8 million a year.
With the emergence of BIG data technologies, along with new investment trends such as ESG Investing, the need to consume more data to stay competitive has grown at an exponential rate since these studies/surveys were undertaken. The cost of bad data is likely to have risen commensurately.
A focus on continually improving an organisation’s data over time can become its most valuable asset. And yet, this is often the most overlooked area when an organisation embarks on their outsourcing journey. Organisations are often misguided, with a disproportionate amount of focus on People, Process and Technology. They often ignore what could be their most valuable asset or superpower, deeply embedded in the DNA of the organisation, “Data” itself.
Data is complex and multi-faceted. Its lineage is often difficult to navigate, especially within an outsourcing arrangement. However, with data being at the core of every decision made, organisations have a choice: accept that they are indeed a data-driven organisation and embrace the challenge - or be left behind.
At Liqueo we see an evolving trend across a number of large-scale outsourcing programmes (>$1 trillion AUM) where outsourcing organisations are falling short when it comes to attestation of data quality. This is largely due to the adoption of a “delegated responsibility approach” during implementation. Data quality is no longer pursued by the outsourcing organisation. Instead, they attempt to delegate it to the outsourcing provider - based on their interpretation of the service they have signed up for.
Over time, this approach often leads to a negative perception of data quality and a lack of control, ownership, and accountability. This in turn, starts to erode confidence within the programme - specifically within data sensitive areas such as Portfolio Management, Dealing, Risk & Compliance and Performance Measurement (i.e., the core functions being retained). The ‘blame game’ ensues, fundamentally undermining constructive ways of working - which in turn lead to delays and a lack of confidence in the particular incentive.
At Liqueo, experience has taught us that the outsourcing organisation (not provider) should retain ownership and duty of care when it comes to data quality. This will ensure short- and medium-term success as well as longevity in adopting good data quality operations, post transition.
Certainly, once the outsourcing organisation and the outsourcing provider have mutually agreed to transfer operational ownership, it is acknowledged that the outsourcing provider is delivering a service. However, it is essential to note that in such circumstances, it is the outsourcing organisation that assumes all the associated risks, a perspective that aligns with the FCA's stance on operational resilience in the context of outsourcing.
Below are some of the common challenges or trends that we’ve encountered, here at Liqueo, across outsourcing programmes, both operational and technical.
Delegated responsibility for the validation and attestation of data quality (‘it’s their problem not ours’).
Data quality is validated based on functional testing outcomes, and not necessarily verifying all the data points within a data domain based on business usage.
Absence of an independent data quality team or data quality test phase, with no single source of truth through a data quality lens.
Insufficient SME knowledge of system specific data models/constructs actively involved in data quality testing and triage of differences. This leads to false identification of issues or lack of capability to identify patterns in root cause.
Insufficient documentation in place to navigate technical data lineage and business data lineage, slowing down data quality validation process significantly.
Insufficient tooling in place to validate data fitness (e.g. Production <> Golden Source <> Migrated Data) resulting in single use solutions being built.
Wrong technology and processes to manage ad-hoc data migration requests, resulting in significant amounts of noise between data quality validation tests.
Audit requirements to support a retrospective forensic data review not being considered, with no ‘point in time’ reference to what state the data has been signed-off in, at the point of transfer of operational ownership.
Multiple golden sources of data being managed across a federated model, leading to data duplication and/or conflicts.
Environment release and change control being managed independently and not being managed across organisations, leading to erroneous testing outcomes.
Ways of working practices, generic issue/defect templates used in defect management tools e.g. Jira which limits the ability to forensically drill into the concentration of issues at the data domain and attribute level.
If you want to be on the path to success, adopt a data implementation approach where the outsourcing organisation takes on full ownership and accountability for the attestation of data quality. Create an independent data quality team with the right skillset that is empowered to work in partnership with the Enterprise Data Team (Ultimate Data Owner) and make data quality decisions. Adopt a forensic analysis approach to data accuracy and completeness at the attribute level - across the full scope data universe. Utilise tooling that has an extensive footprint in terms of usage. Move away from single use solutions and adopt a reusable approach where the tooling is carried forward in a BAU capacity or into latter phases of the programme.
If you are already on an outsourcing journey or about to embark on one and have concerns about the success trajectory of your data programme, contact us. We would love to speak to you about the various ways in which our Data Practice can help.
We use necessary cookies to make our site work. We'd also like to set analytics cookies that help us make improvements by measuring how you use the site. These will be set only if you accept.
For more detailed information about the cookies we use, see our Cookies page.
Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.