By Senior Data Consultants Judith Kirkwood-law and Kieran Drea.
In today’s rapidly evolving Asset Management landscape, effective data management is critical to any successful Target Operating Model (TOM) transition. Factors such as scalability, the move to Cloud, and data-intensive analytics and insights from Machine Learning and other Artificial Intelligence (A.I.), have led to greater utilisation and consumption of data, with a shift towards being guided by “Data-First.” Effectively managing and leveraging data for decision-making can result in better insights, cost savings, effective compliance with regulations, and enhanced asset performance. But who do you need on the team to unlock this data power? This article delves into key roles and responsibilities you need to consider to make a success of your Data TOM.
Having a defined and clearly communicated data strategy is critical to bring people along on the TOM transformation journey. Typically, this is the responsibility of the Chief Data Officer (CDO) or may fall to roles within the COO or CTO. This leadership role comes with the responsibility to oversee data strategy, data governance, data quality and compliance with data-related regulations. It is also vital for this vision-setting individual to collaborate with others at executive level - to ensure buy-in for the data strategy. Furthermore, the data strategy must support and drive the organisation’s strategic goals to facilitate alignment between data initiatives and business objectives.
Articulating this vision is not just the responsibility of the Executive Leadership, but of all Data Users across an organisation. From portfolio managers to quantitative analysts, Data Users should be able to define their business objectives and aspirations with respect to data - and feel empowered and compelled to align the vision with tactful support and homogeny. If success criteria for a data strategy are not well defined, then the CDO will be “shooting in the dark” trying to understand “what good looks like” for business users. Conversely, the data professionals within the business should seek to educate their peers on the art of what’s possible, championing Data Operating Model Change as platforms continue to evolve and innovate.
Asset & Wealth Management organisations are heavily regulated. They must establish and enforce data governance policies, standards, and procedures related to asset data. They must also ensure data privacy, security, and compliance with relevant regulations (e.g., MIFID II, GDPR, HIPAA).
The Data Governance Team will typically own and establish data governance policies, standards, and procedures. They will enforce data privacy, security, and compliance and be responsible for monitoring and auditing data management practices for adherence. Data governance is unique to each organisation and should consider the operating model's nuances. It is essential that governance is not seen as a “tick box” exercise in bureaucracy but is leveraged to bring benefits for data users as well as compliance with regulation.
Data Owners assume responsibility for specific datasets or domains and may also be the Product Owner for aspects of the data platform. They are the individuals with deep knowledge and expertise in their respective fields. Assigning these roles guarantees the right Subject Matter Experts are making decisions that ensure data quality, integrity, and compliance with the relevant policies, while defining data access controls and permissions. Without clear Data Owners defined in your TOM, chaos can reign. Delivery teams cannot confirm who is making decisions about priorities, definitions, or data classifications. Nor can they confirm who is signing off data quality testing or who is defining the processes required to make use of the data in question.
A key aspect of a TOM is the technology and data platform. Roles and responsibilities must be defined so the platform design and architecture aligns to the data strategy.
This requires being able to take a higher-level view and set the reference architecture, principles and standards for data flows, integration strategies and modelling data. A Data Architect role should have responsibility for these tasks. Without guiding and controlling these aspects of the platform, it can be more difficult to make decisions, leading to siloed solutions that do not adhere to an overall strategy, thus increasing costs.
Infrastructure and Technology functions will typically provide and manage technology infrastructure for data migration, storage, processing, testing, and access. Whoever is in this role must ensure scalability, reliability, and high availability of data platforms. To achieve this, they will need to collaborate with data teams to meet infrastructure requirements. Data security, typically governed by the Chief Information Security Officer (CISO), is required to protect data from unauthorised access, breaches, and cyber threats. To do this, the security team will implement encryption, access controls, and data security measures, as well as conducting security audits and risk assessments. It is advisable to engage with these teams early in your TOM journey to avoid encountering blockers and issues later in the process.
When embracing data evolution, Asset managers frequently grapple with the decision of whether to outsource or keep Data Operations in-house. Additionally, the decision to centralise or decentralise an organisation's data operations depends on the organisation's maturity and size. This complex subject deserves focused attention, requiring organisations to assess their data maturity and set out their own journey. However, it is advisable to have a best practice TOM that utilises a Domain specific DataOps methodology comprising Data Engineers, Data Analysts and Data Stewards. Entrusting business users alone with the seamless flow of data in and out of an organisation can lead to disruptions in data services. We also recognise that there may be other roles as part of your DataOps function, like application developers, business analysts and technical analysts - depending on the organisation’s specific needs.
Data Engineers are responsible for collecting, integrating, modelling, and enriching data by developing and maintaining data pipelines for data ingestion and transformation. They can implement data storage solutions and database management as well as optimising data processing and ETL processes. This important role provides the backbone of the data platform, enabling data analysts, data scientists and business users to do their work.
Also of key importance are Data Stewards. They are responsible for upholding data policies and standards within data domains. In some cases, they are Subject Matter Experts within specific data domains. They ensure the quality, integrity, and compliance of data. They also validate and enhance data, enabling accurate decision-making and risk mitigation. If their role is ill-defined, data quality may deteriorate, leading to inaccurate decisions, compliance violations, and increased operational costs. Data could become inconsistent, compromising business objectives. Without clear Data Steward responsibilities promoting collaboration and efficiency across the organisation, data silos may form. Well-defined data stewardship is therefore essential to maintaining data reliability.
Overall, a DataOps function is pivotal to ensuring the smooth flow of data, from ingestion and integration to data quality validation and distribution in a multi-sourced environment. The responsibility here is to monitor data pipelines, data ingestion, and data transformation processes. A well-defined DataOps function ensures data accuracy and availability, allowing for informed investment decisions and risk management. Without well-defined DataOps responsibilities, Data inconsistencies, errors and delays can occur, leading to misguided decisions, operational inefficiencies, and increased risks. Inadequate DataOps can disrupt the investment process, compromise data integrity, and hinder performance analysis. This can impact an organisation's ability to use up-to-date data for trading and portfolio management.
An effective data operating model facilitates efficiency through a variety of strategies. Firstly, the implementation of data automation tools and processes is instrumental to success. Automation reduces manual reconciliation and data entry activity - which also minimises errors while accelerating data processing. It can also assist in regularly performing data quality checks and validation to identify and correct inaccuracies promptly. Furthermore, establishment of consistent data standards and formats through good architecture and governance ensures data consistency. It also prevents confusion among the different teams using and analysing the data.
Cloud-based data management solutions offer scalability, accessibility, and cost-efficiency, while data retention policies determine when data can be archived or deleted, thus reducing storage costs. Regular data audits and assessments are also imperative - to identify areas for improvement and optimise data management processes.
The Data Vendor Management function plays an essential role in the efficiency of the Data TOM. This is because providing vital external information is integral to making informed investment choices. Data Vendor Management are responsible for the acquisition, negotiation, licensing, and careful management of market data from various vendors. The accuracy and timeliness of this market data is essential to evaluating asset values, anticipating market trends, and fine-tuning investment portfolios. Additionally, the role regularly monitors market data expenses and evaluates data subscriptions to ensure cost-effectiveness. In the absence of a well-defined Data Vendor Management role, organisations can encounter issues like data duplication, data inaccuracies, unreliable pricing, and decision-making delays. This results in incorrect investments, missed opportunities, and savings lost (due to the lack of continual management of data vendors, consideration of vendor pricing or subscription costs). Ineffectual market data vendor management can impede decision makers’ capacity to respond to market shifts, affecting portfolio performance and client trust.
The human element can amplify these more technical efficiencies by facilitating seamless collaboration among data analysts, data scientists, and portfolio managers. This optimises the strategic leverage of data towards improved investment outcomes. Cross-training different teams in keeping up to date with the latest tools and techniques used by data teams and portfolio managers, further promotes efficiency and knowledge-sharing.
Finally, three crucial roles within the TOM collaborate to glean valuable insights from vast amounts of financial data. These roles, comprising Data Analysts, Data Scientists, and Front Office Quantitative Analysts all bring their analytical expertise - providing meaningful guidance for portfolio managers and decision-makers. Their combined efforts harmonise diverse skills and knowledge, enabling them to convert disparate data into actionable intelligence.
Data Analysts play a pivotal role in extracting valuable insights from intricate financial data. They meticulously examine data sources, analysing, interpreting, and reconciling security information. Their expertise lies in generating informative reports, dashboards, and visualisations to extract actionable insights. They identify trends, patterns, anomalies, as well as areas for data improvement, and adeptly communicate these data-driven discoveries to stakeholders. Well-defined data analysts empower informed decision-making, support portfolio management, and enhance risk mitigation.
Data Science is an enabler in asset management. They harness advanced analytics, Machine Learning, and statistical techniques to unearth deeper insights from complex financial data. Data Scientists collaborate with Data Analysts and Domain Experts to develop predictive models that enable informed investment decisions, risk mitigation, and improved returns.
Front Office Quantitative Analysts apply their quantitative expertise in financial modelling, risk assessment, and investment strategies. They collaborate with Data Scientists to design complex mathematical models and algorithms to boost portfolio performance and minimise risk.
In dynamic financial markets, the insights of proficient Data Analysts, Data Scientists, and Quantitative Analysts are essential for achieving optimal investment outcomes. Without clear role definitions, however, their tasks may misalign, leading to inaccuracies and inefficiencies. These issues can result in flawed investment decisions and financial setbacks, potentially causing missed market opportunities, suboptimal risk management, overlooked investment prospects and flawed predictions – leading to client dissatisfaction. It is crucial for these analysts to collaborate effectively, ensuring that their individual projects align with the firm's overall objectives and avoid redundancy.
Data Roles & Responsibilities
To summarise, effective data management is a vital component of a modern organisation’s operating model. It seeks to optimise asset performance, reduce costs, and ensure regulatory compliance. Clear roles and responsibilities, as outlined above, are essential for operational success. The level of maturity and skill within these roles can also have a significant impact on your ability to derive value from data. So, when defining your TOM you should consider evaluating what is required and if it is necessary to upskill or recruit.
Defining and assessing these roles and responsibilities, encouraging collaboration, and developing a data-driven culture, establishes a robust foundation for data-driven decision-making and sustainable Asset Management practices.
If your organisation is embarking on a TOM transformation journey, the Liqueo Data & Analytics Practice can help you to align your data capabilities with your overall vision for the future. We draw advisory and delivery expertise in Data Strategy, Data Governance, DataOps, Data Migration & Integration, Data Quality and Data Analytics & Insights. Please contact us.
We use necessary cookies to make our site work. We'd also like to set analytics cookies that help us make improvements by measuring how you use the site. These will be set only if you accept.
For more detailed information about the cookies we use, see our Cookies page.
Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.