Author: Fabrizio Cestari, Liqueo Senior Consultant
Large Asset Management firms are constantly looking at how to increase profits, access new markets and reduce costs. This is key in their decisions when choosing core systems which are used to pursue their goals. On the technical side, it is necessary to consider the existing or potential connections and data exchange between different systems: how many there are, how to replicate or replace them and so on. When made correctly, these choices drive customer success and improve efficiency.
In a change process, it is necessary to explore and analyse all the different angles and constraints before reaching a final decision.
Assessment
One of the most important decisions companies need to take is related to the systems they choose for the daily process. These can be Front Office, Front Office to Middle Office, Front to Back, one vendor or multiples. There is no one choice that suits everyone, but it is possible, by tailoring the systems, to meet a wide range of requirements.
At Liqueo, we understand a client’s needs, identify the best approach, and propose a solution based on principles of:
- Time
- Budget
- Solution design
- Build and follow up
Regardless of the options and the modules that an Asset Manager decides to adopt, one of the biggest questions in a new project implementation is whether the workflow definition should align to the current process. Most of the companies try for a “like-for-like” implementation which tend to replicate exactly what came before. But often a new system doesn’t follow the same process or procedure of its predecessor
This is usually due to:
- New data language
- Design architecture
- Different workflows
- Ways of treating data
- Interaction with external systems
Design phase
It’s crucial that a project has an effective Design Phase and that the collection and treatment of data is well understood. The pressure to speed up the project can cause big issues during the implementation, which become visible only after launch. These issues include:
- Wrong asset classes
- Missing data
- Missing or wrong workflows
- Internal structure not well represented (organisation, responsibilities and so on)
- Interaction problems across systems (integration)
There is no single way to avoid issues during the implementation, but some good practices can reduce divergence and mistakes. Tight timescales can become the biggest enemy of a project’s success. Throughout the implementation, companies and software vendors should meet regularly to clearly define what will be released, when, and using what approach.
How do we do this properly?
As previously stated, the Design Phase is essential and can be considered a key success factor. But how do we do it properly?
Here are some useful suggestions:
- List all the internal procedures affected directly/indirectly by the new system
- List all the systems that should be included in the integration
- Define the workflows in an interactive map that reflects all the connections (internal and external)
- Establish what kind of data is needed, how to collect it, and identify any gaps
- Involve key internal people in the process definition
- Write all the tests to be carried out in each phase of project and account for re-testing if something fails
Test phase
The Test Phase is as important as the Design Phase. With a proper test coverage, 90% of potential issues can be identified before launch. Carrying out extensive testing also helps internal key people familiarise themselves with the system. It shows them how the existing operating model will change under the new system, and provides the opportunity to allay misgivings or address issues raised by those people most familiar with the existing operating model.
Data is at the heart of the implementation. The link between financial providers, the data they process, and the system often goes unrecognised but in reality is absolutely crucial. Missing or incorrect data could compromise the entire workflow and create issues across the organisation or between internal and external systems. Once again, in this stage of the configuration, key people should test and verify that the data is correct and fully covered by the new system.
A case study demonstrates this clearly: An Italian asset manager decided to implement some Derivatives that were used in the old system. Unfortunately, the new OMS and EMS system required additional data to allow fund managers and traders to carry out their work correctly. By forgetting to consider how the new data added to the workflow, the implementation process was compromised and the launch date was delayed by more than a year.
The final stage
The last stage of the process that should be considered and directly impacts project success is the Target Operating Model. Sometimes the TOM is a kind of synthesis between the old and the new system.
The best approach that Liqueo would suggest is to start with a new TOM from scratch and, once designed, work with the client to assess which parts of the old TOM should be carried over, and which can be replaced or even removed. Read our blog on how to simplify and streamline during your Front Office implementation for more insights.
To get in touch with us today, contact us via info@liqueo.com

Interested in speaking to one of our team?
If you’ve got questions, we’ve got expert insights. Contact us to discuss how our expertise can be leveraged to address your most pressing business and technology needs.
Latest Insights

Centralised vs. Federated vs. Hybrid: Choosing the Right Data Governance Model
Data governance is often overlooked until something goes wrong. We regularly see firms prioritise it...

The Next Chapter Introducing Our New Website
Some milestones feel like a moment to stop and take stock. Five years ago, Liqueo was just an idea &...

Six Considerations to Set Testing Up for Success in OMS Transformation Programmes
By Senior Consultant, Shireen Quadir, and Consultant, Edward Wimble The success of an Order Manageme...