MLOps for Reducing the Cycle Time for Development and Delivery of Business-Critical Intelligence

About the Customer & Challenges Faced:

Client – Multi-Asset & Macro (MA&M) are in the process of developing strategies that can be used in existing products, decision tools that can help in managing existing portfolio risks, and also the development of new products and solutions that combine both. The desired outcome is to optimize existing products as well as developing additional strategies that seek to exploit non-traditional risk premia such as value, momentum, or carry-based strategies.  This work is currently being undertaken by various team members across MA&M. However, the work is generally siloed, manually intensive, and time-consuming, performed in an unstable environment, is not scalable in its current form, and cannot utilize the multiple, large data sets required. Additionally, there is no current resource available to store this data as a time series, which frustrates further avenues of analysis to be done. The situation demanded the need for an integrated platform for data scientists equipped with accelerators and tools to experiment,  discover,  share, and deliver insights. The platform brings together best-of-breed tools into one integrated and intuitive platform.

Solution and Approach:

Based on the above problem statements, we collaborated with the client to build an advanced analytics solution for managing the back-testing strategy. The engagement demonstrated the Quantitative Modelling & Analytics Platform (QMAP)business process automation and achieved connected and controlled data analytics, python model management, and deployment on the platform.

  • Import data from multiple sources -Refinitiv Worldscope, Worldscope PiT API, Bloomberg, S&P, Refinitiv DDL API, Causal Lens API, ExAnte, etc.
  • Store data in xpresso data versioning repository
  • Perform EDA, visualization and transformation
  • Push transformed data in xpresso data versioning repository
  • Store the transformed data in xpresso data versioning repository
  • Implement the moving average strategy on data (a model developed by client QMAP Team)
  • Set-up and run multiple experiments on the Back-testing pipeline by configuring the dynamic
  • Parameters to generate new values for signal
  • Store the results of each experiment
  • Choose the pipeline and parameters
  • Corresponding to the best experiment result and promote it to production
  • Finalize Back-testing pipeline’s champion run
  • Deploy inference service on the production environment for making predictions using the champion run of the back-testing pipeline
  • Integrate the results with Macrobond to visualize and analyze the results


  • By using, one can leverage high-end data connectivity, efficient data versioning, perform exploratory data analysis and generate inferences using an intuitive process and through an industry-standardized manner.
  • The unique, containerized platform-centric approach offered by can be used to employ required infrastructure, deploy rapidly to multiple high-availability environments while aligning with best-in-class DevSecOps practices.
  • also brings in-depth QA-QC testing and logging frameworks, synchronous and asynchronous monitoring, and performance tracking ability.
  • also has SSO (single-sign-on) for various in-built tools and subsystems that make the platform access seamless throughout.​
  • In a nutshell, all the above features in a single plate under the same hood make an unbeatable AI Ops framework.

Have Any Questions?

Need more information about the platform?