Investment managers seek to optimize all their portfolio risk analytics data to support investment and trading decisions. However, harnessing the best of their own data as well as vendor data, affordably and without straining resources from the operations, IT and quants teams, is a growing challenge.
Part of this challenge is not only that there’s not a viable one-size-fits-all approach to data management across the industry, there isn’t even a one-size-fits-all across the business needs within a firm. Different business needs and use cases demand data at differing levels of granularity, time sensitivity, structure, validation, enrichment and inter-linkage. There is no single piece of technology that can take in, store, link, curate and make all this data available in every way, place and time needed. It requires the best use of the right technologies and approaches.
A data-related competency that is rarely fully formed in a firm is risk and analytics, which is why firms often seek risk and analytics services from specialist service providers. But despite turning to outside help, firms frequently struggle to draw on their own internal data to support their investment strategies or gain alpha. And even if internal data is fit for purpose, spinning up risk models quickly to react to the market is often an additional hurdle. Taking advantage of full portfolio risk analytics and optimization models, factor investing models, principal component analysis, and efficient frontier are mostly unmet needs for portfolio managers.
It comes down to the fact that when creating and implementing credible investment strategies and analysis, investment managers need confidence in data management capabilities that ensure the right data is available and utilized. For some firms this means trying to keep up to date with warehouses, lakes, lake houses, operational data stores, vendor/cloud data feeds, APIs and the latest tools for modelling and analytics.
Is there a way that a firm can harness its own data and data from multiple vendors without straining its resources? In the same vein as using a specialist risk and analytics service, using a specialist data management provider unlocks access to the best data technologies and data model, with the provider ensuring the availability of the right tools and services for the right job. This can mean ease of use of instrument identifiers to map data, fast cloud data pipelines, the linkage of related data, optimal data handling and curation in a data lake, smooth reporting from a warehouse, and integrated access to quant libraries for modeling and analytics.
And what if a specialist provider could also add to this the management of time series data, validated and clean, with which a firm can confidently test and run models or perform portfolio risk analytics. All this is already available to buy side firms as a no-code capability, delivered by data management experts. For example, using an operational data store (ODS) in its Time-Series Master framework, GoldenSource can take in data, perform any necessary processing and then feed it into Snowflake or other cloud data warehouse/lake services, to a customized data model. Therefore, end-to-end data goes into that customized data model, through modeling and analytics programs that can switch between those various data sources without coding, freeing up the teams within the firm to get the best business results.
This seamless switch between data sources is powered by our Quant WorkBench, which uses a new scripting language, QLIScript, to take the data from the different layers in our data systems and pipe it anywhere it is needed for analytics. Quant WorkBench is agnostic to where data sits. It can read data directly from our Time-Series Master tables or from a data lake, and users can pick and choose which vendor’s data they want to onboard. The infrastructure of the underlying GoldenSource solution links all the selected data together.
The functions of our ODS, Time-Series Master and Quant WorkBench are flexible, so users can also pair these with data lake houses or delta lakes, which can contain linked together and reconciled data in a manner more advanced than an inhouse data warehouse is likely to achieve.
Collectively, our continuing investment and innovations in data capabilities represent a path to data management and modeling that is efficient and affordable for firms aspiring to understand data in support of investment and trading decisions.