Author: Mark Abdollahian and Joerg Ruetschi
Obviously, complex interactions of political, economic, business, and social dimensions on investment and risk-taking decisions are at the core of what financial institutions do. And the toughest yet most valuable decisions are often driven by such. With the right decision fortunes can be made but also lost as several historical examples such as the fall of Long-Term Capital Management, the default of Lehman Brothers or most recently the demise of the cryptocurrency exchange FTX illustrate. In today’s fast-moving environment, when the pace and tempo of events happens faster than any single human can cognitively process, new technologies, data and models can help meet this challenge. While the scale of available data has been increasing exponentially over the last 20 years (IBM 2020), the information overload of what to do with that data and related complexities not only overwhelm individuals but entire organisations. As such, more resources, skills, and capacities are required to meet the needs of today’s analytical and decision processes. This increases costs and makes organisations vulnerable to new uncertainties in their decision making.
Just like group of successful quantitative hedge fund demonstrated, the promise of augmented intelligence offers to meet these challenges of increasing speed, complexity and uncertainty of events. In this article, we set out what augmented intelligence means, define the underlying technology drivers of information arbitrage opportunities today and elaborate on some modelling approaches based on Nobel prize winning complexity science. We conclude by specifying three main use cases that augment the decision making at financial institutions.
Open-source platforms for intelligence augmentation
Whether we realize it or not, every day we rely on crowd sourced geographic information system (GIS) data streams and dynamic stochastic optimization algorithms to navigate in our cars. Now, advances in artificial intelligence (AI) and data science combined with computational power are being applied to navigating decision outcomes on the human terrain. Many open-source platforms for decision intelligence augmentation (OSINT) bring those technology drivers together. They combine big data collection with artificial intelligence and high-performance computing through using a specific modelling approach.
The core components that drive predictive and machine-learning (ML) analytics start with available data pools. These can be individual and disparate data sets to merged big data ‘pools’ and ‘lakes.’ Big data is a collective name that refers to the use of immense datasets which possess varying qualities, often referred to as the four V’s - volume, velocity, variety and veracity – of data. It covers relevance, complexity, and depth of data that come from internal and external sources. ML is a key field of study of AI that allows processing large data with mathematical accuracy and objectivity. The use of algorithms leads to less biased results, substantial efficiency gains and new, deeper insights. Algorithms are mathematical procedures that allow to interpret data in specific ways and create predictable output based on identified patterns.
AI approaches are based on the same processes of foundational human cognition, looking at pattens of data input and then adjusting the correct classification of output through both single and double loop learning. Pattern recognition can happen only if the algorithm has access to a simple subset to make predictions of the dataset as a whole. Obviously, to create a computer platform that can learn, large datasets are required. However, AI learning in a computer can happen millions of times faster than humanly possible.
Internal datasets are often propriety and non-public whereas external data are usually publicly available. Structured data, such as exchange trades, assets, commodities and currencies are highly organised and can be codified, sorted, and searched in files. Most financial data such as accounting statements or transaction records are also structured. Unstructured data, on the other hand, have no predefined organisation form. These can be found in emails, reports, news, social media, or any other form of written and oral communication. Unstructured data can be captured and processed by natural language processing (NLP), a branch of ML. NLP facilitates the interactions between computers and human natural languages. It applies linguistic structure principles and breaks down sentences into their elemental pieces to identify semantic relationships among them. In essence, it captures the raw data elements and processes them to make sense out of their meaning, often in the form of quantitative metrics and scores. For a specific event such as a financial announcement or corporate transaction, the ML algorithms detect the polarity of statements within a range of positive, neutral, and negative sentiment, as well as detailed emotion contexts. This analysis incorporates the relative importance of the statement made on the topics by a key influencer such a c‐level executive and equity analyst.
Given advances in NLP and cloud computing, today automatic unstructured data collection occurs in near real time. Such data collection captures event risks or opportunities of ‘who does what to whom, how, where and when’ that can impact investments. Big data architectures capture the key variables, indicators and signals surrounding events across all scales of human connectivity and interactions, from individuals to groups and their interconnections all occurring in changing markets contexts. Event taxonomies map and track specific events of interest, giving decision makers comprehensive situational awareness, as the pace and tempo of events happen faster than any single human being can cognitively process. But given all this data, what can we do with it and how do we make sense out of meaning?
Complexity Science
To avoid being overwhelmed with ‘paralysis by analysis,’ we often rely on intuition, advice, information, and anticipation of what will happen next. While us as individuals have our own unique, deep experience, that experience is gained through the contextual interactions of our world underpinned by science. In 2021, the Nobel prize in Physics was awarded to Syukuro Manabe, Klaus Hasselmann and Giorgio Parisi for their work understanding and predicting complex physical systems, such as climate change, from the smallest atomic level to the largest planetary scales. Such complex adaptive systems (CAS) science is holistic, integrating different theories, data and evidence with their individual premises and dynamic behaviours. Its goal is to discover dynamics, conditions, and constraints that can be generalised to overarching principles across integrated systems.
Just like a macroeconomist might have a different view of markets than an investment banker, or a psychologist might differ on leadership from a political analyst, CAS attempts to integrated disparate perspectives, theories and science together for fuller, more sound explanations and predictions. It focuses on connectivity and interdependencies across several scales and nested subsystems. In the context of financial decision making, asset managers assess macro market conditions and trends as well as overall company performance, key personnel as well as sectoral trends. Understanding the macro environment in the context of individual, micro company performance and managers all interconnected in B2B and B2C meso networks of sectoral performance is crucial. Often such complex, connectivity results in new, emergent behaviours, such as recent meme stock retail trading via social media. This behaviour arises from multi-faceted relationships, interactions and connectivity of elements within a system and between a system and its environment. Connectivity and interdependence mean that a decision and action by any individual, group or entity may impact related individuals and/or systems. The response of regulators to the challenges of the Global Financial Crisis of 2008-12 for instance fundamentally reshaped the global financial system, and the business models of the main players. Effects are often nonlinear and contingent, as we saw with meme stocks, which might vary with the state of each related individual and system at any particular time. Through this approach, specific insights are gained for the decision-making process, often referred to anticipatory intelligence. The combination of the insights with a specific modelling approach leads to the development of advanced analytics that provides decision makers with actionable descriptive, predictive and prescriptive decision intelligence augmentation.
Main use cases for financial decision making
Just as we rely on weather forecasting and navigation systems in our daily lives, for the decision making process at financial institutions, OSINT can assist in three specific areas.
The first use case is about the identification and monitoring of an investment hypothesis. At the beginning of the investment process, a specific theme is identified, and a hypothesis established – what are key political, economic, financial, business, and social event risks and opportunities that can impact the investment. Each hypothesis is shaped by an event that can be tracked and monitored by OSINT. Each event is defined by key words that maps, tracks and forecasts the connectivity and interdependencies between the political, economic, business and social dimensions that drive the investment and its underlying hypothesis. OSINT tracks thousands of different events on a global scale with detailed emotional sentiment for individuals, groups and entities. It provides the decision makers a comprehensive tool for their analysis and management of the investment across the decision process.
The second use case focuses on early warning indicators as part of an enterprise-wide risk management system. OSINT decodes the risk factors with their dimensions and interdependencies to anticipate, mitigate and shape desired outcomes. The event methodology is shaped by the risk sensitivities with their key words. The event methodology can further incorporate specific regulatory and compliance requirements as well as specific business sensitivities.
The third one builds on the emerging environmental, social, and corporate governance (ESG) standards. ESG has become widely accepted as a reporting and compliance framework. However, there is a broader dimension that focuses on the impact, performance and credibility of the declared sustainability agenda to which the board and management are committed to. Failing to meet commitments pose not only a reputational risk for the company, but when a company deviates from, or violates, its communicated agenda and standards, the directors and officers may also come under fire, leading to D&O liability at the Board and management level. The magnitude of the financial impact of a reputational impact is often difficult to quantify which makes it challenging to manage it with traditional instruments of risk management. The ESG frameworks with its standards such as the one of the World Economic Forum (WEF) or Sustainability Accounting Standards Board (SASB) set out the event methodology for OSINT. It compares the statements made by the management towards their commitment to their ESG agenda versus the external perception, in say conventional but also social media, which provides a powerful tool for risk management and monitoring.
Globalisation and technology innovation have made today’s world more interdependent than ever while the available information for decision making has increased dramatically. Financial institutions face a complex operating environment, and their key decision maker an unseen challenge in making the right choices for their businesses. OSINT platforms bring together data science, artificial intelligence, and computational power while applying modern tools of complexity science to find the right solution and choice for a specific situation. They augment decision intelligence and making, and are a key tool to successfully operate in today’s business environment.
Dr. Mark Abdollahian is Chief Executive Officer at ACERTAS. He focuses on designing and delivering decision intelligence with data driven decision-making.
Dr. Joerg Ruetschi is a Senior Advisor at ACERTAS and author of the book “Transforming Financial Institutions”.
コメント