By Katherine Paulson

 

This week’s blog is an introduction to some of the basic terminology used in Artificial Intelligence (AI) and their definitions.  As someone new to AI, but a veteran of the financial services industry, I made an effort to start learning quickly about the basics when I arrived here at AlphaTrAI. Once I understood these terms, a new world of asset management was open to me and the exciting possibilities that came with it.  

 

Although it seems like AI terminology just recently started appearing online and in social media, it’s not new, it has been around for decades.  In fact, the term “Artificial Intelligence” was coined in 1956 at Dartmouth College.  The reason for AI’s incredible growth in the last decade is because of these key developments:  

 

  • Computational Power
  • Access to Algorithms and Libraries
  • New Research
  • Access to Data

 

The growth in AI investing started in 2011 and continued its upward trajectory with a big acceleration between 2015 through mid-2019 when VC’s invested $60B in AI related startups. 

The Basics

 

When thinking about AI terminology, let’s begin at the top, Artificial Intelligence, Machine Learning (ML)  and Deep Learning (DL).  Artificial Intelligence encompasses the entire scope, Machine Learning is a subset within AI and Deep Learning is a subset of Machine Learning as illustrated here:

In the simplest terms:

 

Artificial Intelligence = Computer is doing something human

Machine Learning = This computer is learning

Deep Learning = This computer is learning in a specific way

 

Two types of Machine Learning

 

Now, let’s move on to Supervised Learning vs. Unsupervised Learning.  The main thing to remember in Supervised Learning is that it is all about labeled data.  In Supervised Learning, imagine a manager who oversees all of the data and tells the computer exactly what that data is.  The computer is fed with examples, directed to determine what is correct or incorrect and told their differences with the end goal being able to train it to identify its features. Features are attribute information you know to be important for making a prediction. Then, when new data that has never been seen before is introduced, based on the model that the computer created (think of a model being a shortcut to a pattern the computer saw), it creates an output with a prediction of what that new data is.  An example output may be expressed as “65% confident”. 

 

In Unsupervised Learning, you guessed it, it is the opposite – there are no labels, no manager and no direction.  The computer or system is provided a data set and it organizes the data in its own way.  For example, a popular algorithm in Unsupervised Learning is “clustering”.  Clustering is the task of dividing the population or data points into a number of groups such that data points in the same groups are more similar to other data points in the same group and dissimilar to the data points in other groups.  Clustering algorithms are popular for customer segmentation.

 

Going back to the basic definitions of AI, ML and DL, remember AI can be described simply as the computer doing anything human-like that can range from autonomous cars to a very sophisticated set of if-then statements that recreate human behavior.  Within AI is ML which is most of what we read and hear about.  So, to differentiate ML vs. DL, we need to distinguish what separates DL.

 

What is Deep Learning?

 

Deep Learning is specific to its structure that mimics the human brain, also called Neural Networks.  Neural Networks are set up to model how humans process information.  They are generally more computationally intensive than traditional ML.  Artificial Neural Networks (ANN) can look as simple as one input layer, one hidden layer and one output.  However, Deep neural networks have more than one hidden layer, that is what separates them.  Each hidden layer is a specialized function.  These multiple layers or depth allows you to create much more abstract features that can be extracted from the upper layers which enables the improvement of the overall flow of the network to classify something, an image for example.

To describe this in a simple way, let’s use an analogy.  A person with very poor vision with a prescription of -15 looks at an image and sees a complete blur.

Then, someone hands them a pair of glasses that improves their vision to -7.  Now, they start to see some clusters of pixels in the image.

Next, someone hands them a pair of glasses that improves their vision to -3 and they can determine that there are ears, body and paws in the image.

Finally, they are handed another pair of glasses that allow them to have 20/20 vision and now they see clearly the clusters form an image of their favorite pet.

Deep Learning = Multiple hidden layers that “build” on each other to “extract” higher level features like “paws”.

 

What is incredible to think about DL is the ability to apply this layered learning to the abstract.  Business applications are boundless.

 

I hope this brief overview gives you a better understanding of AI’s basic terminology and the next time it comes up in conversation, you won’t miss a beat.

By Andreas Roell

 

The Merriam-Webster dictionary defines “Authentic” as both “worthy of acceptance or belief as conforming to or based on fact” and “conforming to an original so as to reproduce essential features”.  I caught myself using the phrase “Authentic AI” in one of my discussions and it made me think about this concept that I have been using for years.  What does it truly mean to me and others (yes – I am not the only one using it)? 

 

Contrary to the standard definition of the word “Authentic” above, the authenticity of AI, is really not a matter of “Fake” to be contrasted with “Real” but rather, about the essential features of AI which need to be acknowledged in a particular use case and what these “authentic” features are.

 

It does not necessarily mean that algorithms must be custom or built from scratch, if the applicability into a certain application is acceptable.  It simply means, in layman’s terms, that the models fit for a specific use case.

 

In order to reach the necessary level of requirements, it is essential to deploy a full immersion approach to a specific application or sector. In our case at AlphaTrAI, for us to truly deploy authentic AI, it was required to deeply understand, simulate and develop the necessary scenarios applicable to the highly dynamic and ever-unique conditions of the financial markets. Looking at one of my previous posts on the ever changing rules of the financial markets (you can find here), it should become very clear that off-the-shelf AI models for prediction and detection may function as a base for Authentic execution in financial markets but cannot be the solution. 

 

In my mind, these are the key building blocks of Authentic AI for financial markets:

 

  • Models that successfully function for prediction and detection based on very small data sets. This is due to my previous argument that markets cannot be predicted successfully in the long-term if the models are reliant on patterns created from long data periods. Markets are fast and ever changing.
  • Connected to the previous point, is to operate with models that are more “general” in nature when it comes to identification and comprehension of market conditions. Solely relying on models that function on predefined signals are not only limited in possibility of survival (At AlphaTrAI, we operate under the belief that each signal is like a star in the universe that changes in brightness or effectiveness and evaporates at one point) but also become very ineffective in fat tail market scenarios. Ability to detect market rotations in order to identify underlying dynamics of the market. So the answer to this very “tactical” form of algorithmic execution is to either function on, or at minimum, provide an “air cover” model framework that has the ability to recognize market regimes. Such a general concept of algorithms can be compared to an EQ element of a human’s decision making process, which we all have learned the older we get, enables us to make more rational situational decisions.
  • The ability for the algorithmic models to solve the entire problem versus only a portion of it. For financial markets this will mean that an integration of models should not only be able to decide what to trade, but must also have the ability to successfully execute the trades. A common shortcoming of back tests is seen when the models trained are not set up for slippage and executions for high volume positions, making the real trading results vastly different to these simulations. Another example is the integration of trading across a multitude of asset types or short-long positions.

 

Having been an operator for a while in the financial markets, I have noticed an acceleration of “AI based” financial offerings.  As a result, I see a significant amount of interest in AI strategies from both retail and institutional investors.  Where I am taking a leap is, and what my many years in AI have shown me, is that only an “Authentic approach to AI” can effectively work in financial markets. My findings on the current state of the majority of AI executions are such that are heavily dominated by human engineered rule-based systems, non-fully automated decision making and execution of models with a heavy reliance on a very narrow (often single) set of signals. 

 

So I encourage you to consider the authenticity of AI before making investments into an AI based investment product. Maybe even use some of my points as part of your due diligence list.

 

Happy AI investing.

By Andreas Roell

 

2020 has truly been an unprecedented year in all of our lives.  From the pandemic, to the historical election, to the way most of us just celebrated this holiday season unlike any other.  As always, unprecedented times come with significant insights and learnings.  This is especially true for the financial markets this year.  It will clearly go down in history as the most volatile, highest magnitude, fastest, news-driven and unpredictable market in history.

 

Many of the systematic, quantitative and algorithmic-based asset managers had to completely restructure how they have operated for years. See Renaissance Capital, as an example here.

 

Before I jump into some of the root causes of these troubles, I want to give you some fascinating data points. First, I asked individuals on our science team to provide me with some visual analysis of the year 2020 versus some previous years. We used a Pearson’s Correlation Coefficient Prism to display analogies and diverting time periods. The graph clearly shows how 2020 can be analogous to the recession of 2008.

 

A deeper look at 2020 shows that the majority of the irregularities occurred in the April / May time frame although the remainder of the year has shown also strong tendencies of dissimilarities.

 

Sifting through market data (through November 2020) I tabulated and identified additional clarity of this fascinating year.

The graph below shows the magnitude of daily point changes in DOW’s history:

 

Or below: The Top 20 Days of the S&P since 1967 with the Largest Intraday Point Swings.

Surprise, surprise, they are all days in 2020.

 

Or the largest single day changes of the S&P compared to a previous day with 70% of the top 40 existed in 2020:

 

Largest Daily Point Gains:

Largest Daily Point Losses:

 

Or the incredible rally from the election results that saw a 7.25% jump in the first week after the election. Here is a comparison I found to other post-election market behavior:

Source: https://www.winton.com/longer-view/market-impact-of-us-presidential-elections (By the way this is a very cool history of stock markets and presidential elections)

 

CNN reported: “Post-election chaos would of course rattle markets, which famously hate uncertainty. The smoother-than-feared election set off a celebration on Wall Street, with the S&P 500 notching its biggest post-election rally since 1932.”

Hopefully, I have provided some convincing data to make you feel good about going through probably the most unusual financial year ever.  Earlier, I promised that I will come back around the issues that quant, systematic and algorithmic funds are facing specifically during a year like 2020.

 

The root problem is that the typical trade executions and decision making processes rely on historical data. According to Bloomberg, Renaissance clearly stated the problem during their September letter to clients: “It is not surprising that our funds, which depend on models that are trained on historical data, should perform abnormally (either for the better or for the worse) in a year that is anything but normal by historical standards.”

 

So for me, the question that I would ask myself if I were in Renaissance or a typical quant strategy is this:  What would it mean if 2020 is just a signal of a brand new market regime?  One that is erratic as the retail investors provide over 20% of trade flow – the highest share ever to multiple sources. Or if the speed of automated decision making continuously increases to unprecedented levels.

 

What if markets continue to charter new dissimilarities and the historical data sets that these traditional algorithmic or quantitative strategies are not relevant anymore?

 

It means that we will be beginning a new chapter of hedge funds closures potentially – one that is specifically targeting the quantitative, systematic and algorithmic shops that are unable to reinvent their approach.  At a minimum, they must address decision making made with increasingly more sparse data sets or reconfigure existing logic to a higher level of market signal detection.

 

Without giving away our recipe of course, the key is to advance the authenticity of models to a point where they are self-learning and much more agile.  If the markets of 2020 have proven anything, it has shown that building dynamic models that evolve and make decisions in extreme conditions is the future.

By Andreas Roell

 

When Google’s DeepMind beat the Go world champion in 2017, the AI world celebrated this accomplishment as a major milestone in the technical advancements of the industry.  Rightfully so, because defeating the number one Go player at the time was considered beyond the reach of even the most sophisticated computer programs.  The ancient board game, invented over 2,500 years ago, is famously complex with more possible configurations than atoms in the observable universe.

 

To explain briefly, Google’s AlphaGo used a combination of deep learning and neural networks in such an advanced manner, at that point in time, the algorithms continually reinforced themselves to improve by playing millions of games against variations of itself.  This trained a “policy” network to help the models predict the next moves, which in turn created a “value” network to evaluate proposed positions.  AlphaGo then looked at the possible moves and permutations to select the move the network deemed most likely to succeed.

 

When studying the moves of AlphaGo during its game against world champion Ke Jie, it is astonishing to see that the moves it executed were considered highly unusual or have never been seen before in certain situations of the game by even the most experienced players.  If you have not seen the documentary about AlphaGo, I highly recommend it.  It is available on Youtube here

 

Often successful algorithmic approaches to the financial markets are compared to AlphaGo’s approach to the game of Go.  The analogies are very logical from an algorithmic perspective when considering the extreme amount of permutations possible such as in the game of Go. However, comparing the level of permutations in Go to the financial markets is not considered a scientific challenge.

 

In addition, many scientists take the approach that training algorithms on the historical behavior of the markets will provide the model training necessary for algorithms to anticipate, predict and decide on the next trading moves.  That’s why it is so common to hear conversations about how feeding more historical data into models for training will make the executions even better.

 

However, after analyzing the financial markets at a much deeper level, it becomes clear that the analogies between the game Go and the financial markets are overly simplistic and superficial.  In reality, relying on a similar algorithmic approach will create a significant amount of risk to underperformance and volatility.  

 

The fact is, the financial markets are significantly more complex than the game of Go. To explain it very simply, our Chief Scientist explained to me that the composition of “opponents”, or to put it nicely, “players” changes on a day to day and minute by minute basis.  Different participants execute their beliefs on the best next move.  They all have different strategies, needs, systems, capabilities, knowledge and even emotions.

 

In thinking about this complexity in relation to Go, you would have to re-analyze your opponent’s personality without ever seeing him before and prior to your next move.  In addition, you have a high level of outside influences that have an impact on the markets. For example, breaking news and news stories that specifically have a major impact on the trading conditions of this unprecedented year of 2020.  

 

To make matters worse, news may break in the middle of a trading day providing a significant amount of directional turns throughout the day.  Driving this point home is that 18 out of the top 20 largest intraday directional market swings in market history occurred in 2020.  Looking at 16 out of these 18 trading days, the intraday swing can be correlated with significant breaking news announcements, such as the vaccine trial breakthrough.

 

This brings me back to my comparison with the game of Go.  The financial markets would be the equivalent of not just dealing with a changing opponent for every move, but an opponent changing before a move is made.  In addition, the opponent also has exclusive permission to change his move after you have made yours. Tricky, right?

 

Leading an algorithmic-focused asset management firm, I have learned that the key to all of this is that models are built with the following key components to be successful:

 

  • Ability to detect market rotations in order to identify underlying dynamics of the market
  • Ability to train models based on very small data sets.  The reason for this goes back to my earlier argument that every trade situation is unique and has never been seen before
  • Participate in both long and short positions to provide an opportunity to participate in any market direction
  • Ability to predict and function around market slippage when dealing with intraday trades
  • Ability to function effectively across various type of securities to take advantage of de-correlated conditions typically between two asset types 

 

I hope this provides you with a high level perspective on how one would either build an algorithmic trading framework or allow a potential investor into an algorithmic-focused offering to conduct more effective due diligence.

By Todd Spillane

 

As we enter the next decile of the 21st century, there are a number of things that I see coming at us with increasing speed. These are the things that I am thinking about that are coming at us fast and furious as it relates to the asset management industry and investments as a whole.

 

First, the global economy, even in the face of the pandemic, will continue to spread its wings into the far reaches of the world.  Correlations between equities will continue to increase to the point that there will be little dispersion between US equities and the rest of the world and asset managers will need to will need to look for other ways to add alpha for investors.

 

Investment strategies have evolved over time, it seems to me that the evolution from a fundamental value strategy to a fundamental growth strategy to a quantitative value, balanced or growth strategy. Managers also explored other “non-correlated” assets including commodities, but at the end of the day, they all become highly correlated.  Like most things, these strategies get over done with too many managers all doing the same thing, can you say FANG? So, what is next for the asset management industry?

 

In looking beyond, the horizon, I see a number of things that will be very different in the near future and some a bit further out in the asset management industry.

 

  • There will be more consolidation under the theory that bigger is better. While this is probably true, it is only because of the crushing weight of the expenses of running a big asset management firm particularly in the mutual fund space.
  • ETFs will continue to erode traditional mutual fund market share due to the significant cost differential, the transparency and the ability for shareholders to exit at any point during the day.
  • Investors will continue to look for the holy grail of exceptional performance and a low cost. While this may be utopia, as we all know performance wins at the end of the day, and consistency of performance is the third leg of the holy grail trinity.
  • Investors, particularly wealthy investors that can access hedge funds and private equity for potentially better investment opportunities with positive results, significant assets will flow their way.
  • Investors that are driven to see outsized alpha may find it in funds that have a very different way of managing the assets and will find opportunities in funds that are quantitatively managed using artificial intelligence (“AI”) and machine learning (“ML”).

 

So, what does this mean from a compliance perspective?

 

Compliance will become more challenging for firms, from limited resources and converging firms. As a result, compliance professionals must keep current with not only all the changes in the markets, but also the changes in the investment universe. To that point, we all thought that we knew the various investment strategies and how portfolio management teams think.  Now, we have a whole new set of business and portfolio management issues to understand and think about how our regulatory world meshes with the new age of technology.

 

In order to stay current with the investment strategies, compliance professionals need not only to stay current on their readings but also use the investment team as a resource to understand the investment process not only for disclosure purposes but also for monitoring and testing purposes.

 

Additionally, a great resource for compliance professionals are the various vendors and service providers.  Use them to get as much information that you can, they love to explain their product and if it covers an emerging area, they are patient and willing to explain how their service works in addition to the broader markets.

 

The challenge of being a compliance professional in this day and age is that the regulatory landscape is changing quickly as well as the business landscape changing even faster. In order to keep up, compliance professionals need to have their firm be committed to spending the right amount on technology and on staff.  The key is to get the right balance to manage the regulatory risks that each and every firm faces in this technology age.

By Hudson Cooper

 

In the left panel, we plot the yearly maximum United States unemployment rate since 1948, updated on a quarterly basis.

 

The right panel shows the unconditional probability of exceeding a given unemployment rate over the course of a year. The points marked “Observations” are the empirical probabilities of these excesses and are given by Weibull plotting positions.

 

On a yearly basis, a Generalized Extreme Value (GEV) distribution is fit to the observations, and its predicted probabilities are represented with a solid curve. A confidence band reflecting model uncertainty due to variability in the data is represented as a blue shaded region.

 

The GEV distribution is an important model because it not only allows us to extrapolate the historical frequency of extreme events to “unprecedented” future events, it also allows us to realistically capture our (asymmetric!) uncertainty of these estimates.

 

The animation shows how the GEV models and their uncertainties change as they are given access to more historical data. We see that as more extreme events (unusually high unemployment rates) occur, the model updates its beliefs accordingly, attributing higher probabilities to these events and slightly reducing the uncertainty in this region.

 

While historical empirical probabilities are prone to underestimating the future empirical probability of extreme events, the GEV model describes these occurrences more faithfully. When the GEV model fails, it also tends to underestimate these probabilities, but these failures are well-represented by the asymmetric confidence bands.

 

Source: U.S. Bureau of Labor Statistics, Unemployment Rate [UNRATE], retrieved from FRED, Federal Reserve Bank of St. Louis; https://fred.stlouisfed.org/series/UNRATE

by Andreas Roell

 

By now, most of us have likely become aware of JP Morgan’s prediction that approximately $300b of capital in equity markets are at the risk of outflow. As a quick reminder, this forecast is based on the traditional 60/40 approach of mutual funds, which are in need to rebalance sometime during the course of the end of the year, as a result of the growth of the stock markets.

 

While this forecast captured headlines, I felt that it is important to absorb this analysis in a way that considers various perspectives. Having led an algorithmic-focused asset management firm for over four years now, I have learned that human traders have a emotional bias related to their analysis of the market. It is obviously something that algorithmic teams like AlphaTrAI pride ourselves to be independent of.

 

So when thinking about the emotional impact of this story, it will likely cause some extreme reactions in terms of shifting out of, going short on equities, or simply sitting out via cash allocations. As a fundamental reminder, stock markets pride themselves on having priced the future in real time. This would mean that we should be sitting comfortable as human intelligence and emotional triggers should soften this capital outflow before it even happens. Couple that with mutual funds equity estimated share to be about 19% (source: $7t from Business Insider divided by stock market value found here).

 

Just to put this into perspective, on March 16, 2020 the Dow Jones approximately had its largest loss in market capitalization of $875b, representing a 13% drop. That is 190% higher than the $300b forecasted here.

 

Why do I even want to discuss this topic here? As mentioned earlier, we are still finding ourselves in a financial trading world dominated by humans. On top of that, the share of “hobby traders” aka Retail Traders now account for almost 25% of trades that is up from 10% in 2019.

 

With that, our financial markets are highly exposed to human limitations of decision making. I classify these as limited data processing and analysis capabilities. In my work, I have obviously been exposed to a machine’s ability to create a more holistic market perspective behind specific market events. This was completely missing for me when I read all of the news stories about JP Morgan’s forecast. Maybe it was our world’s need for a headline grabbing story, or maybe it was just our human limitations to holistically analyze a situation in terms of relational context. This made me want to help and share what I see how algorithms adapt when market rotations in general occur. They don’t treat a situation as an all-or-nothing decision. They evaluate and dynamically decide based on the magnitude of an event.  They classify the risk, and diversify into multiple solutions. It is this composure that I believe will help humans to generally cope with events like this.