Kobe Steel plant that supplied

The Stunning Potential of Models Like Gemini-1.5 and SORA: Finance and Beyond

  • Author: Luis Sanchez
  • Published On: February 17, 2024

I’m fascinated by the rapid advances in AI, particularly with models like Gemini-1.5 and SORA. As an early pioneer in algorithmic trading and exotic structured finance, I'm especially keen to explore Gemini’s potential for trading applications and financial analysis and SORA's potential to disrupt some aspects of the film industry.

Early in my career, I spent a good amount of time on the buy side, developing models to quantify the effectiveness of technical trading systems. But more exciting than that, arbitraging exotic options on equity derivatives or simply trading them was a lot of fun back in the day.

In 2008, after Lehman’s debacle, I returned to my roots: coding models for various actionable intelligence purposes, and even managed to raise capital from a prominent hedge fund for some of my early attempts of computational linguistics and AI in general applied to finance, search and advertisement. Since then, and until 2010, I had been collecting alternative datasets and analyzing them with ML for successful one-off trades. However, in 2018, after a dry spell from 2015 to 2018 -too many people chasing alpha with the same alt data and the low hanging fruits gone since 2013-, I started experimenting with unbalanced multi-classification models for alpha generation in trading of highly liquid equities, with models running in Google Cloud's A100 GPUs. The stock market is an extremly difficult problem to tackle, involving non stationary time series, but I am happy with the progress I am making thus far in some small areas of the problem.

Kobe Steel plant that supplied

More recently, starting around 2021, I started using LLMs and Reinforcement Learning applied to some datasets to discover new sources of alpha. Early results are promising based on a ML driven, self-discovery of profitable trading strategies designed to beat a benchmark, placing trades autonomously (as an example, my personal portfolio is 6.09% up YTD vs S&P500 5.45% YTD). However, the public LLMs I am using hit constraints around context length that limited their reasoning capacity to create dense vectors used in my prop classification models to analyze the markets and recommend positions. I believe that significantly more data and domain knowledge are required to discover the “holy grail” signals hiding in the noise, plus adequate money management. Perhaps a variation of what I am attempting is the secret sauce that Jim Simons and Peter Brown at RenTec have been using for decades? But what is out there that can analyze a very large number of tokens, so the system does not hit the constraints I am experiencing? Enter Gemini-1.5 and its astounding 10 million token context potential. Finally, an LLM architecture that can ingest entire textbooks, white papers, multiple SEC sector filings simultaneously, firehoses of news, alternative data feeds, and more to deeply grasp concepts before generating trading signals or answering strategy questions. With a greater than 99.7% recall, this is simply mind-blowing. The bad news is that it is not open source and for now has just a very limited user base.

Kobe Steel plant that supplied

And this happens the same day I got approval from Anthropic to be one of the beta tester of their closed, large context model for integration with code for commercial use. Well, the space is moving very fast and a great deal of traditional finance quants are lagging behind.

Kobe Steel plant that supplied

I believe that the smart design of ML models coupled with the Markov processes implicit in RNNs and LLMs with large context windows and with a sprinkle of RL and generative AI are the key for the next “Medallion Fund”, still with the constraint that the fund's AUM needs to have a cap probably smaller than Medallion. I believe that with a combination of domain expertise from humans and an AI like an orchestra conductor, the next generation of hedge funds is just about to start, and If this interest you, please stay tuned for my upcoming paper with Columbia University professor and D.E. Shaw alumni, Kosrow Dehnad. The paper is titled “LLM Lexicon: Parsing the language of Financial Markets”.

Of course, real-world testing is needed before committing clients' capital in a money management operation. But by augmenting (not replacing) human traders and risk managers' intuition with AI, I envision incremental breakthroughs, some of are being developed by small R&D teams like Koz’ and mine.

Another super interesting development is Open AI's SORA's model. This and other generative models will unlock new creative possibilities for commercial film production and financing of distribution. As someone involved in some aspects of film financing and ILS back in my days at Deutsche Bank, Lehman Brothers, and SAGA Capital, I had the opportunity to present some of my ML driven models for uncorrelated asset classes, to the New York Society of Securities Analysts back in 2010. SORA in particular is one of those tech developments that a few years (or maybe a few months) down the road has the potential to disrupt some important aspects of film making.

Possible breakthroughs envisioned:

So what makes movies so appealing from an investment standpoint?

  • Alpha discovery from linking obscure, diffuse data sets across text, numbers, code, audio, and video modalities.
  • Automated signal decomposition, simulation, and combinatorial optimization at scale.
  • Natural language generation of model documentation, explaining complex findings to humans with mathematical and market rigor.
  • Generative AI for stress testing scenarios and creation monetizable IP assets.

The future of AI holds tremendous promise and peril across sectors like finance and media. I welcome your perspectives - please share any concerns or opportunities you see as well!