COMMERCIAL AND INDUSTRIAL MATHEMATICS

March  8, 2025

THE FIELDS INSTITUTE FOR RESEARCH IN MATHEMATICAL SCIENCES

2015-2016
Fields Quantitative Finance Seminar

held at the Fields Institute, 222 College St., Toronto

Sponsored by


The Quantitative Finance Seminar has been a centerpiece of the Commercial/Industrial program at the Fields Institute since 1995. Its mandate is to arrange talks on current research in quantitative finance that will be of interest to those who work on the border of industry and academia. Wide participation has been the norm with representation from mathematics, statistics, computer science, economics, econometrics, finance and operations research. Topics have included derivatives valuation, credit risk, insurance and portfolio optimization. Talks occur on the last Wednesday of every month throughout the academic year and start at 5 pm. Each seminar is organized around a single theme with two 45-minute talks and a half hour reception. There is no cost to attend these seminars and everyone is welcome.

To be informed of speakers and titles for upcoming seminars and financial mathematics activities, please subscribe to the Fields mail list.

Upcoming Talks 2015-2016
Talks streamed live at
FieldsLive

February 24, 2016

Room 230

Talk 1: 5:00pm
Talk 2: 6:15pm

Phelim Boyle (Wilfrid Laurier University)
Long only portfolios and the Perron Frobenius theorem

The first principal component of stock returns is often identified with a market factor. Empirical portfolios based on this principal component sometimes contain short positions. These portfolios are based on the dominant eigenvector of the correlation matrix. We analyze empirically how stock return correlations affect the signs of the dominant eigenvector. If all the correlations are positive the dominant eigenvector is positive and the portfolio has positive weights. This follows from the Perron Frobenius theorem. In practice some of the correlations are negative and in this case the dominant eigenvector may be positive or it may contain negative values. We analyze the characteristics of the correlation matrix that lead to negative weights in the dominant eigenvector and we show that this is driven by a few stocks. We document the characteristics of these stocks. We also explore the relationship more generally and manage to obtain a few analytic results.

Jim Gatheral (Baruch College, CUNY)
Rough Volatility

Starting from the observation that increments of the log-realized-volatility possess a remarkably simple scaling property, we show that log-volatility behaves essentially as a fractional Brownian motion with Hurst exponent H of order 0.1, at any reasonable time scale. The resulting Rough Fractional Stochastic Volatility (RFSV) model is remarkably consistent with financial time series data. We then show how the RFSV model can be used to price claims on both the underlying and integrated volatility. We analyze in detail a simple case of this model, the rBergomi model. In particular, we find that the rBergomi model fits the SPX volatility markedly better than conventional Markovian stochastic volatility models, and with fewer parameters. Finally, we show that actual SPX variance swap curves seem to be consistent with model forecasts, with particular dramatic examples from the weekend of the collapse of Lehman Brothers and the Flash Crash.

This is joint work with Andrew Lesniewski, Christian Bayer, Peter Friz, Thibault Jaisson, and Mathieu Rosenbaum.

February 3, 2016

Stewart Library

Talk 1: 5:00pm
Talk 2: 6:15pm

Co-Pierre Georg (University of Cape Town and Deutsche Bundesbank)
Contagion in Coupled Financial Networks

We develop a model of the financial system in which financial intermediaries are comprised of business units specialized in trading different types of assets. Assets are intermediated from sellers to buyers via exogenously fixed trading networks. The novelty of our model is that we allow intra-institutional spill-overs. The failure of one business unit exerts an externality on other business units of the same bank and couples trading networks for different assets. We study the resilience of such a system to exogenous random shocks. When there is only one type of asset the transition from a regime in which all banks intermediate to a regime in which inter- mediation breaks down is continuous in the size of the exogenous shock. When there are multiple types of assets, however, this break-down of intermediation occurs not only at smaller shock sizes, it happens abrupt. The abrupt break-down of intermedi- ation is weaker when trading networks are correlated. If, however, an uncorrelated trading network is coupled with multiple coupled and correlated trading networks, the abrupt break-down of intermediation occurs for even smaller shock-sizes.

Sebastian Jaimungal (University of Toronto)
Statistical Arbitrage using Order Book Signals

Statistical arbitrage trading strategies allow agents to generate profits by taking advantage of (typically short lived) predictability in the direction of prices or other state variables. In this talk, we will introduce two classes of such strategies that incorporate very different kinds of information from the limit order book (LOB).

In the first part of the talk, we develop a trading strategy that employs limit and market orders in a structurally dependent multi-asset economy, e.g., options, futures and stocks. To model the structural dependence, the midprice processes follow a multivariate reflected Brownian motion on the closure of a no-arbitrage region which is dictated by the assets' bid-ask spreads. We pose and solve a stochastic optimal control problem for an investor who takes positions in these assets and we will explore the key features of the resulting strategies and their simulated profit and loss.

In the second part of the talk, we use high-frequency data from the Nasdaq exchange to build a measure of volume order imbalance in the LOB. We show that our measure is a good predictor of the sign of the next market order (MO), i.e. buy or sell, and also helps to predict price changes immediately after the arrival of an MO. Based on these empirical findings, we introduce and calibrate a Markov chain modulated pure jump model of price, spread, LO and MO arrivals, and order imbalance. As an application of the model, we pose and solve a stochastic control problem for an agent who maximizes terminal wealth, subject to inventory penalties, by executing roundtrip trades using LOs. We demonstrate the efficacy of the model and optimal control problem by calibrating the model and testing its performance on out-of-sample data. We show that introducing our volume imbalance measure into the optimisation problem considerably boosts the profits of the strategy.

[ This talk is based on joint work with Álvaro Cartea, Ryan Donnelly and Jason Ricci: Enhancing Trading Strategies using Order Book Signals (http://ssrn.com/abstract=2668277) and Trading Strategies within the Edges of No-Arbitrage (http://ssrn.com/abstract=2664567) ]

November 25, 2015

Room 230

Talk 1: 5:00pm
Talk 2: 6:15pm

Dan Rosen (S&P Capital IQ)
Re-Thinking Scenarios: Stress Testing of Multi-Asset Portfolios by Integrating Economic Scenarios with Advanced Simulation Analytics

Scenarios are the language of Risk. While scenario analysis and stress testing have been an explicit part of risk management methodologies and systems for over two decades, the typical scenario and stress testing tools haven’t evolved much and are still generally quite static and largely subjective. In this talk, we present a simple and powerful approach to create meaningful stress scenarios for risk management and investment analysis of multi-asset portfolios, which effectively combines economic forecasts and “expert” views with portfolio simulation methods.

Expert scenarios are typically described in terms of a small number of key economic variables or factors. However, when applied to a portfolio, they are incomplete – they generally do not describe what occurs to all relevant market risk factors that affect the portfolio. We need to understand how these market risk factors behave, conditional on the outcome of the economic factors. The key insight to our approach is that the conditional expectation, and more generally the full conditional distribution of all the factors, and of the portfolio P&L, can be estimated directly from a pre-computed simulation using Least Squares Regression. We refer to this approach as Least Squares Stress Testing (LSST). LSST is a simulation-based conditional scenario generation method that offers many advantages over more traditional analytical methods. Simulation techniques are simple, flexible, and provide very transparent results, which are auditable and easy to explain. LSST can be applied to both market and credit risk stress testing with a large number of risk factors, which can follow completely general stochastic processes, with fat-tails, non-parametric and general codependence structures, autocorrelation, etc. LSST further produces explicit risk factor P&L contributions. From a methodology perspective, we also discuss some of the assumptions the LSST approach, statistical tests to check when these assumptions fail, and remedies that can be applied.

Finally, we illustrate the application of the methodology through the analysis of the performance of a real-life portfolio under scenarios from a recent economic research report as well as regulatory scenarios.

(joint work with David Saunders, University of Waterloo)

Ron Dembo (Zerofootprint)
Know your environment

There is a big gap between the way banks calculate and report on regulator driven capital requirements and the way in which they actually manage capital. Regulations have become so onerous that some banks spend billions to simply manage reporting and compliance. Fundamentally, the way banks are regulated is counter-productive. We need a simpler approach that will bring regulatory and management capital closer. Data technology has now reached a tipping point and it is possible for a bank's overall risk-adjusted returns to be calculated and aggregated in real time. We can analyze the entire bank on our desktop in almost real-time. It would clearly be more productive and arguably better for banks to spend money on their data infrastructure and processing rather than on massive simulations of more and more questionable "sophisticated" models. This talk is about how this can be achieved.

October 28th, 2015

Room 230

Talk 1: 5:00pm
Talk 2: 6:15pm

Alexander Lipton (Bank of America and Oxford University)
Modern monetary circuit theory, stability of interconnected banking network, and balance sheet optimization for individual banks

A modern version of Monetary Circuit Theory with a particular emphasis on stochastic underpinning mechanisms is developed. Existing theories of money creation are compared and contrasted. It is explained how money is created by the banking system as a whole and by individual banks. The role of central banks as system stabilizers and liquidity providers is elucidated. It is shown that in the process of money creation banks become naturally interconnected. A novel Extended Structural Default Model describing the stability of the Interconnected Banking Network is proposed. The purpose of banks’ capital and liquidity is explained. Multi-period constrained optimization problem for banks’ balance sheet is formulated and solved in a simple case. Both theoretical and practical aspects are covered.

John Hull (Joseph L. Rotman School of Management University of Toronto)
Optimal Delta Hedging

The “practitioner Black-Scholes delta” for hedging equity options is a delta calculated from the Black-Scholes-Merton model with the volatility parameter set equal to the implied volatility. As has been pointed out by a number of researchers, this delta does not minimize the variance of a trader’s position. This is because there is a negative correlation between equity price movements and implied volatility movements. The minimum variance delta takes account of both the impact of price changes and the impact of the expected change in implied volatility conditional on a price change. In this paper, we use ten years of data on options on stock indices and individual stocks to investigate the relationship between the Black-Scholes delta and the minimum variance delta. Our approach is different from earlier research in that it is empirically-based. It does not require a stochastic volatility model to be specified.

Optimal Delta Hedging for Equity Options

Joint work with Alan White.

   
   
   
   
   

 

 

back to top