|
THEMATIC PROGRAMS |
||||||||||||||||||||||||||||||||||||||
December 22, 2024 | |||||||||||||||||||||||||||||||||||||||
Thematic Program on Quantitative Finance:
Foundations and Applications
|
Visitor Seminars
in the Quantitative Finance Program
|
PAST SEMINARS | |
Tuesday February 2, 2010 1:30
pm - 2:15 pm
2:15
pm - 3:00 pm |
Hao Xing, PhD (Michigan) Klaas Schulze, PhD (Bonn) |
Tuesday February 9, 2010 1:30
pm - 2:15 pm
2:15
pm - 3:00 pm |
Dr. Xianhua
Peng, Fields Ontario Postdoctoral Fellow (Fields/York)
What Is a Good External Risk Measure: Bridging the Gaps between Robustness, Subadditivity, and Insurance Risk Measures The recently revised Basel II risk measure for market risk, which uses Value-at-Risk (VaR) with scenario analysis, does not belong to any existing theoretical framework of risk measures proposed in the academic literature. We propose new data-based risk measures, called natural risk statistics, by postulating a new set of axioms that only require subadditivity for commonotonic random losses. Natural risk statistics include (i) the tail conditional median that is more robust than the tail conditional expectation suggested by coherent risk measures, and (ii) VaR with scenario analysis, in particular the current and recently revised Basel II risk measures, as special cases. Hence, natural risk statistics provide axiomatic justification for Basel II risk measures. In addition, we emphasize that risk measures used for external regulation should be robust with respect to model misspecification and small changes in the data. We show that coherent risk measures are generally not robust with respect to data and insurance risk measures are generally not robust enough with respect to model misspecification.
We introduce a completely new way of performing
Monte-Carlo simulation using Wiener-Hopf factorization
for functionals of a Levy process involving the
joint law of the process and its maximum at a fixed
time. The method has pertinence in numerically pricing
barrier options. This is based on joint work with
Juan Carlos Pardo and Kees van Schaik. |
Tuesday February 16, 2010 1:30
pm |
Dr. Vladimir
Surkov, (Fields/University of Western Ontario) Efficient Fourier Transform-Based Pricing of Interest Rate Derivatives In this article, we introduce a new and efficient numerical method, called Fourier Space Time-stepping (irFST), for valuation of general interest rate contingent claims. While a wide array of models for the instantaneous short-rate exist in the literature, we focus on the Vasicek and Hull-White models, and their multi-factor and jump extensions. Even though closed-form solutions exist (for models without jumps) for zero-bond and swap options, the motivation for the irFST method is to efficiently price exotic, path-dependent, and early-exercise options. Recently, Fourier transform-based method have methods
have began to gain traction in the area of pricing
of interest rate derivatives. Duffie, The irFST method is based on the mean-reverting
Fourier Space Time-stepping algorithm of Jaimungal,
Surkov (2008), but is tailored to We also discuss an extension of the irFST method
to efficiently compute option Greeks (sensitivities
to changes in state variables or models parameters).
Finally, we demonstrate the precision and flexibility
of our numerical method through various examples. |
Tuesday February 23, 2010 1:30
pm |
Dr. Vladimir
Vinogradov (Ohio University) Stochastic Models for Movements of Equities which Employ the Power-Variance Family The first group of models pertains to a class of Levy processes generated starting from the power-variance family of probability laws.For such models, we establish the exact asymptotics of the probabilities of large deviations, the distribution of the first passage time, and the rational price of the European call option. A subsequent application of the techniques of stochastic exponentials leads to an additional class of Levy processes. The ordinary exponentials of its members constitute the geometric Levy processes which we utilize for describing chaotic movements of equities. Namely, we consider a self-financing portfolio comprised of one bond and k equities assuming that the returns on all k equities belong to the latter class. We demonstrate that for a particular choice of constant portfolio weights, the combined movement of k equities is governed by a geometric Levy process, which belongs to the same class. The Merton-type allocation of constant weights, which we implement, coincides with those of fund managers. Although simpler, in the discontinuous case this approach is less profitable, than portfolio weight selection using an approach that maximizes the expected logarithmic utility. In a special case, we establish a converse of Merton's mutual fund theorem. We derive Pythagorean-type theorems for Sharpe portfolio performance measures emphasizing their relation to Merton-type weights and the additivity of shape parameter. |
Tuesday March 2, 2010 1:30
pm |
Kyoung-Kuk Kim, PhD (Columbia) We study a class of generalized Riccati differential equations associated with affine diffusion processes. These diffusions arise in financial econometrics as candidate models for asset price dynamics. The generalized Riccati equations determine the Fourier transform of the diffusions transition law. We investigate stable regions of the dynamical systems and analyze their blow-up times. We discuss the implication of applying these results to affine diffusions and, in particular, to option pricing theory. |
Tuesday March 9, 2010 1:30
pm |
Rudra Jena (Centre de Mathématiques Appliquées,
Ecole Polytechnique) Given a set of assumptions on the real-world dynamics
of the underlying, the European options on this
underlying are not efficiently priced in |
Monday March 15, 2010
|
Prof. Tom Salisbury
(York University) Insurance and equity guarantees I'll survey some recent work on variable annuities, and the kind of equity guarantees insurance companies have been building into retirement savings products. We'll look at hedging issues, from the point of view of the issuer. The challenge here is that the guarantees involve long-dated embedded options, and blend mortality risk with equity risk. We'll also consider some of the issues clients have in optimally managing portfolios that include these products. My work in this area is joint with Huaxiong Huang and Moshe Milevsky, both of York University. |
Monday March 15, 2010
|
Dr. Fouad
Marri, (Fields Institute/York University) Pricing compound Poisson processes with the Farlie-Gumbel-Morgenstern dependence structure In this paper we consider an extension to the classical compound Poisson risk model. Historically, it has been assumed that the loss amounts and claim inter-arrival times are independent. In this contribution, a dependence structure between the claim amount and the interclaim time is introduced through a Farlie-Gumbel-Morgenstern copula. In this framework, the moment generating function of the aforementioned dependent processes is derived and studied. The moments of the compound Poisson risk model are also derived. Various implications of the dependence are discussed and exemplified numerically. This is joint work with Edward Furman (York University). |
Tuesday April 6, 2010 1:30
- 2:15 |
Alexey Kuznetsov (York University) We discuss the problem of computing marginal and
joint distribution of such functionals of a Levy
process as the first passage time, overshoot, |
Tuesday April 13, 2010 1:30
pm |
Arash Fahim We introduce a probabilistic numerical scheme for fully nonlinear PDEs, and show that it can be introduced naturally as a combination of Monte Carlo and finite differences scheme without appealing to the theory of backward stochastic differential equations. We mention the convergence of the discrete-time approximation and derives a bound on the discretization error in terms of the time step. An explicit implementable scheme requires to approximate the conditional expectation operators involved in the discretization. This induces a further Monte Carlo error. We also mention the result which proves the convergence of the latter approximation scheme, and derives an upper bound on the approximation error. Numerical experiments are performed for the approximation of the solution of the mean curvature flow equation in dimensions two and three, and for two and five-dimensional (plus time) fully-nonlinear Hamilton-Jacobi-Bellman equations arising in the theory of portfolio optimization in financial mathematics. We also provide a generalization to nonlocal fully nonlinear PDEs (Integro-differential PDEs) |
Tuesday April 20, 2010 1:30
- 2:15 pm |
Xianhua Peng (Fields/York University) The recent financial turmoil has witnessed the powerful impact of the default clustering effect (i.e., one default event tends to trigger more default events in the future and cross-sectionally), especially on the market of collateralized debt obligations (CDOs). We propose a model based on cumulative default intensities that can incorporate the default clustering effect. Furthermore, the model is tractable enough to provide a direct link between single-name credit securities, such as credit default swaps (CDS), and multi-name credit securities, such as CDOs. The result of calibration to the recent market data, when Bear Sterns, Lehman Brothers, etc. collapsed and default correlation among firms was substantially high, shows that the model is promising. |
Tuesday April 27, 2010 1:30
pm - 2:15 pm |
Dr. Anke Wiese (Heriot-Watt University) We study solutions to nonlinear stochastic differential systems driven by a multi-dimensional Wiener process. A useful algorithm for strongly simulating such stochastic systems is the Castell-Gaines method, which is based on the exponential Lie series. When the diffusion vector fields commute, it has been proved that at low orders this method is more accurate in the mean-square error than corresponding stochastic Taylor methods. However it has also been shown that when the diffusion vector fields do not commute, this is not true for strong order one methods. Here we prove that when the diffusion vector fields do not commute, the exponential Lie series is usurped by the sinh-log series. In other words, the mean-square error associated with a numerical method based on the sinh-log series, is always smaller than the corresponding stochastic Taylor error, in fact to all orders. Our proof utilizes the underlying Hopf algebra structure of these series, and a two-alphabet associative algebra of shuffle and concatenation operations. We illustrate the benefits of the proposed series in numerical studies. Joint work with S.J. Malham, Heriot-Watt University. |
Tuesday May 4, 2010 1:30
- 2:15 pm |
Eddie Ng The field of time-series analysis has made important contributions to a wide spectrum of applications such as tide-level studies in hydrology, natural resource prospecting in geo-statistics, speech recognition, weather forecasting, financial trading, and economic forecasts and analysis. Nevertheless, the analysis of the non-Gaussian and non-stationary features of time-series remains challenging for the current state-of-art models. This work proposes an innovative framework which
leverages the theory of copula, combined with a
probabilistic framework from the machine learning
community, to produce a versatile tool for multiple
time-series analysis. I coined this new model Kernel-based
Copula Processes (KCPs). Under the new proposed
framework, various idiosyncracies can be modeled
parsimoniously via a kernel function for individual
time-series, and long-range dependency can be captured
by a copula function. The copula function separates
the marginal behavior and serial dependency structures,
thus allowing them to be modeled separately and
with much greater flexibility. Moreover, the codependent
structure of a large number of time-series with
potentially vastly different characteristics can
be captured in a compact and elegant fashion through
the notion of a binding copula. This feature allows
a highly heterogeneous model to be built, breaking
free from the homogeneous limitation of most conventional
models. The KCPs have demonstrated superior predictive
power when used to forecast a multitude of data
sets from meteorological and financial areas. Finally,
the versatility of the KCP model is exemplified
when it was successfully applied to non-trivial
classification problems unaltered.
|
Tuesday May 11, 2010 1:30
pm - 2:15 pm |
Prof. Luis Seco (University of Toronto) The events of 2007 and 2008 show that correlation regimes have transcendental impact in finance; I will present some mathematical considerations around this phenomenon and discuss some of its applications. |
Tuesday June 1, 2010 1:30
pm - 2:15 pm
|
Pavel Gapeev (London School of Economics) Stochastic differential equations driven by a class of L\'evy processes are considered and the question of finding closed form solutions is studied. Being based on smooth invertible transformations of the current states of the underlying processes, a reducibility criterion is presented for such stochastic differential equations to ones which are solvable using ordinary differential equations. A method is proposed for constructing L\'evy driven analogues of the initial continuous diffusions, which is based on this reducibility criterion. The action of this method is illustrated on the construction of some well known diffusions, and related applications to finance are discussed. Jean-Francois Renaud (University of Waterloo) A discretization scheme for nonnegative diffusion
processes is proposed and the convergence of the
corresponding sequence of approximating processes
is proved using the martingale problem framework.
Motivations for this scheme come typically from
finance, especially for path-dependent option pricing.
The scheme is simple: one only needs to find a nonnegative
distribution whose mean and variance satisfy a simple
condition to apply it. Then, for virtually any This is joint work with Chantal Labbé and Bruno Rémillard (HEC Montréal) |
Wednesday June 9, 2010
1:30
pm - 2:15 pm
|
Lung Kwan Tsui (University of Pittsburgh) In this note we continue the study of the stress
event model, a simple and intuitive dynamic model
for credit risky portfolios, proposed by Alex Kreinin (Algorithmics) The Mill's ratio, R(t), is defined as the ratio of the probability P(X>t) to the density of the standard normal random variable, X, computed at t. This function plays an important role in Statistics and Stochastic Modeling. In this talk we discuss analytical and combinatorial properties of this function. Our approach is based on the complete monotonicity of the function R(t). |
Tuesday June 15, 2010 1:30
- 2:15 pm
2:15
- 3:00 pm |
Joe Campolieti (Wilfrid Laurier University) We present some recent developments in the construction,
classification and application of new families of
solvable diffusion models with affine drift and
nonlinear volatility functions. The solvable diffusions
admit closed-form analytical expressions for various
transition densities, first hitting time distributions,
distributions of various extrema of the processes,
and other quantities that are fundamental to financial
derivatives pricing. Our approach is based on so-called
diffusion canonical transformations which exploit
the use of measure changes (i.e. Doob transforms)
in combination with elementary (Ito) transformations.
The mathematical framework produces a large class
of analytically tractable multi-parameter nonlinear
local volatility diffusion models that are mapped
onto various simpler underlying diffusions. In particular,
in this talk we present a spectral theory for transformed
diffusions and derive closed-form spectral expansions
for first-hitting time densities and transition
probability densities for three new main families
of one-dimensional diffusions with (and without)
imposed killing. The rapidly convergent spectral
expansions lead to various applications in asset
pricing. Hao Xing (Boston University) Portfolio turnpike is an intuitive property of the portfolio choice problem. It states that if preferences of two agents are similar at large wealth levels, then their investment strategies are similar as horizon increases. This problem dates back to 1968 and it has been proven in different market settings. But all results assume the completeness of the market. In this talk, we will discuss whether this property holds in an incomplete market. We show that when the investment strategy of one agent is myopic, the turnpike property holds in a general incomplete market with semimartingale dynamics. When the optimal strategy is not myopic, we study a specific market model whose asset prices are driven by a common factor. In this market model, we will discuss the relationship between turnpike property, h-transform, and ergodic theory. This is a joint work with Paolo Guasoni, Kostas Kardaras, and Scott Robertson. |