High-frequency statistics in Finance.

Jean Jacod (UPMC-Paris 6)

The aim of this course is to provide some basic facts about, and an overview of, statistics of processes which are observed at discrete times on a finite time interval. The domain of applications is primarily the study of observed stock prices.

After introducing the problem, we will explain which “parameters” of the model for the stock price or log-price can be identified, when it is observed at discrete times and when the frequency increases and eventually goes to infinity. The main parameters of this kind are the volatility and also the existence or not of jumps and their degree of activity when they are present. Then we will explain how it is possible to estimate these quantities, in a variety of settings (regular or irregular observation times, exact or noisy observation). If time permits, we will also mention some open Problems.

The New Post-crisis Landscape of Derivatives and Fixed Income Activity under Regulatory Constraints on Credit risk, Liquidity risk, and Counterparty risk.

Nicole El Karoui, LPMA-UPMC, Paris

Introduction

The motivation for this course is to update academic community on the deep transformation after the financial 2008- crisis in the world of interest rates, and credit derivatives induced by the regulation. Liquidity risk, credit risk, counterparty risk have become more bulky over the recent years, maybe than the market risk, given the identified lack of transparence in the OTC Market.

These risks can be mitigated by the way trade and post-trade functions are structured. At trading level, risks can be reduced by improving operational efficiency, e.g. ensuring electronic trade execution, affirmation and confirmation.This would have the effect of making OTC trade execution more similar to the way transactions are handled on-exchange.

One way is to impose collateral and margin requirements. In the bilateral clearing, the two counterparties most often have collateral agreements in place that provide for regular monitoring of how the value of the contract evolves so as to manage their respective credit exposures to each other. In the Central Counter-party (CCP)clearing, the CCP acts as a counterparty to each side of a transaction. It makes collateral management simpler, as it is the CCP that collects and manages collateral.

Special attention is dedicated to reduce credit risks notably in Credit Default Swap (CDS) market, since CDS are particularly vulnerable on many respects. The risk they cover-the credit risk- is not immediately observable but requires specific information about the borrower, which typically only banks have had. Assessing the risk remains difficult, and amplified by the fact that the potential obligations that come with them are extreme.

It is of crucial importance in a derivative business at a aggregated level, to (i) measure counterparty exposure, (ii) compute capital requirements, and (iii) hedge counterparty risk. Measuring counterparty exposure is important for setting limits on the amount of business a firm is prepared to do with a given counterparty; hedging it gives a possibility of mitigating it and transferring risk; and from a regulatory perspective there is significant pressure on financial institutions to have the capability of producing accurate risk measures to compute capital. In addition, computing counterparty exposure can also give insights into prices of complex products in potential future scenarios.The Risk Control, function attracting relatively limited attention in the past, is now becoming a central activity of all major financial institutions, requiring significant resources from all parties.

The aim of the course is to provide a bridge between old and new practices including counterparty risk in fixed income and credit derivatives market, first at the level of the bilateral contract, second at the aggregated level. In particular, we try to make a rigorous formulation of the different problems

Outline

First talk

The first part is dedicated to the basic foundations of the interest rates derivatives in a perfect market, by making a clear distinction between the different notions of funding, risk-free rate, bond, and also the notions of forward curve and discounting curve. As a consequence, we deduced the standard HJM framework

on interest rates dynamics and the notion of forward neutral probability measure. In regard, we describe the standard contracts as forward or future contracts, swaps, and the associated derivatives.

The second part is an (non standard) introduction of the default derivative world, where the basic contact is the CDS, without specific mathematical tools. Default spreads and other similar quantities appear naturally. A general framework is then introduced. Examples of affine models. These tools are necessary to model the liquidity risk in the interbank market, and the multi-discounting curves. Different examples are developed.

Second talk

Pricing with collateral: some typical non-linear backward stochastic equation for pricing. Right-way/Wrong-way risk;

Hedging and Managing counterparty risk; aggregation and risk mitigation; stress testing.

Bibliography

Cesari, G., Aquilina, J., Charpillon, N., Filipovic, Z., Lee, G., & Manda, I. (2009). Modelling, pricing, and hedging counterparty credit exposure: A technical guide. Springer Science & Business Media.

Grbac, Z., & Runggaldier, W. J. (2015). Interest Rate Modeling: Post-Crisis Challenges and Approaches.

Henrard, M. (2013). Multi-curves framework with stochastic spread: A coherent approach to STIR futures and their options. OpenGamma Quantitative Research, (11).

Riding the Bubble with Convex Incentives.

Fernando Zapatero* (USC, Los Angeles, CA, US) and Juan Sotes-Paladino (University of Melbourne).

Several empirical studies contradict the efficient markets contention that sophisticated investors like hedge funds should underweight overvalued assets in their portfolios. We rationalize this evidence within a dynamic model that accounts for hedge fund convex incentive fees. In response to these incentives, risk-averse hedge fund managers with superior information can aggressively overweight an overvalued asset with positive risk premium to beat a risk-less benchmark, even when they expect overpricing to fall in the short term. To secure outperformance, managers tilt their portfolios towards the risk-less benchmark and hold too much of a negative risk premium asset. This distortion can increase with managers’ information advantage over other market participants. The optimal investment strategy of managers exacerbates equilibrium mispricing (both over- and undervaluation) with respect to the case of no convex incentives.

5:00–5:30pm:

Heterogeneous Archimedean copula and t-copula in credit portfolio modeling

Ludger Overbeck*, University of Giessen

Besides its advantage in modelling tail-dependency, the main drawback of standard non-Gaussian copula is the homogeneity in the tail dependency. Several approaches to solve are meanwhile developed, hierachical copula, the grouped t-copula and the heterogeneous t-copula as recently described by Luo and Shevchenko. We will show results from a concrete implementation of a factor model using the later approach in a two step estimation procedure. In particular the effects on capital allocation will be highlighted. In the second part, we will also present how this can be extended to a wide class of Archimedean copula, in order to capture heterogeneous tail-dependencies and therefore tail-sensitive capital allocation in credit portfolio models. This first part is joint work in progress with Carsten Binnenhei, Melanie Frick and Benedikt Mankel (Deka Bank).

5:30pm–6:00pm

Option-Implied Objective Measures of Market Risk

Matthias Leiss*, ETHZ; Heinrich Nax, ETHZ

Foster and Hart (2009) introduce an objective measure of the riskiness of an asset that implies a bound on how much of one’s wealth is ‘safe’ to invest in the asset while (a.s.) guaranteeing no-bankruptcy in the long run. In this study, we translate the Foster-Hart measure from static and abstract gambles to dynamic and applied finance using nonparametric estimation of risk-neutral densities from S&P 500 call and put option prices covering 2003 to 2013. This exercise results in an option-implied market view of objective riskiness. The dynamics of the resulting ‘option-implied Foster-Hart bound’ are analyzed and assessed in light of well-known risk measures in- cluding value at risk, expected shortfall and risk-neutral volatility. The new measure is shown to be a significant predictor of ahead-return downturns. Furthermore, it is able to grasp more characteristics of the risk-neutral probability distributions than other measures, furthermore exhibiting predictive consistency. The robustness of the risk-neutral density estimation method is analyzed via Monte Carlo methods.

6:00pm–6:30pm

Counterparty Risk and Funding: Immersion and Beyond

Shiqi Song*, Université d’Evry ; Stéphane Crépey, University of Evry

In Cr’epey’s paper (textsc{Cr’epey, S.} (2015). Bilateral Counterparty risk under funding constraints. Part II: CVA. textit{Mathematical Finance} textbf{25}(1), 1-50.), a basic reduced-form counterparty risk modeling approach {was introduced} under a rather standard immersion hypothesis between a reference filtration and the filtration progressively enlarged by the default times of the two parties, also involving the continuity of some of the data at default time. This basic approach is too restrictive for application to credit derivatives, which are characterized by strong wrong-way risk, i.e.~adverse dependence between the exposure and the credit riskiness of the counterparties, and gap risk, i.e.~slippage between the portfolio and its collateral during the so-called cure period that separates default from liquidation.

{This paper} shows how a suitable extension of the basic approach can be devised so that it can be applied in dynamic copula models of counterparty risk on credit derivatives.

More generally, this extended approach is applicable in any marked default time intensity setup satisfying a suitable integrability condition. The integrability condition expresses that no mass is lost in a related measure change.

5:00pm–5:30pm:

Cross-Dependent Volatility

Julien Guyon (Bloomberg L.P.)

Local volatilities in multi-asset models typically have no cross-asset dependency. In this talk, we propose a general framework for pricing and hedging derivatives in cross-dependent volatility (CDV) models, i.e., multi-asset models in which the volatility of each asset is a function of not only its current or past levels, but also those of the other assets. For instance, CDV models can capture that stock volatilities are driven by an index level, or recent index returns. We explain how to build all the CDV models that are calibrated to all the asset smiles, solving in particular the longstanding smiles calibration problem for the “cross-aware” multidimensional local volatility model. CDV models are rich enough to be simultaneously calibrated to other instruments, such as basket smiles, and we show that the model can fit a basket smile either by means of a correlation skew, like in the classical “cross-blind” multi-asset local volatility model, or using only the cross-dependency of volatilities itself, in a correlation-skew-free model, thus proving that steep basket skews are not necessarily a sign of correlation skew. We can even calibrate CDV models to basket smiles using correlation skews that are opposite to the ones generated by the classical cross-blind models, e.g., calibrate to large negative index skews while requiring that stocks are less correlated when the market is down. All the calibration procedures use the particle method; the calibration of the implied “local in basket” CDV uses a novel fixed point-compound particle method. Numerical results in the case of the FX smile triangle problem illustrate our results and the capabilities of CDV models.

Keywords: Option pricing, multi-asset models, cross-dependent volatility, correlation skew, smile calibration, basket options, particle method.

5:30pm–6:00pm:

Rough Volatility: From Microstructural Foundations to Smile

Mathieu Rosenbaum (Universite Pierre-et-Marie-Curie)

It has been recently shown that rough volatility models reproduce very well the statistical properties of low frequency financial data. In such models, the volatility process is driven by a fractional Brownian motion with Hurst parameter of order 0.1. The goal of this talk is first to explain how such fractional dynamics can be obtained from the behaviour of market participants at the microstructural scales. Using limit theorems for Hawkes processes, we show that a rough volatility naturally arises in the presence of high frequency trading combined with metaorders splitting. Then we will demonstrate that such result enables us to derive an efficient method to compute the smile in rough volatility models. This is joint work with Omar El Euch, Masaaki Fukasawa, Jim Gatheral and Thibault Jaisson.

6:00pm–6:30pm:

Hedging of covered options with linear price impact and gamma constraint

Bruno Bouchard (Universite Paris-Dauphine)

Within a financial model with linear price impact, we study the problem of hedging a covered European option under gamma constraint. Using stochastic target and partial differential equation smoothing techniques, we prove that the super-replication price is the viscosity solution of a fully non-linear parabolic equation. As a by-product, we show how ε-optimal strategies can be constructed. A numerical resolution scheme is proposed.

The New Post-crisis Landscape of Derivatives and Fixed Income Activity under Regulatory Constraints on Credit risk, Liquidity risk, and Counterparty risk.

Nicole El Karoui, LPMA-UPMC, Paris

Introduction

The motivation for this course is to update academic community on the deep transformation after the financial 2008- crisis in the world of interest rates, and credit derivatives induced by the regulation. Liquidity risk, credit risk, counterparty risk have become more bulky over the recent years, maybe than the market risk, given the identified lack of transparence in the OTC Market.

These risks can be mitigated by the way trade and post-trade functions are structured. At trading level, risks can be reduced by improving operational efficiency, e.g. ensuring electronic trade execution, affirmation and confirmation.This would have the effect of making OTC trade execution more similar to the way transactions are handled on-exchange.

One way is to impose collateral and margin requirements. In the bilateral clearing, the two counterparties most often have collateral agreements in place that provide for regular monitoring of how the value of the contract evolves so as to manage their respective credit exposures to each other. In the Central Counter-party (CCP)clearing, the CCP acts as a counterparty to each side of a transaction. It makes collateral management simpler, as it is the CCP that collects and manages collateral.

Special attention is dedicated to reduce credit risks notably in Credit Default Swap (CDS) market, since CDS are particularly vulnerable on many respects. The risk they cover-the credit risk- is not immediately observable but requires specific information about the borrower, which typically only banks have had. Assessing the risk remains difficult, and amplified by the fact that the potential obligations that come with them are extreme.

It is of crucial importance in a derivative business at a aggregated level, to (i) measure counterparty exposure, (ii) compute capital requirements, and (iii) hedge counterparty risk. Measuring counterparty exposure is important for setting limits on the amount of business a firm is prepared to do with a given counterparty; hedging it gives a possibility of mitigating it and transferring risk; and from a regulatory perspective there is significant pressure on financial institutions to have the capability of producing accurate risk measures to compute capital. In addition, computing counterparty exposure can also give insights into prices of complex products in potential future scenarios.The Risk Control, function attracting relatively limited attention in the past, is now becoming a central activity of all major financial institutions, requiring significant resources from all parties.

The aim of the course is to provide a bridge between old and new practices including counterparty risk in fixed income and credit derivatives market, first at the level of the bilateral contract, second at the aggregated level. In particular, we try to make a rigorous formulation of the different problems

Outline

First talk

The first part is dedicated to the basic foundations of the interest rates derivatives in a perfect market, by making a clear distinction between the different notions of funding, risk-free rate, bond, and also the notions of forward curve and discounting curve. As a consequence, we deduced the standard HJM framework

on interest rates dynamics and the notion of forward neutral probability measure. In regard, we describe the standard contracts as forward or future contracts, swaps, and the associated derivatives.

The second part is an (non standard) introduction of the default derivative world, where the basic contact is the CDS, without specific mathematical tools. Default spreads and other similar quantities appear naturally. A general framework is then introduced. Examples of affine models. These tools are necessary to model the liquidity risk in the interbank market, and the multi-discounting curves. Different examples are developed.

Second talk

Pricing with collateral: some typical non-linear backward stochastic equation for pricing. Right-way/Wrong-way risk;

Hedging and Managing counterparty risk; aggregation and risk mitigation; stress testing.

Bibliography

Cesari, G., Aquilina, J., Charpillon, N., Filipovic, Z., Lee, G., & Manda, I. (2009). Modelling, pricing, and hedging counterparty credit exposure: A technical guide. Springer Science & Business Media.

Grbac, Z., & Runggaldier, W. J. (2015). Interest Rate Modeling: Post-Crisis Challenges and Approaches.

Henrard, M. (2013). Multi-curves framework with stochastic spread: A coherent approach to STIR futures and their options. OpenGamma Quantitative Research, (11).

Poster Session

Short Term American Path Dependent Option Pricing in the USDCOP Market: Central Bank’s Volatility Control Option Case

Santiago Stozitzky*, Bancolombia

A stochastic approach to pricing financial instruments for the Caribbean markets

Stephen Barnes*, University of the West Indies; Conall Kelly, University of the West Indies; Alexandra Rodkina, University of the West Indies

Estimating and Forecasting the Term Structure of Interest Rates:US and Colombia Analysis

Cristhian Rodriguez*, Urosario

Modeling the Uruguayan Sovereign Debt

Andrés Sosa*, Centro de Matemática, UdelaR

Nonlinear options pricing and Feynman Kac’s theorem

John Moreno*, U. Externado de Colombia

Extreme returns in the mining gold stock and in gold prices

Gonzalo Ubal*, Universidad de Talca

Modeling and Forecasting of the Relationship between Airline Stocks and Oil Market

Erik Muñoz*, Universidad de Talca

Neural networks in sovereign rating: application to Colombia, period between 1838-1900

Mauricio Avellaneda Hortua*, Universidad Externado de Colombia

El modelo Lee-Carter para estimar y pronosticar mortalidad: Una aplicación para Colombia

Carlos Ochoa*, Universidad Nacional

NUMERICAL APPROXIMATION OF VEGA UNDER THE STOCHASTIC VOLATILITY MODEL OF HERSTON, USING THE PATHWISE-EULER METHOD

Ana Maria Serrato Polania*, Universidad Externado

Synthetic portfolio for event studies: Estimating the effects of volatility call auctions

Diego Agudelo, Universidad EAFIT; Carlos Castro Universidad del Rosario; Sergio Preciado*, Universidad del Rosario

5:30pm–6:00pm:

Calibration in Option Pricing with Forward and Backward Reduced Models

Jose Silva*, University of Wuppertal; E. Jan ter Maten, University of Wuppertal; Michael Guenther, University of Wuppertal

This work presents the calibration of a stochastic volatility model, the Heston Model using Model Order Reduction. The calibration within the context of financial markets usually goes along the following lines. After defining which model or models suit the behaviour of each market the best, e.g. FX Markets, Stock Markets, etc., the information regarding currently priced instruments in the market is gathered. Using simple, quick models one obtains an estimate of at which value of the parameters is the market currently trading. These estimates are posteriorly used to priced more complexed or exotic products.

A very common calibration process involves a least-squares minimization problem in which each cost function evaluation involves solving one partial differential equation (PDE) per each set of parameters available on the market. This can quickly become prohibitively expensive to solve numerically. For that reason two parallel strategies are presented in this work, which should improve considerably the cost of such calibrations.

Obtaining a Dupire-type equation for both models, we proceed to calibrate option prices to market data by a least-squares minimization. We present the results showing the computational efficiency and comparing it with the ones resulting from a parametric reduced order model. We use Alternating-Direction Implicit schemes to numerically solve the partial differential equations in both approaches.

6:00pm–6:30pm:

Prediction of Federal Funds Target Rate: A Dynamic Logistic Bayesian Model Average Approach

Hernán Alzate*, Bancolombia S.A.; Andrés Ramírez-Hassan, EAFIT University

In this paper we examine which macroeconomic and financial variables have most predictive power for the target repo rate decisions made by the Federal Reserve. We conduct the analysis for the FOMC decisions during the period June 1998-April 2015 using dynamic logistic models with dynamic Bayesian Model Averaging that allows to perform predictions in real-time with great flexibility. The computational burden of the algorithm is reduced by adapting a Markov Chain Monte Carlo Model Composition: MC3. We found that the outcome of the FOMC meetings during the sample period are predicted well: Logistic DMA-Up and Dynamic Logit-Up models present hit ratios of 87,2 and 88,7; meanwhile, hit ratios for the Logistic DMA-Down and Dynamic Logit-Down models are 79,8 and 68,0, respectively.

6:30pm–7:00pm:

Stochastic Portfolio Theory: A Machine Learning Perspective

Alexander Vervuurt*, University of Oxford; Yves-Laurent Kom Samo, University of Oxford

We propose a novel application of Gaussian processes to financial asset allocation. Our approach is deeply rooted in Stochastic Portfolio Theory (SPT), a stochastic analysis framework recently introduced by Robert E. Fernholz that aims at flexibly analyzing the performance of certain investment strategies in stock markets relative to benchmark indices. In particular, SPT has exhibited some investment strategies based on company sizes that, under realistic assumptions, outperform benchmark indices with probability 1 over certain time horizons. Galvanized by this result, we consider the inverse problem that consists of learning (from historical data) an optimal investment strategy based on any given set of trading characteristics, and using a user-specified optimality criterion that may go beyond the outperformance of a benchmark index. Although the inverse problem is of the utmost interest to investment management practitioners, it can hardly be tackled using the SPT framework. We show that our machine learning approach learns investment strategies that considerably outperform existing SPT strategies in the US stock market.

5:30pm–6:00pm

Some remarks on functionally generated portfolios

Johannes Ruf*, UCL; Ioannis Karatzas, Columbia

In the first part of the talk I will review Bob Fernholz’ theory of functionally generated portfolios. In the second part I will discuss questions related to the existence of short-term arbitrage opportunities. This is joint work with Ioannis Karatzas

6:00pm-6:30pm

Martingale Optimal Transport and Beyond

Marcel Nutz*, Columbia

We study the Monge–Kantorovich transport between two probability measures, where the transport plans are subject to a probabilistic constraint. For instance, in the martingale optimal transport problem, the transports are laws of martingales. Interesting new couplings emerge as optimizers in such problems.

Constrained transport arises in the context of robust hedging in mathematical finance via linear programming duality. We formulate a complete duality theory for general performance functions, including the existence of optimal hedges. This duality leads to an analytic monotonicity principle which describes the geometry of optimal transports. Joint work with Mathias Beiglböck, Florian Stebegg and Nizar Touzi.

6:30pm–7:00pm

Dynamic Programming Approach to Principal-Agent Problems

Dylan Possamaï*, Université Paris Dauphine; Nizar Touzi, Ecole Polytechnique; Jaksa Cvitanic, Caltech

We consider a general formulation of the Principal-Agent problem with a lump-sum payment on a finite horizon. Our approach is the following: we first find the contract that is optimal among those for which the agent’s value process allows a dynamic programming representation and for which the agent’s optimal effort is straightforward to find. We then show that, under technical conditions, the optimization over the restricted family of contracts represents no loss of generality. Moreover, the principal’s problem can then be analyzed by the standard tools of control theory. Our proofs rely on the Backward Stochastic Differential Equations approach to non-Markovian stochastic control, and more specifically, on the recent extensions to the second order case.

Path-Dependent Volatility

Julien Guyon

Bloomberg LP. New York, US

So far, path-dependent volatility models have drawn little attention from both practitioners and academics compared to local volatility and stochastic volatility models. This is unfair: in this talk we show that they combine benefits from both. Like the local volatility model, they are complete and can fit exactly the market smile; like stochastic volatility models, they can produce rich implied volatility dynamics. Not only that: given their huge flexibility, they can actually generate a much broader range of spot-vol dynamics, thus possibly preventing large mispricings, and they can also capture prominent historical patterns of volatility. We give many examples to showcase their capabilities.

10:30am–11:00am:

A systematic view on price based trading strategies

A. Christian Da Silva*, Dunn Capital; Fernando Ferreira, USP; Ju-Yi Yen, University of Cincinnati

We study trading strategies that use historical price data to predict future asset performance. Important and well known examples are short-term or long-term reversal cite{Bondt1989}, momentum and trend-following cite{JT1993,BP}. A distinction between these strategies is the extension of historical data used to predict the future. In particular, short-term reversal appears for portfolios which are build using historical data up to one month. Trend-following and momentum are generally implemented using few months to one year and long-term reversal uses few years of historical data.

We study such strategies by assuming that the asset log-returns $x_{i}$ are Gaussian random variables with drift $mu$, variance $V$ and autocorrelation $rho$ cite{FSY}. We further assume to be able to trade proportional to a simple moving average ($m(T)$) with $T$ terms and calculate the exact expression for the average performance and its variance cite{FSY}.

Empirically we identify the different regimes (from short-term reversal to long-term reversal and beyond) by presenting the Sharpe ratio of our strategy applied to 120 years of the DJIA as a function of $T$.

11:00am–11:30am (Spanish):

An Application of Extreme Value Theory for Measuring Financial Risk in the Uruguayan Pension Fund

Guillermo Magnou*, SURA

Traditional methods for financial risk measures adopts normal distributions as a pattern of the financial return behavior. Assessing the probability of rare and extreme events is an important issue in the risk management of financial portfolios. In this paper, we use Peaks Over Threshold (POT) model of Extreme Value Theory (EVT), and General Pareto Distribution (GPD) which can give a more accurate description on tail distribution of financial losses. The EVT and POT techniques provides well established statistical models for the computation of extreme risk measures like the Return Level, Value at Risk and Expected Shortfall. In this paper we apply this technique to a series of daily losses of AFAP SURA over an 18-year period (1997-2015), AFAP SURA is the second largest pension fund in Uruguay with more than 310,000 clients and assets under management over USD 2 billion.

Our major conclusion is that the POT model can be useful for assessing the size of extreme events. VaR approaches based on the assumption of normal distribution are definitely overestimating low percentiles (due to the high variance estimation), and underestimate high percentiles (due to heavy tails). The absence of extreme values in the assumption of normal distribution underestimate the Expected Shortfall estimation for high percentiles. Instead, the extreme value approach on POT model seems coherent with respect to the actual losses observed and is easy to implement.

11:30am–12:00am (Spanish):

Empirical Approach to the Heston Model Parameters on the Exchange Rate USD / COP

Carlos Grajales*, Universidad de Antioquia; Santiago Medina, Universidad Nacional de Colombia

This work proposes an empirical calibration of the Heston stochastic volatility model for the exchange rate USD / COP. The parameter estimation is done by developing an algorithm that performs simulated trajectories for the exchange rate under

Heston model and looking for matching probability distributionof simulated paths with the probability distribution that comes from the real exchange rate. The calibration is achieved by using both two-sample KS test and Nelder Mead simplex direct search. At the end, the results show that although achieving multiple optima parameter values depending on an initial vector parameter is posible, one of these could be chosen according to financial market information. The ongoing suggested open problems that come with the underlying dynamics presented will be related with

derivatives and risk measures valuation

8:00am–8:30am:

Arbitrage-Free XVA (Invited Session talk –Stéphane Crépey)

Agostino Capponi*, Columbia University; Stephan Sturm, WPI; Maxim Bichuch, Johns Hopkins University

We develop a framework for computing the total valuation adjustment (XVA) of a European claim accounting for funding costs, counterparty credit risk, and collateralization. Based on no-arbitrage arguments, we derive backward stochastic differential equations (BSDEs) associated with the replicating portfolios of long and short positions in the claim. This leads to the definition of buyer’s and seller’s XVA, which in turn identify a no-arbitrage interval.In the case that borrowing and lending rates coincide, we provide a fully explicit expression for the uniquely determined XVA, expressed as a percentage of the price of the traded claim, and for the corresponding replication strategies. In the general case of asymmetric funding, repo and collateral rates, we study the semilinear partial differential equation (PDE) characterizing buyer’s and seller’s XVA and show the existence of a unique classical solution to it. To illustrate our results, we conduct a numerical study demonstrating how funding costs, repo rates, and counterparty risk contribute to determine the total valuation adjustment.

8:30am–9:00am:

Option pricing under time-varying risk-aversion with applications to risk forecasting

Ruediger Kiesel*, University Duisburg-Essen; Florentin Rahe, University Ulm

We present a new option-pricing model, which explicitly captures the difference in the persistence of volatility under historical and risk-neutral probabilities. The model also allows to capture the empirical properties of pricing kernels, such as time-variation and the typical S-shape. We apply our model for two purposes. First, we analyze the risk preferences of market participants invested in S&P 500 index options during 2001 – 2009. We find that risk-aversion strongly increases during stressed market conditions and relaxes during normal market conditions. Second, we extract forward-looking information from S&P 500 index options and perform out-of-sample Value-at-Risk (VaR) forecasts during the period of the subprime mortgage crises. We compare the VaR forecasting performance of our model with four alternative VaR models and find that 2-Factor Stochastic Volatility models have the best forecasting performance.

9:00am–9:30am:

Log-Skew-Normal Mixture Model For Option Valuation

VISWANATHAN ARUNACHALAM*, UNIV. NACIONAL DE COLOMBIA; Jose Jimenéz, Univ Nacional de Colombia

There is good empirical evidence to show that the financial series, whether stocks or indices, currencies or interest rates do not follow the log-normal random walk underlying the Black-Scholes model, which is the basis for most of the theory of options valuation. In this article, we present an alternative approach for calculating the price of the call and put options when stock return distribution follows a Log-Skew-Normal mixture distribution. We obtain explicit expression for the price of the European options and formula for Greeks. We also analyze the effect the implied volatility at-the-money and a derivation of an expression for the implied volatilities at and around-the-money and discuss its asymptotic behavior. We present some numerical results for the calibration to real market option data.

9:30am–10:00am:

On American swaptions under the linear-rational framework

Yerkin Kitapbayev*, Boston University; Damir Filipovic, EPFL

In this paper we study the American version of the swaptions under the linear-rational term structure model (Filipovic, Larsson and Trolle (2014)). This framework enables us to simplify

the pricing problem significantly and formulate

corresponding optimal stopping problem for a diffusion process. The latter problem reduces to a free-boundary problem which we tackle by local time-space

calculus (Peskir (2005)). We characterize the optimal stopping boundary as the unique solution to nonlinear integral equation and using this we obtain the arbitrage-free price of the American swaption and the optimal exercise strategies in terms of swap rates for both fixed-rate payer and receiver.

8:00am–8:30am:

PDE models for pricing fixed rate morgages and their insurance and coinsurance

Carmen Calvo-Garrido, Carlos Vázquez*

In the pricing of fixed rate mortgages with prepayment and default options, we introduce jump-diffusion models for the house price evolution. These models take into account sudden changes in the price (jumps) during bubbles and crisis situations in real estate markets. After posing the models based on partial-integro differential equations (PIDE) problems for the contract, insurance and the fraction of the total loss not covered by the insurance (coinsurance), we propose appropriate numerical methods to solve them. Among this methods are semilagrangian schemes for time discretization combined with finite elements, ALAS algorithm for inequality constraints and quadrature formulas for nonlocal terms.

8:30am–9:00am

Randomised Heston models

Antoine Jacquier*

Inspired by recent works on the behaviour of the forward implied volatility smile, we introduce a new class of stochastic volatility models. The dynamics are the same as the classical Heston model, but the random starting point of the variance process is randomly distributed. We show how to choose the initial distribution (i) to fit the short end of the smile—traditionally mis-calibrated in classical stochastic volatility models, and (ii) to estimate past realisation of the volatility time series. This is a joint work with Fangwei Shi (Imperial College London).

9:00am–9:30am

Pricing Bermudan options under local Levy models with default

Anastasia Borovykh, Andrea Pascucci* and Cornelis W. Oosterlee

We consider a defaultable asset whose risk-neutral pricing dynamics are described by an exponential Levy-type martingale. This class of models allows for local volatility, local default intensity and a locally dependent Levy measure. We present a pricing method for Bermudan options based on an analytical approximation of the characteristic function combined with the COS method. We derive the adjoint expansion of the characteristic function using a Taylor expansion of the coecients. Due to a special form of the obtained characteristic function the price can be computed using a Fast Fourier Transform- based algorithm resulting in a fast and accurate calculation.

9:30am–10:00am

Backtesting Lambda Value at Risk

Jacopo Corbetta*, Ecole des Ponts – ParisTech; Ilaria Peri, University of Greenwich

A new risk measure, the lambda value at risk ($Lambda VaR$), has been recently proposed from a theoretical point of view as an immediate generalization of the value at risk ($VaR$). The $Lambda VaR$ appears to be attractive for its potential ability to solve several problems of the $VaR$.In this paper we propose three nonparametric backtesting methodologies for the $Lambda VaR$ which exploit different features. Two of these tests directly assess the correctness of the level of coverage predicted by the model. One of these tests is bilateral and provides an asymptotic result. A third test assess the accuracy of the $Lambda VaR$ that depends on the choice of the P$&$L distribution. However, this test requires the storage of more information.Finally, we perform a backtesting exercise and we compare our results with the ones from Hitaj and Peri (2015).

10:30am–11:00am:

On the behavior of the price impact in the Kyle-Back model.

José Corcuera*, University of Barcelona

In this paper we study the equilibrium arising in the Kyle-Back model when we allow the depth parameter to be random. This richer model can explain different phenomena observed in the financial market and that remained unexplained. We also unify different extensions of the Kyle-Back model in this framework.

11:00am–11:30am:

Optimal market dealing under constraints.

Etienne Chevalier*, Université d’Evry; Mhamed Gaigi, ; Vathana Ly Vath, ENSIIE; Mohammed Mnif, ENIT.

We consider a market dealer acting as a liquidity provider by continuously setting bid and ask prices for an illiquid asset in a quote-driven market. The market dealer may benefit from the bid-ask spread but has the obligation to permanently quote both

prices while satisfying some liquidity and inventory constraints. The objective is to maximize the expected utility from terminal liquidation value over a finite horizon and subject to the above constraints. We characterize the value function as the unique

viscosity solution to the associated HJB equation and further enrich our study with numerical results. The contributions of our study concern both the modelling aspects and the dynamic structure of the control strategies. Important features and constraints characterizing market making problems are no longer ignored. Indeed, along with the obligation to continuously quote bid and ask prices, we do not allow the market maker to stop quoting them when the stock inventory reaches its lower or higher bound. Furthermore, we no longer assume the existence of a reference price.

11:30m–12:00pm: (Spanish)

Who knows better in an Emerging Market? Performance of Institutions, Foreigners and Individuals.

Diego Agudelo*, Universidad EAFIT; James Byder, Universidad EAFIT; Paula Yepes, Universidad EAFIT

We find that local investors do better than foreigners in terms of trading execution. However foreign investors obtain better returns than local individuals both in short and long term. Local institutions are the best group on both dimensions. Our result reconcile apparent contradictions in the international finance literature on who invests better in an emerging market. These contradictions disappears with a more careful formulation of the research question at hand. The traditional Locals vs Foreigners or Institutions versus Individuals is too simplistic because it doesn’t distinguish between the different dimensions of performance. Our study makes use of two unique databases of Colombian stocks and acts as out-of-sample test of previous findings. Moreover, we provide evidence that the better performance of Institutions and Foreigners is driven by information advantages.

12:00m–12:30m: (Spanish)

An empirical analysis of unspanned risk for the U.S. yield curve.

Karoll Gomez*, Universidad Nacional de Colombia.

In this paper, I formally test for the unspanning properties of liquidity premium

risk in the context of a joint Gaussian affine term structure model for zero-coupon

U.S. Treasury and TIPS bonds. In the model, the liquidity factor is considered as

an additional factor that does not span the yield curve, but improves the forecast

of bond risk premia. I present empirical evidence suggesting that indeed liquidity

premium helps to forecast U.S. bond risk premia, but it is not linearly spanned by

the information in the joint yield curve. In addition, I show that the liquidity factor

does not affect the dynamic of bonds under the pricing measure, but does affect

them under the historical measure. Furthermore, the variation in the TIPS liquidity

premium predicts the future evolution of the traditional yield curve factors.

10:30am–11:00am:

Semimartingale properties of the lower Snell envelope in optimal stopping under model uncertainty

Erick Treviño*, Universidad de Guanajuato

Optimal stopping under model uncertainty is a recent topic under research. The classical approach to characterize the solution of optimal stopping is based on the Snell envelope which can be seen as the value process as time runs. The analogous concept under model uncertainty is the so-called lower Snell envelope and in this paper, we investigate its structural properties. We give conditions under which it is a semimartingale with respect to one of the underlying probability

measures and show how to identify the finite variation process by a limiting procedure. An example illustrates that without our conditions, the semimartingale property does not hold in general.

11:00am–11:30am:

On de Finetti’s problem under a time of ruin constraint

Mauricio Junca*, Universidad de Los Andes

We consider the classic de Finetti’s problem when the reserves are assumed to follow a spectrally negative Levy process subject to a constraint on the time of ruin. We introduce the dual problem and show that the complementary slackness condition is satisfied, thus there is no duality gap. Therefore the optimal value function can be obtained as the point-wise

infimum of auxiliary value functions indexed by Lagrange multipliers. We also present a series of numerical examples.

Joint work with Camilo Hernández

11:30am–12m:

Utility maximization in a multi-dimensional semi-martingale setting with nonlinear wealth dynamics

Rafael Serrano*, Universidad del Rosario.

We explore martingale and convex duality techniques to study optimal investment strategies that maximize expected risk-averse utility from consumption and terminal wealth in a multi-dimensional semimartingale market model with absolutely continuous characteristics and non-linear wealth dynamics. This allows to take account of market frictions such as different borrowing and lending interest rates or short positions with cash collateral and negative rebate rates. Our main result is a sufficient condition for existence of optimal policies and their explicit chracterization in the case of CRRA utility functions. We present numerical examples and some preliminary results for the case in which the investor’s final wealth is liable to deferred capital gains taxes or subject to further downside or expected loss

12:00m–12:30:

Optimizing the exercise boundary for the holder of an American option over a parametric family.

José Vidal Alcalá* , Centro de Investigación en Matemáticas, CIMAT

In the setting of American option pricing, we introduce an efficient stochastic optimization algorithm to find the optimal exercise boundary among a parametric family. We use the Calculus of Variations to write down a probabilistic representation of the payout sensitivity with respect to the exercise boundary parameter.

This representation is used in a Monte Carlo estimator after the development of an accurate SDE discretization scheme for stopped diffusions. As an intermediate result, we are able to approximate deltas at the boundary for barrier options. Numerical simulations/analysis of the algorithms used are presented.

Self-exciting process in Finance and Insurance for credit risk and longevity risk modelling in heterogenous portfolios.

Nicole El Karoui, LPMA-UPMC.

Recent regulatory evolution in credit risk management suggests to consider the credit risk of an aggregated portfolio as generated by a family of intercorrelated firms

with defaults propagation. Redemption risk in Life Insurance is very sensitive to contagion effect driven by the level of external variables as inflation and interest rates, but also the behavior of the other insured. rmiii For longevity purposes in an actuarial and demographical context, the individual point of view (Individual based models) allows to take into account specific individual characteristics (socio-economic status, education level, marital status…), and also the age of individuals (or events).

The aim is to take into account population heterogeneity in characteristics or age, impacting the rate of evolution in a way easier to model than the global point of view. Contagion effect, which is well known in seismology or in neuroscience with the spike-and-wave patterns, but also in High Frequency Trading, must be also included.

The simple model of contagion process is the Hakwes process, whose we give a new interpretation in terms of IBM, allowing to develop more complex Markovian model best suited to modeling redemption risk. For age pyramid of human heterogeneous population, we propose an extension of traditional birth and death processes. Using as source of randomness a $sigma$- finite Poisson measure with characteristics augmented by a thinning parameter, the population process is described as the strong solution of a stochastic differential equation, based on complex (birth, move and death) rates processes, depending of age, characteristics and past population, and from environmental factors. This strong representation permits easy comparisons.

As an example, we reproduce by simulation the cohort effect, well-known in UK, where a cohort of people born between $(1927,1040) $ showed an improvement in life-expectation from neighboring cohorts.

Estimation of volatility in presence of high activity jumps and noise. Jean Jacod (UPMC-Paris 6)

We consider an It semimartingale which is observed along a discrete (regular or not) time grid, within a fixed time interval. The observations are contaminated by noise, and the semimartingale has jumps with a degree of activity bigger than 1. Our aim is to revisit the estimation of the integrated volatility in such a setting: we use a mixture of the pre-averaging method (to eliminate noise) and of the empirical characteristic function method, which has been shown to be effcient (after proper de-biasing) even when the jump activity is bigger than 1, in contrast with most other methods.

This talk is a presentation of a joint work with Viktor Todorov, from Northwestern University.

4:30pm–5:00pm

Liquidity risk and optimal dividend/investment strategies. (Contributed talk)

Vathana Ly Vath*, ENSIIE; Etienne Chevalier, University of Evry; Mhamed Gaigi, ENIT

In this paper, we consider the problem of determining an optimal control on the dividend and investment policy of a firm operating under uncertain environment and risk constraints. In classical models in corporate finance, it is generally assumed that firm’s assets are either infinitely liquid or illiquid. It is particularly the case in the study of optimal dividend and investment policy. In Jeanblanc-Shiryaev (95), Asmussen-Taksar (97), Choulli-Taksar-Zhou (03), it is assumed that the firm’s assets may be separated into two types of assets, highly liquid assets which may be assimilated as cash reserve, i.e. cash and equivalents, or infinitely illiquid assets, i.e. productive assets that may not be sold. As such, when the cash reserve gets near the bankruptcy point, the firm manager may not be able to inject any cash by selling parts of its non-liquid assets. Some extensions of the above studies are investigated, see for instance Decamps-Villeneuve (05) and LyVath-Pham-Villeneuve (08) where investments are allowed. However, the core assumption on the two different types of assets, highly liquid and infinitely illiquid, still remains.

In our paper, we no longer simplify the optimal dividend and investment problem by assuming that firm’s assets are either infinitely illiquid or liquid. For the same reason as highlighted in financial market problems, it is necessary to take into account the liquidity constraints. More precisely, investment (acquiring productive assets) and disinvestment should be possible but not necessarily at their fair value. The firm may face some liquidity costs when buying or selling assets. While taking into account liquidity constraints and costs has become the norm in recent financial markets problems, it is still not the case in corporate finance, to the best of our knowledge. In our paper, we consider the company’s assets are separated in

two categories, cash and equivalents and risky assets which are subjected to liquidity costs. The risky assets are assimilated to productive assets which may be increased or decreased when the firm decides to invest or disinvest. The objective of the firm manager is to find the optimal dividend and investment strategy maximizing its shareholders’ value. Mathematically, we formulate this problem as a combined multidimensional singular and multi-regime switching problem. The studies that are most relevant

to our problem include Guo-Tomecek (08) , LyVath-Pham-Villeneuve(08), and Chevalier-LyVath-Scotti (13). By incorporating uncertainty into illiquid assets value, we no longer deal with a uni-dimensional control problem but a bi-dimensional singular and multi-regime switching problem. In such a setting, it is clear that it will be no longer possible to get explicit or quasi-explicit optimal strategies. Consequently, to determine the four regions comprising the continuation, dividend and investment/disinvestment regions, numerical resolutions are

required.

5:00pm–5:30pm

Financial Models with Defaultable Numéraires

Sergio Pulido Nino*, ENSIIE / Université d’Évry; Travis Fisher, ; Johannes Ruf,

Financial models are studied where each asset may potentially lose value relative to any other. To this end, the paradigm of a pre-determined numéraire is abandoned in favour of a symmetrical point of view where all assets have equal priority. This approach yields novel versions of the Fundamental Theorems of Asset Pricing, which clarify and extend non-classical pricing formulas used in the financial community. Furthermore, conditioning on non-devaluation, each asset can serve as proper numéraire and a classical no-arbitrage condition can be formulated. It is shown when and how these local conditions can be aggregated to a global no-arbitrage condition.

5:30pm–6:00pm

The Sustainable Black-Scholes Equation

Stéphane Crépey*, University of Evry; Yannick Armenti, University of Evry and LCH.Clearnet; Chao Zhou, National University of Singapore

In incomplete markets, a basic Black-Scholes perspective has to be complemented by the valuation of market imperfections. Otherwise this results in Black-Scholes Ponzi schemes, such as the ones at the core of the last global financial crisis, where always more derivatives are issued to remunerate the capital required by the already opened positions. In this paper we consider the sustainable Black-Scholes equations that arise for a portfolio of options if one adds to their trade additive Black-Scholes price, on top of a nonlinear liquidity funding cost, the cost of remunerating at a hurdle rate the residual risk left by imperfect hedging. In addition, we assess the impact of model uncertainty in this setup.