Jun
15
Wed
Second International Congress on Actuarial Science and Quantitative Finance
Jun 15 – Jun 18 all-day
Jun
17
Fri
Plenary Talk: Path-Dependent Volatility. Julien Guyon. Bloomberg LP. New York, US @ Auditorio Paraninfo. Claustro San Agustín. Universidad de Cartagena
Jun 17 @ 8:00 am – 9:00 am

Path-Dependent Volatility
Julien Guyon
Bloomberg LP. New York, US

So far, path-dependent volatility models have drawn little attention from both practitioners and academics compared to local volatility and stochastic volatility models. This is unfair: in this talk we show that they combine benefits from both. Like the local volatility model, they are complete and can fit exactly the market smile; like stochastic volatility models, they can produce rich implied volatility dynamics. Not only that: given their huge flexibility, they can actually generate a much broader range of spot-vol dynamics, thus possibly preventing large mispricings, and they can also capture prominent historical patterns of volatility. We give many examples to showcase their capabilities.

Plenary talk: Aggressive Backtesting of Stochastic Loss Reserve Models – Where It Leads Us. Glenn Meyers, ISO Innovative Analytics, New York USA. @ Auditorio Paraninfo. Claustro San Agustín. Universidad de Cartagena.
Jun 17 @ 9:00 am – 10:00 am

Aggressive Backtesting of Stochastic Loss Reserve Models – Where It Leads Us
Glenn Meyers
ISO Innovative Analytics,
New York USA.

In 2012 the was posted on the Casualty Actuarial Society website. This database includes several hundred loss triangles compiled from the American Schedule P exhibits that are reported to the American National Association of Insurance Commissioners. The database includes subsequent outcomes that were reported after the original loss triangle was reported.
Since the database was compiled, the speaker has used this database test the predictions of two currently popular stochastic loss reserve models and found some shortcomings of these models. The talk will discuss new models that are fit with Bayesian MCMC algorithms that address these shortcomings. The talk will then go on to show how these models can be used to address the current problems of dependencies between lines of insurance, and cost of capital risk margins.

Keywords: Bayesian MCMC, Stochastic Loss Reserving, Dependencies, Risk Margins, Schedule P

Coffee Break
Jun 17 @ 10:00 am – 10:30 am
Contributed talks 5: Practitioner–Finance** (English, Spanish) @ Auditorio Paraninfo. Claustro San Agustín. Universidad de Cartagena.
Jun 17 @ 10:30 am – 12:00 pm

10:30am–11:00am:
A systematic view on price based trading strategies
A. Christian Da Silva*, Dunn Capital; Fernando Ferreira, USP; Ju-Yi Yen, University of Cincinnati
We study trading strategies that use historical price data to predict future asset performance. Important and well known examples are short-term or long-term reversal cite{Bondt1989}, momentum and trend-following cite{JT1993,BP}. A distinction between these strategies is the extension of historical data used to predict the future. In particular, short-term reversal appears for portfolios which are build using historical data up to one month. Trend-following and momentum are generally implemented using few months to one year and long-term reversal uses few years of historical data.
We study such strategies by assuming that the asset log-returns $x_{i}$ are Gaussian random variables with drift $mu$, variance $V$ and autocorrelation $rho$ cite{FSY}. We further assume to be able to trade proportional to a simple moving average ($m(T)$) with $T$ terms and calculate the exact expression for the average performance and its variance cite{FSY}.
Empirically we identify the different regimes (from short-term reversal to long-term reversal and beyond) by presenting the Sharpe ratio of our strategy applied to 120 years of the DJIA as a function of $T$.

11:00am–11:30am (Spanish):
An Application of Extreme Value Theory for Measuring Financial Risk in the Uruguayan Pension Fund
Guillermo Magnou*, SURA
Traditional methods for financial risk measures adopts normal distributions as a pattern of the financial return behavior. Assessing the probability of rare and extreme events is an important issue in the risk management of financial portfolios. In this paper, we use Peaks Over Threshold (POT) model of Extreme Value Theory (EVT), and General Pareto Distribution (GPD) which can give a more accurate description on tail distribution of financial losses. The EVT and POT techniques provides well established statistical models for the computation of extreme risk measures like the Return Level, Value at Risk and Expected Shortfall. In this paper we apply this technique to a series of daily losses of AFAP SURA over an 18-year period (1997-2015), AFAP SURA is the second largest pension fund in Uruguay with more than 310,000 clients and assets under management over USD 2 billion.
Our major conclusion is that the POT model can be useful for assessing the size of extreme events. VaR approaches based on the assumption of normal distribution are definitely overestimating low percentiles (due to the high variance estimation), and underestimate high percentiles (due to heavy tails). The absence of extreme values in the assumption of normal distribution underestimate the Expected Shortfall estimation for high percentiles. Instead, the extreme value approach on POT model seems coherent with respect to the actual losses observed and is easy to implement.

11:30am–12:00am (Spanish):
Empirical Approach to the Heston Model Parameters on the Exchange Rate USD / COP
Carlos Grajales*, Universidad de Antioquia; Santiago Medina, Universidad Nacional de Colombia
This work proposes an empirical calibration of the Heston stochastic volatility model for the exchange rate USD / COP. The parameter estimation is done by developing an algorithm that performs simulated trajectories for the exchange rate under
Heston model and looking for matching probability distributionof simulated paths with the probability distribution that comes from the real exchange rate. The calibration is achieved by using both two-sample KS test and Nelder Mead simplex direct search. At the end, the results show that although achieving multiple optima parameter values depending on an initial vector parameter is posible, one of these could be chosen according to financial market information. The ongoing suggested open problems that come with the underlying dynamics presented will be related with
derivatives and risk measures valuation

Contributed talks 6: Practitioner–Actuarial Science** & Academic–Actuarial Science–Education** (English, Spanish) @ Aula Máxima de Derecho. Claustro de San Agustín.
Jun 17 @ 10:30 am – 12:00 pm

10:30am–11:00am:
Value-at-Risk Estimation of Aggregated Risks Using Marginal Laws and Some Dependence Information
ANDRES CUBEROS ACEVEDO*, SCOR; Esterina Masiello, Univ Lyon, Institut Camille Jordan,; Veronique Maume-Deschamps, Univ Lyon, Institut Camille Jordan,
Estimating the Value-at-Risk of aggregated variables (mainly sums or weighted sums) is crucial in risk management for many application fields such as finance, insurance, environment… This question has been widely treated but new efficient methods are always welcome; especially if they apply in (relatively) high dimension. We propose an estimation procedure based on the checkerboard approximation of the empirical copulas. It allows to get good estimations from a (quite) small sample of the multivariate law and a full knowledge of the marginal laws. This situation is realistic for many applications. Estimations may be improved by including in the checkerboard approximation some additional information (on the law of a sub-vector or on extreme probabilities). Our approach is illustrated by numerical examples.

11:00am–11:30a.m.:
Basel and Solvency: Brazilian Experience
ELIZABETH BORELLI*, PUCSP; FABIANA SILVA, PUCSP; VIVIAN CANOAS, PUCSP
This article aims to discuss the instruments related to the economic stability and solvency of financial institutions such as banks, insurers and reinsurers, with regards to the guaranteed assets and rights of investors and insureds as well as its financial impact on the Brazilian market. The analysis assumes that the capital requirement constitutes the most important and powerful protection tool for financial institutions against the risk of insolvency, with the purpose that pacts like Basel and Solvency are intended to form reserves capable of supporting fluctuations against inherent risks to the activities they perform, as mechanisms for absorbing variations against unexpected losses. In this context, it is briefly presented the history of these agreements in the world, followed by the analysis of its implementation and impact in Brazil, where regulators have also created procedures to suit the market, by adapting the legislation for banks, insurers and reinsurers and new capital requirements. It is concluded that this process of evolution and regulation must constantly be monitored so that new conditions can be incorporated in the models, to ensure the solvency of the market. It was found that profitability values and ratios for banks have been reduced due to the higher capital requirements and consequent reduction in financial leverage ratios to these companies after the implementation of Basel I and Basel II projects in Brazil. In the insurance market, the higher capital requirements penalized smaller companies, which can reduce competitiveness. However, despite the unfavorable financial impact in some measure, it is important to consider the positive effect of greater stability to the system.

11:30am–12:00m (Spanish):
On four documents by Julio Garavito on actuarial mathematics and insuraces
Fabio Ortiz*, U. Externado-U de los Andes
We present four documents by te colombian engineer Julio Gravito (1865-1920) in which he dealt with topics in actuarial mathematics and insurances.
Julio Garavito was a professor at Universidad Nacional de Colombia from around 1895 to 1920 . His interest in several fields of pure and applied mathematics and his self cultivated scientifical interest made of this engineer one the few professors who published articles on several subjects.
One of this was he interst in actuarial mathematics and insurances. We will comment on four articles on this field and actuarial mathematics, three of which are unpublished:
Calculation of premiums and reserves of life insurances made for the Sociedad Nacional de Seguros (1902-1903): in this document he
uses the text of E. Dormoy (Doroy (1878)) to explain the theory of calculation of policies different premiumms and he explains the use of a mortality table for which purpose he uses tables from United States and England. Altough the document was made for the calculation of the insurance conpany and is unpublished, a description of this and some commentes was made in Ortiz(2014)
The second document Notes on the insurances companies is an unpublished exposition on actuarial mathematics in which he explains the calculation of premiums and reserves on one or two heads. Accordig to the used notation the exposition is based on Dormoy (1878).
The third one is Seguro Agricola published in 1931 altough it was presented to the Agricultural Congress in 1911. In this exposition explains the adventages of promoting agricole assurances and policies for the reinforcement of this economical activity in Colombia.
The fourth document is Compania Cooperativa de Constructores. In this document he deals with a cooperative associaion in which a number of members can get the ownership of one house in such a manner that the cooperative has the mortage and the houses are alloted to a grou pf members by sweepstakes. The problem is to calculate the number of members supposed to get loans for the houses to be alloted during a number of periods so that the cooperatve association can hold all the mortage

SOCIEDAD LATINOAMERICANA DE ACTUARÍA Y FINANZAS @ Auditorio Paraninfo. Claustro San Agustín. Universidad de Cartagena.
Jun 17 @ 12:00 pm – 12:30 pm

SOCIEDAD LATINOAMERICANA DE FINANZAS CUANTITATIVAS Y CIENCIAS ACTUARIALES
Comité Organizador ICASQF

Se invita a la presentación de la sociedad Latinoamericana de Finanzas Cuantitativas y Ciencias Actuariales Se hará una breve descripción de los objetivos, misión y alcance de la sociedad. La invitación es abierta a todos los estudiantes, profesionales, investigadores y educadores que están interesados en el fortalecimiento académico del modelamiento cuantitativo de Finanzas y Actuaría en Latinoamérica.

Tour in Chiva
Jun 17 @ 2:00 pm – 6:00 pm

Traditional “Chiva” bus tour: Bocagrande, Cartagena Bay, Monument of the Old Shoes, “La Popa” Monastery, San Felipe Castle –entrance included-, old vaults. . Open to all registered participants (no extra fee), confirmation is compulsory.

The meeting point is at 2:30pm at the old vaults “Las Bovedas Artesanales” There are limited places available, so please return a form included in the registration package to the registration desk as soon as possible. Forms will only be received until 5:00 pm on the first day of the conference (Wednesday June 15, 2016).

Congress Dinner
Jun 17 @ 8:00 pm – 11:00 pm

Open to all registered participants (dinner fee US$25, COP$75,000), confirmation is compulsory (fill out form on the 1st day of the conference).

The congress dinner will be held in the Restaurant-Bar “El Baluarte San Francisco Javier” located on walls of the historic city center in front of the naval museum and diagonal to hotel Charleston Santa Teresa (see the map). The congress dinner is not included in the conference fee. The cost of the congress dinner is COP$75,000 or US$27. The dinner fee includes: bar entrance, coconut lemonade, soft-drink or water, a main dish (grilled tenderloin or grilled vegetables) with a side dish, chocolate cake, live music and fantastic views. Alcoholic drinks are not included.

There are limited places available. Forms will only be received until 5:00 pm on the first
day of the conference (Wednesday June 15, 2016).

Jun
18
Sat
Contributed Talk 8: Academic–Finance–Options, Futures, Stochastic Volatility, Real Options, Energy Finance, Hybrids** @ Aula Máxima de Derecho. Claustro de San Agustín.
Jun 18 @ 8:00 am – 10:00 am

8:00am–8:30am:
Arbitrage-Free XVA (Invited Session talk –Stéphane Crépey)
Agostino Capponi*, Columbia University; Stephan Sturm, WPI; Maxim Bichuch, Johns Hopkins University
We develop a framework for computing the total valuation adjustment (XVA) of a European claim accounting for funding costs, counterparty credit risk, and collateralization. Based on no-arbitrage arguments, we derive backward stochastic differential equations (BSDEs) associated with the replicating portfolios of long and short positions in the claim. This leads to the definition of buyer’s and seller’s XVA, which in turn identify a no-arbitrage interval.In the case that borrowing and lending rates coincide, we provide a fully explicit expression for the uniquely determined XVA, expressed as a percentage of the price of the traded claim, and for the corresponding replication strategies. In the general case of asymmetric funding, repo and collateral rates, we study the semilinear partial differential equation (PDE) characterizing buyer’s and seller’s XVA and show the existence of a unique classical solution to it. To illustrate our results, we conduct a numerical study demonstrating how funding costs, repo rates, and counterparty risk contribute to determine the total valuation adjustment.

8:30am–9:00am:
Option pricing under time-varying risk-aversion with applications to risk forecasting
Ruediger Kiesel*, University Duisburg-Essen; Florentin Rahe, University Ulm
We present a new option-pricing model, which explicitly captures the difference in the persistence of volatility under historical and risk-neutral probabilities. The model also allows to capture the empirical properties of pricing kernels, such as time-variation and the typical S-shape. We apply our model for two purposes. First, we analyze the risk preferences of market participants invested in S&P 500 index options during 2001 – 2009. We find that risk-aversion strongly increases during stressed market conditions and relaxes during normal market conditions. Second, we extract forward-looking information from S&P 500 index options and perform out-of-sample Value-at-Risk (VaR) forecasts during the period of the subprime mortgage crises. We compare the VaR forecasting performance of our model with four alternative VaR models and find that 2-Factor Stochastic Volatility models have the best forecasting performance.

9:00am–9:30am:
Log-Skew-Normal Mixture Model For Option Valuation
VISWANATHAN ARUNACHALAM*, UNIV. NACIONAL DE COLOMBIA; Jose Jimenéz, Univ Nacional de Colombia
There is good empirical evidence to show that the financial series, whether stocks or indices, currencies or interest rates do not follow the log-normal random walk underlying the Black-Scholes model, which is the basis for most of the theory of options valuation. In this article, we present an alternative approach for calculating the price of the call and put options when stock return distribution follows a Log-Skew-Normal mixture distribution. We obtain explicit expression for the price of the European options and formula for Greeks. We also analyze the effect the implied volatility at-the-money and a derivation of an expression for the implied volatilities at and around-the-money and discuss its asymptotic behavior. We present some numerical results for the calibration to real market option data.

9:30am–10:00am:
On American swaptions under the linear-rational framework
Yerkin Kitapbayev*, Boston University; Damir Filipovic, EPFL
In this paper we study the American version of the swaptions under the linear-rational term structure model (Filipovic, Larsson and Trolle (2014)). This framework enables us to simplify
the pricing problem significantly and formulate
corresponding optimal stopping problem for a diffusion process. The latter problem reduces to a free-boundary problem which we tackle by local time-space
calculus (Peskir (2005)). We characterize the optimal stopping boundary as the unique solution to nonlinear integral equation and using this we obtain the arbitrage-free price of the American swaption and the optimal exercise strategies in terms of swap rates for both fixed-rate payer and receiver.

Contributed Talks 7: Academic–Actuarial Science–Life, Pensions, and Other** @ Auditorio Piso 2. Casa Museo Arte y Cultura la Presentación.
Jun 18 @ 8:00 am – 9:30 am

8:00am–8:30am:
On Reinsurance by Capital Injections in a Brownian perturbed Risk Model
Zied Ben Salah*, American University in Cairo; Jose Garrido, Concordia University
In this paper we consider a risk model where the deficits after ruin can be covered by reinsurance contracts with different levels of retention. To allow the insurance company surviving after ruin, the reinsurer has to inject additional capitals. Assuming that these are obtained through a reinsurance agreement, the problem is to determine the reinsurance premium using the discounted expected value of capital injections. Inspired by results of Z. Ben Salah (2014) , we show that an explicit formula for the reinsurance premium exists in a setting involving accumulated claims modeled by a subordinator, and Brownian perturbation. We illustrate this result by specific examples when there is no Brownian perturbation.

8:30am–9:00am:
The retrospective loss random variable and its relevance in actuarial science
Emiliano Valdez*, University of Connecticut
In this paper, we define a retrospective loss random variable and mathematically demonstrate that its expectation is the retrospective reserve which in turn equals the prospective reserve. By defining an associated random variable for the retrospective reserve, similar to the prospective loss random variable for the prospective reserve, we can explore various properties of the retrospective loss random variable. We demonstrate that the retrospective loss random variable provides us with valuable historical information on how actual experience varies from reserving assumptions and whether it is significant enough to adjust the prospective reserves for the business. The paper concludes with a model of a block of inforce policies with actual experience different from reserving assumptions, and a rigorous and consistent methodology on how prospective reserves could be adjusted based on the realized retrospective loss random variable. This is joint work with J. Vadiveloo, G. Niu and G.Gan.

9:00am–9:30am:
Optimal decumulation into annuity after retirement: a stochastic control approach.
Nicolas Langrené*, CSIRO; Thomas Sneddon, CSIRO; Geoffrey Lee, CSIRO; Zili Zhu, CSIRO
Around the world, public and private pension systems continue to shift gradually from traditional defined benefit plans to defined contribution plans. Under this new system, individuals contribute to their own individual retirement account, and must manage it throughout retirement. In particular, contrary to defined benefit plans, individuals face longevity risk, i.e. the risk that their retirement savings are not enough to cover their consumption if they were to live longer than expected.
A natural product to hedge this longevity risk would be an inflation-protected life annuity, which, in exchange of an immediate lump sum payment, delivers a steady stream of payments to the purchaser as long as he lives. However, most retirees do not voluntarily annuitize any of their savings. Prospect theory provides some insights on this puzzle: framed as an investment product, an annuity is seen as a massive, unappealing bet on one’s life expectancy. In particular, the perceived risk of losing all one’s entire principal if dying shortly after the annuity purchase outweighs the more distant risk of running out of money if living longer than expected.
Between these two opposites that are no annuitization and immediate full annuitization, we propose an intermediate strategy: progressive decumulation into annuity. Entering gradually into annuity keeps longevity risk at bay, while mitigating shortcomings of immediate full annuitization: inflexibility, risk of illiquidity, and risk of quick loss of the whole principal. Moreover, if the dynamic shift into annuity takes heed of the market conditions, the retiree can keep benefiting from the equity premium.
Mathematically, finding the best gradual shift into annuity can be expressed as a simple stochastic control problem. We solve it numerically using an extension of the least-squares Monte Carlo algorithm, as it allows for great flexibility on the dynamics of the risk factors and on the mortality model. More precisely, in order to obtain realistic and consistent simulations of the risk factors over the whole retirement period, we extend the classical Wilkie investment model into a more general cascade structure containing all the relevant variables in the economy.
Several objective functions were tested: expected utility, probability of ruin, and more sophisticated combinations. In each case, allowing for dynamic purchase of annuity improves substantially the financial situation of the retiree. This is illustrated on the dynamic evolution of the distribution of the net wealth of the retiree. Moreover, the algorithm provides the explicit policy to follow over time to obtain these observed improvements in practice.
Our results can either be used by retirees to make informed decisions on annuity purchase over time, or by insurance companies to package this dynamic strategy into a new kind of annuity package featuring delayed, gradual initial payment and equity-linked, inflation-protected gradual payout streams.

Invited Session–Andrea Pascucci * @ Auditorio Paraninfo. Claustro San Agustín. Universidad de Cartagena.
Jun 18 @ 8:00 am – 10:00 am

8:00am–8:30am:
PDE models for pricing fixed rate morgages and their insurance and coinsurance
Carmen Calvo-Garrido, Carlos Vázquez*
In the pricing of fixed rate mortgages with prepayment and default options, we introduce jump-diffusion models for the house price evolution. These models take into account sudden changes in the price (jumps) during bubbles and crisis situations in real estate markets. After posing the models based on partial-integro differential equations (PIDE) problems for the contract, insurance and the fraction of the total loss not covered by the insurance (coinsurance), we propose appropriate numerical methods to solve them. Among this methods are semilagrangian schemes for time discretization combined with finite elements, ALAS algorithm for inequality constraints and quadrature formulas for nonlocal terms.

8:30am–9:00am
Randomised Heston models
Antoine Jacquier*
Inspired by recent works on the behaviour of the forward implied volatility smile, we introduce a new class of stochastic volatility models. The dynamics are the same as the classical Heston model, but the random starting point of the variance process is randomly distributed. We show how to choose the initial distribution (i) to fit the short end of the smile—traditionally mis-calibrated in classical stochastic volatility models, and (ii) to estimate past realisation of the volatility time series. This is a joint work with Fangwei Shi (Imperial College London).

9:00am–9:30am
Pricing Bermudan options under local Levy models with default
Anastasia Borovykh, Andrea Pascucci* and Cornelis W. Oosterlee
We consider a defaultable asset whose risk-neutral pricing dynamics are described by an exponential Levy-type martingale. This class of models allows for local volatility, local default intensity and a locally dependent Levy measure. We present a pricing method for Bermudan options based on an analytical approximation of the characteristic function combined with the COS method. We derive the adjoint expansion of the characteristic function using a Taylor expansion of the coecients. Due to a special form of the obtained characteristic function the price can be computed using a Fast Fourier Transform- based algorithm resulting in a fast and accurate calculation.

9:30am–10:00am
Backtesting Lambda Value at Risk
Jacopo Corbetta*, Ecole des Ponts – ParisTech; Ilaria Peri, University of Greenwich
A new risk measure, the lambda value at risk ($Lambda VaR$), has been recently proposed from a theoretical point of view as an immediate generalization of the value at risk ($VaR$). The $Lambda VaR$ appears to be attractive for its potential ability to solve several problems of the $VaR$.In this paper we propose three nonparametric backtesting methodologies for the $Lambda VaR$ which exploit different features. Two of these tests directly assess the correctness of the level of coverage predicted by the model. One of these tests is bilateral and provides an asymptotic result. A third test assess the accuracy of the $Lambda VaR$ that depends on the choice of the P$&$L distribution. However, this test requires the storage of more information.Finally, we perform a backtesting exercise and we compare our results with the ones from Hitaj and Peri (2015).

Coffe Break
Jun 18 @ 10:00 am – 10:30 am
Contributed Talks 10: Academic–Finance–Economics, Market Microstructures** (English & Spanish) @ Aula Máxima de Derecho. Claustro de San Agustín.
Jun 18 @ 10:30 am – 12:30 pm

10:30am–11:00am:
On the behavior of the price impact in the Kyle-Back model.
José Corcuera*, University of Barcelona
In this paper we study the equilibrium arising in the Kyle-Back model when we allow the depth parameter to be random. This richer model can explain different phenomena observed in the financial market and that remained unexplained. We also unify different extensions of the Kyle-Back model in this framework.

11:00am–11:30am:
Optimal market dealing under constraints.
Etienne Chevalier*, Université d’Evry; Mhamed Gaigi, ; Vathana Ly Vath, ENSIIE; Mohammed Mnif, ENIT.
We consider a market dealer acting as a liquidity provider by continuously setting bid and ask prices for an illiquid asset in a quote-driven market. The market dealer may benefit from the bid-ask spread but has the obligation to permanently quote both
prices while satisfying some liquidity and inventory constraints. The objective is to maximize the expected utility from terminal liquidation value over a finite horizon and subject to the above constraints. We characterize the value function as the unique
viscosity solution to the associated HJB equation and further enrich our study with numerical results. The contributions of our study concern both the modelling aspects and the dynamic structure of the control strategies. Important features and constraints characterizing market making problems are no longer ignored. Indeed, along with the obligation to continuously quote bid and ask prices, we do not allow the market maker to stop quoting them when the stock inventory reaches its lower or higher bound. Furthermore, we no longer assume the existence of a reference price.

11:30m–12:00pm: (Spanish)
Who knows better in an Emerging Market? Performance of Institutions, Foreigners and Individuals.
Diego Agudelo*, Universidad EAFIT; James Byder, Universidad EAFIT; Paula Yepes, Universidad EAFIT
We find that local investors do better than foreigners in terms of trading execution. However foreign investors obtain better returns than local individuals both in short and long term. Local institutions are the best group on both dimensions. Our result reconcile apparent contradictions in the international finance literature on who invests better in an emerging market. These contradictions disappears with a more careful formulation of the research question at hand. The traditional Locals vs Foreigners or Institutions versus Individuals is too simplistic because it doesn’t distinguish between the different dimensions of performance. Our study makes use of two unique databases of Colombian stocks and acts as out-of-sample test of previous findings. Moreover, we provide evidence that the better performance of Institutions and Foreigners is driven by information advantages.

12:00m–12:30m: (Spanish)
An empirical analysis of unspanned risk for the U.S. yield curve.
Karoll Gomez*, Universidad Nacional de Colombia.
In this paper, I formally test for the unspanning properties of liquidity premium
risk in the context of a joint Gaussian affine term structure model for zero-coupon
U.S. Treasury and TIPS bonds. In the model, the liquidity factor is considered as
an additional factor that does not span the yield curve, but improves the forecast
of bond risk premia. I present empirical evidence suggesting that indeed liquidity
premium helps to forecast U.S. bond risk premia, but it is not linearly spanned by
the information in the joint yield curve. In addition, I show that the liquidity factor
does not affect the dynamic of bonds under the pricing measure, but does affect
them under the historical measure. Furthermore, the variation in the TIPS liquidity
premium predicts the future evolution of the traditional yield curve factors.

Contributed Talks 9: Academic–Actuarial Science–Life** @ Auditorio Piso 2. Casa Museo Arte y Cultura la Presentación.
Jun 18 @ 10:30 am – 12:30 pm

10:30am–11:00m
Policy Characteristics and Stakeholder Returns in Participating Life Insurance: Which Contracts Can Lead to a Win-Win?
Charbel Mirza*, University of Lausanne; Joël Wagner, University of Lausanne
Participating life insurance contracts and pension plans often include a return guarantee and participation in the surplus of the institution’s result. The final account value in such contracts depends on the investment policy driven by solvency requirements as well as the level of market returns, the guarantee and the participation rates. Using a contingent claim model for such contracts, we assume a competitive market with minimum solvency requirements similar to Solvency II. We consider solvency requirements on the maturity and one-year time horizons, as well as contracts with single and periodic premium payments. Through numerical analyses we link the expected returns for equity holders and policyholders in various situations. Using the return on equity and policyholder internal rate of return along with utility measures, we assess which contract settings optimize the return-compromise for both stakeholders in a low interest rate environment. Our results extend the academic literature by building on the work by Schmeiser and Wagner (2015, emph{The Journal of Risk and Insurance}, 82(3):659–686) and are relevant for practitioners given the current financial market environment and difficulties in insurance-linked savings plans with guarantees.

11:00am–11:30am
Andrew Hunt*, N/A
Identifiability in Mortality Models
Lack of identifiability, where different sets of parameters give identical fits to the data, is a common feature of many standard mortality models. Traditionally, these issues have been “solved” by imposing arbitrary constraints on the parameters, so as to select a unique set when fitting the model. However, this solution can cause further problems and may mean that our projections from such models depend on the arbitrary choice of constraints we adopt. We investigate this matter in a number of common age/period/cohort mortality models, discussing how identifiability issues arise, how to find them in more complicated models, their potential impact on our projected mortality rates, and how this issue can be resolved fully.

11:30am–12:00m
Predicting human mortality: quantitative evaluation of four stochastic models
Anastasia Novokreshchenova*, University of Turin; Luca Regis, IMT Institute for Advanced Studies Lucca.
In this paper we compare different mortality models. We consider one discrete-time model proposed by Lee and Carter (1992) and three continuous-time models: Wills and Sherris (2011) model, Feller process and Ornstein-Uhlenbeck (OU) process. The first two models estimate the whole surface of mortality simultaneously, while in the last two cases each generation is modelled and calibrated separately. We calibrate the models to the UK female population data. To evaluate the goodness of fit we use two measures: the relative error between the forecasted and the actual mortality rates and the percentage of actual mortality rates which falls within a confidence interval. We find that all the models show relatively similar absolute total error. In terms of the confidence intervals, the results are more divergent since each model implies a certain structure of the variance.
According to our experiments, models which estimate the whole surface of mortality, produce better results in terms of the confidence interval. However, in terms of the mean absolute error, OU and the Feller processes perform best of all.

12:00m–12:30pm:
Mortality Improvement Rates: Modelling and Parameter Uncertainty.
Andres Villegas*, University of New South Wales; Andrew Hunt, Pacific Life Re, London
Rather than looking at mortality rates directly, a number of recent academic studies have looked at modelling rates of improvement in mortality when making projections. Although relatively new in the academic literature, the use of mortality improvement rates has a long-standing tradition in actuarial practice when allowing for improvements in mortality from standard mortality tables. However, mortality improvement rates are difficult to estimate robustly and models of them are subject to high levels of parameter uncertainty, since they are derived by dividing one uncertain quantity by another. Despite this, the studies of mortality improvement rates to date have not investigated parameter uncertainty due to the ad hoc methods used to fit the models to historical data. In this study, we adapt the Poisson model for the numbers of deaths at each age and year, proposed in Brouhns et al. [Insurance: Mathematics and Economics 3 (2002) 31] to model mortality improvement rates. This enables models of improvement rates to be fitted using standard maximum likelihood techniques and allows parameter uncertainty to be investigated using a standard bootstrapping approach. We illustrate the proposed modelling approach using data for the USA and England and Wales populations taken from the Human Mortality Database.

Invited Session—Daniel Hernández* @ Auditorio Paraninfo. Claustro San Agustín. Universidad de Cartagena.
Jun 18 @ 10:30 am – 12:30 pm

10:30am–11:00am:
Semimartingale properties of the lower Snell envelope in optimal stopping under model uncertainty
Erick Treviño*, Universidad de Guanajuato
Optimal stopping under model uncertainty is a recent topic under research. The classical approach to characterize the solution of optimal stopping is based on the Snell envelope which can be seen as the value process as time runs. The analogous concept under model uncertainty is the so-called lower Snell envelope and in this paper, we investigate its structural properties. We give conditions under which it is a semimartingale with respect to one of the underlying probability
measures and show how to identify the finite variation process by a limiting procedure. An example illustrates that without our conditions, the semimartingale property does not hold in general.

11:00am–11:30am:
On de Finetti’s problem under a time of ruin constraint
Mauricio Junca*, Universidad de Los Andes
We consider the classic de Finetti’s problem when the reserves are assumed to follow a spectrally negative Levy process subject to a constraint on the time of ruin. We introduce the dual problem and show that the complementary slackness condition is satisfied, thus there is no duality gap. Therefore the optimal value function can be obtained as the point-wise
infimum of auxiliary value functions indexed by Lagrange multipliers. We also present a series of numerical examples.
Joint work with Camilo Hernández

11:30am–12m:
Utility maximization in a multi-dimensional semi-martingale setting with nonlinear wealth dynamics
Rafael Serrano*, Universidad del Rosario.
We explore martingale and convex duality techniques to study optimal investment strategies that maximize expected risk-averse utility from consumption and terminal wealth in a multi-dimensional semimartingale market model with absolutely continuous characteristics and non-linear wealth dynamics. This allows to take account of market frictions such as different borrowing and lending interest rates or short positions with cash collateral and negative rebate rates. Our main result is a sufficient condition for existence of optimal policies and their explicit chracterization in the case of CRRA utility functions. We present numerical examples and some preliminary results for the case in which the investor’s final wealth is liable to deferred capital gains taxes or subject to further downside or expected loss

12:00m–12:30:
Optimizing the exercise boundary for the holder of an American option over a parametric family.
José Vidal Alcalá* , Centro de Investigación en Matemáticas, CIMAT
In the setting of American option pricing, we introduce an efficient stochastic optimization algorithm to find the optimal exercise boundary among a parametric family. We use the Calculus of Variations to write down a probabilistic representation of the payout sensitivity with respect to the exercise boundary parameter.
This representation is used in a Monte Carlo estimator after the development of an accurate SDE discretization scheme for stopped diffusions. As an intermediate result, we are able to approximate deltas at the boundary for barrier options. Numerical simulations/analysis of the algorithms used are presented.

Plenary Talk: Self-exciting process in Finance and Insurance for credit risk and longevity risk modelling in heterogenous portfolios. Nicole El Karoui, LPMA-UPMC. @ Auditorio Paraninfo. Claustro San Agustín. Universidad de Cartagena.
Jun 18 @ 2:00 pm – 3:00 pm

Self-exciting process in Finance and Insurance for credit risk and longevity risk modelling in heterogenous portfolios.
Nicole El Karoui, LPMA-UPMC.

Recent regulatory evolution in credit risk management suggests to consider the credit risk of an aggregated portfolio as generated by a family of intercorrelated firms
with defaults propagation. Redemption risk in Life Insurance is very sensitive to contagion effect driven by the level of external variables as inflation and interest rates, but also the behavior of the other insured. rmiii For longevity purposes in an actuarial and demographical context, the individual point of view (Individual based models) allows to take into account specific individual characteristics (socio-economic status, education level, marital status…), and also the age of individuals (or events).

The aim is to take into account population heterogeneity in characteristics or age, impacting the rate of evolution in a way easier to model than the global point of view. Contagion effect, which is well known in seismology or in neuroscience with the spike-and-wave patterns, but also in High Frequency Trading, must be also included.

The simple model of contagion process is the Hakwes process, whose we give a new interpretation in terms of IBM, allowing to develop more complex Markovian model best suited to modeling redemption risk. For age pyramid of human heterogeneous population, we propose an extension of traditional birth and death processes. Using as source of randomness a $sigma$- finite Poisson measure with characteristics augmented by a thinning parameter, the population process is described as the strong solution of a stochastic differential equation, based on complex (birth, move and death) rates processes, depending of age, characteristics and past population, and from environmental factors. This strong representation permits easy comparisons.

As an example, we reproduce by simulation the cohort effect, well-known in UK, where a cohort of people born between $(1927,1040) $ showed an improvement in life-expectation from neighboring cohorts.

Plenary Talk: Estimation of volatility in presence of high activity jumps and noise. Jean Jacod (UPMC-Paris 6) @ Auditorio Paraninfo. Claustro San Agustín. Universidad de Cartagena.
Jun 18 @ 3:00 pm – 4:00 pm

Estimation of volatility in presence of high activity jumps and noise. Jean Jacod (UPMC-Paris 6)

We consider an It semimartingale which is observed along a discrete (regular or not) time grid, within a fixed time interval. The observations are contaminated by noise, and the semimartingale has jumps with a degree of activity bigger than 1. Our aim is to revisit the estimation of the integrated volatility in such a setting: we use a mixture of the pre-averaging method (to eliminate noise) and of the empirical characteristic function method, which has been shown to be effcient (after proper de-biasing) even when the jump activity is bigger than 1, in contrast with most other methods.

This talk is a presentation of a joint work with Viktor Todorov, from Northwestern University.

Coffee Break
Jun 18 @ 4:00 pm – 4:30 pm
Invited Session–Greg Taylor* @ Aula Máxima de Derecho. Claustro de San Agustín.
Jun 18 @ 4:30 pm – 6:30 pm

4:30pm–5:00pm:
Asymptotic theory for over-dispersed chain-ladder models
Jonas Harnau*, University of Oxford; Bent Nielsen, University of Oxford
The chain-ladder technique is ubiquitous in non-life insurance claim reserving. In a Poisson model, the chain-ladder technique is maximum likelihood estimation. The equivalence of mean and variance of the Poisson is usually refuted by the data. Often, an over-dispersed Poisson structure in which mean and variance are proportional is then assumed. Then, the chain-ladder technique is maximum quasi-likelihood estimation. An asymptotic theory is provided for this situation. This leads to closed form distribution forecasts involving the t distribution. Further, an asymptotically F distributed test statistic is proposed to test for adequacy of the chain-ladder technique compared to a more general model with calendar effect. A simulation study suggests that both distribution forecasts and test statistic give reasonable approximations in finite samples. The proposed distribution forecasts are compared with the standard bootstrap approach. The results generalise to age-period-cohort models used in other fields.

5:00pm–5:30pm:
Self-assembling insurance claim models
Greg Taylor*, UNSW Australia; Hugh Miller, Taylor Fry Analytics & Actuarial Consulting; Grainne McGuire, Taylor Fry Analytics & Actuarial Consulting
The paper considers claim data sets containing complex features, e.g. simultaneous irregular trends across accident periods, development periods and calendar periods. The literature contains contributions on the modelling of such data sets by various forms of multivariate model, such as the Generalized Linear Model.Such modelling is time-consuming and expensive. The present paper investigates the automation of the modelling process, so that the model assembles itself in the presence of a given data set. This is achieved by means of regularized regression (particularly the lasso) of the claim data with a specified set spline basis functions as regressors.This form of modelling is applied first to a number of simulated data sets whose properties are fully known. The extent to which the model, applied in an unsupervised fashion, captures the known features embedded in the data is investigated.Subsequently, the unsupervised modelling is applied to a real-world data set. Although this set’s properties are, therefore, strictly unknown, the authors have some 15 years’ experience with it, and are therefore familiar with many of its features. It has been modelled for many years with a Generalized Linear Model, the results of which are compared with those from the self-assembled model.The use of regularized regression in this context requires careful consideration of the tuning parameter(s). This is discussed in some detail. Throughout the exposition, emphasis is also placed on the investigation of forecast efficiency of the self-assembled models, and on comparison between candidate models.

5:30pm–6:00pm:
Robust Paradigm Applied to Parameter Reduction in Actuarial Triangle Models
Gary Venter*, The Universe
Robust statistics addresses the impact of outliers on parameter estimation, done from the viewpoint of the model as an over-simplified representation of the process generating the data. This perspective of models being imperfectly specified is what I am calling the robust paradigm. It renders problematic much of classical statistical inference, which assumes that the data is generated from the model. In particular, goodness-of-fit measures are no longer sufficient for comparing models. A somewhat standard response is to base model selection on out-of-sample testing. This is reasonable intuitively anyway and has become common practice even without thinking too much about the robust paradigm, but it is essential under this paradigm. If the data is not coming from the model, how well the model works on other data becomes the key issue.A popular way of standardizing out-of-sample testing is LOO – leave one out – which fits the model to every subset of the data that has one fewer observation than the total sample. Then prediction errors on the omitted points become the basis of model comparison. This can be quite burdensome computationally, but recently a fast approximation to LOO has been devel-oped that makes this approach feasible for almost every model.An approach that has looked towards robust testing and LOO in particular is Lasso estimation. Lasso can be formulated as finding the parameters that minimize the negative loglikelihood plus a selected percentage of the sum of the absolute values of the parameters. This effectively reduces the number of parameters, or at least the degrees of freedom used by the model. How-ever absent an out-of-sample testing methodology, the selected percentage is left to modeler judgment.Actuarial triangle models, like those for casualty loss development, historically had parameters for every row and column of the triangle. Then Taylor (1977) popularized using diagonal para-meters as well. Zehnwirth and associates introduced methods to reduce the resulting surfeit of parameters by using linear trends across the parameters, with occasional trend changes as needed. This talk looks at parameter shrinkage methods like Lasso and random effects for estimating the trend changes, then using LOO for determining the optimal degree of shrinkage. This is also applied to mortality triangles, for which actuaries have used very similar models, but generalized by interaction among the row, column and diagonal terms, as in Renshaw-Haberman (2006) . These models can also be used for loss development. Taylor, Greg C., 1977 ― Separation of Inflation and Other Effects from the Distribution of Non-Life Insurance Claim Delays, ASTIN Bulletin 9, pp. 219–230. Renshaw, A.E., and Haberman, S. 2006. ―A Cohort-Based Extension to the Lee-Carter Model for Mortality Reduction Factors. Insurance: Mathematics and Economics 38: 556–570.

6:00pm–6:30pm
Sarmanov Family of Bivariate Distributions for Multivariate Loss Reserving Analysis.
Jean-Philippe Boucher*, UQAM; Anas Abdallah, UQAM; Hélène Cossette, Université Laval; Julien Trufin, Université Libre de Bruxelles
The correlation among multiple lines of business plays a critical role in aggregating claims and thus determining loss reserves for an insurance portfolio. We show that the Sarmanov family of bivariate distributions is a convenient choice to capture the dependencies introduced by various sources, including the common calendar year, accident year and development period effects. The density of the bivariate Sarmanov distributions with different marginals can be expressed as a linear combination of products of independent marginal densities. This pseudo-conjugate property greatly reduces the complexity of posterior computations. In a case study, we analyze an insurance portfolio of personal and commercial auto lines from a major US property-casualty insurer.