Results 11  20
of
867
Dynamic Monetary Risk Measures for Bounded DiscreteTime Processes
, 2004
"... We study timeconsistency questions for processes of monetary risk measures that depend on bounded discretetime processes describing the evolution of financial values. The time horizon can be finite or infinite. We call a process of monetary risk measures timeconsistent if it assigns to a process ..."
Abstract

Cited by 101 (8 self)
 Add to MetaCart
We study timeconsistency questions for processes of monetary risk measures that depend on bounded discretetime processes describing the evolution of financial values. The time horizon can be finite or infinite. We call a process of monetary risk measures timeconsistent if it assigns to a process of financial values the same risk irrespective of whether it is calculated directly or in two steps backwards in time, and we show how this property manifests itself in the corresponding process of acceptance sets. For processes of coherent and convex monetary risk measures admitting a robust representation with sigmaadditive linear functionals, we give necessary and sufficient conditions for timeconsistency in terms of the representing functionals.
Theory and applications of Robust Optimization
, 2007
"... In this paper we survey the primary research, both theoretical and applied, in the field of Robust Optimization (RO). Our focus will be on the computational attractiveness of RO approaches, as well as the modeling power and broad applicability of the methodology. In addition to surveying the most pr ..."
Abstract

Cited by 100 (14 self)
 Add to MetaCart
(Show Context)
In this paper we survey the primary research, both theoretical and applied, in the field of Robust Optimization (RO). Our focus will be on the computational attractiveness of RO approaches, as well as the modeling power and broad applicability of the methodology. In addition to surveying the most prominent theoretical results of RO over the past decade, we will also present some recent results linking RO to adaptable models for multistage decisionmaking problems. Finally, we will highlight successful applications of RO across a wide spectrum of domains, including, but not limited to, finance, statistics, learning, and engineering.
A Theory of Systemic Risk and
 York University
, 2001
"... Luigi Zingales for suggesting that the channel of information spillovers be examined as a source of systemic risk, to Amil Dasgupta, John Moore, and seminar participants at Bank of England, ..."
Abstract

Cited by 93 (14 self)
 Add to MetaCart
(Show Context)
Luigi Zingales for suggesting that the channel of information spillovers be examined as a source of systemic risk, to Amil Dasgupta, John Moore, and seminar participants at Bank of England,
Optimal Execution with Nonlinear Impact Functions and Tradingenhanced Risk
, 2001
"... We determine optimal trading strategies for liquidation of a large singleasset portfolio to minimize a combination of volatility risk and market impact costs. We take the market impact cost per share to be a power law function of the trading rate, with an arbitrary positive exponent. This includes, ..."
Abstract

Cited by 88 (2 self)
 Add to MetaCart
We determine optimal trading strategies for liquidation of a large singleasset portfolio to minimize a combination of volatility risk and market impact costs. We take the market impact cost per share to be a power law function of the trading rate, with an arbitrary positive exponent. This includes, for example, the squareroot law that has been proposed based on market microstructure theory. In analogy to the linear model, we define a “characteristic time” for optimal trading, which now depends on the initial portfolio size and decreases as execution proceeds. We also consider a model in which uncertainty of the realized price is increased by demanding rapid execution; we show that optimal trajectories are described by a “critical portfolio size” above which this effect is dominant and below which it may be neglected.
The Generalized Hyperbolic Model: Financial Derivatives and Risk Measures
 MATHEMATICAL FINANCE – BACHELIER CONGRESS 2000, GEMAN
, 1998
"... Statistical analysis of data from the nancial markets shows that generalized hyperbolic (GH) distributions allow a more realistic description of asset returns than the classical normal distribution. GH distributions contain as subclasses hyperbolic as well as normal inverse Gaussian (NIG) distributi ..."
Abstract

Cited by 84 (12 self)
 Add to MetaCart
(Show Context)
Statistical analysis of data from the nancial markets shows that generalized hyperbolic (GH) distributions allow a more realistic description of asset returns than the classical normal distribution. GH distributions contain as subclasses hyperbolic as well as normal inverse Gaussian (NIG) distributions which have recently been proposed as basic ingredients to model price processes. GH distributions generate in a canonical way Levy processes, i.e. processes with stationary and independent increments. We introduce a model for price processes which is driven by generalized hyperbolic Levy motions. This GH model is a generalization of the hyperbolic model developed by Eberlein and Keller (1995). It is incomplete. We derive an option pricing formula for GH driven models using the Esscher transform as martingale measure and compare the prices with classical BlackScholes prices. The objective of this study is to examine the consistency of our model assumptions with the empirically obser...
Robust utility maximization in a stochastic factor model
, 2006
"... We give an explicit PDE characterization for the solution of a robust utility maximization problem in an incomplete market model, whose volatility, interest rate process, and longterm trend are driven by an external stochastic factor process. The robust utility functional is defined in terms of a ..."
Abstract

Cited by 82 (6 self)
 Add to MetaCart
We give an explicit PDE characterization for the solution of a robust utility maximization problem in an incomplete market model, whose volatility, interest rate process, and longterm trend are driven by an external stochastic factor process. The robust utility functional is defined in terms of a HARA utility function with negative risk aversion and a dynamically consistent coherent risk measure, which allows for model uncertainty in the distributions of both the asset price dynamics and the factor process. Our method combines two recent advances in the theory of optimal investments: the general duality theory for robust utility maximization and the stochastic control approach to the dual problem of determining optimal martingale measures.
Pricing and Hedging in Incomplete Markets
 Journal of Financial Economics
, 2001
"... We present a new approach for positioning, pricing, and hedging in incomplete markets that bridges standard arbitrage pricing and expected utility maximization. Our approach for determining whether an investor should undertake a particular position involves specifying a set of probability measures a ..."
Abstract

Cited by 82 (8 self)
 Add to MetaCart
(Show Context)
We present a new approach for positioning, pricing, and hedging in incomplete markets that bridges standard arbitrage pricing and expected utility maximization. Our approach for determining whether an investor should undertake a particular position involves specifying a set of probability measures and associated °oors which expected payo®s must exceed in order for the investor to consider the hedged and ¯nanced investment to be acceptable. By assuming that the liquid assets are priced so that each portfolio of assets has negative expected return under at least one measure, we derive a counterpart to the ¯rst fundamental theorem of asset pricing. We also derive a counterPricing and Hedging in Incomplete Markets 2 part to the second fundamental theorem, which leads to unique derivative security pricing and hedging even though markets are incomplete. For products that are not spanned by the liquid assets of the economy, we show how our methodology provides more realistic bidask spreads.
Coherent Allocation of Risk Capital
 Journal of Risk
, 1999
"... The allocation problem stems from the diversification e#ect observed in risk measurements of financial portfolios: the sum of the risk measures of many portfolios is typically larger than the risk of all portfolios taken together. The allocation problem is to apportion this "diversification adv ..."
Abstract

Cited by 80 (0 self)
 Add to MetaCart
(Show Context)
The allocation problem stems from the diversification e#ect observed in risk measurements of financial portfolios: the sum of the risk measures of many portfolios is typically larger than the risk of all portfolios taken together. The allocation problem is to apportion this "diversification advantage" to the portfolios in a fair manner, to obtain new, firminternal risk evaluations of the portfolios. Our approach is axiomatic, in the sense that we first establish arguably necessary properties of an allocation scheme, and then study schemes that fulfill the properties. Important results from the area of game theory find a direct application, and are used here. Keywords: allocation of risk; coherent risk measure; game theory; Shapley value; AumannShapley prices; RORAC; riskadjusted performance measure. 1 Introduction The underlying theme of this paper is the sharing of costs within the di#erent constituents of a firm. We call this sharing "allocation", as it is assumed that a higher au...
A theoretical framework for the pricing of contingent claims in the presence of model uncertainty
"... The aim of this work is to evaluate the cheapest superreplication price of a general (possibly pathdependent) European contingent claim in a context where the model is uncertain. This setting is a generalization of the uncertain volatility model (UVM) introduced in by Avellaneda, Levy and Paras. Th ..."
Abstract

Cited by 78 (0 self)
 Add to MetaCart
(Show Context)
The aim of this work is to evaluate the cheapest superreplication price of a general (possibly pathdependent) European contingent claim in a context where the model is uncertain. This setting is a generalization of the uncertain volatility model (UVM) introduced in by Avellaneda, Levy and Paras. The uncertainty is specified by a family of martingale probability measures which may not be dominated. We obtain a partial characterization result and a full characterization which extends Avellaneda, Levy and Paras results in the UVM case. 1. Introduction. Our
Expected Shortfall: A Natural Coherent Alternative to Value at Risk
 Economic Notes
"... We discuss the coherence properties of Expected Shortfall (ES) asafinancial risk measure. This statistic arises in a natural way from the estimation of the “average of the 100p% worst losses ” in a sample of returns to a portfolio. Here p is some fixed confidence level. We also compare several alter ..."
Abstract

Cited by 75 (9 self)
 Add to MetaCart
(Show Context)
We discuss the coherence properties of Expected Shortfall (ES) asafinancial risk measure. This statistic arises in a natural way from the estimation of the “average of the 100p% worst losses ” in a sample of returns to a portfolio. Here p is some fixed confidence level. We also compare several alternative representations of ES which turn out to be more appropriate for certain purposes. Key words: Expected Shortfall; Risk measure; worst conditional expectation; tail conditional expectation; valueatrisk (VaR); conditional valueatrisk (CVaR); coherence; subadditivity. 1 A four years impasse Risk professionals have been looking for a coherent alternative to Value at Risk (VaR) for four years. Since the appearance, in 1997, of Thinking Coherently by Artzner et al [3] followed by Coherent Measures of Risk [4], it was clear to risk practitioners and researchers that the gap between market practice and theoretical progress had suddenly widened enormously. These papers in fact faced for the first time the problem of defining in a clearcut way what properties a statistic of a portfolio should have in order to be considered a sensible risk measure. The answer to this question was given through a complete characterization of such properties via an axiomatic formulation of the concept of coherent risk measure. With this result, risk management became all of a sudden a science in itself with its own rules correctly definedinadeductiveframework. Surprisingly enough, however, VaR, the risk measure adopted as best practice by essentially all banks and regulators, happened to fail the exam for being admitted in this science. VaR is not a coherent risk measure because it simply doesn’t fulfill one of the axioms of coherence.