Results 1  10
of
200
A RiskFactor Model Foundation for RatingsBased Bank Capital Rules
 Journal of Financial Intermediation
, 2003
"... When economic capital is calculated using a portfolio model of credit valueatrisk, the marginal capital requirement for an instrument depends, in general, on the properties of the portfolio in which it is held. By contrast, ratingsbased capital rules, including both the current Basel Accord and i ..."
Abstract

Cited by 283 (1 self)
 Add to MetaCart
When economic capital is calculated using a portfolio model of credit valueatrisk, the marginal capital requirement for an instrument depends, in general, on the properties of the portfolio in which it is held. By contrast, ratingsbased capital rules, including both the current Basel Accord and its proposed revision, assign a capital charge to an instrument based only on its own characteristics. I demonstrate that ratingsbased capital rules can be reconciled with the general class of credit VaR models. Contributions to VaR are portfolioinvariant only if (a) there is only a single systematic risk factor driving correlations across obligors, and (b) no exposure in a portfolio accounts for more than an arbitrarily small share of total exposure. Analysis of rates of convergence to asymptotic VaR leads to a simple and accurate portfoliolevel addon charge for undiversified idiosyncratic risk. There is no similarly simple way to address violation of the single factor assumption.
Pricing, Hedging and Optimally Designing Derivatives via Minimization of Risk Measures
 IN VOLUME ON INDIFFERENCE PRICING, PRINCETON UNIVERSITY PRESS. 24 BERNARDO A.E. AND LEDOIT O.,(2000
, 2005
"... The question of pricing and hedging a given contingent claim has a unique solution in a complete market framework. When some incompleteness is introduced, the problem becomes however more difficult. Several approaches have been adopted in the literature to provide a satisfactory answer to this probl ..."
Abstract

Cited by 56 (5 self)
 Add to MetaCart
(Show Context)
The question of pricing and hedging a given contingent claim has a unique solution in a complete market framework. When some incompleteness is introduced, the problem becomes however more difficult. Several approaches have been adopted in the literature to provide a satisfactory answer to this problem, for a particular choice criterion. Among them, Hodges and Neuberger [72] proposed in 1989 a method based on utility maximization. The price of the contingent claim is then obtained as the smallest (resp. largest) amount leading the agent indifferent between selling (resp. buying) the claim and doing nothing. The price obtained is the indifference seller's (resp. buyer's) price. Since then, many authors have used this approach, the exponential utility function being most often used (see for instance, El Karoui and Rouge [51], Becherer [11], Delbaen et al. [39] , Musiela and Zariphopoulou [93] or Mania and Schweizer [89]...). In this chapter, we also adopt this exponential utility point of view to start with in order to nd the optimal hedge and price of a contingent claim based on a nontradable risk. But soon, we notice that the right framework to work with is not that of the exponential utility itself but that of the certainty equivalent which is a convex functional satisfying some nice properties among which that of cash translation invariance. Hence, the results obtained in this particular framework can be immediately extended to functionals satisfying the same properties, in other words to convex risk measures as introduced by Föllmer and Schied [53] and [54]
Coherent approaches to risk in optimization under uncertainty
 In Tutorials in Operations Research INFORMS
, 2007
"... Keywords Decisions often need to be made before all the facts are in. A facility must be built to withstand storms, floods, or earthquakes of magnitudes that can only be guessed from historical records. A portfolio must be purchased in the face of only statistical knowledge, at best, about how marke ..."
Abstract

Cited by 39 (3 self)
 Add to MetaCart
(Show Context)
Keywords Decisions often need to be made before all the facts are in. A facility must be built to withstand storms, floods, or earthquakes of magnitudes that can only be guessed from historical records. A portfolio must be purchased in the face of only statistical knowledge, at best, about how markets will perform. In optimization, this implies that constraints may need to be envisioned in terms of safety margins instead of exact requirements. But what does that really mean in model formulation? What guidelines make sense, and what are the consequences for optimization structure and computation? The idea of a coherent measure of risk in terms of surrogates for potential loss, which has been developed in recent years for applications in financial engineering, holds promise for a far wider range of applications in which the traditional approaches to uncertainty have been subject to criticism. The general ideas and main facts are presented here with the goal of facilitating their transfer to practical work in those areas. optimization under uncertainty; safeguarding against risk; safety margins; measures of risk; measures of potential loss; measures of deviation; coherency; valueatrisk; conditional valueatrisk; probabilistic constraints; quantiles; risk envelopes; dual representations; stochastic programming 1.
Generalized Deviations in Risk Analysis
 FINANCE AND STOCHASTICS
"... General deviation measures are introduced and studied systematically for their potential applications to risk management in areas like portfolio optimization and engineering. Such measures include standard deviation as a special case but need not be symmetric with respect to ups and downs. Their pro ..."
Abstract

Cited by 39 (2 self)
 Add to MetaCart
General deviation measures are introduced and studied systematically for their potential applications to risk management in areas like portfolio optimization and engineering. Such measures include standard deviation as a special case but need not be symmetric with respect to ups and downs. Their properties are explored with a mind to generating a large assortment of examples and assessing which may exhibit superior behavior. Connections are shown with coherent risk measures in the sense of Artzner, Delbaen, Eber and Heath, when those are applied to the difference between a random variable and its expectation, instead of to the random variable itself. However, the correspondence is only onetoone when both classes are restricted by properties called lower range dominance, on the one hand, and strict expectation boundedness on the other. Dual characterizations in terms of sets called risk envelopes are fully provided.
Constructing uncertainty sets for robust linear optimization
, 2006
"... doi 10.1287/opre.1080.0646 ..."
Optimization of risk measures
 Probabilistic and Randomized Methods for Design under Uncertainty
, 2005
"... Consider a stochastic system whose output variable Z is a real valued random variable. If it depends on some decision vector x ∈ Rn, we can write the relation: Z(ω) = f(x, ω), ω ∈ Ω. ..."
Abstract

Cited by 27 (3 self)
 Add to MetaCart
(Show Context)
Consider a stochastic system whose output variable Z is a real valued random variable. If it depends on some decision vector x ∈ Rn, we can write the relation: Z(ω) = f(x, ω), ω ∈ Ω.
Stochastic orders and risk measures: Consistency and bounds
 INSURANCE: MATHEMATICS AND ECONOMICS
, 2005
"... We investigate the problem of consistency of risk measures with respect to usual stochastic order and convex order. It is shown that under weak regularity conditions risk measures are consistent with these stochastic orders. This result is used to derive bounds for risk measures of portfolios. As a ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
We investigate the problem of consistency of risk measures with respect to usual stochastic order and convex order. It is shown that under weak regularity conditions risk measures are consistent with these stochastic orders. This result is used to derive bounds for risk measures of portfolios. As a byproduct, we extend the characterization of Kusuoka (2001) of coherent, lawinvariant risk measures with the Fatou property to unbounded random variables.
LP Solvable Models for Portfolio Optimization: A Classification and Computational Comparison
, 2003
"... ..."
Drawdown Measure in Portfolio Optimization
 International Journal of Theoretical and Applied Finance, V
"... We propose a new oneparameter family of risk measures called Conditional Drawdown (CDD). These measures of risk are functionals of the portfolio drawdown (underwater) curve considered in an active portfolio management. For some value of the tolerance parameter α, in the case of a single sample path ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
(Show Context)
We propose a new oneparameter family of risk measures called Conditional Drawdown (CDD). These measures of risk are functionals of the portfolio drawdown (underwater) curve considered in an active portfolio management. For some value of the tolerance parameter α, in the case of a single sample path, drawdown functional is defined as the mean of the worst (1 − α) ∗ 100 % drawdowns. The CDD measure generalizes the notion of the drawdown functional to a multiscenario case. The CDD measure includes the Maximal Drawdown and Average Drawdown as its limiting cases. We studied mathematical properties of the CDD and developed efficient optimization techniques for CDD computation and solving asset allocation problems with CDD measure. For a particular example, we find the optimal portfolios for a case of Maximal Drawdown, a case of Average Drawdown, and several intermediate cases between these two. The CDD family of risk functionals is similar to Conditional ValueatRisk (CVaR), which is also called Mean Shortfall, Mean Access loss, or Tail ValueatRisk. Some recommendations on how to select the optimal risk functionals for getting practically stable portfolios are provided. We solved a real life portfolio allocation problem using the proposed measures. 1
What is a good risk measure: bridging the gaps between data, coherent risk measures, and insurance risk measures
, 2006
"... Two main axiomatically based risk measures are the coherent risk measure, which assumes subadditivity for random variables, and the insurance risk measure, which assumes additivity for comonotonic random variables. We propose a new, data based, risk measure, called natural risk statistic, that is ch ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
Two main axiomatically based risk measures are the coherent risk measure, which assumes subadditivity for random variables, and the insurance risk measure, which assumes additivity for comonotonic random variables. We propose a new, data based, risk measure, called natural risk statistic, that is characterized by a new set of axioms. The new axioms only require subadditivity for comonotonic random variables, which is consistent with the prospect theory in psychology. Comparing to two previous measures, the natural risk statistic includes the tail conditional median which is more robust than the tail conditional expectation suggested by the coherent risk measure; and, unlike insurance risk measures, the natural risk statistics can also incorporate scenario analysis. The natural risk statistic includes the VaR as a special case and therefore shows that VaR, though simple, is not irrational.