Results 1  10
of
16
Infconvolution of risk measures and optimal risk transfer
"... We develop a methodology to optimally design a financial issue to hedge nontradable risk on financial markets.The modeling involves a minimization of the risk borne by issuer given the constraint imposed by a buyer who enters the transaction if and only if her risk level remains below a given thres ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
We develop a methodology to optimally design a financial issue to hedge nontradable risk on financial markets.The modeling involves a minimization of the risk borne by issuer given the constraint imposed by a buyer who enters the transaction if and only if her risk level remains below a given threshold. Both agents have also the opportunity to invest all their residual wealth on financial markets but they do not have the same access to financial investments. The problem may be reduced to a unique infconvolution problem involving some transformation of the initial risk measures.
Pricing, Hedging and Optimally Designing Derivatives via Minimization of Risk Measures
 IN VOLUME ON INDIFFERENCE PRICING, PRINCETON UNIVERSITY PRESS. 24 BERNARDO A.E. AND LEDOIT O.,(2000
, 2005
"... The question of pricing and hedging a given contingent claim has a unique solution in a complete market framework. When some incompleteness is introduced, the problem becomes however more difficult. Several approaches have been adopted in the literature to provide a satisfactory answer to this probl ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
The question of pricing and hedging a given contingent claim has a unique solution in a complete market framework. When some incompleteness is introduced, the problem becomes however more difficult. Several approaches have been adopted in the literature to provide a satisfactory answer to this problem, for a particular choice criterion. Among them, Hodges and Neuberger [72] proposed in 1989 a method based on utility maximization. The price of the contingent claim is then obtained as the smallest (resp. largest) amount leading the agent indifferent between selling (resp. buying) the claim and doing nothing. The price obtained is the indifference seller's (resp. buyer's) price. Since then, many authors have used this approach, the exponential utility function being most often used (see for instance, El Karoui and Rouge [51], Becherer [11], Delbaen et al. [39] , Musiela and Zariphopoulou [93] or Mania and Schweizer [89]...). In this chapter, we also adopt this exponential utility point of view to start with in order to nd the optimal hedge and price of a contingent claim based on a nontradable risk. But soon, we notice that the right framework to work with is not that of the exponential utility itself but that of the certainty equivalent which is a convex functional satisfying some nice properties among which that of cash translation invariance. Hence, the results obtained in this particular framework can be immediately extended to functionals satisfying the same properties, in other words to convex risk measures as introduced by Föllmer and Schied [53] and [54]
The dichotomy between structure and randomness, arithmetic progressions, and the primes
"... Abstract. A famous theorem of Szemerédi asserts that all subsets of the integers with positive upper density will contain arbitrarily long arithmetic progressions. There are many different proofs of this deep theorem, but they are all based on a fundamental dichotomy between structure and randomness ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
Abstract. A famous theorem of Szemerédi asserts that all subsets of the integers with positive upper density will contain arbitrarily long arithmetic progressions. There are many different proofs of this deep theorem, but they are all based on a fundamental dichotomy between structure and randomness, which in turn leads (roughly speaking) to a decomposition of any object into a structured (lowcomplexity) component and a random (discorrelated) component. Important examples of these types of decompositions include the Furstenberg structure theorem and the Szemerédi regularity lemma. One recent application of this dichotomy is the result of Green and Tao establishing that the prime numbers contain arbitrarily long arithmetic progressions (despite having density zero in the integers). The power of this dichotomy is evidenced by the fact that the GreenTao theorem requires surprisingly little technology from analytic number theory, relying instead almost exclusively on manifestations of this dichotomy such as Szemerédi’s theorem. In this paper we survey various manifestations of this dichotomy in combinatorics, harmonic analysis, ergodic theory, and number theory. As we hope to emphasize here, the underlying themes in these arguments are remarkably similar even though the contexts are radically different. 1.
Martingale measures for discrete time processes with infinite horizon
 M ATHEMATICALFINANCE
, 1994
"... Let (St)t2I be an IR d {valued adapted stochastic process on ( � F � (Ft)t2I �P). A basic problem, occuring notably in the analysis of securities markets, is to decide whether there is a probability measure Q on F equivalent toP such that (St)t2I is a martingale with respect to Q. It is known since ..."
Abstract

Cited by 17 (12 self)
 Add to MetaCart
Let (St)t2I be an IR d {valued adapted stochastic process on ( � F � (Ft)t2I �P). A basic problem, occuring notably in the analysis of securities markets, is to decide whether there is a probability measure Q on F equivalent toP such that (St)t2I is a martingale with respect to Q. It is known since the fundamental papers of HarrisonKreps (79), HarrisonPliska(81) and Kreps(81) that there is an intimate relation of this problem with the notions of "no arbitrage" and "no free lunch" in financial economics. We introduce the intermediate concept of "no free lunch with bounded risk". This is a somewhat more precise version of the notion of "no free lunch": It requires that there should be an absolute bound of the maximal loss occuring in the trading strategies considered in the definition of "no free lunch". We shall give an argument why the condition of "no free lunch with bounded risk " should be satisfied by a reasonable model of the price process (St)t2I of a securities market. We can establish the equivalence of the condition of "no free lunch with bounded risk " with the existence of an equivalent martingale measure in the case when the index set I is discrete but (possibly) infinite. A similar theorem was recently obtained by Delbaen (92) for the case of continuous time processes with continuous paths. We can combine these two theorems to get a similar result for the continuous time case when the process (St)t2IR + is bounded and  roughly speaking  the jumps occur at predictable times.
Zitković, “Optimal consumption from investment and random endowment in incomplete semimartingale markets
 Ann. Probab
, 2003
"... Abstract. We consider the problem of maximizing expected utility from consumption in a constrained incomplete semimartingale market with a random endowment process, and establish a general existence and uniqueness result using techniques from convex duality. The notion of asymptotic elasticity of Kr ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
Abstract. We consider the problem of maximizing expected utility from consumption in a constrained incomplete semimartingale market with a random endowment process, and establish a general existence and uniqueness result using techniques from convex duality. The notion of asymptotic elasticity of Kramkov and Schachermayer is extended to the timedependent case. By imposing no smoothness requirements on the utility function in the temporal argument, we can treat both pure consumption and combined consumption/terminal wealth problems, in a common framework. To make the duality approach possible, we provide a detailed characterization of the enlarged dual domain which is reminiscent of the enlargement of L1 to its topological bidual (L∞) ∗ , a space of finitelyadditive measures. As an application, we treat the case of a constrained Itôprocess marketmodel. 1.
F.: Optimal consumption choice under uncertainty with intertemporal substitution
 Ann. Appl. Probab
, 2001
"... We extend the analysis of the intertemporal utility maximization problem for HindyHuangKreps utilities reported in Bank and Riedel (1998) to the stochastic case. Existence and uniqueness of optimal consumption plans are established under arbitrary convex portfolio constraints, including the cases ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
We extend the analysis of the intertemporal utility maximization problem for HindyHuangKreps utilities reported in Bank and Riedel (1998) to the stochastic case. Existence and uniqueness of optimal consumption plans are established under arbitrary convex portfolio constraints, including the cases of both complete and incomplete markets. For the complete market setting, KuhnTuckerlike necessary and sufficient conditions for optimality are given. Using this characterization, we show that optimal consumption plans are obtained by reflecting the associated level of satisfaction on a stochastic lower bound. When uncertainty is generated by a Lévy process and agents exhibit constant relative risk aversion, closedform solutions are derived. Depending on the structure of the underlying stochastics, optimal consumption occurs at rates, in gulps, or singular to Lebesgue measure. Keywords: HindyHuangKreps preferences, nontime additive utility optimization, intertemporal utility, intertemporal substitution
Intersection bodies and Lp spaces
 Adv. Math
"... Abstract. We prove that convex intersection bodies are isomorphically equivalent to unit balls of subspaces of Lq for each q ∈ (0, 1). This is done by extending to negative values of p the factorization theorem of Maurey and Nikishin which states that for any 0 < p < q < 1 every Banach subspace of L ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract. We prove that convex intersection bodies are isomorphically equivalent to unit balls of subspaces of Lq for each q ∈ (0, 1). This is done by extending to negative values of p the factorization theorem of Maurey and Nikishin which states that for any 0 < p < q < 1 every Banach subspace of Lp is isomorphic to a subspace of Lq. 1.
FINITELY ADDITIVE SUPERMARTINGALES
, 801
"... Abstract. The concept of finitely additive supermartingales, originally due to Bochner, is revived and developed. We exploit it to study measure decompositions over filtered probability spaces and the properties of the associated DoléansDade measure. We obtain versions of the Doob Meyer decompositi ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract. The concept of finitely additive supermartingales, originally due to Bochner, is revived and developed. We exploit it to study measure decompositions over filtered probability spaces and the properties of the associated DoléansDade measure. We obtain versions of the Doob Meyer decomposition and, as an application, we establish a version of the Bichteler and Dellacherie theorem with no exogenous probability measure. 1.
An Almost Sure Approximation for the Predictable Process in the Doob–Meyer Decomposition Theorem
"... Summary. We construct the Doob–Meyer decomposition of a submartingale as a pointwise superior limit of decompositions of discrete submartingales suitably built upon discretizations of the original process. This gives, in particular, a direct proof of predictability of the increasing process in the D ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Summary. We construct the Doob–Meyer decomposition of a submartingale as a pointwise superior limit of decompositions of discrete submartingales suitably built upon discretizations of the original process. This gives, in particular, a direct proof of predictability of the increasing process in the Doob–Meyer decomposition. 1 The Doob–Meyer Theorem The Doob–Meyer decomposition theorem opened the way towards the theory of stochastic integration with respect to square integrable martingales and—consequently—semimartingales, as described in the seminal paper [7]. According to Kallenberg [4], this theorem is “the cornerstone of the modern probability theory”. It is therefore not surprising that many proofs of it are known. To the author’s knowledge, all the proofs heavily depend on a result due to DoléansDade [3], which identifies predictable increasing processes with “natural ” increasing processes, as defined by Meyer [6]. In the present paper we develop ideas of another classical paper by K. Murali Rao [8] and construct a sequence of decompositions for which the superior limit is pointwise (in (t, ω)) equal to the desired one, and thus we obtain predictability in the easiest possible way. Let (Ω,F, {Ft} t∈[0,T],P) be a stochastic basis, satisfying the “usual ” conditions, i.e. the filtration {Ft} is rightcontinuous and F0 contains all Pnull sets of FT. Let (D) denote the class of measurable processes {Xt} t∈[0,T] such that the family {Xτ} is uniformly integrable, where τ runs over all stopping times with respect to {Ft} t∈[0,T]. One of the variants of the Doob–Meyer theorem can be formulated as follows.
Singular Control of Optional Random Measures Stochastic Optimization and Representation Problems Arising in the Microeconomic Theory of Intertemporal Consumption Choice
"... doctor rerum naturalium (dr. rer. nat.) im Fach Mathematik eingereicht an der Mathematisch–Naturwissenschaftlichen Fakultät II der Humboldt–Universität zu Berlin von ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
doctor rerum naturalium (dr. rer. nat.) im Fach Mathematik eingereicht an der Mathematisch–Naturwissenschaftlichen Fakultät II der Humboldt–Universität zu Berlin von