Results 1  10
of
43
Fractional Brownian motion, random walks, and binary market models
 Finance & Stochastics
, 2001
"... Abstract. We prove a Donsker type approximation theorem for the fractional Brownian motion in the case H> 1/2. Using this approximation we construct an elementary market model that converges weakly to the fractional analogue of the Black–Scholes model. We show that there exist arbitrage opportuni ..."
Abstract

Cited by 44 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We prove a Donsker type approximation theorem for the fractional Brownian motion in the case H> 1/2. Using this approximation we construct an elementary market model that converges weakly to the fractional analogue of the Black–Scholes model. We show that there exist arbitrage opportunities in this model. One such opportunity is constructed explicitly.
A Nonlinear SuperExponential Rational Model of Speculative Financial Bubbles
, 2002
"... Keeping a basic tenet of economic theory, rational expectations, we model the nonlinear positive feedback between agents in the stock market as an interplay between nonlinearity and multiplicative noise. The derived hyperbolic stochastic finitetime singularity formula transforms a Gaussian white ..."
Abstract

Cited by 20 (7 self)
 Add to MetaCart
(Show Context)
Keeping a basic tenet of economic theory, rational expectations, we model the nonlinear positive feedback between agents in the stock market as an interplay between nonlinearity and multiplicative noise. The derived hyperbolic stochastic finitetime singularity formula transforms a Gaussian white noise into a rich time series possessing all the stylized facts of empirical prices, as well as accelerated speculative bubbles preceding crashes. We use the formula to invert the two years of price history prior to the recent crash on the Nasdaq (april 2000) and prior to the crash in the Hong Kong market associated with the Asian crisis in early 1994. These complex price dynamics are captured using only one exponent controlling the explosion, the variance and mean of the underlying random walk. This offers a new and powerful detection
Multifractal returns and hierarchical portfolio theory. Quantitative Finance
, 2001
"... We extend and test empirically the multifractal model of asset returns based on a multiplicative cascade of volatilities from large time scale to small time scales. Inspired by an analogy between price dynamics and hydrodynamic turbulence [Ghashghaie et al., 1996; Arneodo et al., 1998a], it models t ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
(Show Context)
We extend and test empirically the multifractal model of asset returns based on a multiplicative cascade of volatilities from large time scale to small time scales. Inspired by an analogy between price dynamics and hydrodynamic turbulence [Ghashghaie et al., 1996; Arneodo et al., 1998a], it models the time scale dependence of the probability distribution of returns in terms of a superposition of Gaussian laws, with a lognormal distribution of the Gaussian variances. This multifractal description of assets fluctuations is generalized into a multivariate framework to account simultaneously for correlations across times scales and between a basket of assets. The reported empirical evidences show that this extension is pertinent for financial modelling. Two sources of nonnormality are discussed: at large time scales, the distinction between discretely and continuously discounted returns lead to the usual lognormal deviation from normality; at small time scales, the multiplicative cascade process leads to multifractality and strong deviations from normality. By perturbation expansions, we are able to quantify precisely on the cumulants of the distribution of returns the interplay and crossover between these two mechanisms. The second part of the paper applies this theory to portfolio optimisation.
The economic return of research: the Pareto law and its implications
 European Physical Journal B
"... At what level should government or companies support research? This complex multifaceted question encompasses such qualitative bonus as satisfying natural human curiosity, the quest for knowledge and the impact on education and culture, but one of its most scrutinized component reduces to the asses ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
(Show Context)
At what level should government or companies support research? This complex multifaceted question encompasses such qualitative bonus as satisfying natural human curiosity, the quest for knowledge and the impact on education and culture, but one of its most scrutinized component reduces to the assessment of economic performance and wealth creation derived from research. Many studies report evidences of positive economic benefits derived from basic research [1, 2]. In certain areas such as biotechnology, semiconductor physics, optical communications [3], the impact of basic research is direct while, in other disciplines, the path from discovery to applications is full of surprises. As a consequence, there are persistent uncertainties in the quantification of the exact economic returns of public expenditure on basic research. This gives little help to policy makers trying to determine what should be the level of funding. Here, we suggest that these uncertainties have a fundamental origin to be found in the interplay between the intrinsic “fat tail ” power law nature of the distribution of
Highly optimised global organisation of metabolic networks
 IEE Proceedings: Systems Biology 152
, 2005
"... Abstract: Highlevel, mathematically precise descriptions of the global organisation of complex metabolic networks are necessary for understanding the global structure of metabolic networks, the interpretation and integration of large amounts of biologic data (sequences, various omics) and ultim ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
Abstract: Highlevel, mathematically precise descriptions of the global organisation of complex metabolic networks are necessary for understanding the global structure of metabolic networks, the interpretation and integration of large amounts of biologic data (sequences, various omics) and ultimately for rational design of therapies for disease processes. Metabolic networks are highly organised to execute their function efficiently while tolerating wide variation in their environment. These networks are constrained by physical requirements (e.g. conservation of energy, redox and small moieties) but are also remarkably robust and evolvable. The authors use wellknown features of the stoichiometry of bacterial metabolic networks to demonstrate how network architecture facilitates such capabilities, and to develop a minimal abstract metabolism which incorporates the known features of the stoichiometry and respects the constraints on enzymes and reactions. This model shows that the essential functionality and constraints drive the tradeoffs between robustness and fragility, as well as the largescale structure and organisation of the whole network, particularly high variability. The authors emphasise how domainspecific constraints and tradeoffs imposed by the environment are important factors in shaping stoichiometry. Importantly, the consequence of these highly organised tradeoffs and tolerances is an architecture that has a highly structured modularity that is selfdissimilar and scalerich. Introduction Metabolic networks, which have been extensively studied for decades, are emblematic of how evolution has sculpted biologic systems for optimal function. In addition to unambiguous functional descriptions of core metabolism, this conserved network has been recently described in detail in terms of its stoichiometry (mass and energy balance). A higher level, mathematically defined description of the global organisation of complex metabolic networks is critical for a deep understanding of metabolism, from the interpretation of huge amounts of biologic data (sequences, various omics) to design of therapies for disease processes. The stakes are high for obtaining the big picture right: biologic data plugged into a distorted model or interpreted in the context of a flawed universal law propagates misinterpretations. In flux analyses [1], stoichiometry is considered as a constraint, and fluxes are optimised to satisfy a global objective, typically growth. Previous studies, however, have not directly addressed whether the stoichiometry itself is highly optimal or organised in any sense and contributes to the origins and purpose of complexity in biological networks. Yet biochemistry textbooks describe metabolism as having evolved to be 'highly integrated' with the appearance of a 'coherent design' [2]. Here we explore both important 'design' (with no implication of a 'designer') features of metabolism and the sense in which stoichiometry itself has highly organised and optimised tolerances and tradeoffs (HOT) Basic features of metabolic networks Metabolism is essentially a linked series of chemical reactions, which function to synthesise building blocks for usable cellular components and to extract energy and reducing power from the cellular environment, in the context of total organism homeostasis. Constraints on the network are imposed by highly unpredictable intracellular and extracellular environments as well as the details of enzyme molecular structure, the cost of making enzymes and the conservation of atoms, energy and small moieties. The simplest model of metabolic networks is a stoichiometry matrix (smatrix for short) of chemical reactions with the metabolites in rows and reactions in columns and is defined unambiguously except for permutations of rows # IEE, 2005
Muzy: Volatility Fingerprints of Large Shocks: Endogeneous Versus Exogeneous arXiv:condmat/0204626
, 2003
"... Finance is about how the continuous stream of news gets incorporated into prices. But not all news have the same impact. Can one distinguish the effects of the Sept. 11, 2001 attack or of the coup against Gorbachev on Aug., 19, 1991 from financial crashes such as Oct. 1987 as well as smaller volatil ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Finance is about how the continuous stream of news gets incorporated into prices. But not all news have the same impact. Can one distinguish the effects of the Sept. 11, 2001 attack or of the coup against Gorbachev on Aug., 19, 1991 from financial crashes such as Oct. 1987 as well as smaller volatility bursts? Using a parsimonious autoregressive process with longrange memory defined on the logarithm of the volatility, we predict strikingly different response functions of the price volatility to great external shocks compared to what we term endogeneous shocks, i.e., which result from the cooperative accumulation of many small shocks. These predictions are remarkably wellconfirmed empirically on a hierarchy of volatility shocks. Our theory allows us to classify two classes of events (endogeneous and exogeneous) with specific signatures and characteristic precursors for the endogeneous class. It also explains the origin of endogeneous shocks as the coherent accumulations of tiny bad news, and thus unify all previous explanations of large crashes including Oct. 1987. 1
An economic analogy to thermodynamics.
 Am. J. Phys.
, 1999
"... We develop analogies between economic systems and thermodynamics, and show how economic quantities can characterize the state of an economic system in equilibrium. We argue that just as a physical system in thermodynamic equilibrium requires a nonmechanical variable ͑the temperature T) to specify i ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We develop analogies between economic systems and thermodynamics, and show how economic quantities can characterize the state of an economic system in equilibrium. We argue that just as a physical system in thermodynamic equilibrium requires a nonmechanical variable ͑the temperature T) to specify its state, so does an economic system. In addition, both systems must have a corresponding conjugate quantity, the entropy S. We also develop economic analogies to the free energy, Maxwell relations, and the GibbsDuhem relationship. Assuming that economic utility can be measured, we develop an operational definition of an economic temperature scale. We also develop an analogy to statistical mechanics, which leads to Gaussian fluctuations.
Data Mining for Prediction. Financial Series Case
, 2003
"... Hard problems force innovative approaches and attention to detail, their exploration often contributing beyond the area initially attempted. This thesis investigates the data mining process resulting in a predictor for numerical series. The series experimented with come from financial data – usuall ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Hard problems force innovative approaches and attention to detail, their exploration often contributing beyond the area initially attempted. This thesis investigates the data mining process resulting in a predictor for numerical series. The series experimented with come from financial data – usually hard to forecast. One approach to prediction is to spot patterns in the past, when we already know what followed them, and to test on more recent data. If a pattern is followed by the same outcome frequently enough, we can gain confidence that it is a genuine relationship. Because this approach does not assume any special knowledge or form of the regularities, the method is quite general – applicable to other time series, not just financial. However, the generality puts strong demands on the pattern detection – as to notice regularities in any of the many possible forms. The thesis ’ quest for an automated patternspotting involves numerous data mining and optimization techniques: neural networks, decision trees, nearest neighbors, regression, genetic algorithms and other. Comparison of their performance on a stock exchange index data is one of the contributions. As no single technique performed sufficiently well, a number of predictors have been put together, forming a voting ensemble. The vote is diversified not only by different training data – as usually done – but also by a learning method and its parameters. An approach is also proposed how to speedup a predictor finetuning. The algorithm development goes still further: A prediction can only be as good as the training data, therefore the need for good data preprocessing. In particular, new multivariate discretization and attribute selection algorithms are presented. The thesis also includes overviews of prediction pitfalls and possible solutions, as well as of ensemblebuilding for series data with financial characteristics, such as noise and many attributes. The Ph.D. thesis consists of an extended background on financial prediction, 7 papers, and 2 appendices.
Importance of Positive Feedbacks and Overconfidence in a SelfFulfilling Ising Model of Financial Markets
, 2005
"... Following a long tradition of physicists who have noticed that the Ising model provides a general background to build realistic models of social interactions, we study a model of financial price dynamics resulting from the collective aggregate decisions of agents. This model incorporates imitation, ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Following a long tradition of physicists who have noticed that the Ising model provides a general background to build realistic models of social interactions, we study a model of financial price dynamics resulting from the collective aggregate decisions of agents. This model incorporates imitation, the impact of external news and private information. It has the structure of a dynamical Ising model in which agents have two opinions (buy or sell) with coupling coefficients which evolve in time with a memory of how past news have explained realized market returns. We study two versions of the model, which differ on how the agents interpret the predictive power of news. We show that the stylized facts of financial markets are reproduced only when agents are overconfident and misattribute the success of news to predict return to herding effects, thereby providing positive feedbacks leading to the model functioning close to the critical point. Our model exhibits a rich multifractal structure characterized by a continuous spectrum of exponents of the power law relaxation of endogenous bursts of volatility, in good agreement with previous analytical