Results 1  10
of
190
Phase Noise in Oscillators: a Unifying Theory and Numerical Methods for Characterization
 IEEE Transactions on Circuits and Systems
, 2000
"... Abstract—Phase noise is a topic of theoretical and practical interest in electronic circuits, as well as in other fields, such as optics. Although progress has been made in understanding the phenomenon, there still remain significant gaps, both in its fundamental theory and in numerical techniques f ..."
Abstract

Cited by 124 (11 self)
 Add to MetaCart
Abstract—Phase noise is a topic of theoretical and practical interest in electronic circuits, as well as in other fields, such as optics. Although progress has been made in understanding the phenomenon, there still remain significant gaps, both in its fundamental theory and in numerical techniques for its characterization. In this paper, we develop a solid foundation for phase noise that is valid for any oscillator, regardless of operating mechanism. We establish novel results about the dynamics of stable nonlinear oscillators in the presence of perturbations, both deterministic and random. We obtain an exact nonlinear equation for phase error, which we solve without approximations for random perturbations. This leads us to a precise characterization of timing jitter and spectral dispersion, for computing which we develop efficient numerical methods. We demonstrate our techniques on a variety of practical electrical oscillators and obtain good matches with measurements, even at frequencies close to the carrier, where previous techniques break down. Our methods are more than three orders of magnitude faster than the bruteforce Monte Carlo approach, which is the only previously available technique that can predict phase noise correctly. Index Terms—Circuit simulation, FokkerPlanck equations, nonlinear oscillators, oscillator noise, phase noise, stochastic
Stochasticity in transcriptional regulation: origins, consequences, and mathematical representations
 Biophys. J
, 2001
"... ABSTRACT Transcriptional regulation is an inherently noisy process. The origins of this stochastic behavior can be traced to the random transitions among the discrete chemical states of operators that control the transcription rate and to finite number fluctuations in the biochemical reactions for t ..."
Abstract

Cited by 70 (1 self)
 Add to MetaCart
ABSTRACT Transcriptional regulation is an inherently noisy process. The origins of this stochastic behavior can be traced to the random transitions among the discrete chemical states of operators that control the transcription rate and to finite number fluctuations in the biochemical reactions for the synthesis and degradation of transcripts. We develop stochastic models to which these random reactions are intrinsic and a series of simpler models derived explicitly from the first as approximations in different parameter regimes. This innate stochasticity can have both a quantitative and qualitative impact on the behavior of generegulatory networks. We introduce a natural generalization of deterministic bifurcations for classification of stochastic systems and show that simple noisy genetic switches have rich bifurcation structures; among them, bifurcations driven solely by changing the rate of operator fluctuations even as the underlying deterministic system remains unchanged. We find stochastic bistability where the deterministic equations predict monostability and viceversa. We derive and solve equations for the mean waiting times for spontaneous transitions between quasistable states in these switches.
DataOriented Performance Analysis
 of SHA3 Candidates on FPGA Accelerated Computers," Design, Automation and Test in Europe, DATE 2011
, 2011
"... I have documented that target prices subsumed in downgrade recommendations are the most informative while target prices in coverage reiteration are the least informative. The First Call database enables me to extend the analysis to an intraday frequency. Conducting event studies using high frequency ..."
Abstract

Cited by 61 (4 self)
 Add to MetaCart
I have documented that target prices subsumed in downgrade recommendations are the most informative while target prices in coverage reiteration are the least informative. The First Call database enables me to extend the analysis to an intraday frequency. Conducting event studies using high frequency data, is even more critical given the advent of information technology systems has dramatically changed the landscape of stock trading. The modified approach to event study is relevant to the fastchanging trading environment in today’s capital market. For upgrades, there are significant positive marketadjusted returns lasting 20 minutes; for downgrades, there are significant negative marketadjusted returns lasting 25 to 35 minutes. By constructing portfolios on the basis of target price information measures (TP/P, ΔTP/TP1 and ΔTP/P), I also document that the information subsumed in target price revision is more useful than target price alone. Furthermore, the dramatic rise in trading activity coupled with a shift in order imbalances implies that the market interprets recommendations with target price changes broadly as a liquidity event.
Diffusion Maps, Spectral Clustering and Reaction
 Applied and Computational Harmonic Analysis: Special issue on Diffusion Maps and Wavelets
, 2006
"... A central problem in data analysis is the low dimensional representation of high dimensional data, and the concise description of its underlying geometry and density. In the analysis of large scale simulations of complex dynamical systems, where the notion of time evolution comes into play, importan ..."
Abstract

Cited by 61 (13 self)
 Add to MetaCart
A central problem in data analysis is the low dimensional representation of high dimensional data, and the concise description of its underlying geometry and density. In the analysis of large scale simulations of complex dynamical systems, where the notion of time evolution comes into play, important problems are the identification of slow variables and dynamically meaningful reaction coordinates that capture the long time evolution of the system. In this paper we provide a unifying view of these apparently different tasks, by considering a family of di#usion maps, defined as the embedding of complex (high dimensional) data onto a low dimensional Euclidian space, via the eigenvectors of suitably defined random walks defined on the given datasets. Assuming that the data is randomly sampled from an underlying general probability distribution p(x) = e U(x) , we show that as the number of samples goes to infinity, the eigenvectors of each di#usion map converge to the eigenfunctions of a corresponding di#erential operator defined on the support of the probability distribution. Di#erent normalizations of the Markov chain on the graph lead to di#erent limiting di#erential operators.
RNA folding at elementary step resolution
 RNA
"... We study the stochastic folding kinetics of RNA sequences into secondary structures with a new algorithm based on the formation, dissociation, and the shifting of individual base pairs. We discuss folding mechanisms and the correlation between the barrier structure of the conformational landscape an ..."
Abstract

Cited by 49 (7 self)
 Add to MetaCart
We study the stochastic folding kinetics of RNA sequences into secondary structures with a new algorithm based on the formation, dissociation, and the shifting of individual base pairs. We discuss folding mechanisms and the correlation between the barrier structure of the conformational landscape and the folding kinetics for a number of
Statistical Mechanics of Nonlinear Nonequilibrium Financial Markets: Applications to Optimized Trading
 MATH. MODELLING
, 1996
"... A paradigm of statistical mechanics of financial markets (SMFM) using nonlinear nonequilibrium algorithms, first published in L. Ingber, Mathematical Modelling, 5, 343361 (1984), is fit to multivariate financial markets using Adaptive Simulated Annealing (ASA), a global optimization algorithm, to p ..."
Abstract

Cited by 41 (34 self)
 Add to MetaCart
A paradigm of statistical mechanics of financial markets (SMFM) using nonlinear nonequilibrium algorithms, first published in L. Ingber, Mathematical Modelling, 5, 343361 (1984), is fit to multivariate financial markets using Adaptive Simulated Annealing (ASA), a global optimization algorithm, to perform maximum likelihood fits of Lagrangians defined by path integrals of multivariate conditional probabilities. Canonical momenta are thereby derived and used as technical indicators in a recursive ASA optimization process to tune trading rules. These trading rules are then used on outofsample data, to demonstrate that they can profit from the SMFM model, to illustrate that these markets are likely not efficient.
ENSO theory
 J. Geophys. Res
, 1998
"... Abstract. Beginning from the hypothesis by Bjerknes [1969] that oceanatmosphere interaction was essential to the E1 NifioSouthern Oscillation (ENSO) phenomenon, the Tropical OceanGlobal Atmosphere (TOGA) decade has not only confirmed this but has supplied detailed theory for mechanisms setting the ..."
Abstract

Cited by 35 (7 self)
 Add to MetaCart
Abstract. Beginning from the hypothesis by Bjerknes [1969] that oceanatmosphere interaction was essential to the E1 NifioSouthern Oscillation (ENSO) phenomenon, the Tropical OceanGlobal Atmosphere (TOGA) decade has not only confirmed this but has supplied detailed theory for mechanisms setting the underlying period and possible mechanisms responsible for the irregularity of ENSO. Essentials of the theory of ocean dynamical adjustment are reviewed from an ENSO perspective. Approaches to simple atmospheric modeling greatly aided development of theory for ENSO atmospheric feedbacks but are critically reviewed for current stumbling blocks for applications beyond ENSO. ENSO theory has benefitted from an unusually complete hierarchy of coupled models of various levels of complexity. Most of the progress during the ENSO decade came from models of intermediate complexity, which are sufficiently detailed to compare to observations and to use in prediction but are less complex than coupled general circulation models. ENSO theory in simple models lagged behind ENSO simulation in intermediate models but has provided a useful role in uniting seemingly diverse viewpoints. The process of boiling ENSO theory down to a single consensus model of all aspects of the phenomenon is still a rapidly progressing area, and theoretical limits to ENSO predictability are still in debate, but a thorough foundation for the discussion has been established in the TOGA decade. 1.
Application of statistical mechanics methodology to termstructure bondpricing models
 Mathl. Comput. Modelling
, 1991
"... Recent work in statistical mechanics has developed new analytical and numerical techniques to solve coupled stochastic equations. This paper applies the very fast simulated reannealing and pathintegral methodologies to the estimation of the Brennan and Schwartz twofactor term structure model. It ..."
Abstract

Cited by 32 (28 self)
 Add to MetaCart
Recent work in statistical mechanics has developed new analytical and numerical techniques to solve coupled stochastic equations. This paper applies the very fast simulated reannealing and pathintegral methodologies to the estimation of the Brennan and Schwartz twofactor term structure model. It is shown that these methodologies can be utilized to estimate more complicated nfactor nonlinear models. 1. CURRENT MODELS OF TERM STRUCTURE The modern theory of term structure of interest rates is based on equilibrium and arbitrage models in which bond prices are determined in terms of a few state variables. The onefactor models of Cox, Ingersoll and Ross (CIR) [14], and the twofactor models of Brennan and Schwartz (BS) [59] have been instrumental in the development of the valuation of interest dependent securities. The assumptions of these models include: • Bond prices are functions of a number of state variables, one to several, that follow Markov processes. • Inv estors are rational and prefer more wealth to less wealth. • Inv estors have homogeneous expectations.
Learning continuous probability distributions with symmetric diffusion networks
 Cognitive Science
, 1993
"... in this article we present symmetric diffusion networks, a family of networks that instantiate the principles of continuous, stochastic, adaptive and interactive propagation of information. Using methods of Markovlon diffusion theory, we formalize the activation dynamics of these networks and then ..."
Abstract

Cited by 32 (7 self)
 Add to MetaCart
in this article we present symmetric diffusion networks, a family of networks that instantiate the principles of continuous, stochastic, adaptive and interactive propagation of information. Using methods of Markovlon diffusion theory, we formalize the activation dynamics of these networks and then show that they can be trained to reproduce entire muitivariote probability distributions an their outputs using the contrastive Hebbian learning rule (CHL).,We show that CHL performs gradient descent on an error function that captures differences between desired and obtolned continuous multivoriate probability distributions. This allows the learning algorithm to go beyond expected values of output units and to approximate complete probability distributions on continuous muitivariote activation spaces. We argue that learning continuous distributions is an important task underlying a variety of reallife situations that were beyond the scope of previous connectionist networks. Deterministic networks, like back propagation, cannot ieorn this task because they ore limited to learning average values of independent output units. Previous stochastic connectionist networks could learn probobility distributions but they were limited to discrete variables. Simulations show that symmetric diffusion networks can be trained with the CHL rule to opproximate discrete and continuous probability distributions of various types. 1.