Results 1  10
of
400
Phase Noise in Oscillators: a Unifying Theory and Numerical Methods for Characterization
 IEEE Transactions on Circuits and Systems
, 2000
"... Abstract—Phase noise is a topic of theoretical and practical interest in electronic circuits, as well as in other fields, such as optics. Although progress has been made in understanding the phenomenon, there still remain significant gaps, both in its fundamental theory and in numerical techniques f ..."
Abstract

Cited by 203 (22 self)
 Add to MetaCart
(Show Context)
Abstract—Phase noise is a topic of theoretical and practical interest in electronic circuits, as well as in other fields, such as optics. Although progress has been made in understanding the phenomenon, there still remain significant gaps, both in its fundamental theory and in numerical techniques for its characterization. In this paper, we develop a solid foundation for phase noise that is valid for any oscillator, regardless of operating mechanism. We establish novel results about the dynamics of stable nonlinear oscillators in the presence of perturbations, both deterministic and random. We obtain an exact nonlinear equation for phase error, which we solve without approximations for random perturbations. This leads us to a precise characterization of timing jitter and spectral dispersion, for computing which we develop efficient numerical methods. We demonstrate our techniques on a variety of practical electrical oscillators and obtain good matches with measurements, even at frequencies close to the carrier, where previous techniques break down. Our methods are more than three orders of magnitude faster than the bruteforce Monte Carlo approach, which is the only previously available technique that can predict phase noise correctly. Index Terms—Circuit simulation, FokkerPlanck equations, nonlinear oscillators, oscillator noise, phase noise, stochastic
Stochasticity in transcriptional regulation: origins, consequences, and mathematical representations
 Biophys. J
, 2001
"... ABSTRACT Transcriptional regulation is an inherently noisy process. The origins of this stochastic behavior can be traced to the random transitions among the discrete chemical states of operators that control the transcription rate and to finite number fluctuations in the biochemical reactions for t ..."
Abstract

Cited by 123 (3 self)
 Add to MetaCart
(Show Context)
ABSTRACT Transcriptional regulation is an inherently noisy process. The origins of this stochastic behavior can be traced to the random transitions among the discrete chemical states of operators that control the transcription rate and to finite number fluctuations in the biochemical reactions for the synthesis and degradation of transcripts. We develop stochastic models to which these random reactions are intrinsic and a series of simpler models derived explicitly from the first as approximations in different parameter regimes. This innate stochasticity can have both a quantitative and qualitative impact on the behavior of generegulatory networks. We introduce a natural generalization of deterministic bifurcations for classification of stochastic systems and show that simple noisy genetic switches have rich bifurcation structures; among them, bifurcations driven solely by changing the rate of operator fluctuations even as the underlying deterministic system remains unchanged. We find stochastic bistability where the deterministic equations predict monostability and viceversa. We derive and solve equations for the mean waiting times for spontaneous transitions between quasistable states in these switches.
DataOriented Performance Analysis
 of SHA3 Candidates on FPGA Accelerated Computers," Design, Automation and Test in Europe, DATE 2011
, 2011
"... I have documented that target prices subsumed in downgrade recommendations are the most informative while target prices in coverage reiteration are the least informative. The First Call database enables me to extend the analysis to an intraday frequency. Conducting event studies using high frequency ..."
Abstract

Cited by 102 (7 self)
 Add to MetaCart
(Show Context)
I have documented that target prices subsumed in downgrade recommendations are the most informative while target prices in coverage reiteration are the least informative. The First Call database enables me to extend the analysis to an intraday frequency. Conducting event studies using high frequency data, is even more critical given the advent of information technology systems has dramatically changed the landscape of stock trading. The modified approach to event study is relevant to the fastchanging trading environment in today’s capital market. For upgrades, there are significant positive marketadjusted returns lasting 20 minutes; for downgrades, there are significant negative marketadjusted returns lasting 25 to 35 minutes. By constructing portfolios on the basis of target price information measures (TP/P, ΔTP/TP1 and ΔTP/P), I also document that the information subsumed in target price revision is more useful than target price alone. Furthermore, the dramatic rise in trading activity coupled with a shift in order imbalances implies that the market interprets recommendations with target price changes broadly as a liquidity event.
Diffusion Maps, Spectral Clustering and Reaction
 Applied and Computational Harmonic Analysis: Special issue on Diffusion Maps and Wavelets
, 2006
"... A central problem in data analysis is the low dimensional representation of high dimensional data, and the concise description of its underlying geometry and density. In the analysis of large scale simulations of complex dynamical systems, where the notion of time evolution comes into play, importan ..."
Abstract

Cited by 93 (15 self)
 Add to MetaCart
A central problem in data analysis is the low dimensional representation of high dimensional data, and the concise description of its underlying geometry and density. In the analysis of large scale simulations of complex dynamical systems, where the notion of time evolution comes into play, important problems are the identification of slow variables and dynamically meaningful reaction coordinates that capture the long time evolution of the system. In this paper we provide a unifying view of these apparently different tasks, by considering a family of di#usion maps, defined as the embedding of complex (high dimensional) data onto a low dimensional Euclidian space, via the eigenvectors of suitably defined random walks defined on the given datasets. Assuming that the data is randomly sampled from an underlying general probability distribution p(x) = e U(x) , we show that as the number of samples goes to infinity, the eigenvectors of each di#usion map converge to the eigenfunctions of a corresponding di#erential operator defined on the support of the probability distribution. Di#erent normalizations of the Markov chain on the graph lead to di#erent limiting di#erential operators.
RNA folding at elementary step resolution
 RNA
"... We study the stochastic folding kinetics of RNA sequences into secondary structures with a new algorithm based on the formation, dissociation, and the shifting of individual base pairs. We discuss folding mechanisms and the correlation between the barrier structure of the conformational landscape an ..."
Abstract

Cited by 71 (8 self)
 Add to MetaCart
(Show Context)
We study the stochastic folding kinetics of RNA sequences into secondary structures with a new algorithm based on the formation, dissociation, and the shifting of individual base pairs. We discuss folding mechanisms and the correlation between the barrier structure of the conformational landscape and the folding kinetics for a number of
Fast Numerical Methods for Stochastic Computations: A Review
, 2009
"... This paper presents a review of the current stateoftheart of numerical methods for stochastic computations. The focus is on efficient highorder methods suitable for practical applications, with a particular emphasis on those based on generalized polynomial chaos (gPC) methodology. The framework ..."
Abstract

Cited by 62 (2 self)
 Add to MetaCart
This paper presents a review of the current stateoftheart of numerical methods for stochastic computations. The focus is on efficient highorder methods suitable for practical applications, with a particular emphasis on those based on generalized polynomial chaos (gPC) methodology. The framework of gPC is reviewed, along with its Galerkin and collocation approaches for solving stochastic equations. Properties of these methods are summarized by using results from literature. This paper also attempts to present the gPC based methods in a unified framework based on an extension of the classical spectral methods into multidimensional random spaces.
ENSO theory
 J. Geophys. Res
, 1998
"... Abstract. Beginning from the hypothesis by Bjerknes [1969] that oceanatmosphere interaction was essential to the E1 NifioSouthern Oscillation (ENSO) phenomenon, the Tropical OceanGlobal Atmosphere (TOGA) decade has not only confirmed this but has supplied detailed theory for mechanisms setting the ..."
Abstract

Cited by 59 (9 self)
 Add to MetaCart
Abstract. Beginning from the hypothesis by Bjerknes [1969] that oceanatmosphere interaction was essential to the E1 NifioSouthern Oscillation (ENSO) phenomenon, the Tropical OceanGlobal Atmosphere (TOGA) decade has not only confirmed this but has supplied detailed theory for mechanisms setting the underlying period and possible mechanisms responsible for the irregularity of ENSO. Essentials of the theory of ocean dynamical adjustment are reviewed from an ENSO perspective. Approaches to simple atmospheric modeling greatly aided development of theory for ENSO atmospheric feedbacks but are critically reviewed for current stumbling blocks for applications beyond ENSO. ENSO theory has benefitted from an unusually complete hierarchy of coupled models of various levels of complexity. Most of the progress during the ENSO decade came from models of intermediate complexity, which are sufficiently detailed to compare to observations and to use in prediction but are less complex than coupled general circulation models. ENSO theory in simple models lagged behind ENSO simulation in intermediate models but has provided a useful role in uniting seemingly diverse viewpoints. The process of boiling ENSO theory down to a single consensus model of all aspects of the phenomenon is still a rapidly progressing area, and theoretical limits to ENSO predictability are still in debate, but a thorough foundation for the discussion has been established in the TOGA decade. 1.
A Method for Selecting the Bin Size of a Time Histogram
, 2007
"... The time histogram method is the most basic tool for capturing a timedependent rate of neuronal spikes. Generally in the neurophysiological literature, the bin size that critically determines the goodness of the fit of the time histogram to the underlying spike rate has been subjectively selected by ..."
Abstract

Cited by 49 (5 self)
 Add to MetaCart
The time histogram method is the most basic tool for capturing a timedependent rate of neuronal spikes. Generally in the neurophysiological literature, the bin size that critically determines the goodness of the fit of the time histogram to the underlying spike rate has been subjectively selected by individual researchers. Here, we propose a method for objectively selecting the bin size from the spike count statistics alone, so that the resulting bar or line graph time histogram best represents the unknown underlying spike rate. For a small number of spike sequences generated from a modestly fluctuating rate, the optimal bin size may diverge, indicating that any time histogram is likely to capture a spurious rate. Given a paucity of data, the method presented here can nevertheless suggest how many experimental trials should be added in order to obtain a meaningful timedependent histogram with the required accuracy.
Statistical Mechanics of Nonlinear Nonequilibrium Financial Markets: Applications to Optimized Trading
 MATH. MODELLING
, 1996
"... A paradigm of statistical mechanics of financial markets (SMFM) using nonlinear nonequilibrium algorithms, first published in L. Ingber, Mathematical Modelling, 5, 343361 (1984), is fit to multivariate financial markets using Adaptive Simulated Annealing (ASA), a global optimization algorithm, to p ..."
Abstract

Cited by 47 (35 self)
 Add to MetaCart
(Show Context)
A paradigm of statistical mechanics of financial markets (SMFM) using nonlinear nonequilibrium algorithms, first published in L. Ingber, Mathematical Modelling, 5, 343361 (1984), is fit to multivariate financial markets using Adaptive Simulated Annealing (ASA), a global optimization algorithm, to perform maximum likelihood fits of Lagrangians defined by path integrals of multivariate conditional probabilities. Canonical momenta are thereby derived and used as technical indicators in a recursive ASA optimization process to tune trading rules. These trading rules are then used on outofsample data, to demonstrate that they can profit from the SMFM model, to illustrate that these markets are likely not efficient.