Results 1  10
of
25
Representation of spatial orientation by the intrinsic dynamics of the headdirection cell ensemble: A theory
 J. Neurosci
, 1996
"... The headdirection (HD) cells found in the limbic system in freely moving rats represent the instantaneous head direction of the animal in the horizontal plane regardless of the location of the animal. The internal direction represented by these cells uses both selfmotion information for inettiall ..."
Abstract

Cited by 130 (4 self)
 Add to MetaCart
The headdirection (HD) cells found in the limbic system in freely moving rats represent the instantaneous head direction of the animal in the horizontal plane regardless of the location of the animal. The internal direction represented by these cells uses both selfmotion information for inettially based updating and familiar visual landmarks for calibration. Here, a model of the dynamics of the HD cell ensemble is presented. The stability of a localized static activity profile in the network and a dynamic shift mechanism are explained naturally by synaptic weight distribution components with even and odd symmetry, respectively. Under symmetric weights or symmetric reciprocal connections, a stable activity profile close to the known directional tuning curves will emerge. By adding a slight asymmetry to the weights, the activity profile will shift continuously without 1
Portfolio ValueatRisk with HeavyTailed Risk Factors,” Mathematical Finance 12
, 2002
"... This paper develops efficient methods for computing portfolio valueatrisk (VAR) when the underlying risk factors have a heavytailed distribution. In modeling heavy tails, we focus on multivariate t distributions and some extensions thereof. We develop two methods for VAR calculation that exploit ..."
Abstract

Cited by 34 (2 self)
 Add to MetaCart
This paper develops efficient methods for computing portfolio valueatrisk (VAR) when the underlying risk factors have a heavytailed distribution. In modeling heavy tails, we focus on multivariate t distributions and some extensions thereof. We develop two methods for VAR calculation that exploit a quadratic approximation to the portfolio loss, such as the deltagamma approximation. In the first method, we derive the characteristic function of the quadratic approximation and then use numerical transform inversion to approximate the portfolio loss distribution. Because the quadratic approximation may not always yield accurate VAR estimates, we also develop a low variance Monte Carlo method. This method uses the quadratic approximation to guide the selection of an effective importance sampling distribution that samples risk factors so that large losses occur more often. Variance is further reduced by combining the importance sampling with stratified sampling. Numerical results on a variety of test portfolios indicate that large variance reductions are typically obtained. Both methods developed in this paper overcome difficulties associated with VAR calculation with heavytailed risk factors. The Monte Carlo method also extends to the problem of estimating the conditional excess, sometimes known as the conditional VAR.
Weighted analysis of paired microarray experiments
 Statistical Applications in Genetics and Molecular Biology
"... Copyright c○2005 by the authors. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher, bepres ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Copyright c○2005 by the authors. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher, bepress, which has been given certain exclusive rights by the author. Statistical Applications in Genetics and Molecular Biology is produced by The
Towards a more complete debt strategy simulation framework. Bank of Canada: Working Paper
, 2002
"... The views expressed in this paper are those of the author. No responsibility for them should be attributed to the Bank of Canada. Contents ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
The views expressed in this paper are those of the author. No responsibility for them should be attributed to the Bank of Canada. Contents
On the Determination Coefficient in Robust Regression
, 2000
"... In simple linear regression the determination coefficient tells us which percentage of the variance of the response variable is explained by the fitted linear mapping of the explanatory variable. In this paper we examine how to extend the notion of determination coefficient to mean absolute value an ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
In simple linear regression the determination coefficient tells us which percentage of the variance of the response variable is explained by the fitted linear mapping of the explanatory variable. In this paper we examine how to extend the notion of determination coefficient to mean absolute value and least median of squares regression. The concept behind our proposals for the extension originated in an economical context. After dealing with questions of unbiasedness and limiting behaviour for different scales of error variances we discuss a simple example with different distribution assumptions for the residuals and find that the results meet intuition. AMS SUBJECT CLASSIFICATION: 62G35; 62J05. KEY WORDS: Linear regression; robust regression; determination coefficient; quantile; quantile derivative; mean absolute deviation; least median of squares; unbiasedness. 1
Improved likelihood inference in Birnbaum–Saunders regressions. Working paper arXiv:0806.2208v2
, 2009
"... The Birnbaum–Saunders regression model is commonly used in reliability studies. We address the issue of performing inference in this class of models when the number of observations is small. We show that the likelihood ratio test tends to be liberal when the sample size is small, and we obtain a cor ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
The Birnbaum–Saunders regression model is commonly used in reliability studies. We address the issue of performing inference in this class of models when the number of observations is small. We show that the likelihood ratio test tends to be liberal when the sample size is small, and we obtain a correction factor which reduces the size distortion of the test. The correction makes the error rate of the test vanish faster as the sample size increases. The numerical results show that the modified test is more reliable in finite samples than the usual likelihood ratio test. We also present an empirical application. Key words: Bartlett correction; Birnbaum–Saunders distribution; Likelihood ratio test; Maximum likelihood estimation.
Applications and Extensions of a Technique for Estimator Densities
"... Abstract—Applications are given of a formula for the exact probability density function of the maximum likelihood estimates of a statistical model, where the data generating model is allowed to differ from the estimation model. The main examples are supported by simulation experiments. Curved expone ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract—Applications are given of a formula for the exact probability density function of the maximum likelihood estimates of a statistical model, where the data generating model is allowed to differ from the estimation model. The main examples are supported by simulation experiments. Curved exponential families are investigated, for which an approach is described that can be used in many practical situations. The distribution of a maximum likelihood estimator in exponential regression is developed. Nonlinear regression is then considered, with an example of a model discrepancy situation arising in ELISA immunoassays and similar biochemical titrations. An incorrect logistic model is specified for a titration curve that is used for describing the reaction of a chemical sample to applied substrate concentration. A method is suggested to reduce the amount of bias in the estimate of binding affinity. Finally there is a prospective discussion of other possible uses of the technique, including general comparisons of sets of alternative models in frequentist and Bayesian settings, applications to robust estimation and extensions beyond maximum likelihood estimates.
NEW INSIGHTS ON STOCHASTIC COMPLEXITY
"... The Minimum Description Length (MDL) principle led to various expressions of the stochastic complexity (SC), and the most recent one is given by the negative logarithm of the Normalized Maximum Likelihood (NML). For better understanding the properties of the newest SCformula, we relate it to the we ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The Minimum Description Length (MDL) principle led to various expressions of the stochastic complexity (SC), and the most recent one is given by the negative logarithm of the Normalized Maximum Likelihood (NML). For better understanding the properties of the newest SCformula, we relate it to the wellknown Generalized Likelihood Ratio Test (GLRT). Additionally, we compare the SC with the Bayesian Information Criterion (BIC) and other model selection rules. Some of the results are discussed in connection with families of models that are widely used in signal processing. 1.
A New Method for Analysis of Microarray Gene Expression Assays Abstract
, 2005
"... DNA microarray experiments provide a high throughput way to measure mRNA levels of thousands of genes simultaneously. This technology has been refined during the last 10 years and is now an important tool for molecular biologists. From a statistical point of view, the analysis of data generated with ..."
Abstract
 Add to MetaCart
DNA microarray experiments provide a high throughput way to measure mRNA levels of thousands of genes simultaneously. This technology has been refined during the last 10 years and is now an important tool for molecular biologists. From a statistical point of view, the analysis of data generated with DNA microarrays is far from straightforward. The experiments involve several consecutive steps, each inducing systematic effects and differences in precision. Moreover, microarrays are both time consuming and expensive, so only few replicates are usually made. Thus, thousands of variables are observed only a few times each and there is a pressing need for quality assessments, which makes traditional statistical methods unsuitable. In this thesis a new method for analysis of paired DNA microarray experiments is presented. The method is based on a generalised linear model with a variance structure which consists of both a geneindependent covariance matrix and a genedependent scaling factor. To increase the precision, the latter is assumed to follow a prior distribution with a shape hyperparameter. These assumption makes the method suitable for handling data with quality variations and/or few replicates. Estimators for the covariance matrix as well as the hyperparameter are constructed and general likelihood ratio tests for differentially expressed genes are derived. Simulated datasets are used to show that the method