Results 1  10
of
136
Thresholding of statistical maps in functional neuroimaging using the false discovery rate.
 NeuroImage
, 2002
"... Finding objective and effective thresholds for voxelwise statistics derived from neuroimaging data has been a longstanding problem. With at least one test performed for every voxel in an image, some correction of the thresholds is needed to control the error rates, but standard procedures for mult ..."
Abstract

Cited by 521 (9 self)
 Add to MetaCart
(Show Context)
Finding objective and effective thresholds for voxelwise statistics derived from neuroimaging data has been a longstanding problem. With at least one test performed for every voxel in an image, some correction of the thresholds is needed to control the error rates, but standard procedures for multiple hypothesis testing (e.g., Bonferroni) tend to not be sensitive enough to be useful in this context. This paper introduces to the neuroscience literature statistical procedures for controlling the false discovery rate (FDR). Recent theoretical work in statistics suggests that FDRcontrolling procedures will be effective for the analysis of neuroimaging data. These procedures operate simultaneously on all voxelwise test statistics to determine which tests should be considered statistically significant. The innovation of the procedures is that they control the expected proportion of the rejected hypotheses that are falsely rejected. We demonstrate this approach using both simulations and functional magnetic resonance imaging data from two simple experiments. © 2002 Elsevier Science (USA)
Nonparametric Permutation Tests for Functional Neuroimaging: A Primer with Examples. Human Brain Mapping
, 2001
"... The statistical analyses of functional mapping experiments usually proceeds at the voxel level, involving the formation and assessment of a statistic image: at each voxel a statistic indicating evidence of the experimental effect of interest, at that voxel, is computed, giving an image of statistics ..."
Abstract

Cited by 396 (9 self)
 Add to MetaCart
(Show Context)
The statistical analyses of functional mapping experiments usually proceeds at the voxel level, involving the formation and assessment of a statistic image: at each voxel a statistic indicating evidence of the experimental effect of interest, at that voxel, is computed, giving an image of statistics, a statistic
Voxelbased morphometry—The methods
 Neuroimage
, 2000
"... At its simplest, voxelbased morphometry (VBM) involves a voxelwise comparison of the local concentration of gray matter between two groups of subjects. The procedure is relatively straightforward and involves spatially normalizing highresolution images from all the subjects in the study into the ..."
Abstract

Cited by 273 (4 self)
 Add to MetaCart
(Show Context)
At its simplest, voxelbased morphometry (VBM) involves a voxelwise comparison of the local concentration of gray matter between two groups of subjects. The procedure is relatively straightforward and involves spatially normalizing highresolution images from all the subjects in the study into the same stereotactic space. This is followed by segmenting the gray matter from the spatially normalized images and smoothing the graymatter segments. Voxelwise parametric statistical tests which compare the smoothed graymatter images from the two groups are performed. Corrections for multiple comparisons are made using the theory of Gaussian random fields. This paper describes the steps involved in VBM, with particular emphasis on segmenting gray matter from MR images with nonuniformity artifact. We provide evaluations of the assumptions that underpin the method, including the accuracy of the segmentation and the assumptions made about the statistical distribution of the data. © 2000 Academic Press
A voxelbased morphometric study of ageing in 465 normal adult human brains.
 NeuroImage
, 2001
"... Voxelbasedmorphometry (VBM) is a wholebrain, unbiased technique for characterizing regional cerebral volume and tissue concentration differences in structural magnetic resonance images. We describe an optimized method of VBM to examine the effects of age on grey and white matter and CSF in 465 n ..."
Abstract

Cited by 267 (2 self)
 Add to MetaCart
(Show Context)
Voxelbasedmorphometry (VBM) is a wholebrain, unbiased technique for characterizing regional cerebral volume and tissue concentration differences in structural magnetic resonance images. We describe an optimized method of VBM to examine the effects of age on grey and white matter and CSF in 465 normal adults. Global grey matter volume decreased linearly with age, with a significantly steeper decline in males. Local areas of accelerated loss were observed bilaterally in the insula, superior parietal gyri, central sulci, and cingulate sulci. Areas exhibiting little or no age effect (relative preservation) were noted in the amygdala, hippocampi, and entorhinal cortex. Global white matter did not decline with age, but local areas of relative accelerated loss and preservation were seen. There was no interaction of age with sex for regionally specific effects. These results corroborate previous reports and indicate that VBM is a useful technique for studying structural brain correlates of ageing through life in humans. © 2001 Academic Press Key Words: ageing; normal; MRI; voxel based morphometry. INTRODUCTION There is compelling evidence from post mortem and in vivo studies that the brain shrinks with age, but accurate quantification of the specific patterns of agerelated atrophy has proved elusive. It is unclear whether there are predictable common patterns of ageing or whether individual human brains respond to the ageing process idiosyncratically. Postmortem analysis of mammalian brains suggest that there may be a gradient of ageing from the association areas to the primary sensory regions, with the former showing the most prominent correlations between age and atrophy
Controlling the familywise error rate in functional neuroimaging: a comparative review
 Statistical Methods in Medical Research
, 2003
"... Functional neuroimaging data embodies a massive multiple testing problem, where 100 000 correlated test statistics must be assessed. The familywise error rate, the chance of any false positives is the standard measure of Type I errors in multiple testing. In this paper we review and evaluate three a ..."
Abstract

Cited by 173 (7 self)
 Add to MetaCart
(Show Context)
Functional neuroimaging data embodies a massive multiple testing problem, where 100 000 correlated test statistics must be assessed. The familywise error rate, the chance of any false positives is the standard measure of Type I errors in multiple testing. In this paper we review and evaluate three approaches to thresholding images of test statistics: Bonferroni, random �eld and the permutation test. Owing to recent developments, improved Bonferroni procedures, such as Hochberg’s methods, are now applicable to dependent data. Continuous random �eld methods use the smoothness of the image to adapt to the severity of the multiple testing problem. Also, increased computing power has made both permutation and bootstrap methods applicable to functional neuroimaging. We evaluate these approaches on t images using simulations and a collection of real datasets. We �nd that Bonferronirelated tests offer little improvement over Bonferroni, while the permutation method offers substantial improvement over the random �eld method for low smoothness and low degrees of freedom. We also show the limitations of trying to �nd an equivalent number of independent tests for an image of correlated test statistics. 1
Spatial Pattern Analysis of Functional Brain Images Using Partial Least Squares
 Neuroimage
, 1996
"... This paper introduces a new tool for functional neuroimage analysis: partial least squares (PLS). It is unique as a multivariate method in its choice of emphasis for analysis, that being the covariance between brain images and exogenous blocks representing either the experiment design or some behavi ..."
Abstract

Cited by 158 (17 self)
 Add to MetaCart
(Show Context)
This paper introduces a new tool for functional neuroimage analysis: partial least squares (PLS). It is unique as a multivariate method in its choice of emphasis for analysis, that being the covariance between brain images and exogenous blocks representing either the experiment design or some behavioral measure. Whatemerges are spatial patterns of brain activity that represent the optimal association between the images and either of the blocks. This process differs substantially from other multivariate methods in that rather than attempting to predict the individual values of the image pixels, PLS attempts to explain the relation between image pixels and task or behavior. Data from a face encoding and recognition PET rCBF study are used to illustrate two types of PLS analysis: an activation analysis of task with images and a brain behavior analysis. The commonalities across the two analyses are suggestive of a general face memory network differentially engaged during encoding and recognition. PLS thus serves as an important extension by extracting new information from imaging data that is not accessible through other currently used univariate and multivariate image analysis tools. r 1996 Academic Press, Inc
Global, voxel, and cluster tests, by theory and permutation, for a difference between two groups of univariate analysis of ERPs/ERFs I: Review 1725 structural MR images of the brain
 IEEE Transactions on Medical Imaging
, 1999
"... Abstract—We describe almost entirely automated procedures for estimation of global, voxel, and clusterlevel statistics to test the null hypothesis of zero neuroanatomical difference between two groups of structural magnetic resonance imaging (MRI) data. Theoretical distributions under the null hypo ..."
Abstract

Cited by 134 (18 self)
 Add to MetaCart
(Show Context)
Abstract—We describe almost entirely automated procedures for estimation of global, voxel, and clusterlevel statistics to test the null hypothesis of zero neuroanatomical difference between two groups of structural magnetic resonance imaging (MRI) data. Theoretical distributions under the null hypothesis are available for 1) global tissue class volumes; 2) standardized linear model [analysis of variance (ANOVA and ANCOVA)] coefficients estimated at each voxel; and 3) an area of spatially connected clusters generated by applying an arbitrary threshold to a twodimensional (2D) map of normal statistics at voxel level. We describe novel methods for economically ascertaining probability distributions under the null hypothesis, with fewer assumptions, by permutation of the observed data. Nominal Type I error control by permutation testing is generally excellent; whereas theoretical distributions may be over conservative. Permutation has the additional advantage that it can be used to test any statistic of interest, such as the sum of suprathreshold voxel statistics in a cluster (or cluster mass), regardless of its theoretical tractability under the null hypothesis. These issues are illustrated by application to MRI data acquired from 18 adolescents with hyperkinetic disorder and 16 control subjects matched for age and gender. Index Terms — Brain, imaging/mapping, probability distributions, statistics.
The quantitative evaluation of functional neuroimaging expertiments: The NPAIRS data analysis framework. NeuroImage 11: S592
, 2000
"... We introduce a dataanalysis framework and performance metrics for evaluating and optimizing the interaction between activation tasks, experimental designs, and the methodological choices and tools for data acquisition, preprocessing, data analysis, and extraction of statistical parametric maps (SPM ..."
Abstract

Cited by 68 (17 self)
 Add to MetaCart
(Show Context)
We introduce a dataanalysis framework and performance metrics for evaluating and optimizing the interaction between activation tasks, experimental designs, and the methodological choices and tools for data acquisition, preprocessing, data analysis, and extraction of statistical parametric maps (SPMs). Our NPAIRS (nonparametric prediction, activation, influence, and reproducibility resampling) framework provides an alternative to simulations and ROC curves by using real PET and fMRI data sets to examine the relationship between prediction accuracy and the signaltonoise ratios (SNRs) associated with reproducible SPMs. Using crossvalidation resampling we plot training–test set predictions of the experimental design variables (e.g., brainstate labels) versus reproducibility
An evaluation of thresholding techniques in fMRI analysis
, 2004
"... This paper reviews and compares individual voxelwise thresholding methods for identifying active voxels in singlesubject fMRI datasets. Different error rates are described which may be used to calibrate activation thresholds. We discuss methods which control each of the error rates at a prespecifi ..."
Abstract

Cited by 53 (21 self)
 Add to MetaCart
This paper reviews and compares individual voxelwise thresholding methods for identifying active voxels in singlesubject fMRI datasets. Different error rates are described which may be used to calibrate activation thresholds. We discuss methods which control each of the error rates at a prespecified level a, including simple procedures which ignore spatial correlation among the test statistics as well as more elaborate ones which incorporate this correlation information. The operating characteristics of the methods are shown through a simulation study, indicating that the error rate used has an important impact on the sensitivity of the thresholding method, but that accounting for correlation has little impact. Therefore, the simple procedures described work well for thresholding most singlesubject fMRI experiments and are recommended. The methods are illustrated with a real bilateral finger tapping experiment
A Framework For Computational Anatomy
, 2002
"... The rapid collection of brain images from healthy and diseased subjects has stimulated the development of powerful mathematical algorithms to compare, pool and average brain data across whole populations. Brain structure is so complex and variable that new approaches in computer vision, partial diff ..."
Abstract

Cited by 48 (16 self)
 Add to MetaCart
The rapid collection of brain images from healthy and diseased subjects has stimulated the development of powerful mathematical algorithms to compare, pool and average brain data across whole populations. Brain structure is so complex and variable that new approaches in computer vision, partial differential equations, and statistical field theory are being formulated to detect and visualize diseasespecific patterns. We present some novel mathematical strategies for computational anatomy, focusing on the creation of populationbased brain atlases. These atlases describe how the brain varies with age, gender, genetics, and over time. We review applications in Alzheimer's disease, schizophrenia and brain development, outlining some current challenges in the field.