Results 1  10
of
33
2003: Global analyses of sea surface temperature, sea ice, and night marine air temperature since the late Nineteenth Century
 J. Geophysical Research
"... data set, HadISST1, and the nighttime marine air temperature (NMAT) data set, HadMAT1. HadISST1 replaces the global sea ice and sea surface temperature (GISST) data sets and is a unique combination of monthly globally complete fields of SST and sea ice concentration on a 1 ° latitudelongitude grid ..."
Abstract

Cited by 193 (0 self)
 Add to MetaCart
data set, HadISST1, and the nighttime marine air temperature (NMAT) data set, HadMAT1. HadISST1 replaces the global sea ice and sea surface temperature (GISST) data sets and is a unique combination of monthly globally complete fields of SST and sea ice concentration on a 1 ° latitudelongitude grid from 1871. The companion HadMAT1 runs monthly from 1856 on a 5 ° latitudelongitude grid and incorporates new corrections for the effect on NMAT of increasing deck (and hence measurement) heights. HadISST1 and HadMAT1 temperatures are reconstructed using a twostage reducedspace optimal interpolation procedure, followed by superposition of qualityimproved gridded observations onto the reconstructions to restore local detail. The sea ice fields are made more homogeneous by compensating satellite microwavebased sea ice concentrations for the impact of surface melt effects on retrievals in the Arctic and for algorithm deficiencies in the Antarctic and by making the historical in situ concentrations consistent with the satellite data. SSTs near sea ice are estimated using statistical relationships between SST and sea ice concentration. HadISST1 compares well with other published analyses, capturing trends in global, hemispheric, and regional SST well,
An Ensemble Adjustment Kalman Filter for Data Assimilation
, 2001
"... A theory for estimating the probability distribution of the state of a model given a set of observations exists. This nonlinear ..."
Abstract

Cited by 139 (4 self)
 Add to MetaCart
A theory for estimating the probability distribution of the state of a model given a set of observations exists. This nonlinear
Analysis of incomplete climate data: Estimation of mean values and covariance matrices and imputation of missing values
, 2001
"... Estimating the mean and the covariance matrix of an incomplete dataset and filling in missing values with imputed values is generally a nonlinear problem, which must be solved iteratively. The expectation maximization (EM) algorithm for Gaussian data, an iterative method both for the estimation of m ..."
Abstract

Cited by 57 (3 self)
 Add to MetaCart
Estimating the mean and the covariance matrix of an incomplete dataset and filling in missing values with imputed values is generally a nonlinear problem, which must be solved iteratively. The expectation maximization (EM) algorithm for Gaussian data, an iterative method both for the estimation of mean values and covariance matrices from incomplete datasets and for the imputation of missing values, is taken as the point of departure for the development of a regularized EM algorithm. In contrast to the conventional EM algorithm, the regularized EM algorithm is applicable to sets of climate data, in which the number of variables typically exceeds the sample size. The regularized EM algorithm is based on iterated analyses of linear regressions of variables with missing values on variables with available values, with regression coefficients estimated by ridge regression, a regularized regression method in which a continuous regularization parameter controls the filtering of the noise in the data. The regularization parameter is determined by generalized crossvalidation, such as to minimize, approximately, the expected mean squared error of the imputed values. The regularized EM algorithm can estimate, and exploit for the imputation of missing values, both synchronic and diachronic covariance matrices, which may contain information on spatial covariability, stationary temporal covariability, or cyclostationary temporal covariability. A test of the regularized EM algorithm with simulated surface temperature data demonstrates that the algorithm is applicable to typical sets of climate data and that it leads to more accurate estimates of the missing values than a conventional noniterative imputation technique.
Model assessment of decadal variability and trends in t e tropical Pacific Ocean
 J. Clim
, 1998
"... In this report, global coupled ocean–atmosphere models are used to explore possible mechanisms for observed decadal variability and trends in Pacific Ocean SSTs over the past century. The leading mode of internally generated decadal (�7 yr) variability in the model resembles the observed decadal var ..."
Abstract

Cited by 31 (4 self)
 Add to MetaCart
In this report, global coupled ocean–atmosphere models are used to explore possible mechanisms for observed decadal variability and trends in Pacific Ocean SSTs over the past century. The leading mode of internally generated decadal (�7 yr) variability in the model resembles the observed decadal variability in terms of pattern and amplitude. In the model, the pattern and time evolution of tropical winds and oceanic heat content are similar for the decadal and ENSO timescales, suggesting that the decadal variability has a similar ‘‘delayed oscillator’ ’ mechanism to that on the ENSO timescale. The westward phase propagation of the heat content anomalies, however, is slower and centered slightly farther from the equator (�12 � vs 9�N) for the decadal variability. Cool SST anomalies in the midlatitude North Pacific during the warm tropical phase of the decadal variability are induced in the model largely by oceanic advection anomalies. An index of observed SST over a broad triangular region of the tropical and subtropical Pacific indicates a warming rate of �0.41�C (100 yr) �1 since 1900, �1.2�C (100 yr) �1 since 1949, and �2.9�C (100 yr) �1 since 1971. All three warming trends are highly unusual in terms of their duration, with occurrence rates of less than 0.5 % in a 2000yr simulation of internal climate variability using a lowresolution model. The most unusual is the trend since 1900 (96yr duration): the longest simulated duration of a trend of this magnitude is 85 yr. This
PRACTICAL APPROACHES TO PRINCIPAL COMPONENT ANALYSIS IN THE PRESENCE OF MISSING VALUES
"... Informaatio ja luonnontieteiden tiedekunta ..."
HISTORICAL CLIMATOLOGY IN EUROPE – THE STATE OF THE ART
"... Abstract. This paper discusses the state of European research in historical climatology. This field of science and an overview of its development are described in detail. Special attention is given to the documentary evidence used for data sources, including its drawbacks and advantages. Further, me ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Abstract. This paper discusses the state of European research in historical climatology. This field of science and an overview of its development are described in detail. Special attention is given to the documentary evidence used for data sources, including its drawbacks and advantages. Further, methods and significant results of historicalclimatological research, mainly achieved since 1990, are presented. The main focus concentrates on data, methods, definitions of the “Medieval Warm Period ” and the “Little Ice Age”, synoptic interpretation of past climates, climatic anomalies and natural disasters, and the vulnerability of economies and societies to climate as well as images and social representations of past weather and climate. The potential of historical climatology for climate modelling research is discussed briefly. Research perspectives in historical climatology are formulated with reference to data, methods, interdisciplinarity and impacts. 1.
A DimensionReduction Approach to SpaceTime Kalman Filtering
, 1999
"... this article, we present an approach to spacetime prediction that achieves dimension reduction and uses a statistical model that is temporally dynamic and spatially descriptive. That is, it exploits the unidirectional flow of time (in an autoregressive framework) and is spatially "descriptive& ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
this article, we present an approach to spacetime prediction that achieves dimension reduction and uses a statistical model that is temporally dynamic and spatially descriptive. That is, it exploits the unidirectional flow of time (in an autoregressive framework) and is spatially "descriptive" in that the autoregressive process is spatially colored. With the inclusion of a measurement equation, this formulation naturally leads to the development of a spatiotemporal Kalman filter that achieves dimension reduction in the analysis of large spatiotemporal data sets. We use this Kalman filter to predict at times and locations for which we do not have data. The method is applied to a data set of nearsurface winds, obtained from a blending of observations and a deterministic atmospheric model, and is shown to perform better than independently applying simple kriging to each time slice of the spatial field. That is, we can improve prediction by exploiting the dynamic structure of the spatial fields evolving in time. The improvement becomes more pronounced as the signaltonoise ratio decreases.
Estimation of the Diurnal Variability of Sea Surface Temperatures using Numerical Modelling and the Assimilation of Satellite Observations
, 2006
"... This thesis is concerned with the diurnal cycle of sea surface temperature (SST). The diurnal variability of SSTs are an important feature of the climate system. In order to obtain accurate SST records and reduce errors in satellite derived SST estimates an understanding of the diurnal signals in th ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
This thesis is concerned with the diurnal cycle of sea surface temperature (SST). The diurnal variability of SSTs are an important feature of the climate system. In order to obtain accurate SST records and reduce errors in satellite derived SST estimates an understanding of the diurnal signals in these observations are essential. Satellite derived SST observations measure the skin and subskin layers whereas ocean models typically resolve a 5 metre temperature. An understanding of these differences are important for assimilation of SST. In this thesis a onedimensional mixed layer ocean model is improved and developed with the capability of representing the dominant processes involved in the development of the diurnal cycle of SSTs. The model is forced with operational forecast data and used to build spatial maps of the diurnal warming. The extent of the diurnal warming at a particular location and time is predominately governed by a nonlinear response to the cloud cover and sea surface wind speeds over the day. The accuracy of the modelled SST is hampered by uncertainty in these forcing variables. A novel algorithm is developed that uses SST observations to derive
Quantitative Study of Smoothing SplineANOVA Based Fingerprint Methods for Attribution of Global Warming
, 1999
"... A fingerprintbased method for climate change detection and attribution with some novel features is proposed. The method is based on a functional ANOVA (ANalysis Of VAriance) decomposition of a time and space signal, further decomposed into global timetrend and timetrend anomaly as a function of s ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
A fingerprintbased method for climate change detection and attribution with some novel features is proposed. The method is based on a functional ANOVA (ANalysis Of VAriance) decomposition of a time and space signal, further decomposed into global timetrend and timetrend anomaly as a function of space. The method estimates the signal as a component of forced minus background climate model output, and then uses a partial spline model to estimate and test for the existence of signal in historical data. The method is based on the classical detection of signal in noise, however there are several features apparently novel to the fingerprint literature, in particular, the analysis takes place directly in observation space, anomalies are tted directly and there is possibility for estimating certain parameters of covariance models for the historical data as part of the analysis. Simulation studies using climate model runs from GFDL and NCAR and historical data for NH Winter average...
Globality and Optimality in Climate Field Reconstructions from Proxy Data
, 1999
"... A primary objective of paleoclimate research is the characterization of natural climate variability on time scales of years to millennia. We develop here a systematic methodology for the objective and verifiable reconstruction of climate fields from sparse observational networks of proxy data, using ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
A primary objective of paleoclimate research is the characterization of natural climate variability on time scales of years to millennia. We develop here a systematic methodology for the objective and verifiable reconstruction of climate fields from sparse observational networks of proxy data, using reduced space objective analysis. In this approach we seek to reconstruct only the leading modes of largescale variability which are observed in the modern climate and resolved in the proxy data. Given explicit assumptions, the analysis produces climate fields, indices, and their associated estimated errors. These may be subsequently checked for consistency with parameter choices and procedural assumptions by comparison with withheld data and results from benchmark experiments. The methodology is applied to the candidate tree ring indicator data set described by Villalba et al. (1999), for the reconstruction of gridded Pacific Ocean basin sea surface temperature (SST) over the interval 10...