Results 1  10
of
21,765
Exploration, normalization, and summaries of high density oligonucleotide array probe level data.
 Biostatistics,
, 2003
"... SUMMARY In this paper we report exploratory analyses of highdensity oligonucleotide array data from the Affymetrix GeneChip R system with the objective of improving upon currently used measures of gene expression. Our analyses make use of three data sets: a small experimental study consisting of f ..."
Abstract

Cited by 854 (33 self)
 Add to MetaCart
of five MGU74A mouse GeneChip R arrays, part of the data from an extensive spikein study conducted by Gene Logic and Wyeth's Genetics Institute involving 95 HGU95A human GeneChip R arrays; and part of a dilution study conducted by Gene Logic involving 75 HGU95A GeneChip R arrays. We display some
Quantal Response Equilibria For Normal Form Games
 NORMAL FORM GAMES, GAMES AND ECONOMIC BEHAVIOR
, 1995
"... We investigate the use of standard statistical models for quantal choice in a game theoretic setting. Players choose strategies based on relative expected utility, and assume other players do so as well. We define a Quantal Response Equilibrium (QRE) as a fixed point of this process, and establish e ..."
Abstract

Cited by 647 (28 self)
 Add to MetaCart
existence. For a logit specification of the error structure, we show that as the error goes to zero, QRE approaches a subset of Nash equilibria and also implies a unique selection from the set of Nash equilibria in generic games. We fit the model to a variety of experimental data sets by using maximum
The EntityRelationship Model: Toward a Unified View of Data
 ACM Transactions on Database Systems
, 1976
"... A data model, called the entityrelationship model, is proposed. This model incorporates some of the important semantic information about the real world. A special diagrammatic technique is introduced as a tool for database design. An example of database design and description using the model and th ..."
Abstract

Cited by 1829 (6 self)
 Add to MetaCart
ambiguities in these models are analyzed. Possible ways to derive their views of data from the entityrelationship model are presented. Key Words and Phrases: database design, logical view of data, semantics of data, data models, entityrelationship model, relational model, Data Base Task Group, network model
InductiveDataType Systems
, 2002
"... In a previous work ("Abstract Data Type Systems", TCS 173(2), 1997), the leI two authors presented a combined lmbined made of a (strongl normal3zG9 alrmal rewrite system and a typed #calA#Ik enriched by patternmatching definitions folnitio a certain format,calat the "General Schem ..."
Abstract

Cited by 821 (23 self)
 Add to MetaCart
In a previous work ("Abstract Data Type Systems", TCS 173(2), 1997), the leI two authors presented a combined lmbined made of a (strongl normal3zG9 alrmal rewrite system and a typed #calA#Ik enriched by patternmatching definitions folnitio a certain format,calat the "
AgentSpeak(L): BDI Agents speak out in a logical computable language
, 1996
"... BeliefDesireIntention (BDI) agents have been investigated by many researchers from both a theoretical specification perspective and a practical design perspective. However, there still remains a large gap between theory and practice. The main reason for this has been the complexity of theoremprov ..."
Abstract

Cited by 514 (2 self)
 Add to MetaCart
proving or modelchecking in these expressive specification logics. Hence, the implemented BDI systems have tended to use the three major attitudes as data structures, rather than as modal operators. In this paper, we provide an alternative formalization of BDI agents by providing an operational and proof
A model for technical inefficiency effects in a stochastic frontier production function for panel data
 Empirical Economics
, 1995
"... Abstract: A stochastic frontier production function is defined for panel data on firms, in which the nonnegative technical inetGciency effects are assumed to be a function of firmspecific variables and time. The inefficiency effects are assumed to be independently distributed as truncations of nor ..."
Abstract

Cited by 555 (4 self)
 Add to MetaCart
of normal distributions with constant variance, but with means which are a linear function of observable variables. This panel data model is an extension of recently proposed models for inefTiciency effects in stochastic frontiers for crosssectional data. An empirical application of the model is obtained
An analysis of transformations
 Journal of the Royal Statistical Society. Series B (Methodological
, 1964
"... In the analysis of data it is often assumed that observations y,, y,,...,y, are independently normally distributed with constant variance and with expectations specified by a model linear in a set of parameters 0. In this paper we make the less restrictive assumption that such a normal, homoscedasti ..."
Abstract

Cited by 1067 (3 self)
 Add to MetaCart
In the analysis of data it is often assumed that observations y,, y,,...,y, are independently normally distributed with constant variance and with expectations specified by a model linear in a set of parameters 0. In this paper we make the less restrictive assumption that such a normal
Evaluating the Accuracy of SamplingBased Approaches to the Calculation of Posterior Moments
 IN BAYESIAN STATISTICS
, 1992
"... Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accurac ..."
Abstract

Cited by 604 (12 self)
 Add to MetaCart
accuracy of the approximations to the expected value of functions of interest under the posterior. In this paper methods from spectral analysis are used to evaluate numerical accuracy formally and construct diagnostics for convergence. These methods are illustrated in the normal linear model
A Bayesian Framework for the Analysis of Microarray Expression Data: Regularized tTest and Statistical Inferences of Gene Changes
 Bioinformatics
, 2001
"... Motivation: DNA microarrays are now capable of providing genomewide patterns of gene expression across many different conditions. The first level of analysis of these patterns requires determining whether observed differences in expression are significant or not. Current methods are unsatisfactory ..."
Abstract

Cited by 491 (6 self)
 Add to MetaCart
due to the lack of a systematic framework that can accommodate noise, variability, and low replication often typical of microarray data. Results: We develop a Bayesian probabilistic framework for microarray data analysis. At the simplest level, we model logexpression values by independent normal
Marching cubes: A high resolution 3D surface construction algorithm
 COMPUTER GRAPHICS
, 1987
"... We present a new algorithm, called marching cubes, that creates triangle models of constant density surfaces from 3D medical data. Using a divideandconquer approach to generate interslice connectivity, we create a case table that defines triangle topology. The algorithm processes the 3D medical d ..."
Abstract

Cited by 2696 (4 self)
 Add to MetaCart
data in scanline order and calculates triangle vertices using linear interpolation. We find the gradient of the original data, normalize it, and use it as a basis for shading the models. The detail in images produced from the generated surface models is the result of maintaining the inter
Results 1  10
of
21,765