Results 1  10
of
777
A quantitative comparison of graphbased models for internet topology
 IEEE/ACM TRANSACTIONS ON NETWORKING
, 1997
"... Graphs are commonly used to model the topological structure of internetworks, to study problems ranging from routing to resource reservation. A variety of graphs are found in the literature, including fixed topologies such as rings or stars, "wellknown" topologies such as the ARPAnet, and ..."
Abstract

Cited by 235 (3 self)
 Add to MetaCart
(Show Context)
Graphs are commonly used to model the topological structure of internetworks, to study problems ranging from routing to resource reservation. A variety of graphs are found in the literature, including fixed topologies such as rings or stars, "wellknown" topologies such as the ARPAnet, and randomly generated topologies. While many researchers rely upon graphs for analytic and simulation studies, there has been little analysis of the implications of using a particular model, or how the graph generation method may a ect the results of such studies. Further, the selection of one generation method over another is often arbitrary, since the differences and similarities between methods are not well understood. This paper considers the problem of generating and selecting graph models that reflect the properties of real internetworks. We review generation methods in common use, and also propose several new methods. We consider a set of metrics that characterize the graphs produced by a method, and we quantify similarities and differences amongst several generation methods with respect to these metrics. We also consider the effect of the graph model in the context of a speciffic problem, namely multicast routing.
Replicated Microarray Data
 Statistica Sinica
, 2001
"... cDNA microarrays permit us to study the expression of thousands of genes simultaneously. They are now used in many dierent contexts to compare mRNA levels between two or more samples of cells. Microarray experiments typically give us expression measurements on a large number of genes, say 10,00020, ..."
Abstract

Cited by 148 (7 self)
 Add to MetaCart
cDNA microarrays permit us to study the expression of thousands of genes simultaneously. They are now used in many dierent contexts to compare mRNA levels between two or more samples of cells. Microarray experiments typically give us expression measurements on a large number of genes, say 10,00020,000, but with few, if any replicates for each gene. Traditional methods using means and standard deviations to detect dierential expression are not completely satisfactory in this context, and so a dierent approach seems desirable. In this paper we present an empirical Bayes method for analysing replicated microarray data. Data from all the genes in a replicate set of experiments are combined into estimates of parameters of a prior distribution. These parameter estimates are then combined at the gene level with means and standard deviations to form a statistic B which can be used to decide whether dierential expression has occurred. The statistic B avoids the problems of using averages or tstatistics. The method is illustrated using data from an experiment comparing the expression of genes in the livers of SRBI transgenic mice with that of the corresponding wildtype mice. In addition we present the results of a simulation study estimating the ROC curve of B and three other statistics for determining dierential expression: the average and two simple modications of the usual tstatistic. B was found to be the most powerful of the four, though the margin was not great. The data were simulated to resemble the SRBI data. Keywords: cDNA microarray, dierential expression, empirical Bayes, replication, ROC curve, tstatistic Department of Mathematics, Uppsala University y Correspondence should be addressed to Ingrid Lonnstedt, telephone/fax +46184712842/4713201, e...
Manet simulation studies: The incredibles
 ACM SIGMOBILE Mobile Computing and Communications Review
, 2005
"... Simulation is the research tool of choice for a majority of the mobile ad hoc network (MANET) community. However, while the use of simulation has increased, the credibility of the simulation results has decreased. To determine the state of MANET simulation studies, we surveyed the 20002005 proceedi ..."
Abstract

Cited by 111 (0 self)
 Add to MetaCart
(Show Context)
Simulation is the research tool of choice for a majority of the mobile ad hoc network (MANET) community. However, while the use of simulation has increased, the credibility of the simulation results has decreased. To determine the state of MANET simulation studies, we surveyed the 20002005 proceedings of the ACM International Symposium on Mobile Ad Hoc Networking and Computing (MobiHoc). From our survey, we found significant shortfalls. We present the results of our survey in this paper. We then summarize common simulation study pitfalls found in our survey. Finally, we discuss the tools available that aid the development of rigorous simulation studies. We offer these results to the community with the hope of improving the credibility of MANET simulationbased studies. I.
The Möbius Framework and Its Implementation
"... The Möbius framework is an environment for supporting multiple modeling formalisms and solution techniques. Models expressed in formalisms that are compatible with the framework are translated into equivalent models using Mobius framework components. This translation preserves the structure of the m ..."
Abstract

Cited by 86 (19 self)
 Add to MetaCart
The Möbius framework is an environment for supporting multiple modeling formalisms and solution techniques. Models expressed in formalisms that are compatible with the framework are translated into equivalent models using Mobius framework components. This translation preserves the structure of the models, allowing e#cient solutions. The framework is implemented in the tool by a welldefined abstract functional interface. Models and solution techniques interact with one another through the use of the standard interface, allowing them to interact with Mobius framework components, not formalism components. This permits novel combinations of modeling techniques, and will be a catalyst for new research in modeling techniques. This paper describes our approach, focusing on the "atomic model." We describe the formal description of the Mobius components as well as their implementations in our software tool.
A Procedure For Robust Design: Minimizing Variations Caused By Noise Factors And Control Factors
 ASME Journal of Mechanical Design
, 1996
"... In this paper, we introduce a small variation to current approaches broadly called Taguchi Robust Design Methods. In these methods, there are two broad categories of problems associated with simultaneously minimizing performance variations and bringing the mean on target, namely, Type I  minimizing ..."
Abstract

Cited by 69 (27 self)
 Add to MetaCart
(Show Context)
In this paper, we introduce a small variation to current approaches broadly called Taguchi Robust Design Methods. In these methods, there are two broad categories of problems associated with simultaneously minimizing performance variations and bringing the mean on target, namely, Type I  minimizing variations in performance caused by variations in noise factors (uncontrollable parameters). Type II  minimizing variations in performance caused by variations in control factor (design variables). In this paper, we introduce a variation to the existing approaches to solve both types of problems. This variation embodies the integration of the Response Surface Methodology (RSM) with the compromise Decision Support Problem (DSP). Our approach is especially useful for design problems where there are no closedform solutions and system performance is computationally expensive to evaluate. The design of a solar powered irrigation system is used as an example. WORDS: 6,605 1 Assistant Professor...
An algorithmic approach to concept exploration in a large knowledge network (automatic thesaurus consultation): symbolic branchandbound search vs. connectionist Hopfield net activation
 Journal of the American Society for Information Science
, 1995
"... This paper presents a framework for knowledge discovery and concept exploration. In order to enhance the concept exploration capability of knowledgebased systems and to alleviate the limitations of the manual browsing approach, we have developed two spreading activationbased algorithms for concep ..."
Abstract

Cited by 68 (18 self)
 Add to MetaCart
(Show Context)
This paper presents a framework for knowledge discovery and concept exploration. In order to enhance the concept exploration capability of knowledgebased systems and to alleviate the limitations of the manual browsing approach, we have developed two spreading activationbased algorithms for concept exploration in large, heterogeneous networks of concepts (e.g., multiple thesauri). One algorithm, which is based on the symbolic Al paradigm, performs a conventional branchandbound search on a semantic net representation to identify other highly relevant concepts (a serial, optimal search process). The second algorithm, which is based on the neural network approach, executes the Hopfield net parallel relaxation and convergence process to identify “convergent ” concepts for some initial queries (a parallel, heuristic search process). Both algorithms can be adopted for automatic, multiplethesauri consultation. We tested these two algorithms on a large textbased knowledge network of about 13,000 nodes (terms) and 80,000 directed links in the area of computing technologies. This knowledge network was created from two external thesauri and one automatically generated thesaurus. We conducted experiments to compare the behaviors and performances of the two algorithms with the hypertextlike browsing process. Our experiment revealed that manual browsing achieved higherterm recall but lowerterm precision in comparison to the algorithmic systems. However, it was also a much more laborious and cognitively demanding process. In document retrieval, there were no statistically significant differences in document recall and precision between the algorithms and the manual browsing process. In light of the effort required by the manual browsing process, our proposed algorithmic approach presents a viable option for efficiently traversing largescale, multiple thesauri (knowledge network). 1
The Möbius Modeling Tool
 IN PROCEEDINGS OF THE 9TH INTERNATIONAL WORKSHOP ON PETRI NETS AND PERFORMANCE MODELS
"... Despite the development of many modeling formalisms and model solution methods, most tool implementations support only a single formalism. Furthermore, models expressed in the chosen formalism cannot be combined with models expressed in other formalisms. This monolithic approach both limits the usef ..."
Abstract

Cited by 65 (12 self)
 Add to MetaCart
(Show Context)
Despite the development of many modeling formalisms and model solution methods, most tool implementations support only a single formalism. Furthermore, models expressed in the chosen formalism cannot be combined with models expressed in other formalisms. This monolithic approach both limits the usefulness of such tools to practitioners, and hampers new and existing formalisms and solvers. This paper describes the method that a new modeling tool, cal led Mobius, uses to eliminate these limitations. Mobius provides an infrastructure to support multiple interacting formalisms and solvers, and is extensible in that new formalisms and solvers can be added to the tool without changing those already implemented. Mobius provides this capability through the use of an abstract functional interface, which provides a formalismindependent interface to models. This allows models expressed in multiple formalisms to interact with each other, and with multiple solvers.
Some New Three Level Designs for the Study of Quantitative Variables
, 1960
"... This article describes some methods which enable us to construct small designs for quantitative factors, while maintaining as much orthogonality of the design as possible. To calculate the D ..."
Abstract

Cited by 64 (0 self)
 Add to MetaCart
This article describes some methods which enable us to construct small designs for quantitative factors, while maintaining as much orthogonality of the design as possible. To calculate the D
MultiModal Identity Verification Using Expert Fusion
 Information Fusion
, 2000
"... The contribution of this paper is to compare paradigms coming from the classes of parametric, and nonparametric techniques to solve the decision fusion problem encountered in the design of a multimodal biometrical identity verification system. The multimodal identity verification system under con ..."
Abstract

Cited by 48 (0 self)
 Add to MetaCart
(Show Context)
The contribution of this paper is to compare paradigms coming from the classes of parametric, and nonparametric techniques to solve the decision fusion problem encountered in the design of a multimodal biometrical identity verification system. The multimodal identity verification system under consideration is built of d modalities in parallel, each one delivering as output a scalar number, called score, stating how well the claimed identity is verified. A decision fusion module receiving as input the d scores has to take a binary decision: accept or reject the claimed identity. We have solved this fusion problem using parametric and nonparametric classifiers. The performances of all these fusion modules have been evaluated and compared with other approaches on a multimodal database, containing both vocal and visual biometric modalities. Keywords: Multimodal identity verification, biometrics, decision fusion. 1 Introduction The automatic verification 1 of a person is more and...
Imprecision in Engineering Design
 ASME JOURNAL OF MECHANICAL DESIGN
, 1995
"... Methods for incorporating imprecision in engineering design decisionmaking are briefly reviewed and compared. A tutorial is presented on the Method of Imprecision (MoI), a formal method, based on the mathematics of fuzzy sets, for representing and manipulating imprecision in engineering design. The ..."
Abstract

Cited by 47 (6 self)
 Add to MetaCart
Methods for incorporating imprecision in engineering design decisionmaking are briefly reviewed and compared. A tutorial is presented on the Method of Imprecision (MoI), a formal method, based on the mathematics of fuzzy sets, for representing and manipulating imprecision in engineering design. The results of a design cost estimation example, utilizing a new informal cost specification, are presented. The MoI can provide formal information upon which to base decisions during preliminary engineering design and can facilitate setbased concurrent design.