Results 1  10
of
79
GADT: A Probability Space ADT for Representing and Querying the Physical World
 In ICDE
, 2002
"... Large sensor networks are being widely deployed for measurement, detection, and monitoring applications. Many of these applications involve database systems to store and process data from the physical world. This data has inherent measurement uncertainties that are properly represented by continuous ..."
Abstract

Cited by 30 (1 self)
 Add to MetaCart
Large sensor networks are being widely deployed for measurement, detection, and monitoring applications. Many of these applications involve database systems to store and process data from the physical world. This data has inherent measurement uncertainties that are properly represented by continuous probability distribution functions (pdf's). We introduce a new objectrelational data type, the Gaussian ADT GADT, that models physical data as gaussian pdf's, and we show that existing index structures can be used as fast access methods for GADT data. We also present a measuretheoretic model of probabilistic data and evaluate GADT in its light.
Continuity analysis of programs
 SIGPLAN Not
"... We present an analysis to automatically determine if a program represents a continuous function, or equivalently, if infinitesimal changes to its inputs can only cause infinitesimal changes to its outputs. The analysis can be used to verify the robustness of programs whose inputs can have small amou ..."
Abstract

Cited by 28 (7 self)
 Add to MetaCart
We present an analysis to automatically determine if a program represents a continuous function, or equivalently, if infinitesimal changes to its inputs can only cause infinitesimal changes to its outputs. The analysis can be used to verify the robustness of programs whose inputs can have small amounts of error and uncertainty— e.g., embedded controllers processing slightly unreliable sensor data, or handheld devices using slightly stale satellite data. Continuity is a fundamental notion in mathematics. However, it is difficult to apply continuity proofs from real analysis to functions that are coded as imperative programs, especially when they use diverse data types and features such as assignments, branches, and loops. We associate data types with metric spaces as opposed to just sets of values, and continuity of typed programs is phrased in terms of these spaces. Our analysis reduces questions about continuity
Minimal Subset Evaluation: Rapid Warmup for Simulated Hardware State
 In Proceedings of the 2001 International Conference on Computer Design
, 2001
"... This paper introduces minimal subset evaluation (MSE) as a way to reduce time spent on largestructure warmup during the fastforwarding portion of processor simulations. Warm up is commonly used prior to fulldetail simulation to avoid coldstart bias in large structures like caches and branch pred ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
This paper introduces minimal subset evaluation (MSE) as a way to reduce time spent on largestructure warmup during the fastforwarding portion of processor simulations. Warm up is commonly used prior to fulldetail simulation to avoid coldstart bias in large structures like caches and branch predictors. Unfortunately, warm up can be very time consuming, often representing 50% or more of total simulation time. Previous techniques have used the entire fastforward interval to obtain accurate warm up, which may be prohibitive for large parameterspace searches, or chosen a short but adhoc warmup length that reduces simulation time but may sacrifice accuracy. MSE probabilistically determines a minimally sufficient fraction of the set of fastforward transactions that must be executed for warm up to accurately produce state as it would have appeared had the entire fastforward interval been used for warm up. The paper describes the mathematical underpinnings of MSE and demonstrates its effectiveness for both singlelargesample and multiplesample simulation styles. In our experiments, MSE yields errors of less than 1% in IPC measurements with cycleaccurate simulation, while reducing simulation times by an average factor of two or more. 1
Dobbie Z: Processing of gene expression data generated by quantitative realtime RTPCR. Biotechniques 2002, 32(6):13721374, 1376, 13781379. doi:10.1186/147121211168 Cite this article as: Beese et al.: Effect of cAMP derivates on assembly and mainten
 Inclusion in PubMed, CAS, Scopus and Google Scholar • Research
"... Realtime quantitative PCR represents a highly sensitive and powerful technique for the quantitation of nucleic acids. It has a tremendous potential for the highthroughput analysis of gene expression in research and routine diagnostics. However, the major hurdle is not the practical performance of ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
Realtime quantitative PCR represents a highly sensitive and powerful technique for the quantitation of nucleic acids. It has a tremendous potential for the highthroughput analysis of gene expression in research and routine diagnostics. However, the major hurdle is not the practical performance of the experiments themselves but rather the efficient evaluation and the mathematical and statistical analysis of the enormous amount of data gained by this technology, as these functions are not included in the software provided by the manufacturers of the detection systems. In this work, we focus on the mathematical evaluation and analysis of the data generated by realtime quantitative PCR, the calculation of the final results, the propagation of experimental variation of the measured values to the final results, and the statistical analysis. We developed a Microsoft ® Excel ®based software application coded in Visual Basic for Applications, called QGene, which addresses these points. QGene manages and expedites the planning, performance, and evaluation of realtime quantitative PCR experiments, as well as the mathematical and statistical analysis, storage, and graphical presentation of the data. The QGene software application is a tool to cope with complex realtime quantitative PCR experiments at a highthroughput scale and considerably expedites and rationalizes the experimental setup, data analysis, and data management while ensuring highest reproducibility.
Threedimensional segmentation and growthrate estimation of small pulmonary nodules in helical CT images
 IEEE Trans. Med. Imaging
"... permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Cornell University’s products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes o ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Cornell University’s products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to
Outline of a Theory of Strongly Semantic Information
 Floridi, L. 1999, Philosophy and Computing – An Introduction (London
, 2003
"... This paper outlines a quantitative theory of strongly semantic information (TSSI) based on truthvalues rather than probability distributions. The main hypothesis supported in the paper is that (i) the classic quantitative theory of weakly semantic information (TWSI) is based on probability distribu ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
This paper outlines a quantitative theory of strongly semantic information (TSSI) based on truthvalues rather than probability distributions. The main hypothesis supported in the paper is that (i) the classic quantitative theory of weakly semantic information (TWSI) is based on probability distributions because (ii) it assumes that truthvalues supervene on information, yet (iii) this principle is too weak and generates a wellknown semantic paradox, whereas (iv) TSSI, according to which information encapsulates truth, can avoid the paradox and is more in line with the standard conception of what counts as information. After a brief introduction, section two outlines the semantic paradox entailed by TWSI, analysing it in terms of an initial conflict between two requisites of a quantitative theory of semantic information. In section three, three criteria of information equivalence are used to provide a taxonomy of quantitative approaches to semantic information and introduce TSSI. In section four, some further desiderata that should be fulfilled by a quantitative TSSI are explained. From section five to section seven, TSSI is developed on the basis of a calculus of truthvalues and semantic discrepancy with respect to a given situation. In section eight, it is shown how TSSI succeeds in solving the paradox. Section nine summarises the main results of the paper and indicates some future developments.
A randomized method for the Shapley value for the voting game
 IN THE SIXTH INTERNATIONAL JOINT ON AUTONOMOUS AGENTS AND MULTIAGENT SYSTEMS (AAMAS 2007
, 2007
"... The Shapley value is one of the key solution concepts for coalition games. Its main advantage is that it provides a unique and fair solution, but its main problem is that, for many coalition games, the Shapley value cannot be determined in polynomial time. In particular, the problem of finding this ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
The Shapley value is one of the key solution concepts for coalition games. Its main advantage is that it provides a unique and fair solution, but its main problem is that, for many coalition games, the Shapley value cannot be determined in polynomial time. In particular, the problem of finding this value for the voting game is known to be #Pcomplete in the general case. However, in this paper, we show that there are some specific voting games for which the problem is computationally tractable. For other general voting games, we overcome the problem of computational complexity by presenting a new randomized method for determining the approximate Shapley value. The time complexity of this method is linear in the number of players. We also show, through empirical studies, that the percentage error for the proposed method is always less than 20 % and, in most cases, less than 5%.
Combining color and shape information for illuminationviewpoint invariant object recognition
 IEEE TRANS. ON IMAGE PROCESSING
, 2006
"... In this paper, we propose a new scheme that merges color and shapeinvariant information for object recognition. To obtain robustness against photometric changes, colorinvariant derivatives are computed first. Color invariance is an important aspect of any object recognition scheme, as color chan ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
In this paper, we propose a new scheme that merges color and shapeinvariant information for object recognition. To obtain robustness against photometric changes, colorinvariant derivatives are computed first. Color invariance is an important aspect of any object recognition scheme, as color changes considerably with the variation in illumination, object pose, and camera viewpoint. These color invariant derivatives are then used to obtain similarity invariant shape descriptors. Shape invariance is equally important as, under a change in camera viewpoint and object pose, the shape of a rigid object undergoes a perspective projection on the image plane. Then, the color and shape invariants are combined in a multidimensional colorshape context which is subsequently used as an index. As the indexing scheme makes use of a colorshape invariant context, it provides a highdiscriminative information cue robust against varying imaging conditions. The matching function of the colorshape context allows for fast recognition, even in the presence of object occlusion and cluttering. From the experimental results, it is shown that the method recognizes rigid objects with high accuracy in 3D complex scenes and is robust against changing illumination, camera viewpoint, object pose, and noise.
An Analysis of Feasible Solutions for MultiIssue Negotiation Involving Nonlinear Utility Functions
"... This paper analyzes bilateral multiissue negotiation between selfinterested agents. Specifically, we consider the case where issues are divisible, there are time constraints in the form of deadlines and discount factors, and the agents have different preferences over the issues. Given these differi ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
This paper analyzes bilateral multiissue negotiation between selfinterested agents. Specifically, we consider the case where issues are divisible, there are time constraints in the form of deadlines and discount factors, and the agents have different preferences over the issues. Given these differing preferences, it is possible to reach Paretooptimal agreements by negotiating all the issues together using a package deal procedure (PDP). However, finding equilibrium strategies for this procedure is not always computationally easy. In particular, if the agents ’ utility functions are nonlinear, then equilibrium strategies may be hard to compute. In order to overcome this complexity, we explore two different solutions. The first is to use the PDP for linear approximations of the given nonlinear utilities. The second solution is to use a simultaneous procedure (SP) where the issues are discussed in parallel but independently of each other. We then compare these two solutions both in terms of their computational properties (i.e., time complexity of computing an approximate equilibrium and the associated error of approximation) and their economic properties (i.e., the agents ’ utilities and social welfare of the resulting outcome). By doing so, we show that an approximate equilibrium for the PDP and the SP can be found in polynomial time. In terms of the economic properties, although the PDP is known to generate Pareto optimal outcomes, we show that, in some cases, which we identify, the SP is better for one of the two agents and also increases the social welfare.