Results 1  10
of
119
Semantic Similarity in a Taxonomy: An InformationBased Measure and its Application to Problems of Ambiguity in Natural Language
, 1999
"... This article presents a measure of semantic similarityinanisa taxonomy based on the notion of shared information content. Experimental evaluation against a benchmark set of human similarity judgments demonstrates that the measure performs better than the traditional edgecounting approach. The a ..."
Abstract

Cited by 416 (9 self)
 Add to MetaCart
This article presents a measure of semantic similarityinanisa taxonomy based on the notion of shared information content. Experimental evaluation against a benchmark set of human similarity judgments demonstrates that the measure performs better than the traditional edgecounting approach. The article presents algorithms that take advantage of taxonomic similarity in resolving syntactic and semantic ambiguity, along with experimental results demonstrating their e#ectiveness. 1. Introduction Evaluating semantic relatedness using network representations is a problem with a long history in arti#cial intelligence and psychology, dating back to the spreading activation approach of Quillian #1968# and Collins and Loftus #1975#. Semantic similarity represents a special case of semantic relatedness: for example, cars and gasoline would seem to be more closely related than, say, cars and bicycles, but the latter pair are certainly more similar. Rada et al. #Rada, Mili, Bicknell, & Blett...
SELECTION AND INFORMATION: A CLASSBASED APPROACH TO LEXICAL RELATIONSHIPS
, 1993
"... Selectional constraints are limitations on the applicability of predicates to arguments. For example, the statement “The number two is blue” may be syntactically well formed, but at some level it is anomalous — BLUE is not a predicate that can be applied to numbers. According to the influential theo ..."
Abstract

Cited by 235 (8 self)
 Add to MetaCart
Selectional constraints are limitations on the applicability of predicates to arguments. For example, the statement “The number two is blue” may be syntactically well formed, but at some level it is anomalous — BLUE is not a predicate that can be applied to numbers. According to the influential theory of (Katz and Fodor, 1964), a predicate associates a set of defining features with each argument, expressed within a restricted semantic vocabulary. Despite the persistence of this theory, however, there is widespread agreement about its empirical shortcomings (McCawley, 1968; Fodor, 1977). As an alternative, some critics of the KatzFodor theory (e.g. (JohnsonLaird, 1983)) have abandoned the treatment of selectional constraints as semantic, instead treating them as indistinguishable from inferences made on the basis of factual knowledge. This provides a better match for the empirical phenomena, but it opens up a different problem: if selectional constraints are the same as inferences in general, then accounting for them will require a much more complete understanding of knowledge representation and inference than we have at present. The problem, then, is this: how can a theory of selectional constraints be elaborated without first having either an empirically adequate theory of defining features or a comprehensive theory of inference? In this dissertation, I suggest that an answer to this question lies in the representation of conceptual
Predicting the Future of Discrete Sequences From Fractal Representations of the Past
, 2001
"... We propose a novel approach for building nite memory predictive models similar in spirit to variable memory length Markov models (VLMMs). The models are constructed by rst transforming the nblock structure of the training sequence into a geometric structure of points in a unit hypercube, such ..."
Abstract

Cited by 29 (10 self)
 Add to MetaCart
We propose a novel approach for building nite memory predictive models similar in spirit to variable memory length Markov models (VLMMs). The models are constructed by rst transforming the nblock structure of the training sequence into a geometric structure of points in a unit hypercube, such that the longer is the common sux shared by any two nblocks, the closer lie their point representations.
Spatial Representation of Symbolic Sequences through Iterative Function Systems
 IEEE Transactions on Systems, Man, and Cybernetics Part A: Systems and Humans
, 1998
"... Jeffrey proposed a graphic representation of DNA sequences using Barnsley's iterative function systems. In spite of further developments in this direction, the proposed graphic representation of DNA sequences has been lacking a rigorous connection between its spatial scaling characteristics and the ..."
Abstract

Cited by 24 (11 self)
 Add to MetaCart
Jeffrey proposed a graphic representation of DNA sequences using Barnsley's iterative function systems. In spite of further developments in this direction, the proposed graphic representation of DNA sequences has been lacking a rigorous connection between its spatial scaling characteristics and the statistical characteristics of the DNA sequences themselves. We 1) generalize Jeffrey's graphic representation to accommodate (possibly infinite) sequences over an arbitrary finite number of symbols, 2) establish a direct correspondence between the statistical characterization of symbolic sequences via R'enyi entropy spectra and the multifractal characteristics (R'enyi generalized dimensions) of the sequences' spatial representations, 3) show that for general symbolic dynamical systems, the multifractal f H  spectra in the sequence space coincide with the f H spectra on spatial sequence representations. Keywords Multifractal theory, Iterative function systems, Chaos game representation...
Some Foundational Questions Concerning Language
 Journal of Pragmatics
, 1992
"... foundations of standard approaches to language studies involve an incoherence in their presuppositions. Second, we present an alternative approach that resolves this incoherence. Third, we discuss how this error manifests itself in categorial grammars and model theoretic possible worlds semantics ..."
Abstract

Cited by 22 (21 self)
 Add to MetaCart
foundations of standard approaches to language studies involve an incoherence in their presuppositions. Second, we present an alternative approach that resolves this incoherence. Third, we discuss how this error manifests itself in categorial grammars and model theoretic possible worlds semantics. Fourth, we suggest some possible revisions in standard approaches to accommodate them to the alternative that we suggest. We arrive at a fundamentally functional, or pragmatic, conception  an interactive conception  of the nature of language and meaning. Contents 1.
Testing For Nonlinearity Using Redundancies: Quantitative and Qualitative Aspects
 Physica D
, 1995
"... A method for testing nonlinearity in time series is described based on informationtheoretic functionals  redundancies, linear and nonlinear forms of which allow either qualitative, or, after incorporating the surrogate data technique, quantitative evaluation of dynamical properties of scrutinized ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
A method for testing nonlinearity in time series is described based on informationtheoretic functionals  redundancies, linear and nonlinear forms of which allow either qualitative, or, after incorporating the surrogate data technique, quantitative evaluation of dynamical properties of scrutinized data. An interplay of quantitative and qualitative testing on both the linear and nonlinear levels is analyzed and robustness of this combined approach against spurious nonlinearity detection is demonstrated. Evaluation of redundancies and redundancybased statistics as functions of time lag and embedding dimension can further enhance insight into dynamics of a system under study. Keywords: time series, nonlinearity, mutual information, redundancy, surrogate data 1 Introduction The problem of inferring the dynamics of a system from measured data is a perpetual challenge for time series analysts. Ideas and concepts from nonlinear dynamics and theory of deterministic chaos have led to a num...
Construction of polygonal interpolants: A maximum entropy approach
 International Journal for Numerical Methods in Engineering
, 2004
"... In this paper, we establish a link between maximizing (informationtheoretic) entropy and the construction of polygonal interpolants. The determination of shape functions on ngons (n>3) leads to a nonunique underdetermined system of linear equations. The barycentric coordinates �i, which form a ..."
Abstract

Cited by 17 (13 self)
 Add to MetaCart
In this paper, we establish a link between maximizing (informationtheoretic) entropy and the construction of polygonal interpolants. The determination of shape functions on ngons (n>3) leads to a nonunique underdetermined system of linear equations. The barycentric coordinates �i, which form a partition of unity, are associated with discrete probability measures, and the linear reproducing conditions are the counterpart of the expectations of a linear function. The �i are computed by maximizing the uncertainty H(�1, �2,...,�n) = − ∑n i=1 �i log �i, subject to the above constraints. The description is expository in nature, and the numerical results via the maximum entropy (MAXENT) formulation are compared to those obtained from a few distinct polygonal interpolants. The maximum entropy formulation leads to a feasible solution for �i in any convex or nonconvex polygon. This study is an instance of the application of the maximum entropy principle, wherein leastbiased inference is made on the basis of incomplete information. Copyright � 2004 John Wiley & Sons, Ltd. KEY WORDS: Shannon entropy; information theory; barycentric coordinates; natural neighbours; Laplace interpolant; meshfree interpolant; data interpolation 1.
Sequence Compaction for Power Estimation: Theory and Practice
 IEEE TRANSACTIONS ON COMPUTER AIDED DESIGN
, 1999
"... Power estimation has become a critical step in the design of today's integrated circuits (IC's). Power dissipation is strongly input pattern dependent and, hence, to obtain accurate power values one has to simulate the circuit with a large number of vectors that typify the application data. The goal ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
Power estimation has become a critical step in the design of today's integrated circuits (IC's). Power dissipation is strongly input pattern dependent and, hence, to obtain accurate power values one has to simulate the circuit with a large number of vectors that typify the application data. The goal of this paper is to present an effective and robust technique for compacting large sequences of input vectors into much smaller ones such that the power estimates are as accurate as possible and the simulation time is reduced by orders of magnitude. Specifically, this paper introduces the hierarchical modeling of Markov chains as a flexible framework for capturing not only complex spatiotemporal correlations, but also dynamic changes in the sequence characteristics. In addition to this, we introduce and characterize a family of variableorder dynamic Markov models which provide an effective way for accurate modeling of external input sequences that affect the behavior of finite state machines. The new framework is very effective and has a high degree of adaptability. As the experimental results show, large compaction ratios of orders of magnitude can be obtained without significant loss in accuracy (less than 5% on average) for power estimates.
Portfolio choice, attention allocation, and price comovement
 Journal of Economic Theory
, 2010
"... This paper models the attention allocation of portfolio investors. Investors choose the composition of their information subject to an information flow constraint. Given their expected investment strategy in the next period, which is to hold a diversified portfolio, in equilibrium investors choose t ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
This paper models the attention allocation of portfolio investors. Investors choose the composition of their information subject to an information flow constraint. Given their expected investment strategy in the next period, which is to hold a diversified portfolio, in equilibrium investors choose to observe one linear combination of asset payoffs asaprivate signal. When investors use this private signal to update information about two assets, changes in one asset affect both asset prices and may lead to asset price comovement. The model also has implications for the transmission of volatility shocks between two assets.
Harmonic Broadcasting Is BandwidthOptimal Assuming Constant Bit Rate
 in Proc. Annual ACMSIAM Symposium on Discrete Algorithms 2002
, 2002
"... Harmonic broadcasting was introduced by Juhn and Tseng in 1997 as a way to reduce the bandwidth requirements required for videoondemand broadcasting. ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
Harmonic broadcasting was introduced by Juhn and Tseng in 1997 as a way to reduce the bandwidth requirements required for videoondemand broadcasting.