Results 1  10
of
29
Quantitative languages
"... Quantitative generalizations of classical languages, which assign to each word a real number instead of a boolean value, have applications in modeling resourceconstrained computation. We use weighted automata (finite automata with transition weights) to define several natural classes of quantitativ ..."
Abstract

Cited by 70 (24 self)
 Add to MetaCart
Quantitative generalizations of classical languages, which assign to each word a real number instead of a boolean value, have applications in modeling resourceconstrained computation. We use weighted automata (finite automata with transition weights) to define several natural classes of quantitative languages over finite and infinite words; in particular, the real value of an infinite run is computed as the maximum, limsup, liminf, limit average, or discounted sum of the transition weights. We define the classical decision problems of automata theory (emptiness, universality, language inclusion, and language equivalence) in the quantitative setting and study their computational complexity. As the decidability of the languageinclusion problem remains open for some classes of weighted automata, we introduce a notion of quantitative simulation that is decidable and implies language inclusion. We also give a complete characterization of the expressive power of the various classes of weighted automata. In particular, we show that most classes of weighted
Temporal specifications with accumulative values
 In LICS
, 2011
"... Abstract—There is recently a significant effort to add quantitative objectives to formal verification and synthesis. We introduce and investigate the extension of temporal logics with quantitative atomic assertions, aiming for a general and flexible framework for quantitativeoriented specifications ..."
Abstract

Cited by 21 (9 self)
 Add to MetaCart
(Show Context)
Abstract—There is recently a significant effort to add quantitative objectives to formal verification and synthesis. We introduce and investigate the extension of temporal logics with quantitative atomic assertions, aiming for a general and flexible framework for quantitativeoriented specifications. In the heart of quantitative objectives lies the accumulation of values along a computation. It is either the accumulated summation, as with the energy objectives, or the accumulated average, as with the meanpayoff objectives. We investigate the extension of temporal logics with the prefixaccumulation assertions Sum(v) ≥ c and Avg(v) ≥ c, where v is a numeric variable of the system, c is a constant rational number, and Sum(v) and Avg(v) denote the accumulated sum and average of the values of v from the beginning of the computation up to the current point of time. We also allow the pathaccumulation assertions LimInfAvg(v) ≥ c and LimSupAvg(v) ≥ c, referring to the average value along an entire computation. We study the border of decidability for extensions of various temporal logics. In particular, we show that extending the fragment of CTL that has only the EX, EF, AX, and AG temporal modalities by prefixaccumulation assertions and extending LTL with pathaccumulation assertions, result in temporal logics whose modelchecking problem is decidable. The extended logics allow to significantly extend the currently known energy and meanpayoff objectives. Moreover, the prefixaccumulation assertions may be refined with “controlledaccumulation”, allowing, for example, to specify constraints on the average waiting time between a request and a grant. On the negative side, we show that the fragment we point to is, in a sense, the maximal logic whose extension with prefixaccumulation assertions permits a decidable modelchecking procedure. Extending a temporal logic that has the EG or EU modalities, and in particular CTL and LTL, makes the problem undecidable. I.
On the Determinization of Weighted Finite Automata
 SIAM J. Comput
, 1998
"... . We study determinization of weighted finitestate automata (WFAs), which has important applications in automatic speech recognition (ASR). We provide the first polynomialtime algorithm to test for the twins property, which determines if a WFA admits a deterministic equivalent. We also provide ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
. We study determinization of weighted finitestate automata (WFAs), which has important applications in automatic speech recognition (ASR). We provide the first polynomialtime algorithm to test for the twins property, which determines if a WFA admits a deterministic equivalent. We also provide a rigorous analysis of a determinization algorithm of Mohri, with tight bounds for acyclic WFAs. Given that WFAs can expand exponentially when determinized, we explore why those used in ASR tend to shrink. The folklore explanation is that ASR WFAs have an acyclic, multipartite structure. We show, however, that there exist such WFAs that always incur exponential expansion when determinized. We then introduce a class of WFAs, also with this structure, whose expansion depends on the weights: some weightings cause them to shrink, while others, including random weightings, cause them to expand exponentially. We provide experimental evidence that ASR WFAs exhibit this weight dependence. ...
Expressiveness and closure properties for quantitative languages
 In Proc. of LICS: Logic in Computer Science. IEEE Comp. Soc
, 2009
"... Abstract. Weighted automata are nondeterministic automata with numerical weights on transitions. They can define quantitative languages L that assign to each word w a real number L(w). In the case of infinite words, the value of a run is naturally computed as the maximum, limsup, liminf, limit avera ..."
Abstract

Cited by 20 (9 self)
 Add to MetaCart
(Show Context)
Abstract. Weighted automata are nondeterministic automata with numerical weights on transitions. They can define quantitative languages L that assign to each word w a real number L(w). In the case of infinite words, the value of a run is naturally computed as the maximum, limsup, liminf, limit average, or discounted sum of the transition weights. We study expressiveness and closure questions about these quantitative languages. We first show that the set of words with value greater than a threshold can be nonωregular for deterministic limitaverage and discountedsum automata, while this set is always ωregular when the threshold is isolated (i.e., some neighborhood around the threshold contains no word). In the latter case, we prove that the ωregular language is robust against small perturbations of the transition weights. We next consider automata with transition weights 0 or 1 and show that they are as expressive as general weighted automata in the limitaverage case, but not in the discountedsum case. Third, for quantitative languages L1 and L2, we consider the operations max(L1, L2), min(L1, L2), and 1−L1, which generalize the boolean operations on languages, as well as the sum L1 +L2. We establish the closure properties of all classes of quantitative languages with respect to these four operations. 1
Skew and infinitary formal power series
, 2005
"... We investigate finitestate systems with weights. Departing from the classical theory, in this paper the weight of an action does not only depend on the state of the system, but also on the time when it is executed; this reflects the usual human evaluation practices in which later events are conside ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
We investigate finitestate systems with weights. Departing from the classical theory, in this paper the weight of an action does not only depend on the state of the system, but also on the time when it is executed; this reflects the usual human evaluation practices in which later events are considered less urgent and carry less weight than close events. We first characterize the terminating behaviors of such systems in terms of rational formal power series. This generalizes a classical result of Schützenberger. Secondly, we deal with nonterminating behaviors and their weights. This includes an extension of the Büchiacceptance condition from finite automata to weighted automata and provides a characterization of these nonterminating behaviors in terms of ωrational formal power series. This generalizes a classical theorem of Büchi.
Finite Automata Based Compression of Bilevel and Simple Color Images
, 1996
"... We introduce generalized finite automata as a tool for specification of bilevel images. We describe an inference algorithm for generalized finite automata and a lossy compression system for bilevel and simple color images based on this algorithm and vector quantization. 1 Introduction A bile ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
We introduce generalized finite automata as a tool for specification of bilevel images. We describe an inference algorithm for generalized finite automata and a lossy compression system for bilevel and simple color images based on this algorithm and vector quantization. 1 Introduction A bilevel multiresolution image is specified by assigning the value 0 or 1 to every node of the infinite quadtree. If the outgoing edges of each node of the quadtree are labeled 0; 1; 2; 3, we get a uniquely labeled path to every node; its label is called the address of the node. The address of a node at depth k is a string of length k over the alphabet f0; 1; 2; 3g. Hence, a bilevel multiresolution image can be specified as a subset of strings over the alphabet f0; 1; 2; 3g, namely the collection of the addresses of the nodes assigned value 1 (black). Regular sets of strings are specified by finite automata or regular expressions [13]. Therefore, finite automata can be used to specify (regular) ...
Algorithms on Compressed Strings and Arrays
 In Proc. 26th Ann. Conf. on Current Trends in Theory and Practice of Infomatics
, 1999
"... . We survey the complexity issues related to several algorithmic problems for compressed one and twodimensional texts without explicit decompression: patternmatching, equalitytesting, computation of regularities, subsegment extraction, language membership, and solvability of word equations. Our ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
(Show Context)
. We survey the complexity issues related to several algorithmic problems for compressed one and twodimensional texts without explicit decompression: patternmatching, equalitytesting, computation of regularities, subsegment extraction, language membership, and solvability of word equations. Our basic problem is one and twodimensional patternmatching together with its variations. For some types of compression the patternmatching problems are infeasible (NPhard), for other types they are solvable in polynomial time and we discuss how to reduce the degree of corresponding polynomials. 1 Introduction In the last decade a new stream of research related to data compression has emerged: algorithms on compressed objects. It has been caused by the increase in the volume of data and the need to store and transmit masses of information in compressed form. The compressed information has to be quickly accessed and processed without explicit decompression. In this paper we consider severa...
Probabilistic Weighted Automata
"... Abstract. Nondeterministic weighted automata are finite automata with numerical weights on transitions. They define quantitative languages L that assign to each word w a real number L(w). The value of an infinite word w is computed as the maximal value of all runs over w, and the value of a run as t ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
(Show Context)
Abstract. Nondeterministic weighted automata are finite automata with numerical weights on transitions. They define quantitative languages L that assign to each word w a real number L(w). The value of an infinite word w is computed as the maximal value of all runs over w, and the value of a run as the maximum, limsup, liminf, limit average, or discounted sum of the transition weights. We introduce probabilistic weighted automata, in which the transitions are chosen in a randomized (rather than nondeterministic) fashion. Under almostsure semantics (resp. positive semantics), the value of a word w is the largest real v such that the runs over w have value at least v with probability 1 (resp. positive probability). We study the classical questions of automata theory for probabilistic weighted automata: emptiness and universality, expressiveness, and closure under various operations on languages. For quantitative languages, emptiness and universality are defined as whether the value of some (resp. every) word exceeds a given threshold. We prove some of these questions to be decidable, and others undecidable. Regarding expressive power, we show that probabilities allow us to define a wide variety of new classes of quantitative languages, except for discountedsum automata, where probabilistic choice is no more expressive than nondeterminism. Finally, we give an almost complete picture of the closure of various classes of probabilistic weighted automata for the following pointwise operations on quantitative languages: max, min, sum, and numerical complement. 1
Efficiency of Fast Parallel PatternSearching in Highly Compressed Texts
"... We consider efficiency of NCalgorithms for patternsearching in highly compressed one and twodimensional texts. "Highly compressed" means that the text can be exponentially large with respect to its compressed version, and "fast" means "in polylogarithmic time". Give ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We consider efficiency of NCalgorithms for patternsearching in highly compressed one and twodimensional texts. "Highly compressed" means that the text can be exponentially large with respect to its compressed version, and "fast" means "in polylogarithmic time". Given an uncompressed pattern P and a compressed version of a text T, the compressed matching problem is to test if P occurs in T. Two types of closely related compressed representations of 1dimensional texts are considered: the LempelZiv encodings (LZ, in short) and restricted LZ encodings (RLZ, in short). For highly compressed texts there is a small difference between them, in extreme situations both of them compress text exponentially, e.g. Fibonacci words of size N have compressed versions of size O(log N) for LZ and Restricted LZ encodings. An efficient sequential algorithm for LZcompressed matching was given in [7], we show that this algorithm is inherently sequential. Despite similarities we prove that LZcompressed m...
String Transformation for nDimensional Image Compression
 SPRINGERVERLAG BERLIN HEIDELBERG
, 2002
"... Image compression and manipulation by weighted finite automata exploit similarities in the images in order to obtain notable compression ratios and manipulation tools. The investigations are often based on twodimensional images. A natural ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Image compression and manipulation by weighted finite automata exploit similarities in the images in order to obtain notable compression ratios and manipulation tools. The investigations are often based on twodimensional images. A natural