Results 1  10
of
24
Alphabet Independent And Dictionary Scaled Matching
, 1996
"... The rapidly growing need for analysis of digitized images in multimedia systems has lead to a variety of interesting problems in multidimensional pattern matching. One of the problems is that of scaled matching, finding all appearances of a pattern in a text in all sizes. Another important proble ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
The rapidly growing need for analysis of digitized images in multimedia systems has lead to a variety of interesting problems in multidimensional pattern matching. One of the problems is that of scaled matching, finding all appearances of a pattern in a text in all sizes. Another important problem is dictionary matching, quick search through a dictionary of preprocessed patterns in order to find all dictionary patterns that appear in the input text. In this paper we provide a simple algorithm for two dimensional scaled matching. Our algorithm is the first lineartime alphabetindependent scaled matching algorithm. Its running time is O(jT j), where jT j is the text size, and is independent of j\Sigmaj, the size of the alphabet. The main idea behind our algorithm is identifying and exploiting a scalinginvariant property of patterns. Our technique generalizes to produce the first known algorithm for scaled dictionary matching. We can find all appearances of all dictionary pa...
Inplace RunLength 2D Compressed Search
 In Proceedings of 11th Annual ACMSIAM Symposium on Discrete Algorithms, SODA'2000
, 2000
"... The recent explosion in the amount of stored data has necessitated the storage and transmission of data in compressed form. The need to quickly access this data has given rise to a new paradigm in searching, that of compressed matching [1, 8, 10]. The goal of the compressed pattern matching problem ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
The recent explosion in the amount of stored data has necessitated the storage and transmission of data in compressed form. The need to quickly access this data has given rise to a new paradigm in searching, that of compressed matching [1, 8, 10]. The goal of the compressed pattern matching problem is to nd a pattern in a text without decompressing the text. The criterion of extra space is very relevant to compressed searching. An algorithm is called inplace if the amount of extra space used is proportional to the input size of the pattern. In this paper we present a 2d compressed matching algorithm that is inplace. Let compressed(T ) and compressed(P ) denote the compressed text and pattern, respectively. The algorithm presented in this paper runs in time O(jcompressed(T )j + jP j log ) where is min(jP j; j j), and is the alphabet, for all patterns that have no trivial rows (rows consisting of a single repeating symbol). The amount of space used is O(jcompressed(P )j). The compression used is the 2d runlength compression, used in FAX transmission.
Two and Higher Dimensional Pattern Matching in Optimal Expected Time
, 1994
"... Algorithms with optimal expected running time are presented for searching the occurrences of a twodimensional m × m pattern P in a twodimensional n × n text T over an alphabet of size c. The algorithms are based on placing in the text a static grid of test points, determined on ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
Algorithms with optimal expected running time are presented for searching the occurrences of a twodimensional m &times; m pattern P in a twodimensional n &times; n text T over an alphabet of size c. The algorithms are based on placing in the text a static grid of test points, determined only by n, m and c (not dynamically by earlier test results). Using test strings read from the test points the algorithms eliminate as many potential occurrences of P as possible. The remaining potential occurrences are separately checked for actual occurrences. A suitable choice of the test point set leads to algorithms with expected running time O(n 2 log c m 2 =m 2 ) using the uniform Bernoulli model of randomness. This is shown to be optimal by a generalization of a onedimensional lower bound result by Yao. Experimental results show that the algorithms are efficient in practice, too. The method is also generalized for the k mismatches problem. The resulting algorithm has expected running ti...
Approximate parameterized matching
 In Proc. 12th European Symposium on Algorithms (ESA
, 2004
"... Abstract Two equal length strings s and s0, over alphabets \Sigma s and \Sigma s0, parameterize match if thereexists a bijection ss: \Sigma s! \Sigma s0, such that ss(s) = s0, where ss(s) is the renaming of each characterof s via ss. Parameterized matching is the problem of finding all parameterize ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Abstract Two equal length strings s and s0, over alphabets \Sigma s and \Sigma s0, parameterize match if thereexists a bijection ss: \Sigma s! \Sigma s0, such that ss(s) = s0, where ss(s) is the renaming of each characterof s via ss. Parameterized matching is the problem of finding all parameterized matches of apattern string p in a text t and approximate parameterized matching is the problem of finding,at each location, a bijection ss that maximizes the number of characters that are mapped from p to the appropriate plength substring of t.Parameterized matching was introduced as a model for software duplication detection in software maintenance systems and also has applications in image processing and computationalbiology. For example, approximate parameterized matching models image searching with variable color maps in the presence of errors.We consider the problem for which an error threshold, k, is given and the goal is to find alllocations in t for which there exists a bijection ss which maps p into the appropriate plengthsubstring of t with at most k mismatched mappedelements.We show that (1) the approximate parameterized matching, when  p=t, is equivalent tothe maximum matching problem on graphs, implying that (2) maximum matching is reducible to the approximate parameterized matching with threshold k, up till an O(log t) factor (thiscan be achieved by reducing approximate parameterized matching to the problem by using a binary search on the k's). Given the best known maximum matching algorithms an O(m1.5),where m = p  = t, is implied for approximate parameterized matching. We show that (3) forthe k threshold problem we can do this in O(m + k1.5).Our main result (4) is an O(nk1.5 + mk log m) time algorithm where m = p  and n = t. 1 Introduction In the traditional pattern matching model [11, 19], one seeks exact occurrences of a given pattern pin a text t, i.e. text locations where every text symbol is equal to its corresponding pattern symbol.For two equal length strings
TwoDimensional Periodicity in Rectangular Arrays
 Proc. of the 3rd ACMSIAM Symposium on Discrete Algorithms
, 1992
"... String matching is rich with a variety of algorithmic tools. In contrast, multidimensional matching has had a rather sparse set of techniques. This paper presents a new algorithmic technique for twodimensional matching: periodicity analysis. Its strength appears to lie in the fact that it is inhere ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
String matching is rich with a variety of algorithmic tools. In contrast, multidimensional matching has had a rather sparse set of techniques. This paper presents a new algorithmic technique for twodimensional matching: periodicity analysis. Its strength appears to lie in the fact that it is inherently twodimensional. Periodicity in strings has been used to solve string matching problems. Multidimensional periodicity, though, is not as simple as it is in strings and was not formally studied or used in pattern matching. In this paper, we define and analyze twodimensional periodicity in rectangular arrays. One definition of string periodicity is that a periodic string can selfoverlap in a particular way. An analogous concept is true in two dimensions. The self overlap vectors of a rectangle generate a regular pattern of locations where the rectangle may originate. Based on this regularity, we define four categories of periodic arrays: nonperiodic, latticeperiodic, lineperiodic and...
Optimal exact and fast approximate two dimensional pattern matching allowing rotations
 In Proc. 13th Annual Symposium on Combinatorial Pattern Matching (CPM 2002), LNCS 2373
, 2002
"... Abstract. We give fast filtering algorithms to search for a 2 dimensional pattern in a 2dimensional text allowing any rotation of the pattern. We consider the cases of exact and approximate matching under several matching models, improving the previous results. For a text of size n \Theta n charac ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Abstract. We give fast filtering algorithms to search for a 2 dimensional pattern in a 2dimensional text allowing any rotation of the pattern. We consider the cases of exact and approximate matching under several matching models, improving the previous results. For a text of size n \Theta n characters and a pattern of size m \Theta m characters, the exact matching takes average time O(n2 log m=m2), which is optimal. If we allow k mismatches of characters, then our best algorithm achieves O(n2k log m=m2) average time, for reasonable k values. For large k, we obtain an O(n2k3=2 p log m=m) average time algorithm. We generalize
Multidimensional Pattern Matching: A Survey
, 1992
"... We review some recent algorithms motivated by computer vision. The problem inspiring this research is that of searching an aerial photograph for all appearances of some object. The issues we discuss are local errors, scaling, compression and dictionary matching. We review deterministic serial te ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We review some recent algorithms motivated by computer vision. The problem inspiring this research is that of searching an aerial photograph for all appearances of some object. The issues we discuss are local errors, scaling, compression and dictionary matching. We review deterministic serial techniques that are used for multidimensional pattern matching and discuss their strengths and weaknesses. College of Computing Georgia Institute of Technology Atlanta, Georgia 303320280 Paritally supported by NSF grant IRI9013055. 1 Motivation String Matching is one of the most widely studied problems in computer science [Gal85]. Part of its appeal is in its direct applicability to "real world" problems. The KnuthMorrisPratt [KMP77] algorithm is directly implemented in the emacs "s" and UNIX "grep" commands. The longest common subsequence dynamic programming algorithm [CKK72] is implemented in the UNIX "diff" command. The largest overlap heuristic for finding the shortest common s...
Fast Parallel String PrefixMatching
 Theoret. Comput. Sci
, 1992
"... An O(log log m) time n log m log log m processor CRCWPRAM algorithm for the string prefixmatching problem over a general alphabet is presented. The algorithm can also be used to compute the KMP failure function in O(log log m) time on m log m log log m processors. These results improve on th ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
An O(log log m) time n log m log log m processor CRCWPRAM algorithm for the string prefixmatching problem over a general alphabet is presented. The algorithm can also be used to compute the KMP failure function in O(log log m) time on m log m log log m processors. These results improve on the running time of the best previous algorithm for both problems, which was O(log m), while preserving the same number of operations. 1 Introduction String matching is the problem of finding all occurrences of a short pattern string P[1::m] in a longer text string T [1::n]. The classical sequential algorithm of Knuth, Morris and Pratt [12] solves the string matching problem in time that is linear in the length of the input strings. The KnuthMorrisPratt [12] string matching algorithm can be easily generalized to find the longest pattern prefix that starts at each text position within the same time bound. We refer to this problem as string prefixmatching. In parallel, the string matching p...
On the Comparison Complexity of the String PrefixMatching Problem
 IN PROC. 2ND EUROPEAN SYMPOSIUM ON ALGORITHMS, NUMBER 855 IN LECTURE NOTES IN COMPUTER SCIENCE
, 1995
"... In this paper we study the exact comparison complexity of the string prefixmatching problem in the deterministic sequential comparison model with equality tests. We derive almost tight lower and upper bounds on the number of symbol comparisons required in the worst case by online prefixmatchi ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
In this paper we study the exact comparison complexity of the string prefixmatching problem in the deterministic sequential comparison model with equality tests. We derive almost tight lower and upper bounds on the number of symbol comparisons required in the worst case by online prefixmatching algorithms for any fixed pattern and variable text. Unlike previous results on the comparison complexity of stringmatching and prefixmatching algorithms, our bounds are almost tight for any particular pattern. We also consider the special case where the pattern and the text are the same string. This problem, which we call the string selfprefix problem, is similar to the pattern preprocessing step of the KnuthMorrisPratt stringmatching algorithm that is used in several comparison efficient stringmatching and prefixmatching algorithms, including in our new algorithm. We obtain roughly tight lower and upper bounds on the number of symbol comparisons required in the worst case...
On a Conjecture on Bidimensional Words
, 1999
"... We prove that, given a double sequence w over the alphabet A (i.e. a mapping from Z Z 2 to A), if there exists a pair (n 0 ; m 0 ) 2 Z Z 2 such that p w (n 0 ; m 0 ) ! 1 100 n 0 m 0 , then w has a periodicity vector, where p w of w is the complexity function of w. 1 Introduction In combinat ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We prove that, given a double sequence w over the alphabet A (i.e. a mapping from Z Z 2 to A), if there exists a pair (n 0 ; m 0 ) 2 Z Z 2 such that p w (n 0 ; m 0 ) ! 1 100 n 0 m 0 , then w has a periodicity vector, where p w of w is the complexity function of w. 1 Introduction In combinatorics on words the notions of complexity and periodicity are of fundamental importance. The complexity function of a formal language counts, for any natural number n, the number of words in the language of length n. The complexity function of a word (finite, infinite, biinfinite) is the complexity function of the formal language whose elements are all the factors (or blocks, or also subwords) of the word. The MorseHedlund Theorem states that there exists an important relationship between periodicity and complexity. In particular it states that for any biinfinite word w if the number of its different factors of length n is less Dipartimento di Matematica ed Applicazioni, Universit`a degl...