Results 1 
6 of
6
Beyond Induction Variables: Detecting and Classifying Sequences Using a Demanddriven SSA Form
 ACM Transactions on Programming Languages and Systems
, 1995
"... this paper we present a practical technique for detecting a broader class of linear induction variables than is usually recognized, as well as several other sequence forms, including periodic, polynomial, geometric, monotonic, and wraparound variables. Our method is based on Factored UseDef (FUD) ..."
Abstract

Cited by 106 (5 self)
 Add to MetaCart
this paper we present a practical technique for detecting a broader class of linear induction variables than is usually recognized, as well as several other sequence forms, including periodic, polynomial, geometric, monotonic, and wraparound variables. Our method is based on Factored UseDef (FUD) chains, a demanddriven representation of the popular Static Single Assignment form. In this form, strongly connected components of the associated SSA graph correspond to sequences in the source program: we describe a simple yet efficient algorithm for detecting and classifying these sequences. We have implemented this algorithm in Nascent, our restructuring Fortran 90+ compiler, and we present some results showing the effectiveness of our approach.
Optimizing array bound checks using flow analysis
 ACM Letters on Programming Languages and Systems
, 1993
"... Bound checks are introduced in programs for the runtime detection of array bound violations. Compiletime optimizations are employed to reduce the executiontime overhead due to bound checks. The optimizations reduce the program execution time through elimination of checks and propagation of checks ..."
Abstract

Cited by 73 (4 self)
 Add to MetaCart
Bound checks are introduced in programs for the runtime detection of array bound violations. Compiletime optimizations are employed to reduce the executiontime overhead due to bound checks. The optimizations reduce the program execution time through elimination of checks and propagation of checks out of loops. An execution of the optimized program terminates with an array bound violation if and only if the same outcome would have resulted during the execution of the program containing all array bound checks. However, the point at which the array bound violation occurs may not be the same. Experimental results indicate that the number of bound checks performed during the execution of a program is greatly reduced using these techniques. Categories and Subject Descriptors: D.2.5 [Software Engineering]: Testing and Debuggingâ€”error handling and recouery; D.3.4 [Programming Languages]: Processorsâ€”compilers, optimization
Monotonic Evolution: An Alternative to Induction Variable Substitution for Dependence Analysis
"... We present a new approach to dependence testing in the presence of induction variables. Instead of looking for closed form expressions, our method computes monotonic evolution which captures the direction in which the value of a variable changes. This information is then used in the dependence test ..."
Abstract

Cited by 26 (5 self)
 Add to MetaCart
We present a new approach to dependence testing in the presence of induction variables. Instead of looking for closed form expressions, our method computes monotonic evolution which captures the direction in which the value of a variable changes. This information is then used in the dependence test to help determine whether array references are dependencefree. Under this scheme, closed form computation and induction variable substitution can be delayed until after the dependence test and be performed ondemand. To improve computative efficiency, we also propose an optimized (noniterative) dataflow algorithm to compute evolution. Experimental results show that dependence tests based on evolution information matches the accuracy of that based on closedform computation (implemented in Polaris), and when no closed form expressions can be calculated, our method is more accurate than that of Polaris.
Induction variable analysis without idiom recognition: Beyond monotonicity
 In Proceedings of the 14th International Workshop on Languages and Compilers for Parallel Computing
, 2001
"... This work is an extension of the previous induction variable analyses based on monotonic evolution [11]. With the same computational complexity, the new algorithm improves the monotonic evolutionbased analysis in two aspects: more accurate dependence testing and the ability to compute closed form e ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
This work is an extension of the previous induction variable analyses based on monotonic evolution [11]. With the same computational complexity, the new algorithm improves the monotonic evolutionbased analysis in two aspects: more accurate dependence testing and the ability to compute closed form expressions. The experimental results demonstrate that when dealing with induction variables, dependence tests based on distance intervals are both efficient and effective compared to closedform based dependence tests. 1 Introduction Dependence analysis is useful to many parallelization and optimization algorithms. To extract dependence information, array subscripts must be compared across statements and loop iterations. However, array subscripts often include variables whose value at each loop iteration is not easily available. An important class of such variables are induction variables. In classical dependence analyses, occurrences of induction variable are often replaced by their closed form expressions. Since most dependence tests handle affine expressions only, this approach only applies to induction variables with affine closed form expressions. To handle more general induction variables, in our previous work, we proposed a dependence test based on a lightweight IV
Dependence Testing without Induction Variable Substitution
, 2001
"... We present a new approach to dependence testing in the presence of induction variables. Instead of looking for closed form expressions, our method computes monotonic evolution which captures the direction in which the value of a variable changes. This information is used for dependence testing of ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We present a new approach to dependence testing in the presence of induction variables. Instead of looking for closed form expressions, our method computes monotonic evolution which captures the direction in which the value of a variable changes. This information is used for dependence testing of array references. Under this scheme, closed form computation and induction variable substitution can be delayed until after the dependence test and be performed ondemand. The technique can be extended to dynamic data structures, using either pointerbased implementations or standard objectoriented containers. To improve efficiency, we also propose an optimized (noniterative) dataflow algorithm to compute evolution. Experimental results show that dependence tests based on evolution information match the accuracy of that based on closedform computation (implemented in Polaris), and when no closed form expressions can be calculated, our method is more accurate than that of Polaris.
Analyses of Pointers . . . DEPENDENCE TESTING
, 2001
"... Many compiler optimizations rely on dependence tests to check the validity of the proposed transformations. Classical dependence tests focus primarily on array accesses with affine subscript expressions. These tests, however, often fall short when dealing with program features such as pointers, irre ..."
Abstract
 Add to MetaCart
Many compiler optimizations rely on dependence tests to check the validity of the proposed transformations. Classical dependence tests focus primarily on array accesses with affine subscript expressions. These tests, however, often fall short when dealing with program features such as pointers, irregular subscripts, and objectoriented designs, which are common in today's high performance applications. In this thesis, we present three analyses that enable dependence tests for three types of objects: pointers, induction variables, and container objects. The first analysis is a pointer analysis that is precise enough for iterationbased dependence tests. The analysis is presented in the context of Java. An iterationbased dependence test needs to disambiguate pointers from different iterations. Therefore, our method summarizes memory locations referenced by a pointer at every instance of a static program point. Such pointer information can also be used to eliminate redundant exception checks on Java references. We then present an induction variable analysis that exploits IV information without closed form computation. Array subscripts involving induction variables are nonaffine. In classical dependence analyses, such subscripts are handled by replacing occurrences of induction variables by their closed form expressions. Instead of computing closed form expressions, however, our method computes the value change of a variable along different controlflow paths. It then uses this information to discover that array references are dependencefree. The last