• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 18,689
Next 10 →

Table 1: Features of verification tools

in Tools for Parametric Verification. A Comparison on a Case Study
by Petr Matousek
"... In PAGE 25: ...Table1 , a brief comparison of verification tools that where used for para- metric verification of PGM protocol is shown. Acknowledgements The work presented in the paper was made at laboratory LIAFA under supervi- sion of professor Ahmed Bouajjani and with collaboration of Mihaela Sighireanu from the same institute.... ..."

Table 1: Classi cation of some replay debuggers. Two criteria were omitted: the failure of the replay (criterion 6) is not a concern in most debugger papers, but we believe that most debuggers are of the type 6b, and all debuggers are of type 7a regarding the class of instrumented instructions (criterion 7).

in A Taxonomy of Distributed Debuggers Based on Execution Replay
by Carl Dionne, Marc Feeley, Jocelyn Desbiens 1996
Cited by 3

Table 1: Classi cation of some replay debuggers. Two criteria were omitted: the failure of the replay (criterion 6) is not a concern in most debugger papers, but we believe that most debuggers are of the type 6b, and all debuggers are of type 7a regarding the class of instrumented instructions (criterion 7).

in A Taxonomy of Distributed Debuggers Based on Execution Replay
by Carl Dionne, Marc Feeley, Jocelyn Desbiens 1996
Cited by 3

Table 1. Experimental results comparing our approach implemented in S-VeT againt BLAST verification tool

in Using Counterexample Analysis to Minimize the Number of Predicates for Predicate Abstraction
by Thanyapat Sakunkonchak, Satoshi Komatsu, Masahiro Fujita
"... In PAGE 8: ... For comparison in this paper, we use the same options. Table1 shows the comparison of various benchmarks running BLAST against our approach. Size of the program can be defined by the number of branching conditions used.... ..."

Table 2. Key characteristics of different MPI verification approaches

in Automated MPI Correctness Checking What if there was a magic option?
by Patrick Ohly, Werner Krotz-vogel
"... In PAGE 22: ... Apart from MARMOT and Umpire all of them use distributed checking, just like TAC correctness checking. Table2 is an updated version of the same table in [10], taking into account more recent publications and adding information about TAC correctness check- ing. What sets TAC correctness checking apart from all other tools is that it covers all the important MPI-1 checks and pays special attention to usability: it is the only tool which provides special support for investigating problems inter- actively with a debugger and that uses debug information, function names and stack back traces to provide problem reports that directly map to the source code.... ..."

Table 1: A selection of verification tools

in Introduction Deductive Software Verification
by Reiner Hähnle
"... In PAGE 2: ... The strongest postcondition (weakest precondition) can then be compared to the start state (final state) given in the specification using a first-order theorem prover. Representative state-of-art systems (see also Table1 ) that employ the em- bedding approach include the Spec# system and the KeY system. The first, although referred to by its authors as a static program verifier is an advanced verification condition generator with a theorem prover backend for C# programs annotated with specifications written in a language called Spec#.... ..."

Table 2. University Formal Verification tools.

in unknown title
by unknown authors

Table 3: Steganalysis Tools Verification Tests #

in Abstract Steganalysis in Computer Forensics
by Ahmed Ibrahim

Table 2 Results of the verifications performed in this paper.

in Model Checking Publish/Subscribe Notification for thinkteam
by Maurice H. ter Beek, Mieke Massink, Diego Latella, Stefania Gnesi, Alessandro Forghieri, Maurizio Sebastianis
"... In PAGE 16: ...1. The results of these verifications are summarised in Table2 , with the runtime again given as hours:minutes:seconds.... ..."

Table 1: Verification performance for selected circuits.

in Equivalence Checking Using Cuts and Heaps
by Andreas Kuehlmann , Florian Krohm
"... In PAGE 5: ... Next, the performance of the presented technique was measured for a number of IBM internal circuits. The tests were based on the verification tool Verity and performed on a RS/6000 workstation model 390, the results are shown in Table1 . The second and third column report the design complexity in terms of the number inputs, outputs, gates, and transistors.... In PAGE 6: ... Overall, the proposed engine greatly extends the class of designs which can be handled automatically. For example, with the exception of D9000, none of the industrial designs of Table1 could be verified in a rea- sonable amount of time using a BDD engine only. 8 Conclusions The paper presents a new method to perform functional comparison of combinational circuits using BDDs, circuit graph hashing, cutpoint guessing, and false negative elimination.... ..."
Next 10 →
Results 1 - 10 of 18,689
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University