Results 1 - 10
of
13
Incremental Consistency Checking for Pervasive Context
- In Proc. the 28th International Conference on Software Engineering
, 2006
"... Applications in pervasive computing are typically required to interact seamlessly with their changing environments. To provide users with smart computational services, these applications must be aware of incessant context changes in their environments and adjust their behaviors accordingly. As these ..."
Abstract
-
Cited by 23 (10 self)
- Add to MetaCart
(Show Context)
Applications in pervasive computing are typically required to interact seamlessly with their changing environments. To provide users with smart computational services, these applications must be aware of incessant context changes in their environments and adjust their behaviors accordingly. As these environments are highly dynamic and noisy, context changes thus acquired could be obsolete, corrupted or inaccurate. This gives rise to the problem of context inconsistency, which must be timely detected in order to prevent applications from behaving anomalously. In this paper, we propose a formal model of incremental consistency checking for pervasive contexts. Based on this model, we further propose an efficient checking algorithm to detect inconsistent contexts. The performance of the algorithm and its advantages over conventional checking techniques are evaluated experimentally using Cabot middleware.
Partial Constraint Checking for Context Consistency in Pervasive Computing
- ACM TRANS. ON SOFTWARE ENGINEERING AND METHODOLOGY 19(3), ARTICLE 9
, 2010
"... Pervasive computing environments typically change frequently in terms of available resources and their properties. Applications in pervasive computing use contexts to capture these changes and adapt their behaviors accordingly. However, contexts available to these applications may be abnormal or imp ..."
Abstract
-
Cited by 17 (8 self)
- Add to MetaCart
Pervasive computing environments typically change frequently in terms of available resources and their properties. Applications in pervasive computing use contexts to capture these changes and adapt their behaviors accordingly. However, contexts available to these applications may be abnormal or imprecise due to environmental noises. This may result in context inconsistencies, which imply that contexts conflict with each other. The inconsistencies may set such an application into a wrong state or lead the application to misadjust its behavior. It is thus desirable to detect and resolve the context inconsistencies in a timely way. One popular approach is to detect context inconsistencies when contexts breach certain consistency constraints. Existing constraint checking techniques recheck the entire expression of each affected consistency constraint upon context changes. When a changed context affects only a constraint’s subexpression, rechecking the entire expression can adversely delay the detection of other context inconsistencies. This article proposes a rigorous approach to identifying the parts of previous checking results that are reusable without entire rechecking. We evaluated our work on the Cabot middleware through both simulation experiments and a case study. The experimental results reported that our approach achieved over a fifteenfold
Automatically Identifying Changes that Impact Code-to-Design Traceability
"... An approach is presented that automatically determines if a given source code change impacts the design (i.e., UML class diagram) of the system. This allows code-to-design traceability to be consistently maintained as the source code evolves. The approach uses lightweight analysis and syntactic diff ..."
Abstract
-
Cited by 15 (5 self)
- Add to MetaCart
(Show Context)
An approach is presented that automatically determines if a given source code change impacts the design (i.e., UML class diagram) of the system. This allows code-to-design traceability to be consistently maintained as the source code evolves. The approach uses lightweight analysis and syntactic differencing of the source code changes to determine if the change alters the class diagram in the context of abstract design. The intent is to support both the simultaneous updating of design documents with code changes and bringing old design documents up to date with current code given the change history. An efficient tool was developed to support the approach and is applied to an open source system (i.e., HippoDraw). The results are evaluated and compared against manual inspection by human experts. The tool performs better than (error prone) manual inspection. 1.
Automatically Detecting and Tracking Inconsistencies in Software Design Models
- IEEE TRANSACTIONS ON SOFTWARE ENGINEERING
, 2011
"... Abstract—Consistency checkers help engineers find errors (inconsistencies) in software design models. Even if engineers are willing to tolerate inconsistencies, they are better off knowing about their existence to avoid follow-on errors and unnecessary rework. However, current approaches do not dete ..."
Abstract
-
Cited by 14 (3 self)
- Add to MetaCart
(Show Context)
Abstract—Consistency checkers help engineers find errors (inconsistencies) in software design models. Even if engineers are willing to tolerate inconsistencies, they are better off knowing about their existence to avoid follow-on errors and unnecessary rework. However, current approaches do not detect or track inconsistencies fast enough. This paper presents an automated approach for detecting and tracking inconsistencies in design models in real time (while the model changes). Engineers only need to define consistency rules – in any language and without any manual annotations as required by the current state-of-the-art. Our approach automatically identifies how model changes affect these consistency rules. It does this through model profiling during consistency checking to observe the behavior of consistency rules to understand how they affect the model (and are thus affected by model changes). The approach is quick, correct, scalable, fully automated, and also easy to use as it does not require any special skills from the engineers who want to use it. We evaluated the approach on 34 models with model sizes of up to 162,237 model elements and 24 types of consistency and well-formedness rules. Our empirical evaluation shows that our approach requires only 1.4 ms to re-evaluate the consistency of the model after a change (in average), its performance is not affected by the model size but only by the number of consistency rules, at the expense of a quite acceptable, linearly increasing memory consumption. Index Terms—D.2.10 [Design] I.
TQL: A Query Language to Support Traceability
"... A query language for traceability is proposed and presented. The language, TQL, is based in XML and supports queries across multiple artifacts and multiple traceability link types. A number of primitives are defined to allow complex queries to be constructed and executed. Example queries are present ..."
Abstract
-
Cited by 10 (0 self)
- Add to MetaCart
(Show Context)
A query language for traceability is proposed and presented. The language, TQL, is based in XML and supports queries across multiple artifacts and multiple traceability link types. A number of primitives are defined to allow complex queries to be constructed and executed. Example queries are presented in the context of traceability questions. The technical details of the language and issues of implementation are discussed. 1
Relating Requirements to Implementation via Topic Analysis: Do Topics Extracted from Requirements Make Sense to Managers and Developers?
"... Abstract—Large organizations like Microsoft tend to rely on formal requirements documentation in order to specify and design the software products that they develop. These documents are meant to be tightly coupled with the actual implementation of the features they describe. In this paper we evaluat ..."
Abstract
-
Cited by 6 (3 self)
- Add to MetaCart
(Show Context)
Abstract—Large organizations like Microsoft tend to rely on formal requirements documentation in order to specify and design the software products that they develop. These documents are meant to be tightly coupled with the actual implementation of the features they describe. In this paper we evaluate the value of high-level topic-based requirements traceability in the version control system, using Latent Dirichlet Allocation (LDA). We evaluate LDA topics on practitioners and check if the topics and trends extracted matches the perception that Program Managers and Developers have about the effort put into addressing certain topics. We found that effort extracted from version control that was relevant to a topic often matched the perception of the managers and developers of what occurred at the time. Furthermore we found evidence that many of the identified topics made sense to practitioners and matched their perception of what occurred. But for some topics, we found that practitioners had difficulty interpreting and labelling them. In summary, we investigate the high-level traceability of requirements topics to version control commits via topic analysis and validate with the actual stakeholders the relevance of these topics extracted from requirements. Keywords-latent Dirichlet allocation (LDA); requirements; version control; traceability; topics; requirements engineering I.
Using fine-grained differencing to evolve traceability links
- In GCT’07
, 2007
"... An approach to support the sustained evolution of traceability links is proposed and outlined. A fine-grained differencing approach on the link endpoints is used to maintain the links in a scalable manner. Here scalable refers to large software systems with thousands of links. Details of the link mo ..."
Abstract
-
Cited by 3 (1 self)
- Add to MetaCart
(Show Context)
An approach to support the sustained evolution of traceability links is proposed and outlined. A fine-grained differencing approach on the link endpoints is used to maintain the links in a scalable manner. Here scalable refers to large software systems with thousands of links. Details of the link model and representation are given followed by the process used to evolve traceability links.
Incremental Consistency Checking for Complex Design Rules and Larger Model Changes
"... Abstract. Advances in consistency checking in model-based software development made it possible to detect errors in real-time. However, existing approaches assume that changes come in small quantities and design rules are generally small in scope. Yet activities such as model transformation, re-fact ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
(Show Context)
Abstract. Advances in consistency checking in model-based software development made it possible to detect errors in real-time. However, existing approaches assume that changes come in small quantities and design rules are generally small in scope. Yet activities such as model transformation, re-factoring, model merging, or repairs may cause larger model changes and hence cause performance problems during consistency checking. The goal of this work is to increase the performance of re-validating design rules. This work proposes an automated and tool supported approach that re-validates the affected parts of a design rule only. It was empirical evaluated on 19 design rules and 30 small to large design models and the evaluation shows that the approach improves the computational cost of consistency checking with the gains increasing with the size and complexity of design rules.
code-to-design traceability during evolution
"... Automatically identifying changes that impact ..."
Université de Montréal
"... Un formalisme pour la traçabilité des transformations par Mathieu Lemoine Département d’informatique et de recherche opérationnelle Faculté des arts et des sciences Mémoire présenté à la Faculté des arts et des sciences en vue de l’obtention du grade de Maître ès sciences (M.Sc.) en informatique ..."
Abstract
- Add to MetaCart
Un formalisme pour la traçabilité des transformations par Mathieu Lemoine Département d’informatique et de recherche opérationnelle Faculté des arts et des sciences Mémoire présenté à la Faculté des arts et des sciences en vue de l’obtention du grade de Maître ès sciences (M.Sc.) en informatique