Results 1  10
of
29
C.: Coupling and importance sampling for statistical model checking. Research Report LSV1201, Laboratoire Spécification et Vérification. ENS
, 2012
"... Abstract. Statistical modelchecking is an alternative verification technique applied on stochastic systems whose size is beyond numerical analysis ability. Given a model (most often a Markov chain) and a formula, it provides a confidence interval for the probability that the model satisfies the for ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
(Show Context)
Abstract. Statistical modelchecking is an alternative verification technique applied on stochastic systems whose size is beyond numerical analysis ability. Given a model (most often a Markov chain) and a formula, it provides a confidence interval for the probability that the model satisfies the formula. One of the main limitations of the statistical approach is the computation time explosion triggered by the evaluation of very small probabilities. In order to solve this problem we develop a new approach based on importance sampling and coupling. The corresponding algorithms have been implemented in our tool COSMOS. We present experimentation on several relevant systems, with estimated time reductions reaching a factor of 10 −120.
Approximate probabilistic analysis of biopathway dynamics
 Bioinformatics
, 2012
"... Motivation: Biopathways are often modeled as systems of ordinary differential equations (ODEs). Such systems will usually have many unknown parameters and hence will be difficult to calibrate. Since the data available for calibration will have limited precision, an approximate representation of the ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Motivation: Biopathways are often modeled as systems of ordinary differential equations (ODEs). Such systems will usually have many unknown parameters and hence will be difficult to calibrate. Since the data available for calibration will have limited precision, an approximate representation of the ODEs dynamics should suffice. One must however be able to efficiently construct such approximations for large models and perform model calibration and subsequent analysis. Results: We present a GPUbased scheme by which a system of ODEs is approximated as a dynamic Bayesian network (DBN). We then construct a model checking procedure for DBNs based on a simple probabilistic linear time temporal logic. The GPU implementation considerably extends the reach of our previous PCcluster based implementation (Liu et al., 2011b). Further, the key components of our algorithm can serve as the GPU kernel for other Monte Carlo simulations based analysis of biopathway dynamics. Similarly, our model checking framework is a generic one and can be applied in other systems biology settings. We have tested our methods on three ODE models of biopathways: the EGFNGF pathway, the segmentation clock network and the MLCphosphorylation pathway models. The GPU implementation shows significant gains in performance and scalability while the model checking framework turns out to be convenient and efficient for specifying and verifying interesting pathways properties. Availability: The source code is freely available at
Verification of an afdx infrastructure using simulations and probabilities. volume 6418 of LNCS
, 2010
"... Abstract. Until recently, there was not a strong need for networking inside aircrafts. Indeed, the communications were mainly cabled and handled by Ethernet protocols. The evolution of avionics embedded systems and the number of integrated functions in civilian aircrafts has changed the situation. I ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
Abstract. Until recently, there was not a strong need for networking inside aircrafts. Indeed, the communications were mainly cabled and handled by Ethernet protocols. The evolution of avionics embedded systems and the number of integrated functions in civilian aircrafts has changed the situation. Indeed, those functionalities implies a huge increase in the quantity of data exchanged and thus in the number of connections between functions. Among the available mechanisms provided to handle this new complexity, one find Avionics Full Duplex Switched Ethernet (AFDX), a protocol that allows to simulate a pointtopoint network between a source and one or more destinations. The core idea in AFDX is the one of Virtual Links (VL) that are used to simulate pointtopoint communication between devices. One of the main challenge is to show that the total delivery time for packets on VL is bounded by some predefined value. This is a difficult problem that also requires to provide a formal, but quite evolutive, model of the AFDX network. In this paper, we propose to use a componentbased design methodology to describe the behavior of the model. We then propose a stochastic abstraction that allows not only to simplify the complexity of the verification process but also to provide quantitative information on the protocol. 1
Schedulability of HerschelPlanck revisited using statistical model checking
 In ISoLA (2), volume 7610 of LNCS
, 2012
"... Abstract. Schedulability analysis is a main concern for several embedded applications due to their safetycritical nature. The classical method of response time analysis provides an efficient technique used in industrial practice. However, the method is based on conservative assumptions related to ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
Abstract. Schedulability analysis is a main concern for several embedded applications due to their safetycritical nature. The classical method of response time analysis provides an efficient technique used in industrial practice. However, the method is based on conservative assumptions related to execution and blocking times of tasks. Consequently, the method may falsely declare deadline violations that will never occur during execution. This paper is a continuation of previous work of the authors in applying extended timed automata model checking (using the tool UPPAAL) to obtain more exact schedulability analysis, here in the presence of nondeterministic computation times of tasks given by intervals [BCET,WCET]. Considering computation intervals makes the schedulability of the resulting task model undecidable. Our contribution is to propose a combination of model checking techniques to obtain some guarantee on the (un)schedulability of the model even in the presence of undecidability. Two methods are considered: symbolic model checking and statistical model checking. Symbolic model checking allows to conclude schedulability i.e. absence of deadline violations for varying sizes of BCET. However, the symbolic model checking technique is overapproximating for the considered task model and can therefore not be used for disproving schedulability. As a remedy, we show how statistical model checking may be used to generate concrete counter examples witnessing nonschedulability. In addition, we apply statistical model checking to obtain more informative performance analysis e.g. expected response timeswhen the system is schedulable. The methods are demonstrated on a complex satellite software system yielding new insights useful for the company.
Onthefly Confluence Detection for Statistical Model Checking
, 2013
"... Statistical model checking is an analysis method that circumvents the state space explosion problem in modelbased verification by combining probabilistic simulation with statistical methods that provide clear error bounds. As a simulationbased technique, it can only provide sound results if the un ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Statistical model checking is an analysis method that circumvents the state space explosion problem in modelbased verification by combining probabilistic simulation with statistical methods that provide clear error bounds. As a simulationbased technique, it can only provide sound results if the underlying model is a stochastic process. In verification, however, models are usually variations of nondeterministic transition systems. The notion of confluence allows the reduction of such transition systems in classical model checking by removing spurious nondeterministic choices. In this presentation, we show that confluence can be adapted to detect and discard such choices onthefly during simulation, thus extending the applicability of statistical model checking to a subclass of Markov decision processes. In contrast to previous approaches that use partial order reduction, the confluencebased technique can handle additional kinds of nondeterminism. In particular, it is not restricted to interleavings. We evaluate our approach, which is implemented as part of the modes simulator for the MODEST modelling language, on a set of examples that highlight its strengths and limitations and show the improvements compared to the partial orderbased method.
A Simulink Hybrid Heart Model for Quantitative Verification of Cardiac Pacemakers ABSTRACT
"... We develop a novel hybrid heart model in Simulink that is suitable for quantitative verification of implantable cardiac pacemakers. The heart model is formulated at the level of cardiac cells, can be adapted to patient data, and incorporates stochasticity. It is inspired by the timed and hybrid auto ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
We develop a novel hybrid heart model in Simulink that is suitable for quantitative verification of implantable cardiac pacemakers. The heart model is formulated at the level of cardiac cells, can be adapted to patient data, and incorporates stochasticity. It is inspired by the timed and hybrid automata network models of Jiang et al and Ye et al, where probabilistic behaviour is not considered. In contrast to our earlier work, we work directly with action potential signals that the pacemaker sensor inputs from a specific cell, rather than ECG signals. We validate the model by demonstrating that its composition with a pacemaker model can be used to check safety properties by means of approximate probabilistic verification.
Statistical Model Checking for Distributed ProbabilisticControl Hybrid Automata with Smart Grid Applications
, 2011
"... This technical report is a more detailed version of a published paper [12]. The power industry is currently moving towards a more dynamical, intelligent power grid. This Smart Grid is still in its infancy and a formal evaluation of the expensive technologies and ideas on the table is necessary befor ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
This technical report is a more detailed version of a published paper [12]. The power industry is currently moving towards a more dynamical, intelligent power grid. This Smart Grid is still in its infancy and a formal evaluation of the expensive technologies and ideas on the table is necessary before committing to a full investment. In this paper, we argue that a good model for the Smart Grid must match its basic properties: it must be hybrid (both evolve over time, and perform control/computation), distributed (multiple concurrently executing entities), and allow for asynchronous communication and stochastic behaviour (to accurately model realworld power consumption). We propose Distributed ProbabilisticControl Hybrid Automata (DPCHA) as a model for this purpose, and extend Bounded LTL to Quantified Bounded LTL in order to adapt and apply existing statistical modelchecking techniques. We provide an implementation of a framework for developing and verifying DPCHAs. Finally, we conduct a case study for Smart Grid communications analysis. Keywords: statistical model checking, hybrid automata, hybrid systems, power
Probably Approximately Correct MDP Learning and Control With Temporal Logic Constraints
"... Abstract—We consider synthesis of controllers that maximize the probability of satisfying given temporal logic specifications in unknown, stochastic environments. We model the interaction between the system and its environment as a Markov decision process (MDP) with initially unknown transition prob ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Abstract—We consider synthesis of controllers that maximize the probability of satisfying given temporal logic specifications in unknown, stochastic environments. We model the interaction between the system and its environment as a Markov decision process (MDP) with initially unknown transition probabilities. The solution we develop builds on the socalled modelbased probably approximately correct Markov decision process (PACMDP) method. The algorithm attains an εapproximately optimal policy with probability 1−δ using samples (i.e. observations), time and space that grow polynomially with the size of the MDP, the size of the automaton expressing the temporal logic specification,
Importance Sampling for Model Checking of Continuous Time Markov Chains
"... Abstract—Model checking real time properties on probabilistic systems requires computing transient probabilities on continuous time Markov chains. Beyond numerical analysis ability, a probabilistic framing can only be obtained using simulation. This statistical approach fails when directly applied t ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract—Model checking real time properties on probabilistic systems requires computing transient probabilities on continuous time Markov chains. Beyond numerical analysis ability, a probabilistic framing can only be obtained using simulation. This statistical approach fails when directly applied to the estimation of very small probabilities. Here combining the uniformization technique and extending our previous results, we design a method which applies to continuous time Markov chains and formulas of a timed temporal logic. The corresponding algorithm has been implemented in our tool COSMOS. We present experimentations on a relevant system. Our method produces a reliable confidence interval with respect to classical statistical model checking on rare events. Keywordsstatistical model checking; rare events; importance sampling; coupling; uniformization I.