Results 1 
7 of
7
System Identification, Approximation and Complexity
 International Journal of General Systems
, 1977
"... This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a ..."
Abstract

Cited by 36 (22 self)
 Add to MetaCart
(Show Context)
This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a class of models: a constant one of complexity; and a variable one of approximation induced by an observed behaviour. An admissible model is such that any less complex model is a worse approximation. The general problem of identification is that of finding the admissible subspace of models induced by a given behaviour. It is proved under very general assumptions that, if deterministic models are required then nearly all behaviours require models of nearly maximum complexity. A general theory of approximation between models and behaviour is then developed based on subjective probability concepts and semantic information theory The role of structural constraints such as causality, locality, finite memory, etc., are then discussed as rules of the game. These concepts and results are applied to the specific problem or stochastic automaton, or grammar, inference. Computational results are given to demonstrate that the theory is complete and fully operational. Finally the formulation of identification proposed in this paper is analysed in terms of Klir’s epistemological hierarchy and both are discussed in terms of the rich philosophical literature on the acquisition of knowledge. 1
Tenacious Tortoises: A formalism for argument over rules of inference
 Computational Dialectics (ECAI 2000 Workshop
, 2000
"... As multiagent systems proliferate and employ different and more sophisticated formal logics, it is increasingly likely that agents will be reasoning with different rules of inference. Hence, an agent seeking to convince another of some proposition may first have to convince the latter to use a rule ..."
Abstract

Cited by 10 (7 self)
 Add to MetaCart
(Show Context)
As multiagent systems proliferate and employ different and more sophisticated formal logics, it is increasingly likely that agents will be reasoning with different rules of inference. Hence, an agent seeking to convince another of some proposition may first have to convince the latter to use a rule of inference which it has not thus far adopted. We define a formalism to represent degrees of acceptability or validity of rules of inference, to enable autonomous agents to undertake dialogue concerning inference rules. Even when they disagree over the acceptability of a rule, two agents may still use the proposed formalism to reason collaboratively. 1
Supervaluationism and Its Logics
"... If we adopt a supervaluational semantics for vagueness, what sort of logic results? As it turns out, the answer depends crucially on how the standard notion of validity as truth preservation is recast. There are several ways of doing this within a supervaluational framework, the main alternative bei ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
If we adopt a supervaluational semantics for vagueness, what sort of logic results? As it turns out, the answer depends crucially on how the standard notion of validity as truth preservation is recast. There are several ways of doing this within a supervaluational framework, the main alternative being between ‘global ’ construals (e.g. an argument is valid if and only if it preserves truthunderallprecisifications) and ‘local’ construals (an argument is valid if and only if, under all precisifications, it preserves truth). The former alternative is by far more popular, but I argue in favour of the latter, for (i) it does not suffer from a number of serious objections, and (ii) it makes it possible to restore global validity as a defined notion. Supervaluationism is a mixed bag. It is sometimes described as the ‘standard ’ theory of vagueness, at least in so far as vagueness is construed as a semantic phenomenon, but exactly what that standard theory amounts to is far from clear. In fact, it is pretty clear that there is not just one supervaluational semantics out there—there are lots of such semantics; and although it is true that they all exploit the same insight, their relative differences are by no means immaterial. For one thing, a lot depends on how exactly supervaluations are constructed, that is, on how exactly we come to establish the truthvalue of a given statement. (And when I say that a lot depends on this I mean to say that different explanations may give rise to different philosophical worries, or justify different reactions.) Secondly, and equally importantly, a lot depends on how a given supervaluationary machinery is brought into play when it comes to explaining the logic of the language, that is, not the notion of truth, or ‘supertruth’, as it applies to individual statements, but the notion of validity, or ‘supervalidity’, as it applies to whole arguments. (I am thinking for instance of how different explanations may bear on the question of whether, or to what extent, vagueness involves a departure from classical logic.) Here I want to focus on this second part of the story. However, since the notion of validity depends on the notion of truth—or so one may argue—I also want to comment briefly on the first.
Evaluating Defaults
, 2002
"... We seek to find normative criteria of adequacy for nonmonotonic logic similar to the criterion of validity for deductive logic. Rather than stipulating that the conclusion of an inference be true in all models in which the premises are true, we require that the conclusion of a nonmonotonic inference ..."
Abstract
 Add to MetaCart
We seek to find normative criteria of adequacy for nonmonotonic logic similar to the criterion of validity for deductive logic. Rather than stipulating that the conclusion of an inference be true in all models in which the premises are true, we require that the conclusion of a nonmonotonic inference be true in “almost all ” models of a certain sort in which the premises are true. This “certain sort ” specification picks out the models that are relevant to the inference, taking into account factors such as specificity and vagueness, and previous inferences. The frequencies characterizing the relevant models reflect known frequencies in our actual world. The criteria of adequacy for a default inference can be extended by thresholding to criteria of adequacy for an extension. We show that this avoids the implausibilities that might otherwise result from the chaining of default inferences. The model proportions, when construed in terms of frequencies, provide a verifiable grounding of default rules, and can become the basis for generating default rules from statistics.
c ○ Peter King, in Medieval Formal Logic (Kluwer 2001), 117–145 CONSEQUENCE AS INFERENCE Mediæval Proof Theory 1300–1350
"... The first half of the fourteenth century saw a remarkable flowering in accounts of consequences (consequentiae). Logicians began to write independent treatises on consequences, the most wellknown being those by Walter Burleigh (De consequentiis) and Jean Buridan (Tractatus de consequentiis). Conseq ..."
Abstract
 Add to MetaCart
The first half of the fourteenth century saw a remarkable flowering in accounts of consequences (consequentiae). Logicians began to write independent treatises on consequences, the most wellknown being those by Walter Burleigh (De consequentiis) and Jean Buridan (Tractatus de consequentiis). Consequences also came to be treated systematically in comprehensive works on logic, such as those of Walter Burleigh (both versions of the De puritate artis logicae), William of Ockham (Summa logicae), and, to a lesser extent, Jean Buridan (Summulae de dialectica)—as well as in works written in their wake. 1 The philosophical achievement realized in these various writings was no less than a formulation of a theory of inference: the rules for consequences given by these mediæval authors spell out a natural deduction system in the sense of Jaskowski and Gentzen. 2 1 All translations are mine. I what follows I cite the Latin text only when it is not readily available (e.g. for much of Buridan’s Summulae de dialectica), when there is a textual difficulty, or when a point depends on its original phrasing. The texts on
Article A Pragmatic Justification of Deduction
"... he justification of rational inference making, whether deductive or inductive, has been of interest to many philosophers. The problem of induction has concerned philosophers since the debate was first raised by David Hume.1 Hume argued that if induction is to be justified, nature must be uniform, be ..."
Abstract
 Add to MetaCart
he justification of rational inference making, whether deductive or inductive, has been of interest to many philosophers. The problem of induction has concerned philosophers since the debate was first raised by David Hume.1 Hume argued that if induction is to be justified, nature must be uniform, because inductive arguments make predictions based on regularities in nature. Yet we have reasons to think that nature is not uniform. Attempts to justify the use of induction within the sciences, which began with Hume 2 who shows that we cannot use an inductive justification of induction, as this would presuppose that induction is justified, were continued by Kant in his discussion of strict universality.3 Yet an interesting point is elucidated by Carroll4 and Haack;5 that deduction is equally in need of justification despite the common assumption that deduction is inherently justified. In this paper I will focus on the attempt made by Hans Reichenbach6 to justify induction by pragmatic reasons, in reference to the inductive inferences we make on a regular basis. Other attempts have been made to show that induction is not in
Naturalized Epistemology and Degrees of Knowledge
"... The epistemological approach I’m going to offer in this paper owes much of its prompting motivation to Quine’s ideas, esp. holism and naturalism, as well as to Ferdinand Gonseth’s idoneism. Still, it is but fair for me to point out that several of the mainstays of the present proposal owe very littl ..."
Abstract
 Add to MetaCart
The epistemological approach I’m going to offer in this paper owes much of its prompting motivation to Quine’s ideas, esp. holism and naturalism, as well as to Ferdinand Gonseth’s idoneism. Still, it is but fair for me to point out that several of the mainstays of the present proposal owe very little to the influence of the philosophers whose epistemological views have attracted me most — or for that matter to that of other analytical philosophers. I am referring to my acknowledging degrees of truth and existence and, consequently, degrees of knowledge, too. In order to make it easier to follow my reflections below through their sometimes winding course, I now proceed to list the 13 main components of my proposal. (1) There are infinitely many degrees of truth of propositions, or — what amounts to the same — of existence of the states of affairs (facts) those propositions would correspond to. The structure of those degrees is an atomic one (in an algebraic sense, to be explained later on). (2) Accordingly, there may be infinitely many degrees of existence of [the state of affairs consisting in] someone’s believing something.