This paper discusses three relevant logics that obey Component Homogeneity - a principle that Goddard and Routley introduce in their project of a logic of significance. The paper establishes two main results. First, it establishes a general characterization result for two families of logic that obey Component Homogeneity - that is, we provide a set of necessary and sufficient conditions for their consequence relations. From this, we derive characterization results for S*fde, dS*fde, crossS*fde. Second, the paper establishes (...) complete sequent calculi for S*fde, dS*fde, crossS*fde. Among the other accomplishments of the paper, we generalize the semantics from Bochvar, Hallden, Deutsch and Daniels, we provide a general recipe to define containment logics, we explore the single-premise/single-conclusion fragment of S*fde, dS*fde, crossS*fdeand the connections between crossS*fde and the logic Eq of equality by Epstein. Also, we present S*fde as a relevantlogic of meaninglessness that follows the main philosophical tenets of Goddard and Routley, and we briefly examine three further systems that are closely related to our main logics. Finally, we discuss Routley's criticism to containment logic in light of our results, and overview some open issues. (shrink)
I develop and defend a truthmaker semantics for the relevantlogic R. The approach begins with a simple philosophical idea and develops it in various directions, so as to build a technically adequate relevant semantics. The central philosophical idea is that truths are true in virtue of specific states. Developing the idea formally results in a semantics on which truthmakers are relevant to what they make true. A very natural notion of conditionality is added, giving us (...)relevant implication. I then investigate ways to add conjunction, disjunction, and negation; and I discuss how to justify contraposition and excluded middle within a truthmaker semantics. (shrink)
This paper has two aims. First, it sets out an interpretation of the relevantlogic E of relevant entailment based on the theory of situated inference. Second, it uses this interpretation, together with Anderson and Belnap’s natural deduc- tion system for E, to generalise E to a range of other systems of strict relevant implication. Routley–Meyer ternary relation semantics for these systems are produced and completeness theorems are proven. -/- .
The aim of this paper is to explore what insights relevant logics may provide for the understanding of literary fictional narrative. To date, hardly anyone has reflected on the intersection of relevant logics and narratology, and some could think that there is good reason for it. On the one hand, relevance has been a prominent issue in pragmatics, in the tradition of Grice, and Sperber and Wilson; thus framed, relevance is highly context-sensitive, so it seems unsuitable for formal (...) analysis. On the other hand, the very idea of a logic of narrative has been criticized, arguing that logic brings to a stasis the time of human action (Ricœur, II: 29-60), or that its emphasis on rules misses the creative, unpredictable character of literature (De Man)... First, I will briefly introduce relevant logics, with an eye to showing their interest for narratological concerns, rather than to here providing a coherent (let alone comprehensive) survey. Secondly, lest I get drawn into purely abstract discussion, I will analyse several stories in order to give some instances of the kind of topics congenial to narratology that may be addressed with a relevantist toolkit. Thirdly (and lastly), I will expand in more theoretical fashion on certain issues raised in the second section and bring them into connection with pragmatic relevance theory. (shrink)
This paper sets out to evaluate the claim that Aristotle’s Assertoric Syllogistic is a relevance logic or shows significant similarities with it. I prepare the grounds for a meaningful comparison by extracting the notion of relevance employed in the most influential work on modern relevance logic, Anderson and Belnap’s Entailment. This notion is characterized by two conditions imposed on the concept of validity: first, that some meaning content is shared between the premises and the conclusion, and second, that (...) the premises of a proof are actually used to derive the conclusion. Turning to Aristotle’s Prior Analytics, I argue that there is evidence that Aristotle’s Assertoric Syllogistic satisfies both conditions. Moreover, Aristotle at one point explicitly addresses the potential harmfulness of syllogisms with unused premises. Here, I argue that Aristotle’s analysis allows for a rejection of such syllogisms on formal grounds established in the foregoing parts of the Prior Analytics. In a final section I consider the view that Aristotle distinguished between validity on the one hand and syllogistic validity on the other. Following this line of reasoning, Aristotle’s logic might not be a relevance logic, since relevance is part of syllogistic validity and not, as modern relevance logic demands, of general validity. I argue that the reasons to reject this view are more compelling than the reasons to accept it and that we can, cautiously, uphold the result that Aristotle’s logic is a relevance logic. (shrink)
The system R, or more precisely the pure implicational fragment R›, is considered by the relevance logicians as the most important. The another central system of relevance logic has been the logic E of entailment that was supposed to capture strict relevant implication. The next system of relevance logic is RM or R-mingle. The question is whether adding mingle axiom to R› yields the pure implicational fragment RM› of the system? As concerns the weak systems there (...) are at least two approaches to the problem. First of all, it is possible to restrict a validity of some theorems. In another approach we can investigate even weaker logics which have no theorems and are characterized only by rules of deducibility. (shrink)
This interesting and imaginative monograph is based on the author’s PhD dissertation supervised by Saul Kripke. It is dedicated to Timothy Smiley, whose interpretation of PRIOR ANALYTICS informs its approach. As suggested by its title, this short work demonstrates conclusively that Aristotle’s syllogistic is a suitable vehicle for fruitful discussion of contemporary issues in logical theory. Aristotle’s syllogistic is represented by Corcoran’s 1972 reconstruction. The review studies Lear’s treatment of Aristotle’s logic, his appreciation of the Corcoran-Smiley paradigm, and his (...) understanding of modern logical theory. In the process Corcoran and Scanlan present new, previously unpublished results. Corcoran regards this review as an important contribution to contemporary study of PRIOR ANALYTICS: both the book and the review deserve to be better known. (shrink)
Relevant logics provide an alternative to classical implication that is capable of accounting for the relationship between the antecedent and the consequence of a valid implication. Relevant implication is usually explained in terms of information required to assess a proposition. By doing so, relevant implication introduces a number of cognitively relevant aspects in the de nition of logical operators. In this paper, we aim to take a closer look at the cognitive feature of relevant implication. (...) For this purpose, we develop a cognitively-oriented interpretation of the semantics of relevant logics. In particular, we provide an interpretation of Routley-Meyer semantics in terms of conceptual spaces and we show that it meets the constraints of the algebraic semantics of relevantlogic. (shrink)
In 1942 Haskell B. Curry presented what is now called Curry's paradox which can be found in a logic independently of its stand on negation. In recent years there has been a revitalised interest in non-classical solutions to the semantic paradoxes. In this article the non-classical resolution of Curry’s Paradox and Shaw-Kwei' sparadox without rejection any contraction postulate is proposed. In additional relevant paraconsistent logic C ̌_n^#,1≤n<ω, in fact,provide an effective way of circumventing triviality of da Costa’s (...) paraconsistent Set Theories〖NF〗n^C. (shrink)
What does it mean for the laws of logic to fail? My task in this paper is to answer this question. I use the resources that Routley/Sylvan developed with his collaborators for the semantics of relevant logics to explain a world where the laws of logic fail. I claim that the non-normal worlds that Routley/Sylvan introduced are exactly such worlds. To disambiguate different kinds of impossible worlds, I call such worlds logically impossible worlds. At a logically impossible (...) world, the laws of logic fail. In this paper, I provide a definition of logically impossible worlds. I then show that there is nothing strange about admitting such worlds. (shrink)
Relevance logic has become ontologically fertile. No longer is the idea of relevance restricted in its application to purely logical relations among propositions, for as Dunn has shown in his (1987), it is possible to extend the idea in such a way that we can distinguish also between relevant and irrelevant predications, as for example between “Reagan is tall” and “Reagan is such that Socrates is wise”. Dunn shows that we can exploit certain special properties of identity within (...) the context of standard relevance logic in a way which allows us to discriminate further between relevant and irrelevant properties, as also between relevant and irrelevant relations. The idea yields a family of ontologically interesting results concerning the different ways in which attributes and objects may hang together. Because of certain notorious peculiarities of relevance logic, however,1 Dunn’s idea breaks down where the attempt is made to have it bear fruit in application to relations among entities which are of homogeneous type. (shrink)
Logic arguably plays a role in the normativity of reasoning. In particular, there are plausible norms of belief/disbelief whose antecedents are constituted by claims about what follows from what. But is logic also relevant to the normativity of agnostic attitudes? The question here is whether logical entailment also puts constraints on what kinds of things one can suspend judgment about. In this paper I address that question and I give a positive answer to it. In particular, I (...) advance two logical norms of agnosticism, where the first one allows us to assess situations in which the subject is agnostic about the conclusion of a valid argument and the second one allows us to assess situations in which the subject is agnostic about one of the premises of a valid argument. (shrink)
Epistemic two-dimensional semantics is a theory in the philosophy of language that provides an account of meaning which is sensitive to the distinction between necessity and apriority. While this theory is usually presented in an informal manner, I take some steps in formalizing it in this paper. To do so, I define a semantics for a propositional modal logic with operators for the modalities of necessity, actuality, and apriority that captures the relevant ideas of epistemic two-dimensional semantics. I (...) also describe some properties of the logic that are interesting from a philosophical perspective, and apply it to the so-called nesting problem. (shrink)
A logic is called 'paraconsistent' if it rejects the rule called 'ex contradictione quodlibet', according to which any conclusion follows from inconsistent premises. While logicians have proposed many technically developed paraconsistent logical systems and contemporary philosophers like Graham Priest have advanced the view that some contradictions can be true, and advocated a paraconsistent logic to deal with them, until recent times these systems have been little understood by philosophers. This book presents a comprehensive overview on paraconsistent logical systems (...) to change this situation. The book includes almost every major author currently working in the field. The papers are on the cutting edge of the literature some of which discuss current debates and others present important new ideas. The editors have avoided papers about technical details of paraconsistent logic, but instead concentrated upon works that discuss more 'big picture' ideas. Different treatments of paradoxes takes centre stage in many of the papers, but also there are several papers on how to interpret paraconistent logic and some on how it can be applied to philosophy of mathematics, the philosophy of language, and metaphysics. (shrink)
We propose a new account of indicative conditionals, giving acceptability and logical closure conditions for them. We start from Adams’ Thesis: the claim that the acceptability of a simple indicative equals the corresponding conditional probability. The Thesis is widely endorsed, but arguably false and refuted by empirical research. To fix it, we submit, we need a relevance constraint: we accept a simple conditional 'If φ, then ψ' to the extent that (i) the conditional probability p(ψ|φ) is high, provided that (ii) (...) φ is relevant for ψ. How (i) should work is well-understood. It is (ii) that holds the key to improve our understanding of conditionals. Our account has (i) a probabilistic component, using Popper functions; (ii) a relevance component, given via an algebraic structure of topics or subject matters. We present a probabilistic logic for simple indicatives, and argue that its (in)validities are both theoretically desirable and in line with empirical results on how people reason with conditionals. (shrink)
In this paper I will develop a view about the semantics of imperatives, which I term Modal Noncognitivism, on which imperatives might be said to have truth conditions (dispositionally, anyway), but on which it does not make sense to see them as expressing propositions (hence does not make sense to ascribe to them truth or falsity). This view stands against “Cognitivist” accounts of the semantics of imperatives, on which imperatives are claimed to express propositions, which are then enlisted in explanations (...) of the relevant logico-semantic phenomena. It also stands against the major competitors to Cognitivist accounts—all of which are non-truth-conditional and, as a result, fail to provide satisfying explanations of the fundamental semantic characteristics of imperatives (or so I argue). The view of imperatives I defend here improves on various treatments of imperatives on the market in giving an empirically and theoretically adequate account of their semantics and logic. It yields explanations of a wide range of semantic and logical phenomena about imperatives—explanations that are, I argue, at least as satisfying as the sorts of explanations of semantic and logical phenomena familiar from truth-conditional semantics. But it accomplishes this while defending the notion—which is, I argue, substantially correct—that imperatives could not have propositions, or truth conditions, as their meanings. (shrink)
One logic or many? I say—many. Or rather, I say there is one logic for each way of specifying the class of all possible circumstances, or models, i.e., all ways of interpreting a given language. But because there is no unique way of doing this, I say there is no unique logic except in a relative sense. Indeed, given any two competing logical theories T1 and T2 (in the same language) one could always consider their common core, (...) T, and settle on that theory. So, given any language L, one could settle on the minimal logic T0 corresponding to the common core shared by all competitors. That would be a way of resisting relativism, as long as one is willing to redraw the bounds of logic accordingly. However, such a minimal theory T0 may be empty if the syntax of L contains no special ingredients the interpretation of which is independent of the specification of the relevant L-models. And generally—I argue—this is indeed the case. (shrink)
This paper contends that Stoic logic (i.e. Stoic analysis) deserves more attention from contemporary logicians. It sets out how, compared with contemporary propositional calculi, Stoic analysis is closest to methods of backward proof search for Gentzen-inspired substructural sequent logics, as they have been developed in logic programming and structural proof theory, and produces its proof search calculus in tree form. It shows how multiple similarities to Gentzen sequent systems combine with intriguing dissimilarities that may enrich contemporary discussion. Much (...) of Stoic logic appears surprisingly modern: a recursively formulated syntax with some truth-functional propositional operators; analogues to cut rules, axiom schemata and Gentzen’s negation-introduction rules; an implicit variable-sharing principle and deliberate rejection of Thinning and avoidance of paradoxes of implication. These latter features mark the system out as a relevance logic, where the absence of duals for its left and right introduction rules puts it in the vicinity of McCall’s connexive logic. Methodologically, the choice of meticulously formulated meta-logical rules in lieu of axiom and inference schemata absorbs some structural rules and results in an economical, precise and elegant system that values decidability over completeness. (shrink)
I want to model a finite, fallible cognitive agent who imagines that p in the sense of mentally representing a scenario—a configuration of objects and properties—correctly described by p. I propose to capture imagination, so understood, via variably strict world quantifiers, in a modal framework including both possible and so-called impossible worlds. The latter secure lack of classical logical closure for the relevant mental states, while the variability of strictness captures how the agent imports information from actuality in the (...) imagined non-actual scenarios. Imagination turns out to be highly hyperintensional, but not logically anarchic. Section 1 sets the stage and impossible worlds are quickly introduced in Sect. 2. Section 3 proposes to model imagination via variably strict world quantifiers. Section 4 introduces the formal semantics. Section 5 argues that imagination has a minimal mereological structure validating some logical inferences. Section 6 deals with how imagination under-determines the represented contents. Section 7 proposes additional constraints on the semantics, validating further inferences. Section 8 describes some welcome invalidities. Section 9 examines the effects of importing false beliefs into the imagined scenarios. Finally, Sect. 10 hints at possible developments of the theory in the direction of two-dimensional semantics. (shrink)
An exact truthmaker for A is a state which, as well as guaranteeing A’s truth, is wholly relevant to it. States with parts irrelevant to whether A is true do not count as exact truthmakers for A. Giving semantics in this way produces a very unusual consequence relation, on which conjunctions do not entail their conjuncts. This feature makes the resulting logic highly unusual. In this paper, we set out formal semantics for exact truthmaking and characterise the resulting (...) notion of entailment, showing that it is compact and decidable. We then investigate the effect of various restrictions on the semantics. We also formulate a sequent-style proof system for exact entailment and give soundness and completeness results. (shrink)
Agents require a constant flow, and a high level of processing, of relevant semantic information, in order to interact successfully among themselves and with the environment in which they are embedded. Standard theories of information, however, are silent on the nature of epistemic relevance. In this paper, a subjectivist interpretation of epistemic relevance is developed and defended. It is based on a counterfactual and metatheoretical analysis of the degree of relevance of some semantic information i to an informee/agent a, (...) as a function of the accuracy of i understood as an answer to a query q, given the probability that q might be asked by a. This interpretation of epistemic relevance vindicates a strongly semantic theory of information, according to which semantic information encapsulates truth. It accounts satisfactorily for several important applications and interpretations of the concept of relevant information in a variety of philosophical areas. And it interfaces successfully with current philosophical interpretations of causal and logical relevance. (shrink)
We introduce a number of logics to reason about collective propositional attitudes that are defined by means of the majority rule. It is well known that majoritarian aggregation is subject to irrationality, as the results in social choice theory and judgment aggregation show. The proposed logics for modelling collective attitudes are based on a substructural propositional logic that allows for circumventing inconsistent outcomes. Individual and collective propositional attitudes, such as beliefs, desires, obligations, are then modelled by means of minimal (...) modalities to ensure a number of basic principles. In this way, a viable consistent modelling of collective attitudes is obtained. (shrink)
ABSTRACT: A detailed presentation of Stoic theory of arguments, including truth-value changes of arguments, Stoic syllogistic, Stoic indemonstrable arguments, Stoic inference rules (themata), including cut rules and antilogism, argumental deduction, elements of relevance logic in Stoic syllogistic, the question of completeness of Stoic logic, Stoic arguments valid in the specific sense, e.g. "Dio says it is day. But Dio speaks truly. Therefore it is day." A more formal and more detailed account of the Stoic theory of deduction can (...) be found in S. Bobzien, Stoic Syllogistic, OSAP 1996. (shrink)
It is natural to think that our ordinary practices in giving explanations for our actions, for what we do, commit us to claiming that content properties are causally relevant to physical events such as the movements of our limbs and bodies, and events which these in turn cause. If you want to know why my body ambulates across the street, or why my arm went up before I set out, we suppose I have given you an answer when I (...) say that I wanted to greet a friend on the other side of the street, and thought that my arm's going up would be interpreted by him as a signal to stop for a moment. This widely held view might be disputed, but I shall not argue for it in this paper. I want to start with the view that our beliefs and desires and other propositional attitudes are causally relevant, in virtue of their modes and particular contents, to our movements, in order to investigate the consequences for analyses of thought content. For this purpose, I argue, in sec. II, for three necessary conditions on causal relevance: (a) a nomic sufficiency condition, (b) a logical independence condition, and (c) a screening-off condition. In sec. III, I apply these conditions to relational and functional theories of thought content, arguing that these theories cannot accommodate the causal relevance of content properties to our behaviour. I argue further that, on two plausible assumptions, one about the dependence of the mental on the physical, and the other about the availability in principle of causal explanations of our movements in terms of our non-relational physical properties, content properties can be causally relevant only if they are nomically type-correlated, relative to certain circumstances, with non-relational physical properties of our bodies. In sec. IV, I respond to a number of objections that might be advanced against this conclusion. (shrink)
A. J. Ayer’s empiricist criterion of meaning was supposed to have sorted all statements into nonsense on the one hand, and tautologies or genuinely factual statements on the other. Unfortunately for Ayer, it follows from classical logic that his criterion is trivial—it classifies all statements as either tautologies or genuinely factual, but none as nonsense. However, in this paper, I argue that Ayer’s criterion of meaning can be defended from classical proofs of its triviality by the adoption of a (...)relevantlogic—an idea which is motivated because, according to Ayer, the genuinely factual statements are those which observation is relevant to. (shrink)
We analyze the logical form of the domain knowledge that grounds analogical inferences and generalizations from a single instance. The form of the assumptions which justify analogies is given schematically as the "determination rule", so called because it expresses the relation of one set of variables determining the values of another set. The determination relation is a logical generalization of the different types of dependency relations defined in database theory. Specifically, we define determination as a relation between schemata of first (...) order logic that have two kinds of free variables: (1) object variables and (2) what we call "polar" variables, which hold the place of truth values. Determination rules facilitate sound rule inference and valid conclusions projected by analogy from single instances, without implying what the conclusion should be prior to an inspection of the instance. They also provide a way to specify what information is sufficiently relevant to decide a question, prior to knowledge of the answer to the question. (shrink)
This paper defines the form of prior knowledge that is required for sound inferences by analogy and single-instance generalizations, in both logical and probabilistic reasoning. In the logical case, the first order determination rule defined in Davies (1985) is shown to solve both the justification and non-redundancy problems for analogical inference. The statistical analogue of determination that is put forward is termed 'uniformity'. Based on the semantics of determination and uniformity, a third notion of "relevance" is defined, both logically and (...) probabilistically. The statistical relevance of one function in determining another is put forward as a way of defining the value of information: The statistical relevance of a function F to a function G is the absolute value of the change in one's information about the value of G afforded by specifying the value of F. This theory provides normative justifications for conclusions projected by analogy from one case to another, and for generalization from an instance to a rule. The soundness of such conclusions, in either the logical or the probabilistic case, can be identified with the extent to which the corresponding criteria (determination and uniformity) actually hold for the features being related. (shrink)
I argue against abductivism about logic, which is the view that rational theory choice in logic happens by abduction. Abduction cannot serve as a neutral arbiter in many foundational disputes in logic because, in order to use abduction, one must first identify the relevant data. Which data one deems relevant depends on what I call one's conception of logic. One's conception of logic is, however, not independent of one's views regarding many of the (...) foundational disputes that one may hope to solve by abduction. (shrink)
The reasoning process of analogy is characterized by a strict interdependence between a process of abstraction of a common feature and the transfer of an attribute of the Analogue to the Primary Subject. The first reasoning step is regarded as an abstraction of a generic characteristic that is relevant for the attribution of the predicate. The abstracted feature can be considered from a logic-semantic perspective as a functional genus, in the sense that it is contextually essential for the (...) attribution of the predicate, i.e. that is pragmatically fundamental (i.e. relevant) for the predica-tion, or rather the achievement of the communicative intention. While the transfer of the predicate from the Analogue to the analogical genus and from the genus to the Primary Subject is guaranteed by the maxims (or rules of inference) governing the genus-species relation, the connection between the genus and the predicate can be complex, characterized by various types of reasoning patterns. The relevance relation can hide implicit arguments, such as an implicit argument from classification , an evaluation based on values, consequences or rules, a causal relation, or an argument from practical reasoning. (shrink)
This chapter focuses on alternative logics. It discusses a hierarchy of logical reform. It presents case studies that illustrate particular aspects of the logical revisionism discussed in the chapter. The first case study is of intuitionistic logic. The second case study turns to quantum logic, a system proposed on empirical grounds as a resolution of the antinomies of quantum mechanics. The third case study is concerned with systems of relevance logic, which have been the subject of an (...) especially detailed reform program. Finally, the fourth case study is paraconsistent logic, perhaps the most controversial of serious proposals. (shrink)
2nd edition. Many-valued logics are those logics that have more than the two classical truth values, to wit, true and false; in fact, they can have from three to infinitely many truth values. This property, together with truth-functionality, provides a powerful formalism to reason in settings where classical logic—as well as other non-classical logics—is of no avail. Indeed, originally motivated by philosophical concerns, these logics soon proved relevant for a plethora of applications ranging from switching theory to cognitive (...) modeling, and they are today in more demand than ever, due to the realization that inconsistency and vagueness in knowledge bases and information processes are not only inevitable and acceptable, but also perhaps welcome. The main modern applications of (any) logic are to be found in the digital computer, and we thus require the practical knowledge how to computerize—which also means automate—decisions (i.e. reasoning) in many-valued logics. This, in turn, necessitates a mathematical foundation for these logics. This book provides both these mathematical foundation and practical knowledge in a rigorous, yet accessible, text, while at the same time situating these logics in the context of the satisfiability problem (SAT) and automated deduction. The main text is complemented with a large selection of exercises, a plus for the reader wishing to not only learn about, but also do something with, many-valued logics. (shrink)
The contemporary versions of the ontological argument that originated from Charles Hartshorne are formalized proofs based on unique modal theories. The simplest well-known theory of this kind arises from the b system of modal logic by adding two extra-logical axioms: “If the perfect being exists, then it necessarily exists‘ and “It is possible that the perfect being exists‘. In the paper a similar argument is presented, however none of the systems of modal logic is relevant to it. (...) Its only premises are the axiom and, instead of, the new axiom : “If the perfect being doesn’t exist, it necessarily doesn’t‘. The main goal of the work is to prove that is no more controversial than and -- in consequence -- the whole strength of the modal ontological argument lies in the set of its extra-logical premises. In order to do that, three arguments are formulated: ontological, “cosmological‘ and metalogical. (shrink)
This paper starts by indicating the analysis of Hempel's conditions of adequacy for any relation of confirmation (Hempel, 1945) as presented in Huber (submitted). There I argue contra Carnap (1962, Section 87) that Hempel felt the need for two concepts of confirmation: one aiming at plausible theories and another aiming at informative theories. However, he also realized that these two concepts are conflicting, and he gave up the concept of confirmation aiming at informative theories. The main part of the paper (...) consists in working out the claim that one can have Hempel's cake and eat it too - in the sense that there is a logic of theory assessment that takes into account both of the two conflicting aspects of plausibility and informativeness. According to the semantics of this logic, a is an acceptable theory for evidence β if and only if a is both sufficiently plausible given β and sufficiently informative about β. This is spelt out in terms of ranking functions (Spohn, 1988) and shown to represent the syntactically specified notion of an assessment relation. The paper then compares these acceptability relations to explanatory and confirmatory consequence relations (Flach, 2000) as well as to nonmonotonic consequence relations (Kraus et al., 1990). It concludes by relating the plausibility-informativeness approach to Carnap's positive relevance account, thereby shedding new light on Carnap's analysis as well as solving another problem of confirmation theory. (shrink)
The paper discusses approaches to Epistemic Contextualism that model the satisfaction of the predicate ‘know’ in a given context C in terms of the notion of belief/fact-matching throughout a contextually specified similarity sphere of worlds that is centred on actuality. The paper offers three counterexamples to approaches of this type and argues that they lead to insurmountable difficulties. I conclude that what contextualists (and Subject-Sensitive Invariantists) have traditionally called the ‘epistemic standards’ of a given context C cannot be explicated in (...) terms of a contextually specified similarity sphere that is centred on actuality. The mentioned accounts of epistemic relevance and thus the corresponding accounts of the context-sensitivity (or subject-sensitivity) of ‘knows’ are to be rejected. (shrink)
The paper surveys the currently available axiomatizations of common belief (CB) and common knowledge (CK) by means of modal propositional logics. (Throughout, knowledge- whether individual or common- is defined as true belief.) Section 1 introduces the formal method of axiomatization followed by epistemic logicians, especially the syntax-semantics distinction, and the notion of a soundness and completeness theorem. Section 2 explains the syntactical concepts, while briefly discussing their motivations. Two standard semantic constructions, Kripke structures and neighbourhood structures, are introduced in Sections (...) 3 and 4, respectively. It is recalled that Aumann's partitional model of CK is a particular case of a definition in terms of Kripke structures. The paper also restates the well-known fact that Kripke structures can be regarded as particular cases of neighbourhood structures. Section 3 reviews the soundness and completeness theorems proved w.r.t. the former structures by Fagin, Halpern, Moses and Vardi, as well as related results by Lismont. Section 4 reviews the corresponding theorems derived w.r.t. the latter structures by Lismont and Mongin. A general conclusion of the paper is that the axiomatization of CB does not require as strong systems of individual belief as was originally thought- only monotonicity has thusfar proved indispensable. Section 5 explains another consequence of general relevance: despite the "infinitary" nature of CB, the axiom systems of this paper admit of effective decision procedures, i.e., they are decidable in the logician's sense. (shrink)
Imagine a dog tracing a scent to a crossroads, sniffing all but one of the exits, and then proceeding down the last without further examination. According to Sextus Empiricus, Chrysippus argued that the dog effectively employs disjunctive syllogism, concluding that since the quarry left no trace on the other paths, it must have taken the last. The story has been retold many times, with at least four different morals: (1) dogs use logic, so they are as clever as humans; (...) (2) dogs use logic, so using logic is nothing special; (3) dogs reason well enough without logic; (4) dogs reason better for not having logic. This paper traces the history of Chrysippus's dog, from antiquity up to its discussion by relevance logicians in the twentieth century. (shrink)
Our question is: can we embed minimal negation in implicative logics weaker than I→? Previous results show how to define minimal negation in the positive fragment of the logic of relevance R and in contractionless intuitionistic logic. Is it possible to endow weaker positive logics with minimal negation? This paper prooves that minimal negation can be embedded in even such a weak system as Anderson and Belnap’s minimal positive logic.
In this work we propose an encoding of Reiter’s Situation Calculus solution to the frame problem into the framework of a simple multimodal logic of actions. In particular we present the modal counterpart of the regression technique. This gives us a theorem proving method for a relevant fragment of our modal logic.
Section 1 reviews Strawson’s logic of presuppositions. Strawson’s justification is critiqued and a new justification proposed. Section 2 extends the logic of presuppositions to cases when the subject class is necessarily empty, such as (x)((Px & ~Px) → Qx) . The strong similarity of the resulting logic with Richard Diaz’s truth-relevantlogic is pointed out. Section 3 further extends the logic of presuppositions to sentences with many variables, and a certain valuation is proposed. It (...) is noted that, given this valuation, Gödel’s sentence becomes neither true nor false. The similarity of this outcome with Goldstein and Gaifman’s solution of the Liar paradox, which is discussed in section 4, is emphasized. Section 5 returns to the definition of meaningfulness; the meaninglessness of certain sentences with empty subjects and of the Liar sentence is discussed. The objective of this paper is to show how all of the above-mentioned concepts are interrelated. (shrink)
Desires matter. How are we to understand the intentionality of desire? According to the two classical views, desire is either a positive evaluation or a disposition to act: to desire a state is to positively evaluate it or to be disposed to act to realize it. This Ph.D. Dissertation examines these conceptions of desire and proposes a deontic alternative inspired by Meinong. On this view, desiring is representing a state of affairs as what ought to be or, if one prefers, (...) as what should be. Desire involves a deontic manner of representing: a norm of the ought-to-be type features in desire’s intentional mode, as opposed to content. The dissertation is structured in three parts. In order to defend this conception, I formulate three main desiderata for a promising theory of the intentionality of desire in the introduction (§0). The first concerns desire’s direction of fit, i.e. the intuition that the world should conform to our desires. The second concerns the death of desire principle, i.e. the intuition that one cannot desire what one represents as actual. The last pertains to desire’s role in psychological explanations, i.e. the intuition that desires can explain some mental states and be explained by other mental states. The first part examines the main conceptions of desire in light of these desiderata. I argue that the classical pictures of desire do not adequately meet our desiderata. The first chapter is devoted to the evaluative conception (§1), while the second examines the motivational approach (§2). Following these criticisms, I then present the deontic view of desire (§3). In the second part, I defend this conception with the help of three arguments. The main idea is that appealing to norms of the ought-to-be type can satisfy our chief desiderata: the world should conform to norms (world-to-mind direction of fit, §4), norms are grounded on values and in turn ground obligations (explanation, §5), and norms are about non-actual states of affairs (death of desire principle, §6). In the last part, I develop the deontic view to draw a cartography of the various types of desire. Some desires are correct, while others are inappropriate. This distinction is explained by the deontic conception, as it matches that between states of affairs that ought to obtain and states that should not obtain (§7). Two study cases are examined: caprice and the impermissibility of desire aggregation. Intuitively, hopes, wishes, or urges are types of desire. The next chapter presents a typology inspired by the deontic view and the type of norms there are (§8). The last chapter discusses the main objections to the deontic approach (§9). In conclusion, I show the relevance of the deontic view for several debates in philosophy of mind and ethics. Desires are crucial because they are the ‘eye’ of what should be. (shrink)
Philosophers have spilled a lot of ink over the past few years exploring the nature and significance of grounding. Kit Fine has made several seminal contributions to this discussion, including an exact treatment of the formal features of grounding [Fine, 2012a]. He has specified a language in which grounding claims may be expressed, proposed a system of axioms which capture the relevant formal features, and offered a semantics which interprets the language. Unfortunately, the semantics Fine offers faces a number (...) of problems. In this paper, I review the problems and offer an alternative that avoids them. I offer a semantics for the pure logic of ground that is motivated by ideas already present in the grounding literature, and for which a natural axiomatization capturing central formal features of grounding is sound and complete. I also show how the semantics I offer avoids the problems faced by Fine’s semantics. (shrink)
Charles S. Peirce (1839-1914) made relevant contributions to deductive logic, but he was primarily interested in the logic of science, and more especially in what he called 'abduction' (as opposed to deduction and induction), which is the process whereby hypotheses are generated in order to explain the surprising facts. Indeed, Peirce considered abduction to be at the heart not only of scientific research, but of all ordinary human activities. Nevertheless, in spite of Peirce's work and writings in (...) the field of methodology of research, scarce attention has been paid to the logic of discovery over the last hundred years, despite an impressive development not only of scientific research but also of logic. -/- Having this in mind, the exposition is divided into five parts: 1) a brief presentation of Peirce, focusing on his work as a professional scientist; 2) an exposition of the classification of inferences by the young Peirce: deduction, induction and hypothesis; 3) a sketch of the notion of abduction in the mature Peirce; 4) an exposition of the logic of surprise; and finally, by way of conclusion, 5) a discussion of this peculiar ability of guessing understood as a rational instinct. -/- . (shrink)
There is a natural story about what logic is that sees it as tied up with two operations: a ‘throw things into a bag’ operation and a ‘closure’ operation. In a pair of recent papers, Jc Beall has fleshed out the account of logic this leaves us with in more detail. Using Beall’s exposition as a guide, this paper points out some problems with taking the second operation to be closure in the usual sense. After pointing out these (...) problems, I then turn to fixing them in a restricted case and modulo a few simplifying assumptions. In a followup paper, the simplifications and restrictions will be removed. (shrink)
The theory of imperatives is philosophically relevant since in building it — some of the long standing problems need to be addressed, and presumably some new ones are waiting to be discovered. The relevance of the theory of imperatives for philosophical research is remarkable, but usually recognized only within the ﬁeld of practical philosophy. Nevertheless, the emphasis can be put on problems of theoretical philosophy. Proper understanding of imperatives is likely to raise doubts about some of our deeply entrenched (...) and tacit presumptions. In philosophy of language it is the presumption that declaratives provide the paradigm for sentence form; in philosophy of science it is the belief that theory construction is independent from the language practice, in logic it is the conviction that logical meaning relations are constituted out of logical terminology, in ontology it is the view that language use is free from ontological commitments. The list is not exhaustive; it includes only those presumptions that this paper concerns. (shrink)
The paper presents an exhaustive menu of nonmonotonic logics. The options are individuated in terms of the principles they reject. I locate, e.g., cumulative logics and relevance logics on this menu. I highlight some frequently neglected options, and I argue that these neglected options are particularly attractive for inferentialists.
Val Plumwood’s 1993 paper, “The politics of reason: towards a feminist logic” (hence- forth POR) attempted to set the stage for what she hoped would begin serious feminist exploration into formal logic – not merely its historical abuses, but, more importantly, its potential uses. This work offers us: (1) a case for there being feminist logic; and (2) a sketch of what it should resemble. The former goal of Plumwood’s paper encourages feminist theorists to reject anti-logic (...) feminist views. The paper’s latter aim is even more challenging. Plumwood’s critique of classical negation (and classical logic) as a logic of domination asks us to recognize that particular logical systems are weapons of oppression. Against anti-logic feminist theorists, Plumwood argues that there are other logics besides classical logic, such as relevant logics, which are suited for feminist theorizing. Some logics may oppress while others may liberate. We provide details about the sources and context for her rejection of classical logic and motivation for promoting relevant logics as feminist. (shrink)
The anti-exceptionalist debate brought into play the problem of what are the relevant data for logical theories and how such data affects the validities accepted by a logical theory. In the present paper, I depart from Laudan's reticulated model of science to analyze one aspect of this problem, namely of the role of logical data within the process of revision of logical theories. For this, I argue that the ubiquitous nature of logical data is responsible for the proliferation of (...) several distinct methodologies for logical theories. The resulting picture is coherent with the Laudanean view that agreement and disagreement between scientific theories take place at different levels. From this perspective, one is able to articulate other kinds of divergence that considers not only the inferential aspects of a given logical theory, but also the epistemic aims and the methodological choices that drive its development. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.