GGeneral Session

Titles and Abstracts

7th UNILOG'2022

KEYNOTE SPEAKERS

Jeonbuk National University, South Korea

"Implicational partial gaggles and their representations" (Joint work with the late J. M.Dunn)

This is a continuation of the paper “Implicational partial-Galois logics and their relational semantics,” in which we introduced implicational partial-Galois logics as special kinds of implicational tonoid logics. First note that implicational tonoid logics are logics combining two classes of generalized logics, one of which is the class of weakly implicative logics introduced by Cintula and the other of which is the class of gaggle logics introduced by Dunn. In this paper, we extend this investigation to representations.

First, as preliminaries we recall the implicational partial gaggle matrices, called “implicational partial-Galois matrices,” introduced in [1]. We then introduce their representations, which are based on labeling and tonic types. We in particular consider embeddability theorems for such matrices. Finally, we generalize these to assertional implicational partial-Galois algebras and then consider their corresponding representations.

REFERENCES

[1] E.Yang, and J. M. Dunn, "Implicational Partial Galois Logics: Relational Semantics", Logica Universalis 15 (2021), pp.457–476.


Université d’Artois and member of CRIL lab and CNRS, France

" Defeasible Description Logics"

Description Logics (DLs) are a family of logic-based knowledge representation formalisms with appealing computational properties and a variety of applications at the confluence of artificial intelligence, databases and other areas. In particular, DLs are well-suited for representing and reasoning about ontologies and therefore they stand as the formal foundations of the Semantic Web. The different DL formalisms that have been proposed in the literature provide us with a wide choice of constructors in the object language. Nevertheless, these are intended to represent only classical, unquestionable knowledge, and are therefore unable to express and cope with the different aspects of uncertainty and vagueness that often show up in everyday life.

Examples of these comprise the various guises of exceptions, typicality (and atypicality), approximations and many others, as usually encountered in the different forms of human quotidian reasoning. A similar argument can be put forward when moving to the level of entailment, that of the sanctioned conclusions from an ontology. DL systems provide for a variety of (standard and non-standard) reasoning services, but the underlying notion of entailment remains classical and therefore, depending on the application one has in mind, DLs inherit most of the criticisms raised in the development of the so-called non-classical logics. In this talk, I make a case for endowing DLs and their associated reasoning services with the ability to cope with defeasibility.

I start by introducing the notion of defeasible class subsumption, which allows for the specification of and reasoning about defeasible inheritance, and give it an intuitive semantics in terms of preference relations. Next I show how to take defeasibility to the level of entailment through the notion of rational closure of a defeasible ontology. Of particular interest is the fact that our constructions do not negatively affect decidability or complexity of reasoning for an important class of DLs. Finally, I show how our semantic definitions are fruitful in extending DLs with further defeasible constructs at the object level and in providing a notion of contextual defeasible subsumption. These allow for an even more fine-grained treatment of exceptions in reasoning with defeasible inheritance.

REFERENCES

[1] K. Britz, G. Casini, T. Meyer, K. Moodley, U. Sattler, I. Varzinczak, "Principles of KLM-style defeasible description logics", ACM Transactions on Computational Logic , 22 (1), (2021).

[3] I. Varzinczak, "Defeasible Description Logics", Künstliche Intelligenz, 34 (4), (2020): 539-542.


Ecole des Mines, Paris, France

"Logic and algebra in the exchanges between Bertrand Russell and Louis Couturat"

Russell proposes the "discovery of the fundamental ideas of Mathematics, and of the necessary judgments (axioms) that one must accept in reasoning about these ideas" (R 18.7.98). He is speaking here of the future The Principles of Mathematics which will be published in 1903. This discovery is that of Logic, which he conceives from very heterogeneous elements: classical logic, the work of his master Whitehead of 1898, A Treatrise on Universal Algebra, the work of Frege on arithmetic, the symbolism of Peano in the various publications of the Formulaire, of Pieri for the hypothetico-deductive method, of Weierstrass and Dedekind for the rigor and the relations of the number and the magnitude, then of Cantor on the sets and transfinite numbers, of the role of the principle of induction to distinguish the finite numbers from the infinite numbers. He will say that logic is an experimental science whose object is mathematics (RMM 1906).

Couturat follows Russell in his construction of logic, but pulling it as far as possible towards algebra, in particular Boole's and Schröder's, to which, in the end, he will surrender, finding Russell's work made for "the angels" (on reading the first volume of the Principia Mathematica, co-written with Alfred North Whitehead, 1913). Couturat remains divided between logic and algebra, because he moves from one to the other - from algebra to logic - while Russell builds the conditions of a conception - a design - of logic.

REFERENCES

Anne-Françoise Schmid, ed. de Bertrand Russell et Louis Couturat, Correspondance sur la philosophie, la logique et la politique (1897-1913), Paris, Kimé, 2001, 2 volumes.

Oliver Schlaudt et Anne-Françoise Schmid eds., Louis Couturat : The History of Modern Symbolic Logic and Other French Manuscripts, Cham, Switzerland, Springer Nature, 2021.

University of Sevilla, Spain

"The Logic of Medical-Veterinary Diagnosis"


My proposal consists in an application of modern understanding of abductive reasoning [9] to Ancient Near East hippiatric diagnosis [2,5, 8]. These Ancient medical texts [1] use the same kind of reasoning than some modern medical diagnosis[3], namley abduction. Here, we understand abduction in terms of ignorance-preserving reasoning [10]. It is neither deduction, nor induction. The hypothesis of an illness is conjectured by the physician, who ends acting despite the lack of definitive confirmation; i.e. by means of a full abduction [1, 3].

After having explained the Akkadian medical diagnosis in terms of abductive inference, we will offer an analysis of the Akkadian and Ugaritic Hippiatry. We will analyse this logical relation in RS 17.120 (KTU 1.85) text [6, 4] and BAM 159 [7] by taking into account the contexts and three aspects: the linguistic conditional and the subjunctive inferential relation; the difference between the sign/symptoms and the illness; and the clear defesability of the relation between sign/symptoms and treatment.

After a formal analysis, we might see how the “explanatory” part of the abductive inference is completely absent in Ugaritic texts. As the “explanatory” aspect of the scientific reasoning and concretely abductive reasoning has been considered as a fundamental part, we might consider that this diagnosis is not an abduction. Nevertheless, my point is precisely to mention that this “explanatory” aspect is not necesary, either suficient in abduction. There is another part of the schema, we need to take into account as an important part, even when we deal with an “explanatory” abduction. This is exactly the part we have in Ugaritic and Akkadian Hippiatry, the further action which led us to a practical reasoning.

REFERENCES

[1] C. Barés Gómez. Abduction in akkadian medical diagnosis. Journal of Applied Logics - IfCoLog Journal of Logics and their Applications, 5(8):1697-1722, 2018.

[2] C. Barés Gómez. Lógica, Conocimiento y Abducción, chapter Un análisis de la inferencia en la práctica médico-veterinaria antigua. Los textos hipiátricos de Ugarit., pages 265 284. College Publications, 2021.

[3] C. Barés Gómez and M. Fontaine. Medical reasoning in public health emergencies: Below high standards of accuracy. Teorema, 40(1), 2021.

[4] P. Bordreuil and D. Pardee. Manuel d'Ougaritique, volume 2. Paul Geuthner, Paris, 2004.

[5] C. Cohen. The ugaritic hipiatric texts and bam 159. Journal of the Ancient Near Easter Society 15, pages 1 12, 1983.

[6] C. Cohen and D. Sivan. The Ugaritic Hippiatric Texts. A Critical Edition. American Oriental Series, Essay 9, 1983.

[7] F. Köcher. Die Babylonisch - assyrische Medizin in Texten und Untersuchungen. De Gruyter, 1963,80.

[8] D. Pardee. Trente ans de recherches sur les textes et les soins hippiatriques en langue ougaritique. Pallas 101. Trousse du veterinaire dans l antiquite et au moyen age - instruments et pratiques, 2016.

[9] C.S. Peirce. Collected Papers of Charles Sanders Peirce. Harvard University Press, Cambridge, 1931-1958.

[10] J. Woods. Errors of Reasoning. Naturalizing the Logic of Inference. College Publications, London, 2013.


Blender Logic, USA

"Acquiring Immunity to Paradox"


Logic (overwhelmingly FOL) provides the “abstract” material used to build most all of the world’s assertion processing software (e.g., banking). Despite this great achievement, FOL limits software in domains where th information to be reasoned over does not fit classical constraints. For applied logic, the boundaries of classica relevance are most clearly seen in knowledge-sensitive domains (e.g., where information is incomplete o where background knowledge needs to be modified at the moment of interpretation). For pure logic, these boundaries are most clearly seen in paradoxical phenomena.

In this talk, beginning with the Liar and its strengthened versions, we examine a range of sentences that produce paradoxical results for classical logic, describe the practical relevance of these paradoxes, and show that the various non-classical ‘fixes’ talked about in the litterature address the symptoms of the disease, without providing immunity to the cause. For example, dialethism – that there are true contradictions addresses liar symptoms by legitimizing them; i.e. allowing sentences to be true and false , without providing rules for determining the circumstances underwhich true contradictions should or should not be allowed.

Then we introduce a new kind of logic called LC (consistent with notions illuminated in the Tractatus ) that provides computable rules for sorting wffs into genuine and pseudo propositions (with Liar-like wffs being pseudo propositions) that is immune to the paradoxes caused by Liar- or Curry-like sentences. We apply LC to a variety of such sentences and real world problems to show how LC is immune to paradox without in any way losing the beneficial attributes of classical logic or restricting non-classical reasoning about pseudo propositions.


REFERENCES

Thomsen, E. (2003) OLAP Solutions Building Multidimensional Information Systems: John Wiley & Sons

Thomsen, E. Smith, B. (2018): Ontology-based Fusion of Sensor Data and Natural language: Journal of Applied Ontology

Barwise, J.& Etchemendy, J. (1989). The liar: An essay on truth and circularity.: Oxford University Press.

Among others, see: Priest, G. (1979): The Logic of Paradox; Priest, G. (1981): The Logic of Paradox Revisited; Priest, G. (2000): Truth and Contradiction; Beall, Jc. (2009), Spandrels of Truth;

Thomsen, E. (2018): A Tractarian Resolution to the Ontological Argument : Journal of Applied Logic for an informal treatment and Beverely, J., Thomsen, E., Hull, D. Acquiring Immunity to Paradox in review for a formal exposition

Wittgenstein, L. (1973): Tractatus logico-philosophicus (1921). Madrid: Alianza

F. Voorbraak (Frans), Generalized Kripke models for epistemic logic, Morgan Kaufmann Publishers, San Mateo, 1992. pp. 214-228.


CONTRIBUTING SPEAKERS

CNRS – UMR 8011 Sciences, Normes, Décision

Paris, France

"Which logic for the self-ascription of attitudes?"


What should the semantics of ascriptions of the form “I that p,” where “I ” plays the part of the main clause and “that p” that of the subordinate declarative clause providing the content of whatever is d look like?

The properties at stake are the doxastic positive introspective property for bimodal propositional logic and the positive introspective property for epistemic propositional logic holding in all Kripke structures. Are the axiom schemata of Kraus and Lehmann’s bimodal system KBCD (Kraus and Lehmann 1988) and of Voorbraak’s S5+KD45 system (Voorbraak 1992) legitimate? The systems have an advantage: they provide separate accessibility relations for individual knowledge and individual belief such that the bridge axiom schema holds. This yields a coherent picture of what the semantics could look like. The crux of the matter is whether these axiom schemata are psychologically and epistemically fit when mediate warrants must play an epistemic role, as is often the case with self-ascriptions.

A surveyability constraint might be imposed so that I successfully positively introspect whether I [that] p only provided my introspection doesn’t tolerate increases in length and complexity beyond a certain level, whether the appropriate mental states have a representational content or just phenomenological properties. In case an argument grounds the modal claim that I could be the bearer of an attitude when positive introspection fails to deliver a warrant, it will be inquired whether a bimodal system without a bridge axiom could provide separate accessibility relations for individual knowledge (Ki) and individual belief (Bi) while also correctly capturing both the failure and the modal claim.

REFERENCES

S.Kraus and D.Lehmann, “Knowledge, Belief and Time,” Theoretical Computer Science, vol. 58 (1988), pp. 155-174.

F. Voorbraak (Frans), Generalized Kripke models for epistemic logic, Morgan Kaufmann Publishers, San Mateo, 1992. pp. 214-228.


University of St Andrews, Scotland

"Consequence and Signication in Fourteenth-Century Logic"


Forty years ago, Niels Green-Pedersen [1] listed five different accounts of valid consequence, variously promoted by logicians in the early fourteenth century and discussed by Niels Drukken of Denmark in his commentary on Aristotle's Prior Analytics written in Paris in the late 1330s. Two of these arguably fail to give defining conditions: truth preservation was shown by Buridan and others to be at best necessary but not sufficient; incompatibility of the opposite of the conclusion with the premises is merely circular if incompatibility is analysed in terms of consequence. Buridan was perhaps the first to define consequence in terms of preservation of what Spade [2, p.81] called firmness, that is, signifying as things are. Swyneshed pinpointed a sophism which threatens to undermine this analysis. Bradwardine turned it around: he suggested that a necessary condition on consequence was that the premises signify everything the conclusion signifies. Dumbleton gave counterexamples to Bradwardine's suggestion where the conclusion arguably signifies more than, or even completely differently from the premises. Yet a long-standing tradition held that some species of validity depend on the conclusion being in some way contained in the premises. (See [3]) What, if anything, does signification have to do with consequence?

REFERENCES

1. Niels Green Pedersen, "Nicholas Drukken of Dacia's Commentary on the Prior Analytics|with special regard to the theory of consequences", Cahiers de l'Institut du Moyen Age Grec et Latin, 37, 1981, 42-79.

2. Paul Vincent Spade, William Heytesbury, On Insoluble Sentences, Pontical Institute of Mediaeval Studies, Toronto, 1979.

3. Catarina Dutilh Novaes, "Medieval Theories of Consequence", The Stanford Encyclopedia of Philosophy (Fall 2020 Edition), Edward N. Zalta (ed.).

University of Leipzig, Germany

"Diagrammatic Reasoning in Euclid"


As the paradigm of ancient mathematical practice, the Euclidean demonstration was for the longest time read as an axiomatic system. This paper will critically reflect on recent attempts to describe the Elements as instead presenting a system of natural deduction. According to this new reading, Euclid’s procedure does not employ Common Notions, Postulates, and Definitions as premises on the basis of which theorems are proven, but instead provides the rules of reasoning. Euclidean demonstrations are here understood as not merely diagram-based, but instead as inherently diagrammatic. The diagrams are thus read as possessing non-natural meaning and as essentially general. Euclidean demonstrations are furthermore claimed to extend our knowledge by virtue of being in Kant’s words synthetic a priori. I will instead consider Poincaré’s and Schröder’s understanding of the axioms as hidden definitions, and furthermore Hilbert’s understanding of axiomatic systems as ordering systems that justify axioms internally, instead of primarily in terms of rules of inference, and lastly Pieri’s understanding of elementary geometry as hypothetical-deductive. I will argue that while these earlier views concur that the axioms do not provide the starting points of reasoning, they possess certain explanatory advantages over the recent reading, most notably that they can accommodate variance of mathematical systems, which Euclid’s demonstrations in the reading as a system of natural deduction cannot.

REFERENCES

[1] Hilbert, D.: The foundations of mathematics, translated by Stephan Bauer-Menglerberg and Dagfinn Føllesdal. In: van Heijenoort, J. (1967). From Frege to Gödel: A Source Book in Mathematical Logic, 1879–1931, pp. 464–479. Harvard University Press, Cambridge (1927)

[2] Lachtermann, D. R.: The Ethics of Geometry: A Genealogy of Modernity. Routledge, New York and London (1989)

[3] Macbeth, D.: Diagrammatic Reasoning in Euclid’s Elements. In: van Kerkhove, B., van Bendegem, J. P., de Vuyst, J. (eds.) Philosophical Perspectives on Mathematical Practice 12, pp. 235-267. College Publications, London (2010)

[4] Manders, K.: The Euclidean Diagram. In: Mancosu, P. (ed.) The Philosophy of mathematical practice, pp. 80-133. Oxford University Press (2004)

[5] Pieri, M.: I Principii della geometria di posizione composti in sistema logico deduttivo, Memorie della Reale Accademia delle Scienze di Torino (Series 2), 48. pp. 1–62. (1898)

[6] Poincaré, H.: On the Foundations of Geometry. The Monist, 9, pp. 1–43. (1898)

[7] Russel, B.: Mathematical Logic Based on the Theory of Types. Logic and Knowledge: Essays 1901-50. Ed. by R.C. Marsh. Routledge, London and New York (1956)

[8] Schröder, E.: Algebra der Logik. Vols. I–III. reprint Chelsea (1966)

[9] Stein, H.: Logos, Logic, and Logistiké: Some Philosophical Remarks on the Nineteenth Century Transformation of Mathematics. In: Aspray, W., Kitcher, P. (eds.) History and Philosophy of Mathematics. Vol. XI. Minnesota Studies in the Philosophy of Science. University of Minnesota Press, Minneapolis, MN (1988)

[10] Tennant, N.: The withering away of formal semantics? Mind and Language, 1, pp. 302-318 (1986)

Université Paris 8 Vincennes-Saint-Denis, LPPC, France

"Logical Bases of Modal Navigation"


From the age of 4 a human being has prodigious skills in modal navigation, with the ability to extract himself/herself out of the present to travel at high speed in universes of possible worlds: mental time travel, fictional navigation (games, novels, movies, virtual reality…), epistemic navigation (mind-reading of other people's worlds), mind-wandering and fantasic life, esthetic travel (e.g., through pictorial or musical works), spiritual travel, and last but not least logical travel (a reasoning implies generally to consider multiple alternatives).

However, while powerful formal tools, such as the semantics of possible worlds, have been forged for a long time to deal with inferences in modal logic, the logical bases of modal navigation remain little discussed.

In order to advance in this direction, we will first specify the representational tools necessary for any modal navigation, based on the conceptual framework of representational spaces where a (universe of) world(s) is constituted from fragments of presence and links that allow to unify these fragments (such as Hypertext links associates pages on the word wide web) [1]. The operation of modal abstraction will be introduced as allowing to modalize (and more generally to contextualize) a content fragment as belonging to a specific world (or more generally to a context), with marking by a symbol that refers to this world [2].

We will then specify the logical conditions for an adequate representation of the universes of possible worlds in modal navigation and will propose a quantified modal language using terms of possible worlds. We will show that it is possible to account for navigational dynamics between worlds — with the inter-world tracking necessary for safe high-speed navigation — by implementing the world terms as nodes in an activation network [3].

Finally, we will specify how modal navigation can serve as a paradigm for the general study of mental navigation, as a possible world can be considered as a context-specific case.

REFERENCES

[1] Plagnol, A.: Espaces de représentation : théorie élémentaire et psychopathologie. Editions du CNRS, Paris (2004)

[2] Plagnol, A.: "Logic and theory of representation". In: Beziau, J.-Y, Desclés, J.-P., Moktefi, A., Pascu, A-C. (eds.), Logic in question: Talks From The Annual Sorbonne Logic Workshop (2011-2019). Springer, Cham (in press)

[3] Plagnol, A. : Principes de navigation dans les mondes possibles — Tome 1 : Fondations. Editions Terra Cotta, Garches (2019)

University of Namur, Belgium

"The Ways of Partiality and Paraconsistency"


This talk aims at identifying some of the ways in which partial and paraconsistent logics violate the law of excluded middle and the law of non-contradiction respectively. In this connection, three many-valued logics and three constructive logics are addressed. Among the many-valued logics, we consider Kleene's strong three-valued logic, Priest's logic of paradox, and Dunn-Belnap's four-valued logic.

Among the constructive logics, we investigate intuitionistic logic, dual-intuitionistic logic, and bi-intuitionistic logic. To propose a unified understanding of these partial and paraconsistent logics, two steps mark out this talk. First, we define a four-valued extension of the temporal logic KtT4, which is the modal logic obtained from the minimal temporal logic Kt by requiring the accessibility relation to be reflexive (which corresponds to the axiom T) and transitive (which corresponds to the axiom 4). A relational semantics and a labelled sequent calculus are set out. Note that this labelled sequent calculus is based on an internalisation of the relational semantics of KtT4 into a four-sided sequent calculus. Second, we show some useful properties of this four-valued extension of KtT4.

We start by pointing out that bi-intuitionistic logic and Dunn-Belnap's four-valued logic can be faithfully embedded into that logic. Then, different forms of the original cut rule are shown to be admissible in the sequent calculus. Finally, we argue that this extension satisfies several duality properties that provide a fresh insight into the relationship between partiality and paraconsistency.

School of Technology & Computer Science, Tata Institute of Fundamental Research, Mumbai India

"Voting Theory and Semantics of Computation"


We examine interconnections between voting theory, which studies individual and collective social choice and aggregation of choices, and the theory of formal program logics in computer science. It is well known that the quest to characterize sequential algorithms using denotational semantics of λ-calculus lead to the notion of `stability.' It was erroneously believed that stability captures the notion of `sequentiality,' until the discovery of a stable function which is not sequential. In voting theory, the process of aggregating individual choices in order to determine collective choice which satises necessary requirements leads to an impossibility result. We show that any aggregation of choices which gives rise to an impossibility also leads to a unique function satisfying the sequentiality conditions of formal program logics.

REFERENCES

[1] A.K.Sen: Collective Choice and Social Welfare, Oliver & Boyd (1970).

[2] K.Suzumura: Rational Choice, Collective Decisions and Social Welfare, Cambridge University Press (1983).

[3] G.Winskel: The Formal Semantics of Programming Languages, MIT Press (1993).


Palacký University Olomouc, Czech Republic

"Specification of Tenses in Tichý’s Transparent Intensional Logic and Prior’s Temporal Logic"


Pavel Tichý stressed in his paper ‘The Logic of Temporal Discourse’ that contemporary systems of temporal logic lacked the ability to properly formalise temporal discourse. The problematic features of temporal discourse that Tichý pointed out are, for instance, the difference between Past Simple and Present Perfect. In the paper, he proposed the differentiation between these in his system of logic, Transparent Intensional Logic (TIL).

When Tichý discussed contemporary temporal systems of logic he could not omit the systems of Arthur N. Prior, who introduced modern temporal logic. Therefore, his critique was also targeted to Prior. Tichý considered, however, only Prior’s standard temporal systems of logic.

Patrick Blackburn pointed out that Prior postulated hybrid systems of logic. These systems have greater expressive power than standard temporal systems of logic. Therefore, they could be used for specifications of tenses that Tichý found troublesome. The aim of my talk is to demonstrate that even though Prior’s formalisation is by no means as detailed as Tichý’s, it is able to capture the basic difference between these two tenses.

REFERENCES

P.Blackburn, "Arthur Prior and Hybrid Logic", Synthese 150, 329–372 (2006)

P.Blackburn and K.F. Jørgensen, "Reichenbach, Prior and Hybrid Tense Logic", Synthese 193, 3677–3689 (2016)

R.Bäuerle, "Tense Logics and Natural Language", Synthese 40: 225–230 (1979)

M.Duží, B.Jespersen and P.Materna, Procedural Semantics for Hyperintensional Logic. Springer Verlag, Berlin (2010)

S.T.Kuhn and P. Portner, "Tense and Time". In: Gabbay, D.M. Guenthner F. (eds.), Handbook of Philosophical Logic, vol 7. pp. 277–346. Springer Verlag, Dordrecht (2002)

A.N.Prior, Past, Present and Future. Clarendon Press, Oxford (1967)

A.N.Prior, "Tense Logic and the Logic of Earlier and Later". In: Hasle, P., Øhrstrøm, P., Bräuner, T., and Copeland, B. J., (eds.). Papers on Time and Tense, pp. 117–138. Oxford University Press, Oxford (2003)

A.N.Prior, "Now". In: Hasle, P., Øhrstrøm, P., Bräuner, T., and Copeland, B. J., (eds.). Papers on Time and Tense, pp. 171–193. Oxford University Press, Oxford (2003)

P.Tichý, "The Logic of Temporal Discourse", Linguistics and Philosophy 3(3), 343–369, (1980)

IUSS - School for Advanced Studies, Paiva, Italy

"Epistemic Overdetermination Strikes Back. A New Hope for the A Priori"


Recent work in the epistemology of logic has taken an empiricist turn in the understanding of its subject matter. Two key Quinean intuitions have been embraced by logical anti-exceptionalists: that

i) logical theorizing is non-exceptional – not special – and contiguous to science;

ii) that logical theories are justified and adopted via abductive methodology (e.g., inference to the best explanation), which in turn is based on scientifically constrained premises that concern theoretical virtues.

I argue that an indispensable feature of this methodology cannot be coherently accounted by anti-exceptionalists: epistemic overdetermination, i.e., the phenomenon according to which the same logical claim can be justified on the basis of multiple sources of evidence, including both a priori and a posteriori ones. I contend that logical anti-exceptionalist should either revise their epistemic framework to allow for a defeasible notion of a priori justification or avoid relying on a priori evidence altogether.

However, each of these strategies has its own drawbacks: anti-exceptionalists about logic can account for epistemic overdetermination – and, consequently, for an empiricist epistemology of logic – only if they either recognize that logic can be a priori justified, or they admit the implausible claim that some crucial criteria underlying the inference to the best explanation should be abandoned, e.g., the adequacy to the data criterion.

REFERENCES

1. A.Casullo, "Epistemic Overdetermination and A Priori Justification". Philosophical Perspectives. 19, 41-58 (2005)

2. A.Goldman, "Philosophical intuitions: Their target, their source, and their epistemic status". Grazer Philosophische Studien. 74(1):1-26 (2007)

3. O.Hjortland, "Anti-exceptionalism about Logic". Philosophical Studies. 174 (3): 631–658 (2017)

4. P.Maddy, Second Philosophy. A naturalistic method. Oxford University Press, Oxford (2007

5. B.Martin, "Identifying Logical Evidence". Synthese 198: 9069-9095 (2021).

6. J.Peregrin, and P. Svoboda, "Moderate anti-exceptionalist and earthborn logic". Synthese 199: 8781-8806 (2021)

7. G.Priest, "Logical disputes and the a Priori. Logique et Analyse." 59 (236): 347-366 (2016)

8. G.Russell, "Metaphysical analyticity and the epistemology of logic". Philosophical Studies. 171(1):161-175 (2014)

9. J.Shieber, "A partial defense of intuitions on naturalist grounds". Synthese. 187(2):321-341 (2012)

10. T.Williamson, "Semantic Paradoxes and Abductive Methodology". In B.Armour-Garb (ed.), Reflections on the Liar, pp. 325-346. Oxford University Press, Oxford (2017)

Department of Philosophy

University of Costa Rica, Costa Rica

"Singular propositions Aristotelian (assertoric) syllogistic theory "

As widely recognized, one of the important problems surrounding Aristotelian (assertoric) syllogistic theory concerns the logical role that singular propositions might play in this theoretical framework. This presentation will initially focus on this problem.

We’ll show first that Aristotle’s work doesn’t provide an unambiguous answer to the problem. Then, we’ll consider post-Aristotelian solutions, which assimilate singular propositions to categorical propositions. Although these solutions partially hit the target, we’ll see that they lack a full semantic grounding. This grounding is required to constitute philosophically adequate elucidations of the issue.

An attempt to fill the above semantic gap might be conducted along the lines of Nino Cocchiarella’s interpretation of singular propositions. His theory constitutes a sortalist approach to proper names and would provide the semantic foundation lacking in the post-Aristotelian proposals. However, as we’ll point out, a Cocchiarellan-inspired solution wouldn’t conform to the Aristotelian truth conditions for singular propositions. Moreover, this solution and, in general, any attempt to interpret singular propositions as categorical might conflict with Aristotle’s view of universals.

The above difficulty leads us to explore the alternative of extending Aristotle’s syllogistic to singular propositions, instead of attempting to assimilate them to categorical propositions. For this purpose, we’ll propose two formal sortal logics that will capture such a syllogistic theory as extended to singular propositions. One of the systems is an axiomatic system, and the other a natural deduction system. We’ll show that Aristotelian intuitions ground both systems.

Université Catholique de Louvain-la-Neuve, Belgium

"Relevant Tonk"


Prior (1960) showed how adding tonk-introduction and tonk-elimination to a non-trivial transitive logic, results in a trivial logic. In reaction, Belnap (1962) set up an influential doctrine according to which inference rules can define a new logical constant relative to a background logic L if they are conservative and uniquely defining relative to L. Building on recent non-transitive approaches to relevant entailment (Tennant 2017, Verdée, de Bal, Samonek 2019), we identify a non transitive and non-reflexive logic relative to which Belnap's two conditions are met for the tonk rules. Yet, it can be shown that extending the non-transitive relevant core of classical logic by the tonk rules yields a logic that is non-reflexivie, non-transitive, but more importantly not truth-preserving in general. Thus although the tonk rules satisfies the requirement of Belnap's doctrine, relative to this context of relevant deducibility, it still displays a deviant logical behavior, by turning a truth-preserving relevant logic into a non truth-preserving one. If the trivialization argument offered by Prior is sufficient to conclude that the tonk rules fail to define a new connective in a transitive context of deducibility, then this loss of truth-preservation should also be sufficient to claim that the tonk rules fail to define a new connective in this context of deducibility relative to which Belnap's existence and uniqueness conditions are met. We take this as evidence against the Belnap doctrine, which needs to be rejected or at least amended.

REFERENCES

N.Belnap, "Tonk, plonk and plink". Analysis 22(6):130-134 (1962)

N.Tennant, Core Logic. Oxford University Press. 2017

P.Verdée, I.De Bal and A.Samonek "A non-transitive relevant implication corresponding to classical consequence." Australasian Journal of Logic 16(2):10-40 (2019).

University of Torino, Italy

"An Abstract Theory of Definitions with an Application to Self-referential Truth"


Tarski [3] emphasised the "considerable analogies" we can observe between some concepts related to the notion of "inference" and correspondent concepts belonging to the notion of "defining". Some years before, Tarski [2] had distilled the mathematical features of inference within the framework of an abstract notion of "logical consequence". Since then, the study of abstract consequence relations, motivated by their logical interpretations, has developed into a mature field of research (see, for instance, Martin & Pollard [1]), however no analogous development for the abstract features of the definitions has been recorded. I will offer a formalisation of an abstract theory of definitions within the same framework used for the abstract consequence relations. More precisely, I will show that it is possible to express in terms of a given consequence relation some of the concepts (like conservativeness, relative consistency, eliminability, etc.) that are involved in formulating the classical theory of definitions for the first-order languages, and that some results about the mutual relationships between these concepts can be established in this abstract setting. From the outlined theory I will generalise an abstract theory of partial definitions, namely, a theory of those sets of axioms which act as a definition for only a fragment of the entire language. As an application, this latter theory will be used to to develop an abstract theory of self-referential truth.

REFERENCES

[1] N. M. Martin & S. Pollard: Closure Spaces and Logic, Kluwer Academic Publishers (1996).

[2] A. Tarski, Ueber einige fundamentalen Begriffe der Metamathematik. Comptes Rendus des séances de la Société des Sciences et des Lettres de Varsovie 23 cl. III, 22-29, (1930).

[3] A. Tarski, Einige methodologische Untersuchungen über die Definierbarkeit der Begriffe, Erkenntnis 5, 80-100 (1935).

University of Pécs, Hungary

"Amalgamation and Densification in Classes of Involutive Commutative Residuated Lattices"


The amalgamation property is quite rare for general varieties. Its variants are in strong relationship with various syntactic interpolation properties of substructural logics, hence its investigation in varieties of residuated lattices is of particular interest. The amalgamation property is investigated in some classes of non-divisible, non-integral, and non-idempotent involutive commutative residuated lattices in this talk. We shall prove that the classes of odd and even totally ordered, involutive, commutative residuated lattices fail the amalgamation property, and that their subclasses formed by their idempotent-symmetric algebras have the amalgamation property. Finally, it is shown that the variety of semilinear, idempotent-symmetric, odd, involutive, commutative residuated chains has the amalgamation property, and hence also the transferable injections property. A variety V admits densification if every chain in V can be embedded into a dense chain in V. Densification is a key component to prove standard completeness of a fuzzy logic. We prove in an algebraic manner that the variety of semilinear, odd, involutive, commutative residuated lattices and the variety of semilinear, idempotent- symmetric, odd, involutive, commutative residuated lattices admit densification. In residuated lattices the amalgamation property has been analyzed mostly in varieties in which the algebras are linear or semilinear, i.e., subdirect products of linearly ordered ones, or conic or semiconic. In addition, the investigated classes in the literature have mostly been either divisible and integral or idempotent. The scope of the present talk is investigating the amalgamation property in some varieties of residuated lattices which are neither divisible nor integral nor idempotent.

REFERENCE

[1] Jenei, S, Amalgamation and densification in classes of involutive commutative residuated lattices, arXiv 2020, https://arxiv.org/abs/2012.14181

Supported by the MEC_R 140883 project of the National Research, Development and Innovation Office, Hungary

LanCog, Centre of Philosophy of the University of Lisbon Portugal

"The Overdetermined Future"


One area in which philosophers have traditionally thought it might be reasonable to appeal to truth-value gaps is that of future contingents. Given that the future is open or unsettled, statements about contingent aspects of the future can plausibly be taken to be neither true nor false. In this context, JC Beall (2012) made an interesting proposal, claiming that, in order to capture the openness or unsettledness of the future, truth-value gluts would work as well as gaps. On such a view, the future would be overdetermined, rather than underdetermined. There would be too many truths about the future, with some contradicting others. Beall sketched a semantics for a logic with future contradictions, according to which the present, however, would be fully consistent. Contradictory future times would have to somehow be 'consistentized' before becoming present. It is not my intention to defend this view of time, but I do think that Beall has a point here, and one that is worth developing and exploring further. I shall try to develop the view in two ways, by providing the view with a metaphysical basis and by suggesting an enrichment of the models used in the semantics.

REFERENCE

Jc Beall, "Future Contradictions." Australasian Journal of Philosophy 90, 547-557 (2012)

University of Milan (Università degli Studi di Milano), Italy

"Peirce’s Regenerated Logic: the Dismissal of Substance and Being"


The paper aims to clarify why Charles S. Peirce, after his research on the logic of relatives and quantification theory (1870-1897), eventually abandoned the categories of ‘substance’ and ‘being’, widely adopted since Aristotle onwards, at least in the Indo-European logic. To reach this goal, the paper focuses on the evolution of Peirce’s theory of the proposition, which is representative of one of the most characterizing feature of his thought: the interconnection of logic-semiotics-language. Although those two categories disappear in Peirce’s writings from the mid-1890s onwards, in those years Peirce’s logical account of ‘subject’ and ‘copula’ (usually associated with the categories of ‘substance’ and ‘being’), may shed some new light on why he omits ‘substance’ and ‘being’ in favor of just three categories (Quality/Firstness, Relation/Secondness, Representation/Thirdness). The paper is divided into three sections. The first section summarizes Peirce’s early account of both categories and the proposition. The second one focuses on his semiotic and logical interpretation of the proposition, provided after his reflection upon the logic of relatives. According to Peirce’s view, the copula is not any longer a necessary part of the proposition, and the subject should not be confused with the grammatical one. Finally, the third section reconnects the results that Peirce reached at the propositional level to his dismissal of the categories of ‘substance’ and ‘being’.

Serge Robert

Department of philosophy, Université du Québec à Montréal, Montréal, Québec,

"The Cognitive Foundations of Logical Connectives and Fuzzy Reasoning Contexts"


The underlying thesis behind this contribution is that logical connectives are not arbitrary constructions and that they express specific cognitive procedures that we tend to use in the treatment of information.

In order to study the cognitive functions of connectives, we suggest a metalogical investigation of fuzzy connectives. Fuzzy logics have developed looser definitions of the connectives, so that they can represent more flexible procedures of reasoning than the ones that classical logic offers. This raises the question of up to where such a flexibility can be conducted? In other words, what formal properties a connective can lose and still be the same connective? How can we distinguish between the essential and dispensable properties of a connective? For example, would a non-commutative disjunction still be a disjunction, would it still play the cognitive role of a disjunction? The idea is that, in deciding about the properties of connectives, logicians are implicitly guided by the cognitive function we tend to attribute to the different connectives. So, trying to answer the previous questions about connectives, this investigation will help to shed some light on their specific cognitive functions.

There are many different fuzzy logics, defining connectives in various ways. We will look at some of them (Zadeh, Lukasiewicz, Kleene, Reichenbach, Gödel, Goguen…) and see if they preserve or not different classical properties of the connectives. Looking at these fuzzy logics, we will see how monotonicity can be loosened and what can happen to the idempotence, commutativity, associativity, distributivity and duality properties for connectives like conjunction and disjunction. We will also look at the conditional connective, how it can be weakened, avoiding some properties like reflexivity or contraposition, and how it deals with other principles like Ex falso sequitur quodlibet.

Through such an investigation of fuzzy connectives, we will draw some conclusions about the essential and dispensable properties of connectives (like negation, conjunction, disjunction, conditional and biconditional) and about the function of each of them in our cognitive activity of reasoning. This will also lead to insights about possible new non-classical logics.

REFERENCES

1. Klir, G. et Yuan, B. (1995) Fuzzy Sets and Fuzzy Logic, Theory and Applications, Englewood Cliffs, N. J., Prentice-Hall.

2. Negnevitsky, M. (2011) Artificial Intelligence, Addison Wesley, 3e ed., ch. 4.


Istanbul Technical University, Turkey

"Breaking the Spell of Existence"


Following Russell and Quine, the “current orthodoxy” (as Graham Priest calls it) is inclined to interpret the particular quantifier ‘some’ as ‘there exists’, and to assume that the domain of quantification comprises only existent objects: existent in reality. In my talk, I will argue that the current take on existence is rooted in some wrong-headed conceptions of logical validity. So, I will first distinguish between three different conceptions of logical validity: derivational, formal and model-theoretic. Then, I will explain why the derivational conception of validity and the formal conception of validity might lead someone to believe that the domain of discourse should include only existent objects. Finally, I will argue that in order to overcome the “current orthodoxy”, we do not need to revise the rules that standardly govern our understanding of the existential quantifier, but that we need a better understating of the role that existence actually plays in model-theoretic semantics.

Vangelis Triantafyllou

University of Ioannina, Greece

"Ancient Logic as a Form of Geometry"


Our subject is Aristotle’s logic -as presented in the Prior Analytics- and the investigation of the hypothesis that logical symbolism and methodology was in these early stages of a geometrical nature; with the gradual algebraization that ocurred historically being one of the main reasons that some of the earlier passages on logic often appear enigmatic, since their original purpose was to be treated as text that accompanies certain diagrams and offers commentary on them. After examining passages from ancient texts that can be taken as revealing the extent to which diagrammatic methods were seen as applicable to various fields of study, we outline certain characteristics of the aristotelian text that can be seen as indicating the use of such methods in the study of logic. We then move on to offer an interpretation of syllogisms, as well as of the main proof methods that are utilized by Aristotle, as specific forms of operations performed on certain diagrams, operations that can be seen as analogous to constructions that occur in Euclid’s elements. Finally, we present a diagrammatic decision procedure for the syllogistic.

REFERENCES

[1] Corcoran, J.: Completeness of an ancient logic. The Journal of Symbolic Logic, Vol. 37, No. 4, pp.696-702 (1972)

[2] Heath, T. L.: Mathematics in Aristotle. Oxford at the Clarendon Press (1949)

[3] Heath, T. L.: The Thirteen Books of Euclid’s Elements. Cambridge: at the University Press (1968)

[4] Ross, D.: Aristotle’s Prior and Posterior Analytics, a revised text with Introduction and Commentary. Oxford at the Clarendon Press (1957)

[5] Smiley, T.: What is a syllogism? Journal of Philosophical Logic, 2 (1): pp. 136-154 (1973

Department of Mathematics and Statistics

Idaho State University, Pocatello, ID, USA

"Reconstructing Constructivism"

Is there a middle ground between Constructive Mathematics [B] and what might be called Default Mathematics? Both use First Order Logic and Zermelo-Frankel Set Theory with the Axiom of Choice (ZFC) to prove theorems, but constructivists avoid innite processes. The difference is highlighted by the standard proofs of the Bolzano{Weierstrass Theorem, which asserts that a bounded sequence has a convergent subsequence. One step of the proof uses First Order Logic and ZFC to show that when a set of innite cardinality is partitioned into two disjoint subsets then at least one of the subsets has innite cardinality. But constructivists balk at the next step, which is to choose a subset of innite cardinality, because that requires complete knowledge of at least one subset. Constructivists call this "omniscience." So, while default mathematics accepts the theorem that a bounded random sequence has a convergent subsequence, constructivist mathematics may not.

Identifying randomness with high Kolmogorov complexity leads to the idea that the theorem remains constructively true in the case of a sequence of low complexity. For example, the evidently low-complexity sequence sin(n), where n is a natural number, is dicult to compute but enough of its behavior is understood to decide which subset(s) have innite cardinality when the interval [-?1; 1] is repeatedly bisected.

When a mathematician imagines a sequence an there is an implicit assumption that some formula or algorithm (eg, a recurrence relation) is present; that would constitute low complexity. Alternatively, in Signal Processing this is described as the data being "sparse." [D] This appears to be a natural psychological hypothesis, since much of mathematics was developed to describe natural processes, which lead to sparse sequences and signals even if the underlying rule is not known. In other words, much of mathematics was developed under an unacknowledged hypothesis of low complexity.

This paper posits low complexity as an explicit hypothesis and tries to make the relationship between low complexity and theorems explicit. The additional hypothesis creates a middle ground between constructive and default mathematics in which theorems about random signals or data remain constructively valid as long as the signal is "not too random."

REFERENCES

[B] Bishop, Erret, Constructive Analysis. NY: Wiley (1958).

[D] Donoho, D., "Compressed Sensing," IEEE Transactions on Information Theory, 52 (April, 2006), no. 3, 1289 { 1306.

ICAM Paris – Center for Ethics, Technology and Society (CETS)

School for Advanced Studies in Social Sciences (EHESS - GSPR)

"Elements of Special and General Dialectics"


Dialectics can be understood in a classical way as a rational method of discussion and argumentation allowing to justify a position which is not true, but only probable. Unlike logic, where it is allowed to positively conclude a reasoning by means of a demonstration, the only proof in the strict sense that is within its reach is negative, when a discourse is caught in a contradiction. It remains that, in dialectics, the dynamics of change or evolution of a position can be understood in a dialogical sense (Plato, Aristotle) or in a non-dialogical sense (Hegel, Marx). In this last version, the dialectic is then only the mirror of the process of reality itself, that of nature and of the spirit, and it is presented as a necessary chain which the (often caricatured) triad 'Thesis-Antithesis-Synthesis' strives to account for. Certain developments in dialectics (Lupasco, Walton, Van Eemeren, Grootendorst) make it possible to relativize the conflict between its dialogic and its non-dialogical version and, on this basis, to give it an overall unity. It is well known that dialectics is not conventionally divided according to a coherent construction, as may be the case in logic, where it is customary to distinguish several stages (logic of propositions, predicates, relations, modalities) and several sectors (semantics, syntax, pragmatics). It can be suggested that, analogous to logic, dialectics can be divided into several stages (dialectics of positions, arguments, contradictions and changes). It can also be divided, by derivation from the classical distinction between thesis, antithesis and synthesis, into several sectors (topics, agonics and harmonics). Thus, we can assume in the wake of a tradition that goes back to Aristotle, enriched by Hegel, that dialectics comprises three parts: a topics, which relates to the arguments put forward in support of a position (model 'Pro' ); an agonics which relates to the arguments put forward in support of a position and those put forward in support of the opposite position, possibly in a discussion which brings out the contradictions of the two positions, i.e. their opposition (model 'Pro' / 'Contra'); finally, a harmonics, which relates to the arguments put forward in support of two or more positions presented as contrary, according to a relation of opposition, and for which it is important to produce a composition (model 'Pro' / 'Contra' / 'Alter'). However, it is a question, as in logic, of getting out of the problem posed by its aletheiotropy and epistemotropy, i.e. the centering on truth and science (Lavelle, Special and Genral Logic, 2018). This problem also affects dialectics insofar as it remains centered on the stake of veracity, or verisimilitude (approximation of truth), even in the case of moral or artistic discussion or argumentation. We face in the project of a general dialectics, which contrasts a special dialectics centered on veracity or verisimiltude, the same challenge of moving from homology to heterology and then taking into account in the discussion and the argumentation several multiple rationalities (epistemics, technics, aesthetics, ethics). This challenge amounts to asking to what extent is possible (a) a general topics which widens the field of argumentation of a position beyond the stake of veracity or verisimilitude (b) a general agonics which extends the field of opposition by considering the contradiction of positions of distinct order, and (c) a general harmonics which opens the field of composition to positions marked by their difference or contradiction, but also by their difference in the order of rationality.

REFERENCES

BOUCQUIAUX, L., LECLERCQ, B. (2017) Logique formelle et argumentation, Louvain-la-Neuve, De Boeck.

COUNET, J.-M. (2012) "La formalisation de la logique de Hegel. Bilan de quelques tentatives", Logique et analyse, 55, 218.

DA COSTA, N. (1997) Logiques classiques et non classiques. Essai sur les fondements de la logique, Paris, Masson.

DAFERMOS, M. (2018) "Relating Dialogue and Dialectics. A philosophical Perspective", Dialogical Pedagogy, 6.

HARRIS, E. (1987) Lire la logique de Hegel, Lausanne, L’Age d’Homme.

KEIFF, L, RAHMAN, S. (2010) "La dialectique, entre logique et rhétorique", Revue de métaphysique et de morale, 2, 66.

LAVELLE, S. (2018) Special and General Logic, Beziau, d’Ottaviano, Costa-Leite, Aftermath of the Logical Paradise, CLE-UNicamp, n°81

MONTMINY, M. (2015) Raisonnement et pensée critique, Montréal, Presses Universitaires de Montréal.

VAN EEMEREN, F., GROOTENDORST, R. (1992) Argumentation, Communication and Fallacies, Routledge.

WALTON, D. (1998) The New Dialectic. Conversational Contexts of Argument, The University of Toronto Press.