1. Debirupa Basu, Shi Sikshayatan College, Kolkata
Title: Modal Cognitivism: A reaction to Logic of Imperatives (slides)
Abstract: It is a fact that imperatives are neither true nor false. But this fact has a far reaching consequence in the sense that it raises the question of cognitivism playing some role in this context. Many thinkers have accepted the cognitivist account while considering the semantics of imperatives. Some of the cognitivists who favour reductionism accept that imperatives express propositions. Some others however accept that imperatives are non-truth-conditional. The former attempt lacks strong ground of defense while the latter has been welcome in greater degree by thinkers especially by empiricists and logicians. In logic, a full-fledged non-reductionist system is found in the writings of Peter B. M. Vranas. There are however scholars who have reservation in accepting the latter account. Such an approach is found in Logic and Semantics for Imperatives where Nate Charlow has attempted to deviate from this account by offering an account which is named as Modal Cognitivism. The present paper is a humble attempt to reflect upon such non-cognitivism. The focus is upon the issue whether such account can explain imperative inference — pure or mixed.
Title: Constraint LTL with Remote Access (Slides)
Abstract: Constraint Linear Temporal Logic (CLTL) is an extension of LTL that is interpreted on sequences of valuations of variables over an infinite domain. The atomic formulas are interpreted as constraints on the valuations. The atomic formulas can constrain valuations at the current position and positions that are a fixed distance apart (e.g., the previous position or the second previous position and so on). The satisfiability problem for CLTL is known to be Pspace-complete. We generalize CLTL to let atomic formulas access positions that are unboundedly far away in the past. We annotate the sequence of valuations with letters from a finite alphabet and use regular expressions on the finite alphabet to control how atomic formulas access past positions. We prove that the satisfiability problem for this extension of the logic is decidable in cases where the domain is dense and open with respect to a linear order (e.g., rational numbers with the usual linear order). We prove that it is also decidable over integers with linear order and equality.
Title: Gradability of predicates, dialetheism and an uncertainty principle in logic (slides)
Abstract: Paul Egre, in his paper ‘Respects for Contradictions’ [Egr19], expounded gradability of ́ predicate of a sentence in the context of dialetheism. Dialetheism is the view which considers some sentences to be both, true and false [Pri06]. It accommodates true contradictions. There is a possibility to confuse the notion of contradiction with the notion of lexical ambiguity. Paul Egre attempts to define acceptable contradiction in two distinct linguistic forms. One is with ‘and’ descriptions and the other is ‘neither’ descriptions. Formal logic considers these two senses to be equivalent. However in ordinary language these two senses may not be the same. The form of contradiction in logic is ‘X is p and not p’. Egre and Zehr picks up two sentences in ordinary language and show that both are not the same. In this context Egre refers to the notion of presupposition failure. Multidimensional adjectives arise in an attempt to fit the logical form of contradiction in contradiction found in ordinary language. In this context we introduce a type of uncertainty principle in logic. The flavor of this principle is reminiscent of Heisenberg’s uncertainly principle from the world of Physics.
References:
[Egr19] Paul ́ Egr ́e. “Respects for Contradictions”. In: ́ Graham Priest on Dialetheism and Paraconsistency. Ed. by Can Ba ̧skent and Thomas Macaulay Ferguson. Springer Verlag, 2019, pp. 39–57.
[Pri06] Graham Priest. In Contradiction: A Study of the Transconsistent. New York: Oxford University Press, 2006.
Title: Proof theory through the lens of Hilbert's 24th problem (slides)
Abstract: At the turn of the previous century, David Hilbert famously posed 23 problems at the International Congress of Mathematics that set the narrative for research in mathematics and logic for at least a third of a century. Proof theory was one of the subjects that was born as a result of Hilbert's problems being actively tackled. In 2000, a 24th problem was found in Hilbert's notebook that asks for a criterion of simplicity in mathematical proofs and the development of a proof theory with the power to prove that a given proof is the simplest possible. In this talk, I will present various interpretations of the 24th problem and view major developments in proof theory through the lens of those interpretations.
Title: Complexity Study of Reasoning about Knowledge and Public Observations (slides)
Abstract: This talk explores the model checking and satisfiability problems in logical systems for reasoning about knowledge and observations of intelligent agents and their interactions. We examine the foundations laid by Kripke and Hintikka, which paved the way towards the development of their dynamic counterparts, e.g., Public announcement logic, and its extensions like Dynamic epistemic logic and Public observation logic. Complexity studies for these logics stem from similar studies in temporal logics initiated by Halpern and Vardi, among others.
More specifically, we focus on Public observation logic and analyze its model checking and satisfiability problems. By investigating these problems, we uncover insights into effective and efficient reasoning about knowledge in multi-agent scenarios. Moreover, we explore how these insights can be applied in solving standard problems in relevant domains like planning.
Title: Clean Rough Randomness vs Others (slides)
Abstract: A number of generalizations of stochastic and information-theoretic randomness are known in the literature. In earlier research, it is shown by the present author and others that probabilistic methods are not compatible with handling meaning in vague and dynamic contexts of rough reasoning (and therefore meaningful AIML) [1, 2]. Based on these, new non-stochastic concepts of clean rough randomness were introduced and fruitfully applied by her in [5, 6]. Many loose subjective probabilistic concepts are actually available in rough sets [8–10], and there is room for exploration. Essentially, a phenomenon is clean roughly random if it can be modeled by general rough sets or a derived process thereof. It is realized in concrete situations through functions and predicates of different types, such as that of large minded reasoners (LMR). The latter is used to construct very efficient granular algorithms for discovering tolerance relations from information tables in [6]. Rough random functions are sometimes related to overlap functions, aggregations, and weakening thereof [4, 7], and axiomatic granularity assumptions are optional in related considerations [3]. In the present talk, the concepts and their scope are explained, contrasted with those of subjective probability, and further research directions are provided.
References
1. Mani, A.: Types of Probabilities Associated With Rough Membership Functions. In: Bhattacharyya, S., et al. (eds.) Proceedings of ICRCICN’2015: IEEE Xplore. pp. 175–180. IEEE Computer Society (2015), http://dx.doi.org/10.1109/ICRCICN.2015.7434231
2. Mani, A.: Probabilities, Dependence and Rough Membership Functions. International Journal of Computers and Applications 39(1), 17–35 (2016). https://doi.org/10.1080/1206212X.2016.1259800
3. Mani, A.: Comparative Approaches to Granularity in General Rough Sets. In: Bello, R., et al. (eds.) IJCRS 2020, LNAI, vol. 12179, pp. 500–518. Springer (2020)
4. Mani, A.: Algebraic models for qualified aggregation in general rough sets, and reasoning bias discovery. In: Campagner, A., Others (eds.) Rough Sets. IJCRS 2023, LNAI, vol. 14481, pp. 137–153. Springer Nature (2023), https://link.springer.com/chapter/10.1007/978-3-031-50959-9
5. Mani, A.: Rough randomness and its application. Journal of the Calcutta Mathematical Society In Press, 1–15 (2023). https://doi.org/10.5281/zenodo.7762335, https://zenodo.org/record/7762335
Title: Coarser Equivalences for Causal Concurrency (slides)
Abstract: Trace theory (formulated by Mazurkiewicz in 1987) is a principled framework for defining equivalence relations for concurrent program runs based on a commutativity relation over the set of atomic steps taken by individual program threads. Its simplicity, elegance, and algorithmic efficiency makes it useful in many different contexts including program verification and testing. It is well-understood that the larger the equivalence classes are, the more benefits they would bring to the algorithms and applications that use them.
In this talk I will go over some preliminary work on relaxations of trace equivalence with the goal of maintaining its algorithmic advantages for answering the causal concurrency question --- are two events ordered the same way in all possible equivalent executions?. I will first go over the algorithmic hardness involved in the largest appropriate relaxation of trace equivalence, called the reads-from equivalence used extensively in the context of testing and model checking of shared memory multi-threaded programs. The same problem can be decided in constant space for trace equivalence.
Next, I will introduce a new commutativity-based notion of equivalence called grain equivalence that is strictly more relaxed than trace equivalence, and yet yields a constant space algorithm for the same problem. This notion of equivalence uses commutativity of grains, which are sequences of atomic steps, in addition to the standard commutativity from trace theory. I will talk about distinct cases when the grains are contiguous subwords of the input program run and when they are not, formulate the precise definition of causal concurrency in each case, and show that they can be decided in constant space, despite being strict relaxations of the notion of causal concurrency based on trace equivalence.
Title: Logicality of Metaphors (slides)
Abstract: A revolutionary change in our understanding of reason is actually a radical change in our understanding of ourselves. The shift in our understanding shows that reason is not literal, but largely metaphorical. This paper gives a picture of literal and metaphorical, showing a different nature of them, and giving a higher place to metaphorical. Literal is true to fact, conventional, and represents constant values which cannot be modified. Commonsense theory or literal-truth-paradigm that which is involved with Correspondence theory of truth expresses that the truth and falsity of a sentence is determined only by how it relates to the world. Metaphorical language describes something through comparison with something different, explains concept through interpretations or contexts. For the literalists metaphor is just derived from the literal. There can be metaphorical uses of language , those uses can be indirectly literal or meaninglessly imaginative, in that they do not express literal concepts. This paper tries to show that metaphorical concepts stand in its own right, do not need to be derived from the literal. Metaphor places things in a new light, shows new dimension of truth and experiences of a different world altogether.
Title: Specification Synthesis with Constrained Horn Clauses (slides)
Abstract: Many practical problems in software development, verification and testing rely on specifications. The problem of specification synthesis is to automatically find relational constraints for undefined functions, called specifications, in a given program and a postcondition. For specifications to be useful, they need to be both correct and maximal.
The problem of specification synthesis can be encoded as logical formulas in the Floyd-Hoare style. This restricted class of first-order formulas are called constrained Horn clauses (CHCs), which are gaining popularity as logical intermediate representations among program verification tools. Finding correct specifications can be reduced to the satisfiability of CHCs. Furthermore, maximal specifications can be found by iteratively refining solutions to CHCs.
In this talk, I will introduce the problem setting and provide an overview of our algorithmic framework to solve this problem.
Title: Consistency checking: logic and law (slides)
Abstract: The notion of consistency is central to logic: we call a theory consistent if no contradiction can be derived from it, or in model theoretic terms, if we can construct a model for it. One of the major domains of social practice where logical principles play a critical implicit role is that of law. Not surprisingly, consistency is important in law as well. However, the notion of consistency is considerably nuanced in the legal domain. Some legal philosophers call a legal system is consistent if, and only if, there are no norm-conflicts in the strict sense of norm-contradictions in the system. There are also so-called consistency doctrines in jurisprudence, whereby they seek to ensure that regulatory shifts caused by policy change based on regulatory whim are not permitted. The recent EU Climate law has been criticised in terms of its consistency with the “do no significant harm” principle. Then there are also procedural notions of consistency involving compliance.
We suggest that logical formalisation of these aspects of consistency is worthwhile, with the objective of algorithmic consistency checking.
Title: Some new 3-valued Paraconsistent logics (Slides)
Abstract: In the context of classical logic, the notions of consequence and inconsistency are equivalent in the sense that assuming any one of them as the primitive notion the other can be obtained. Paraconsistent logics are those logics in which everything may not derived from an inconsistent set of premises (law of non-explosion). As a result, in any paraconsistent logic the notions of consequence and inconsistency are not equivalent in the sense of classical logic. In [1] the authors present two equivalent sets of axioms for certain paraconsistent logics, one for consequence, the other for inconsistency. Various ways of obtaining explosion using different properties of logical connectives in the classical context have been explored. Then those properties are divided such that explosion is prevented. Based on these division a tree diagram is formulated, which shows different possible branches of paraconsistent logics. One can observe that, for all these parconsistent logics, the two axiom sets mentioned above are equivalent. Some known paraconsistent logics can be placed under a few branches of the diagram, but some of the branches remain empty. We hall present some new 3-valued paraconsistent logics that can be placed under some of the other branches of this tree diagram.
References:
[1] S. Dutta and M. K. Chakraborty. Consequence-inconsistency interrelation: in the framework of paraconsistent logics. In J.-Y. Beziau, M. Chakraborty, and S. Dutta, editors, New Directions in Paraconsistent Logic, volume 152, pages 269–283. Springer India, 2015.
Title: Program verification using small models (slides)
Abstract: We explore the use of methods from model theory in the verification of array programs. Given a program P and an assertion A to check of P, we formulate these respectively as sentences φP and φA of multi-sorted first order logic. The sentence φP is such that modulo a small set φ{ax} of axioms, the models of φP are in 1-1 correspondence with (suitably abstract representations of) the executions of P on arbitrary input arrays. Likewise, the models of φA restricted to the class of models of φP ∧ φax, correspond to (suitably abstracted) executions of P that satisfy the assertion A.
Our approach to verifying the program P against the assertion A consists of showing that the sentence αP, A := φax ∧ φP ∧ ¬φA has the small model property. That is, if αP, A has a model of some finite size, then there is also such a model in which the domains interpreting the sorts have all sizes bounded by a small function of P and A. The desired verification reduces to searching for models of αP, A of sizes within the mentioned bound. We examine specific classes of programs and assertions, and establish the small model property for αP, A by showing that models of αP, A that are large admit local reductions that are effected by "removal of iterations" in the executions of P that they correspond to.
Title: Laugh loud to remain healthy (slides)
Abstract: Logic of imperatives as an extended system has been practiced by both eastern and western thinkers over few decades. In Indian tradition it is mainly Mīmāṁsā school of thought that has dealt with imperative statements as vedic injunctions. In recent past, computer scientists have constructed a system of imperative logic by using only the Vidhivākya /injunction (MIRA) in special education and Robotics , though there are other forms of imperatives, such as suggestion, invitation, prayer, advice etc., which also occur as premise/es and conclusion of inferences used in daily life. Such a system is named as MIRA formalism by Bama Srinivasan and Ranjini Parthasarathi. In this system, imperatives- both affirmative and negative- are treated either as conditional or as unconditional. The system dealt with basic vocabulary, different binary connectives, rules of validity and offered the proofs of soundness and completeness. Worry however remains regarding representation of conditional imperatives , constructed out of different types of motivation . Keeping in mind Mīmāṁsā doctrine of vedic injunctions, an attempt has been made in the present paper to offer interpretation , different from what is offered by the system under consideration.
Title: Cumulativity in Default Logic (slides)
Abstract: Default Logic is a theory of non-monotonic reasoning, which was introduced by Reiter in the 1980s, and it has evolved into one of the most prominent methods with multiple real-life applications. In classical logic, a statement is either true or false, and the addition of new information can only expand the set of provable truths. However, in many real-world scenarios, knowledge is uncertain, incomplete, and subject to revision. Default logic was designed to address these issues and provide a framework for making inferences in the presence of incomplete or uncertain information.
This talk will be about the property of cumulativity in mathematical logic. I begin with the background of default logic and then present cumulativity in the general sense and how it impacts monotonic and non-monotonic logic differently. Then, I discuss the non-cumulative nature of the original default logic with examples, the significance of incorporating cumulativity into default logic, and the several types of cumulativity that have been widely studied in the past.
Title: A Mīmāṁsā Inspired Journey into Logic with real-time Applications in Computer Science (slides)
Abstract: Mīmāṁsā, an Indian philosophical system, focuses on interpreting Vedic texts. This system presents various methodologies that have proven beneficial in diverse contexts, including legal systems, agriculture, and logic. This presentation will delve into some of the concealed logical perspectives within Mīmāṁsā philosophy, offering insights that can be readily applied in real-time applications such as Robotics, Natural Language Processing, Reinforcement Learning and Automated Reasoning.
Title: Cardinality in a Parallel Universe and a Question (slides)
Abstract: The presentation starts with an overview of generalized algebra-valued models of set theories. As particular cases, we have well-known Boolean-valued models or Heyting-valued models. Then we define ordinal numbers and cardinal numbers in a class of algebra-valued models, which contains the two-valued Boolean-valued model as well. We then eventually discuss a few important properties of cardinal numbers and cardinal arithmetic. Unexpectedly, these properties remain same as classical set theory. This gives raise to an important question on the comparison between the classical and non-classical foundation of mathematics.
Title: Encoding control system evolution and scheduling behaviours to check control safety properties (slides)
Abstract: Cyber-physical systems are developed in a layered manner, beginning with the control law (i.e. controller synthesis) for the given control application, followed by the software implementation as a set of tasks. However, timing uncertainties at the implementation layer can lead to unforeseen patterns of deadline misses of the control task, resulting in possible violation of control safety properties. Detecting and reproducing these violations requires precise analysis of the interaction between control and scheduling layers.
We show how to jointly encode the system behaviour – consisting of control trajectories and task schedules – as a set of constraints. These constraints can be solved by off-the-shelf satisfiability-modulo-theory (SMT) solvers. Solutions to these constraints are the exact error traces. An abstraction-refinement scheme helps improve solver scalability to realistic safety-critical systems.