Propositional logic-introduction
Propositional logic or 0th order is the branch of logic that studies ways of joining as well as modifying entire propositions, statements or sentences to form a more complicated proposition, statements or sentences. Moreover, it studies the logical relationships and properties that are derived from those methods of combining or altering statements. It is also known as sentential logic and statement logic. One common way of combining statement is joining two simpler propositions with the world “and”. The complex statement formed when two statements are joined together with “and”, that statement is true if and only if both the component statements are true.
An argument of the following form is logically valid because of this:
- If the train arrives late and there are no taxis at the station.
- Therefore John is late for his meeting.
A proposition is simply a statement whereas propositional logic studies the ways statements can interact with each other. The simplest statements are considered as indivisible units in propositional logic and hence, propositional logic does not study those logical properties and relations that are not themselves statements on their own but depend upon parts of statements, such as the subject and predicate of a statement. Classical truth-functional propositional logic is the most thoroughly researched branch of propositional logic, which studies logical operators and connectives that are used to produce complex statements whose truth-values depend entirely on the truth-values of the simpler statements making them up, and it is assumed that every statement is either true or false and not both. Propositional Logic is actually concerned with propositions and their interrelationships. A proposition is a statement or sentence which is either true or false. If a proposition is false, then we say its truth value is false, and if a proposition is true then we say its truth value if true.
The study of logical operators is first needed to understand propositional logic. A logical operator is any phrase or word used either to modify one statement to make a different statement or join multiple statements together to form a more complicated statement. Words such as “and”, “or”, “not”, “if … then…”, “because”, and “necessarily”, are all operators In English. A logical operator is said to be truth-functional if the truth-values of the statements it is used to construct always depend entirely on the truth-values of the statements from which they are constructed. As a compound statement joined together with the word “and” is true if both the statements so joined are true, and false if either or both are false so, the English word “and” is truth-functional. As a compound statement joined together with the word “or” is true if at least one of the joined statements is true, and false if both joined statements are false so, the English word “or” is truth-functional. As the negation of a statement is true if and only if the statement negated is false so, the English word “not” is truth-functional.
Some logical operators are not truth-functional. The word “necessarily” is one example of an operator in English that is not truth-functional. Whether a statement formed using this operator is true or false does not depend entirely on the truth-values of the statement to which the operator is applied. The following statements are true.
- 3 + 2 = 5.
- Someone is reading an article in an encyclopedia.
Now consider the same statements modified with the “necessarily” operator:
- Necessarily, 3 + 2 = 5.
- Necessarily, someone is reading an article in an encyclopedia.
Here, the first example is true and the second example is false. Hence, the truth-values of a statement using the operator “necessarily” does not depend entirely on the truth-values of the statement modified.
Classical truth-functional propositional logic is that branch of truth-functional propositional logic that assumes that there are only two possible truth-values a statement can have: one is “true”, and the other is “false” and also that every statement is either true or false but not both. Whereas “non-classical” propositional logic is that in which such possibilities are considered.
- A proposition’s having a truth-value other than true or false.
- A proposition’s having either an indeterminate truth-value or lacking a truth-value altogether.
- A proposition’s being both true and
History
The study of logic as an independent discipline began with the work of Aristotle (384-322 BCE). Generally, Aristotle’s writings on logic dealt with the logic of categories and quantifiers such as “all”, and “some”, which are not treated in propositional logic. However, Aristotle espoused two principles of great importance in propositional logic, in his metaphysical writings, which have since come to be called the Law of Excluded Middle and the Law of Contradiction. Interpreted in propositional logic, the Law of Excluded Middle is the principle that every statement is either true or false; the Law of Contradiction is the principle that no statement is both true and false. These are cornerstones of classical propositional logic. Aristotle, and his successor at the Lyceum, Theophrastus(d. 287 BCE), did recognize a need for a doctrine of “complex” propositions. Complex propositions involving conjunctions (statements joined by connective “and”), disjunctions (statements joined by connective “or”) and conditionals (statements joined by connective “if… then…”), but their investigations into propositional logic seem to have been very minor.
To study statement operators such as “and”, “or” and “if… then…” more serious attempts were conducted by the Stoic philosophers in the late 3rd century BCE. Since most of their original works are lost, we cannot make any definite claims about who first made investigations into what areas of propositional logic, but we do know from the writings of Sextus Empiricus that Diodorus Cronus and his pupil Philo had engaged in a debate about whether the truth of a conditional statement depends entirely on it not being the case that its antecedent (if-clause) is true while its consequent (then-clause) is false, or it may require some sort of stronger connection between the antecedent and consequent. A debate that was continues to have relevance for modern discussion of conditionals. The Stoic philosopher Chrysippus (roughly 280-205 BCE) did the most in advancing Stoic propositional logic, by marking out different ways of forming complex premises for arguments and listing valid inference schemata for each. In accordance with Chrysippus suggestion, the following inference schemata are to be considered the most basic:
- If first, then second; but the first; therefore the second.
- If first, then second; but not the second; therefore, not the first.
- Not both first and second; but the first; therefore, not the second.
- Either first or second [and not both]; but the first; therefore, not the second.
- Either first or second; but not the second; therefore the first.
Inference rules correspond very closely to the basic principles in a contemporary system of natural deduction for propositional logic. For example, the above mentioned first two rules correspond to the rules of modus ponens and modus tollens, respectively. Chrysippus and other Stoics expand these basic inference schemata upon less basic inference schemata and are preserved in the work of Diogenes Laertius, Sextus Empiricus and later, in the work of Cicero.
The Stoic’s work was advanced in small steps in the centuries that followed. This work was done by the second-century logician Galen (roughly 129-210 CE), the sixth-century philosopher Boethius (roughly 480-525 CE) and later by Peter Abelard (1079-1142) and William of Ockham (1288-1347), and some others. Much of their work involved introducing improved terminology and furthering the discussion of the relationships between operators and producing better formalizations of the principles of Aristotle or Chrysippus. Abelard seems to have been the first to differentiate exclusive disjunction from inclusive disjunction and to suggest that, for the development of a relatively simple logic of disjunctions, inclusive disjunction is the more important notion.
The next major step in the development of propositional logic came only much later with the advent of symbolic logic in the work of logicians such as Augustus DeMorgan (1806-1871) and George Boole (1815-1864) in the mid-19th century. George Boole was primarily interested in developing mathematical-style “algebra” to replace Aristotelian syllogistic logic, primarily by employing the numeral “1” for the universal class, the numeral “0” for the empty class, the addition notation “x + y” for the union of classes x and y, the multiplication notation “xy” for the intersection of classes x and y, etc., so that statements of syllogistic logic could be treated in quasi-mathematical fashion as an equation;
- “No x is y” could be written as “xy = 0”
George Boole noticed that if an equation such as “x = 0” is read as “x is false”, and “x = 1” is read as “x is true”, then the rules given for his logic of classes can be transformed into a logic for propositions, with “xy = 1” is reinterpreted as saying that x and y are both true, and “x + y = 1” is reinterpreted as saying that either x or y is true. Among mathematicians, Boole’s work sparked rapid interest in logic. Later, to form the basis of the truth-functional propositional logics utilized in computer design and programming, “Boolean algebras” were used.
In the 19th century, Gottlob Frege (1848-1925) presented logic as a branch of systematic inquiry more fundamental than mathematics or algebra, and also presented the first modern axiomatic calculus for logic in his 1879 work Begriffsschrift. According to Frege’s axiomatization, it is possible to distill the first complete axiomatization of classical truth-functional propositional logic. Frege was the first to systematically argue that all truth-functional connectives could be defined in terms of negation and the material conditional.
In the 20th century, Bertrand Russell gave a complete and different axiomatization of propositional logic, considered on its own, in his 1906 paper “The Theory of Implication”. After that, he produced another axiomatization using disjunction and negation as primitives in the 1910 work Principia Mathematica along with A. N. Whitehead. American logician H. M. Sheffer in 1913 publishes the first Proof of the possibility of defining all truth functional operators in virtue of a single binary operator. French logician Jean Nicod in 1917 discovered that it was possible to axiomatize propositional logic using the Sheffer stroke and only a single axiom schema and single inference rule.
For classical truth-functional propositional logic, the natural deduction systems were developed and popularized in the work of Gerhard Gentzen in the mid-1930s.
Syntax and Semantics
A proposition is a collection of declarative statements. The declarative statement has either a truth value “true” or a truth value “false”. For example, the following statements are propositions:
- Everyone born on Sunday has purple hair.
- The sun is shining.
The following statement is not a proposition.
- “P is less than 5”. It is “False” because unless we give a specific value of P, we cannot say whether the statement is true or false.
Among contemporary logicians, the basic rules and principles of classical truth-functional propositional logic are almost entirely agreed upon, and capable of being stated in a definitive way. This is most easily done if we utilize a simplified logical language that deals only with simple statements (indivisible units) as well as complex statements (joined together by means of truth-functional connectives).
A statement would never consist of a single word in any ordinary language, but would always at least consist of a noun or pronoun along with a verb. There are two types of sentences, in Propositional Logic, simple and compound sentences. Simple sentences are about to express simple facts about the world, often called proposition constants or, sometimes, logical constants. Compound sentences are formed from simpler sentences and are about to express logical relationships between the simpler sentences of which they are composed.
Propositional logic treats simple statements as indivisible wholes, the propositional logic language uses certain distinct symbols ‘p’, ‘q’, ‘r’, etc., in place of complete statements. These symbols are called Propositional variables because it does not consider smaller parts of statements. For example, given the atomic sentences:
p: ‘I won the lottery last week.’
q: ‘I purchased a lottery ticket.’
r: ‘I won last week’s sweepstakes.
And in place of the truth-functional operators, “and”, “or”, “if… then…”, “if and only if”, and “not”, the logical signs ‘&’, ‘v’, ‘→’, ‘↔’, and ‘¬’ are used, respectively. These logical signs are called connectives. These connectives are used to connect the propositional variables.
Connectives
Generally, in propositional logic we use five connectives which are:
- OR (∨)
- AND (∧)
- Negation/ NOT (¬)
- Implication / if-then (→)
- If and only if (⇔)
OR (∨)
This operation of two propositions p and r (written as p∨r) is true if at least any one of the propositional variables p or r is true.
The ‘OR’ truth table is as follows:
p | r | p ∨ r |
True | True | True |
True | False | True |
False | True | True |
False | False | False |
Example:
p: ‘I won the lottery last week.’
r: ‘I won last week’s sweepstakes.
Sentence: ‘I won the lottery last week, or I won last week’s sweepstakes;
This declarative sentence is denoted by p∨r and called dis-junction of p and r.
AND (∧)
This operation of two propositions p and r (written as p∧r) is true if both the propositional variable p and r is true.
The ‘AND’ truth table is as follows:
p | r | p ∧ r |
True | True | True |
True | False | False |
False | True | False |
False | False | False |
Example:
p: ‘I won the lottery last week.’
r: ‘I won last week’s sweepstakes.’
Sentence: ‘Last week I won the lottery and the sweepstakes.’
This declarative sentence is denoted by p∧r and called conjunction of p and r.
Negation (¬)
This operation of a proposition p (written as ¬p) is true when p is false and is false when p is true.
The ‘Negation’ truth table is as follows:
p | ¬ p |
True | False |
False | True |
Example:
p: ‘I won the lottery last week.’
Sentence: ‘I did not win the lottery last week,’
This declarative sentence is denoted by ¬p and called Negation of p.
Implication / if-then (→)
An implication p→q is the proposition “if p, then q”. p→q is false if p is true and q is false. The rest cases are all true.
The ‘Implication’ truth table is as follows:
p | q | p→q |
True | True | True |
True | False | False |
False | True | True |
False | False | True |
Example:
p: ‘I won the lottery last week.’
q: ‘I purchased a lottery ticket.’
Sentence: ‘If I won the lottery last week, then I purchased a lottery ticket.’
This declarative sentence is denoted by p → q (q is a logical consequence of p) and called p as the assumption/antecedent of p → q and q its conclusion/consequent.
Bi-conditional / If and only if (⇔)
This operation of two propositions p and q (written as p ⇔ q) is true when p and q are the same, i.e. both are false or both are true.
The ‘bi-conditional’ truth table is as follows:
p | q | p ⇔ q |
True | True | True |
True | False | False |
False | True | False |
False | False | True |
A bi-conditional is a combination of implication and a reverse implication.
Now we are entitled to use these rules of constructing propositions repeatedly. e.g., we are now in a position to form the proposition
p ∧ q → ¬r ∨ q
It means that ‘if p and q then not r or q’.
However, there is a potential ambiguity in this reading. There are two ways to remove ambiguity.
- One way is the insertion of brackets, as:
(p ∧ q) → ((¬r) ∨ q) (Now this assertion is dis-ambiguous). - And the other way is to adopt certain conventions about the binding priorities of these symbols.
Convention: ¬ binds more tightly than ∨ and ∧, ∨ and ∧ bind more tightly than →. And the implication → is right-associative as: expressions of the form p → q → r denote p → (q → r).
The treatment of semantics in propositional Logic is similar to its treatment in Algebra. Algebra is actually not concerned with the real-world significance of variables. The relationships among the values of the variables expressed in the equations are very interesting. Algebraic methods are specially designed to respect these relationships, independent of what the variables represent in the equation. Logic is also unconcerned with the real-world significance of proposition constants. The interesting thing is the relationship among the truth values of simple sentences and the truth values of compound sentences within which the simple sentences are contained. In sentences, logical reasoning methods are independent of the significance of proposition constants.
Truth assignment is a function assigning a truth value to each of the proposition constants of the propositional vocabulary. Digit 1 is used as a synonym for true and 0 as a synonym for false, and by superscribing the constant or expression with i as the superscript, we are referring to the value of a constant or expression under a truth assignment i.
Here is an example of the case of a propositional vocabulary with just three proposition constants p, q, and r.
pi = 1
qi = 0
ri = 1
The following is another truth assignment for the same vocabulary.
pi = 0
qi = 0
ri = 1
These formulas are not themselves sentences in Propositional Logic. In Propositional Logic, superscripts are not allowed and it does not use the = symbol.
Evaluation
Given a truth assignment for the truth values of proposition constants, then evaluation is the process of determining the truth values of compound sentences.
There is a simple technique that is used for evaluating complex sentences. True and false values for the proposition constants are substituted in a sentence, forming an expression with 1s and 0s and logical operators. Operator semantics are used to evaluate sub-expressions with these truth values as arguments. Then working from the inside out is repeated, until we have a truth value for the sentence as a whole.
Consider the truth assignment i, as an example shown below:
pi = 1
qi = 0
ri = 1
Using the evaluation method, we can see that i satisfy (p ∨ q) ∧ (¬q ∨ r).
(p ∨ q) ∧ (¬ q ∨ r)
(1 ∨ 0) ∧ (¬ 0 ∨ 1)
1 ∧ (¬ 0 ∨ 1)
1 ∧ (1 ∨ 1)
1 ∧ 1
1
Now consider the truth assignment j defined as below:
pj = 0
qj = 1
rj = 0
j does not satisfy (p ∨ q) ∧ (¬q ∨ r), in this case.
(p ∨ q) ∧ (¬q ∨ r)
(0 ∨ 1) ∧ (¬1 ∨ 0)
1 ∧ (¬1 ∨ 0)
1 ∧ (0 ∨ 0)
1 ∧ 0
0
Using this technique, the truth of arbitrary sentences is evaluated in a language. The cost is exactly proportional to the size of the sentence. However, in some cases, it is possible to economize and do even better. For example, when evaluating conjunction, if we found that the first conjunct is false then there is no need to evaluate the second conjunct since the sentence as a whole must be false.
Satisfaction
Satisfaction is the opposite of the evaluation. A compound sentence is tried to figure out which truth assignments satisfy that sentence. A method based on truth tables is an effective procedure for finding truth assignments that satisfy the Propositional Logic sentence. A truth table for a propositional language shows all of the possible truth assignments for the proposition constants in the language. In the table, columns correspond to the proposition constants of the language, and the rows correspond to different truth assignments for those constants.
Here is an example of a truth table for a propositional language with just three proposition constants (p, q, and r). Each column in the table corresponds to one proposition constant, and each row in the table corresponds to a single truth assignment.
p | q | r |
1 | 1 | 1 |
1 | 1 | 0 |
1 | 0 | 1 |
1 | 0 | 0 |
0 | 1 | 1 |
0 | 1 | 0 |
0 | 0 | 1 |
0 | 0 | 0 |
Note: for a propositional language with n proposition constants, there are n columns in the truth table and 2n rows.
In solving satisfaction problems, a truth table for the proposition constants of the language is used then a sentence is processed, placing an x next to rows in the truth table corresponding to the truth assignments that do not satisfy the sentence. The remaining truth assignments at the end of this process are all possible truth assignments of the input sentences.
Consider the sentence p ∨ q ⇒ q ∧ r, as an example. All truth assignments that satisfy this sentence can be found by constructing a truth table for p, q, and r. For taking the satisfied rows (remaining rows), Place an x next to each row that does not satisfy the sentence.
p | q | r | ||
1 | 1 | 1 | ||
x | 1 | 1 | 0 | x |
x | 1 | 0 | 1 | x |
x | 1 | 0 | 0 | x |
0 | 1 | 1 | ||
x | 0 | 1 | 0 | x |
0 | 0 | 1 | ||
0 | 0 | 0 |
The truth table method is computationally complex. The size of a truth table for a language grows exponentially, with the number of proposition constants in that language. The method works well, only when the number of constants is small. The method becomes impractical when the number is large. It can be tedious, even for moderate-sized problems. As there are 65,536 truth assignments, for only 16 proposition constants. This is the main disadvantage of the truth table method.