logic

logic
/loj"ik/, n.
1. the science that investigates the principles governing correct or reliable inference.
2. a particular method of reasoning or argumentation: We were unable to follow his logic.
3. the system or principles of reasoning applicable to any branch of knowledge or study.
4. reason or sound judgment, as in utterances or actions: There wasn't much logic in her move.
5. convincing forcefulness; inexorable truth or persuasiveness: the irresistible logic of the facts.
6. Computers. See logic circuit.
[1325-75; ME logik < L logica, n. use of neut. pl. (in ML taken as fem. sing.) of Gk logikós of speech or reason. See LOGO-, -IC]
Syn. 4. sense, cogency.

* * *

I
Study of inference and argument.

Inferences are rule-governed steps from one or more propositions, known as premises, to another proposition, called the conclusion. A deductive inference is one that is intended to be valid, where a valid inference is one in which the conclusion must be true if the premises are true (see deduction; validity). All other inferences are called inductive (see induction). In a narrow sense, logic is the study of deductive inferences. In a still narrower sense, it is the study of inferences that depend on concepts that are expressed by the "logical constants," including: (1) propositional connectives such as "not," (symbolized as ¬), "and" (symbolized as ∧), "or" (symbolized as ∨), and "if-then" (symbolized as ⊃), (2) the existential and universal quantifiers, "(∃x)" and "(∀x)," often rendered in English as "There is an x such that ..." and "For any (all) x, ...," respectively, (3) the concept of identity (expressed by "="), and (4) some notion of predication. The study of the logical constants in (1) alone is known as the propositional calculus; the study of (1) through (4) is called first-order predicate calculus with identity. The logical form of a proposition is the entity obtained by replacing all nonlogical concepts in the proposition by variables. The study of the relations between such uninterpreted formulas is called formal logic. See also deontic logic; modal logic.
II
(as used in expressions)
logic many valued
logic philosophy of

* * *

Introduction

the study of propositions and their use in argumentation.

The major task of logic is to establish a systematic way of deducing the logical consequences of a set of sentences. In order to accomplish this, it is necessary first to identify or characterize the logical consequences of a set of sentences. The procedures for deriving conclusions from a set of sentences then need to be examined to verify that all logical consequences, and only those, are deducible from that set. Finally, in recent times, the question has been raised whether all the truths regarding some domain of interest can be contained in a specifiable deductive system.

From its very beginning, the field of logic has been occupied with arguments (argument), in which certain statements, the premises, are asserted in order to support some other statement, the conclusion. If the premises are intended to provide conclusive support for the conclusion, the argument is a deductive (deduction) one. If the premises are intended to support the conclusion only to a lesser degree, the argument is called inductive (induction). A logically correct deductive argument is termed valid, while an acceptable inductive argument is called cogent. The notion of support is further elucidated by the observation that the truth of the premises of a valid deductive argument necessitates the truth of the conclusion: it is impossible for the premises to be true and the conclusion false. The truth of the premises of a cogent inductive argument, on the other hand, confers only a probability of truth on its conclusion: it is possible for the premises to be true while the conclusion is false.

Logic is not concerned to discover premises that persuade an audience to accept, or to believe, the conclusion. This is the subject of rhetoric. The notion of rational persuasion is sometimes used by logicians in the sense that, if one were to accept the premises of a valid deductive argument, it would not be rational to reject the conclusion; one would in effect be contradicting oneself in practice. The case of inductive logic will be considered below.

From the above characterization of arguments, it is evident that they are always advanced in some language, either a natural language such as English or Chinese or, possibly, a specialized technical language such as mathematics. To develop rules for determining the validity of deductive arguments, the statements comprising the argument must be analyzed in order to see how they relate to one another. The analysis of the logical forms of arguments can be accomplished most perspicuously if the statements of the argument are framed in some canonical form. Additionally, when stated in a regimented format, various ambiguities or other defects of the original statements can be avoided.

When they are stated in a natural language, some arguments appear to give support to their conclusions or to confute a thesis. Such a defective, although apparently correct, argument is called a fallacy. Some of these errors in argument occur often enough that types of such fallacies are given special names. For example, if one were to attack the premises of an argument by casting aspersions on the character of the proponent of the argument, this would be characterized as committing an ad hominem fallacy. The character of the proponent of an argument has no relevance to the validity of the argument. There are several other fallacies of relevance, such as threatening the audience (argumentum ad baculum) or appealing to their feelings of pity (argumentum ad misericordiam).

The other major grouping of fallacies concerns those apparently correct arguments whose plausibility depends on some ambiguity. For an argument to be valid it is required that the terms occurring in the argument retain one meaning throughout. Subtle shifts of meaning that destroy the correctness of any argument can occur in natural language expressions:

Today chain-smokers are rapidly disappearing.
Karen is a chain-smoker.
Therefore, today Karen is rapidly disappearing.
Clearly what is intended in the first premise is that the class of chain-smokers is becoming a smaller class, not that the individuals in the class are undergoing any change. A well-known, classic example of incorrect reasoning based on an ambiguity arising from the grammatical construction employed, the so-called amphiboly, is the case of Croesus, king of Lydia in the 6th century BC, who was considering invading Persia. When he consulted the oracle at Delphi, he is reported to have received the following reply: “If Croesus goes to war with Cyrus (the king of Persia), he will destroy a mighty kingdom.” Croesus inferred that his campaign would be successful, but in fact he lost, and consequently his own mighty kingdom was destroyed.

Categorical propositions
One of the first and best-known—and most successful—attempts to provide a regimented framework within which some important deductive arguments could be recognized as valid or invalid was that of Aristotle. Many arguments are composed of premises and conclusions that are stated or could be restated as categorical propositions (categorical proposition). Categorical propositions may be distinguished first by their quality, either affirmative or negative. An affirmative categorical proposition asserts that all or some of a class of objects are included in another class of objects (e.g., “All whales are mammals”), while a negative categorical proposition asserts that all or some of a class of objects are not included in another class of objects (e.g., “Some pets are not dogs”).

Secondly, categorical propositions may be distinguished by their quantity, either universal or particular. When the assertion is that all of a class of objects are or are not included in another class of objects, the proposition is universal. When only some (precisely, at least one) of a class are or are not included in another, the proposition is particular.

The two distinguishing features above lead to four types of categorical proposition:

A:universal affirmativeAll A's are B's.

E:universal negativeNo A's are B's.

I:particular affirmativeSome A's are B's.

O:particular negativeSome A's are not B's.

The letters to the left, A, E, I, and O, are the standard labels for these types of propositions. The expressions in the right column are schematic sentences, requiring, in this case, English phrases referring to classes of objects where A and B are located. Some examples of categorical propositions in this standard form are:
● A: All games are enjoyable activities.
● E: No wars are enjoyable activities.
● I: Some women are soldiers.
● O: Some women are not soldiers.

Not all arguments in ordinary contexts are expressed in categorical propositions. Indeed, most are not. The sample A proposition above would more likely be expressed as: “All games are enjoyable.” But enjoyable is an adjective and does not refer to a class of objects. The adjective must be replaced by a noun phrase to obtain a proper categorical proposition. In all cases, propositions must be expressed using two noun phrases joined by the appropriate copula, a form of the verb to be.

Original: Some sailors are dancing.
Rewritten: Some sailors are persons who are dancing.
(Note that “Some sailors are dancers” is not quite right, since a dancer may not actually be dancing at the moment.)

Most languages contain many more verbs than the standard copula; hence, there are many grammatical statements that do not use variations of this verb. These sentences must be rewritten as well:

Original: All dogs bark.
Rewritten: All dogs are animals that bark.
Even variations of the verb to be must be rewritten:
Original: Some lucky person will win the lottery.
Rewritten: Some lucky persons are persons who will win the lottery.

Another difficulty with the requirement that all arguments be expressed using categorical propositions is that some arguments involve reference to one individual. The sentence “Socrates is a Greek” is considered to be a singular proposition. Some logicians allow such sentences in arguments and treat them as universal categorical propositions. It is usually better, however, to rewrite such sentences as explicit categorical propositions:

All persons identical to Socrates are Greeks.
The class referred to by the subject term “persons identical to Socrates” has one and only one object in it—namely, Socrates himself.

A natural language usually has various rhetorical devices for expressing quantifiers, and some languages—English, for example—occasionally do not even express the quantifier, letting the grammatical construction convey that information instead. We find “A cow is a mammal” referring to cows in general, so it would be regimented as “All cows are mammals.” Examples of noncategorical quantifiers along with appropriate translations into categorical propositions are:

Original: A few scientists are dullards.
Rewritten: Some scientists are dullards.
Original: Not everyone who runs for office is elected.
Rewritten: Some persons who run for office are not elected persons.
Original: All entrants can't be winners.
Rewritten: Some entrants are not winners.
Original: Automobiles are not toys.
Rewritten: No automobiles are toys.

Conditional sentences have the form “If . . . , then .” If the antecedent (“if” clause) and the consequent (“then” clause) refer to the same class of objects, the conditional can be rewritten in categorical form. Otherwise, it cannot be rewritten and must be dealt with differently (see below Other argument forms (logic)). Some conditionals whose antecedent and consequent refer to the same class of objects are:
● If an animal is a tiger, (then) it's a carnivore.
● If it's a snake, then it's not a mammal.
● A student will succeed if he or she studies assiduously.

(Note the reversal of the clauses.)

These are rewritten in categorical form as:
● All tigers are carnivores.
● No snakes are mammals.
● All students who study assiduously are students who will succeed.

When the antecedent and consequent refer to different classes, such rewriting is not possible (e.g., “If the president is reelected, then I shall never vote again”).

Finally there are such locutions as “Only” (or “None but”), “The only,” and “All except” (or “All but”). When it is asserted that only A's are B's, it is not claimed that A's are B's. Rather, it is claimed that, if anything is a B, then it is also an A. So, for example, if it is asserted that only entrants are prizewinners, no one is asserting that all entrants will win a prize. What is asserted is that all prizewinners are entrants. The case “The only” is quite different. Here, “The only winners are Texans” is expressed by the proposition “All winners are Texans.” The phrase “All except” introduces an exceptive proposition. It requires two categorical propositions to state everything asserted by an exceptive proposition. The statement “All except crew members abandoned ship” asserts that everyone who was not a crew member abandoned ship and that no crew member abandoned ship. Thus, two categorical propositions are needed to express this exceptive proposition:

All non-crew members are persons who abandoned ship.
No crew members are persons who abandoned ship.

Immediate inference
The simplest possible arguments that can be constructed from categorical propositions are those with one premise and, of course, one conclusion. These are called immediate inferences. In order to characterize the valid arguments with one premise, it is necessary to consider various transformations of a categorical proposition. One transformation switches the subject and predicate terms of a proposition, resulting in a proposition called the converse (conversion) of the original.

Original Converse

A:All A's are B's. All B's are A's.

E:No A's are B's. No B's are A's.

I:Some A's are B's. Some B's are A's.

O:Some A's are not B's.Some B's are not A's.

Only in the cases of E and I propositions can one immediately infer the converse. That is, only these inferences by conversion are correct:

No snakes are birds. Some cats are pets.

∴ No birds are snakes.∴ Some pets are cats.

The obverse (obversion) of a proposition is a more complicated transformation. The quality of the proposition is changed from affirmative to negative (or from negative to affirmative), and the predicate term is replaced by its negation (frequently formed by prefixing “non-”). Thus, “All A's are B's” becomes “No A's are non-B's,” and similarly for the other three categorical propositions. The obverse of any categorical proposition is logically equivalent to the original and hence may be immediately inferred from it:

No snakes are birds.

∴ All snakes are non-birds.

Some cats are pets

∴ Some cats are not non-pets.

All whales are mammals.

∴ No whales are non-mammals.

Some dogs are not friendly animals.

∴ Some dogs are non-friendly animals.

The contrapositive of a categorical proposition is formed by converting the proposition (switching subject and predicate terms) and then negating both the subject and predicate. Only in the cases of A and O propositions can the contrapositive be inferred as a valid conclusion:

All whales are mammals.

∴All non-mammals are non-whales.

Some pets are not cats.

∴Some non-cats are not non-pets.

In the cases of E and I propositions, the contrapositive does not follow as a valid conclusion.

These immediate inferences are frequently employed to transform propositions in an argument into a form that enables the more complex argument to be analyzed.

Categorical syllogisms
The next more complex form of argument is one with two categorical propositions as premises and one categorical proposition as conclusion. When arguments of this type have exactly three terms (term) occurring throughout the argument and when the predicate term of the conclusion occurs in the first premise and the subject term of the conclusion occurs in the second premise, the argument is called a categorical syllogism.

The pattern of the types of categorical propositions as they occur in a syllogism, frequently indicated by the appropriate letters (A, E, I, O), is called the mood of the syllogism. Thus, possible moods are AAA, AIO, EIO, and so on. Within a given mood, the terms can occur in various patterns. The pattern in which the terms S, M, and P (subject, middle, and predicate) are arranged is called the figure of the syllogism. For instance, in the first premise the predicate term of the conclusion may appear first as the subject of the premise or it may occur last as the predicate of the premise. This is also true for the subject term of the conclusion when it occurs in the second premise. There are four possibilities:

Thus a syllogism in the fourth figure, with mood AAA, is called AAA-4:

All P's are M's.All cantaloupes are fruits.

All M's are S's.All fruits are seed-bearers.

∴ All S's are P's.∴ All seed-bearers are cantaloupes.

Intuitively, it is obvious that this is not a valid argument. The task of logic is to show why a syllogism is valid or not. An example of a valid syllogism is EIO in the second figure:

No P's are M's. No scientists are children.

Some S's are M's. Some infants are children.

∴ Some S's are not P's.∴ Some infants are not

scientists.

The validity of a syllogism depends on the relations among the classes referred to by the terms of the argument. If all of one class is contained in a second class and none of the second class is in a third, then none of the first class is in the third either. Using this principle and others like it, logicians have been able to establish which syllogisms are valid and which are not.

Arguments presented in ordinary contexts, even when statable in categorical propositions, may not be simple syllogisms. Often essential premises are not stated, because they are so obvious and trivial as not to require mentioning. When an essential premise is not stated, the argument is called an enthymeme. Enthymematic arguments need to have their hidden premises made explicit before a test for validity can be made. In addition, arguments often contain more than two premises. Indeed, some arguments can be structured as a sequence of syllogisms, where preliminary conclusions are expressly drawn and then are used as premises in later syllogisms. Such a chain of subarguments is called a sorites. The English logician and novelist Lewis Carroll (Carroll, Lewis) devised clever, whimsical sorites that have entertained students for more than 100 years. For instance, in Symbolic Logic (1896) he presented the following argument, whose conclusion was left unexpressed:

All my sons are slim.
No child of mine is healthy who takes no exercise.
All gluttons who are children of mine are fat.
No daughter of mine takes any exercise.
In addition, certain crucial premises of this argument—such as “No slim persons are fat persons”—have not been expressed.

Other argument forms
The argument form most discussed and studied from the time of Aristotle to the early 19th century was the syllogism. But Aristotle himself noted that some arguments were expressed in propositions other than categorical ones. The following argument, for instance, has for its first premise a hypothetical proposition:

If all men are born equals, then all slaves are unjustly treated persons.
All men are born equals.
∴ All slaves are unjustly treated persons.
This is a hypothetical argument, often called a hypothetical syllogism. Hypothetical propositions have the form “If . . . , then ,” where the word “then” is often omitted. When, as above, the conclusion is obtained by the second premise's affirming the antecedent, the argument is said to be by modus ponens (modus ponens and modus tollens). The conclusion in this case is the consequent of the hypothetical first premise.

A hypothetical argument can also be conducted by denying the consequent of the hypothetical premise and thereby concluding with a denial of the antecedent of the hypothetical. This form of hypothetical argument is called modus tollens, and the denials in either case are frequently expressed by the contradictory of the proposition at issue, either the antecedent or consequent of the hypothetical. An example of a modus tollens hypothetical argument is

If some persons are persons with rights to freedom, then all persons are persons with rights to freedom.
Not all persons are persons with rights to freedom.
∴ No persons are persons with rights to freedom.

Disjunctions (disjunction) are propositions in which the predicate is asserted to belong to one or another subject, or one or another predicate is asserted to belong to a subject: “Either A's or B's are C's, or A's are either B's or C's.” Another more complex disjunction takes two categorical propositions as alternatives: “Either A's are B's, or C's are D's.” A disjunctive argument (sometimes called a disjunctive syllogism) contains one of the three above disjunctive forms as one premise and the denial of one of the alternatives (disjuncts) as the second premise. The valid conclusion in these cases is the other alternative. A simple and traditional example is

Either God is unjust, or no men are eternally punished creatures.
God is not unjust.
∴ No men are eternally punished creatures.
The singular proposition here (“God is unjust”) is treated as a universal categorical proposition.

Sometimes the alternatives are meant to be exclusive—that is, if one is true, the other is false. When such is the case, a valid disjunctive argument can then be constructed by affirming one of the alternatives in a premise and subsequently concluding a denial of the other alternative. Thus,

Either Bacon or Shakespeare is the author of Hamlet.
Shakespeare is the author of Hamlet.
∴ Bacon is not the author of Hamlet.
Unfortunately, it is not always evident whether the disjunction is to be taken in the inclusive or the exclusive sense, and the careful logician will usually explicitly assert “A or B, but not both.” Examples of ambiguity of disjunction abound: “Newton or Leibniz is the discoverer of the calculus (possible codiscoverers)”; “All diplomats are liars or failures.”

A combination of a disjunction and hypothetical propositions as premises gives rise to a type of argument known as a dilemma. The hypothetical propositions offer alternatives, either one of which leads to a (frequently unpalatable) conclusion. When the conclusions of both alternatives are the same, it is a simple dilemma; when they differ, it is a complex dilemma. If the antecedent of the hypothetical proposition is affirmed, and thus the consequent is also affirmed as conclusion, the argument is constructive. When the consequent is denied, and thus the antecedent is denied as conclusion, the argument is called destructive. Some illustrations of these types of dilemmas are displayed below. (For ease of reading, these propositions are not written in categorical form but are expressed as they would be colloquially.)

Simple constructive:

If a science furnishes useful facts, it is worthy of being cultivated; and if the study of it exercises the reasoning powers, it is worthy of being cultivated. But either a science furnishes useful facts, or its study exercises the reasoning powers. Therefore it is worthy of being cultivated.
(William Stanley Jevons, Elementary Lessons in Logic [1870].)

Complex constructive:

If there is censorship of the press, abuses of power will be concealed; and if there is no censorship, truth will be sacrificed to sensation. But there must either be censorship or not. Therefore either abuses of power will be concealed, or truth will be sacrificed to sensation.
(Horace William Brindley Joseph, An Introduction to Logic [1916].)

Destructive:

If this person were wise, he would not speak irreverently of Scripture in jest; and if he were good, he would not do so in earnest. But he does it either in jest or earnest. Therefore he is either not wise or not good.
(Richard Whately, Elements of Logic [1826].)

Symbolic logic (formal logic)
A number of developments during the Renaissance and immediately thereafter—the period of the emergence of modern science—led to increasing dissatisfaction with the traditional logic of the syllogism. In particular, the development of functional relations in natural science, the shift of interest from geometry to algebra in mathematics, the concern for the logical foundations of mathematics, and the call for a language that would reveal logical relations by its very notation (compare Gottfried Wilhelm Leibniz' characteristica universalis) led to the developments in the 19th century that can be called the algebra of logic. It is notorious that the British mathematician and logician Augustus De Morgan (De Morgan, Augustus) (1847) found fault with the syllogism by pointing out that it cannot (easily) deal with the simple relational inference:

All horses are animals.

Although various abbreviations were accomplished through symbols (symbol), even in the works of Aristotle himself, the use of symbols in an explicit formal system, the precursor of modern symbolic logic, began with George Boole (1847) and Ernst Schröder (1890–1905), was developed further by Gottlob Frege (1879), and finally culminated in the Principia Mathematica of Bertrand Russell and Alfred North Whitehead (1910–13). The formal systems of modern symbolic logic differ from earlier logical studies that used symbols in that, in the former, totally artificial languages are rigorously developed using special symbols for precisely defined logical concepts. The rules of this language, both the syntactic rules for deduction and the semantic rules for interpreting expressions, are explicitly and precisely stated. The development of these symbolic formal systems within which deductive arguments can be represented yields a number of distinct advantages. A high degree of rigour can be attained. The sharp separation of semantics from syntax leads to a clear distinction between the validity of an argument (semantics) and the deducibility of the conclusion from axioms and premises (syntax). Additionally, the formal system, once made totally explicit, can itself be the object of study.

The logical relations among whole sentences is the basis of the modern symbolic approach. In effect, hypothetical and disjunctive arguments rather than the categorical syllogism become the centre of attention. Beginning with simple sentences that have no simpler sentences as components, one constructs compound sentences using sentential connectives (connective). The truth value (truth-value) (either true or false) of the compound sentence depends then on the truth values of its components in a clear and explicit manner according to which function is represented by the sentential connective. For instance, the propositional truth function called conjunction, which is frequently represented by “·” or “&,” has the value true when both the conjoined propositions have the value true; otherwise it has the value false. In other words, if p and q are arbitrary propositions, the sentence “p·q” represents a true proposition just in case both p and q are true propositions themselves. The formalization of these truth functions and the statement of the rules for inferring new sentences from earlier ones (the rules of inference) results in a formal system called the propositional calculus (PC).

Yet PC cannot deal with arguments formerly handled by the categorical syllogism. Some way of dealing with the internal structure of simple sentences needs to be developed. The great power of modern logic is based on the important notion of a propositional function. A propositional function acts on a domain of individuals and has the value true or false, depending on which individual (or individuals) is the argument of the function. Thus, “ is an even number” represents a propositional function whose value is true whenever the blank is filled by a numeral referring to an even number and false when the number is odd.

Instead of using expressions with blank spaces, which can be confusing if there is more than one blank, logicians utilize what are termed individual variables, expressions that hold open a place in a sentence fragment for the name of some individual. Individual variables are frequently lowercase letters from the end of the alphabet. So the example in the previous paragraph would be written: “x is an even number.” This expression can become a sentence when the variable “x” is replaced by the name of some thing—a true sentence when that thing is an even number. There are other ways to convert such expressions into sentences. One can prefix the expression with a universal quantifier, “For all x.” Now the resulting sentence, “For all x, x is an even number,” expresses the false proposition that everything is an even number. Furthermore, prefixing the expression with an existential quantifier, “There is at least one x,” yields the true sentence, “There is at least one thing such that it is an even number.”

Being an even number is a property that some individuals can have. Expressions that attribute a property to an individual are (monadic) predicates. It is customary to express simple predicates by uppercase letters placed before the individual term. Thus if E is used for the predicate “is an even number,” the expression Ex is intended to represent “x is an even number.” Using monadic predicates, quantifiers, individual variables, and the sentential connectives developed in PC, it is possible to express all the categorical syllogisms and subsequently determine their validity. When rules of inference and possibly axioms are introduced, this system is called the monadic predicate calculus. When relations are asserted to hold between two or more individuals, additional, n-adic, predicates enter the language. For example, using the uppercase letter L to express the dyadic relation of being less than, and taking a and b to be any (not necessarily different) numbers, one can assert that a is less than b by writing: Lab. The notation of dyadic relation symbols allows a simple expression, and solution, of De Morgan's problem, mentioned above, about heads of horses. One may even introduce the notion of predicate variables; but, as long as there is no quantification over predicate variables, the resulting formal system is called the lower predicate calculus (LPC).

One further extension of LPC is usually made in modern logic. One special dyadic relation, represented by the equality sign, “=,” placed between two terms, is taken to be the identity relation. Depending on the type of formal system that is being considered, either axioms of identity (e.g., “Everything is self-identical”) are adopted or else rules of inference governing transformations (e.g., “From any conclusion ϕ containing the name a and an earlier line of derivation, a = b, infer a new conclusion ϕ′ containing b for some occurrences of a”) are added to the earlier rules of the system. The resulting system, which in effect restricts the possible interpretations of LPC to the identity relation for the dyadic predicate “=,” is called LPC with identity (or sometimes first-order logic with identity). Several considerations suggest that this is the most comprehensive logical system possible and that any other additions will no longer result in all logical truths, and only logical truths, as theorems.

In formal systems the emphasis shifts from arguments to deducing conclusions. The rules of inference of the system allow various transformations on, or inferences from, initial sequences of symbols. When no additional material assumptions are used, the final line of any such derivation is called a theorem of logic. When, however, assumptions about some field of inquiry are incorporated into the formal system, the theorems derived by using the rules of the system are theorems of the material theory. Thus, if certain postulates about the behaviour of moving bodies are laid down, one would derive theorems of kinematics—and similarly for arithmetic, geometry, and so on.

Modern logic in the last part of the 20th century can be divided into four major areas of investigation. The first area is proof theory, the study of the properties of formal systems and the derivations that can be accomplished within them. The second area is model theory, which investigates the various structures about which formal theories can be constructed. Here the emphasis is on what cannot be validly deduced from a set of material hypotheses. One attempts to find structures about which the hypotheses are true and yet for which a particular statement is false. Third is recursion theory, which deals with questions involving the decidability of the question of whether or not a sentence is deducible from a set of premises. This study has led to theories of computability, or the existence of mechanical procedures for solving problems associated with deducibility. Finally, there is the broad area of the foundations of mathematics (mathematics, foundations of), especially the logical grounding of the basic notions of set theory.

Applications of the formal methods of logic have burgeoned with the development of novel semantic devices such as “possible worlds.” It is now possible to provide a semantics for various modal logics dealing with such topics as necessarily true propositions, known propositions (as distinct from those merely believed), obligatory actions, and the structure of temporal relations. Previously, formulas of modal logic were merely uninterpreted sequences of symbols with no clear meanings. In addition, grammatical studies within the general field of linguistics has benefited from the seminal work of the American logician Richard Montague (1970) and subsequent developments.

Inductive (induction) logic
Inductive arguments intend to support their conclusions only to some degree; the premises do not necessitate the conclusion. Traditionally, the study of inductive logic was confined to either arguments by analogy or else methods of arriving at generalizations (generalization) on the basis of a finite number of observations. A typical argument by analogy proceeds from the premise that two objects are observed to be similar with respect to a number of attributes to the conclusion that the two objects are also similar with respect to another attribute. The strength of such arguments depends on the degree to which the attributes in question are related to each other.

The methods appropriate to inductive generalizations have been studied by modern philosophers from Francis Bacon in the early 17th century to William Whewell and John Stuart Mill in the 19th century. Proper inductive generalizations require that the observed instances referred to in the premises be obtained according to a careful method of varying the circumstances of observations, a rigorous search for exceptional cases, and attempts to detect correlations or dependencies among the various phenomena.

In the 20th century, most notably in the work of Hans Reichenbach (Reichenbach, Hans) (1938), a distinction has been made between the context of discovery and the context of justification, between the nonlogical process for arriving at a general hypothesis and the logical relations that obtain between the hypothesis and the evidence for it—the so-called hypothetico-deductive method. In modern inductive logic, the probability (probability theory) calculus, or some variant of it, is called upon to explicate the notion of how observed evidence logically supports a theoretical hypothesis.

Morton L. Schagrin

The best starting point for exploring any of the topics in logic is D. Gabbay and F. Guenthner (eds.), Handbook of Philosophical Logic, 4 vol. (1983–89), a comprehensive reference work. See also Gerald J. Massey, Understanding Symbolic Logic (1970), an introductory text; and Robert E. Butts and Jaakko Hintikka, Logic, Foundations of Mathematics, and Computability Theory (1977), a collection of conference papers.

* * *

Universalium. 2010.

Synonyms:

Look at other dictionaries:

• Logic — logic …   Philosophy dictionary

• Logic — • A historical survey from Indian and Pre Aristotelian philosophy to the Logic of John Stuart Mill Catholic Encyclopedia. Kevin Knight. 2006. Logic     Logic      …   Catholic encyclopedia

• Logic — Pro Entwickler: Apple Inc. Aktuelle Version: 8.0.2 (20. Mai 2008) Betriebssystem: Mac OS X Kategorie …   Deutsch Wikipedia

• LOGIC — (Heb. חָכְמַת הַדִּבּוּר or מְלֶאכֶת הַהִגַּיוֹן), the study of the principles governing correct reasoning and demonstration. The term logic, according to Maimonides, is used in three senses: to refer to the rational faculty, the intelligible in… …   Encyclopedia of Judaism

• logic — LÓGIC, Ă, logici, ce s.f., adj. I. s.f. 1. Ştiinţă a demonstraţiei, al cărei obiect este stabilirea condiţiilor corectitudinii gândirii, a formelor şi a legilor generale ale raţionării corecte. ♢ Logică generală = logică clasică, de tradiţie… …   Dicționar Român

• logic — lo‧gic [ˈlɒdʒɪk ǁ ˈlɑː ] noun [uncountable] 1. COMMERCE commercial/​economic/​industrial logic a way of thinking and making good judgements that is connected to a particular area of business, the economy etc: • Their takeover bid appears to have… …   Financial and business terms

• Logic — Log ic, n. [OE. logike, F. logique, L. logica, logice, Gr. logikh (sc. te chnh), fr. logiko s belonging to speaking or reason, fr. lo gos speech, reason, le gein to say, speak. See {Legend}.] 1. The science or art of exact reasoning, or of pure… …   The Collaborative International Dictionary of English

• logic —    Logic is the study of the correct way of reasoning. It is a prescriptive discipline rather than a merely descriptive one (psychology describes how we actually do reason). The two main methods for describing how we should think are the… …   Christian Philosophy

• logic — [läj′ik] n. [ME logike < OFr logique < L logica < Gr logikē ( technē), logical (art) < logikos, of speaking or reasoning < logos, a word, reckoning, thought < legein, to speak, choose, read < IE base * leg̑ , to gather > L …   English World dictionary

• Logic — es una herramienta multiuso desarrollada bajo licencia pública de Mozilla Frontal. Logic de …   Wikipedia Español

• logic — ► NOUN 1) reasoning conducted or assessed according to strict principles of validity. 2) the ability to reason correctly. 3) (the logic of) the course of action following as a necessary consequence of. 4) a system or set of principles underlying… …   English terms dictionary