CHAPTER 9: INDUCTION

9.1 Statement Strength

Strong statement: is true only under specific circumstances; the world must be just so in order for it to be true. E.g., every vertebrate has a heart. Bob’s house is the third on the left on main street

Weak statement: is true only under a wide variety of possible circumstances; it says nothing specific and demands little of the world for its truth. E.g., something is happening somewhere. Some people are sort of weird

Rules of relative strength

Rule 1: If statement A deductively implies statement B, but B does not deductively imply A, then A is stronger than B. E.g., all cows are horned; some cows are horned

Rule 2: If statement A is logically equivalent to statement B, then A and B are equal in strength. E.g., John loves Mary; Mary is loved by John

9.2 Statistical Syllogism

Two types of inductive arguments

Statistical inductive arguments: an inductive argument which does not presuppose the uniformity of nature.

E.g., 98% of college freshmen can read beyond the 6th grade level; David is a college freshman; therefore, Dave can read beyond the 6th grade level

Humean inductive arguments: an inductive argument which presupposes the uniformity of nature

E.g., Each of the 100 college freshman surveyed know how to spell logic; if we ask another college freshman, he or she will also know how to spell logic

Two interpretations of inductive probability

Logical: the percentage figure divided by 100

Subjective: inductive probability is a measure of a particular rational person’s degree of belief in the conclusion, given its premises

Statistical Syllogism (general to specific)

Allows us to arrive at a conclusion concerning a member of a set from statistics concerning a set of individuals

n percent of F are G; x is F; therefore, x is G

9.3 Statistical Generalization (specific to general)

Statistical generalization (specific to general): allows us to arrive at a conclusion concerning an entire population from a premise concerning a random sample of that population

Formula

n percent of s (number) randomly selected F are G;

therefore, about n percent of all F are G

Fallacy of small sample: the conclusion is too strong to be supported by sample number in the premises

9.4 Inductive Generalization and Simple Induction

Inductive generalization (specific to general): allows us to arrive at a conclusion concerning an entire population when it is not possible to obtain a random sample (e.g., they may involve future events)

Formula

n percent of s (number) thus-far-observed F are G;

therefore, about n percent of all F are G

Fallacy of biased sample: attempts to apply statistical generalization with a nonrandom sampling technique (a type of hasty generalization)

The success of statistical generalization depends on randomness of sampling

Simple induction

Inductive generalization where the population in the conclusion is reduced to one individual (the argument is strengthened by lessening the conclusion)

The argument gets stronger when the percentage is over 50, but weaker when under 50

Formula

n percent of the s thus-far-observed F are G

Therefore, if one more F is observed, it will be G

9.5 Induction by Analogy

Formula

F_{1}x & F_{2}x
& . . . & F_{n}x [object x has several properties]

F_{1}y & F_{2}y
& . . . & F_{n}y [object y has these same properties]

Gy [object y has an additional property]

Therefore, Gx [object x probably has that additional property too]

Relevant disanalogy: contrary evidence to analogical arguments

9.6 Mill's Methods

Two-step procedure for determining the cause of an observed effect

1. Formulate a list of suspected causes (including the actual cause)

2. Rule out by observation as many suspected causes as possible down to one

Four kinds of causes

Necessary cause (causally necessary condition): a condition needed to produce a certain effect

The effect will never occur without the cause (although the cause can occur without the effect)

e.g., fuel a necessary cause of fire (fire will never occur without fuel, although fuel can occur without fire)

Conditional statement: If you have a given effect (fire), then it will always be produced by a given cause (fuel)

Sufficient cause (causally sufficient condition): a condition which always produces a certain effect

The cause will never occur without the effect (although the effect can occur without the cause)

e.g., decapitation is a sufficient cause of death in higher animals (although death in higher animals can occur without decapitation)

Conditional statement: If you have a given cause (decapitation of higher animals), then it will always result in a given effect (death)

Necessary and sufficient causes: a condition that is needed to produce and will always produce a certain effect

The effect will never occur without the cause, nor the cause without the effect

e.g., the presence of a massive body is a necessary and sufficient cause for the presence of a gravitational field

Conditional statement: a given cause (massive body) will occur if and only if a given effect (gravitational field) accompanies

Causal dependence of one variable quantity on another: a variable quantity B is casually dependent on a second variable quantity A if a change in A always produces a corresponding change in B

e.g., the brightness of an object B varies inversely with the square of the distance A from that object

Conditional statement: if you change a given variable quantity (distance) then this will result in a change to another variable quantity (brightness)

Method of Agreement (necessary causes of E)

A deductive procedure for ruling out suspected causally necessary conditions, with the goal of narrowing the list down to one

e.g., several students get sick eating at the cafeteria; examine what they ate in common

Mill’s wording: “If two or more instances of the phenomenon under investigation have only one circumstance in common, the circumstance in which alone all the instances agree, is the cause (or effect) of the given phenomenon.”

Method of Difference (sufficient causes of E)

A procedure for narrowing down a list of suspected sufficient causes for an effect E by rejecting any item on the list that occurs without E, with the goal of narrowing the list down to one

e.g., several students get sick eating at the cafeteria; examine what the non-sick students all ate

Mill’s wording: “If an instance in which the phenomenon under investigation occurs, and an instance in which it does not occur, have every circumstance in common save one, that one occurring only in the former; the circumstance in which alone the two instances differ, is the effect, or the cause, or an indispensable part of the cause, of the phenomenon.”

Method of Agreement and Difference (necessary and sufficient causes of E)

A procedure for eliminating items from a list of suspected necessary and sufficient causes of an effect E by simultaneously applying the methods of agreement and difference

Mill’s wording: “If two or more instances in which the phenomenon occurs have only one circumstance in common, while two or more instances in which it does not occur have nothing in common save the absence of that circumstance: the circumstance in which alone the two sets of instances differ, is the effect, or cause, or a necessary part of the cause, of the phenomenon.”

Method of Concomitant variation (quantities on which the magnitude of E is causally dependent)

A procedure for narrowing down a list of variable magnitudes suspected of being the cause of a specific change in the magnitude of an effect E, where a variable is rejected if it remains constant throughout the change in E

e.g., give differing portions of contaminated food to different people to see how sick they get

Mill’s wording: “Deduct from any phenomenon such part as is known by previous inductions to be the effect of certain antecedents, and the residue of the phenomenon is the effect of the remaining antecedents.”

9.7 Scientific Theories

Terms:

Scientific theory: an account of some natural phenomenon which in conjunction with further known facts or conjectures (auxiliary hypotheses) enables us to deduce consequences which can be tested by observation

Model: a physical or mathematical structure claimed to be analogous in some respect to the phenomenon for which the theory provides an account

Confirmation through successful prediction

A theory’s predictions are deduced from the theory (plus auxiliaries); if the prediction proves false, then either the theory or one of the auxiliaries must be false

If we are confident of the auxiliaries, then the fault rests with the theory

Confidence in a theory should never be absolute: even if all predictions so far are successful, there may be some untested prediction that is false

A theory becomes more probable with more successful predictions

Principle of scientific probability: If E is some initial body of evidence (including auxiliary hypotheses) and C is the additional verification of some of the theory’s predictions, the probability of the theory given E & C is higher than the probability of the theory given E alone