It is enlightening to try and define terms properly when trying to understand foundations of a scientific domain. In the case of the learning phenomenon, the distinction between deduction and induction is a crucial one.
Deductive reaonsoning consists in combining logical statements according to certain agreed upon rules in order to obtain new statements. This is how mathematicians prove theorems from axioms. Proving a theorem is nothing but combining a small set of axioms with certain rules. Of course, this does not mean proving a theorem is a simple task, but it could theoretically be automated.
Inductive reasoning consists in constructing the axioms from the observation of supposed consequences of these axioms. This is what scientists like physicists for example do: observing natural phenomena, they postulate the laws of Nature.
Both deduction and induction have limitations. One limitation of deduction is exemplified by Gödel's theorem which essentially states that for a rich enough set of axioms, one can produce statements that can be neither proved nor disproved.
Induction on the other hand is limited in that it is impossible to prove that an inductive statement is correct. At most can one empirically observe that the deductions that can be made from this statement are not in contradiction with experiments. But one can never be sure that no future observation will contradict the statement.
Hello Olivier,
As you point out, deduction and induction are two powerful reasoning paradigms, even if they both have limitations. Deduction goes from the general to the particular, and induction goes from the particular to the general. Following another post expressing the difference between probability and statistics, one could say that deduction is to induction what probability is to statistics.
One thing to note is that induction alone is not that useful: the induction of a model (a general knowledge) is interesting only if you can use it, i.e. if you can apply it to new situations, by going somehow from the general to the particular.
Two other kinds of reasoning may also be mentioned, namely abduction and analogy.
Like deduction, abduction relies on knowledge expressed through general rules. Like deduction, it goes from the general to the particular, but it does in an unusual manner since it infers causes from consequences. So, from "A implies B" and "B", A can be infered. This kind of inference may seem strange or poor at first glance. In fact, it can be very powerful too. For example, most of a doctor's work is infering diseases from symptoms, which is exactly what abduction is about. "I know the general rule which states that flu implies fever. I'm observing fever, so there must be flu." However, abduction is not able to build new general rules: induction must have been involved at some point to state that "flu implies fever".
Analogy is another kind of reasoning, which goes from the particular to the particular. The most basic form of analogy is based on the assumption that similar situations have similar properties: from "P(a)" and "a ~ b", "P(b)" can be infered. More complex analogy-based learning schemes, involving several situations and recombinations can also be considered. Lots of lawyers use analogical reasoning to analyse new problems based on previous cases. What is really interesting about analogy is that it completely bypass the model construction: instead of going from the particular to the general, and then from to the general to the particular, it goes directly from the particular to the particular.
Posted by: Nicolas Stroppa | September 23, 2005 at 09:42 PM
Hi Nicolas,
Thanks for the comment.
You are perfectly right, I omitted these and I am glad that you point this out.
It seems to me that abduction is just a special type of deduction in the sense that the abductive reasoning consists in applying logical rules to combine statements and obtain other ones. In your examples, if you just use the contrapositive statement you can get to the conclusion by standard deduction.
So I may be wrong, but abduction does not seem to form a different class than deduction.
However, what you call analogy is very related to what some people call transduction (the term was coined by Vapnik): the fact that you make predictions on particular instances without building a full model, but directly from the observations at hand.
This seems to be quite different from deduction and induction. An interesting question would be what kind of limitations this type of reasoning has.
There are possibly many other interesting questions related to analogy/transduction. For example, there has been some debate in the ML community about whether it is better to do transduction than induction. V. Vapnik strongly advocates transduction as a more powerful method than induction, but I do not completely agree with his views...
I would be interested to know you opinion, as someone who has worked on analogy.
Posted by: Olivier Bousquet | September 26, 2005 at 11:50 PM
Thanks, I had a small knowledge about deduction and induction. But I didn't expect that they were a huge gap between them and the end they can complement.
Posted by: Generic Viagra | March 24, 2011 at 04:36 PM
I have learned a lot from your blog. You speak accurately.
Posted by: wireless security systems | March 30, 2011 at 05:52 AM