Empiricism

Any theory emphasizing sense-experience (including introspection) rather than reason or intuition as the basis for either some or all of our knowledge; ‘basis’ referring usually to justification, though sometimes to psychological origin.

Empiricism can concern either propositions or concepts, rejecting (most) a priori ones; for John Locke (1632-1704) all concepts (‘ideas’) were empirical, but propositions connecting them could be known a priori. Extreme empiricists confine our knowledge to immediate experience (and introspection) and fall into the egocentric predicament.

Less extreme ones allow other knowledge to be reached from such experience, or confine their empiricism to certain spheres. Empiricists usually allow propositions of mathematics and logic to be a priori, but insist that they are analytic (that is, roughly, reducible to tautologies) rather than synthetic (embodying substantive knowledge).

Also see: rationalism

Source:
D Odegard, ‘Locke as an Empiricist’, Philosophy (1965)

History

Background

A central concept in science and the scientific method is that conclusions must be empirically based on the evidence of the senses. Both natural and social sciences use working hypotheses that are testable by observation and experiment. The term semi-empirical is sometimes used to describe theoretical methods that make use of basic axioms, established scientific laws, and previous experimental results in order to engage in reasoned model building and theoretical inquiry.

Philosophical empiricists hold no knowledge to be properly inferred or deduced unless it is derived from one’s sense-based experience.[7] This view is commonly contrasted with rationalism, which states that knowledge may be derived from reason independently of the senses. For example, John Locke held that some knowledge (e.g. knowledge of God’s existence) could be arrived at through intuition and reasoning alone. Similarly Robert Boyle, a prominent advocate of the experimental method, held that we have innate ideas.[8][9] The main continental rationalists (Descartes, Spinoza, and Leibniz) were also advocates of the empirical “scientific method”.[10][11]

Early empiricism

Aristotle

Between 600 and 200 BCE

Between 600 and 200 BCE, the Vaisheshika school of Hindu philosophy, founded by the ancient Indian philosopher Kanada, accepted perception and inference as the only two reliable sources of knowledge.[12][13][14] This is enumerated in his work Vaiśeṣika Sūtra.

c. 330 – 400 BCE[

The earliest Western proto-empiricists were the Empiric school of ancient Greek medical practitioners, founded in 330 BCE.[15] Its members rejected the three doctrines of the Dogmatic school, preferring to rely on the observation of phantasiai (i.e., phenomena, the appearances).[16] The Empiric school was closely allied with Pyrrhonist school of philosophy, which made the philosophical case for their proto-empiricism.

The notion of tabula rasa (“clean slate” or “blank tablet”) connotes a view of mind as an originally blank or empty recorder (Locke used the words “white paper”) on which experience leaves marks. This denies that humans have innate ideas. The notion dates back to Aristotle, c. 350 BC:

What the mind (nous) thinks must be in it in the same sense as letters are on a tablet (grammateion) which bears no actual writing (grammenon); this is just what happens in the case of the mind. (Aristotle, On the Soul, 3.4.430a1).

Aristotle’s explanation of how this was possible was not strictly empiricist in a modern sense, but rather based on his theory of potentiality and actuality, and experience of sense perceptions still requires the help of the active nous. These notions contrasted with Platonic notions of the human mind as an entity that pre-existed somewhere in the heavens, before being sent down to join a body on Earth (see Plato’s Phaedo and Apology, as well as others). Aristotle was considered to give a more important position to sense perception than Plato, and commentators in the Middle Ages summarized one of his positions as “nihil in intellectu nisi prius fuerit in sensu” (Latin for “nothing in the intellect without first being in the senses”).

This idea was later developed in ancient philosophy by the Stoic school, from about 330 BCE. Stoic epistemology generally emphasized that the mind starts blank, but acquires knowledge as the outside world is impressed upon it.[17] The doxographer Aetius summarizes this view as “When a man is born, the Stoics say, he has the commanding part of his soul like a sheet of paper ready for writing upon.”[18]

A drawing of Ibn Sina (Avicenna) from 1271

Islamic Golden Age and Pre-Renaissance (5th to 15th centuries CE)[edit]

During the Middle Ages (from the 5th to the 15th century CE) Aristotle’s theory of tabula rasa was developed by Islamic philosophers starting with Al Farabi (c. 872 – 951 CE), developing into an elaborate theory by Avicenna (c. 980 – 1037)[19] and demonstrated as a thought experiment by Ibn Tufail.[20] For Avicenna (Ibn Sina), for example, the tabula rasa is a pure potentiality that is actualized through education, and knowledge is attained through “empirical familiarity with objects in this world from which one abstracts universal concepts” developed through a “syllogistic method of reasoning in which observations lead to propositional statements which when compounded lead to further abstract concepts”. The intellect itself develops from a material intellect (al-‘aql al-hayulani), which is a potentiality “that can acquire knowledge to the active intellect (al-‘aql al-fa’il), the state of the human intellect in conjunction with the perfect source of knowledge”.[19] So the immaterial “active intellect”, separate from any individual person, is still essential for understanding to occur.

In the 12th century CE the Andalusian Muslim philosopher and novelist Abu Bakr Ibn Tufail (known as “Abubacer” or “Ebn Tophail” in the West) included the theory of tabula rasa as a thought experiment in his Arabic philosophical novel, Hayy ibn Yaqdhan in which he depicted the development of the mind of a feral child “from a tabula rasa to that of an adult, in complete isolation from society” on a desert island, through experience alone. The Latin translation of his philosophical novel, entitled Philosophus Autodidactus, published by Edward Pococke the Younger in 1671, had an influence on John Locke’s formulation of tabula rasa in An Essay Concerning Human Understanding.[20]

A similar Islamic theological novel, Theologus Autodidactus, was written by the Arab theologian and physician Ibn al-Nafis in the 13th century. It also dealt with the theme of empiricism through the story of a feral child on a desert island, but departed from its predecessor by depicting the development of the protagonist’s mind through contact with society rather than in isolation from society.[21]

During the 13th century Thomas Aquinas adopted the Aristotelian position that the senses are essential to mind into scholasticism. Bonaventure (1221–1274), one of Aquinas’ strongest intellectual opponents, offered some of the strongest arguments in favour of the Platonic idea of the mind.

Renaissance Italy

In the late renaissance various writers began to question the medieval and classical understanding of knowledge acquisition in a more fundamental way. In political and historical writing Niccolò Machiavelli and his friend Francesco Guicciardini initiated a new realistic style of writing. Machiavelli in particular was scornful of writers on politics who judged everything in comparison to mental ideals and demanded that people should study the “effectual truth” instead. Their contemporary, Leonardo da Vinci (1452–1519) said, “If you find from your own experience that something is a fact and it contradicts what some authority has written down, then you must abandon the authority and base your reasoning on your own findings.”[22]

Significantly, an empirical metaphysical system was developed by the Italian philosopher Bernardino Telesio which had an enormous impact on the development of later Italian thinkers, including Telesio’s students Antonio Persio and Sertorio Quattromani, his contemporaries Thomas Campanella and Giordano Bruno, and later British philosophers such as Francis Bacon, who regarded Telesio as “the first of the moderns.” [23] Telesio’s influence can also be seen on the French philosophers René Descartes and Pierre Gassendi.[23]

The decidedly anti-Aristotelian and anti-clerical music theorist Vincenzo Galilei (c. 1520 – 1591), father of Galileo and the inventor of monody, made use of the method in successfully solving musical problems, firstly, of tuning such as the relationship of pitch to string tension and mass in stringed instruments, and to volume of air in wind instruments; and secondly to composition, by his various suggestions to composers in his Dialogo della musica antica e moderna (Florence, 1581). The Italian word he used for “experiment” was esperienza. It is known that he was the essential pedagogical influence upon the young Galileo, his eldest son (cf. Coelho, ed. Music and Science in the Age of Galileo Galilei), arguably one of the most influential empiricists in history. Vincenzo, through his tuning research, found the underlying truth at the heart of the misunderstood myth of ‘Pythagoras’ hammers’ (the square of the numbers concerned yielded those musical intervals, not the actual numbers, as believed), and through this and other discoveries that demonstrated the fallibility of traditional authorities, a radically empirical attitude developed, passed on to Galileo, which regarded “experience and demonstration” as the sine qua non of valid rational enquiry.

British empiricism

Thomas Hobbes

British empiricism, a retrospective characterization, emerged during the 17th century as an approach to early modern philosophy and modern science. Although both integral to this overarching transition, Francis Bacon, in England, advised empiricism at 1620, whereas René Descartes, in France, upheld rationalism around 1640, a distinction drawn by Immanuel Kant, in Germany, near 1780. (Bacon’s natural philosophy was influenced by Italian philosopher Bernardino Telesio and by Swiss physician Paracelsus.)[23] Contributing later in the 17th century, Thomas Hobbes and Baruch Spinoza are retrospectively identified likewise as an empiricist and a rationalist, respectively. In the Enlightenment during the 18th century, both George Berkeley, in England, and David Hume, in Scotland, became leading exponents of empiricism, a lead precedented in the late 17th century by John Locke, also in England, hence the dominance of empiricism in British philosophy.

In response to the early-to-mid-17th century “continental rationalism,” John Locke (1632–1704) proposed in An Essay Concerning Human Understanding (1689) a very influential view wherein the only knowledge humans can have is a posteriori, i.e., based upon experience. Locke is famously attributed with holding the proposition that the human mind is a tabula rasa, a “blank tablet”, in Locke’s words “white paper”, on which the experiences derived from sense impressions as a person’s life proceeds are written. There are two sources of our ideas: sensation and reflection. In both cases, a distinction is made between simple and complex ideas. The former are unanalysable, and are broken down into primary and secondary qualities. Primary qualities are essential for the object in question to be what it is. Without specific primary qualities, an object would not be what it is. For example, an apple is an apple because of the arrangement of its atomic structure. If an apple were structured differently, it would cease to be an apple. Secondary qualities are the sensory information we can perceive from its primary qualities. For example, an apple can be perceived in various colours, sizes, and textures but it is still identified as an apple. Therefore, its primary qualities dictate what the object essentially is, while its secondary qualities define its attributes. Complex ideas combine simple ones, and divide into substances, modes, and relations. According to Locke, our knowledge of things is a perception of ideas that are in accordance or discordance with each other, which is very different from the quest for certainty of Descartes.

Bishop George Berkeley

A generation later, the Irish Anglican bishop, George Berkeley (1685–1753), determined that Locke’s view immediately opened a door that would lead to eventual atheism. In response to Locke, he put forth in his Treatise Concerning the Principles of Human Knowledge (1710) an important challenge to empiricism in which things only exist either as a result of their being perceived, or by virtue of the fact that they are an entity doing the perceiving. (For Berkeley, God fills in for humans by doing the perceiving whenever humans are not around to do it.) In his text Alciphron, Berkeley maintained that any order humans may see in nature is the language or handwriting of God.[24] Berkeley’s approach to empiricism would later come to be called subjective idealism.[25][26]

The Scottish philosopher David Hume (1711–1776) responded to Berkeley’s criticisms of Locke, as well as other differences between early modern philosophers, and moved empiricism to a new level of skepticism. Hume argued in keeping with the empiricist view that all knowledge derives from sense experience, but he accepted that this has implications not normally acceptable to philosophers. He wrote for example, “Locke divides all arguments into demonstrative and probable. On this view, we must say that it is only probable that all men must die or that the sun will rise to-morrow, because neither of these can be demonstrated. But to conform our language more to common use, we ought to divide arguments into demonstrations, proofs, and probabilities—by ‘proofs’ meaning arguments from experience that leave no room for doubt or opposition.”[27] And,[28]

“I believe the most general and most popular explication of this matter, is to say [See Mr. Locke, chapter of power.], that finding from experience, that there are several new productions in matter, such as the motions and variations of body, and concluding that there must somewhere be a power capable of producing them, we arrive at last by this reasoning at the idea of power and efficacy. But to be convinced that this explication is more popular than philosophical, we need but reflect on two very obvious principles. First, That reason alone can never give rise to any original idea, and secondly, that reason, as distinguished from experience, can never make us conclude, that a cause or productive quality is absolutely requisite to every beginning of existence. Both these considerations have been sufficiently explained: and therefore shall not at present be any farther insisted on.”

— Hume Section XIV “of the idea of necessary connexion in A Treatise of Human Nature

Hume divided all of human knowledge into two categories: relations of ideas and matters of fact (see also Kant’s analytic-synthetic distinction). Mathematical and logical propositions (e.g. “that the square of the hypotenuse is equal to the sum of the squares of the two sides”) are examples of the first, while propositions involving some contingent observation of the world (e.g. “the sun rises in the East”) are examples of the second. All of people’s “ideas”, in turn, are derived from their “impressions”. For Hume, an “impression” corresponds roughly with what we call a sensation. To remember or to imagine such impressions is to have an “idea”. Ideas are therefore the faint copies of sensations.[29]

David Hume’s empiricism led to numerous philosophical schools.

Hume maintained that no knowledge, even the most basic beliefs about the natural world, can be conclusively established by reason. Rather, he maintained, our beliefs are more a result of accumulated habits, developed in response to accumulated sense experiences. Among his many arguments Hume also added another important slant to the debate about scientific method—that of the problem of induction. Hume argued that it requires inductive reasoning to arrive at the premises for the principle of inductive reasoning, and therefore the justification for inductive reasoning is a circular argument.[29] Among Hume’s conclusions regarding the problem of induction is that there is no certainty that the future will resemble the past. Thus, as a simple instance posed by Hume, we cannot know with certainty by inductive reasoning that the sun will continue to rise in the East, but instead come to expect it to do so because it has repeatedly done so in the past.[29]

Hume concluded that such things as belief in an external world and belief in the existence of the self were not rationally justifiable. According to Hume these beliefs were to be accepted nonetheless because of their profound basis in instinct and custom. Hume’s lasting legacy, however, was the doubt that his skeptical arguments cast on the legitimacy of inductive reasoning, allowing many skeptics who followed to cast similar doubt.

3 thoughts on “Empiricism

  1. zortilo nrel says:

    I have been absent for some time, but now I remember why I used to love this website. Thank you, I will try and check back more frequently. How frequently you update your web site?

  2. Amboseli says:

    I’m so happy to read this. This is the type of manual that needs to be given and not the random misinformation that’s at the other blogs. Appreciate your sharing this best doc.

Leave a Reply

Your email address will not be published.