Argumentation theory, or argumentation, is the interdisciplinary study of how conclusions can be reached through logical reasoning; that is, claims based, soundly or not, on premises. It includes the arts and sciences of civil debate, dialogue, conversation, and persuasion. It studies rules of inference, logic, and procedural rules in both artificial and real world settings.
Argumentation includes deliberation and negotiation which are concerned with collaborative decision-making procedures. It also encompasses eristic dialog, the branch of social debate in which victory over an opponent is the primary goal. This art and science is often the means by which people protect their beliefs or self-interests—or choose to change them—in rational dialogue, in common parlance, and during the process of arguing.
Argumentation is used in law, for example in trials, in preparing an argument to be presented to a court, and in testing the validity of certain kinds of evidence. Also, argumentation scholars study the post hoc rationalizations by which organizational actors try to justify decisions they have made irrationally.
Typically an argument has an internal structure, comprising the following
An argument has one or more premises and one conclusion.
Often classical logic is used as the method of reasoning so that the conclusion follows logically from the assumptions or support. One challenge is that if the set of assumptions is inconsistent then anything can follow logically from inconsistency. Therefore, it is common to insist that the set of assumptions be consistent. It is also good practice to require the set of assumptions to be the minimal set, with respect to set inclusion, necessary to infer the consequent. Such arguments are called MINCON arguments, short for minimal consistent. Such argumentation has been applied to the fields of law and medicine. A second school of argumentation investigates abstract arguments, where 'argument' is considered a primitive term, so no internal structure of arguments is taken on account.
In its most common form, argumentation involves an individual and an interlocutor or opponent engaged in dialogue, each contending differing positions and trying to persuade each other. Other types of dialogue in addition to persuasion are eristic, information seeking, inquiry, negotiation, deliberation, and the dialectical method (Douglas Walton). The dialectical method was made famous by Plato and his use of Socrates critically questioning various characters and historical figures.
Argumentation theory had its origins in foundationalism, a theory of knowledge (epistemology) in the field of philosophy. It sought to find the grounds for claims in the forms (logic) and materials (factual laws) of a universal system of knowledge. But argument scholars gradually rejected Aristotle's systematic philosophy and the idealism in Plato and Kant. They questioned and ultimately discarded the idea that argument premises take their soundness from formal philosophical systems. The field thus broadened.
Karl R. Wallace's seminal essay, "The Substance of Rhetoric: Good Reasons" in the Quarterly Journal of Speech (1963) 44, led many scholars to study "marketplace argumentation" – the ordinary arguments of ordinary people. The seminal essay on marketplace argumentation is Ray Lynn Anderson and C. David Mortensen's "Logic and Marketplace Argumentation" Quarterly Journal of Speech 53 (1967): 143–150. This line of thinking led to a natural alliance with late developments in the sociology of knowledge. Some scholars drew connections with recent developments in philosophy, namely the pragmatism of John Dewey and Richard Rorty. Rorty has called this shift in emphasis "the linguistic turn".
In this new hybrid approach argumentation is used with or without empirical evidence to establish convincing conclusions about issues which are moral, scientific, epistemic, or of a nature in which science alone cannot answer. Out of pragmatism and many intellectual developments in the humanities and social sciences, "non-philosophical" argumentation theories grew which located the formal and material grounds of arguments in particular intellectual fields. These theories include informal logic, social epistemology, ethnomethodology, speech acts, the sociology of knowledge, the sociology of science, and social psychology. These new theories are not non-logical or anti-logical. They find logical coherence in most communities of discourse. These theories are thus often labeled "sociological" in that they focus on the social grounds of knowledge.
In general, the label "argumentation" is used by communication scholars such as (to name only a few) Wayne E. Brockriede, Douglas Ehninger, Joseph W. Wenzel, Richard Rieke, Gordon Mitchell, Carol Winkler, Eric Gander, Dennis S. Gouran, Daniel J. O'Keefe, Mark Aakhus, Bruce Gronbeck, James Klumpp, G. Thomas Goodnight, Robin Rowland, Dale Hample, C. Scott Jacobs, Sally Jackson, David Zarefsky, and Charles Arthur Willard, while the term "informal logic" is preferred by philosophers, stemming from University of Windsor philosophers Ralph H. Johnson and J. Anthony Blair. Harald Wohlrapp developed a criterion for validness (Geltung, Gültigkeit) as freedom of objections.
Trudy Govier, Douglas Walton, Michael Gilbert, Harvey Seigal, Michael Scriven, and John Woods (to name only a few) are other prominent authors in this tradition. Over the past thirty years, however, scholars from several disciplines have co-mingled at international conferences such as that hosted by the University of Amsterdam (the Netherlands) and the International Society for the Study of Argumentation (ISSA). Other international conferences are the biannual conference held at Alta, Utah sponsored by the (US) National Communication Association and American Forensics Association and conferences sponsored by the Ontario Society for the Study of Argumentation (OSSA).
Some scholars (such as Ralph H. Johnson) construe the term "argument" narrowly, as exclusively written discourse or even discourse in which all premises are explicit. Others (such as Michael Gilbert) construe the term "argument" broadly, to include spoken and even nonverbal discourse, for instance the degree to which a war memorial or propaganda poster can be said to argue or "make arguments". The philosopher Stephen Toulmin has said that an argument is a claim on our attention and belief, a view that would seem to authorize treating, say, propaganda posters as arguments. The dispute between broad and narrow theorists is of long standing and is unlikely to be settled. The views of the majority of argumentation theorists and analysts fall somewhere between these two extremes.
The study of naturally occurring conversation arose from the field of sociolinguistics. It is usually called conversation analysis. Inspired by ethnomethodology, it was developed in the late 1960s and early 1970s principally by the sociologist Harvey Sacks and, among others, his close associates Emanuel Schegloff and Gail Jefferson. Sacks died early in his career, but his work was championed by others in his field, and CA has now become an established force in sociology, anthropology, linguistics, speech-communication and psychology. It is particularly influential in interactional sociolinguistics, discourse analysis and discursive psychology, as well as being a coherent discipline in its own right. Recently CA techniques of sequential analysis have been employed by phoneticians to explore the fine phonetic details of speech.
Empirical studies and theoretical formulations by Sally Jackson and Scott Jacobs, and several generations of their students, have described argumentation as a form of managing conversational disagreement within communication contexts and systems that naturally prefer agreement.
The basis of mathematical truth has been the subject of long debate. Frege in particular sought to demonstrate (see Gottlob Frege, The Foundations of Arithmetic, 1884, and Begriffsschrift, 1879) that arithmetical truths can be derived from purely logical axioms and therefore are, in the end, logical truths. The project was developed by Russell and Whitehead in their Principia Mathematica. If an argument can be cast in the form of sentences in Symbolic Logic, then it can be tested by the application of accepted proof procedures. This has been carried out for Arithmetic using Peano axioms. Be that as it may, an argument in Mathematics, as in any other discipline, can be considered valid only if it can be shown that it cannot have true premises and a false conclusion.
Perhaps the most radical statement of the social grounds of scientific knowledge appears in Alan G.Gross's The Rhetoric of Science (Cambridge: Harvard University Press, 1990). Gross holds that science is rhetorical "without remainder", meaning that scientific knowledge itself cannot be seen as an idealized ground of knowledge. Scientific knowledge is produced rhetorically, meaning that it has special epistemic authority only insofar as its communal methods of verification are trustworthy. This thinking represents an almost complete rejection of the foundationalism on which argumentation was first based.
Interpretive argumentation is pertinent to the humanities, hermeneutics, literary theory, linguistics, semantics, pragmatics, semiotics, analytic philosophy and aesthetics. Topics in conceptual interpretation include aesthetic, judicial, logical and religious interpretation. Topics in scientific interpretation include scientific modeling.
Legal arguments are spoken presentations to a judge or appellate court by a lawyer, or parties when representing themselves of the legal reasons why they should prevail. Oral argument at the appellate level accompanies written briefs, which also advance the argument of each party in the legal dispute. A closing argument, or summation, is the concluding statement of each party's counsel reiterating the important arguments for the trier of fact, often the jury, in a court case. A closing argument occurs after the presentation of evidence.
Political arguments are used by academics, media pundits, candidates for political office and government officials. Political arguments are also used by citizens in ordinary interactions to comment about and understand political events. The rationality of the public is a major question in this line of research. Political scientist Samuel L. Popkin coined the expression "low information voters" to describe most voters who know very little about politics or the world in general.
In practice, a "low information voter" may not be aware of legislation that their representative has sponsored in Congress. A low-information voter may base their ballot box decision on a media sound-bite, or a flier received in the mail. It is possible for a media sound-bite or campaign flier to present a political position for the incumbent candidate that completely contradicts the legislative action taken in the Capitol on behalf of the constituents. It may only take a small percentage of the overall voting group who base their decision on the inaccurate information, a voter block of 10 to 12%, to swing an overall election result. When this happens, the constituency at large may have been duped or fooled. Nevertheless, the election result is legal and confirmed. Savvy Political consultants will take advantage of low-information voters and sway their votes with disinformation because it can be easier and sufficiently effective. Fact checkers have come about in recent years to help counter the effects of such campaign tactics.
Psychology has long studied the non-logical aspects of argumentation. For example, studies have shown that simple repetition of an idea is often a more effective method of argumentation than appeals to reason. Propaganda often utilizes repetition. Nazi rhetoric has been studied extensively as, inter alia, a repetition campaign.
Empirical studies of communicator credibility and attractiveness, sometimes labeled charisma, have also been tied closely to empirically-occurring arguments. Such studies bring argumentation within the ambit of persuasion theory and practice.
Some psychologists such as William J. McGuire believe that the syllogism is the basic unit of human reasoning. They have produced a large body of empirical work around McGuire's famous title "A Syllogistic Analysis of Cognitive Relationships". A central line of this way of thinking is that logic is contaminated by psychological variables such as "wishful thinking", in which subjects confound the likelihood of predictions with the desirability of the predictions. People hear what they want to hear and see what they expect to see. If planners want something to happen they see it as likely to happen. If they hope something will not happen, they see it as unlikely to happen. Thus smokers think that they personally will avoid cancer, promiscuous people practice unsafe sex, and teenagers drive recklessly.
Stephen Toulmin and Charles Arthur Willard have championed the idea of argument fields, the former drawing upon Ludwig Wittgenstein's notion of language games, (Sprachspiel) the latter drawing from communication and argumentation theory, sociology, political science, and social epistemology. For Toulmin, the term "field" designates discourses within which arguments and factual claims are grounded. For Willard, the term "field" is interchangeable with "community", "audience", or "readership". Along similar lines, G. Thomas Goodnight has studied "spheres" of argument and sparked a large literature created by younger scholars responding to or using his ideas. The general tenor of these field theories is that the premises of arguments take their meaning from social communities.
Field studies might focus on social movements, issue-centered publics (for instance, pro-life versus pro-choice in the abortion dispute), small activist groups, corporate public relations campaigns and issue management, scientific communities and disputes, political campaigns, and intellectual traditions. In the manner of a sociologist, ethnographer, anthropologist, participant-observer, and journalist, the field theorist gathers and reports on real-world human discourses, gathering case studies that might eventually be combined to produce high-order explanations of argumentation processes. This is not a quest for some master language or master theory covering all specifics of human activity. Field theorists are agnostic about the possibility of a single grand theory and skeptical about the usefulness of such a theory. Theirs is a more modest quest for "mid-range" theories that might permit generalizations about families of discourses.
By far, the most influential theorist has been Stephen Toulmin, the Cambridge educated philosopher and educator, best known for his Toulmin model of argument. What follows below is a sketch of his ideas.
Toulmin has argued that absolutism (represented by theoretical or analytic arguments) has limited practical value. Absolutism is derived from Plato's idealized formal logic, which advocates universal truth; thus absolutists believe that moral issues can be resolved by adhering to a standard set of moral principles, regardless of context. By contrast, Toulmin asserts that many of these so-called standard principles are irrelevant to real situations encountered by human beings in daily life.
To describe his vision of daily life, Toulmin introduced the concept of argument fields; in The Uses of Argument (1958), Toulmin states that some aspects of arguments vary from field to field, and are hence called "field-dependent", while other aspects of argument are the same throughout all fields, and are hence called "field-invariant". The flaw of absolutism, Toulmin believes, lies in its unawareness of the field-dependent aspect of argument; absolutism assumes that all aspects of argument are field invariant.
Toulmin's theories resolve to avoid the defects of absolutism without resorting to relativism: relativism, Toulmin asserted, provides no basis for distinguishing between a moral or immoral argument. In Human Understanding (1972), Toulmin suggests that anthropologists have been tempted to side with relativists because they have noticed the influence of cultural variations on rational arguments; in other words, the anthropologist or relativist overemphasizes the importance of the "field-dependent" aspect of arguments, and becomes unaware of the "field-invariant" elements. In an attempt to provide solutions to the problems of absolutism and relativism, Toulmin attempts throughout his work to develop standards that are neither absolutist nor relativist for assessing the worth of ideas.
Toulmin believes that a good argument can succeed in providing good justification to a claim, which will stand up to criticism and earn a favourable verdict.
In The Uses of Argument (1958), Toulmin introduced what became known as the Toulmin Model of Argument, which broke argument into six interrelated components:
The first three elements "claim", "data", and "warrant" are considered as the essential components of practical arguments, while the second triad "qualifier", "backing", and "rebuttal" may not be needed in some arguments.
When first proposed, this layout of argumentation is based on legal arguments and intended to be used to analyze the rationality of arguments typically found in the courtroom; in fact, Toulmin did not realize that this layout would be applicable to the field of rhetoric and communication until his works were introduced to rhetoricians by Wayne Brockriede and Douglas Ehninger. Their "Decision by Debate" (1963) streamlined Toulmin's terminology and broadly introduced his model to the field of debate. Only after he published Introduction to Reasoning (1979) were the rhetorical applications of this layout mentioned in his works.
Toulmin's Human Understanding (1972) asserts that conceptual change is evolutionary. This book attacks Thomas Kuhn's explanation of conceptual change in The Structure of Scientific Revolutions. Kuhn held that conceptual change is a revolutionary (as opposed to an evolutionary) process in which mutually exclusive paradigms compete to replace one another. Toulmin criticizes the relativist elements in Kuhn's thesis, as he points out that the mutually exclusive paradigms provide no ground for comparison; in other words, Kuhn's thesis has made the relativists' error of overemphasizing the "field variant" while ignoring the "field invariant", or commonality shared by all argumentation or scientific paradigms.
Toulmin proposes an evolutionary model of conceptual change comparable to Darwin's model of biological evolution. On this reasoning, conceptual change involves innovation and selection. Innovation accounts for the appearance of conceptual variations, while selection accounts for the survival and perpetuation of the soundest conceptions. Innovation occurs when the professionals of a particular discipline come to view things differently from their predecessors; selection subjects the innovative concepts to a process of debate and inquiry in what Toulmin considers as a "forum of competitions". The soundest concepts will survive the forum of competition as replacements or revisions of the traditional conceptions.
From the absolutists' point of view, concepts are either valid or invalid regardless of contexts; from a relativists' perspective, one concept is neither better nor worse than a rival concept from a different cultural context. From Toulmin's perspective, the evaluation depends on a process of comparison, which determines whether or not one concept will provide improvement to our explanatory power more so than its rival concepts.
In Cosmopolis (1990), Toulmin traces the quest for certainty back to Descartes and Hobbes, and lauds Dewey, Wittgenstein, Heidegger and Rorty for abandoning that tradition.
Scholars at the University of Amsterdam in the Netherlands have pioneered a rigorous modern version of dialectic under the name pragma-dialectics. The intuitive idea is to formulate clearcut rules that, if followed, will yield rational discussion and sound conclusions. Frans H. van Eemeren, the late Rob Grootendorst, and many of their students have produced a large body of work expounding this idea.
The dialectical conception of reasonableness is given by ten rules for critical discussion, all being instrumental for achieving a resolution of the difference of opinion (from Van Eemeren, Grootendorst, & Snoeck Henkemans, 2002, p. 182-183). The theory postulates this as an ideal model, and not something one expects to find as an empirical fact. The model can however serve as an important heuristic and critical tool for testing how reality approximates this ideal and point to where discourse goes wrong, that is, when the rules are violated. Any such violation will constitute a fallacy. Albeit not primarily focused on fallacies, pragma-dialectics provides a systematic approach to deal with them in a coherent way.
Doug Walton developed a distinctive philosophical theory of logical argumentation built around a set of practical methods to help a user identify, analyze and evaluate arguments in everyday conversational discourse and in more structured areas such as debate, law and scientific fields. There are four main components: argumentation schemes, dialogue structures, argument mapping tools, and formal argumentation systems. The method uses the notion of commitment in dialogue as the fundamental tool for the analysis and evaluation of argumentation rather than the notion of belief. Commitments are statements that the agent has expressed or formulated, and has pledged to carry out, or has publicly asserted. According to the commitment model, agents interact with each other in a dialogue in which each takes its turn to contribute speech acts. The dialogue framework uses critical questioning as a way of testing plausible explanations and finding weak points in an argument that raise doubt concerning the acceptability of the argument.
Walton's logical argumentation model takes a different view of proof and justification from that taken in the dominant epistemology in analytical philosophy, which is based on a justified true belief framework. On the logical argumentation approach, knowledge is seen as form of belief commitment firmly fixed by an argumentation procedure that tests the evidence on both sides, and use standards of proof to determine whether a proposition qualifies as knowledge. On this evidence-based approach, scientific knowledge must be seen as defeasible.
Efforts have been made within the field of artificial intelligence to perform and analyze the act of argumentation with computers. Argumentation has been used to provide a proof-theoretic semantics for non-monotonic logic, starting with the influential work of Dung (1995). Computational argumentation systems have found particular application in domains where formal logic and classical decision theory are unable to capture the richness of reasoning, domains such as law and medicine. In Elements of Argumentation, Philippe Besnard and Anthony Hunter show how classical logic-based techniques can be used to capture key elements of practical argumentation.
Within computer science, the ArgMAS workshop series (Argumentation in Multi-Agent Systems), the CMNA workshop series, and now the COMMA Conference, are regular annual events attracting participants from every continent. The journal Argument & Computation is dedicated to exploring the intersection between argumentation and computer science. ArgMining is a workshop series dedicated specifically to the related argument mining task.
At the start of Topics VIII.5, Aristotle distinguishes three types of dialogue by their different goals: (1) the truly dialectical debate, which is concerned with training (gumnasia), with critical examination (peira), or with inquiry (skepsis); (2) the didactic discussion, concerned with teaching; and (3) the competitive (eristic, contentious) type of debate in which winning is the only concern.
Toulmin's 1958 work is essential in the field of argumentation
In place of the traditional epistemological view of knowledge as justified true belief we argue that artificial intelligence and law needs an evidence -based epistemology