Jeff Rubard
2010-02-02 01:50:42 UTC
http://www.amazon.com/collected-Gerhard-Gentzen-foundations-mathematics/dp/072042254X/ref=sr_1_2?ie=UTF8&s=books&qid=1265075393&sr=1-2
-----
Here is a new version of the essay which I posted a previously
revised
version of earlier today [Dec 19 2003 - Ed.]. Comments on either
would be welcome,
especially any comments to the effect that something is missing in
this
"scientific" edit which was welcome in the original. (Note: the
source
document for the text of this version is a hand-coded XML DocBook --
and
perhaps this is notable not for the coding, but for the hand).
Jeff Rubard
Typing And Proofing: Political Significance and the Sequent Calculus
Chapter 1. Typing and Proofing: Political Significance and the
Sequent Calculus
Men make their own history, but they do not make it as they
please;
they do not make it under self-selected circumstances, but under
circumstances existing already, given and transmitted from the
past.
The tradition of all dead generations weighs like a nightmare on
the
brains of the living. And just as they seem to be occupied with
revolutionizing themselves and things, creating something that
did
not exist before, precisely in such epochs of revolutionary
crisis
they anxiously conjure up the spirits of the past to their
service,
borrowing from them names, battle slogans, and costumes in order
to
present this new scene in world history in time-honored disguise
and
borrowed language. Thus Luther put on the mask of the Apostle
Paul,
the Revolution of 1789-1814 draped itself alternately in the
guise
of the Roman Republic and the Roman Empire, and the Revolution
of
1848 knew nothing better to do than to parody, now 1789, now the
revolutionary tradition of 1793-1795. In like manner, the
beginner
who has leaned a new language always translates it back into his
mother tongue, but he assimilates the spirit of the new language
and
expresses himself freely in it only when he moves in it without
recalling the old and when he forgets his native tongue. Marx,
18th
Brumaire
the old major who used to take me to the Capitol when the Senate
and
the House of Representatives were in session had been in the
commissary of the Confederate Army and had very beautiful manners
so
the attendants bowed to the old major except for the pages who
were
little boys not much older than your brother was a page in the
Senate once and occasionally a Representative or a Senator would
look at him with slit eyes may be somebody and bow or shake
hearty
or raise a hand -- John Dos Passos, *USA*
What Is Proof Theory? Proof Theory's Not Semantics
The consideration of formal logic in conjunction with politics -- not
political science, but the bonafide article -- may seem strange,
because
it frankly is. Political life is seen by many as fundamentally
irrational, and various theoretical traditions devoted to classifying
politics as a "matter of custom" grew up around the wreckage of
classical liberalism in the 20th century. Such wreckage as the world
had
never seen -- entire cities laid to waste by powerful new weapons,
and
entire peoples under the sway of finely-grained systems of deception
and
domination. With such events as a backdrop, it may seem foolish to
advance new theories as to the true constitution and interests of a
polity; and such foreboding is well-taken, as theories serve roles
other
than systematizing observations and can get caught in their own
leading-strings, as it were. But perhaps methods exist which will
allow
us to proceed carefully and thoroughly, not only with respect to
concrete political legacies of the modern era.
It is with this conviction -- less than a belief, more than
hypothesis
-- that I introduce this series of essays, and I will accordingly
begin
this essay by placing a strange term from formal logic down on the
page:
proof theory. Proof theory is a subdiscipline of formal logic
dedicated
to the study of methods for proving theorems: it originated with the
mathematicians of the University of Gottingen in the 1930s, in
particular David Hilbert and Gerhard Gentzen. This might seem like a
less than auspicious start if we are going to think that such
theories
have a bearing on the political life of contemporary liberal
democracies, and perhaps it is so. But perhaps not, and I will begin
to
make that argument by invoking a more-familiar use of a logical term:
the complaint that an argument is "just semantics". Whether semantic
systems are themselves important for the political system is an issue
I
will address later: but for now, let me define the term "proof
theory"
recursively -- that is, give a method for generating the object of
study
by repeatedly drawing a distinction.
Proof theory is not semantics; it is not that the two are not closely
linked -- they are nearly inseparable -- but the material of
proof-theoretical research is formed by hiving off from questions of
meaning. In other words, the proof theorist (who is not exactly a
mathematician, at least with this had on) studies what can be said
about
an utterance when its meaning is unknown or indeterminate. But not
just
singly: whether their meaning be independent or based in larger
concerns, utterances are notoriously gregarious creatures and their
collective behavior can be profitably studied. In saying this, we
have
almost entered the province of sociology -- and we will hear more
about
that later. But let me reaffirm that the logical status of a set of
utterances can be studied without regard to any particular use they
find
in an arena of society. It is not so much that the logician studies
assertions in a formal way, as that he or she studies more or less
formal uses of them -- social occurrences they repeatedly effect.
So I have provisionally established an object of study for the proof
theorist. What would count as an acceptable tool with which to study
proofs as they are -- ranging from rough and unfinished, to
sophisticated and potentially faulty? Well, let's consider the
criteria
for "object-oriented" computer programming languages such as
Smalltalk
or C++. There are four functions such a language must allow:
abstraction, encapsulation, polymorphism and inheritance. Abstraction
allows an object to be "opaque" in its functioning within a program,
encapsulation lets the data stored within the object be "regulated"
by
internal principles, polymorphism allows the aforementioned "hiving
off"
of similar but different objects, and inheritance allows
characteristics
of the "parent" object to be transmitted to the new object. But what
are
these "objects" I speak of? *Not* goals, but rather objects as we
understand them in the physical world and occasionally suppose to
exist
otherwise. To use a little bit of the stipulated precision of German,
"object-oriented" languages are *gegenstaendlich* and not
*zweckrational*.
Thusly, if we are attempting to study theoretical entities like
numbers
or concepts in an objective spirit such concerns as motivate
theoretical
computer scientists are very much to the point, and I shall be
looking
for such "objectual" systems to the best of my ability in these
essays.
But before we proceed, what profit might there be in taking a step
back
from the sophisticated computer systems for logically processing
information which are in place today? Perhaps to a certain extent
this
point was put here for us to learn, as programming languages
originally
derive from pure logical concerns. But here is a more constructive
suggestion: contemporaneously with the development of proof theory, a
new systematic approach to semantics ("model theory") came into being
in
various locales. Model theory was motivated by discovery of
limitations
on the structurally desirable properties which could be attributed to
several common logical systems -- and another subdiscipline,
computational complexity theory, has developed to continue such
gainsaying of a logic's suitability for computer programs.
Perhaps we could say that model theory occupies a space between these
two; and if so, questions concerning "the politics of meaning" might
be
provisionally decided by considering all three aspects of systems of
political discourse. So it is demonstrated that sometime acceptance
of
the criticism "that's just semantics"; does not call a halt to
consideration of such issues; and having made this provisional
determination, we can press forward -- in no particular spirit.
Commitments there always are, however; and these essays are
fundamentally intended for young people of an extremely critical cast
of
mind, such that they avoid certain pitfalls in thought -- that is,
certain "language-games" which never play out to everyone's
advantage.
And although the tools offered here are no substitute for a realm of
freedom beyond that known by all previous history (they are frankly
no
substitute for silence and a knowing look) I am very much of the
conviction that quite a bit of harm avoidance, rather than "risk
reduction", is possible today with concerted group effort -- with the
stress placed on "concerted".
The Form Of The Matter: An Introduction To The Sequent Calculus
Gentzen invented two popular proof systems, the sequent calculus and
natural deduction; and as natural deduction is the more popular of
the
two (for the purposes of instruction in formal logic and analytic
philosophy) it is perhaps desirable that I take a minute and explain
its
limitations. Prior to Gentzen, all logical work was conducted in the
familiar axiom-lemma-theorem format: by contrast, Gentzen's systems
offer ways to understand chains of reasoning backwards as well as
forwards, with a savings in elegance and conceptual economy which
pays
off while considering metatheoretical issues. Both natural deduction
and
the sequent calculus involve two sets of rules for each logical
connective: natural deduction has "introduction rules", for when a
statement introducing a connective can be proven, and "elimination
rules" for proving things from such a statement. Natural deduction is
easy to learn and use, but contains methodological shortcuts which
make
even some common mathematical proofs impossibly long (cf. George
Boolos,
"Don't Eliminate Cut").
The sequent calculus consists of three sets of rules for manipulating
its primitive object, the sequent -- and these provide all the
necessary
mores for any sort of commonly accepted proof. The sequent, which
consists of an "antecedent" and "succeedent", is not quite a formula;
rather, it is a schema which represents a number of formulae having
the
same basic structure -- and this feature of the sequent is not
without
its theoretical consequences, as I will show later. But more
importantly, this economy of symbolism permits a wider variety of
formal
manipulations to be "codified". The three sets are "right" rules, for
introducing a connective in the antecedent; "left"; rules, for
introducing one in the succeedent; and "structural" rules governing
all
chains of reasoning. Of the three, the structural rules unique to the
sequent calculus are the most theoretically important; variation of
them
allows us to formulate many "substructural logics" with interesting
properties. (Several of these logics are explained by Greg Restall in
his recent book.)
But in this essay I will be drawing attention to the bipartite
distinction between "left" and "right" rules in a spirit quite
foreign
to philosophical logic as it has traditionally been practiced, as
well
as provisionally formulating a definition of logical analysis
exceeding
the traditional canons of inference (that discipline's bailiwick).
First
I will give the set of rules for Gentzen's original sequent calculus
LK,
based on the version given by Wikipedia but with a slight
modification.
In place of the traditional "sequent connective", variously rendered
as
a "turnstile" or a double arrow, I am putting the "double turnstile"
as
something of a challenge. The double turnstile, which forms the basis
of
the glyph announcing these essays, ordinarily represents model-
theoretic
consequence (inferences valid on any construal of their
constituents).
Old hands will feel it is out of place here, but the distinction
between
derivability in a particular formal system and the "structural
inference" of the sequent is already made: the latter does not permit
of
"subscription". In other words, I am proposing that model-theoretic
consequence for particular logics devolves from sequent connections:
that proof exceeds truth, not the other way around.
Axiom:
(I)
A |= A
Left logical rules: Right logical rules:
Conjunction
Γ, A Δ (∧L1)
----------------
Γ, A∧B |= Δ
Γ, B |= Δ (∧L2) Γ |= A, Δ Σ |= B, Π
(∧R)
--------------
-------------------------
Γ, A∧B |= Δ Γ, Σ ⇒ A∧B, Δ, Π
Disjunction:
Γ, A |= Δ Σ, B |= Π (∨L) Γ |= A, Δ (∨R1)
------------------------ ---------------
Γ, Σ, A∨B |= Δ, Π Γ ⇒ A∨B, Δ
Γ |= B, Δ (∨R2)
---------------
Γ |= A∨B, Δ
Material Implication:
Γ |= A, Δ Σ, B |= Π (→L) Γ, A |= B, Δ (→R)
---------------------- ----------------
Γ, Σ, A→B |= Δ, Π Γ |= A→B, Δ
Negation:
Γ |= A, Δ (¬L) Γ, A |= Δ (¬R)
--------------- ----------
Γ, ¬A |= Δ Γ |= ¬A, Δ
Universal Quanitifier:
Γ, A[t] |= Δ (∀L) Γ |= A[y], Δ (∀R)
-------------------- ----------------------
Γ, ∀x A[x/t] |= Δ Γ |= ∀x A[x/y], Δ
Existential Quantifier:
Γ, A[y] |= Δ (∃L) Γ |= A[t], Δ (∃R)
-------------------- ---------------------
Γ, ∃x A[x/y] |= Δ Γ |= ∃x A[x/t], Δ
Left structural rules: Right structural rules:
Weakening:
Γ |= Δ (WL) Γ |= Δ (WR)
------------- -------------
Γ, A |= Δ Γ |= A, Δ
Contraction:
Γ, A, A |= Δ (CL) Γ |= A, A, Δ (CR)
----------------- ---------------
Γ, A |= Δ Γ |= A, Δ
Interchange:
Γ1, A, B, Γ2 |= Δ (IL) Γ |= Δ1, A, B, Δ2 (IR)
-------------------- ---------------------
Γ1, B, A, Γ2 |= Δ Γ |= Δ1, B, A, Δ2
Cut: Mix (Multicut):
Γ |= A, Δ Σ, A|= Π Γ |= An, Δ Σ, Am|= Π
----------------------- --------------------------
Γ, Σ |= Δ, Π Γ, Σ |= Δ, Π
What of the other symbolic "peculiarities" of the sequent calculus?
The
gamma, delta and sigma symbols stand for groupings of formulae
("context"). Additionally, the commas separating components of a
sequent
have different meanings depending on whether they appear in the
antecedent or the succedent; in the former case they are disjunctive,
in
the latter case conjunctive. However, *pace* Nuel Belnap's Display
Logic, it seems clear that Gentzen's inclusion of commas in the class
of
"improper symbols" was well-founded, as improper symbols in logic
belong
to a class of symbols someone once called "circumstructure" without
anyone paying too much attention. Gentzen's comma could well be taken
as
paradigmatic of a circumstructure -- a piece of "the furniture of
reasoning" -- rather than an invitation to impose arbitrary
structures
on proofs. To the contrary, the rules of the sequent calculus suffice
for rendering all the well-formed formulae of classical and
intuitionistic logic (about which more later), but more importantly
rendering them in a manner perspicuous for metamathematical proofs.
Proofs formulated in the sequent calculus without using the cut rule
obey the subformula property: only subformulae of the derived formula
appear in the tree representing the steps of the deduction. Gentzen
had
an illuminating thing to say about such proofs: "The final result is,
as
it were, gradually built up from its constituent elements. The proof
represented by the derivation is not roundabout in that it contains
only
concepts which recur in the final result." This is an eminently
desirable property from the point of view of demonstrating
consistency,
but unfortunately it almost never appears in any kind of proof we
would
be tempted to engage in while thinking practically: even in
mathematic
logic, "rules of thumb" are an essential part of economically proving
theorems (that is, within the amount of time allotted to the luckiest
of
us on Earth). The cut (together with "weakening", which could perhaps
more accurately be called dilation) is a catch-all device for
including
such "rules of thumb"; as valid forms of inference, but it does not
thereby show that such rules are "conservative extensions" of the
other
rules: for this purpose, a cut-elimination theorem must be proven.
Gentzen's proofs of cut-elimination theorems (*Hauptsaetze*) for
classical and intuitionistic logic employ a variant of the cut rule,
the
mix rule. The purpose of the mix rule is to make prooftrees for a cut
rule tractable, to demonstrate that the cut formula can in theory be
systematically eliminated from any antecedents of a sequent involving
it. Although there are now methods for proving cut-elimination
theorems
which do not use the mix rule, the element differentiating the mix
rule
from the cut rule (the repetition of the cutformula) is worth
considering. What the principle behind the mix rule (in whatever
form)
states is that, although unlike natural deduction the sequent
calculus
does not require that a rule of inference be justified in terms of
immediately obvious applicability of "introduction rules", a valid
cutformula must be genuinely "rulish", in principle capable of
multiple
instantiation; a formula which is not repeatable is merely a glyph
without proper content. The resonances of that formulation are
intentional and deserve further study; we find more than an echo of
Kant's "empirical realism", but quite a bit less than a full account
of
Kantian theoretical philosophy in terms of modern logic.
Another way to explain the difference between cut and mix, quite a
bit
more to the point, is in terms of natural-language pragmatics. Cut is
the formal equivalent of "what is said", the literal meaning conveyed
by
an utterance in one or another way; there are no cut-and-dried rules
for
determining when one is entitled to employ an espied meaning in
reasoning, yet we do it all the time. Mix, on the other hand,
represents
the mechanism by which we can spot implicatures (the metaphorical
equivalence of what is said to some other statement) given context.
If
this analogy seems arbitrary, let me say that in the proof of
cut-elimination for LK the "rank" and "degree" of arbitrary
cutformulae
are used; "rank" is the depth of the derivation necessary to generate
the cutformula, "degree" is the number of logical symbols appearing
in
the formula. But if this seems (perhaps disturbingly) similar to the
concept of "typing" as it is used in mathematical logic and
programming
languages, there is more than a similarity here.
Gentzen's cut-elimination proof makes tacit use of what has come to
be
called the "Curry-Howard isomorphism" or "formulas as types", where a
formula's prooftree is represented by typing of the usual sort (u,v),
where u and v represent domains for a propositional function and the
formula its range. (Such issues are explained in detail in a recent,
highly readable introduction to proof theory, Girard's *Proofs and
Types*; recently made available by its translators on the Internet,
as
it has gone out of print, at
nick.dcs.qmul.ac.uk/~pt/stable/Proofs+Types.html).
In other words abductive inference, reasoning to the best explanation
--
such as is used in "guesstimating" the context-dependent import of
utterances -- has quite a lot to do with these (fallible) rules of
proof: but this is not itself a matter of "best explanation", but
rather
a (partial) homology permitting of (unreliable) exploitation in
considering issues of natural-language pragmatics. That is, provided
an
extreme linguistic atomism is not in play, as cut-elimination rules
out
systems where every element may potentially obey a completely
idiosyncratic logic but not a "propositional" or aspect-dependent
character to utterances: which is today frequently but
problematically
represented as a function of possible worlds to statements by
students
of pragmatics, a topic which will itself be addressed under a
diferent
aspect in the excursus to this essay.
Placing Politics: Telling Your Left From Right In A
Multi-Dimensional World
So let us go on and consider one class of such issues, problems of
political life and its accompanying discourse. To begin: variants of
the
sequent calculus, the "one-sided" Gentzen-Schuette systems, only
permit
statements in the succedent. These are adequate for the purposes of
metamathematically characterizing classical and intuitionistic logic,
but there are a few reasons to think that the "natural" valences of
right and left for inhabitants of the modern era are not irrelevant
here
and deserve a great deal more scrutiny than they usually get (we
shall
have occasion to consider many such "equivalences" in later essays).
To
begin such a consideration: if the terms "right" and "left"
spatialize
the political sphere, they do not do it in any straightforward way.
The
"binary opposition" derives from an early parliamentary seating
arrangement during the French Revolution, where radical members sat
on
the king's left, royalists on his right, and moderates in the middle;
so
the distinction does not serve as an absolute measure of the
political
spectrum (unless there is a tacit populist premise involved).
However,
the distinction traditionally repeats itself within political
parties,
and the question of what is marked out by such a distinction and how
it
affects the party as a whole has been raised time and again
Perhaps most thought-provokingly by Lenin, who advocated "democratic
centralism" as a course for the Bolshevik party such that they might
avoid both "infantile" left-wing purism and ineffective reformism;
the
"democratic centralist" method was supposed to ensure party unity
while
allowing free and open debate on matters of policy and principle. As
I
mentioned, the one-sided systems are fully adequate for the purpose
of
mathematical proofs; one might even be tempted to understand them as
a
"translation" of traditional proof methods into Gentzen's idiom. But
it
seems to me that what two-sided, multivalent sequent calculi are
uniquely suited to "displaying" is the rhetorical effectivity of
political statements (a question raised by Althusser), what is spoken
of
as "running to the left" or to the right in electoral politics. Here
I
am construing "political" in its original, wide signification of
social
life in general, and raising the question of what active supporters of
a
political party are doing while their candidate is raising hopes.
This hypothesis hinges on a conjecture about the linguistic structure
of
the political sphere: that the interpretations abroad concerning past
historical events are generally favorable to the political left,
whereas
the interpretations of contemporary life "in fashion", those that
have
weight with decision-makers, are generally favorable to the political
right. When this is connected up with the logical truism that
antecedents were made for succedents and vice versa, this suggests
that
there are two well-founded political strategies: leftists should try
to
show that the lessons maximally radical past movements teach us have
as
consequences "minimally conservative"; positions (that is, constitute
the lower bound of political positions conservatives must take
seriously) and rightists should try to show that minimally "radical"
(that is, maximally feudalistic) past political movements have as
"trivial" consequences equalitarian shibboleths of the day.
These strategies apply for left-right disputes within a single party
as
well, but whether there is something of the inevitable in the outcome
is
an open question indeed. Gentzen's personal association with the Nazi
party has been the subject of some discussion recently (in Germany,
that
is); and this formulation may suggest that any truth-content in
political rhetoric is being erased by considering such issues using
his
logical tools. However, no other area of discourse is as
well-established in popular sentiment as being a "social technology":
a
means to produce desired effects in a social formation, rather than
existing solely for the purpose of conducting inquiry. And so it
seems
that simply giving the view that the business of politics is
"manufacturing consent" a sharper formulation does no great harm; and
a
third element (as well as the question of the social effectivity of
rhetorical "models", which as I said is one for another day) prevents
such tactics from constituting the whole of discourse about politics.
The famous aphorism of Charles Peguy, "Everything begins as a
mystique
and ends as a politique", suggests a third attitude to political
rhetoric -- one of distaste and resignation: and this attitude is
most
markedly present in "libertarian" political outlooks, not only in
right-wing defenders of free enterprise and property-rights against
"nanny states", but in libertarian leftist movements battling the
control of individual life by profit-motivated organizations.
In America such groups have historically drawn inspiration from
anarcho-syndicalism, but this is something of a rag-bag term,
including
theorists and practical politicians who would not accept such a label
and excluding some who came very close to practicing the ideal of
liberation from the state. What makes abstentionism (pointed
non-participation in elections) a political tactic is its recognition
of
the reality of the political sphere; the Amish are not
abstentionists.
What makes abstentionism (which can be practiced within parliamentary
parties, as the Mississippi Freedom delegation at the 1964 Democratic
convention demonstrated) a "libertarian" tactic is its denial of the
reality of a particular piece of political discourse, assigning to it
the character of "para-politics". The distinction between politics
and
para-politics cuts across the legitimacy/illegitimacy distinction:
the
Continental Congress was illegitimate by the standards of British
rule
but genuinely political, whereas the Nazi party came to power by
"legitimate" means which were then perverted to serve
extra-parliamentary interests. Gentzen's sequent formulation of
intuitionism, LJ, is exactly like LK save for its not permitting more
than one formula in the succedent; and this suggests itself as an
excellent framework for understanding (and practicing) abstentionism.
Intuitionism is the metamathematical doctrine that mathematical truth
cannot exceed means of proof, but it is not as "puritanical" in this
as
is commonly supposed: intuitionistic choice functions permit the use
of
mathematical objects generated by infinite processes of effective
construction, rather than banning the concept of uncountable infinity
from mathematics entirely. And in this intuitionism resembles another
Dutch specialty, council communism.
Council communism gives perhaps the purest formulation of the strict
political demand of communism, that political sovereignty cannot
exceed
the legitimacy granted it by participation and that political bodies
which do not meet this criterion (parliaments) ought to be discarded
in
favor of ones that do (worker's councils). Political intuitionism --
an
abstemious attitude to electoral politics, wherein political activity
is
limited solely to arguing for the legality of particular exercises of
personal freedom, and the illegality of particular exercises of
individual coercion, -- is a means of preserving what is of value in
liberal democracy (uniformity of treatment under the rule of law)
without legitimating what is not (excesses of the bureaucratic
state),
and which does not deny the reality of the polity's claim on us.
"Minarchism" doesn't sound so good if you've ever tried to prove much
without ex falso, the rule that from a contradiction one may infer
anything one likes.
Excursus On Superstructural Logic
Earlier I mentioned the field of "substructural logics", those with
restricted canons of inference. But here I would like to consider an
opposite tendency in thought; namely, an attempt to extend
formalization
such that social phenomena formerly grouped under the heading "non-
logic
conduct" can be rigorously understood in their symbolic articulation.
A
natural fit between the sequent calculus and political rhetoric,
should
such a thing be found to exist, would be worked by politics being one
of
Talcott Parsons' "generalized media of communication" -- a way in
which
action is coordinated without consensus. This may suggest there are
other areas of social interactions which have "templates" provided by
other formal system. I believe similar links exist between the
Hilbert
systems and scientific truth, natural deduction and the discourse of
love in its various (though not manifold) forms; finally, the
resolution
calculus and money or visuality. I will try to explain such claims in
the essays to come; but that such conjectures are made at all
presupposes a degree of "flex" in the logical bailiwick which is not
always supposed to exist. As I mentioned, many thinkers hold that
politics is not "rational" in the sense of being axiomatic, but
rather
is constituted by of cultural *Gestalten* which are impossible to
"pin
down": one notable case is Michael Oakeshott.
But accepting this overview in the abstract does not bar one from
considering "scientific knowledge" to play a significant role in
politics, including scientific knowledge defined in terms of strict
(modally rigid) laws. It seems that provisions should be made for the
inclusion of such rules in sequent calculi (at least on grounds of
realism), but that the presentation above suggests that existing
techniques for including extended rules of inference (such as
modalities) in a sequent calculus are misleading at best, since
"intensional" operators are of a fundamentally different class.
Intensional operators are "non-truth-functional"; the meaning of a
statement involving an intensional operator is not purely dependent
on
the meaning of the other constituents of that statement, "external
factors" must be considered. It may be objected that quantifiers are
non-truth-functional and still included in the sequent calculus
proper;
but as indicated above, the left and right quantifier rules are not
introduction rules simpliciter, as they contain significant
substitutional constraints (whereas both structural and connective
rules
have no restrictions).
So instead, I would propose that intensional operators generally be
understood in terms of the rules Weakening and Cut. Operators for
"normal" modal logics are special cases of Weakening, justified in
terms
of closure under Cut for certain types-as-formulas -- i.e., the
applicability of the Necessitation rule, where "necessarily p" means
that p is a theorem. This does not provide for a "elegant"
representation of rules of inference pertaining to modal logic, but
such
elegance is rather contrary to the nature of modal logic; Hilbert
calculi are ideally suited to weighing the subtle differences between
modal systems, which distinctions form the bulk of the study of modal
logic, and which are not well-represented in natural deduction and
sequent calculi (although they are based in significant model-
theoretic
differences). But it does provide for an understanding of
"intensionality" which naturally connects it to the more general
theoretical notion of superstructure and de-links it from the concept
of
"circumstructure" or normative fact introduced in the third essay in
this series. Objections mounted to intensional logic on Brentanian
grounds may be less strenuous if it is understood that what is being
talked about is not quite the "appearance" of traditional
epistemology,
but patterns in social discourse about such appearances which subsist
independently of individual glosses on experience.
And objections mounted to "essentialist metaphysics" conducted in the
guise of reflections on modality may abate if the multiplicity of
modal
systems and operators currently studied is well-understood. For
example,
one "logical modality", the difference operator, does not say
anything
about necessity or possibility, what it states is that some other
possible world in a model satisfies a proposition; its dual states
that
every other possible world in a model satisfies a given proposition.
Blackburn, de Rijke and Venema introduce the difference operator with
grins, likening it to a hammer in the bottom of a tool chest only
used
when no other tool will do, but in reality it and its dual are very
powerful tools for understanding a concept never before analyzed in a
logical manner, Carl Schmitt's friend/enemy distinction. The
distinction
between friend and enemy in Schmitt is not one between personal
friend
and personal foe, but the founding division of politics: according to
Schmitt, when considering members of other nations in belligerent
situations, there is some one thing they are all not, namely
possessing
a groundless commitment to our way of life. But the point of "modal
political theory" in this style is not to demonstrate Althusser's
concern with the "authorless theater", global structural constraints
on
social life (which requires quite a bit more thinking).
It is rather intended to show that Schmitt's other obsession, the
*ius
publicum Europaeum* (the opposite of 20th-century partisan or total
warfare) can be understood as positing a (unchecked) homology between
the mores of one country and some similar configuration in another
system, i.e. employing the aforementioned difference operator and not
its dual. (Whether Schmitt's later distinction between *Ordnung*,
"ordering", and *Ortung*, "localization", can be seen as underpinning
this is a matter for further research.) And thusly the bumper-sticker
saw "War does not determine who is right, only who is left" can be
seen
in light of this as really saying something like: the discourse of
unchecked political conflict derives from antecedents of said
conflict,
indicates the depth of support for a position in past solidarities,
rather than creating new ones. Diplomats have traditionally reasoned
in
this way all the time; so if "conservative revolutionaries" are to
judge
"rationalism in politics" impossible, the cogency of employing
scientific discourse in politics is what needs to be knocked down,
not
the straw man of a "scientific mode of dress", on pain of nothing new
under the gun; and a "traditional" suggestion is that this thought
disturbs them not at all.
-----
To add what I feel is a necessary adjunct to this (that is, a comment
which is not an organic part of the essay but modulates certain
elements
of it which plausibly cause undesirable distortions in public
discourse
concerning these topics), what I am naming as "visuality" is not the
"logic of painting" as envisioned by Clement Greenberg; that would be
something more like Freud's "scopic drive", or Adornian "unsealing of
the concept". If this seems vague and wishy-washy, it is: but it
absolutely essential (to my mind, the logician's chief task) to
defend
the legitimacy of drawing distinctions in thought, whether they be
found
scientific or not. Many partisans of the physical or
"putativelyphysical" sciences would not agree, citing the ill-defined
category "junk science" as a blight upon civilization; and I am
inclined
to agree, but also to define the category quite differently than many
(illegal/unethical research).
I feel compelled to preserve this right for others, but to the extent
that my thinking about such matters as these is scientific I also
utilize it. The description of "communicative intentions" as a
function
from possible worlds onto sentences is not an accurate rendering of
the
locus classicus for such thoughts, Stalnaker's partial formalization
of
Gricean pragmatics, but I feel the idea of "possible worlds
semantics"
mapping ill-defined "points" onto clearly-defined "truth-values" to
beg
all kinds of questions, or rather to beg off regarding certain
questions
about "fusion of inferences" as a *positive process* (as it is
described
by Gentzen), which forces the description of forms of discourse into
excessively "rationalistic" and individualistic conformances.
In other words, I'm not talking about you here, whoever you happen to
be; I'm talking about a dead guy people don't know much about. But
if
you feel this essay to be oppressive, you have a right to that.
Jeff Rubard
-----
http://search.barnesandnoble.com/The-Spirit-of-the-Laws/Charles-de-Montesquieu/e/9780521369749/?itm=3
Figure.it.out.
-----
Here is a new version of the essay which I posted a previously
revised
version of earlier today [Dec 19 2003 - Ed.]. Comments on either
would be welcome,
especially any comments to the effect that something is missing in
this
"scientific" edit which was welcome in the original. (Note: the
source
document for the text of this version is a hand-coded XML DocBook --
and
perhaps this is notable not for the coding, but for the hand).
Jeff Rubard
Typing And Proofing: Political Significance and the Sequent Calculus
Chapter 1. Typing and Proofing: Political Significance and the
Sequent Calculus
Men make their own history, but they do not make it as they
please;
they do not make it under self-selected circumstances, but under
circumstances existing already, given and transmitted from the
past.
The tradition of all dead generations weighs like a nightmare on
the
brains of the living. And just as they seem to be occupied with
revolutionizing themselves and things, creating something that
did
not exist before, precisely in such epochs of revolutionary
crisis
they anxiously conjure up the spirits of the past to their
service,
borrowing from them names, battle slogans, and costumes in order
to
present this new scene in world history in time-honored disguise
and
borrowed language. Thus Luther put on the mask of the Apostle
Paul,
the Revolution of 1789-1814 draped itself alternately in the
guise
of the Roman Republic and the Roman Empire, and the Revolution
of
1848 knew nothing better to do than to parody, now 1789, now the
revolutionary tradition of 1793-1795. In like manner, the
beginner
who has leaned a new language always translates it back into his
mother tongue, but he assimilates the spirit of the new language
and
expresses himself freely in it only when he moves in it without
recalling the old and when he forgets his native tongue. Marx,
18th
Brumaire
the old major who used to take me to the Capitol when the Senate
and
the House of Representatives were in session had been in the
commissary of the Confederate Army and had very beautiful manners
so
the attendants bowed to the old major except for the pages who
were
little boys not much older than your brother was a page in the
Senate once and occasionally a Representative or a Senator would
look at him with slit eyes may be somebody and bow or shake
hearty
or raise a hand -- John Dos Passos, *USA*
What Is Proof Theory? Proof Theory's Not Semantics
The consideration of formal logic in conjunction with politics -- not
political science, but the bonafide article -- may seem strange,
because
it frankly is. Political life is seen by many as fundamentally
irrational, and various theoretical traditions devoted to classifying
politics as a "matter of custom" grew up around the wreckage of
classical liberalism in the 20th century. Such wreckage as the world
had
never seen -- entire cities laid to waste by powerful new weapons,
and
entire peoples under the sway of finely-grained systems of deception
and
domination. With such events as a backdrop, it may seem foolish to
advance new theories as to the true constitution and interests of a
polity; and such foreboding is well-taken, as theories serve roles
other
than systematizing observations and can get caught in their own
leading-strings, as it were. But perhaps methods exist which will
allow
us to proceed carefully and thoroughly, not only with respect to
concrete political legacies of the modern era.
It is with this conviction -- less than a belief, more than
hypothesis
-- that I introduce this series of essays, and I will accordingly
begin
this essay by placing a strange term from formal logic down on the
page:
proof theory. Proof theory is a subdiscipline of formal logic
dedicated
to the study of methods for proving theorems: it originated with the
mathematicians of the University of Gottingen in the 1930s, in
particular David Hilbert and Gerhard Gentzen. This might seem like a
less than auspicious start if we are going to think that such
theories
have a bearing on the political life of contemporary liberal
democracies, and perhaps it is so. But perhaps not, and I will begin
to
make that argument by invoking a more-familiar use of a logical term:
the complaint that an argument is "just semantics". Whether semantic
systems are themselves important for the political system is an issue
I
will address later: but for now, let me define the term "proof
theory"
recursively -- that is, give a method for generating the object of
study
by repeatedly drawing a distinction.
Proof theory is not semantics; it is not that the two are not closely
linked -- they are nearly inseparable -- but the material of
proof-theoretical research is formed by hiving off from questions of
meaning. In other words, the proof theorist (who is not exactly a
mathematician, at least with this had on) studies what can be said
about
an utterance when its meaning is unknown or indeterminate. But not
just
singly: whether their meaning be independent or based in larger
concerns, utterances are notoriously gregarious creatures and their
collective behavior can be profitably studied. In saying this, we
have
almost entered the province of sociology -- and we will hear more
about
that later. But let me reaffirm that the logical status of a set of
utterances can be studied without regard to any particular use they
find
in an arena of society. It is not so much that the logician studies
assertions in a formal way, as that he or she studies more or less
formal uses of them -- social occurrences they repeatedly effect.
So I have provisionally established an object of study for the proof
theorist. What would count as an acceptable tool with which to study
proofs as they are -- ranging from rough and unfinished, to
sophisticated and potentially faulty? Well, let's consider the
criteria
for "object-oriented" computer programming languages such as
Smalltalk
or C++. There are four functions such a language must allow:
abstraction, encapsulation, polymorphism and inheritance. Abstraction
allows an object to be "opaque" in its functioning within a program,
encapsulation lets the data stored within the object be "regulated"
by
internal principles, polymorphism allows the aforementioned "hiving
off"
of similar but different objects, and inheritance allows
characteristics
of the "parent" object to be transmitted to the new object. But what
are
these "objects" I speak of? *Not* goals, but rather objects as we
understand them in the physical world and occasionally suppose to
exist
otherwise. To use a little bit of the stipulated precision of German,
"object-oriented" languages are *gegenstaendlich* and not
*zweckrational*.
Thusly, if we are attempting to study theoretical entities like
numbers
or concepts in an objective spirit such concerns as motivate
theoretical
computer scientists are very much to the point, and I shall be
looking
for such "objectual" systems to the best of my ability in these
essays.
But before we proceed, what profit might there be in taking a step
back
from the sophisticated computer systems for logically processing
information which are in place today? Perhaps to a certain extent
this
point was put here for us to learn, as programming languages
originally
derive from pure logical concerns. But here is a more constructive
suggestion: contemporaneously with the development of proof theory, a
new systematic approach to semantics ("model theory") came into being
in
various locales. Model theory was motivated by discovery of
limitations
on the structurally desirable properties which could be attributed to
several common logical systems -- and another subdiscipline,
computational complexity theory, has developed to continue such
gainsaying of a logic's suitability for computer programs.
Perhaps we could say that model theory occupies a space between these
two; and if so, questions concerning "the politics of meaning" might
be
provisionally decided by considering all three aspects of systems of
political discourse. So it is demonstrated that sometime acceptance
of
the criticism "that's just semantics"; does not call a halt to
consideration of such issues; and having made this provisional
determination, we can press forward -- in no particular spirit.
Commitments there always are, however; and these essays are
fundamentally intended for young people of an extremely critical cast
of
mind, such that they avoid certain pitfalls in thought -- that is,
certain "language-games" which never play out to everyone's
advantage.
And although the tools offered here are no substitute for a realm of
freedom beyond that known by all previous history (they are frankly
no
substitute for silence and a knowing look) I am very much of the
conviction that quite a bit of harm avoidance, rather than "risk
reduction", is possible today with concerted group effort -- with the
stress placed on "concerted".
The Form Of The Matter: An Introduction To The Sequent Calculus
Gentzen invented two popular proof systems, the sequent calculus and
natural deduction; and as natural deduction is the more popular of
the
two (for the purposes of instruction in formal logic and analytic
philosophy) it is perhaps desirable that I take a minute and explain
its
limitations. Prior to Gentzen, all logical work was conducted in the
familiar axiom-lemma-theorem format: by contrast, Gentzen's systems
offer ways to understand chains of reasoning backwards as well as
forwards, with a savings in elegance and conceptual economy which
pays
off while considering metatheoretical issues. Both natural deduction
and
the sequent calculus involve two sets of rules for each logical
connective: natural deduction has "introduction rules", for when a
statement introducing a connective can be proven, and "elimination
rules" for proving things from such a statement. Natural deduction is
easy to learn and use, but contains methodological shortcuts which
make
even some common mathematical proofs impossibly long (cf. George
Boolos,
"Don't Eliminate Cut").
The sequent calculus consists of three sets of rules for manipulating
its primitive object, the sequent -- and these provide all the
necessary
mores for any sort of commonly accepted proof. The sequent, which
consists of an "antecedent" and "succeedent", is not quite a formula;
rather, it is a schema which represents a number of formulae having
the
same basic structure -- and this feature of the sequent is not
without
its theoretical consequences, as I will show later. But more
importantly, this economy of symbolism permits a wider variety of
formal
manipulations to be "codified". The three sets are "right" rules, for
introducing a connective in the antecedent; "left"; rules, for
introducing one in the succeedent; and "structural" rules governing
all
chains of reasoning. Of the three, the structural rules unique to the
sequent calculus are the most theoretically important; variation of
them
allows us to formulate many "substructural logics" with interesting
properties. (Several of these logics are explained by Greg Restall in
his recent book.)
But in this essay I will be drawing attention to the bipartite
distinction between "left" and "right" rules in a spirit quite
foreign
to philosophical logic as it has traditionally been practiced, as
well
as provisionally formulating a definition of logical analysis
exceeding
the traditional canons of inference (that discipline's bailiwick).
First
I will give the set of rules for Gentzen's original sequent calculus
LK,
based on the version given by Wikipedia but with a slight
modification.
In place of the traditional "sequent connective", variously rendered
as
a "turnstile" or a double arrow, I am putting the "double turnstile"
as
something of a challenge. The double turnstile, which forms the basis
of
the glyph announcing these essays, ordinarily represents model-
theoretic
consequence (inferences valid on any construal of their
constituents).
Old hands will feel it is out of place here, but the distinction
between
derivability in a particular formal system and the "structural
inference" of the sequent is already made: the latter does not permit
of
"subscription". In other words, I am proposing that model-theoretic
consequence for particular logics devolves from sequent connections:
that proof exceeds truth, not the other way around.
Axiom:
(I)
A |= A
Left logical rules: Right logical rules:
Conjunction
Γ, A Δ (∧L1)
----------------
Γ, A∧B |= Δ
Γ, B |= Δ (∧L2) Γ |= A, Δ Σ |= B, Π
(∧R)
--------------
-------------------------
Γ, A∧B |= Δ Γ, Σ ⇒ A∧B, Δ, Π
Disjunction:
Γ, A |= Δ Σ, B |= Π (∨L) Γ |= A, Δ (∨R1)
------------------------ ---------------
Γ, Σ, A∨B |= Δ, Π Γ ⇒ A∨B, Δ
Γ |= B, Δ (∨R2)
---------------
Γ |= A∨B, Δ
Material Implication:
Γ |= A, Δ Σ, B |= Π (→L) Γ, A |= B, Δ (→R)
---------------------- ----------------
Γ, Σ, A→B |= Δ, Π Γ |= A→B, Δ
Negation:
Γ |= A, Δ (¬L) Γ, A |= Δ (¬R)
--------------- ----------
Γ, ¬A |= Δ Γ |= ¬A, Δ
Universal Quanitifier:
Γ, A[t] |= Δ (∀L) Γ |= A[y], Δ (∀R)
-------------------- ----------------------
Γ, ∀x A[x/t] |= Δ Γ |= ∀x A[x/y], Δ
Existential Quantifier:
Γ, A[y] |= Δ (∃L) Γ |= A[t], Δ (∃R)
-------------------- ---------------------
Γ, ∃x A[x/y] |= Δ Γ |= ∃x A[x/t], Δ
Left structural rules: Right structural rules:
Weakening:
Γ |= Δ (WL) Γ |= Δ (WR)
------------- -------------
Γ, A |= Δ Γ |= A, Δ
Contraction:
Γ, A, A |= Δ (CL) Γ |= A, A, Δ (CR)
----------------- ---------------
Γ, A |= Δ Γ |= A, Δ
Interchange:
Γ1, A, B, Γ2 |= Δ (IL) Γ |= Δ1, A, B, Δ2 (IR)
-------------------- ---------------------
Γ1, B, A, Γ2 |= Δ Γ |= Δ1, B, A, Δ2
Cut: Mix (Multicut):
Γ |= A, Δ Σ, A|= Π Γ |= An, Δ Σ, Am|= Π
----------------------- --------------------------
Γ, Σ |= Δ, Π Γ, Σ |= Δ, Π
What of the other symbolic "peculiarities" of the sequent calculus?
The
gamma, delta and sigma symbols stand for groupings of formulae
("context"). Additionally, the commas separating components of a
sequent
have different meanings depending on whether they appear in the
antecedent or the succedent; in the former case they are disjunctive,
in
the latter case conjunctive. However, *pace* Nuel Belnap's Display
Logic, it seems clear that Gentzen's inclusion of commas in the class
of
"improper symbols" was well-founded, as improper symbols in logic
belong
to a class of symbols someone once called "circumstructure" without
anyone paying too much attention. Gentzen's comma could well be taken
as
paradigmatic of a circumstructure -- a piece of "the furniture of
reasoning" -- rather than an invitation to impose arbitrary
structures
on proofs. To the contrary, the rules of the sequent calculus suffice
for rendering all the well-formed formulae of classical and
intuitionistic logic (about which more later), but more importantly
rendering them in a manner perspicuous for metamathematical proofs.
Proofs formulated in the sequent calculus without using the cut rule
obey the subformula property: only subformulae of the derived formula
appear in the tree representing the steps of the deduction. Gentzen
had
an illuminating thing to say about such proofs: "The final result is,
as
it were, gradually built up from its constituent elements. The proof
represented by the derivation is not roundabout in that it contains
only
concepts which recur in the final result." This is an eminently
desirable property from the point of view of demonstrating
consistency,
but unfortunately it almost never appears in any kind of proof we
would
be tempted to engage in while thinking practically: even in
mathematic
logic, "rules of thumb" are an essential part of economically proving
theorems (that is, within the amount of time allotted to the luckiest
of
us on Earth). The cut (together with "weakening", which could perhaps
more accurately be called dilation) is a catch-all device for
including
such "rules of thumb"; as valid forms of inference, but it does not
thereby show that such rules are "conservative extensions" of the
other
rules: for this purpose, a cut-elimination theorem must be proven.
Gentzen's proofs of cut-elimination theorems (*Hauptsaetze*) for
classical and intuitionistic logic employ a variant of the cut rule,
the
mix rule. The purpose of the mix rule is to make prooftrees for a cut
rule tractable, to demonstrate that the cut formula can in theory be
systematically eliminated from any antecedents of a sequent involving
it. Although there are now methods for proving cut-elimination
theorems
which do not use the mix rule, the element differentiating the mix
rule
from the cut rule (the repetition of the cutformula) is worth
considering. What the principle behind the mix rule (in whatever
form)
states is that, although unlike natural deduction the sequent
calculus
does not require that a rule of inference be justified in terms of
immediately obvious applicability of "introduction rules", a valid
cutformula must be genuinely "rulish", in principle capable of
multiple
instantiation; a formula which is not repeatable is merely a glyph
without proper content. The resonances of that formulation are
intentional and deserve further study; we find more than an echo of
Kant's "empirical realism", but quite a bit less than a full account
of
Kantian theoretical philosophy in terms of modern logic.
Another way to explain the difference between cut and mix, quite a
bit
more to the point, is in terms of natural-language pragmatics. Cut is
the formal equivalent of "what is said", the literal meaning conveyed
by
an utterance in one or another way; there are no cut-and-dried rules
for
determining when one is entitled to employ an espied meaning in
reasoning, yet we do it all the time. Mix, on the other hand,
represents
the mechanism by which we can spot implicatures (the metaphorical
equivalence of what is said to some other statement) given context.
If
this analogy seems arbitrary, let me say that in the proof of
cut-elimination for LK the "rank" and "degree" of arbitrary
cutformulae
are used; "rank" is the depth of the derivation necessary to generate
the cutformula, "degree" is the number of logical symbols appearing
in
the formula. But if this seems (perhaps disturbingly) similar to the
concept of "typing" as it is used in mathematical logic and
programming
languages, there is more than a similarity here.
Gentzen's cut-elimination proof makes tacit use of what has come to
be
called the "Curry-Howard isomorphism" or "formulas as types", where a
formula's prooftree is represented by typing of the usual sort (u,v),
where u and v represent domains for a propositional function and the
formula its range. (Such issues are explained in detail in a recent,
highly readable introduction to proof theory, Girard's *Proofs and
Types*; recently made available by its translators on the Internet,
as
it has gone out of print, at
nick.dcs.qmul.ac.uk/~pt/stable/Proofs+Types.html).
In other words abductive inference, reasoning to the best explanation
--
such as is used in "guesstimating" the context-dependent import of
utterances -- has quite a lot to do with these (fallible) rules of
proof: but this is not itself a matter of "best explanation", but
rather
a (partial) homology permitting of (unreliable) exploitation in
considering issues of natural-language pragmatics. That is, provided
an
extreme linguistic atomism is not in play, as cut-elimination rules
out
systems where every element may potentially obey a completely
idiosyncratic logic but not a "propositional" or aspect-dependent
character to utterances: which is today frequently but
problematically
represented as a function of possible worlds to statements by
students
of pragmatics, a topic which will itself be addressed under a
diferent
aspect in the excursus to this essay.
Placing Politics: Telling Your Left From Right In A
Multi-Dimensional World
So let us go on and consider one class of such issues, problems of
political life and its accompanying discourse. To begin: variants of
the
sequent calculus, the "one-sided" Gentzen-Schuette systems, only
permit
statements in the succedent. These are adequate for the purposes of
metamathematically characterizing classical and intuitionistic logic,
but there are a few reasons to think that the "natural" valences of
right and left for inhabitants of the modern era are not irrelevant
here
and deserve a great deal more scrutiny than they usually get (we
shall
have occasion to consider many such "equivalences" in later essays).
To
begin such a consideration: if the terms "right" and "left"
spatialize
the political sphere, they do not do it in any straightforward way.
The
"binary opposition" derives from an early parliamentary seating
arrangement during the French Revolution, where radical members sat
on
the king's left, royalists on his right, and moderates in the middle;
so
the distinction does not serve as an absolute measure of the
political
spectrum (unless there is a tacit populist premise involved).
However,
the distinction traditionally repeats itself within political
parties,
and the question of what is marked out by such a distinction and how
it
affects the party as a whole has been raised time and again
Perhaps most thought-provokingly by Lenin, who advocated "democratic
centralism" as a course for the Bolshevik party such that they might
avoid both "infantile" left-wing purism and ineffective reformism;
the
"democratic centralist" method was supposed to ensure party unity
while
allowing free and open debate on matters of policy and principle. As
I
mentioned, the one-sided systems are fully adequate for the purpose
of
mathematical proofs; one might even be tempted to understand them as
a
"translation" of traditional proof methods into Gentzen's idiom. But
it
seems to me that what two-sided, multivalent sequent calculi are
uniquely suited to "displaying" is the rhetorical effectivity of
political statements (a question raised by Althusser), what is spoken
of
as "running to the left" or to the right in electoral politics. Here
I
am construing "political" in its original, wide signification of
social
life in general, and raising the question of what active supporters of
a
political party are doing while their candidate is raising hopes.
This hypothesis hinges on a conjecture about the linguistic structure
of
the political sphere: that the interpretations abroad concerning past
historical events are generally favorable to the political left,
whereas
the interpretations of contemporary life "in fashion", those that
have
weight with decision-makers, are generally favorable to the political
right. When this is connected up with the logical truism that
antecedents were made for succedents and vice versa, this suggests
that
there are two well-founded political strategies: leftists should try
to
show that the lessons maximally radical past movements teach us have
as
consequences "minimally conservative"; positions (that is, constitute
the lower bound of political positions conservatives must take
seriously) and rightists should try to show that minimally "radical"
(that is, maximally feudalistic) past political movements have as
"trivial" consequences equalitarian shibboleths of the day.
These strategies apply for left-right disputes within a single party
as
well, but whether there is something of the inevitable in the outcome
is
an open question indeed. Gentzen's personal association with the Nazi
party has been the subject of some discussion recently (in Germany,
that
is); and this formulation may suggest that any truth-content in
political rhetoric is being erased by considering such issues using
his
logical tools. However, no other area of discourse is as
well-established in popular sentiment as being a "social technology":
a
means to produce desired effects in a social formation, rather than
existing solely for the purpose of conducting inquiry. And so it
seems
that simply giving the view that the business of politics is
"manufacturing consent" a sharper formulation does no great harm; and
a
third element (as well as the question of the social effectivity of
rhetorical "models", which as I said is one for another day) prevents
such tactics from constituting the whole of discourse about politics.
The famous aphorism of Charles Peguy, "Everything begins as a
mystique
and ends as a politique", suggests a third attitude to political
rhetoric -- one of distaste and resignation: and this attitude is
most
markedly present in "libertarian" political outlooks, not only in
right-wing defenders of free enterprise and property-rights against
"nanny states", but in libertarian leftist movements battling the
control of individual life by profit-motivated organizations.
In America such groups have historically drawn inspiration from
anarcho-syndicalism, but this is something of a rag-bag term,
including
theorists and practical politicians who would not accept such a label
and excluding some who came very close to practicing the ideal of
liberation from the state. What makes abstentionism (pointed
non-participation in elections) a political tactic is its recognition
of
the reality of the political sphere; the Amish are not
abstentionists.
What makes abstentionism (which can be practiced within parliamentary
parties, as the Mississippi Freedom delegation at the 1964 Democratic
convention demonstrated) a "libertarian" tactic is its denial of the
reality of a particular piece of political discourse, assigning to it
the character of "para-politics". The distinction between politics
and
para-politics cuts across the legitimacy/illegitimacy distinction:
the
Continental Congress was illegitimate by the standards of British
rule
but genuinely political, whereas the Nazi party came to power by
"legitimate" means which were then perverted to serve
extra-parliamentary interests. Gentzen's sequent formulation of
intuitionism, LJ, is exactly like LK save for its not permitting more
than one formula in the succedent; and this suggests itself as an
excellent framework for understanding (and practicing) abstentionism.
Intuitionism is the metamathematical doctrine that mathematical truth
cannot exceed means of proof, but it is not as "puritanical" in this
as
is commonly supposed: intuitionistic choice functions permit the use
of
mathematical objects generated by infinite processes of effective
construction, rather than banning the concept of uncountable infinity
from mathematics entirely. And in this intuitionism resembles another
Dutch specialty, council communism.
Council communism gives perhaps the purest formulation of the strict
political demand of communism, that political sovereignty cannot
exceed
the legitimacy granted it by participation and that political bodies
which do not meet this criterion (parliaments) ought to be discarded
in
favor of ones that do (worker's councils). Political intuitionism --
an
abstemious attitude to electoral politics, wherein political activity
is
limited solely to arguing for the legality of particular exercises of
personal freedom, and the illegality of particular exercises of
individual coercion, -- is a means of preserving what is of value in
liberal democracy (uniformity of treatment under the rule of law)
without legitimating what is not (excesses of the bureaucratic
state),
and which does not deny the reality of the polity's claim on us.
"Minarchism" doesn't sound so good if you've ever tried to prove much
without ex falso, the rule that from a contradiction one may infer
anything one likes.
Excursus On Superstructural Logic
Earlier I mentioned the field of "substructural logics", those with
restricted canons of inference. But here I would like to consider an
opposite tendency in thought; namely, an attempt to extend
formalization
such that social phenomena formerly grouped under the heading "non-
logic
conduct" can be rigorously understood in their symbolic articulation.
A
natural fit between the sequent calculus and political rhetoric,
should
such a thing be found to exist, would be worked by politics being one
of
Talcott Parsons' "generalized media of communication" -- a way in
which
action is coordinated without consensus. This may suggest there are
other areas of social interactions which have "templates" provided by
other formal system. I believe similar links exist between the
Hilbert
systems and scientific truth, natural deduction and the discourse of
love in its various (though not manifold) forms; finally, the
resolution
calculus and money or visuality. I will try to explain such claims in
the essays to come; but that such conjectures are made at all
presupposes a degree of "flex" in the logical bailiwick which is not
always supposed to exist. As I mentioned, many thinkers hold that
politics is not "rational" in the sense of being axiomatic, but
rather
is constituted by of cultural *Gestalten* which are impossible to
"pin
down": one notable case is Michael Oakeshott.
But accepting this overview in the abstract does not bar one from
considering "scientific knowledge" to play a significant role in
politics, including scientific knowledge defined in terms of strict
(modally rigid) laws. It seems that provisions should be made for the
inclusion of such rules in sequent calculi (at least on grounds of
realism), but that the presentation above suggests that existing
techniques for including extended rules of inference (such as
modalities) in a sequent calculus are misleading at best, since
"intensional" operators are of a fundamentally different class.
Intensional operators are "non-truth-functional"; the meaning of a
statement involving an intensional operator is not purely dependent
on
the meaning of the other constituents of that statement, "external
factors" must be considered. It may be objected that quantifiers are
non-truth-functional and still included in the sequent calculus
proper;
but as indicated above, the left and right quantifier rules are not
introduction rules simpliciter, as they contain significant
substitutional constraints (whereas both structural and connective
rules
have no restrictions).
So instead, I would propose that intensional operators generally be
understood in terms of the rules Weakening and Cut. Operators for
"normal" modal logics are special cases of Weakening, justified in
terms
of closure under Cut for certain types-as-formulas -- i.e., the
applicability of the Necessitation rule, where "necessarily p" means
that p is a theorem. This does not provide for a "elegant"
representation of rules of inference pertaining to modal logic, but
such
elegance is rather contrary to the nature of modal logic; Hilbert
calculi are ideally suited to weighing the subtle differences between
modal systems, which distinctions form the bulk of the study of modal
logic, and which are not well-represented in natural deduction and
sequent calculi (although they are based in significant model-
theoretic
differences). But it does provide for an understanding of
"intensionality" which naturally connects it to the more general
theoretical notion of superstructure and de-links it from the concept
of
"circumstructure" or normative fact introduced in the third essay in
this series. Objections mounted to intensional logic on Brentanian
grounds may be less strenuous if it is understood that what is being
talked about is not quite the "appearance" of traditional
epistemology,
but patterns in social discourse about such appearances which subsist
independently of individual glosses on experience.
And objections mounted to "essentialist metaphysics" conducted in the
guise of reflections on modality may abate if the multiplicity of
modal
systems and operators currently studied is well-understood. For
example,
one "logical modality", the difference operator, does not say
anything
about necessity or possibility, what it states is that some other
possible world in a model satisfies a proposition; its dual states
that
every other possible world in a model satisfies a given proposition.
Blackburn, de Rijke and Venema introduce the difference operator with
grins, likening it to a hammer in the bottom of a tool chest only
used
when no other tool will do, but in reality it and its dual are very
powerful tools for understanding a concept never before analyzed in a
logical manner, Carl Schmitt's friend/enemy distinction. The
distinction
between friend and enemy in Schmitt is not one between personal
friend
and personal foe, but the founding division of politics: according to
Schmitt, when considering members of other nations in belligerent
situations, there is some one thing they are all not, namely
possessing
a groundless commitment to our way of life. But the point of "modal
political theory" in this style is not to demonstrate Althusser's
concern with the "authorless theater", global structural constraints
on
social life (which requires quite a bit more thinking).
It is rather intended to show that Schmitt's other obsession, the
*ius
publicum Europaeum* (the opposite of 20th-century partisan or total
warfare) can be understood as positing a (unchecked) homology between
the mores of one country and some similar configuration in another
system, i.e. employing the aforementioned difference operator and not
its dual. (Whether Schmitt's later distinction between *Ordnung*,
"ordering", and *Ortung*, "localization", can be seen as underpinning
this is a matter for further research.) And thusly the bumper-sticker
saw "War does not determine who is right, only who is left" can be
seen
in light of this as really saying something like: the discourse of
unchecked political conflict derives from antecedents of said
conflict,
indicates the depth of support for a position in past solidarities,
rather than creating new ones. Diplomats have traditionally reasoned
in
this way all the time; so if "conservative revolutionaries" are to
judge
"rationalism in politics" impossible, the cogency of employing
scientific discourse in politics is what needs to be knocked down,
not
the straw man of a "scientific mode of dress", on pain of nothing new
under the gun; and a "traditional" suggestion is that this thought
disturbs them not at all.
-----
To add what I feel is a necessary adjunct to this (that is, a comment
which is not an organic part of the essay but modulates certain
elements
of it which plausibly cause undesirable distortions in public
discourse
concerning these topics), what I am naming as "visuality" is not the
"logic of painting" as envisioned by Clement Greenberg; that would be
something more like Freud's "scopic drive", or Adornian "unsealing of
the concept". If this seems vague and wishy-washy, it is: but it
absolutely essential (to my mind, the logician's chief task) to
defend
the legitimacy of drawing distinctions in thought, whether they be
found
scientific or not. Many partisans of the physical or
"putativelyphysical" sciences would not agree, citing the ill-defined
category "junk science" as a blight upon civilization; and I am
inclined
to agree, but also to define the category quite differently than many
(illegal/unethical research).
I feel compelled to preserve this right for others, but to the extent
that my thinking about such matters as these is scientific I also
utilize it. The description of "communicative intentions" as a
function
from possible worlds onto sentences is not an accurate rendering of
the
locus classicus for such thoughts, Stalnaker's partial formalization
of
Gricean pragmatics, but I feel the idea of "possible worlds
semantics"
mapping ill-defined "points" onto clearly-defined "truth-values" to
beg
all kinds of questions, or rather to beg off regarding certain
questions
about "fusion of inferences" as a *positive process* (as it is
described
by Gentzen), which forces the description of forms of discourse into
excessively "rationalistic" and individualistic conformances.
In other words, I'm not talking about you here, whoever you happen to
be; I'm talking about a dead guy people don't know much about. But
if
you feel this essay to be oppressive, you have a right to that.
Jeff Rubard
-----
http://search.barnesandnoble.com/The-Spirit-of-the-Laws/Charles-de-Montesquieu/e/9780521369749/?itm=3
Figure.it.out.