*(This post has been modified to correct some errors and add some clarifications in response to comments from alert readers. Thanks for the corrections!)*

Today, we're going to take a brief diversion from category theory to play with

some logic. There are some really neat connections between variant logics and category theory. I'm planning on showing a bit about the connections between category theory and one of those, called *linear logic* . But the easiest way to present things like linear logic is using a mechanism based on **sequent calculus**.

Sequent calculus is a deduction system for performing reasoning in first order propositional logic. But it's notation and general principles are useful for all sorts of reasoning systems, including many different logics, all sorts of type theories, etc. The specific sequent calculus that I'm to talk about is sometimes called system-LK; the general category of things that use this basic kind of rules is called Gentzen systems.

The sequent calculus consists of a set of rules called *sequents*, each of which is normally written like a fraction: the top of the fraction is what you know before applying the sequent; the bottom is what you can conclude. The statements in the sequents are always of the form:

CONTEXTS, Predicates :- CONTEXTS, Predicates

The "CONTEXTS" are sets of predicates that you already know are true. The ":-" is read "entails"; it means that the *conjuction* of the statements and contexts to the left of it can prove the *disjunction* of the statements to the right of it. In predicate logic, the conjuction is logical and, and disjunction is logical or, so you can read the statements as if "," is "∧" on the left of the ":-", and "∨" on the right. *(Note: this paragraph was modified to correct a dumb error that I made that was pointed out by commenter Canuckistani.)*

Contexts are generally written using capital greek letters; predicates are generally written using uppercase english letters. We often put a name for an inference rule to the right of the separator line for the sequent.

For example, look at the following sequent:

Γ :- Δ

————— Weakening-Left

Γ,A :- Δ

This sequent is named Weakening-left; the top says that "Given Γ everything in Δ can be proved."; and

the bottom says "Using Γ plus the fact that A is true, everything in Δ can be proved". The full sequent basically says: if Δ is provable given Γ, then it will still be provable when A is added to Γ;in other words, adding a true fact won't invalidate any proofs that were valid before the addition of A. *(Note: this paragraph was modified to correct an error pointed out by a commenter.)*

The sequent calculus is nothing but a complete set of rules that you can use to perform any inference in predicate calculus. A few quick syntactic notes, and I'll show you the full set of rules.

1. Uppercase greek letters are contexts.

2. Uppercase english letters are *statements*.

3. Lowercase english letters are *terms*; that is, the objects that predicates

can reason about, or variables representing objects.

4. A[b] is a statement A that contains the term b in some way.

5. A[b/c] means A with the term "b" replaced by the term "c".

-------

First, two very basic rules:

1.

———— (Identity)

A :- A

2. Γ :- A, Δ Σ, A :- Π

—————————————— (Cut)

Γ,Σ :- Δ, Π

Now, there's a bunch of rules that have right and left counterparts. They're duals of each other - move terms across the ":-" and switch from ∧ to ∨ or vice-versa.

3. Γ, A :- Δ

————————— (Left And 1)

Γ, A ∧ B :- Δ

4. Γ :- A, Δ

——————— ——— (Right Or 1)

Γ, :- A ∨ B, Δ

5. Γ, B :- Δ

——————— ——(Left And 2)

Γ,A ∧ B :- Δ

6. Γ :- B, Δ

——————— ——— (Right Or 2)

Γ :- A ∧ B, Δ

7. Γ, A :- Δ Σ,B :- Π

————————————— (Left Or)

Γ,Σ, A ∨ B :- Δ,Π

8. Γ :- A,Δ Σ :- B,Π

—————————— ——(Right And)

Γ,Σ :- A ∧ B, Δ,Π

9. Γ :- A,Δ

————— —— (Left Not)

Γ, ¬A :- Δ

10. Γ,A :- Δ

——————— (Right Not)

Γ :- ¬A, Δ

11. Γ :- A,Δ Σ,B :- Π

————————————— (Left Implies)

Γ, Σ, A → B :- Δ,Π

12. Γ,A[y] :- Δ *(y bound)*

————————————— (Left Forall)

Γ,∀x A[x/y] :- Δ

13. Γ :- A[y],Δ *(y free)*

————————————— (Right Forall)

Γ :- ∀x A[x/y],Δ

14. Γ, A[y] :- Δ *(y bound)*

———————————— (Left Exists)

Γ,∃x A[x/y] :- Δ

15. Γ, :- A[y], Δ *(y free)*

————————————(Right Exists)

Γ :- ∃x A[x/y], Δ

16. Γ :- Δ

—————— (Left Weakening)

Γ, A :- Δ

17. Γ :- Δ

—————— (Right Weakening)

Γ :- A, Δ

18. Γ, A, A :- Δ

——————— (Left Contraction)

Γ,A :- Δ

19. Γ :- A, A, Δ

——————— (Right Contraction)

Γ :- A, Δ

20. Γ, A, B, Δ :- Σ

————————— (Left Permutation)

Γ,B, A, Δ :- Σ

21. Γ :- Δ, A, B, Σ

————————— (Right Permutation)

Γ :- Δ B, A, Σ

Here's an example of how we can use sequents to derive A ∨ ¬ A:

1. Context empty. Apply Identity.

2. A :- A. Apply Right Not.

3. empty :- ¬ A, A. Apply Right And 2.

4. empty : A ∨ ¬A, A. Apply Permute Right.

5. empty :- A, A ∨ ¬ A. Apply Right And 1.

6. empty :- A ∨ ¬ A, A ∨ ¬ A. Right Contraction.

7. empty :- A ∨ ¬ A

If you look *carefully* at the rules, they actually make a lot of sense. The only ones that look a bit strange are the "forall" rules; and for those, you need to remember that the variable is *free* on the top of the sequent.

A lot of logics can be described using Gentzen systems; from type theory, to temporal logics, to all manner of other systems. They're a very powerful tool for describing all manner of inference systems.

- Log in to post comments

Does Rule #4 (Right Or 2) contains an "AND" where it should have an "OR"?

Important Missing Information!!!1!!one!!

The commas are to be treated as ANDs on the left hand side of the ENTAILS and are to be treated as ORs on the right hand side.

You could say I know a lot about logic. However, I'm not used to sequent calculus.

You summarize your first sequent as "Using Î plus the fact that A is true, everything in Î can be proved." I strongly doubt that you really want to say "the fact that A is true" in this context. The sequent should still apply if A is false... except I'm not sure what it means for A to be true or false here. But the sequent should still make sense even if A is a contradiction.

You then summarize the sequent again as "adding a true fact won't create an inconsistency." But that doesn't seem like it's in the spirit of the sequent at all. The sequent says, "after adding a new assumption, you can still derive what you could derive before." But it leaves open the possibility that you may be able to derive new things... possibly an inconsistency. I agree that if your new assumption is "true", then that shouldn't create an inconsistency, but that doesn't seem to be what the sequent says.

Basically, the sequent seems to make a claim in the direction of how *much* is entailed by Gamma,A. Whereas a claim of the form "does not create inconsistency" would be a claim in the direction of how *little* is entailed by Gamma,A. Big difference.

I suspect we can't write "adding a true fact won't create an inconsistency" in sequent form. Well, that's partly because I don't know how you say "true". It might be easier to write "adding a tautology won't create an inconsistency", but I need a new symbol: ":/-". Sorry, that looks terrible, but it's supposed to mean "does not entail". Then I can write something like this:

empty :- B

------------------------------

Î,B :/- Â¬ A, A

("If B is a tautology, then adding B to any context does not entail a contradiction.") But I don't think that counts as a sequent. Oops---I just realized that even if it is a sequent, it isn't "valid" as written (consider the case where Gamma is already inconsistent).

EJ:

We're running up against one of my frequent problems; it's difficult to try to write about formal math using informal terms. I tend to err on the side of intuition: I try to write in a way that puts giving people an intuitive understand of what things mean ahead of being as precise as possible. Usually that works out well; sometimes, as in this case, it gets me into trouble :-)

What I meant by the sequent preventing inconsistency was "If I can prove Δ before adding A, then adding A won't change things in a way that means that something that was true before won't be true now - so everything in Δ will still be provable after the addition of A". I was thinking of that kind of inconsistency: "X is provable from Γ before A, but with A added, X is no longer provable".

Similarly with the prose about "the fact that A is true"; for an informal presentation, taling about "A" as "true" provides an intuitive grasp on what's going on, even if it isn't the ideal formal presentation.

Canuckistani:

I don't think it's really correct to say that the "," to the left of the entails means "OR". (In fact, it's a little iffy to say what anything *means* in sequent calculus, since SC is purely syntactic; the semantics comes from predicate logic.) I may be mis-recalling, but I think that the right-hand side of an entailment is a list of distinct statements, each of which is provable given the left hand side. I've updated the article to try to clarify that.

Canuckistani:

I just went back to a logic test to double-check. You're right, and I'm wrong: the LHS of an entailment is a conjunction (AND in FOPL), and the RHS of an entailment *is* a disjunction (OR in FOPL).

I'll go fix the article now. Thanks for the correction, and sorry for getting it wrong even after you pointed it out!

Oh, I don't know jack about the sequent calculus. You should credit whoever wrote the article on the cut-elimination theorem in Wikipedia.

However, spotting the error in the Right Or 2 rule was all me.

What's the point of dealing specifically with sequences, since the permutation and contraction rules make them functionally identical to sets?

Canuckistani:

The point of dealing with sequences and permutation/contraction rules is because the idea of a calculus like SC is to provide a purely syntactic mechanism for manipulating symbolic expressions. Using sets requires you to understand the ideas of expansion, contraction, and permutation; using syntactic rules that perform the permutations and contractions that are needed allows you to avoid needing even the semantics of simple sets: it's a purely syntactic system that operates on strings of symbols.

Could you do another article presenting these rules from an intuitionistic perspective?

Wikipedia says, "Surprisingly, some small changes in the rules of LK suffice to turn it into a proof system for intuitionistic logic. To this end, one has to restrict [the system] to sequents with exactly one formula on the right-hand side, and modify the rules to maintain this invariant.

I don't see how this can be done. Thanks!