\chapter{Propositional model-theory}
\label{ch:prop}

\setcounter{section}{-1}
\section{Propositional formulas}\label{sect:prop-form}

This chapter is inspired in part by Chang and Kiesler
\cite[\S~1.2]{MR0409165}, who describe the subject to be discussed
here as `{}``toy'' model theory'. 

Usually, the term \tech{model-theory} refers to first-order
model-theory, because the logic it uses is first-order logic.  The
notion of \tech{structure} defined above in \S~\ref{S:structures} is
the notion
as used in first-order model-theory.  A structure provides an
interpretation for certain symbols; also, as we shall see, a structure
can be a \tech{model} for sets of \tech{sentences.}  

The concepts of structure, interpretation, model, and sentence, have
analogues in the simpler context of propositional logic.  In this logic, a
\tech{truth-assignment} will take the place of a structure.  A
truth-assignment will provide an interpretation for 
\tech{propositional formulas,} and will serve as a
model for sets of propositional formulas.  

  Our official signature for propositional logic will be
  \begin{equation*}
    \{\to,\lnot{}\},
  \end{equation*}
although we may introduce other connectives as abbreviations.  With
the elements of our signature, along with parentheses, we shall build
up \defn{propositional formulas} from a countably infinite set
\begin{equation*}
  \PVar
\end{equation*}
of
\defn{propositional variables}.  For us, this set will be
\begin{equation*}
  \{P_k:k\in\varN\}; 
\end{equation*}
however, we establish:

\begin{notation}
  Bold-face letters $\sP$, $\sQ$, and $\sR$, will stand for
  members of the set $\PVar$.
\end{notation} 
The set of
{propositional formulas} will be called
\begin{equation*}
\PFm;
\end{equation*}
we shall give a set-theoretic definition this set.

A formula is a certain \tech{string} of symbols.  A \defn{string} is
just a function on some $n$ in $\varN$.  So, a string is just an
$n$-tuple; but we shall generally write a string as
\begin{equation*}
  s_0s_1\dotsb s_{n-1}
\end{equation*}
instead of $(s_0,s_1,\dots,s_{n-1})$.  Each of the expressions $s_k$
here stands for an \defn{entry} in the string.  The \defn{length} of the
string is $n$: we may write
\begin{equation*}
  \length{s_0s_1\dotsb s_{n-1}}=n.
\end{equation*}
The string \defn{begins} with $s_0$ and \defn{ends}
with $s_{n-1}$.  (The unique
string of length $0$ has no entries, and no beginning or ending.)  If the
string $s_0s_1\dotsb s_{n-1}$ is in $\PFm$, then each entry $s_i$ in
the string will be:
\begin{mylist}
  \item
an element of the set $\PVar$ of variables, or 
\item
one of the connectives $\to$ and $\lnot$, or
\item
one of the parentheses $($ or $)$.
\end{mylist}
We shall want to refer to strings by single letters, such as
$\sA$ and $\sB$.  As such, these letters are strings
of length $1$; but they \emph{stand for} strings of other lengths.  I
am writing the letters in boldface as a reminder that they are not
literally strings of symbols in our propositional logic.\footnote{In
  technical terms, they are
\tech{syntactical variables:} they are certain symbols of
  the \tech{syntax language.}  The latter is the language that we our using
  to talk about the \tech{object language,} which in this case is the
  language of propositional logic, which uses
  the symbols just listed above.  See \cite[\S~8]{MR18:631a}.}  For
the same reason, $\sP$, $\sQ$, and $\sR$, are written in bold face; they
are not variables in $\PFm$, but they \emph{stand for} variables.

For the moment, let $S$\label{strings} be the set of \emph{all}
strings of entries from the set 
\begin{equation*}
  \PVar\cup\{\to,\lnot,(,)\}.
\end{equation*}
That is, let
\begin{equation*}
S=  \bigcup_{n\in\varN}(\PVar\cup\{\to,\lnot,(,)\})^n.
\end{equation*}
We shall define $\PFm$ as a subset of $S$.  To do so, we let $\family U$
be the set of all subsets $N$ of $S$ such that:
\begin{mylist}
  \item
each variable (considered as a string of length $1$) is in $N$;
\item
if $\sA$ is in $N$, then $\lnot\sA$ is in $N$;
\item
if $\sA$ and $\sB$ are in $N$, then $(\sA\to\sB)$ is in $N$.
\end{mylist}
To put this another way,\footnote{And in a way that uses the formal
  symbolism of propositional and first-order logic!} for the moment,
  let $f$ be the singulary operation such that $f(\sA)$ is
the string $\lnot \sA$, and let $g$ be the binary operation such
that $g(\sA,\sB)$ is the string $(\sA\to\sB)$, for all $\sA$ and $\sB$
in $S$.  Then
\begin{multline*}
  \family U=\{\Xi\in\pow{S}:\PVar\included \Xi\land
\Forall X(X\in \Xi\to f(X)\in\Xi)\land{}\\
{}\land \Forall X\Forall Y(X\in\Xi\land Y\in \Xi\to g(X,Y)\in\Xi)\}.
\end{multline*}
Now we can define
\begin{equation*}
  \PFm=\bigcap\family U.  
\end{equation*}
This is an \tech{inductive} definition, as will be discussed further
in the next section.  In particular, it means that every element of
$\PFm$ can be displayed as the `trunk' or `root' of a \tech{tree}
whose `leaves' are variables:

\begin{example}\label{example:tree}
The string $(P_0\to(\lnot P_0\to P_1))$ belongs to $\PFm$, because it
is constructed from the variables $P_0$ and $P_1$ in a way permitted by
the conditions for being an element of $\PFm$.  This is suggested by
the following picture:
  \begin{equation*}
  \xymatrix@!0{
&& *+[o][F]{P_0} \ar@{-}[dr]&&&\\
&&&*+[F]{\lnot P_0} \ar@{-}[dr] && *+[o][F]{P_1} \ar@{-}[dl]\\
*+[o][F]{P_0} \ar@{-}[dr] &&&& *+[F]{(\lnot P_0\to P_1)} \ar@{-}[dlll] &\\
& *+[F]{(P_0\to(\lnot P_0\to P_1))} &&&&
}
\end{equation*}
One might draw this picture upside down:
\begin{equation*}
  \xymatrix@!0{
& *+[F]{(P_0\to(\lnot P_0\to P_1))} &&&&\\
&&&& *+[F]{(\lnot P_0\to P_1)} \ar@{-}[ulll] &\\
&&&*+[F]{\lnot P_0} \ar@{-}[ur] &&&\\
*+[o][F]{P_0} \ar@{-}[uuur]&& 
*+[o][F]{P_0} \ar@{-}[ur]&&& *+[o][F]{P_1} \ar@{-}[uul]\\ 
}
\end{equation*}
Thus one can depict a
  formula as being built \emph{up} from the variables, rather than
  \emph{down} from them.
\end{example}

In this context, a \defn{tree} is a set $T$ equipped with a partial
ordering $\leq$ such that, for each $a$ in $T$, the set
$\{x:x\leq a\}$ is finite and is totally ordered by $\leq$.
(Here a partial ordering is just a reflexive, antisymmetric,
and transitive relation.  In a more general definition of tree, each
set $\{x:x\leq a\}$ is only required to be \emph{well-ordered} by
$\leq$; the set may be infinite.) 

\begin{exercises}
  \item
As in Example~\ref{example:tree}, draw trees for some formulas, such
as 
\begin{mylist}
  \item
$(P_0\to(P_1\to P_0))$,
\item
$((P_0\to(P_1\to P_2))\to((P_0\to P_1)\to(P_0\to P_2)))$,
\item
$((\lnot P_0\to\lnot P_1)\to(P_1\to P_0))$.
\end{mylist}
\item
Show that the binary relation
\begin{equation*}
  \{(\sA,\sB)\in\PFm^2:\length{\sA}\leq\length{\sB}\}
\end{equation*}
on $\PFm$ is a partial ordering $\mathrel R$
such that 
\begin{myitem}
  \item
  $\sA\mathrel R\lnot\sA$, 
\item
$\sA\mathrel R(\sA\to\sB)$, and 
\item
$\sB\mathrel R(\sA\to\sB)$,
\end{myitem}
 for all $\sA$ and $\sB$ in $\PFm$.
\item\label{exercise:sub-f}
The \defn{sub-formulas} of a formula are the formulas that appear in
the tree for that formula.
Give a precise definition of the notion of sub-formula: that
is, define a partial ordering $\leq$ of $\PFm$ such that $\sA\leq\sB$
if and only if $\sA$ is a sub-formula of $\sB$.  (Your answer may be
that $\leq$ is the intersection of a family of partial orderings of
$\PFm$.) 
\end{exercises}




\section{Induction}\label{sect:induction}

The definition of $\PFm$ is \defn{inductive,} because it makes the next
theorem possible.
Recall that, to prove by \defn{induction} that a certain set $A$ of
natural numbers
contains \emph{all} natural numbers, one proves two things:
\begin{mylist}
  \item
that $A$ contains $0$,
and 
\item
that if $A$ contains $n$, then $A$ contains $n+1$ (no matter which
natural number $n$ is).  
\end{mylist}
With the
following theorem, a similar method will be available for showing that a
certain set of propositional formulas contains \emph{all}
propositional formulas.  This method can be called \defn{induction on
  (the complexity of) formulas.}  The proof that the method works is
almost immediate from the definition of $\PFm$:

\begin{theorem}[Induction on Formulas]\label{thm:induction}
  Suppose $N$ is a set of propositional formulas such that:
\begin{mylist}
  \item
each variable (considered as a string of length $1$) is in $N$;
\item
if $\sF$ is in $N$, then $\lnot\sF$ is in $N$;
\item
if $\sF$ and $\sG$ are in $N$, then $(\sF\to\sG)$ is in $N$.
\end{mylist}
Then $N=\PFm$.
\end{theorem}

\begin{proof}
  Let $N$ be as supposed.  Then, in particular, $N\included\PFm$.
  Moreover, $N$ is a
  member of the set $\family U$ given in the definition of $\PFm$; so
  $\bigcap\family U\included N$.  But $\PFm=\bigcap\family U$.
  Therefore  $N\included\PFm$ and $\PFm\included N$; so $N=\PFm$. 
\end{proof}

\begin{notation}
  Bold-face letters $\sF$, $\sG$, and $\sH$, (and variants like $\sF'$
  and $\sG_k$,) will
always stand for formulas (that is,
elements of $\PFm$).  
\end{notation}

Suppose $g:V\to\PFm$.  Then $g$ determines a certain function, to be denoted
\begin{equation*}
  \sF\longmapsto\sF(g),
\end{equation*}
with domain $\PFm$.  This function can be called \defn{substitution
  with respect to $g$}.  The reason for the unusual way of denoting
  the function will become clear. 
Suppose $g(P_k)$ is $\sG_k$ for each $k$ in $\varN$.  If
exactly $m$ entries in $\sF$ are variables, so that $\sF$ can be
written as
\begin{equation*}
  \dots P_{k_0}\dots P_{k_1}\dots\dotsb\dots P_{k_{m-1}}\dots,
\end{equation*}
then the formula $\sF(g)$ is
\begin{equation*}
  \dots \sG_{k_0}\dots \sG_{k_1}\dots\dotsb\dots \sG_{k_{m-1}}\dots
\end{equation*}
  If the variables that
appear in $\sF$ belong to the set $\{P_0,\dots,P_{n-1}\}$, 
then $\sF$ can be denoted
\begin{equation*}
  \sF(P_0,\dots,P_{n-1}),
\end{equation*}
and
$\sF(g)$ can be denoted
\begin{equation*}
  \sF(\sG_0,\dots,\sG_{n-1}).
\end{equation*}
It would be a tedious exercise, but a practicable one, to write down an
expression for the $k$th entry in $\sF(\sG_0,\dots,\sG_{n-1})$, for
arbitrary $k$.  That
$\sF(\sG_0,\dots,\sG_{n-1})$ is actually \emph{in} $\PFm$ is a
consequence of the following:

\begin{theorem}
  Let $g:V\to\PFm$.  Then the function $\sF\mapsto\sF(g)$ on $\PFm$
  has co-domain $\PFm$.
\end{theorem}

\begin{proof}
We use induction on formulas to prove
\begin{equation*}
  \{\sF:\sF(g)\in\PFm\}=\PFm.
\end{equation*}
The argument has three parts:
\begin{mylist}
  \item
By assumption, $\sP(g)$, which is $g(\sP)$, is in $\PFm$.
\item
Suppose $\sF(g)$ is a formula $\sH$.  Then substitution with respect
to $g$ in $\lnot\sF$ results in
$\lnot\sH$, which is in $\PFm$ by definition (of $\PFm$).
\item
Suppose $\sF(g)$ and $\sG(g)$ are formulas $\sH$ and $\sH'$
respectively.  Then $(\sF\to\sG)(g)$ is 
$(\sH\to\sH')$, which again is in $\PFm$ by definition.
\end{mylist}
This completes the induction and the proof.
\end{proof}

If the foregoing discussion of substitution seems too informal or
imprecise, let it be noted that the operation $\sF\mapsto\sF(g)$ can be
defined \tech{recursively,} by means of Theorem~\ref{thm:recursion}
below.  However, substitution makes sense for sets of strings that do
\emph{not} admit definition by recursion or even proof by induction.

\begin{exercises}
\item
Prove by induction that every formula has as many left as right
parentheses. 
\item
Prove by induction that an entry $\lnot$ is never preceded by a
variable in any formula.
  \item
For each $k$ in $\varN$, there is a function $h_k$ from $\PFm$ to
$\varN$ such that $h_k(\sF)$ is the number of times that $P_k$ appears
in $\sF$.  Using this, find the length of $\sF(\sG_0,\dots,\sG_{n-1})$ in
terms of $\length{\sF}$ and the $\length{\sG_k}$.
\item
Supposing  $\sF(\sG_0,\dots,\sG_{n-1})$ is $s_0s_1\cdots s_{M-1}$,
what is $s_k$?
\end{exercises}

\section{Recursion}

Recall that functions from $\varN$ into an arbitrary set $B$ can be
defined by \defn{recursion:} 
If
\begin{mylist}
  \item
$c\in B$, and 
\item
$f:B\to B$, 
\end{mylist}
then there is exactly one
function $g$ from $\varN$ to $B$ such that 
\begin{mylist}
  \item
$g(0)=c$, and
\item
$g(n+1)=f(g(n))$ for all $n$ in $\varN$.  
\end{mylist}
That there is \emph{exactly} one such function $g$ means:
\begin{mylist}
  \item
there is at most one such function, and
\item
there is at least one such function.
\end{mylist}
The first of these claims can be proved by induction; the second
requires more.  Similarly, induction on formulas gives us the
following: 

\begin{lemma}\label{lem:uniqueness}
  Suppose $A$ is a set, and
  \begin{mylist}
    \item
  $h_0:\PVar\to A$,
\item
$h_1$ is a singulary operation on $A$, and 
\item
$h_2$ a binary operation on $A$.  
  \end{mylist}
Then there is at most one function $h$ on
  $\PFm$ such that 
  \begin{mylist}
    \item
$h$ agrees with $h_0$ on $\PVar$;
\item
$h(\lnot\sF)=h_1(h(\sF))$, for all $\sF$;
\item
$h((\sF\to\sG))=h_2(h(\sF),h(\sG))$, for all $\sF$ and $\sG$.
  \end{mylist}
\end{lemma}

\begin{proof}
  Suppose $h'$ and $h''$ are such functions $h$.  Let $D$
  consist of
  those formulas $\sF$ such that $h'(\sF)=h''(\sF)$.  Then $D$
  contains the variables; and if $D$ contains $\sF$ and $\sG$, then
  $D$ contains $\lnot\sF$ and $(\sF\to\sG)$.  By induction, $D$ is
  $\PFm$, so $h'=h''$.
\end{proof}

If there \emph{is} a function $h$ as described in the Lemma, then it
is said to be defined by \defn{recursion.}
To prove that recursively defined functions exist at all, we
shall use the following.

\begin{lemma}\label{lem:formulas}
Every formula  in $\PFm$ meets the following conditions:
\begin{mylist}
\item
it has positive length;
\item
if it has length $1$, then it is a variable;
\item
if it has length greater than $1$, then it begins with $\lnot$
or $($; 
  \item
if it begins with
$\lnot$, then it is $\lnot \sF$ for some \emph{formula} $\sF$; 
\item
if it begins with $($, then it is $(\sF\to \sG)$ for some
  \emph{formulas} $\sF$ and~$\sG$.
\end{mylist}
\end{lemma}

\begin{proof}
We use induction on formulas.  Let $N$ be the subset of $\PFm$ containing
every formula that meets the given conditions.  

We first show that $N$ contains the variables.  Let $\sP$ be a variable.
Then $\sP$ has length $1$, which is positive;
so $\sP$ meets the first two conditions.  Since $\sP$ does not have
length greater than $1$, it trivially meets the third condition.
  Since $\sP$ begins with a
variable, it does not begin with $($ or $\lnot$; hence $\sP$ trivially
meets the remaining two conditions.  Therefore $\sP$ is in $N$.

Suppose $N$ contains $\sF$.  Then $\sF$ has positive length, so
$\lnot\sF$ is a formula that has length greater than $1$ and
satisfies the five conditions.  Hence $\lnot\sF$ is in $N$.

Suppose finally that $N$ contains $\sF$ and $\sG$.  Then $(\sF\to\sG)$
is a formula that meets the five conditions, so it is in $N$.

Hence $N=\PFm$ by Theorem~\ref{thm:induction}. 
\end{proof}

So that we can talk clearly about formulas, we declare that:
\begin{mylist}
\item
  every formula $\lnot\sF$ is a \defn{negation;}
\item
every formula $(\sF\to\sG)$ is an \defn{implication,} with \defn{antecedent}
$\sF$ and \defn{consequent} $\sG$.  
\end{mylist}
The last lemma formalizes the
straightforward observation that every formula is a variable, a
negation, or an implication, and is \emph{only} one of these.  
To prove that recursively defined functions on $\PFm$ exist, we shall
need to know that every implication has a \emph{unique} antecedent and
consequent.
To prove \emph{this,} it will be useful to have the following definitions.
An \defn{initial segment} of the string $s_0s_1\dotsb s_{n-1}$ is one
of the strings 
$s_0s_1\dotsb s_{k-1}$, where $k\leq n$.  This initial segment is
\defn{proper} if $k<n$.

The proof of the following lemma requires \tech{strong induction} on
the \emph{lengths} of formulas.  Recall that, to prove by \defn{strong
  induction} that a subset
$A$ of $\varN$ \emph{is} $\varN$, one proves that, for all $n$ in
$\varN$, if $A$ contains every natural
number that is \emph{less} than $n$, then $A$ contains $n$.

\begin{lemma}\label{lem:no-pis}
  No proper initial segment of a formula is a formula.
\end{lemma}

\begin{proof}
Let $N$ be the set of formulas of which no proper
initial segment is a formula.  We shall prove by strong induction on
the lengths of formulas that
$N=\PFm$.  Suppose $N$ contains all formulas \emph{shorter} than a
formula $\sF$.
By Lemma~\ref{lem:formulas}, we know that $\sF$ is a variable $\sP$ or a
formula $\lnot
\sG$ or $(\sG\to \sH)$.  The only
proper initial segment of $\sP$ is the empty string, which is not a
formula.  Any proper initial segment of $\lnot \sG$ is $\lnot
\sA$ for some proper initial segment $\sA$ of $\sG$; so $\sA$ is
not a formula, by our
strong inductive hypothesis; hence $\lnot \sA$ is not a formula,
again by Lemma~\ref{lem:formulas}.
Finally, say $\sF$ is $(\sG\to \sH)$.  Any initial segment of $\sF$ that is a
formula is
$(\sG'\to \sH')$ for some formulas $\sG'$ and $\sH'$ (by
Lemma~\ref{lem:formulas}).  Then one of $\sG$ and $\sG'$
is an initial segment of the other.  But each one is shorter than $\sF$;
so by strong inductive hypothesis, $\sG$ and $\sG'$ are the same formula.
Then $\sH'$ is an initial segment of $\sH$; so \emph{these} formulas
must be the same.  Thus, in all cases, $\sF$ is in $N$.  By strong
induction, $N=\PFm$. 
\end{proof}


\begin{theorem}[Unique Readability of Formulas]\label{thm:urf}
Every formula is a variable, a negation, or an implication,
but only one of these; and every implication has a unique antecedent
and consequent.
\end{theorem}

\begin{proof}
  We know the first part by Lemma~\ref{lem:formulas}.  For the second
  part, suppose $(\sF\to\sG)$ and $(\sF'\to\sG')$ are the same
  formula.  Then one of $\sF$ and $\sF'$ is an initial segment of the
  other, so they are the same by Lemma~\ref{lem:no-pis}; hence $\sG$
  and $\sG'$ are the same. 
\end{proof}

Now we can prove that recursively defined functions on $\PFm$ exist.
Note carefully how the proof relies on the
previous theorem. 

\begin{theorem}[Recursion on Formulas]\label{thm:recursion}
  Suppose $A$ is a set, and
  \begin{mylist}
    \item
  $h_0:\PVar\to A$,
\item
$h_1$ is a singulary operation on $A$, and 
\item
$h_2$ a binary operation on $A$.  
  \end{mylist}
Then there is a unique function $h$ on
  $\PFm$ such that 
  \begin{mylist}
    \item
$h$ agrees with $h_0$ on $\PVar$;
\item
$h(\lnot\sF)=h_1(h(\sF))$, for all $\sF$;
\item
$h((\sF\to\sG))=h_2(h(\sF),h(\sG))$, for all $\sF$ and $\sG$.
  \end{mylist}
\end{theorem}

\begin{proof}
  By Lemma~\ref{lem:uniqueness}, we now need only 
prove that a function $h$ \emph{does} exist as desired.  Let
$\family U$ be the set of
all subsets $R$ of $\PFm\times A$ such that:
\begin{mylist}
  \item
$(\sP,h_0(\sP))\in R$ for all variables $\sP$;
\item
if $(\sF,b)\in R$, then $(\lnot\sF,h_1(b))\in R$;
\item
if $(\sF,b)\in R$ and $(\sG,c)\in R$, then $((\sF\to\sG),h_2(b,c))\in R$.
\end{mylist}
Let $S=\bigcap\family U$.  Then $S\in\family U$, so $S$ has the
properties desired of $h$, except perhaps for being a function.  To
prove that $S$ \emph{is} a function on $\PFm$, let $D$ be the set of
formulas $\sF$ for which there is a unique $b$ in $A$ such that
$(\sF,b)\in S$.  We proceed again by induction on formulas:
\begin{mylist}
  \item
By its definition, $S$ contains every ordered pair $(\sP,h_0(\sP))$.
If $b\neq h_0(\sP)$, then $S\setminus\{(\sP,b)\}\in\family U$, so
$S\included S\setminus\{(\sP,b)\}$, which means
$(\sP,b)\notin S$.  Hence $\sP\in D$.
 \item
Suppose $\sF\in D$.  Then $(\sF,b)\in S$ for some unique $b$.  Hence
$S$ contains $(\lnot\sF,h_1(b))$, 
but if $c\neq h_1(b)$, then $S\setminus\{(\lnot\sF,c)\}\in \family U$,
so $(\lnot\sF,c)\notin S$.  Therefore $\lnot\sF\in D$.
\item
Suppose finally that $\sF$ and $\sG$ are in $D$, and $(\sF,b)$ and
$(\sG,c)$ are in $S$.  By Theorem~\ref{thm:urf}, if $(\sF'\to\sG')$
is the same formula as $(\sF\to\sG)$, then $\sF'$ is $\sF$, and $\sG'$
is $\sG$.  Hence, if $d\neq h_2(b,c)$, then
$S\setminus\{(\sF\to\sG),d)\}\in\family U$, so $((\sF\to\sG),d)\notin
S$.  Therefore $(\sF\to\sG)\in D$.
\end{mylist}
We can conclude that $D=\PFm$, so $S$ is a function $h$ as desired.
\end{proof}

\begin{example}
There is a unique singulary operation $\sF\mapsto\sF^*$ on $\PFm$
  such that
  \begin{mylist}
    \item
$\sP^*=\sP$ for each $\sP$;
\item
$(\lnot\sF)^*=\lnot\sG$, where $\sG$ is $\sF^*$, for each formula $\sF$;
\item
$(\sF\to\sG)^*=\lnot(\sG^*\to\sF^*)$ for all formulas $\sF$ and $\sG$.
  \end{mylist}
In the notation of the Recursion Theorem, $\sF\mapsto\sF^*$ is the
function $h$
when $h_0$ is $\id_{\PVar}$, and $h_1$ is $\sF\mapsto\lnot\sF$, and
$h_2$ is $(\sF,\sG)\mapsto\lnot(\sG\to\sF)$.
Hence, for example,
\begin{align*}
  (\sP\to(\lnot\sQ\to\sR))^*
&=\lnot((\lnot\sQ\to\sR)^*\to\sP^*)\\
&=\lnot(\lnot(\sR^*\to\lnot\sQ^*)\to\sP)\\
&=\lnot(\lnot(\sR\to\lnot\sQ)\to\sP),
\end{align*}
whereas
$((\sP\to\lnot\sQ)\to\sR)^*$ is the formula
$\lnot(\sR\to\lnot(\lnot\sQ\to\sP))$.
\end{example}

The proof of the Recursion Theorem can be modified so as to yield the
following more general result.  (The reader should check the details.)

\begin{porism}\label{por:recursion}
  Suppose
  \begin{myitem}
    \item
  $h_0:\PVar\to A$, 
\item
$h_1:\PFm\times A\to A$, and
\item
 $h_2:(\PFm\times A)^2\to A$.  
  \end{myitem}
Then there is a unique function $h$ on
  $\PFm$ such that 
  \begin{mylist}
    \item
$h$ agrees with $h_0$ on $\PVar$;
\item
$h(\lnot\sF)=h_1(\sF,h(\sF))$ for all $\sF$;
\item
$h((\sF\to\sG))=h_2((\sF,h(\sF)),(\sG,h(\sG)))$ for all
 $\sF$ and $\sG$.
  \end{mylist}
\end{porism}

We used Unique Readability (Theorem~\ref{thm:urf}) to prove the
Recursion Theorem,~\ref{thm:recursion};
conversely, Unique Readability follows from
Porism~\ref{por:recursion}.  Indeed, 
using the notation of the Porism,
let $A$ be $\PFm$, let $h_0$ and $h_1$ be chosen arbitrarily,
and let $h_2$ be 
\begin{equation*}
  ((\sF,\sF'),(\sG,\sG'))\longmapsto(\sF,\sG).
\end{equation*}
Let $h$ be the function guaranteed by the Porism.  Then
$h((\sF\to\sG))=(\sF,\sG)$.  Thus $h$ selects, from an
implication, its antecedent and consequent.  Since $h$ is a function,
the antecedent and consequent are unique.

Note well that the Recursion Theorem is \emph{not} a consequence of
the Induction Theorem:

\begin{example}\label{example:no-recursion}
Suppose we define $\PFm$ without using parentheses.  We shall still be
  able to use induction, but if we are not careful, we shall not have
  definitions by recursion.
  Indeed, say we define $\mathrm{nPF}$ (for `not $\PFm$') so that:
  \begin{mylist}
    \item
each variable is in $\mathrm{nPF}$;
\item
if $\sA$ is in $\mathrm{nPF}$, then so is $\lnot\sA$;
\item
if $\sA$ and $\sB$ are in $\mathrm{nPF}$, then so is
$\sA\to\sB$.
  \end{mylist}
Then proof by induction in $\mathrm{nPF}$ is possible.  However, suppose we
try to define a function~$f$ from $\mathrm{nPF}$ into $\PFm$ so as to
send every element
of the former to its `equivalent' in the latter:
\begin{mylist}
  \item
$f(\sP)=\sP$;
\item
$f(\lnot\sF)=\lnot f(\sF)$;
\item
$f(\sF\to\sG)=(f(\sF)\to f(\sG))$.
\end{mylist}
Then $f(\sP\to \sQ)=(\sP\to\sQ)$ for all variables $\sP$ and $\sQ$; but
$f(P_0\to P_1\to P_2)$ must be both $(P_0\to(P_1\to P_2))$ and $((P_0\to
P_1)\to P_2)$, which is absurd, since these are different formulas,
and $f$ is a function.

A correct way to avoid using parentheses is to use 
\tech{\L ukasiewicz-} or \tech{Polish notation,} writing
$\mathord{\to}\;\sF\;\sG$ 
instead of $(\sF\to\sG)$.  Details are left to the reader.  See also
\S~\ref{sect:notation} below.
\end{example}

\begin{exercises}
\item
Give a recursive definition of the set of sub-formulas of a formula.
(See \S~\ref{sect:prop-form}, Exercise~\ref{exercise:sub-f}.)
  \item
Prove Porism \ref{por:recursion}.
\item
Prove the Recursion Theorem,~\ref{thm:recursion}, in case all formulas
are written in \L ukasiewicz-notation (see
Example~\ref{example:no-recursion}). 
\end{exercises}

\section{Syntactic entailment}\label{sect:syn-entail}

Now suppose $\Sigma$ is a subset of $\PFm$.  We shall define, on
$\PFm$, a
singulary relation called \defn{syntactic entailment by $\Sigma$}.
We shall denote the relation by
\begin{equation*}
  \Sigma\proves.
\end{equation*}  
If 
$\sF$ satisfies the relation, we shall write
\begin{equation*}
  \Sigma\proves\sF.
\end{equation*}
This can be read as \Eng{$\Sigma$ syntactically entails $\sF$}, or
\Eng{$\sF$ is a syntactic consequence of~$\Sigma$.}  
The relation is defined by the following conditions:
\begin{mylist}
  \item
$\Sigma\proves(\sF\to(\sG\to \sF))$;
\item
$\Sigma\proves((\sF\to(\sG\to \sH))\to((\sF\to\sG)\to(\sF\to\sH)))$;
\item
$\Sigma\proves((\lnot\sF\to\lnot\sG)\to(\sG\to \sF))$;
\item
if $\sF\in\Sigma$, then $\Sigma\proves\sF$;
\item
if $\Sigma\proves\sF$, and $\Sigma\proves(\sF\to\sG)$, then
$\Sigma\proves\sG$.
\end{mylist}
Thus the set of syntactic consequences of $\Sigma$ is defined
inductively; that is, a claim analogous to Theorem~\ref{thm:induction}
can be proved (just as easily).  Looking ahead to their semantics, we
can name the three
families of formulas just given:
\begin{mylist}
  \item
$(\sF\to(\sG\to \sF))$ is \defn{Affirmation of the Consequent;}
\item
$((\sF\to(\sG\to \sH))\to((\sF\to\sG)\to(\sF\to\sH)))$ is
  \defn{Self-Distribution of Implication;}
\item
$((\lnot\sF\to\lnot\sG)\to(\sG\to \sF))$ is \defn{Contraposition.}
\end{mylist}
Every such formula will be called an \defn{Axiom.}
The rule that $\Sigma\proves\sG$, if $\Sigma\proves\sF$ and
$\Sigma\proves(\sF\to\sG)$, is the \defn{Rule of Detachment} (or
\defn{\emph{Modus Ponens}}); in particular, $\sG$ can be `detached'
from $(\sF\to\sG)$ by means of $\sF$.  So the set of syntactic
consequences of
$\Sigma$ is the smallest set of formulas that contains the elements of
$\Sigma$ and the Axioms and is \emph{closed} under application of the
Rule of Detachment. 

\begin{lemma}\label{lem:immediate}
  If $\Sigma\proves\sF$, and $\Sigma\included
\Tau$, then $\Tau\proves\sF$.
\end{lemma}

\begin{proof}
  Immediate.\footnote{For the typographically minded:  The letter
  $\Tau$ in the statement of the theorem is a Greek capital
  \emph{tau,} in upright font, rather than a Latin $T$, in italic font.}
\end{proof}

Syntactic entailment of a formula is established fundamentally by the
presence of the formula at the root of an appropriate \emph{tree;}
but the essential 
information in such a tree can be expressed as a \tech{deduction} or
\tech{formal proof.}  First, an example of a syntactic entailment:

\begin{lemma}\label{lem:FtoF}
  $\emptyset\proves (\sF\to \sF)$.
\end{lemma}

\begin{proof}
  We have the following sequence of observations:
  \begin{myenum}
    \item
$\emptyset\proves (\sF\to((\sF\to\sF)\to\sF))$ [by Affirmation of the
      Consequent]; 
\item
$\emptyset\proves ((\sF\to((\sF\to\sF)\to\sF))\to
  ((\sF\to(\sF\to\sF))\to (\sF\to \sF)))$ [by %\\ \mbox{}\hfill
 Self-Distribution of
  Implication];
\item
$\emptyset\proves ((\sF\to(\sF\to\sF))\to (\sF\to \sF))$
  [by Detachment from (2) by (1)];
\item
$\emptyset\proves (\sF\to(\sF\to\sF))$ [by Affirmation of the Consequent];
\item
$\emptyset\proves(\sF\to\sF)$ [by Detachment from (3) by (4)].
  \end{myenum}
This completes the proof.
\end{proof}

By the last two lemmas, we have $\Sigma\proves(\sF\to\sF)$ always.

The preceding proof can be written as a tree, thus:
 \begin{equation*}
   \xymatrix@C=4em@!0{
 &&&&
 *+[F-:<3pt>]{((\sF\to((\sF\to\sF)\to\sF))\to
   ((\sF\to(\sF\to\sF))\to (\sF\to \sF)))} \ar@{-}[dddl] \\
 &&*+[F-:<3pt>]{(\sF\to((\sF\to\sF)\to\sF))} \ar@{-}[ddr] &&\\
 *+[F-:<3pt>]{(\sF\to(\sF\to\sF))} \ar@{-}[ddr] &&&&\\
 &&&*+[F]{((\sF\to(\sF\to\sF))\to (\sF\to \sF))}
 \ar@{-}[dll] &\\
 &*+[F]{(\sF\to\sF)}&&&}
 \end{equation*}
%\begin{equation*}
%  \xymatrix@!0{
%&& *+[o][F]{1} \ar@{-}[dr] && *+[o][F]{2} \ar@{-}[dl] \\
%*+[o][F]{4} \ar@{-}[dr] &&& *+[F]{3} \ar@{-}[dll] & \\
%& *+[F]{5} &&&}
%\end{equation*}
Stripped further of explanatory details, the proof can be written as
the following string of length $5$ (the entries of the string are
themselves strings, namely, formulas): 
\begin{gather}\label{eqn:proof-line1}
 (\sF\to((\sF\to\sF)\to\sF)),\\
 ((\sF\to((\sF\to\sF)\to\sF))\to
  ((\sF\to(\sF\to\sF))\to (\sF\to \sF))),\\
 ((\sF\to(\sF\to\sF))\to (\sF\to \sF)),\\
 (\sF\to(\sF\to\sF)),\\\label{eqn:proof-line5}
(\sF\to\sF).  
\end{gather}
This string is a \tech{deduction} of $(\sF\to\sF)$ from $\emptyset$
(or any other set of formulas).

By definition, a \defn{(formal) proof} or \defn{deduction of $\sH$ from
  $\Sigma$} is
a string of formulas, ending
with $\sH$, such that each entry $\sG$ in the string
\begin{mylist}
  \item
is an Axiom, or 
\item
is an element of $\Sigma$, or
\item
is preceded (somewhere in the string) by formulas $\sF$ and
$(\sF\to\sG)$.
\end{mylist}

\begin{lemma}\label{lem:pis-proof}
  Every initial segment of a formal proof from a set of formulas
is itself a formal proof from that set.
\end{lemma}

\begin{proof}
  Immediate.
\end{proof}

\begin{theorem}\label{thm:proves}
A formula $\sG$ has a formal proof from $\Sigma$ if and only if
  $\Sigma\proves\sG$. 
\end{theorem}

\begin{proof}
By strong induction on the lengths of formal proofs, we first show
that every formula with a formal proof from $\Sigma$ is a syntactic
consequence of $\Sigma$.  Suppose the claim is true for all formulas
with proofs of length less than $n$, and now $\sG$ has a formal proof
of length $n$.  If $\sG$ is an Axiom or an element of $\Sigma$, then
$\Sigma\proves\sG$.  If, in its proof, $\sG$ is
preceded by $\sF$ and $(\sF\to\sG)$, then, by inductive hypothesis and
Lemma~\ref{lem:pis-proof},
$\Sigma\proves\sF$ and $\Sigma\proves(\sF\to\sG)$, hence
$\Sigma\proves\sG$.  Hence $\Sigma\proves\sH$ for all formulas $\sH$
that are formally provable from $\Sigma$.

We prove the converse by induction on syntactic consequences.  Every
Axiom, and every element of $\Sigma$, is a formal proof of itself from
$\Sigma$.  Suppose $\sF$ has formal proof $\sA$, and $(\sF\to\sG)$ has formal
proof $\sB$, from $\Sigma$.  Then $\sA\;\sB\;\sG$ is a formal proof of
$\sG$ from $\Sigma$.  Hence every syntactic consequence of $\Sigma$ is
formally provable from it.
\end{proof}

Establishing syntactic entailment by means of the original definition
or by formal proof is usually quite tedious.  We shall develop
short-cuts.  First, let us develop a simpler notation for formulas, in
the next section.

\section{Notation}\label{sect:notation}

We have chosen a signature for our propositional formulas, namely
$\{\to,\lnot\}$.  We have also chosen a `style' of notation, namely
\defn{infix} notation.  There are alternatives (as mentioned in
Example~\ref{example:no-recursion}).  

What we are calling \Eng{syntactic consequence} seems to have its origin
in the \emph{Begriffsschrift} \cite{MR0263601} of Gottlob Frege,
published in 1879.  (The title can be rendered as `ideography' or `concept
writing').  In Frege's work, what we call formulas appear not as strings, but 
as two-dimensional 
figures.  For example, our three Axioms correspond to Frege's
Judgments~(1),~(2), and---almost---(28); he writes them as follows:
\setlength{\unitlength}{1.2pt}
\begin{center}
\hfill
  \begin{picture}(40,66)(0,-63)
  \put(0,0){\line(1,0){30}}
  \put(30,-3){$\;\sF$}
  \put(20,-10){\line(0,1){10}}  
  \put(20,-10){\line(1,0){10}}  
  \put(30,-13){$\;\sG$}
  \put(10,-20){\line(0,1){20}}
  \put(10,-20){\line(1,0){20}}
  \put(30,-23){$\;\sF$}
  \linethickness{1.5pt}
  \put(0,-5){\line(0,1){10}}
  \end{picture}
\hfill
  \begin{picture}(50,66)(0,-63)
  \put(0,0){\line(1,0){40}}
  \put(40,-3){$\;\sH$}
  \put(30,-10){\line(0,1){10}}
  \put(30,-10){\line(1,0){10}}
  \put(40,-13){$\;\sF$}
  \put(20,-20){\line(0,1){20}}
  \put(20,-20){\line(1,0){20}}
  \put(40,-23){$\;\sG$}
  \put(30,-30){\line(0,1){10}}
  \put(30,-30){\line(1,0){10}}
  \put(40,-33){$\;\sF$}
  \put(10,-40){\line(0,1){40}}
  \put(10,-40){\line(1,0){30}}
  \put(40,-43){$\;\sH$}
  \put(30,-50){\line(0,1){10}}
  \put(30,-50){\line(1,0){10}}
  \put(40,-53){$\;\sG$}
  \put(20,-60){\line(0,1){20}}
  \put(20,-60){\line(1,0){20}}
  \put(40,-63){$\;\sF$}
  \linethickness{1.5pt}
  \put(0,-5){\line(0,1){10}}
\end{picture}
\hfill
  \begin{picture}(40,66)(0,-63)
  \put(0,0){\line(1,0){30}}
  \put(30,-3){$\;\sF$}
  \put(20,-10){\line(0,1){10}}  
  \put(20,-10){\line(1,0){10}}  
  \put(30,-13){$\;\sG$}
  \put(10,-20){\line(0,1){20}}
  \put(10,-20){\line(1,0){20}}
  \put(25,-24){\line(0,1){4}}
  \put(30,-23){$\;\sG$}
  \put(20,-30){\line(0,1){10}}
  \put(20,-30){\line(1,0){10}}
  \put(25,-34){\line(0,1){4}}
  \put(30,-33){$\;\sF$}
  \linethickness{1.5pt}
  \put(0,-5){\line(0,1){10}}
  \end{picture}
\hfill\mbox{}
\end{center}
This style of writing formulas never caught on, except in the
following sense:  To assert a \Eng{judgment} whose \Eng{content} is $A$, Frege
writes
\begin{center}
  \begin{picture}(30,10)(0,-5)
  \put(0,0){\line(1,0){20}}
  \put(20,-3){$\;A$}
  \linethickness{1.5pt}
  \put(0,-5){\line(0,1){10}}
  \end{picture}  
\end{center}
The vertical bar here is the \Eng{judgment stroke,} while the horizontal
is merely the \Eng{content stroke.}
Frege's notation appears to be the origin of
our own symbol~$\proves$.

I propose to modify
our own style of writing parentheses by removing excess parentheses.
When this is done, for example, our three 
Axioms become
\begin{gather*}
\sF\to\sG\to\sF,\\
(\sF\to\sG\to \sH)\to(\sF\to\sG)\to\sF\to\sH,\\
(\lnot\sF\to\lnot\sG)\to\sG\to \sF.
\end{gather*}
In this abbreviated system, we can again define formulas inductively,
albeit in a more complicated way.  The set of these formulas can be
called $\PFm'$.
Every formula in $\PFm'$ will be a variable, a negation, or an
implication.  Then:
\begin{mylist}
\item
$\PVar$ is the set of variables in $\PFm'$.
  \item
If $\sF$ is a variable or a negation in $\PFm'$, then $\lnot \sF$ is a
negation in $\PFm'$. 
\item
If $\sF$ is an implication in $\PFm'$, then $\lnot(\sF)$ is a negation
in $\PFm$.
\item
If $\sF$ is a variable or a negation in $\PFm'$, and $\sG$ is in
$\PFm'$, then $\sF\to \sG$ is an 
implication in $\PFm'$.
\item
If $\sF$ is an implication in $\PFm'$, and $\sG$ is in $\PFm'$, then
$(\sF)\to \sG$ is an implication in $\PFm'$.
\end{mylist}
Thus, no formula by itself will be enclosed in parentheses; but an
implication must be so enclosed when it is negated or used as the
\emph{antecedent} of another implication.  It is left to the reader to
formulate $\PFm'$ as an intersection of sets, so that the analogue of
Theorem~\ref{thm:induction} follows.  
Written as a string of elements of $\PFm'$, the deduction given
earlier as Lines~(\ref{eqn:proof-line1}--\ref{eqn:proof-line5})
becomes
\begin{gather*}
 \sF\to(\sF\to\sF)\to\sF,\\
 (\sF\to(\sF\to\sF)\to\sF)\to
  (\sF\to\sF\to\sF)\to \sF\to \sF,\\
 (\sF\to\sF\to\sF)\to \sF\to \sF,\\
 \sF\to\sF\to\sF,\\
\sF\to\sF.  
\end{gather*}
It is also left to the reader to
formulate and prove an analogue of Theorem~\ref{thm:recursion}, so
that the following can then be proved:

\begin{theorem}\label{thm:simpler}
  There is a unique bijection $\sF\mapsto\overline{\sF}$ from $\PFm$
  to $\PFm'$ such that
  \begin{mylist}
    \item
$\overline{\sP}=\sP$ for all variables $\sP$;
\item
$\overline{\lnot\sF}=
  \begin{cases}
    \lnot\overline{\sF},&\text{ if $\sF$ is a variable or
  negation;}\\
    \lnot(\overline{\sF}),&\text{ if $\sF$ is an
  implication;}
  \end{cases}$
\item
$\overline{(\sF\to\sG)}=
  \begin{cases}
      \overline{\sF}\to\overline{\sG},&\text{ if $\sF$ is
  a variable or negation;}\\
      (\overline{\sF})\to\overline{\sG},&\text{ if $\sF$
  is an implication.}
\end{cases}$
  \end{mylist}
The inverse of this map is a function $\sF\mapsto\overline{\sF}$ from
$\PFm'$ to $\PFm$ such that
\begin{mylist}
  \item
$\overline{\sP}=\sP$ for all variables $\sP$;
\item
$\overline{\lnot\sF}=\lnot\overline{\sF}$;
\item
$\overline{\lnot(\sF)}=\lnot\overline{\sF}$;
\item
$\overline{\sF\to\sG}=(\overline{\sF}\to\overline{\sG})$; 
\item
$\overline{(\sF)\to\sG}=(\overline{\sF}\to\overline{\sG})$.
\end{mylist}
\end{theorem}

\begin{proof}
In the notation of Porism~\ref{por:recursion}, let $A$ be the set of
strings of the symbols in $\PVar\cup\{\to,\lnot,(,)\}$, let $h_0$ be
the inclusion of $\PVar$ in $A$, and let
\begin{align*}
  h_1(\sF,\sA)&=
  \begin{cases}
    \lnot\sA,&\text{ if $\sF$ is a variable or negation;}\\
    \lnot(\sA),&\text{ if $\sF$ is an implication;}
  \end{cases}\\
  h_2((\sF,\sA),(\sG,\sB))&=
  \begin{cases}
    \sA\to\sB,&\text{ if $\sF$ is a variable or negation;}\\
    (\sA)\to\sB,&\text{ if $\sF$ is an implication.}
  \end{cases}
\end{align*}
Then the function from $\PFm$ to $\PFm'$ exists uniquely as desired,
by the Porism.  This function is bijective, with inverse as claimed
(details are left to the reader). 
\end{proof}

\begin{exercises}
  \item
Give a precise definition of $\PFm'$.  (One way to proceed might be as
follows:  Let $S$ be as defined on p.~\pageref{strings}, and let
$\family U$ comprise the subsets $N$ of $S\times 2$ such that
\begin{mylist}
  \item
$(\sP,0)\in N$;
\item
if $(\sA,0)\in N$, then $(\lnot\sA,0)\in N$;
\item
if $(\sA,1)\in N$, then $(\lnot(\sA),0)\in N$;
\item
if $(\sF,0)\in N$, and $(\sG,e)\in N$, then
$(\sF\to\sG,1)\in N$;
\item
if $(\sF,1)\in N$, and $(\sG,e)\in N$, then
$((\sF)\to\sG,1)\in N$.
\end{mylist}
Now extract $\PFm'$.)
\item
Complete the proof of Theorem~\ref{thm:simpler}.
\end{exercises}

\section{Theorems}

Henceforth, let us write propositional formulas in the style of
$\PFm'$ established in the previous section.

A syntactical consequence of $\emptyset$ can be called a
\defn{Theorem.} To express that a formula is a Theorem, we can write
\begin{equation*}
  \proves
\end{equation*}
in front of it, instead of $\emptyset\proves{}$.  Thus, by
Lemma~\ref{lem:FtoF}, we have
\begin{equation*}
  \proves\sF\to\sF.
\end{equation*}
As promised at the end of \S~\ref{sect:syn-entail}, we now start to
develop more
efficient methods of establishing Theorems and other instances of
syntactic entailment.

\begin{theorem}[Deduction]\label{thm:deduction}
$\Sigma\proves\sF\to\sG\Iff\Sigma\cup\{\sF\}\proves\sG$.
\end{theorem}

\begin{proof}
The easy direction is ($\Rightarrow$), which is left to the reader.
The other direction uses strong induction on the lengths of formal
proofs. 

Suppose $\Sigma\cup\{\sF\}\proves\sG$, so that, by
Theorem~\ref{thm:proves}, there is a formal proof
of $\sG$ from $\Sigma\cup\{\sF\}$.  With respect to this proof,
there are three possibilities for $\sG$:

If $\sG$ is an Axiom, or is one of the formulas in $\Sigma$, then
$\Sigma\proves\sG$; but $\sG\to\sF\to \sG$ is an Axiom in any case, so
also $\Sigma\proves\sG\to\sF\to \sG$; hence
$\Sigma\proves\sF\to \sG$ by Detachment.

If $\sG$ is $\sF$, then $\proves\sF\to \sG$ by Lemma
\ref{lem:FtoF}, so $\Sigma\proves\sF\to \sG$ by Lemma~\ref{lem:immediate}. 

Finally, suppose that, in its formal proof, $\sG$ is preceded by $\sH$
and $\sH\to\sG$.  If $\Sigma\proves\sF\to\sH$ and
$\Sigma\proves\sF\to\sH\to\sG$,
then by Self-Distributivity of Implication, and the Rule of
Detachment, $\Sigma\proves\sF\to\sG$, and we are done.  Thus, if
$\Sigma$ does \emph{not} syntactically entail $\sF\to\sG$, then it
also fails to entail $\sF\to\sH\to\sG$ or $\sF\to\sH$.  But each of
the formulas $\sH\to\sG$ and $\sH$ 
has a shorter proof from $\Sigma\cup\{\sF\}$ than $\sG$
does, by Lemma~\ref{lem:pis-proof}.  By strong induction, we are done.
\end{proof}

\begin{lemma}\label{lem:several}
The following formulas are Theorems:
\begin{mylist}
\item\label{item:contrad}
$\lnot \sG\to \sG\to \sF$;
\item\label{item:double-neg}
$\lnot\lnot \sF\to \sF$;
\item\label{item:other-way}
$\sF\to\lnot\lnot \sF$;
\item\label{item:other-contrap}
$(\sF\to \sG)\to \lnot \sG\to\lnot \sF$;
\item\label{item:imp}
$\sF\to\lnot \sG\to\lnot(\sF\to \sG)$.
\item\label{item:two-cases}
$(\sF\to \sG)\to(\lnot \sF\to \sG)\to \sG$.
\end{mylist}
\end{lemma}

\begin{proof}
The following is a formal proof from $\lnot \sG$: 
\begin{equation*}
\lnot \sG\fcom 
\lnot \sG\to \lnot \sF\to \lnot \sG\fcom 
\lnot \sF\to \lnot \sG\fcom 
(\lnot \sF\to\lnot\sG)\to \sG\to \sF\fcom 
\sG\to \sF.
\end{equation*}
So $\lnot \sG\proves\sG\to \sF$.
By the Deduction Theorem, $\proves\lnot\sG\to\sG\to\sF$.

As a special case of what we have just shown, we have
$\lnot\lnot \sF\proves\lnot \sF\to\lnot\lnot\lnot \sF$.  From
Contraposition, we
get $\lnot\lnot \sF\proves\lnot\lnot \sF\to \sF$; then, by both
directions of the Deduction Theorem, we get $\lnot\lnot
\sF\proves\sF$, then $\proves\lnot\lnot \sF\to\sF$.

The remaining parts are left to the reader.
\end{proof}

\begin{exercises}
  \item
Prove the easy direction of the Deduction
Theorem,~\ref{thm:deduction}, and supply missing details of the proof
of the other direction.
\item
Prove the remainder of Lemma~\ref{lem:several}.
\end{exercises}

\section{Logical entailment}\label{sect:logical}

A \defn{truth-assignment} is a function from $\PVar$ to $2$.  Let
$\epsilon$ be such a function.  It determines a
substitution
$\sF\mapsto\sF(\epsilon)$ as in \S~\ref{sect:induction}, although $0$
and $1$
are not formulas in $\PFm$.  By recursion, the truth-assignment $\epsilon$
uniquely determines a function $\sF\mapsto\named{\sF}(\epsilon)$ as
follows (where $+$ and $\cdot$ are as in $\F_2$):
\begin{equation*}
  \named{\sF}(\epsilon)=
  \begin{cases}
  \epsilon(\sP),&\text{ if $\sF$ is $\sP$;}\\
  1+\named{\sG}(\epsilon),&\text{ if $\sF$ is $\lnot\sG$;}\\
% 1+\named{\sG}(\epsilon)\cdot(1+\named{\sH}(\epsilon)),
 1+\named{\sG}(\epsilon)+\named{\sG}(\epsilon)\cdot\named{\sH}(\epsilon),
&\text{ if
  $\sF$ is $\sG\to\sH$.}
  \end{cases}
\end{equation*}
%For $\epsilon(P_k)$, we can write $e_k$.  
A formula $\sF$ can be called
\defn{$n$-ary} if each variable that is an entry in $\sF$ belongs to
the set $\{P_k:k<n\}$.  In this case, $\named{\sF}(\epsilon)$ depends only on
the $n$-tuple 
%$(e_0,\dots,e_{n-1})$. 
$(\epsilon(P_0),\dots,\epsilon(P_{n-1}))$.
(This is obvious, but can be
confirmed by induction on formulas.)  
Denoting this $n$-tuple more briefly by $\tuple e$, we may write
\begin{equation*}
  \named{\sF}(\tuple e)
\end{equation*}
instead of $\named{\sF}(\epsilon)$.
We may then refer to $\tuple e$ as an
   \defn{$n$-ary truth-assignment.}  The number $\named{\sF}(\tuple
   e)$ is the \defn{truth-value} of $\sF$ with respect to $\epsilon$ or
   $\tuple e$.  In particular, $\sF$ is \defn{true in} $\epsilon$ (or
   $\tuple e$) if
   $\named{\sF}(\epsilon)=1$; otherwise, $\sF$ is \defn{false in}
   $\epsilon$. 

The truth-values of $\sF$ with respect to all
   truth-assignments can be given in a \defn{truth-table} with $2^n$
   rows. 

The following may seem obvious, once it is understood:

   \begin{theorem}[Associativity]\label{thm:associativity}
     Suppose $\sF$ is an $n$-ary formula, and $\sH$ is a formula
     $\sF(\sG_0,\dots,\sG_{n-1})$, and $\tuple e$ and $\tuple f$ are
     truth-assignments (of appropriate arity) such that
     \begin{equation*}
       \named{\sG}_k(\tuple e)=f_k
     \end{equation*}
for each $k$ in $n$.  Then
\begin{equation*}
  \named{\sF}(\tuple f)=\named{\sH}(\tuple e).
\end{equation*}
   \end{theorem}

   \begin{proof}
     We use induction on $\sF$.  If $\sF$ is a variable, then it is
     $P_k$ for some $k$ in $n$, so $\sH$ is $\sG_k$, and
     \begin{equation*}
       \named{\sH}(\tuple e)=
       \named{\sG}_k(\tuple e)=f_k=\named{P}_k(\tuple f)=\named{\sF}(\tuple f).
     \end{equation*}
Suppose the claim is true when $\sF$ is $\sF_0$ or $\sF_1$.  If now
$\sF$ is $\lnot\sF_0$, then $\sH$ is
$\lnot\sF_0(\sG_0,\dots,\sG_{n-1})$, which we can write as 
$\lnot\sH_0$, so that
\begin{align*}
\named{\sH}(e)
&=1+\named{\sH}_0(\tuple e)\\
&=1+\named{\sF}_0(\tuple f)\qquad\text{[by inductive hypothesis]}\\
&=\named{\sF}(\tuple f).
   \end{align*}
The remaining case, where $\sF$ is $(\sF_0\to\sF_1)$, is left to the
reader. 
   \end{proof}

In the present context, we can think of truth as a \emph{relation}
from $2^{\PVar}$ to $\PFm$, namely the relation
\begin{equation*}
  \{(\xi,X)\in 2^{\PVar}\times\PFm:\named X(\xi)=1\}.
\end{equation*}
We may denote this relation by
\begin{equation*}
  \models.
\end{equation*}
Hence, instead of $\named{\sF}(\epsilon)=1$, we may write
\begin{equation}\label{eqn:models}
  \epsilon\models\sF.
\end{equation}
The complement of the truth-relation can be denoted
\begin{equation*}
  \nmodels.
\end{equation*}
Hence we can express a fundamental fact as follows:

\begin{lemma}\label{lem:tf}
For all truth-assignments $\epsilon$ and formulas $\sF$, we have
  \begin{equation}\label{eqn:tf}
     \epsilon\models\sF\Iff \epsilon\nmodels\lnot\sF;
   \end{equation}
likewise,
   \begin{equation}\label{eqn:ft}
     \epsilon\nmodels\sF\Iff \epsilon\models\lnot\sF.
   \end{equation}
\end{lemma}

\begin{proof}
  Suppose $e\in2$.  Then $e=1\Iff e\neq0\Iff e+1=0$.  Let $\sG$ be
  $\lnot\sF$.  Then 
  \begin{align*}
\epsilon\models\sF
&\Iff \named{\sF}(\epsilon)=1\\
&\Iff1+\named{\sF}(\epsilon)=0\\
&\Iff\named{\sG}(\epsilon)=0\\
&\Iff\named{\sG}(\epsilon)\neq1\\
&\Iff\epsilon\nmodels\sG\\
&\Iff\epsilon\nmodels\lnot\sF.
  \end{align*}
The other equivalence follows immediately.
\end{proof}

From the truth-relation, we obtain three new functions, as follows.
\begin{mylist}
  \item
A \defn{model} of a set of formulas is a truth-assignment in which
every element of the set is true.  If $\Sigma$ is a set of formulas,
let 
\begin{equation*}
  \Mod{\Sigma}
\end{equation*}
be the set of its models.  This is the set
\begin{equation*}
  \bigcap_{X\in\Sigma}\{\xi\in 2^{\PVar}:\xi\models X\}. 
\end{equation*}
We now have a function $\Xi\mapsto\Mod\Xi$ from $\pow{\PFm}$ to
$\pow{2^{\PVar}}$. 
\item
The \defn{theory} of a set of truth-assignments is the set of formulas
that are true in all of the truth-assignments.  If $A$ is a set of
truth-assignments, let
\begin{equation*}
  \Th A
\end{equation*}
be its theory.  This is the set
\begin{equation*}
  \bigcap_{\xi\in A}\{X\in\PFm:\xi\models X\}.
\end{equation*}
So we have a function $\Xi\mapsto\Th\Xi$ from $\pow{2^{\PVar}}$ to
$\pow{\PFm}$. 
\item
The \defn{logical consequences} of a set of formulas are the formulas
that are true in every model of the original set.  The logical
consequences of $\Sigma$ compose a set
\begin{equation*}
  \conseq\Sigma.
\end{equation*}
This is the set $\bigcap_{\xi\in\Mod\Sigma}\{X\in\PFm:\xi\models X\}$,
which is
\begin{equation*}
  \Th{\Mod\Sigma}.
\end{equation*}
So we have a singulary operation $\Xi\mapsto\conseq\Xi$ on
$\pow{\PFm}$. 
\end{mylist}
We shall not study the operation $\Xi\mapsto\Mod{\Th\Xi}$ on
$\pow{2^{\PVar}}$. 

If $T$ is a set of formulas that is the theory of \emph{some} set of
truth-assignments, then $T$ can be called a \defn{theory,} simply.

If $\sF$ is a logical consequence of $\Sigma$, we may say also that
$\Sigma$ \defn{logically entails} $\sF$.  So we have several ways of
saying the same thing:
\begin{mylist}
  \item
 $\sF$ is a logical consequence of $\Sigma$;
\item
$\Sigma$ logically entails $\sF$;
\item
$\sF\in\conseq\Sigma$.
\end{mylist}
A fourth way is $\Sigma\models\sF$; but I shall avoid this notation,
lest it be confused with the notation introduced on
Line~\eqref{eqn:models}, which has a different meaning.

The logical consequences of $\emptyset$ are called \defn{tautologies;}
these are the formulas that are true in \emph{every} truth-assignment.

Note well that the definition of 
logical entailment is not inductive: there is (at the moment) no obvious way
to prove by induction that a given set of formulas contains all
logical consequences of $\Sigma$ (or even all tautologies).

\begin{lemma}\label{lem:reversing}
  The operations $\Xi\mapsto\Mod\Xi$ and $\Xi\mapsto\Th\Xi$ are
  inclusion-reversing, that is,
  \begin{mylist}
    \item
$\Sigma\included\Tau\implies\Mod\Tau\included\Mod\Sigma$, and
\item
$A\included B\implies\Th B\included \Th A$,
  \end{mylist}
for all sets
$\Sigma$ and $\Tau$ of formulas, and all sets $A$ and $B$
  of truth-assign\-ments.
\end{lemma}

\begin{proof}
  This is a purely set-theoretic fact, as the reader should check.
\end{proof}

\begin{theorem}\label{thm:closure}
  Let $\Sigma$ and $\Tau$ be subsets of $\PFm$.
  \begin{mylist}
\item
    $\Sigma\included\conseq\Sigma$.
\item
If $\Sigma\included\Tau$, then $\conseq\Sigma\included\conseq\Tau$.
\item
$\conseq{\conseq\Sigma}=\conseq\Sigma$.
  \end{mylist}
\end{theorem}

\begin{proof}
  In a model of $\Sigma$, every element of $\Sigma$ is true by
  definition. This proves the first claim.

For the second claim, use Lemma~\ref{lem:reversing} twice.

For the last claim, we have
$\conseq\Sigma\included\conseq{\conseq\Sigma}$
by the first claim.  Suppose now $\sF\in\conseq{\conseq\Sigma}$, and
$\epsilon$ is a model of $\Sigma$.  Then $\epsilon$ is a model of
$\conseq\Sigma$, so $\epsilon\models\sF$.  Thus
$\sF\in\conseq\Sigma$. 
\end{proof}

\begin{exercises}
  \item
Complete the proof of Theorem~\ref{thm:associativity}.
\item
Use Lemma~\ref{lem:tf} to show
$\epsilon\models\sF\Iff\epsilon\models\lnot\lnot\sF$. 
\item
Prove Lemma~\ref{lem:reversing}.
\item
Prove that $\Sigma$ is a theory if and only if $\conseq\Sigma=\Sigma$.
\item
Can you find a formula $\sF$ such that $\conseq{\{\sF\}}=\PFm$?
\item
Can you find a formula $\sG$ such that $\conseq{\{\sG\}}=\emptyset$?
\item
Suppose $\sH\in\conseq{\{\sF\to\sG\}}$.  Does it follow that $\sH$ is
a logical consequence of $\lnot\sF$ or of $\sG$?
\item
Suppose $\sH$ logically entails either $\lnot\sF$ or $\sG$; does it
entail $\sF\to\sG$?
\item
If $\sH\in\conseq{\{\lnot\sF\}}\cup\conseq{\{\sG\}}$, must
$\sH\in\conseq{\{\sF\to\sG\}}$?
\item
If $\sH\in\conseq{\{\lnot\sF\}}\cap\conseq{\{\sG\}}$, must
$\sH\in\conseq{\{\sF\to\sG\}}$?
\item\label{exercise:caps}
Show that
 $\Mod{\bigcup_{i\in I}\Sigma_i}=\bigcap_{i\in I}\Mod{\Sigma_i}$.
\item\label{exercise:cups}
Show that $\Mod{\{\sF\}}\cup\Mod{\{\sG\}}=\Mod{\{\lnot \sF\to \sG\}}$.
\item
Show that $\Mod{\{\sF\}}\comp=\Mod{\{\lnot \sF\}}$.
\end{exercises}

\section{Compactness}

A set of formulas with a model can be called \defn{satisfiable.}  

\begin{lemma}\label{lem:sat}
$\Sigma$ logically entails $\sF$ if and only if
  $\Sigma\cup\{\lnot\sF\}$ is not satisfiable. 
\end{lemma}

\begin{proof}
Suppose $\Sigma$ does \emph{not} logically entail $\sF$.
Then $\Sigma$ has a model $\epsilon$ in which $\sF$ is false.  Hence
$\epsilon\models\lnot\sF$ by Lemma~\ref{lem:tf},
\eqref{eqn:ft},  so
$\epsilon$ is a model of $\Sigma\cup\{\lnot\sF\}$.

Suppose conversely that $\Sigma\cup\{\lnot\sF\}$ has a model.  Then
$\sF$ is false in this model, again by Lemma~\ref{lem:tf},
\eqref{eqn:ft}, so $\sF$ is not a logical consequence of $\Sigma$. 
\end{proof}

A
set of formulas whose every \emph{finite} subset has a model can be
called \defn{finitely satisfiable.}

\begin{lemma}
  If $\Sigma$ is finitely satisfiable, then the same is true of
  $\Sigma\cup\{\sF\}$ or $\Sigma\cup\{\lnot\sF\}$.
\end{lemma}

\begin{proof}
  Suppose neither $\Sigma\cup\{\sF\}$
nor $\Sigma\cup\{\lnot\sF\}$ is finitely satisfiable.  Then $\Sigma$
has finite subsets $\Tau_0$ and $\Tau_1$ such that neither
$\Tau_0\cup\{\sF\}$ nor $\Tau_1\cup\{\lnot\sF\}$ is satisfiable.  Then
also $\Tau_0\cup\{\lnot\lnot\sF\}$ is not satisfiable (why?); hence
$\lnot\sF\in\conseq{\Tau_0}$ and $\sF\in\conseq{\Tau_1}$, by
Lemma~\ref{lem:sat}.  Therefore, every model $\epsilon$ of
$\Tau_0\cup\Tau_1$ is a
model of $\Tau_0$ and $\Tau_1$ (by Lemma~\ref{lem:reversing}), hence
$\epsilon$
is a model of $\{\sF,\lnot\sF\}$.  There can be no such models
$\epsilon$ (why not?); so 
$\Tau_0\cup\Tau_1$ is not satisfiable.  But this is a finite subset of
$\Sigma$; hence $\Sigma$ is not finitely satisfiable.  
\end{proof}


\begin{theorem}[Compactness]
Every finitely satisfiable set of formulas is satisfiable.
\end{theorem}

\begin{proof}
Let $\Sigma$ be finitely satisfiable.
By strong recursion, we first define a function $n\mapsto\sF_n$ from
$\varN$ into $\PFm$.  Suppose $\{\sF_k:k<n\}$ has been defined.  We then
let $\sF_n$ be $P_n$, if $\Sigma\cup\{\sF_k:k<n\}\cup\{P_n\}$ is
finitely satisfiable; otherwise, $\sF_n$ is $\lnot P_n$.  This completes
the recursive definition.

We now observe by induction that every set $\Sigma\cup\{\sF_k:k<n\}$
is finitely satisfiable.  Indeed, it is true by assumption when $n=0$;
and if it is true when $n=m$, then it is true when $n=m+1$, by the
last lemma and the definition of the $\sF_k$.

Every finite subset of $\Sigma\cup\{\sF_k:k\in\varN\}$ is a finite
subset of  $\Sigma\cup\{\sF_k:k<n\}$ for some $n$.  We have just seen
that $\Sigma\cup\{\sF_k:k<n\}$ is finitely satisfiable; therefore the
whole set  $\Sigma\cup\{\sF_k:k\in\varN\}$ is finitely satisfiable.

Now let $\epsilon$ be the truth-assignment given by
\begin{equation}
  \epsilon(P_k)=
  \begin{cases}
    1,& \text{ if }\sF_k=P_k;\\
0,& \text{ if }\sF_k=\lnot P_k.
  \end{cases}
\end{equation}
This is a model of $\Sigma$.  Indeed, suppose $\sG\in\Sigma$.  Then
$\sG$ is $n$-ary for some $n$.  The finite set
$\{\sG\}\cup\{\sF_k:k<n\}$ has a
model $\zeta$.  In particular, $\zeta$ must agree with $\epsilon$ on
$\{P_k:k<n\}$ (why?); so $\epsilon\models\sG$.
\end{proof}

There are sets $\Sigma$ of formulas such that \emph{every} finite
subset of $\Sigma$ has a model that is not a model of $\Sigma$ itself.

\begin{example}\label{example:compactness}
  Let $\Sigma_n$ comprise the formulas
  \begin{equation*}
P_0\to P_1\to\dotsb\to P_k,
  \end{equation*}
where $k<n$.  So $\Sigma_0$ is empty, and $\Sigma_1=\{P_0\}$, and we
have a chain
\begin{equation*}
  \Sigma_0\included\Sigma_1\included\Sigma_2\included\dotsb.
\end{equation*}
Let $\Sigma=\bigcup_{n\in\varN}\Sigma_n$.  Then every finite subset of
$\Sigma$ is a subset of some $\Sigma_n$.  Let
$\epsilon_n$ be the truth-assignment such that
\begin{equation*}
  \epsilon_n(P_k)=1\Iff k<n.
\end{equation*}
Then $\epsilon_n$ is a model of $\Sigma_n$, but not of $\Sigma_{n+1}$
(why?), hence not of $\Sigma$.
\end{example}

If a set $A$ is a \emph{finite} subset of a set $B$, we may denote
this by
\begin{equation*}
  A\fincluded B.
\end{equation*}
Now one consequence of the Compactness Theorem can be expressed as
follows: 

\begin{corollary}\label{cor:finitary}
  $\displaystyle\conseq{\Sigma}=\bigcup_{\Xi\fincluded\Sigma}\conseq\Xi$.
\end{corollary}

\begin{proof}
  By Theorem~\ref{thm:closure}, it is enough to show that
  \begin{equation*}
      \conseq{\Sigma}\included\bigcup_{\Xi\fincluded\Sigma}\conseq\Xi.
  \end{equation*}
  Suppose $\sF$ is \emph{not} a member of the union.  Then,
  for each finite subset $\Tau$ of $\Sigma$, the set $\conseq{\Tau}$
  does not contain $\sF$, and so the set
  $\Tau\cup\{\lnot\sF\}$ is satisfiable, by Lemma~\ref{lem:sat}.  This
  means $\Sigma\cup\{\lnot\sF\}$ is finitely satisfiable; so it is
  satisfiable, by the Compactness Theorem.  Therefore
  $\lnot\sF\notin\conseq\Sigma$, again by Lemma~\ref{lem:sat}. 
\end{proof}

\begin{exercises}
  \item
If $\Tau\cup\{\sF\}$ is not satisfiable, why is
$\Tau\cup\{\lnot\lnot\sF\}$ not satisfiable?
\item
Why has the set $\{\sF,\lnot\sF\}$ no models?
\item
In the proof of the Compactness Theorem, why does $\zeta$ agree with
$\epsilon$ on $\{P_k:k<n\}$?
\item
In Example~\ref{example:compactness}, prove by induction that
$\epsilon_n\in\Mod{\Sigma_n}\setminus\Mod{\Sigma_{n+1}}$.
\item
Suppose $I$ is a set, and there is a function $i\mapsto\sF_i$ from
$I$ into $\PFm$, such that
\begin{equation*}
  \bigcup_{i\in I}\Mod{\{\sF_i\}}=2^{\PVar}.
\end{equation*}
Prove that $I$ has a finite subset $J$ such that $\bigcup_{i\in
  J}\Mod{\{\sF_i\}}=2^{\PVar}$. 
\end{exercises}


\section{Generalizations}

The concepts of the previous section are instances of more general
concepts. 

For an arbitrary set $\Omega$,
a singulary operation $X\mapsto\cl X$ on
$\pow{\Omega}$ is called a  
\defn{closure-operator} on $\Omega$ if it is:
\begin{mylist}
  \item
\defn{increasing} (that is, $A\included\cl A$ for all subsets $A$ of
$\Omega$); 
\item
\defn{monotone} (that is, $\cl A\included\cl B$
  whenever $A\included B\included\Omega$); and
\item
\defn{idempotent} (that is, $\cl{\cl A}=\cl A$ for all $A$ in $\pow{\Omega}$).
\end{mylist}
The terminology is potentially confusing: a closure-operator on
$\Omega$ is an operation on $\pow{\Omega}$ (not on $\Omega$).

The closure-operator $X\mapsto\cl X$ is called \defn{finitary} if
\begin{equation*}
  \cl A=\bigcup_{X\fincluded A}\cl X
\end{equation*}
for all $A$ in $\pow\Omega$.

\begin{examples}{\mbox{}}
\item
If $\Omega$ is a topological space, the function taking a subset of
$\Omega$ to its topological 
closure is a closure-operator on $\Omega$ (usually not finitary).
  \item
If $G$ is a group, the function $X\mapsto\gpgen X$ taking a subset of
$G$ to the group that it generates is a finitary closure-operator on
$G$. 
\item
On $\PFm$, the function
$\Xi\mapsto\conseq\Xi$ is a closure-operator, by
Theorem~\ref{thm:closure}; it is finitary, by
Corollary~\ref{cor:finitary}.  
\item
On any set, the identity-function $X\mapsto X$ is trivially a finitary
closure-operator.
\end{examples}

Closure-operators can arise from a \tech{Galois correspondence}
between two sets.  Suppose $A$ and $B$ are sets, and $R$ is a relation
from $A$ to $B$.  
If $C\included A$, and $D\included B$, let
\begin{gather*}
  C'=\bigcap_{x\in C}\{y\in B:x\mathrel Ry\}=\{y\in B:\Forall x(x\in
  C\to x\mathrel R y)\}\\
  D'=\bigcap_{y\in D}\{x\in A:x\mathrel Ry\}=\{x\in A:\Forall y(y\in
  D\to x\mathrel R y)\}.
\end{gather*}
So we have functions $X\mapsto X'$ from $\pow A$ to $\pow B$, and from
$\pow B$ to $\pow A$.  These functions are inclusion-reversing; so the
functions $X\mapsto X''$ are inclusion-preserving (monotone). 
Moreover,
\begin{align*}
  C''
&=\{x\in A:\Forall y(y\in C'\to x\mathrel R y)\}\\
&=\{x\in A:\Forall y(\Forall z(z\in
  C\to z\mathrel R y)\to x\mathrel R y)\},
\end{align*}
so $C\included C''$; similarly, $D\included D''$.  Replacing $C$ with
$D'$, we get $D'\included D'''$; but since $D\included D''$, we get
also $D'''\included D'$; thus, $D'=D'''$.  Similarly, $C'=C'''$.
Therefore the functions $X\mapsto X''$ are closure-operators on $A$
and $B$.  Also, the functions $X\mapsto X'$ are bijections---each the
inverse of the other---between $\{X':X\included B\}$ and
$\{X':X\included A\}$; the existence of such functions is a
\defn{Galois correspondence.}

Exercises \ref{exercise:caps} and \ref{exercise:cups} in
\S~\ref{sect:logical} show that the sets $\Mod\Sigma$ are the
\defn{closed} sets in a \defn{topology} on $2^{\PVar}$.  Then the
Compactness Theorem can be understood as the topological statement
that this topology is compact.



\section{Completeness}

An arbitrary singulary operation $\Xi\mapsto\Pi(\Xi)$ on $\pow{\PFm}$
can be called
\begin{mylist}
  \item
\defn{sound,} if always $\Pi(\Sigma)\included\conseq\Sigma$;
\item
\defn{complete,} if always $\conseq\Sigma\included\Pi(\Sigma)$.
\end{mylist}

We shall show that logical entailment is the same
relation as syntactic entailment; that is, the operation
\begin{equation*}
  \Xi\longmapsto\{X\in \PFm:\Xi\proves X\}
\end{equation*}
on $\pow{\PFm}$ is sound and complete.

\begin{theorem}[Soundness]
  If $\Sigma\proves\sF$, then $\sF\in\conseq\Sigma$.
\end{theorem}

\begin{proof}
  We use induction on the set of syntactic consequences of $\Sigma$ to
  show that it is a subset of $\conseq\Sigma$. 
  All axioms are tautologies; hence they are logical consequences of
  $\Sigma$ by Theorem~\ref{thm:closure}.  By the same theorem,  
all elements of $\Sigma$ are logical
  consequences of $\Sigma$.  Finally, suppose $\sF$ and $\sF\to\sG$
  are logical consequences of $\Sigma$, and $\epsilon$ is a model of
  $\Sigma$.  Then $\named{\sF}(\epsilon)=1$.  Also, writing $\sH$ for
  $\sF\to\sG$, we have
  \begin{equation*}
    1=\named{\sH}(\epsilon)=1+\named{\sF}(\epsilon)+
    \named{\sF}(\epsilon)\cdot\named{\sG}(\epsilon)
    =1+1+1\cdot\named{\sG}(\epsilon)= \named{\sG}(\epsilon),
  \end{equation*}
so $\epsilon\models\sG$.  This completes the induction and the proof.
\end{proof}

Proving completeness will take more work.\footnote{The following lemma
  corresponds to one found in Church \cite[*151, p.~98]{MR18:631a};
  the origin is not clear.}

\begin{lemma}\label{lem:eval}
Suppose $\tuple e$ is an $n$-ary truth-assignment, and suppose
$\Sigma$ is a set of formulas that, for each $k$ in $n$, contains
$\begin{cases}
    P_k,&\text{ if }e_k=1;\\
    \lnot P_k,&\text{ if }e_k=0.
  \end{cases}$
Then
\begin{equation*}
  \Sigma\proves \sF'
\end{equation*}
for all $n$-ary formulas $\sF$, where $\sF'$ is the formula
$\begin{cases}
\sF,&\text{ if }\named{\sF}(\tuple e)=1;\\
\lnot \sF, &\text{ if }\named{\sF}(\tuple e)=0.
 \end{cases}$
\end{lemma}


\begin{proof}
When $k<n$, let $\sP_k'=\begin{cases}
    P_k,&\text{ if }e_k=1;\\
    \lnot P_k,&\text{ if }e_k=0.
  \end{cases}$
Suppose $\{\sP_k':k<n\}\included\Sigma$.  Let $\sF'$ be defined as in
the statement of the theorem.  We proceed by induction on $n$-ary
formulas.  

If $\sF$ is a variable $P_k$, where $k<n$, then $\sF'$ is $\sP_k'$,
which is in $\Sigma$, so $\Sigma\proves \sF'$. 

Suppose $\Sigma\proves\sG'$, and $\sF$ is
$\lnot\sG$.  There are two cases to consider. 
\begin{mylist}
  \item
If $\named{\sF}(\tuple e)=1$, then $\named{\sG}(\tuple
e)=0$, so $\sF'$ is $\sF$, but $\sG'$ is $\lnot\sG$, which is $\sF$,
that is, $\sF'$.  
\item
If
$\named{\sF}(\tuple e)=0$, then $\named{\sG}(\tuple e)=1$, so $\sG'$
is $\sG$, but $\sF'$ is $\lnot\sF$, which is $\lnot\lnot\sG$, that is,
$\lnot\lnot\sG'$. 
\end{mylist}
In either case, we have $\proves\sG'\to\sF'$, by
Lemmas~\ref{lem:FtoF} and~\ref{lem:several}; hence $\Sigma\proves\sF'$
by inductive
hypothesis and Detachment.

Suppose finally that $\Sigma\proves\sG'$ and $\Sigma\proves\sH'$, and
$\sF$ is $\sG\to\sH$.
There are three cases to
consider:
\begin{mylist}
  \item
$\named{\sG}(\tuple e)=0$;
\item
$\named{\sH}(\tuple e)=1$;
\item
 $\named{\sG}(\tuple e)=1$ and $\named{\sH}(\epsilon)=0$.
\end{mylist}
Details are left to the reader.  This completes the proof.
\end{proof}

\begin{theorem}[Completeness]
If $\sF\in\conseq\Sigma$, then $\Sigma\proves\sF$.
\end{theorem}

\begin{proof}
Suppose $\sF\in\conseq\Sigma$.  By Compactness (rather,
Corollary~\ref{cor:finitary}), $\Sigma$ has a finite subset $\Tau$
such that $\sF\in\conseq\Sigma$.  Write $\Tau$ as
$\{\sF_0,\dots,\sF_{m-1}\}$, and $\sF$ as $\sF_m$.  Then the formula
\begin{equation*}
  \sF_0\to\dots\to\sF_m
\end{equation*}
is a tautology (the proof of this is left to the reader).  Call this
tautology $\sG$, and suppose it is $n$-ary.  We shall show by
induction on $n$ that $\sG$ is a Theorem.

Let
$\sP_k'\in\{P_k,\lnot P_k\}$ for each $k$ in $n$.  By the previous
lemma, we have
\begin{equation}\label{eqn:completeness}
\sP_0'\dots,\sP_{n-1}'\proves \sG.
\end{equation}
If $n=0$, we are done.  (However, there are no nullary formulas.)
Suppose that $\sG$ is a Theorem under the assumption that $n=\ell$; but
now suppose $n=\ell+1$.  From Entailment~\eqref{eqn:completeness} in
this case,
by the Deduction Theorem (\ref{thm:deduction}), remembering that
$\sP_{\ell}'$ can be either $P_{\ell}$ or $\lnot P_{\ell}$, we have
\begin{align*}
\sP_0'\dots,\sP_{\ell-1}'&\proves P_{\ell}\to \sG,\\
\sP_0'\dots,\sP_{\ell-1}'&\proves \lnot P_{\ell}\to \sG.
\end{align*}
so $\sP_0'\dots,\sP_{\ell-1}'\proves \sG$ by
Lemma~\ref{lem:several}.  By inductive hypothesis, $\sG$ is a
Theorem. 
\end{proof}

\begin{exercises}
\item
Complete the proof of Lemma~\ref{lem:eval}.
\item
Prove by induction that, if
$\sF_m\in\conseq{\{\sF_0,\dots,\sF_{m-1}\}}$, then the formula
$\sF_0\to\dotsb\to\sF_m$ is a tautology.
\end{exercises}
