\documentclass[draft]{article}

\input theorem
\input prooftree
\input lineardvi

\def\rtype{{\mathrm{type}}}
\def\rctx{\ {\mathrm{ctx}}}
\def\rkind{{\mathrm{kind}}}
\def\rok{{\mathrm{ok}}}
\def\rid{\mathsf{id}}
\def\tcheck{\Leftarrow}
\def\synth{\Rightarrow}

\def\easyrule#1#2{
\begin{prooftree}
#1 \justifies #2
\end{prooftree}}
\title{TinyLF: A Compact Definition of a Logical Framework}
\author{Jason Reed}
\begin{document}
\maketitle
\section{Introduction}
By pushing some modern ideas concerning LF to their logical
conclusion, one can obtain a very concise definition of a logical
framework, which has a comparably simple metatheory. This logical
framework generalizes LF slightly by including a limited form of
polymorphism.  The main point, however, is not this extension per se,
but that we can support the {\em same} feature set as LF in a much
more compact way.

The central ideas are naturally the recently ubiquitous notion of {\em
restricting attention to only canonical forms} of
$\lambda$-expressions, and the consequent need to define {\em
hereditary substitution}, pioneered by Watkins et al in CLF.

Another main contributing idea is that of spine form, which
considers not just single function applications but applications of
heads to entire vectors of arguments at once. Spine form still
describes function abstractions in terms of one-at-a-time
$\lambda$-binding of variables. The motivation is to find a convenient
syntax in which variable binding also takes place in a `vectorized'
fashion. This is obviously related to the idea of pattern-matching in
functional languages. However, in the present case, given the usual
normal/atomic distinction that serves to enforce $\beta$-normal,
$\eta$-long terms, we are able to demand that functions provide the
{\em entire} pattern of their arguments, coalescing all potential
curried arguments into one sequence.

But the other salient recent piece of work where sequences of terms play an
important role is the Contextual Modal Type Theory designed to give a
logical account of operations such as raising and lowering of
metavariables in unification. In it, modal variables appear under {\em
substitutions} that fill in the variable's arguments all at once.  We
take essentially the same approach while dropping the {\em closedness}
requirement on substitutions for modal variables, making our variables
still `contextual' but not modal.

\section{Syntax}
The complete syntax of the language is as follows:
$$\begin{tabular}{rlll}
Base Classifiers&$v$&$::=$&$R \celse \rtype$\\
Classifiers&$V$&$::=$&$\Pi \Psi .v$\\
Atomic Terms&$R$&$::=$&$x[\sigma]$\\
Terms&$M$&$::=$&$ \lambda \hat \Psi  . R$\\
Substitutions&$ \sigma $&$::=$&$\rid \celse M . \sigma$\\
Contexts&$\Gamma$&$::=$&$\cdot \celse\Gamma, x : V$\\
\end{tabular}$$

The syntax $\hat \Psi$ stands for just the sequence of variables found
in $\Psi$, without their associated types. All contexts are to be understood
as dependent, with $V$s on the right being in the scope of variables on their left.
We will use $=$ to mean syntactic equality up to $\alpha$-conversion.

The terms $M$ of LF correspond to terms here; their consecutive
$\lambda$s have simply been merged into one context. For example, the LF
term 
$$\lambda x . \lambda y . \lambda z . z$$
corresponds to
$$\lambda (x,y,z) . z[\rid]$$ Variable occurrences are always {\em
under substitution}, the substitution being the analogue to the spine
attached to a root in spine form.  Indeed, all substitutions are are
vectors of terms. To carry out a substitution, we will write
$\{\sigma/\Psi\}$, substituting the items found in $\sigma$ for the
variables in $\Psi$. We call these term vectors substitutions rather
than spines because they associate the opposite way.

Both the types and kinds of LF become {\em classifiers} in the present
system. A base classifier is either a base type $x[\sigma]$, which corresponds to
a base type $a\cdot S$ in spine-form $LF$, or else it is the base kind $\rtype$.
This merger is reminiscent of (but not quite identical to) the
uniformity typical of Pure Type Systems. The difference is that while
we allow type variables $x : \rtype$ to appear in contexts, the only
things that can be substituted for $x$ are {\em base} types.

Because variables in the context can have classifiers such as $\Pi
\Psi . \rtype$ that are essentially kinds, we have no need for a
separate notion of signature: the context by itself can contain a
mixture of declarations of type families and constants.

\section{Typing}

The judgments are

$$\begin{tabular}{ll}
$\Gamma \prov V \tcheck \rok$&$V$ is a well-formed classifier\\
$\Gamma \prov N \tcheck V$&$N$ checks at classifier $V$\\
$\Gamma \prov R \synth v$&$R$ synthesizes classifier $v$\\
$\Gamma \prov \sigma \tcheck \Psi$&$\sigma$ is a valid substitution for $\Psi$\\
$\Gamma \prov \Psi \rctx $&$\Psi$ is a well-formed context\\
\end{tabular}$$

Well-formedness of classifiers is defined by two rules, one that 
allows formation of types and another that forms kinds:

$$
\easyrule
{\Gamma \prov \Psi \rctx \qquad 
\Gamma, \Psi \prov R \synth \rtype}
{\Gamma \prov \Pi \Psi . R \tcheck \rok}
\qquad
\easyrule
{\Gamma \prov \Psi \rctx}
{\Gamma \prov \Pi \Psi . \rtype \tcheck \rok}
$$

The remaining judgments are defined by the following rules:
$$
\easyrule
{\Gamma, \Psi \prov R \synth  v' \qquad v = v'}
{\Gamma \prov \lambda \hat \Psi . R \tcheck \Pi \Psi . v}
\qquad 
\easyrule
{ x : \Pi \Psi . v  \in \Gamma \qquad \Gamma \prov \sigma : \Psi}
{\Gamma \prov x[\sigma] \synth v\{\sigma/ \Psi\}}
$$

$$
\easyrule{}
{\Gamma \prov \rid \tcheck \cdot }
\qquad
\easyrule
{\Gamma \prov M \tcheck V\{\sigma / \Psi\} \qquad \Gamma \prov \sigma \tcheck \Psi}
{\Gamma \prov M.\sigma \tcheck (\Psi, x : V) }
$$

$$
\easyrule{}
{ \Gamma\prov \cdot  \rctx }
\qquad
\easyrule
{\Gamma\prov \Psi \rctx \qquad  \Gamma,\Psi \prov V \tcheck \rok }
{ \Gamma\prov \Psi, x : V \rctx}
$$

Let $X$ stand for any syntactic object $v,v,M,R,\sigma,\Gamma$.
All that remains to define is substitution $X\{\sigma/\Psi\}$. It is
defined recursively according to a lexicographic order on the structure of first $\Psi$,
then $X$.
We say $(M/x : V) \in \{\sigma/\Psi\}$ if, for some
$n$, the $n^{th}$ item from the left in $\sigma$ is $M$, and the
$n^{th}$ item from the right in $\Psi$ is $x : V$.
Almost all cases are homomorphic. Abbreviating $\theta = \{\sigma/\Psi\}$,
we have

$$(M_1.\ldots M_n. \rid)\theta = (M_1\theta. \ldots M_n\theta . \rid)$$
$$(V_1,\ldots, V_n)\theta = (V_1\theta, \ldots, V_n \theta)$$
$$(\Pi \Delta . v)\theta = \Pi (\Delta\theta) . (v\theta)
\qquad 
(\lambda \hat \Delta . R)\theta = \lambda \hat \Delta . (R\theta)$$
$$(\rtype)\theta = \rtype$$
The only interesting case is
$$( x[\rho])\theta = 
\cases{
 R\{\rho\theta/\Delta\}&if $(\lambda \hat \Delta . R/x:\Pi \Delta . V) \in \theta $;\cr
x[\rho\theta]&otherwise.
\cr
}$$
This completes the definition of TinyLF.
\section{Results}
The things we are trying to prove are fairly standard, and so too are the methods for the most part.
In preparation for the substitution property, we show that substitutions commute properly.
In preparation for the identity property, we show that $\eta$-expansions are two-sided units with respect to substitution.

\begin{lemma}
\label{simple.type}
$X \{\sigma / \Psi \} = X \{\sigma / \Psi\theta \}$.
\end{lemma}
\begin{proof}
By induction on $\Psi$, then $X$. The interesting case is when $X = x[\rho]$,
and $(\lambda \hat \Delta . R/x:\Pi \Delta . V) \in \{\sigma/\Psi\} $.
In this case $(\lambda\hat \Delta. R/ x : \Pi (\Delta\theta) . (V\theta)) \in \{\sigma / \Psi\theta\}$.
We reason that 
\begin{tabbing}
$R\{\rho\{\sigma/\Psi\} / \Delta\}$\\
$= R\{\rho \{\sigma / \Psi\theta \} / \Delta\}$\` i.h. on $\rho < x[\rho]$\\
$= R\{\rho \{\sigma / \Psi\theta \} / \Delta\theta\}$\` i.h. on $\Delta < \Psi$\\
\end{tabbing}
\cqed
\end{proof}
\begin{lemma}
If $FV(X) \cap \hat \Psi = \emptyset$, then 
$X \{\sigma/\Psi\} = X$.
\label{triv.subst}
\end{lemma}
\begin{proof}
Straightforward induction.
\cqed
\end{proof}

The above lemma is exemplary of the fact that substitution
$\{\sigma/\Psi\}$, as should be expected from a hereditary
substitution, does not really care about the exact structure of the
types $\Psi$, but only the $\Pi$-skeleton of them.

Let $J$ stand for an arbitrary judgment of the system.

\begin{lemma}[Weakening]
If $\Gamma, \Gamma' \prov J$, then $\Gamma, \Delta, \Gamma' \prov J$.
\label{weakening}
\end{lemma}
\begin{proof}
Straightforward induction.
\cqed
\end{proof}

\begin{lemma}
If $(M/x:V)\in \{\sigma/\Psi\}$ and $\Gamma \prov \sigma \tcheck \Psi$,
then $\Gamma \prov M \tcheck V\{\sigma/\Psi\}$.
\label{extract}
\end{lemma}
\begin{proof}
Straightforward induction.
\cqed
\end{proof}

\begin{lemma}[Interchange]
\label{interchange}
If $FV(\sigma) \cap \hat \Delta = \emptyset$, then 
$X \{\rho/\Delta\}\theta = X \theta \{\rho\theta / \Delta\}$
\end{lemma}
\begin{proof}
By induction on first the undordered pair $(\Delta,\Psi)$, and subsequently $X$.
For all the homomorphism cases, it's just $X$ that gets smaller. This includes
the case of $X = x[\omega]$ where $x$ is in neither $\hat \Delta$ nor $\hat \Psi$.
The interesting cases are
\begin{itemize}
\item $x\in \hat\Psi$ and $(\lambda \hat \Omega . R / x : \Pi \Omega . v) \in \{\sigma/\Psi\} = \theta$.
In this case we reason that
\begin{tabbing}
$x[\omega] \{\rho/\Delta\}\theta $\\
$= x[\omega\{\rho/\Delta\}] \theta $\\
$= R\{\omega\{\rho/\Delta\} \theta / \Omega\} $\\
$= R\{\omega\theta\{\rho\theta/\Delta\}  / \Omega\} $\` i.h. on $\rho < x[\rho]$\\
$= R\{\rho\theta/\Delta\}\{\omega\theta\{\rho\theta/\Delta\}  / \Omega\} $ \` Lemma~\ref{triv.subst}\\
$= R\{\omega\theta/\Omega\}  \{\rho\theta / \Delta\}$\` i.h. on $(\Omega,\Delta) < (\Delta,\Psi)$\\
$= x[\omega] \theta \{\rho\theta / \Delta\}$\\
\end{tabbing}
To justify the second induction hypothesis appeal, we need $FV(\rho\theta) \cap \hat \Omega = \emptyset $,
but this is true because the variables $\hat \Omega$ are bound inside $\sigma$.
\item $x\in \hat\Delta$ and $(\lambda \hat \Omega . R / x : \Pi \Omega . v) \in \{\rho/\Delta\}$.
In this case we reason that
\begin{tabbing}
$x[\omega] \{\rho/\Delta\}\theta$\\
$= R\{\omega\{\rho/\Delta\}/\Omega\} \theta$\\
$= R\theta\{\omega\{\rho/\Delta\}\theta/\Omega\} $\` i.h. on $(\Omega,\Psi) < (\Delta, \Psi)$\\
$= R\theta\{\omega\theta\{\rho\theta/\Delta\}/\Omega\} $\` i.h. on $\rho < x[\rho]$\\
$= R\{\omega \theta \{\rho\theta / \Delta\} / \Omega\}  $\` Lemma~\ref{triv.subst}\\
$= x[\omega \theta]  \{\rho\theta / \Delta\}$\\
$= x[\omega] \theta \{\rho\theta / \Delta\}$\\
\end{tabbing}
\end{itemize}
\cqed
\end{proof}

\begin{lemma}[Substitution]
If $\Gamma, \Psi, \Gamma' \prov J$, and $\Gamma \prov \sigma : \Psi$,
then $\Gamma, \Gamma'\theta \prov J\theta$.
\end{lemma}
\begin{proof}
By induction on first the $\Pi$-skeleton structure of $\Psi$ (ignoring the identity of all occurring variables and all embedded substitutions) and subsequently the derivation of $J$.
The interesting cases are:
\begin{itemize}
\item[Case:]
$$
\easyrule
{\Gamma,\Psi,\Gamma' \prov M \tcheck V\{\rho / \Delta\} \qquad \Gamma,\Psi,\Gamma' \prov \rho \tcheck \Delta}
{\Gamma,\Psi,\Gamma' \prov M.\rho \tcheck (\Delta, x : V) }
$$
Reason that
\begin{tabbing}
$\Gamma, \Gamma'\theta \prov \rho\theta \tcheck \Delta\theta$\` i.h.\\
$\Gamma, \Gamma'\theta \prov M \theta \tcheck V\{\rho/\Delta\}\theta$\` i.h.\\
$\Gamma, \Gamma'\theta \prov M \theta \tcheck V\theta\{\rho\theta/\Delta\}$\` Lemma~\ref{interchange}\\
$\Gamma, \Gamma'\theta \prov M \theta \tcheck V\theta\{\rho\theta/\Delta\theta\}$\` Lemma~\ref{simple.type}\\
\end{tabbing}
and form the derivation
$$
\easyrule
{\Gamma,\Gamma'\theta \prov M\theta \tcheck V\theta\{\rho\theta / \Delta\theta\} \qquad \Gamma,\Gamma'\theta \prov \rho\theta \tcheck \Delta\theta}
{\Gamma,\Gamma'\theta \prov M\theta.\rho\theta \tcheck (\Delta\theta, x : V\theta) }
$$

\item[Case:]
$$\easyrule
{ x : \Pi \Delta . v  \in \Gamma,\Psi,\Gamma' \qquad \Gamma,\Psi,\Gamma' \prov \rho : \Delta}
{\Gamma,\Psi,\Gamma' \prov x[\rho] \synth v\{\rho/ \Delta\}}
$$
To show:
$$\Gamma,\Gamma'\theta \prov x[\rho]\theta \synth v\{\rho/ \Delta\}\theta$$

By Lemmas~\ref{interchange} and~\ref{simple.type} it suffices to show
$$\Gamma,\Gamma'\theta \prov x[\rho]\theta \synth v\theta \{\rho\theta/ \Delta\theta\}$$
and we know already by i.h. that
$$\Gamma, \Gamma'\theta \prov \rho\theta \tcheck \Delta\theta$$
If $(\lambda \hat \Delta . R/x:\Pi \Delta . v) \in \theta $,
then by Lemma~\ref{extract} we know we have a derivation that looks like
$$
\easyrule
{\Gamma,\Delta\theta \prov R \tcheck  v\theta}
{\Gamma \prov \lambda \hat \Delta . R \tcheck (\Pi \Delta . v)\theta}$$
and by weakening and i.h. (on $\Delta\theta < \Psi$) and Lemma~\ref{simple.type} we get
$$\Gamma,\Gamma'\theta \prov R\{\rho\theta/\Delta\} \synth v\theta \{\rho\theta/ \Delta\theta\}$$
as required. Otherwise, observe that $(\Gamma,\Gamma')\theta = \Gamma,(\Gamma'\theta)$ by Lemma~\ref{triv.subst},
and form the derivation
$$
\easyrule
{ x : \Pi \Delta\theta . v\theta \in \Gamma,\theta \Gamma' \qquad \Gamma,\Gamma'\theta \prov \rho\theta : \Delta\theta }
{\Gamma,\Gamma'\theta \prov x[\rho\theta] \synth v\theta \{\rho\theta/ \Delta\theta\}}
$$
% &
%x[\rho\theta]&otherwise.
%\cr
%}$$

\end{itemize}
\cqed
\end{proof}

Here is how we define $\eta_V(x)$, the $\eta$-expansion of a variable at a classifier $V$,
and $I_\Psi$, the expanded identity substitution for the context $\Psi$.
$$\eta_{\Pi \Psi . v} (x) = \lambda\hat \Psi . x [I_\Psi]$$
$$I_{\cdot} = \rid \qquad I_{\Psi, x : V} = \eta_V(x) . I_\Psi$$
\begin{lemma}[Unit Laws for $\eta$-expansion]\ 
\begin{enumerate}
\item If $\Gamma \prov R \synth v$, then $R\{I_\Psi / \Psi\} = R$
\item If $\Gamma \prov \rho \tcheck \Delta$, then $I_\Delta\{\rho / \Delta\} = \rho$
\item If $\Gamma \prov \rho \tcheck \Delta$, then $\rho\{I_\Psi / \Psi\} = \rho$
\item If $\Gamma \prov N \tcheck V$, then $N\{I_\Psi / \Psi\} = N$
\item If $\Gamma \prov V \tcheck \rok$, then $V\{I_\Psi / \Psi\} = V$
\item If $\Gamma \prov \Delta \rctx$, then $\Delta\{I_\Psi / \Psi\} = \Delta$
\end{enumerate}
\end{lemma}
\begin{proof}
By induction.
\begin{enumerate}
\item Say $R = x[\rho]$. The interesting case is when $\lambda \hat \Delta . x[I_\Delta]/x:\Pi \Delta . v\in \{I_\Psi / \Psi\}$.
Here we want to show $x[\rho] = x[I_\Delta\{\rho\{I_\Psi/\Psi\} /\Delta\}]$. By one application of i.h. this
is the same as $x[I_\Delta\{\rho /\Delta\}]$, and by another we're done.
\item By typing, of $\rho$ we know that
$$I_\Delta = \eta_{V_1}(x_1).\ldots.\eta_{V_n}(x_n).\rid$$
$$\rho = \lambda \hat \Psi_1 . R_1, \ldots, \lambda \hat \Psi_n . R_n . \rid$$
supposing that $V_i = \Pi \Psi_i . v_i$. We want to show
$$(\eta_{V_1}(x_1).\ldots.\eta_{V_n}(x_n).\rid)\{\rho / \Delta\} = \rho$$
so it suffices to show
$$\eta_{V_i}(x_i)\{\rho / \Delta\} = \lambda \hat\Psi_i . R_i$$
Let's carry out the eta-expansion on the left:
$$\lambda \hat\Psi_i . x_i[I_{\Psi_i}]\{\rho / \Delta\}$$
We know that $\lambda \hat \Psi_i / x : V_i \in \{\rho/\Delta\}$, so this is
$$\lambda \hat\Psi_i . R_i\{I_{\Psi_i}\{\rho / \Delta\}/\Psi_i\}$$
However, $FV(I_{\Psi_i}) \cap \hat \Delta = \emptyset$, so this is
$$\lambda \hat\Psi_i . R_i\{I_{\Psi_i}/\Psi_i\}$$
which by induction hypothesis is
$$\lambda \hat\Psi_i . R_i$$
as required.
\item Straightforward.
\item Straightforward.
\item Straightforward.
\item Straightforward.
\end{enumerate}

\cqed
\end{proof}
\begin{lemma}[Identity]
Suppose $\prov \Gamma \rctx$.
\begin{enumerate}
\item If $x : V \in \Gamma$, then $\Gamma \prov \eta_V(x) \tcheck V$.
\item If $\Gamma\prov \Psi \rctx$, then $\Gamma, \Psi \prov I_\Psi \tcheck \Psi$.
\end{enumerate}
\end{lemma}
\begin{proof} By induction.
\begin{enumerate}
\item
Form the derivation
$$
\easyrule
{\[
x : \Pi \Psi . v \in \Gamma \qquad \Gamma,\Psi \prov I_\Psi \tcheck \Psi
\justifies
\Gamma,\Psi \prov  x [I_\Psi] \tcheck  v\{I_\Psi / \Psi\}
\]}
{\Gamma \prov \lambda\hat \Psi . x [I_\Psi] \tcheck \Pi \Psi . (v\{I_\Psi / \Psi\})}
$$
which by the previous lemma is as required.
\item The empty context case is immediate. Otherwise, form the derivation
$$
\easyrule
{
\Gamma, \Psi, x : V \prov \eta_V(x) \tcheck V \{I_\Psi / \Psi\}\qquad
\Gamma, \Psi \prov I_\Psi \tcheck \Psi
}
{\Gamma, \Psi, x : V \prov \eta_V(x) . I_\Psi \tcheck \Psi, x : V}
$$
from uses of the i.h. and previous lemma.
\end{enumerate}
\cqed
\end{proof}
\end{document}
