Did some cooking with Edwin yesterday, and we came up with an ExTT-like way of separating phases. The idea is to annotate context entries and the typing judgment with a level number (low numbers are more dynamic), with
That is, higher level stuff can only be used for higher level things. We require of our design that the following be admissible
So, contextuals and star (this stratification is orthogonal to stratification for consistency):
Next, the slight generalisation of ExTT to have a funny λ
Plus the whatever conversion apparatus…
Next comes the deletion operator, ‘lightning’. The idea is just to erase types and everything which belongs at a higher level than the given index, leaving a ‘run-time’ λ-term.
Lightning extends to contexts in the obvious way, erasing the type annotations and the high-level entries. Any types which are present get replaced by a dummy token which can never be inspected. I claim
where on untyped terms is the scoping judgment. The point is that variables whose bindings get erased only get used in places which also get erased. If lightning strikes the binding occurrence, it cannot strike any other occurrence.
Now, I claim that lightning commutes with reduction, in that, every reduction before erasure yields at most one reduction afterwards.
This is easily checked for β, δ, γ and structural rules.
We may now extend Edwin’s existing analysis of constructor and eliminator optimisation by making sure that all type-family arguments (eg motives) get quantified at a high level. Who says we can’t have an erasure semantics?
Now with real marvosym lightning. Don’t be too afraid
Ta.