It is a common refrain here at Carnegie Mellon University that structural operational semantics (SOS) is the best form of dynamics, and that contextual dynamics (“Indiana-style semantics”) do not combat the bureaucratic tendency in the way that they are usually claimed to do. This is true in one formulation of contextual dynamics, but I would like to demonstrate a more fine-grained approach that is an improvement on both SOS and standard contextual dynamics.
Dialectical materialism, far from being merely the definitive theory of political and historical movement, also serves as the guiding light for the progressive mathematician in its manifestation as the science of adjoint functors in category theory.
As Lawvere stated in his influential note Quantifiers and Sheaves, the essential move in mathematics is to identify the principal contradictions of a theory in the form of pairs of adjoint functors, and to “weaponize” them as slogans to drive the further development (and generalization) of the thing. In this note, I would like to make a few remarks about the pervasiveness of dialectical phenomena, both in the mathematics itself as well as in the practical engagement in mathematical activity.
This post is an introduction to the container-oriented generalization of the core Brouwerian (co)data structures, inspired by Ghani, Hancock and Pattinson. I am not introducing anything novel; I’m merely taking their framework and showing how to round up the usual suspects of Brouwerian mathematics in their generalized, “family-friendly” setting. I am using Constable et al’s Computational Type Theory + Induction-Recursion as my metalanguage, but other variants of type theory may be used as well.
Here, I present a judgemental reconstruction of LCF-style tactic systems, called Modernized LCF, which admits various extensions including validations (checkable certificates). The purpose is to present the structure and meaning of refinement proof directly as a logical theory with a meaning explanation, in contrast to the standard practice of taking refinement proof as an extra-logical “finishing touch”.
This year’s Oregon Programming Languages Summer School, organized by Bob Harper, Greg Morrisett and Zena Ariola, has come to a close after two amazing weeks. I wanted to take a moment to thank everyone who made my time here so pleasant, with special regards to Bob Harper and Mark Bickford, who patiently answered my numerous questions about Nuprl and Brouwerian mathematics.
Recent conversations have convinced me that there are several misconceptions about the status of side effects in a type theoretic setting, and how to reason about their benignity. Briefly, I will clarify this matter by proposing how one might go about integrating free assignables and non-determinism into Computational Type Theory. I do not answer all questions raised by this proposal, but I intend to be provocative and suggest a rough path toward realizing this goal.
Briefly, Diaconescu’s theorem states that the axiom of choice & extensionality together suffice to imply the principle of the excluded middle (PEM). In 100 Years of Zermelo’s Axiom of Choice, Martin-Löf distinguishes an intensional version of AC from an extensional one, and argues that whilst the former is a theorem of intuitionistic type theory, the latter leads to taboo. In this note, I hope to clarify a few matters, and demonstrate that the matter at hand is not so much extensionality, but infelicitous interpretation of quantifiers. When viewed through this lens, it becomes evident that the problem all along was setoids, not extensionality.
In 1994, Per Martin-Löf wrote Analytic and Synthetic Judgement in Type Theory, in which he convincingly showed that undecidability phenomena should be understood in terms of synthetic judgement, and demonstrated how the judgements of one theory may be made the propositions of another.