Merge branch 'dev'
This commit is contained in:
commit
98cad057d5
1
.gitignore
vendored
Normal file
1
.gitignore
vendored
Normal file
|
@ -0,0 +1 @@
|
|||
html/
|
|
@ -1,6 +1,13 @@
|
|||
Change log
|
||||
=========
|
||||
|
||||
Version 1.6.0
|
||||
-------------
|
||||
|
||||
This version mainly contains changes to the report.
|
||||
|
||||
This is the version I submit for my MSc..
|
||||
|
||||
Version 1.5.0
|
||||
-------------
|
||||
Prove postulates in `Cat.Wishlist`:
|
||||
|
|
8
Makefile
8
Makefile
|
@ -1,5 +1,11 @@
|
|||
build: src/**.agda
|
||||
agda src/Cat.agda
|
||||
agda --library-file ./libraries src/Cat.agda
|
||||
|
||||
clean:
|
||||
find src -name "*.agdai" -type f -delete
|
||||
|
||||
html:
|
||||
agda --html src/Cat.agda
|
||||
|
||||
upload: html
|
||||
scp -r html/ remote11.chalmers.se:www/cat/doc/
|
||||
|
|
2
doc/.gitignore
vendored
2
doc/.gitignore
vendored
|
@ -10,3 +10,5 @@
|
|||
*.idx
|
||||
*.ilg
|
||||
*.ind
|
||||
*.nav
|
||||
*.snm
|
|
@ -3,5 +3,74 @@ Talk about structure of library:
|
|||
|
||||
What can I say about reusability?
|
||||
|
||||
Misc
|
||||
====
|
||||
Meeting with Andrea May 18th
|
||||
============================
|
||||
|
||||
App. 2 in HoTT gives typing rule for pathJ including a computational
|
||||
rule for it.
|
||||
|
||||
If you have this computational rule definitionally, then you wouldn't
|
||||
need to use `pathJprop`.
|
||||
|
||||
In discussion-section I mention HITs. I should remove this or come up
|
||||
with a more elaborate example of something you could do, e.g.
|
||||
something with pushouts in the category of sets.
|
||||
|
||||
The type Prop is a type where terms are *judgmentally* equal not just
|
||||
propositionally so.
|
||||
|
||||
Maybe mention that Andreas Källberg is working on proving the
|
||||
initiality conjecture.
|
||||
|
||||
Intensional Type Theory (ITT): Judgmental equality is decidable
|
||||
|
||||
Extensional Type Theory (ETT): Reflection is enough to make judgmental
|
||||
equality undecidable.
|
||||
|
||||
Reflection : a ≡ b → a = b
|
||||
|
||||
ITT does not have reflections.
|
||||
|
||||
HTT ~ ITT + axiomatized univalence
|
||||
Agda ~ ITT + K-rule
|
||||
Coq ~ ITT (no K-rule)
|
||||
Cubical Agda ~ ITT + Path + Glue
|
||||
|
||||
Prop is impredicative in Coq (whatever that means)
|
||||
|
||||
Prop ≠ hProp
|
||||
|
||||
Comments about abstract
|
||||
-----
|
||||
|
||||
Pattern matching for paths (?)
|
||||
|
||||
Intro
|
||||
-----
|
||||
Main feature of judgmental equality is the conversion rule.
|
||||
|
||||
Conor explained: K + eliminators ≡ pat. matching
|
||||
|
||||
Explain jugmental equality independently of type-checking
|
||||
|
||||
Soundness for equality means that if `x = y` then `x` and `y` must be
|
||||
equal according to the theory/model.
|
||||
|
||||
Decidability of `=` is a necessary condition for typechecking to be
|
||||
decidable.
|
||||
|
||||
Canonicity is a nice-to-have though without canonicity terms can get
|
||||
stuck. If we postulate results about judgmental equality. E.g. funext,
|
||||
then we can construct a term of type natural number that is not a
|
||||
numeral. Therefore stating canonicity with natural numbers:
|
||||
|
||||
∀ t . ⊢ t : N , ∃ n : N . ⊢ t = sⁿ 0 : N
|
||||
|
||||
is a sufficient condition to get a well-behaved equality.
|
||||
|
||||
Eta-equality for RawFunctor means that the associative law for
|
||||
functors hold definitionally.
|
||||
|
||||
Computational property for funExt is only relevant in two places in my
|
||||
whole formulation. Univalence and gradLemma does not influence any
|
||||
proofs.
|
||||
|
|
|
@ -1,21 +1,24 @@
|
|||
\chapter*{Abstract}
|
||||
The usual notion of propositional equality in intensional type-theory is
|
||||
restrictive. For instance it does not admit functional extensionality or
|
||||
univalence. This poses a severe limitation on both what is \emph{provable} and
|
||||
the \emph{re-usability} of proofs. Recent developments have, however, resulted
|
||||
in cubical type theory which permits a constructive proof of these two important
|
||||
notions. The programming language Agda has been extended with capabilities for
|
||||
working in such a cubical setting. This thesis will explore the usefulness of
|
||||
this extension in the context of category theory.
|
||||
The usual notion of propositional equality in intensional type-theory
|
||||
is restrictive. For instance it does not admit functional
|
||||
extensionality nor univalence. This poses a severe limitation on both
|
||||
what is \emph{provable} and the \emph{re-usability} of proofs. Recent
|
||||
developments have however resulted in cubical type theory which
|
||||
permits a constructive proof of these two important notions. The
|
||||
programming language Agda has been extended with capabilities for
|
||||
working in such a cubical setting. This thesis will explore the
|
||||
usefulness of this extension in the context of category theory.
|
||||
|
||||
The thesis will motivate and explain why propositional equality in cubical Agda
|
||||
is more expressive than in standard Agda. Alternative approaches to Cubical Agda
|
||||
will be presented and their pros and cons will be explained. It will emphasize
|
||||
why it is useful to have a constructive interpretation of univalence. As an
|
||||
example of this two formulations of monads will be presented: Namely monaeds in
|
||||
the monoidal form an monads in the Kleisli form.
|
||||
The thesis will motivate the need for univalence and explain why
|
||||
propositional equality in cubical Agda is more expressive than in
|
||||
standard Agda. Alternative approaches to Cubical Agda will be
|
||||
presented and their pros and cons will be explained. As an example of
|
||||
the application of univalence two formulations of monads will be
|
||||
presented: Namely monads in the monoidal form and monads in the
|
||||
Kleisli form and under the univalent interpretation it will be shown
|
||||
how these are equal.
|
||||
|
||||
Finally the thesis will explain the challenges that a developer will face when
|
||||
working with cubical Agda and give some techniques to overcome these
|
||||
difficulties. It will also try to suggest how furhter work can help allievate
|
||||
some of these challenges.
|
||||
Finally the thesis will explain the challenges that a developer will
|
||||
face when working with cubical Agda and give some techniques to
|
||||
overcome these difficulties. It will also try to suggest how further
|
||||
work can help alleviate some of these challenges.
|
||||
|
|
1
doc/acknowledgement.tex
Normal file
1
doc/acknowledgement.tex
Normal file
|
@ -0,0 +1 @@
|
|||
\chapter*{Acknowledgements}
|
13
doc/acknowledgements.tex
Normal file
13
doc/acknowledgements.tex
Normal file
|
@ -0,0 +1,13 @@
|
|||
\chapter*{Acknowledgements}
|
||||
I would like to thank my supervisor Thierry Coquand for giving me a
|
||||
chance to work on this interesting topic. I would also like to thank
|
||||
Andrea Vezzosi for some very long and very insightful meetings during
|
||||
the project. It is fascinating and almost uncanny how quickly Andrea
|
||||
can conjure up various proofs. I also want to recognize the support
|
||||
of Knud Højgaards Fond who graciously sponsored me with a 20.000 DKK
|
||||
scholarship which helped toward sponsoring the two years I have spent
|
||||
studying abroad. I would also like to give a warm thanks to my fellow
|
||||
students Pierre Kraft and Nachiappan Villiappan who have made the time
|
||||
spent working on the thesis way more enjoyable. Lastly I would like to
|
||||
give a special thanks to Valentina Méndez who have been a great moral
|
||||
support throughout the whole process.
|
74
doc/appendix/abstract-funext.tex
Normal file
74
doc/appendix/abstract-funext.tex
Normal file
|
@ -0,0 +1,74 @@
|
|||
\chapter{Non-reducing functional extensionality}
|
||||
\label{app:abstract-funext}
|
||||
In two places in my formalization was the computational behaviours of
|
||||
functional extensionality used. The reduction behaviour can be
|
||||
disabled by marking functional extensionality as abstract. Below the
|
||||
fully normalized goal and context with functional extensionality
|
||||
marked abstract has been shown. The excerpts are from the module
|
||||
%
|
||||
\begin{center}
|
||||
\sourcelink{Cat.Category.Monad.Voevodsky}
|
||||
\end{center}
|
||||
%
|
||||
where this is also written as a comment next to the proofs. When
|
||||
functional extensionality is not abstract the goal and current value
|
||||
are the same. It is of course necessary to show the fully normalized
|
||||
goal and context otherwise the reduction behaviours is not forced.
|
||||
|
||||
\subsubsection*{First goal}
|
||||
Goal:
|
||||
\begin{verbatim}
|
||||
PathP (λ _ → §2-3.§2 omap (λ {z} → pure))
|
||||
(§2-fromMonad
|
||||
(.Cat.Category.Monad.toKleisli ℂ
|
||||
(.Cat.Category.Monad.toMonoidal ℂ (§2-3.§2.toMonad m))))
|
||||
(§2-fromMonad (§2-3.§2.toMonad m))
|
||||
\end{verbatim}
|
||||
Have:
|
||||
\begin{verbatim}
|
||||
PathP
|
||||
(λ i →
|
||||
§2-3.§2 K.IsMonad.omap
|
||||
(K.RawMonad.pure
|
||||
(K.Monad.raw
|
||||
(funExt (λ m₁ → K.Monad≡ (.Cat.Category.Monad.toKleisliRawEq ℂ m₁))
|
||||
i (§2-3.§2.toMonad m)))))
|
||||
(§2-fromMonad
|
||||
(.Cat.Category.Monad.toKleisli ℂ
|
||||
(.Cat.Category.Monad.toMonoidal ℂ (§2-3.§2.toMonad m))))
|
||||
(§2-fromMonad (§2-3.§2.toMonad m))
|
||||
\end{verbatim}
|
||||
\subsubsection*{Second goal}
|
||||
Goal:
|
||||
\begin{verbatim}
|
||||
PathP (λ _ → §2-3.§1 omap (λ {X} → pure))
|
||||
(§1-fromMonad
|
||||
(.Cat.Category.Monad.toMonoidal ℂ
|
||||
(.Cat.Category.Monad.toKleisli ℂ (§2-3.§1.toMonad m))))
|
||||
(§1-fromMonad (§2-3.§1.toMonad m))
|
||||
\end{verbatim}
|
||||
Have:
|
||||
\begin{verbatim}
|
||||
PathP
|
||||
(λ i →
|
||||
§2-3.§1
|
||||
(RawFunctor.omap
|
||||
(Functor.raw
|
||||
(M.RawMonad.R
|
||||
(M.Monad.raw
|
||||
(funExt
|
||||
(λ m₁ → M.Monad≡ (.Cat.Category.Monad.toMonoidalRawEq ℂ m₁)) i
|
||||
(§2-3.§1.toMonad m))))))
|
||||
(λ {X} →
|
||||
fst
|
||||
(M.RawMonad.pureNT
|
||||
(M.Monad.raw
|
||||
(funExt
|
||||
(λ m₁ → M.Monad≡ (.Cat.Category.Monad.toMonoidalRawEq ℂ m₁)) i
|
||||
(§2-3.§1.toMonad m))))
|
||||
X))
|
||||
(§1-fromMonad
|
||||
(.Cat.Category.Monad.toMonoidal ℂ
|
||||
(.Cat.Category.Monad.toKleisli ℂ (§2-3.§1.toMonad m))))
|
||||
(§1-fromMonad (§2-3.§1.toMonad m))
|
||||
\end{verbatim}
|
BIN
doc/assets/isomorphism.pdf
Normal file
BIN
doc/assets/isomorphism.pdf
Normal file
Binary file not shown.
Before Width: | Height: | Size: 266 KiB After Width: | Height: | Size: 266 KiB |
|
@ -46,7 +46,8 @@
|
|||
{\Huge\@title}\\[.5cm]
|
||||
{\Large A formalization of category theory in Cubical Agda}\\[6cm]
|
||||
\begin{center}
|
||||
\includegraphics[width=\linewidth,keepaspectratio]{isomorphism.png}
|
||||
\includegraphics[width=\linewidth,keepaspectratio]{assets/isomorphism.pdf}
|
||||
%% \includepdf{isomorphism.pdf}
|
||||
\end{center}
|
||||
% Cover text
|
||||
\vfill
|
||||
|
|
|
@ -1,45 +1,53 @@
|
|||
\chapter{Conclusion}
|
||||
This thesis highlighted some issues with the standard inductive definition of
|
||||
propositional equality used in Agda. Functional extensionality and univalence
|
||||
are examples of two propositions not admissible in Intensional Type Theory
|
||||
(ITT). This has a big impact on what is provable and the reusability of proofs.
|
||||
This issue is overcome with an extension to Agda's type system called Cubical
|
||||
Agda. With Cubical Agda both functional extensionality and univalence are
|
||||
admissible. Cubical Agda is more expressive, but there are certain issues that
|
||||
arise that are not present in standard Agda. For one thing ITT and standard Agda
|
||||
enjoys Uniqueness of Identity Proofs (UIP). This is not the case in Cubical
|
||||
Agda. In stead there exists a hierarchy of types with increasing
|
||||
\nomen{homotopical structure}. It turns out to be useful to built the
|
||||
This thesis highlighted some issues with the standard inductive
|
||||
definition of propositional equality used in Agda. Functional
|
||||
extensionality and univalence are examples of two propositions not
|
||||
admissible in Intensional Type Theory (ITT). This has a big impact on
|
||||
what is provable and the reusability of proofs. This issue is
|
||||
overcome with an extension to Agda's type system called Cubical Agda.
|
||||
With Cubical Agda both functional extensionality and univalence are
|
||||
admissible. Cubical Agda is more expressive, but there are certain
|
||||
issues that arise that are not present in standard Agda. For one
|
||||
thing Agda enjoys Uniqueness of Identity Proofs (UIP) though a flag
|
||||
exists to turn this off. This feature is not present in Cubical Agda.
|
||||
Rather than having unique identity proofs cubical Agda gives rise to a
|
||||
hierarchy of types with increasing \nomen{homotopical
|
||||
structure}{homotopy levels}. It turns out to be useful to built the
|
||||
formalization with this hierarchy in mind as it can simplify proofs
|
||||
considerably. Another issue one must overcome in Cubical Agda is when a type has
|
||||
a field whose type depends on a previous field. In this case paths between such
|
||||
types will be heterogeneous paths. This problem is related to Cubical Agda not
|
||||
having the K-rule \TODO{Not mentioned anywhere in the report}. In practice it
|
||||
turns out to be considerably more difficult to work heterogeneous paths than
|
||||
with homogeneous paths. The thesis demonstrated some techniques to overcome
|
||||
these difficulties, such as based path-induction.
|
||||
considerably. Another issue one must overcome in Cubical Agda is when
|
||||
a type has a field whose type depends on a previous field. In this
|
||||
case paths between such types will be heterogeneous paths. In
|
||||
practice it turns out to be considerably more difficult to work with
|
||||
heterogeneous paths than with homogeneous paths. The thesis
|
||||
demonstrated the application of some techniques to overcome these
|
||||
difficulties, such as based path induction.
|
||||
|
||||
This thesis formalized some of the core concepts from category theory including;
|
||||
categories, functors, products, exponentials, Cartesian closed categories,
|
||||
natural transformations, the yoneda embedding, monads and more. Category theory
|
||||
is an interesting case-study for the application of Cubical Agda for two reasons
|
||||
in particular: Because category theory is the study of abstract algebra of
|
||||
functions, meaning that functional extensionality is particularly relevant.
|
||||
Another reason is that in category theory it is commonplace to identify
|
||||
isomorphic structures and univalence allows for making this notion precise. This
|
||||
thesis also demonstrated another technique that is common in category theory;
|
||||
namely to define categories to prove properties of other structures.
|
||||
Specifically a category was defined to demonstrate that any two product objects
|
||||
in a category are isomorphic. Furthermore the thesis showed two formulations of
|
||||
monads and proved that they indeed are equivalent: Namely monads in the
|
||||
monoidal- and Kleisli- form. The monoidal formulation is more typical to
|
||||
category theoretic formulations and the Kleisli formulation will be more
|
||||
familiar to functional programmers. In the formulation we also saw how paths can
|
||||
be used to extract functions. A path between two types induce an isomorphism
|
||||
between the two types. This e.g. permits developers to write a monad instance
|
||||
for a given type using the Kleisli formulation. By transporting along the path
|
||||
between the monoidal- and Kleisli- formulation one can reuse all the operations
|
||||
and results shown for monoidal- monads in the context of kleisli monads.
|
||||
This thesis formalizes some of the core concepts from category theory
|
||||
including; categories, functors, products, exponentials, Cartesian
|
||||
closed categories, natural transformations, the yoneda embedding,
|
||||
monads and more. Category theory is an interesting case study for the
|
||||
application of cubical Agda for two reasons in particular: Because
|
||||
category theory is the study of abstract algebra of functions, meaning
|
||||
that functional extensionality is particularly relevant. Another
|
||||
reason is that in category theory it is commonplace to identify
|
||||
isomorphic structures. Univalence allows for making this notion
|
||||
precise. This thesis also demonstrated another technique that is
|
||||
common in category theory; namely to define categories to prove
|
||||
properties of other structures. Specifically a category was defined
|
||||
to demonstrate that any two product objects in a category are
|
||||
isomorphic. Furthermore the thesis showed two formulations of monads
|
||||
and proved that they indeed are equivalent: Namely monads in the
|
||||
monoidal- and Kleisli- form. The monoidal formulation is more typical
|
||||
to category theoretic formulations and the Kleisli formulation will be
|
||||
more familiar to functional programmers. It would have been very
|
||||
difficult to make a similar proof with setoids and the proof would be
|
||||
very difficult to read. In the formulation we also saw how paths can
|
||||
be used to extract functions. A path between two types induce an
|
||||
isomorphism between the two types. This e.g.\ permits developers to
|
||||
write a monad instance for a given type using the Kleisli formulation.
|
||||
By transporting along the path between the monoidal- and Kleisli-
|
||||
formulation one can reuse all the operations and results shown for
|
||||
monoidal- monads in the context of kleisli monads.
|
||||
%%
|
||||
%% problem with inductive type
|
||||
%% overcome with cubical
|
||||
|
|
371
doc/cubical.tex
371
doc/cubical.tex
|
@ -1,36 +1,36 @@
|
|||
\chapter{Cubical Agda}
|
||||
\section{Propositional equality}
|
||||
Judgmental equality in Agda is a feature of the type-system. Its something that
|
||||
can be checked automatically by the type-checker: In the example from the
|
||||
introduction $n + 0$ can be judged to be equal to $n$ simply by expanding the
|
||||
definition of $+$.
|
||||
Judgmental equality in Agda is a feature of the type system. It is
|
||||
something that can be checked automatically by the type checker: In
|
||||
the example from the introduction $n + 0$ can be judged to be equal to
|
||||
$n$ simply by expanding the definition of $+$.
|
||||
|
||||
On the other hand, propositional equality is something defined within the
|
||||
language itself. Propositional equality cannot be derived automatically. The
|
||||
normal definition of judgmental equality is an inductive data-type. Cubical Agda
|
||||
discards this type in favor of a new primitives that has certain computational
|
||||
properties exclusive to it.
|
||||
On the other hand, propositional equality is something defined within
|
||||
the language itself. Propositional equality cannot be derived
|
||||
automatically. The normal definition of judgmental equality is an
|
||||
inductive data type. Cubical Agda discards this type in favor of some
|
||||
new primitives.
|
||||
|
||||
Exceprts of the source code relevant to this section can be found in appendix
|
||||
\S\ref{sec:app-cubical}.
|
||||
Most of the source code related with this section is implemented in
|
||||
\cite{cubical-demo} it can be browsed in hyperlinked and syntax
|
||||
highlighted HTML online. The links can be found in the beginning of
|
||||
section \S\ref{ch:implementation}.
|
||||
|
||||
\subsection{The equality type}
|
||||
The usual notion of judgmental equality says that given a type $A \tp \MCU$ and
|
||||
two points of $A$; $a_0, a_1 \tp A$ we can form the type:
|
||||
The usual notion of judgmental equality says that given a type $A \tp
|
||||
\MCU$ and two points hereof $a_0, a_1 \tp A$ we can form the type:
|
||||
%
|
||||
\begin{align}
|
||||
a_0 \equiv a_1 \tp \MCU
|
||||
\end{align}
|
||||
%
|
||||
In Agda this is defined as an inductive data-type with the single constructor
|
||||
for any $a \tp A$:
|
||||
In Agda this is defined as an inductive data type with the single
|
||||
constructor $\refl$ that for any $a \tp A$ gives:
|
||||
%
|
||||
\begin{align}
|
||||
\refl \tp a \equiv a
|
||||
\end{align}
|
||||
%
|
||||
For any $a \tp A$.
|
||||
|
||||
There also exist a related notion of \emph{heterogeneous} equality which allows
|
||||
for equating points of different types. In this case given two types $A, B \tp
|
||||
\MCU$ and two points $a \tp A$, $b \tp B$ we can construct the type:
|
||||
|
@ -39,8 +39,8 @@ for equating points of different types. In this case given two types $A, B \tp
|
|||
a \cong b \tp \MCU
|
||||
\end{align}
|
||||
%
|
||||
This is likewise defined as an inductive data-type with a single constructors
|
||||
for any $a \tp A$:
|
||||
This likewise has the single constructor $\refl$ that for any $a \tp
|
||||
A$ gives:
|
||||
%
|
||||
\begin{align}
|
||||
\refl \tp a \cong a
|
||||
|
@ -53,107 +53,106 @@ heterogeneous paths respectively.
|
|||
Judgmental equality in Cubical Agda is encapsulated with the type:
|
||||
%
|
||||
\begin{equation}
|
||||
\Path \tp (P \tp I → \MCU) → P\ 0 → P\ 1 → \MCU
|
||||
\Path \tp (P \tp \I → \MCU) → P\ 0 → P\ 1 → \MCU
|
||||
\end{equation}
|
||||
%
|
||||
$I$ is a special data-type (\TODO{that also has special computational properties
|
||||
AFAIK}) called the index set. $I$ can be thought of simply as the interval on
|
||||
the real numbers from $0$ to $1$. $P$ is a family of types over the index set
|
||||
$I$. I will sometimes refer to $P$ as the ``path-space'' of some path $p \tp
|
||||
\Path\ P\ a\ b$. By this token $P\ 0$ then corresponds to the type at the
|
||||
left-endpoint and $P\ 1$ as the type at the right-endpoint. The type is called
|
||||
$\Path$ because it is connected with paths in homotopy theory. The intuition
|
||||
behind this is that $\Path$ describes paths in $\MCU$ -- i.e. between types. For
|
||||
a path $p$ for the point $p\ i$ the index $i$ describes how far along the path
|
||||
one has moved. An inhabitant of $\Path\ P\ a_0\ a_1$ is a (dependent-) function,
|
||||
$p$, from the index-space to the path-space:
|
||||
The special type $\I$ is called the index set. The index set can be
|
||||
thought of simply as the interval on the real numbers from $0$ to $1$
|
||||
(both inclusive). The family $P$ over $\I$ will be referred to as the
|
||||
\nomenindex{path space} given some path $p \tp \Path\ P\ a\ b$. By
|
||||
that token $P\ 0$ corresponds to the type at the left endpoint of $p$.
|
||||
Likewise $P\ 1$ is the type at the right endpoint. The type is called
|
||||
$\Path$ because the idea has roots in homotopy theory. The intuition
|
||||
is that $\Path$ describes\linebreak[1] paths in $\MCU$. I.e.\ paths
|
||||
between types. For a path $p$ the expression $p\ i$ can be thought of
|
||||
as a \emph{point} on this path. The index $i$ describes how far along
|
||||
the path one has moved. An inhabitant of $\Path\ P\ a_0\ a_1$ is a
|
||||
(dependent) function from the index set to the path space:
|
||||
%
|
||||
$$
|
||||
p \tp \prod_{i \tp I} P\ i
|
||||
p \tp \prod_{i \tp \I} P\ i
|
||||
$$
|
||||
%
|
||||
Which must satisfy being judgmentally equal to $a_0$ (respectively $a_1$) at the
|
||||
endpoints. I.e.:
|
||||
Which must satisfy being judgmentally equal to $a_0$ at the
|
||||
left endpoint and equal to $a_1$ at the other end. I.e.:
|
||||
%
|
||||
\begin{align*}
|
||||
p\ 0 & = a_0 \\
|
||||
p\ 1 & = a_1
|
||||
\end{align*}
|
||||
%
|
||||
The notion of ``homogeneous equalities'' is recovered when $P$ does not depend
|
||||
on its argument:
|
||||
The notion of \nomenindex{homogeneous equalities} is recovered when $P$ does not
|
||||
depend on its argument. That is for $A \tp \MCU$ and $a_0, a_1 \tp A$ the
|
||||
homogenous equality between $a_0$ and $a_1$ is the type:
|
||||
%
|
||||
$$
|
||||
a_0 \equiv a_1 \defeq \Path\ (\lambda i \to A)\ a_0\ a_1
|
||||
a_0 \equiv a_1 \defeq \Path\ (\lambda\;i \to A)\ a_0\ a_1
|
||||
$$
|
||||
%
|
||||
For $A \tp \MCU$, $a_0, a_1 \tp A$. I will generally prefer to use the notation
|
||||
$a \equiv b$ when talking about non-dependent paths and use the notation
|
||||
$\Path\ (\lambda i \to P\ i)\ a\ b$ when the path-space is of particular
|
||||
interest.
|
||||
I will generally prefer to use the notation $a \equiv b$ when talking
|
||||
about non-dependent paths and use the notation $\Path\ (\lambda\; i
|
||||
\to P\ i)\ a\ b$ when the path space is of particular interest.
|
||||
|
||||
With this definition we can also recover reflexivity. That is, for any $A \tp
|
||||
\MCU$ and $a \tp A$:
|
||||
%
|
||||
\begin{equation}
|
||||
\begin{aligned}
|
||||
\refl & \tp \Path (\lambda i \to A)\ a\ a \\
|
||||
\refl & \defeq \lambda i \to a
|
||||
\refl & \tp a \equiv a \\
|
||||
\refl & \defeq \lambda\; i \to a
|
||||
\end{aligned}
|
||||
\end{equation}
|
||||
%
|
||||
Here the path-space is $P \defeq \lambda i \to A$ and it satsifies $P\ i = A$
|
||||
definitionally. So to inhabit it, is to give a path $I \to A$ which is
|
||||
judgmentally $a$ at either endpoint. This is satisfied by the constant path;
|
||||
i.e. the path that stays at $a$ at any index $i$.
|
||||
Here the path space is $P \defeq \lambda\; i \to A$ and it satsifies
|
||||
$P\ i = A$ definitionally. So to inhabit it, is to give a path $\I \to
|
||||
A$ which is judgmentally $a$ at either endpoint. This is satisfied by
|
||||
the constant path; i.e.\ the path that is constantly $a$ at any index
|
||||
$i \tp \I$.
|
||||
|
||||
It is also surpisingly easy to show functional extensionality with which we can
|
||||
construct a path between $f$ and $g$ -- the function defined in the introduction
|
||||
(section \S\ref{sec:functional-extensionality}).
|
||||
%% module _ {ℓa ℓb} {A : Set ℓa} {B : A → Set ℓb} where
|
||||
%% funExt : {f g : (x : A) → B x} → ((x : A) → f x ≡ g x) → f ≡ g
|
||||
Functional extensionality is the proposition, given a type $A \tp \MCU$, a
|
||||
family of types $B \tp A \to \MCU$ and functions $f, g \tp \prod_{a \tp A}
|
||||
B\ a$:
|
||||
It is also surprisingly easy to show functional extensionality.
|
||||
Functional extensionality is the proposition that given a type $A \tp
|
||||
\MCU$, a family of types $B \tp A \to \MCU$ and functions $f, g \tp
|
||||
\prod_{a \tp A} B\ a$ gives:
|
||||
%
|
||||
\begin{equation}
|
||||
\label{eq:funExt}
|
||||
\funExt \tp \prod_{a \tp A} f\ a \equiv g\ a \to f \equiv g
|
||||
\funExt \tp \left(\prod_{a \tp A} f\ a \equiv g\ a \right) \to f \equiv g
|
||||
\end{equation}
|
||||
%
|
||||
%% p = λ i a → p a i
|
||||
So given $p \tp \prod_{a \tp A} f\ a \equiv g\ a$ we must give a path $f \equiv
|
||||
g$. That is a function $I \to \prod_{a \tp A} B\ a$. So let $i \tp I$ be given.
|
||||
%% p = λ\; i a → p a i
|
||||
So given $η \tp \prod_{a \tp A} f\ a \equiv g\ a$ we must give a path $f \equiv
|
||||
g$. That is a function $\I \to \prod_{a \tp A} B\ a$. So let $i \tp \I$ be given.
|
||||
We must now give an expression $\phi \tp \prod_{a \tp A} B\ a$ satisfying
|
||||
$\phi\ 0 \equiv f\ a$ and $\phi\ 1 \equiv g\ a$. This neccesitates that the
|
||||
expression must be a lambda-abstraction, so let $a \tp A$ be given. Now we can
|
||||
apply $a$ to $p$ and get the path $p\ a \tp f\ a \equiv g\ a$. And this exactly
|
||||
satisfied the conditions for $\phi$. In conclustion \ref{eq:funExt} is inhabited
|
||||
apply $a$ to $η$ and get the path $η\ a \tp f\ a \equiv g\ a$. And this exactly
|
||||
satisfies the conditions for $\phi$. In conclustion \ref{eq:funExt} is inhabited
|
||||
by the term:
|
||||
%
|
||||
\begin{equation}
|
||||
\label{eq:funExt}
|
||||
\funExt\ p \defeq λ i\ a → p\ a\ i
|
||||
\end{equation}
|
||||
\begin{equation*}
|
||||
\funExt\ η \defeq λ\; i\ a → η\ a\ i
|
||||
\end{equation*}
|
||||
%
|
||||
With this we can now prove the desired equality $f \equiv g$ from section
|
||||
\S\ref{sec:functional-extensionality}:
|
||||
With $\funExt$ in place we can now construct a path between
|
||||
$\var{zeroLeft}$ and $\var{zeroRight}$ -- the functions defined in the
|
||||
introduction \S\ref{sec:functional-extensionality}:
|
||||
%
|
||||
\begin{align*}
|
||||
p & \tp f \equiv g \\
|
||||
p & \defeq \funExt\ \lambda n \to \refl
|
||||
p & \tp \var{zeroLeft} \equiv \var{zeroRight} \\
|
||||
p & \defeq \funExt\ \var{zrn}
|
||||
\end{align*}
|
||||
%
|
||||
Paths have some other important properties, but they are not the focus of
|
||||
this thesis. \TODO{Refer the reader somewhere for more info.}
|
||||
Here $\var{zrn}$ is the proof from \ref{eq:zrn}.
|
||||
%
|
||||
\section{Homotopy levels}
|
||||
In ITT all equality proofs are identical (in a closed context). This means that,
|
||||
in some sense, any two inhabitants of $a \equiv b$ are ``equally good'' -- they
|
||||
do not have any interesting structure. This is referred to as Uniqueness of
|
||||
Identity Proofs (UIP). Unfortunately it is not possible to have a type-theory
|
||||
with both univalence and UIP. In stead we have a hierarchy of types with an
|
||||
increasing amount of homotopic structure. At the bottom of this hierarchy we
|
||||
have the set of contractible types:
|
||||
In ITT all equality proofs are identical (in a closed context). This
|
||||
means that, in some sense, any two inhabitants of $a \equiv b$ are
|
||||
``equally good''. They do not have any interesting structure. This is
|
||||
referred to as Uniqueness of Identity Proofs (UIP). Unfortunately it
|
||||
is not possible to have a type theory with both univalence and UIP. In
|
||||
stead in cubical Agda we have a hierarchy of types with an increasing
|
||||
amount of homotopic structure. At the bottom of this hierarchy is the
|
||||
set of contractible types:
|
||||
%
|
||||
\begin{equation}
|
||||
\begin{aligned}
|
||||
|
@ -165,11 +164,12 @@ have the set of contractible types:
|
|||
\end{equation}
|
||||
%
|
||||
The first component of $\isContr\ A$ is called ``the center of contraction''.
|
||||
Under the propositions-as-types interpretation of type-theory $\isContr\ A$ can
|
||||
Under the propositions-as-types interpretation of type theory $\isContr\ A$ can
|
||||
be thought of as ``the true proposition $A$''. And indeed $\top$ is
|
||||
contractible:
|
||||
%
|
||||
\begin{equation*}
|
||||
\var{tt} , \lambda x \to \refl \tp \isContr\ \top
|
||||
(\var{tt} , \lambda\; x \to \refl) \tp \isContr\ \top
|
||||
\end{equation*}
|
||||
%
|
||||
It is a theorem that if a type is contractible, then it is isomorphic to the
|
||||
|
@ -184,23 +184,23 @@ The next step in the hierarchy is the set of mere propositions:
|
|||
\end{aligned}
|
||||
\end{equation}
|
||||
%
|
||||
$\isProp\ A$ can be thought of as the set of true and false propositions. And
|
||||
One can think of $\isProp\ A$ as the set of true and false propositions. And
|
||||
indeed both $\top$ and $\bot$ are propositions:
|
||||
%
|
||||
\begin{align*}
|
||||
λ \var{tt}\ \var{tt} → refl & \tp \isProp\ ⊤ \\
|
||||
λ\varnothing\ \varnothing & \tp \isProp\ ⊥
|
||||
(λ\; \var{tt}, \var{tt} → refl) & \tp \isProp\ ⊤ \\
|
||||
λ\;\varnothing\ \varnothing & \tp \isProp\ ⊥
|
||||
\end{align*}
|
||||
%
|
||||
$\varnothing$ is used here to denote an impossible pattern. It is a theorem that
|
||||
if a mere proposition $A$ is inhabited, then so is it contractible. If it is not
|
||||
inhabited it is equivalent to the empty-type (or false
|
||||
proposition).\TODO{Cite!!}
|
||||
The term $\varnothing$ is used here to denote an impossible pattern. It is a
|
||||
theorem that if a mere proposition $A$ is inhabited, then so is it contractible.
|
||||
If it is not inhabited it is equivalent to the empty-type (or false
|
||||
proposition).
|
||||
|
||||
I will refer to a type $A \tp \MCU$ as a \emph{mere} proposition if I want to
|
||||
I will refer to a type $A \tp \MCU$ as a \emph{mere proposition} if I want to
|
||||
stress that we have $\isProp\ A$.
|
||||
|
||||
Then comes the set of homotopical sets:
|
||||
The next step in the hierarchy is the set of homotopical sets:
|
||||
%
|
||||
\begin{equation}
|
||||
\begin{aligned}
|
||||
|
@ -209,13 +209,14 @@ Then comes the set of homotopical sets:
|
|||
\end{aligned}
|
||||
\end{equation}
|
||||
%
|
||||
I will not give an example of a set at this point. It turns out that proving
|
||||
e.g. $\isProp\ \bN$ is not so straight-forward (see \cite[\S3.1.4]{hott-2013}).
|
||||
There will be examples of sets later in this report. At this point it should be
|
||||
noted that the term ``set'' is somewhat conflated; there is the notion of sets
|
||||
from set-theory, in Agda types are denoted \texttt{Set}. I will use it
|
||||
consistently to refer to a type $A$ as a set exactly if $\isSet\ A$ is a
|
||||
proposition.
|
||||
I will not give an example of a set at this point. It turns out that
|
||||
proving e.g.\ $\isProp\ \bN$ directly is not so straightforward (see
|
||||
\cite[\S3.1.4]{hott-2013}). Hedberg's theorem states that any type
|
||||
with decidable equality is a set. There will be examples of sets later
|
||||
in this report. At this point it should be noted that the term ``set''
|
||||
is somewhat conflated; there is the notion of sets from set-theory, in
|
||||
Agda types are denoted \texttt{Set}. I will use it consistently to
|
||||
refer to a type $A$ as a set exactly if $\isSet\ A$ is a proposition.
|
||||
|
||||
As the reader may have guessed the next step in the hierarchy is the type:
|
||||
%
|
||||
|
@ -228,8 +229,9 @@ As the reader may have guessed the next step in the hierarchy is the type:
|
|||
%
|
||||
And so it continues. In fact we can generalize this family of types by indexing
|
||||
them with a natural number. For historical reasons, though, the bottom of the
|
||||
hierarchy, the contractible types, is said to be a \nomen{-2-type}, propositions
|
||||
are \nomen{-1-types}, (homotopical) sets are \nomen{0-types} and so on\ldots
|
||||
hierarchy, the contractible types, is said to be a \nomen{-2-type}{homotopy
|
||||
levels}, propositions are \nomen{-1-types}{homotopy levels}, (homotopical)
|
||||
sets are \nomen{0-types}{homotopy levels} and so on\ldots
|
||||
|
||||
Just as with paths, homotopical sets are not at the center of focus for this
|
||||
thesis. But I mention here some properties that will be relevant for this
|
||||
|
@ -238,90 +240,171 @@ exposition:
|
|||
Proposition: Homotopy levels are cumulative. That is, if $A \tp \MCU$ has
|
||||
homotopy level $n$ then so does it have $n + 1$.
|
||||
|
||||
Let $\left\Vert A \right\Vert = n$ denote that the level of $A$ is $n$.
|
||||
Proposition: For any homotopic level $n$ this is a mere proposition.
|
||||
For any level $n$ it is the case that to be of level $n$ is a mere proposition.
|
||||
%
|
||||
\section{A few lemmas}
|
||||
Rather than getting into the nitty-gritty details of Agda I venture to take a
|
||||
more ``combinator-based'' approach. That is, I will use theorems about paths
|
||||
already that have already been formalized. Specifically the results come from
|
||||
the Agda library \texttt{cubical} (\TODO{Cite}). I have used a handful of
|
||||
results from this library as well as contributed a few lemmas myself.\footnote{The module \texttt{Cat.Prelude} lists the upstream dependencies. As well my contribution to \texttt{cubical} can be found in the git logs \TODO{Cite}.}
|
||||
Rather than getting into the nitty-gritty details of Agda I venture to
|
||||
take a more ``combinator-based'' approach. That is I will use
|
||||
theorems about paths that have already been formalized.
|
||||
Specifically the results come from the Agda library \texttt{cubical}
|
||||
(\cite{cubical-demo}). I have used a handful of results from this
|
||||
library as well as contributed a few lemmas myself%
|
||||
\footnote{The module \texttt{Cat.Prelude} lists the upstream
|
||||
dependencies. As well my contribution to \texttt{cubical} can be
|
||||
found in the git logs which are available at
|
||||
\hrefsymb{https://github.com/Saizan/cubical-demo}{\texttt{https://github.com/Saizan/cubical-demo}}.
|
||||
}.
|
||||
|
||||
These theorems are all purely related to homotopy theory and cubical Agda and as
|
||||
such not specific to the formalization of Category Theory. I will present a few
|
||||
of these theorems here, as they will be used later in chapter
|
||||
\ref{ch:implementation} throughout.
|
||||
These theorems are all purely related to homotopy type theory and as
|
||||
such not specific to the formalization of Category Theory. I will
|
||||
present a few of these theorems here as they will be used throughout
|
||||
chapter \ref{ch:implementation}. They should also give the reader some
|
||||
intuition about the path type.
|
||||
|
||||
\subsection{Path induction}
|
||||
\label{sec:pathJ}
|
||||
The induction principle for paths intuitively gives us a way to reason about a
|
||||
type-family indexed by a path by only considering if said path is $\refl$ (the
|
||||
``base-case''). For \emph{based path induction}, that equality is \emph{based}
|
||||
at some element $a \tp A$.
|
||||
The induction principle for paths intuitively gives us a way to reason
|
||||
about a type family indexed by a path by only considering if said path
|
||||
is $\refl$ (the \nomen{base case}{path induction}). For \emph{based
|
||||
path induction}, that equality is \emph{based} at some element $a
|
||||
\tp A$.
|
||||
|
||||
Let a type $A \tp \MCU$ and an element of the type $a \tp A$ be given. $a$ is said to be the base of the induction. Given a family of types:
|
||||
\pagebreak[3]
|
||||
\begin{samepage}
|
||||
Let a type $A \tp \MCU$ and an element of the type $a \tp A$ be
|
||||
given. $a$ is said to be the base of the induction.\linebreak[3] Given
|
||||
a family of types:
|
||||
%
|
||||
$$
|
||||
P \tp \prod_{a' \tp A} \prod_{p \tp a ≡ a'} \MCU
|
||||
D \tp \prod_{b \tp A} \prod_{p \tp a ≡ b} \MCU
|
||||
$$
|
||||
%
|
||||
And an inhabitant of $P$ at $\refl$:
|
||||
And an inhabitant of $D$ at $\refl$:
|
||||
%
|
||||
$$
|
||||
p \tp P\ a\ \refl
|
||||
d \tp D\ a\ \refl
|
||||
$$
|
||||
%
|
||||
We have the function:
|
||||
%
|
||||
\begin{equation}
|
||||
\pathJ\ P\ p \tp \prod_{a' \tp A} \prod_{p \tp a ≡ a'} P\ a\ p
|
||||
\pathJ\ D\ d \tp \prod_{b \tp A} \prod_{p \tp a ≡ b} D\ b\ p
|
||||
\end{equation}
|
||||
\end{samepage}%
|
||||
|
||||
A simple application of $\pathJ$ is for proving that $\var{sym}$ is an
|
||||
involution. Namely for any set $A \tp \MCU$, points $a, b \tp A$ and a path
|
||||
between them $p \tp a \equiv b$:
|
||||
%
|
||||
\begin{equation}
|
||||
\label{eq:sym-invol}
|
||||
\var{sym}\ (\var{sym}\ p) ≡ p
|
||||
\end{equation}
|
||||
%
|
||||
I will not give an example of using $\pathJ$ here. An application can be found
|
||||
later in \ref{eq:pathJ-example}.
|
||||
The proof will be by induction on $p$ and will be based at $a$. That
|
||||
is $D$ will be the family:
|
||||
%
|
||||
\begin{align*}
|
||||
D & \tp \prod_{b' \tp A} \prod_{p \tp a ≡ b'} \MCU \\
|
||||
D\ b'\ p' & \defeq \var{sym}\ (\var{sym}\ p') ≡ p'
|
||||
\end{align*}
|
||||
%
|
||||
The base case will then be:
|
||||
%
|
||||
\begin{align*}
|
||||
d & \tp \var{sym}\ (\var{sym}\ \refl) ≡ \refl \\
|
||||
d & \defeq \refl
|
||||
\end{align*}
|
||||
%
|
||||
The reason $\refl$ proves this is that $\var{sym}\ \refl = \refl$ holds
|
||||
definitionally. In summary \ref{eq:sym-invol} is inhabited by the term:
|
||||
%
|
||||
\begin{align*}
|
||||
\pathJ\ D\ d\ b\ p
|
||||
\tp
|
||||
\var{sym}\ (\var{sym}\ p) ≡ p
|
||||
\end{align*}
|
||||
%
|
||||
Another application of $\pathJ$ is for proving associativity of $\trans$. That
|
||||
is, given a type $A \tp \MCU$, elements of $A$, $a, b, c, d \tp A$ and paths
|
||||
between them $p \tp a \equiv b$, $q \tp b \equiv c$ and $r \tp c \equiv d$ we
|
||||
have the following:
|
||||
%
|
||||
\begin{equation}
|
||||
\label{eq:cum-trans}
|
||||
\trans\ p\ (\trans\ q\ r) ≡ \trans\ (\trans\ p\ q)\ r
|
||||
\end{equation}
|
||||
%
|
||||
In this case the induction will be based at $c$ (the left-endpoint of $r$) and
|
||||
over the family:
|
||||
%
|
||||
\begin{align*}
|
||||
T & \tp \prod_{d' \tp A} \prod_{r' \tp c ≡ d'} \MCU \\
|
||||
T\ d'\ r' & \defeq \trans\ p\ (\trans\ q\ r') ≡ \trans\ (\trans\ p\ q)\ r'
|
||||
\end{align*}
|
||||
%
|
||||
So the base case is proven with $t$ which is defined as:
|
||||
%
|
||||
\begin{align*}
|
||||
\trans\ p\ (\trans\ q\ \refl) & ≡
|
||||
\trans\ p\ q \\
|
||||
& ≡
|
||||
\trans\ (\trans\ p\ q)\ \refl
|
||||
\end{align*}
|
||||
%
|
||||
Here we have used the proposition $\trans\ p\ \refl \equiv p$ without proof. In
|
||||
conclusion \ref{eq:cum-trans} is inhabited by the term:
|
||||
%
|
||||
\begin{align*}
|
||||
\pathJ\ T\ t\ d\ r
|
||||
\end{align*}
|
||||
%
|
||||
We shall see another application of path induction in \ref{eq:pathJ-example}.
|
||||
|
||||
\subsection{Paths over propositions}
|
||||
\label{sec:lemPropF}
|
||||
Another very useful combinator is $\lemPropF$:
|
||||
|
||||
To `promote' this to a dependent path we can use another useful combinator;
|
||||
$\lemPropF$. Given a type $A \tp \MCU$ and a type family on $A$; $P \tp A \to
|
||||
\MCU$. Let $\var{propP} \tp \prod_{x \tp A} \isProp\ (P\ x)$ be the proof that
|
||||
$P$ is a mere proposition for all elements of $A$. Furthermore say we have a
|
||||
path between some two elements in $A$; $p \tp a_0 \equiv a_1$ then we can built
|
||||
a heterogeneous path between any two elements of $p_0 \tp P\ a_0$ and $p_1 \tp
|
||||
P\ a_1$:
|
||||
Another very useful combinator is $\lemPropF$: Given a type $A \tp
|
||||
\MCU$ and a type family on $A$; $D \tp A \to \MCU$. Let $\var{propD}
|
||||
\tp \prod_{x \tp A} \isProp\ (D\ x)$ be the proof that $D$ is a mere
|
||||
proposition for all elements of $A$. Furthermore say we have a path
|
||||
between some two elements in $A$; $p \tp a_0 \equiv a_1$ then we can
|
||||
built a heterogeneous path between any two elements of $d_0 \tp
|
||||
D\ a_0$ and $d_1 \tp D\ a_1$.
|
||||
%
|
||||
$$
|
||||
\lemPropF\ \var{propP}\ p \defeq \Path\ (\lambda\; i \mto P\ (p\ i))\ p_0\ p_1
|
||||
\lemPropF\ \var{propD}\ p \tp \Path\ (\lambda\; i \mto D\ (p\ i))\ d_0\ d_1
|
||||
$$
|
||||
%
|
||||
This is quite a mouthful. So let me try to show how this is a very general and
|
||||
useful result.
|
||||
Note that $d_0$ and $d_1$, though points of the same family, have
|
||||
different types. This is quite a mouthful. So let me try to show how
|
||||
this is a very general and useful result.
|
||||
|
||||
Often when proving equalities between elements of some dependent types
|
||||
$\lemPropF$ can be used to boil this complexity down to showing that the
|
||||
dependent parts of the type are mere propositions. For instance, saw we have a type:
|
||||
$\lemPropF$ can be used to boil this complexity down to showing that
|
||||
the dependent parts of the type are mere propositions. For instance
|
||||
say we have a type:
|
||||
%
|
||||
$$
|
||||
T \defeq \sum_{a \tp A} P\ a
|
||||
T \defeq \sum_{a \tp A} D\ a
|
||||
$$
|
||||
%
|
||||
For some proposition $P \tp A \to \MCU$. If we want to prove $t_0 \equiv t_1$
|
||||
for two elements $t_0, t_1 \tp T$ then this will be a pair of paths:
|
||||
For some proposition $D \tp A \to \MCU$. That is we have $\var{propD}
|
||||
\tp \prod_{a \tp A} \isProp\ (D\ a)$. If we want to prove $t_0 \equiv
|
||||
t_1$ for two elements $t_0, t_1 \tp T$ then this will be a pair of
|
||||
paths:
|
||||
%
|
||||
%
|
||||
\begin{align*}
|
||||
p \tp & \fst\ t_0 \equiv \fst\ t_1 \\
|
||||
& \Path\ (\lambda i \to P\ (p\ i))\ \snd\ t_0 \equiv \snd\ t_1
|
||||
& \Path\ (\lambda\; i \to D\ (p\ i))\ (\snd\ t_0)\ (\snd\ t_1)
|
||||
\end{align*}
|
||||
%
|
||||
Here $\lemPropF$ directly allow us to prove the latter of these:
|
||||
Here $\lemPropF$ directly allow us to prove the latter of these given
|
||||
that we have already provided $p$.
|
||||
%
|
||||
$$
|
||||
\lemPropF\ \var{propP}\ p
|
||||
\tp \Path\ (\lambda i \to P\ (p\ i))\ \snd\ t_0 \equiv \snd\ t_1
|
||||
\lemPropF\ \var{propD}\ p
|
||||
\tp \Path\ (\lambda\; i \to D\ (p\ i))\ (\snd\ t_0)\ (\snd\ t_1)
|
||||
$$
|
||||
%
|
||||
\subsection{Functions over propositions}
|
||||
|
@ -335,9 +418,9 @@ $$
|
|||
\subsection{Pairs over propositions}
|
||||
\label{sec:propSig}
|
||||
%
|
||||
$\sum$-types preserve propositionality whenever its first component is a
|
||||
proposition, and its second component is a proposition for all points of in the
|
||||
left type.
|
||||
$\sum$-types preserve propositionality whenever its first component is
|
||||
a proposition, and its second component is a proposition for all
|
||||
points of the left type.
|
||||
%
|
||||
$$
|
||||
\mathit{propSig} \tp \isProp\ A \to \left(\prod_{a \tp A} \isProp\ (P\ a)\right) \to \isProp\ \left(\sum_{a \tp A} P\ a\right)
|
||||
|
|
|
@ -1,74 +1,139 @@
|
|||
\chapter{Perspectives}
|
||||
\section{Discussion}
|
||||
In the previous chapter the practical aspects of proving things in Cubical Agda
|
||||
were highlighted. I also demonstrated the usefulness of separating ``laws'' from
|
||||
``data''. One of the reasons for this is that dependencies within types can lead
|
||||
to very complicated goals. One technique for alleviating this was to prove that
|
||||
certain types are mere propositions.
|
||||
In the previous chapter the practical aspects of proving things in
|
||||
Cubical Agda were highlighted. I also demonstrated the usefulness of
|
||||
separating ``laws'' from ``data''. One of the reasons for this is that
|
||||
dependencies within types can lead to very complicated goals. One
|
||||
technique for alleviating this was to prove that certain types are
|
||||
mere propositions.
|
||||
|
||||
\subsection{Computational properties}
|
||||
Another aspect (\TODO{That I actually did not highlight very well in the
|
||||
previous chapter}) is the computational nature of paths. Say we have
|
||||
formalized this common result about monads:
|
||||
The new contribution of cubical Agda is that it has a constructive
|
||||
proof of functional extensionality\index{functional extensionality}
|
||||
and univalence\index{univalence}. This means that in particular that
|
||||
the type checker can reduce terms defined with these theorems. So one
|
||||
interesting result of this development is how much this influenced the
|
||||
development. In particular having a functional extensionality that
|
||||
``computes'' should simplify some proofs.
|
||||
|
||||
\TODO{Some equation\ldots}
|
||||
I have tested this by using a feature of Agda where one can mark
|
||||
certain bindings as being \emph{abstract}. This means that the
|
||||
type-checker will not try to reduce that term further during type
|
||||
checking. I tried making univalence and functional extensionality
|
||||
abstract. It turns out that the conversion behaviour of univalence is
|
||||
not used anywhere. For functional extensionality there are two places
|
||||
in the whole solution where the reduction behaviour is used to
|
||||
simplify some proofs. This is in showing that the maps between the
|
||||
two formulations of monads are inverses. See the notes in this
|
||||
module:
|
||||
%
|
||||
\begin{center}
|
||||
\sourcelink{Cat.Category.Monad.Voevodsky}
|
||||
\end{center}
|
||||
%
|
||||
|
||||
By transporting this to the Kleisli formulation we get a result that we can use
|
||||
to compute with. This is particularly useful because the Kleisli formulation
|
||||
will be more familiar to programmers e.g. those coming from a background in
|
||||
Haskell. Whereas the theory usually talks about monoidal monads.
|
||||
I will not reproduce it in full here as the type is quite involved. In
|
||||
stead I have put this in a source listing in \ref{app:abstract-funext}.
|
||||
The method used to find in what places the computational behaviour of
|
||||
these proofs are needed has the caveat of only working for places that
|
||||
directly or transitively uses these two proofs. Fortunately though
|
||||
the code is structured in such a way that this is the case. So in
|
||||
conclusion the way I have structured these proofs means that the
|
||||
computational behaviour of functional extensionality and univalence
|
||||
has not been so relevant.
|
||||
|
||||
\TODO{Mention that with postulates we cannot do this}
|
||||
Barring this the computational behaviour of paths can still be useful.
|
||||
E.g.\ if a programmer wants to reuse functions that operate on a
|
||||
monoidal monads to work with a monad in the Kleisli form that the
|
||||
programmer has specified. To make this idea concrete, say we are
|
||||
given some function $f \tp \Kleisli \to T$ having a path between $p
|
||||
\tp \Monoidal \equiv \Kleisli$ induces a map $\coe\ p \tp \Monoidal
|
||||
\to \Kleisli$. We can compose $f$ with this map to get $f \comp
|
||||
\coe\ p \tp \Monoidal \to T$. Of course, since that map was
|
||||
constructed with an isomorphism these maps already exist and could be
|
||||
used directly. So this is arguably only interesting when one also
|
||||
wants to prove properties of applying such functions.
|
||||
|
||||
\subsection{Reusability of proofs}
|
||||
The previous example also illustrate how univalence unifies two otherwise
|
||||
disparate areas: The category-theoretic study of monads; and monads as in
|
||||
functional programming. Univalence thus allows one to reuse proofs. You could
|
||||
say that univalence gives the developer two proofs for the price of one.
|
||||
The previous example illustrate how univalence unifies two otherwise
|
||||
disparate areas: The category-theoretic study of monads; and monads as
|
||||
in functional programming. Univalence thus allows one to reuse proofs.
|
||||
You could say that univalence gives the developer two proofs for the
|
||||
price of one. As an illustration of this I proved that monads are
|
||||
groupoids. I initially proved this for the Kleisli
|
||||
formulation\footnote{Actually doing this directly turned out to be
|
||||
tricky as well, so I defined an equivalent formulation which was not
|
||||
formulated with a record, but purely with $\sum$-types.}. Since the
|
||||
two formulations are equal under univalence, substitution directly
|
||||
gives us that this also holds for the monoidal formulation. This of
|
||||
course generalizes to any family $P \tp 𝒰 → 𝒰$ where $P$ is inhabited
|
||||
at either formulation (i.e.\ either $P\ \Monoidal$ or $P\ \Kleisli$
|
||||
holds).
|
||||
|
||||
The introduction (section \S\ref{sec:context}) mentioned an often
|
||||
employed-technique for enabling extensional equalities is to use the
|
||||
setoid-interpretation. Nowhere in this formalization has this been necessary,
|
||||
$\Path$ has been used globally in the project as propositional equality. One
|
||||
interesting place where this becomes apparent is in interfacing with the Agda
|
||||
standard library. Multiple definitions in the Agda standard library have been
|
||||
designed with the setoid-interpretation in mind. E.g. the notion of ``unique
|
||||
existential'' is indexed by a relation that should play the role of
|
||||
propositional equality. Likewise for equivalence relations, they are indexed,
|
||||
not only by the actual equivalence relation, but also by another relation that
|
||||
serve as propositional equality.
|
||||
%% Unfortunately we cannot use the definition of equivalences found in the
|
||||
%% standard library to do equational reasoning directly. The reason for this is
|
||||
%% that the equivalence relation defined there must be a homogenous relation,
|
||||
%% but paths are heterogeneous relations.
|
||||
The introduction (section \S\ref{sec:context}) mentioned that a
|
||||
typical way of getting access to functional extensionality is to work
|
||||
with setoids. Nowhere in this formalization has this been necessary,
|
||||
$\Path$ has been used globally in the project for propositional
|
||||
equality. One interesting place where this becomes apparent is in
|
||||
interfacing with the Agda standard library. Multiple definitions in
|
||||
the Agda standard library have been designed with the
|
||||
setoid-interpretation in mind. E.g.\ the notion of \emph{unique
|
||||
existential} is indexed by a relation that should play the role of
|
||||
propositional equality. Equivalence relations are likewise indexed,
|
||||
not only by the actual equivalence relation but also by another
|
||||
relation that serve as propositional equality.
|
||||
%% Unfortunately we cannot use the definition of equivalences found in
|
||||
%% the standard library to do equational reasoning directly. The
|
||||
%% reason for this is that the equivalence relation defined there must
|
||||
%% be a homogenous relation, but paths are heterogeneous relations.
|
||||
|
||||
In the formalization at present a significant amount of energy has been put
|
||||
towards proving things that would not have been needed in classical Agda. The
|
||||
proofs that some given type is a proposition were provided as a strategy to
|
||||
simplify some otherwise very complicated proofs (e.g.
|
||||
\ref{eq:proof-prop-IsPreCategory} and \label{eq:productPath}). Often these
|
||||
proofs would not be this complicated. If the J-rule holds definitionally the
|
||||
proof-assistant can help simplify these goals considerably. The lack of the
|
||||
J-rule has a significant impact on the complexity of these kinds of proofs.
|
||||
In the formalization at present a significant amount of energy has
|
||||
been put towards proving things that would not have been needed in
|
||||
classical Agda. The proofs that some given type is a proposition were
|
||||
provided as a strategy to simplify some otherwise very complicated
|
||||
proofs (e.g.\ \ref{eq:proof-prop-IsPreCategory}
|
||||
and \ref{eq:productPath}). Often these proofs would not be this
|
||||
complicated. If the J-rule holds definitionally the proof-assistant
|
||||
can help simplify these goals considerably. The lack of the J-rule has
|
||||
a significant impact on the complexity of these kinds of proofs.
|
||||
|
||||
\TODO{Universe levels.}
|
||||
\subsection{Motifs}
|
||||
An oft-used technique in this development is using based path
|
||||
induction to prove certain properties. One particular challenge that
|
||||
arises when doing so is that Agda is not able to automatically infer
|
||||
the family that one wants to do induction over. For instance in the
|
||||
proof $\var{sym}\ (\var{sym}\ p) ≡ p$ from \ref{eq:sym-invol} the
|
||||
family that we chose to do induction over was $D\ b'\ p' \defeq
|
||||
\var{sym}\ (\var{sym}\ p') ≡ p'$. However, if one interactively tries
|
||||
to give this hole, all the information that Agda can provide is that
|
||||
one must provide an element of $𝒰$. Agda could be more helpful in this
|
||||
context, perhaps even infer this family in some situations. In this
|
||||
very simple example this is of course not a big problem, but there are
|
||||
examples in the source code where this gets more involved.
|
||||
|
||||
\section{Future work}
|
||||
\subsection{Agda \texttt{Prop}}
|
||||
Jesper Cockx' work extending the universe-level-laws for Agda and the
|
||||
\texttt{Prop}-type.
|
||||
|
||||
\subsection{Compiling Cubical Agda}
|
||||
\label{sec:compiling-cubical-agda}
|
||||
Compilation of program written in Cubical Agda is currently not supported. One
|
||||
issue here is that the backends does not provide an implementation for the
|
||||
cubical primitives (such as the path-type). This means that even though the
|
||||
path-type gives us a computational interpretation of functional extensionality,
|
||||
univalence, transport, etc., we do not have a way of actually using this to
|
||||
compile our programs that use these primitives. It would be interesting to see
|
||||
practical applications of this. The path between monads that this library
|
||||
exposes could provide one particularly interesting case-study.
|
||||
Compilation of program written in Cubical Agda is currently not
|
||||
supported. One issue here is that the backends does not provide an
|
||||
implementation for the cubical primitives (such as the path-type).
|
||||
This means that even though the path-type gives us a computational
|
||||
interpretation of functional extensionality, univalence, transport,
|
||||
etc., we do not have a way of actually using this to compile our
|
||||
programs that use these primitives. It would be interesting to see
|
||||
practical applications of this.
|
||||
|
||||
\subsection{Higher inductive types}
|
||||
This library has not explored the usefulness of higher inductive types in the
|
||||
context of Category Theory.
|
||||
\subsection{Proving laws of programs}
|
||||
Another interesting thing would be to use the Kleisli formulation of
|
||||
monads to prove properties of functional programs. The existence of
|
||||
univalence will make it possible to re-use proofs stated in terms of
|
||||
the monoidal formulation in this setting.
|
||||
|
||||
%% \subsection{Higher inductive types}
|
||||
%% This library has not explored the usefulness of higher inductive types
|
||||
%% in the context of Category Theory.
|
||||
|
||||
\subsection{Initiality conjecture}
|
||||
A fellow student at Chalmers, Andreas Källberg, is currently working
|
||||
on proving the initiality conjecture. He will be using this library
|
||||
to do so.
|
||||
|
|
9
doc/feedback-meeting-andrea.txt
Normal file
9
doc/feedback-meeting-andrea.txt
Normal file
|
@ -0,0 +1,9 @@
|
|||
App. 2 in HoTT gives typing rule for pathJ including a computational
|
||||
rule for it.
|
||||
|
||||
If you have this computational rule definitionally, then you wouldn't
|
||||
need to use `pathJprop`.
|
||||
|
||||
In discussion-section I mention HITs. I should remove this or come up
|
||||
with a more elaborate example of something you could do, e.g.
|
||||
something with pushouts in the category of sets.
|
|
@ -3,7 +3,7 @@ I've written this as an appendix because 1) the aim of the thesis changed
|
|||
drastically from the planning report/proposal 2) partly I'm not sure how to
|
||||
structure my thesis.
|
||||
|
||||
My work so far has very much focused on the formalization, i.e. coding. It's
|
||||
My work so far has very much focused on the formalization, i.e.\ coding. It's
|
||||
unclear to me at this point what I should have in the final report. Here I will
|
||||
describe what I have managed to formalize so far and what outstanding challenges
|
||||
I'm facing.
|
||||
|
@ -57,8 +57,8 @@ composition and identity, laws; preservation of identity and composition) plus
|
|||
the extra condition that it is univalent - namely that you can get an equality
|
||||
of two objects from an isomorphism.
|
||||
|
||||
I make no distinction between a pre-category and a real category (as in the
|
||||
[HoTT]-sense). A pre-category in my implementation would be a category sans the
|
||||
I make no distinction between a pre category and a real category (as in the
|
||||
[HoTT]-sense). A pre category in my implementation would be a category sans the
|
||||
witness to univalence.
|
||||
|
||||
I also prove that being a category is a proposition. This gives rise to an
|
||||
|
|
File diff suppressed because it is too large
Load diff
|
@ -1,194 +1,270 @@
|
|||
\chapter{Introduction}
|
||||
This thesis is a case-study in the application of cubical Agda in the
|
||||
context of category theory. At the center of this is the notion of
|
||||
\nomenindex{equality}. In type-theory there are two pervasive notions
|
||||
of equality: \nomenindex{judgmental equality} and
|
||||
\nomenindex{propositional equality}. Judgmental equality is a property
|
||||
of the type system. Judgmental equality on the other hand is usually
|
||||
defined \emph{within} the system. When introducing definitions this
|
||||
report will use the symbol $\defeq$. Judgmental equalities will be
|
||||
denoted with $=$ and for propositional equalities the notation
|
||||
$\equiv$ is used.
|
||||
|
||||
The rules of judgmental equality are related with $β$- and
|
||||
$η$-reduction which gives a notion of computation in a given type
|
||||
theory.
|
||||
%
|
||||
There are some properties that one usually want judgmental equality to
|
||||
satisfy. It must be \nomenindex{sound}, enjoy \nomenindex{canonicity}
|
||||
and be a \nomenindex{congruence relation}. Soundness means that things
|
||||
judged to be equal are equal with respects to the \nomenindex{model}
|
||||
of the theory or the \emph{meta theory}. It must be a congruence
|
||||
relation because otherwise the relation certainly does not adhere to
|
||||
our notion of equality. One would be able to conclude things like: $x
|
||||
\equiv y \rightarrow f\ x \nequiv f\ y$. Canonicity means that any
|
||||
well typed term evaluates to a \emph{canonical} form. For example for
|
||||
a closed term $e \tp \bN$ it will be the case that $e$ reduces to $n$
|
||||
applications of $\mathit{suc}$ to $0$ for some $n$; $e =
|
||||
\mathit{suc}^n\ 0$. Without canonicity terms in the language can get
|
||||
``stuck'' meaning that they do not reduce to a canonical form.
|
||||
|
||||
To work as a programming languages it is necessary for judgmental
|
||||
equality to be \nomenindex{decidable}. Being decidable simply means
|
||||
that that an algorithm exists to decide whether two terms are equal.
|
||||
For any practical implementation the decidability must also be
|
||||
effectively computable.
|
||||
|
||||
For propositional equality the decidability requirement is relaxed. It
|
||||
is not in general possible to decide the correctness of logical
|
||||
propositions (cf.\ Hilbert's \emph{entscheidigungsproblem}).
|
||||
|
||||
There are two flavors of type-theory. \emph{Intensional-} and
|
||||
\emph{extensional-} type theory (ITT and ETT respectively). Identity
|
||||
types in extensional type theory are required to be
|
||||
\nomen{propositions}{proposition}. That is, a type with at most one
|
||||
inhabitant. In extensional type theory the principle of reflection
|
||||
%
|
||||
$$a ≡ b → a = b$$
|
||||
%
|
||||
is enough to make type checking undecidable. This report focuses on
|
||||
Agda which at a glance can be thought of as a version of intensional
|
||||
type theory. Pattern-matching in regular Agda lets one prove
|
||||
\nomenindex{Uniqueness of Identity Proofs} (UIP). UIP states that any
|
||||
two identity proofs are propositionally identical.
|
||||
|
||||
The usual notion of propositional equality in ITT is quite
|
||||
restrictive. In the next section a few motivating examples will
|
||||
highlight this. There exist techniques to circumvent these problems,
|
||||
as we shall see. This thesis will explore an extension to Agda that
|
||||
redefines the notion of propositional equality and as such is an
|
||||
alternative to these other techniques. The extension is called cubical
|
||||
Agda. Cubical Agda drops UIP as this does not permit
|
||||
\nomenindex{functional extensionality} and
|
||||
\nomenindex{univalence}. What makes this extension particularly
|
||||
interesting is that it gives a \emph{constructive} interpretation of
|
||||
univalence. What all this means will be elaborated in the following
|
||||
sections.
|
||||
%
|
||||
\section{Motivating examples}
|
||||
%
|
||||
In the following two sections I present two examples that illustrate some
|
||||
limitations inherent in ITT and -- by extension -- Agda.
|
||||
In the following two sections I present two examples that illustrate
|
||||
some limitations inherent in ITT and -- by extension -- Agda.
|
||||
%
|
||||
\subsection{Functional extensionality}
|
||||
\label{sec:functional-extensionality}%
|
||||
Consider the functions:
|
||||
%
|
||||
\begin{multicols}{2}
|
||||
\noindent
|
||||
\begin{equation*}
|
||||
f \defeq (n \tp \bN) \mto (0 + n \tp \bN)
|
||||
\end{equation*}
|
||||
\begin{equation*}
|
||||
g \defeq (n \tp \bN) \mto (n + 0 \tp \bN)
|
||||
\end{equation*}
|
||||
\end{multicols}
|
||||
\begin{align*}%
|
||||
\var{zeroLeft} & \defeq \lambda\; (n \tp \bN) \to (0 + n \tp \bN) \\
|
||||
\var{zeroRight} & \defeq \lambda\; (n \tp \bN) \to (n + 0 \tp \bN)
|
||||
\end{align*}%
|
||||
%
|
||||
$n + 0$ is \nomen{definitionally} equal to $n$, which we write as $n + 0 = n$.
|
||||
This is also called \nomen{judgmental} equality. We call it definitional
|
||||
equality because the \emph{equality} arises from the \emph{definition} of $+$
|
||||
which is:
|
||||
The term $n + 0$ is \nomenindex{definitionally} equal to $n$, which we
|
||||
write as $n + 0 = n$. This is also called \nomenindex{judgmental
|
||||
equality}. We call it definitional equality because the
|
||||
\emph{equality} arises from the \emph{definition} of $+$ which is:
|
||||
%
|
||||
\begin{align*}
|
||||
+ & \tp \bN \to \bN \to \bN \\
|
||||
n + 0 & \defeq n \\
|
||||
+ & \tp \bN \to \bN \to \bN \\
|
||||
n + 0 & \defeq n \\
|
||||
n + (\suc{m}) & \defeq \suc{(n + m)}
|
||||
\end{align*}
|
||||
%
|
||||
Note that $0 + n$ is \emph{not} definitionally equal to $n$. $0 + n$ is in
|
||||
normal form. I.e.; there is no rule for $+$ whose left-hand-side matches this
|
||||
expression. We \emph{do}, however, have that they are \nomen{propositionally}
|
||||
equal, which we write as $n + 0 \equiv n$. Propositional equality means that
|
||||
there is a proof that exhibits this relation. Since equality is a transitive
|
||||
relation we have that $n + 0 \equiv 0 + n$.
|
||||
|
||||
Unfortunately we don't have $f \equiv g$.\footnote{Actually showing this is
|
||||
outside the scope of this text. Essentially it would involve giving a model
|
||||
for our type theory that validates all our axioms but where $f \equiv g$ is
|
||||
not true.} There is no way to construct a proof asserting the obvious
|
||||
equivalence of $f$ and $g$ -- even though we can prove them equal for all
|
||||
points. This is exactly the notion of equality of functions that we are
|
||||
interested in; that they are equal for all inputs. We call this
|
||||
\nomen{point-wise equality}, where the \emph{points} of a function refers
|
||||
to its arguments.
|
||||
|
||||
In the context of category theory functional extensionality is e.g. needed to
|
||||
show that representable functors are indeed functors. The representable functor
|
||||
for a category $\bC$ and a fixed object in $A \in \bC$ is defined to be:
|
||||
Note that $0 + n$ is \emph{not} definitionally equal to $n$. This is
|
||||
because $0 + n$ is in normal form. I.e.\ there is no rule for $+$
|
||||
whose left hand side matches this expression. We do however have that
|
||||
they are \nomen{propositionally}{propositional equality} equal, which
|
||||
we write as $n \equiv n + 0$. Propositional equality means that there
|
||||
is a proof that exhibits this relation. We can do induction over $n$
|
||||
to prove this:
|
||||
%
|
||||
\begin{align*}
|
||||
\fmap \defeq X \mto \Hom_{\bC}(A, X)
|
||||
\end{align*}
|
||||
\begin{align}
|
||||
\label{eq:zrn}
|
||||
\begin{split}
|
||||
\var{zrn}\ & \tp ∀ n → n ≡ \var{zeroRight}\ n \\
|
||||
\var{zrn}\ \var{zero} & \defeq \var{refl} \\
|
||||
\var{zrn}\ (\var{suc}\ n) & \defeq \var{cong}\ \var{suc}\ (\var{zrn}\ n)
|
||||
\end{split}
|
||||
\end{align}
|
||||
%
|
||||
The proof obligation that this satisfies the identity law of functors
|
||||
($\fmap\ \idFun \equiv \idFun$) thus becomes:
|
||||
%
|
||||
\begin{align*}
|
||||
\Hom(A, \idFun_{\bX}) = (g \mto \idFun \comp g) \equiv \idFun_{\Sets}
|
||||
\end{align*}
|
||||
%
|
||||
One needs functional extensionality to ``go under'' the function arrow and apply
|
||||
the (left) identity law of the underlying category to prove $\idFun \comp g
|
||||
\equiv g$ and thus close the goal.
|
||||
This show that zero is a right neutral element hence the name $\var{zrn}$.
|
||||
Since equality is a transitive relation we have that $\forall n \to
|
||||
\var{zeroLeft}\ n \equiv \var{zeroRight}\ n$. Unfortunately we don't
|
||||
have $\var{zeroLeft} \equiv \var{zeroRight}$. There is no way to
|
||||
construct a proof asserting the obvious equivalence of
|
||||
$\var{zeroLeft}$ and $\var{zeroRight}$. Actually showing this is
|
||||
outside the scope of this text. Essentially it would involve giving a
|
||||
model for our type theory that validates all our axioms but where
|
||||
$\var{zeroLeft} \equiv \var{zeroRight}$ is not true. We cannot show
|
||||
that they are equal even though we can prove them equal for all
|
||||
points. For functions this is exactly the notion of equality that we
|
||||
are interested in: Functions are considered equal when they are equal
|
||||
for all inputs. This is called \nomenindex{pointwise equality}, where
|
||||
the \emph{points} of a function refer to its arguments.
|
||||
%
|
||||
\subsection{Equality of isomorphic types}
|
||||
%
|
||||
Let $\top$ denote the unit type -- a type with a single constructor. In the
|
||||
propositions-as-types interpretation of type theory $\top$ is the proposition
|
||||
that is always true. The type $A \x \top$ and $A$ has an element for each $a :
|
||||
A$. So in a sense they have the same shape (Greek; \nomen{isomorphic}). The
|
||||
second element of the pair does not add any ``interesting information''. It can
|
||||
be useful to identify such types. In fact, it is quite commonplace in
|
||||
mathematics. Say we look at a set $\{x \mid \phi\ x \land \psi\ x\}$ and somehow
|
||||
conclude that $\psi\ x \equiv \top$ for all $x$. A mathematician would
|
||||
immediately conclude $\{x \mid \phi\ x \land \psi\ x\} \equiv \{x \mid
|
||||
\phi\ x\}$ without thinking twice. Unfortunately such an identification can not
|
||||
Let $\top$ denote the unit type -- a type with a single constructor.
|
||||
In the propositions as types interpretation of type theory $\top$ is
|
||||
the proposition that is always true. The type $A \x \top$ and $A$ has
|
||||
an element for each $a \tp A$. So in a sense they have the same shape
|
||||
(Greek;
|
||||
\nomenindex{isomorphic}). The second element of the pair does not
|
||||
add any ``interesting information''. It can be useful to identify such
|
||||
types. In fact, it is quite commonplace in mathematics. Say we look at
|
||||
a set $\{x \mid \phi\ x \land \psi\ x\}$ and somehow conclude that
|
||||
$\psi\ x \equiv \top$ for all $x$. A mathematician would immediately
|
||||
conclude $\{x \mid \phi\ x \land \psi\ x\} \equiv \{x \mid \phi\ x\}$
|
||||
without thinking twice. Unfortunately such an identification can not
|
||||
be performed in ITT.
|
||||
|
||||
More specifically what we are interested in is a way of identifying
|
||||
\nomen{equivalent} types. I will return to the definition of equivalence later
|
||||
in section \S\ref{sec:equiv}, but for now it is sufficient to think of an
|
||||
equivalence as a one-to-one correspondence. We write $A \simeq B$ to assert that
|
||||
$A$ and $B$ are equivalent types. The principle of univalence says that:
|
||||
\nomenindex{equivalent} types. I will return to the definition of
|
||||
equivalence later in section \S\ref{sec:equiv}, but for now it is
|
||||
sufficient to think of an equivalence as a one-to-one correspondence.
|
||||
We write $A \simeq B$ to assert that $A$ and $B$ are equivalent types.
|
||||
The principle of univalence says that:
|
||||
%
|
||||
$$\mathit{univalence} \tp (A \simeq B) \simeq (A \equiv B)$$
|
||||
%
|
||||
In particular this allows us to construct an equality from an equivalence
|
||||
($\mathit{ua} \tp (A \simeq B) \to (A \equiv B)$) and vice-versa.
|
||||
%
|
||||
$$\mathit{ua} \tp (A \simeq B) \to (A \equiv B)$$
|
||||
%
|
||||
and vice versa.
|
||||
|
||||
\section{Formalizing Category Theory}
|
||||
%
|
||||
The above examples serve to illustrate a limitation of ITT. One case where these
|
||||
limitations are particularly prohibitive is in the study of Category Theory. At
|
||||
a glance category theory can be described as ``the mathematical study of
|
||||
(abstract) algebras of functions'' (\cite{awodey-2006}). By that token
|
||||
functional extensionality is particularly useful for formulating Category
|
||||
Theory. In Category theory it is also common to identify isomorphic structures
|
||||
and univalence gives us a way to make this notion precise. In fact we can
|
||||
formulate this requirement within our formulation of categories by requiring the
|
||||
\emph{categories} themselves to be univalent as we shall see.
|
||||
The above examples serve to illustrate a limitation of ITT. One case
|
||||
where these limitations are particularly prohibitive is in the study
|
||||
of Category Theory. At a glance category theory can be described as
|
||||
``the mathematical study of (abstract) algebras of functions''
|
||||
(\cite{awodey-2006}). By that token functional extensionality is
|
||||
particularly useful for formulating Category Theory. In Category
|
||||
theory it is also commonplace to identify isomorphic structures and
|
||||
univalence gives us a way to make this notion precise. In fact we can
|
||||
formulate this requirement within our formulation of categories by
|
||||
requiring the \emph{categories} themselves to be univalent as we shall
|
||||
see in \S\ref{sec:univalence}.
|
||||
|
||||
\section{Context}
|
||||
\label{sec:context}
|
||||
%
|
||||
The idea of formalizing Category Theory in proof assistants is not new. There
|
||||
are a multitude of these available online. Just as a first reference see this
|
||||
question on Math Overflow: \cite{mo-formalizations}. Notably these
|
||||
implementations of category theory in Agda:
|
||||
are a multitude of these available online. Notably:
|
||||
%
|
||||
\begin{itemize}
|
||||
\item
|
||||
A formalization in Agda using the setoid approach:
|
||||
\url{https://github.com/copumpkin/categories}
|
||||
|
||||
A formalization in Agda using the setoid approach
|
||||
\item
|
||||
A formalization in Agda with univalence and functional
|
||||
extensionality as postulates:
|
||||
\url{https://github.com/pcapriotti/agda-categories}
|
||||
|
||||
A formalization in Agda with univalence and functional extensionality as
|
||||
postulates.
|
||||
\item
|
||||
A formalization in Coq in the homotopic setting:
|
||||
\url{https://github.com/HoTT/HoTT/tree/master/theories/Categories}
|
||||
|
||||
A formalization in Coq in the homotopic setting
|
||||
\item
|
||||
A formalization in \emph{CubicalTT} -- a language designed for
|
||||
cubical type theory. Formalizes many different things, but only a
|
||||
few concepts from category theory:
|
||||
\url{https://github.com/mortberg/cubicaltt}
|
||||
|
||||
A formalization in CubicalTT - a language designed for cubical-type-theory.
|
||||
Formalizes many different things, but only a few concepts from category
|
||||
theory.
|
||||
|
||||
\end{itemize}
|
||||
%
|
||||
The contribution of this thesis is to explore how working in a cubical setting
|
||||
will make it possible to prove more things and to reuse proofs and to try and
|
||||
compare some aspects of this formalization with the existing ones.\TODO{How can
|
||||
I live up to this?}
|
||||
compare some aspects of this formalization with the existing ones.
|
||||
|
||||
There are alternative approaches to working in a cubical setting where one can
|
||||
still have univalence and functional extensionality. One option is to postulate
|
||||
these as axioms. This approach, however, has other shortcomings, e.g.; you lose
|
||||
\nomen{canonicity} (\TODO{Pageno!} \cite{huber-2016}). Canonicity means that any
|
||||
well-typed term evaluates to a \emph{canonical} form. For example for a closed
|
||||
term $e \tp \bN$ it will be the case that $e$ reduces to $n$ applications of
|
||||
$\mathit{suc}$ to $0$ for some $n$; $e = \mathit{suc}^n\ 0$. Without canonicity
|
||||
terms in the language can get ``stuck'' -- meaning that they do not reduce to a
|
||||
canonical form.
|
||||
There are alternative approaches to working in a cubical setting where
|
||||
one can still have univalence and functional extensionality. One
|
||||
option is to postulate these as axioms. This approach, however, has
|
||||
other shortcomings, e.g. you lose \nomenindex{canonicity}
|
||||
(\cite[p. 3]{huber-2016}).
|
||||
|
||||
Another approach is to use the \emph{setoid interpretation} of type theory
|
||||
(\cite{hofmann-1995,huber-2016}). With this approach one works with
|
||||
\nomen{extensional sets} $(X, \sim)$, that is a type $X \tp \MCU$ and an
|
||||
equivalence relation $\sim \tp X \to X \to \MCU$ on that type. Under the setoid
|
||||
interpretation the equivalence relation serve as a sort of ``local''
|
||||
propositional equality. This approach has other drawbacks; it does not satisfy
|
||||
all propositional equalities of type theory (\TODO{Citation needed}), is
|
||||
cumbersome to work with in practice (\cite[p. 4]{huber-2016}) and makes
|
||||
equational proofs less reusable since equational proofs $a \sim_{X} b$ are
|
||||
inherently `local' to the extensional set $(X , \sim)$.
|
||||
Another approach is to use the \emph{setoid interpretation} of type
|
||||
theory (\cite{hofmann-1995,huber-2016}). With this approach one works
|
||||
with \nomenindex{extensional sets} $(X, \sim)$. That is a type $X \tp
|
||||
\MCU$ and an equivalence relation $\sim\ \tp X \to X \to \MCU$ on that
|
||||
type. Under the setoid interpretation the equivalence relation serve
|
||||
as a sort of ``local'' propositional equality. Since the developer
|
||||
gets to pick this relation it is not a~priori a congruence
|
||||
relation. So this must be verified manually by the developer.
|
||||
Furthermore, functions between different setoids must be shown to be
|
||||
setoid homomorphism, that is; they preserve the relation.
|
||||
|
||||
This approach has other drawbacks; it does not satisfy all
|
||||
propositional equalities of type theory a\~priori. That is, the
|
||||
developer must manually show that e.g.\ the relation is a congruence.
|
||||
Equational proofs $a \sim_{X} b$ are in some sense `local' to the
|
||||
extensional set $(X , \sim)$. To e.g.\ prove that $x ∼ y → f\ x ∼
|
||||
f\ y$ for some function $f \tp A → B$ between two extensional sets $A$
|
||||
and $B$ it must be shown that $f$ is a groupoid homomorphism. This
|
||||
makes it very cumbersome to work with in practice (\cite[p.
|
||||
4]{huber-2016}).
|
||||
|
||||
\section{Conventions}
|
||||
\TODO{Talk a bit about terminology. Find a good place to stuff this little
|
||||
section.}
|
||||
In the remainder of this paper I will use the term \nomenindex{Type}
|
||||
to describe -- well -- types. Thereby departing from the notation in
|
||||
Agda where the keyword \texttt{Set} refers to types. \nomenindex{Set}
|
||||
on the other hand shall refer to the homotopical notion of a set. I
|
||||
will also leave all universe levels implicit. This of course does not
|
||||
mean that a statement such as $\MCU \tp \MCU$ means that we have
|
||||
type-in-type but rather that the arguments to the universes are
|
||||
implicit.
|
||||
|
||||
In the remainder of this paper I will use the term \nomen{Type} to describe --
|
||||
well, types. Thereby diverging from the notation in Agda where the keyword
|
||||
\texttt{Set} refers to types. \nomen{Set} on the other hand shall refer to the
|
||||
homotopical notion of a set. I will also leave all universe levels implicit.
|
||||
And I use the term
|
||||
\nomenindex{arrow} to refer to morphisms in a category,
|
||||
whereas the terms
|
||||
\nomenindex{morphism},
|
||||
\nomenindex{map} or
|
||||
\nomenindex{function}
|
||||
shall be reserved for talking about type theoretic functions; i.e.
|
||||
functions in Agda.
|
||||
|
||||
And I use the term \nomen{arrow} to refer to morphisms in a category, whereas
|
||||
the terms morphism, map or function shall be reserved for talking about
|
||||
type-theoretic functions; i.e. functions in Agda.
|
||||
|
||||
$\defeq$ will be used for introducing definitions. $=$ will be used to for
|
||||
judgmental equality and $\equiv$ will be used for propositional equality.
|
||||
As already noted $\defeq$ will be used for introducing definitions $=$
|
||||
will be used to for judgmental equality and $\equiv$ will be used for
|
||||
propositional equality.
|
||||
|
||||
All this is summarized in the following table:
|
||||
|
||||
%
|
||||
\begin{samepage}
|
||||
\begin{center}
|
||||
\begin{tabular}{ c c c }
|
||||
Name & Agda & Notation \\
|
||||
\hline
|
||||
\nomen{Type} & \texttt{Set} & $\Type$ \\
|
||||
\nomen{Set} & \texttt{Σ Set IsSet} & $\Set$ \\
|
||||
|
||||
\varindex{Type} & \texttt{Set} & $\Type$ \\
|
||||
|
||||
\varindex{Set} & \texttt{Σ Set IsSet} & $\Set$ \\
|
||||
Function, morphism, map & \texttt{A → B} & $A → B$ \\
|
||||
Dependent- ditto & \texttt{(a : A) → B} & $∏_{a \tp A} B$ \\
|
||||
\nomen{Arrow} & \texttt{Arrow A B} & $\Arrow\ A\ B$ \\
|
||||
\nomen{Object} & \texttt{C.Object} & $̱ℂ.Object$ \\
|
||||
|
||||
\varindex{Arrow} & \texttt{Arrow A B} & $\Arrow\ A\ B$ \\
|
||||
|
||||
\varindex{Object} & \texttt{C.Object} & $̱ℂ.Object$ \\
|
||||
Definition & \texttt{=} & $̱\defeq$ \\
|
||||
Judgmental equality & \null & $̱=$ \\
|
||||
Propositional equality & \null & $̱\equiv$
|
||||
\end{tabular}
|
||||
\end{center}
|
||||
\end{samepage}
|
||||
|
|
|
@ -9,15 +9,15 @@
|
|||
%% Alternatively:
|
||||
%% \newcommand{\defeq}{≔}
|
||||
\newcommand{\bN}{\mathbb{N}}
|
||||
\newcommand{\bC}{\mathbb{C}}
|
||||
\newcommand{\bX}{\mathbb{X}}
|
||||
\newcommand{\bC}{ℂ}
|
||||
\newcommand{\bD}{𝔻}
|
||||
\newcommand{\bX}{𝕏}
|
||||
% \newcommand{\to}{\rightarrow}
|
||||
%% \newcommand{\mto}{\mapsto}
|
||||
\newcommand{\mto}{\rightarrow}
|
||||
\newcommand{\UU}{\ensuremath{\mathcal{U}}\xspace}
|
||||
\let\type\UU
|
||||
\newcommand{\MCU}{\UU}
|
||||
\newcommand{\nomen}[1]{\emph{#1}}
|
||||
\newcommand{\todo}[1]{\textit{#1}}
|
||||
\newcommand{\comp}{\circ}
|
||||
\newcommand{\x}{\times}
|
||||
|
@ -33,7 +33,9 @@
|
|||
}
|
||||
\makeatother
|
||||
\newcommand{\var}[1]{\ensuremath{\mathit{#1}}}
|
||||
\newcommand{\varindex}[1]{\ensuremath{\mathit{#1}}\index{#1}}
|
||||
\newcommand{\varindex}[1]{\ensuremath{\var{#1}}\index{$\var{#1}$}}
|
||||
\newcommand{\nomen}[2]{\emph{#1}\index{#2}}
|
||||
\newcommand{\nomenindex}[1]{\nomen{#1}{#1}}
|
||||
|
||||
\newcommand{\Hom}{\varindex{Hom}}
|
||||
\newcommand{\fmap}{\varindex{fmap}}
|
||||
|
@ -71,13 +73,17 @@
|
|||
\newcommand\isequiv{\varindex{isequiv}}
|
||||
\newcommand\qinv{\varindex{qinv}}
|
||||
\newcommand\fiber{\varindex{fiber}}
|
||||
\newcommand\shuffle{\varindex{shuffle}}
|
||||
\newcommand\shufflef{\varindex{shuffle}}
|
||||
\newcommand\Univalent{\varindex{Univalent}}
|
||||
\newcommand\refl{\varindex{refl}}
|
||||
\newcommand\isoToId{\varindex{isoToId}}
|
||||
\newcommand\Isomorphism{\varindex{Isomorphism}}
|
||||
\newcommand\rrr{\ggg}
|
||||
\newcommand\fish{\mathrel{\wideoverbar{\rrr}}}
|
||||
%% \newcommand\fish{\mathbin{↣}}
|
||||
%% \newcommand\fish{\mathbin{⤅}}
|
||||
\newcommand\fish{\mathbin{⤇}}
|
||||
%% \newcommand\fish{\mathbin{⤜}}
|
||||
%% \newcommand\fish{\mathrel{\wideoverbar{\rrr}}}
|
||||
\newcommand\fst{\varindex{fst}}
|
||||
\newcommand\snd{\varindex{snd}}
|
||||
\newcommand\Path{\varindex{Path}}
|
||||
|
@ -86,9 +92,39 @@
|
|||
\newcommand*{\QED}{\hfill\ensuremath{\square}}%
|
||||
\newcommand\uexists{\exists!}
|
||||
\newcommand\Arrow{\varindex{Arrow}}
|
||||
\newcommand\embellish[1]{\widehat{#1}}
|
||||
\newcommand\nattrans[1]{\embellish{#1}}
|
||||
\newcommand\functor[1]{\embellish{#1}}
|
||||
\newcommand\NTsym{\varindex{NT}}
|
||||
\newcommand\NT[2]{\NTsym\ #1\ #2}
|
||||
\newcommand\Endo[1]{\varindex{Endo}\ #1}
|
||||
\newcommand\EndoR{\mathcal{R}}
|
||||
\newcommand\EndoR{\functor{\mathcal{R}}}
|
||||
\newcommand\omapR{\mathcal{R}}
|
||||
\newcommand\omapF{\mathcal{F}}
|
||||
\newcommand\omapG{\mathcal{G}}
|
||||
\newcommand\FunF{\functor{\omapF}}
|
||||
\newcommand\FunG{\functor{\omapG}}
|
||||
\newcommand\funExt{\varindex{funExt}}
|
||||
\newcommand{\suc}[1]{\mathit{suc}\ #1}
|
||||
\newcommand{\suc}[1]{\varindex{suc}\ #1}
|
||||
\newcommand{\trans}{\varindex{trans}}
|
||||
\newcommand{\toKleisli}{\varindex{toKleisli}}
|
||||
\newcommand{\toMonoidal}{\varindex{toMonoidal}}
|
||||
\newcommand\pairA{\mathcal{A}}
|
||||
\newcommand\pairB{\mathcal{B}}
|
||||
\newcommand{\joinNT}{\functor{\varindex{join}}}
|
||||
\newcommand{\pureNT}{\functor{\varindex{pure}}}
|
||||
\newcommand{\hrefsymb}[2]{\href{#1}{#2 \ExternalLink}}
|
||||
\newcommand{\sourcebasepath}{http://web.student.chalmers.se/\textasciitilde hanghj/cat/doc/html/}
|
||||
\newcommand{\docbasepath}{https://github.com/fredefox/cat/}
|
||||
\newcommand{\sourcelink}[1]{\hrefsymb
|
||||
{\sourcebasepath#1.html}
|
||||
{\texttt{#1}}
|
||||
}
|
||||
\newcommand{\gitlink}{\hrefsymb{\docbasepath}{\texttt{\docbasepath}}}
|
||||
\newcommand{\doclink}{\hrefsymb{\sourcebasepath}{\texttt{\sourcebasepath}}}
|
||||
\newcommand{\clll}{\mathrel{\bC.\mathord{\lll}}}
|
||||
\newcommand{\dlll}{\mathrel{\bD.\mathord{\lll}}}
|
||||
\newcommand\coe{\varindex{coe}}
|
||||
\newcommand\Monoidal{\varindex{Monoidal}}
|
||||
\newcommand\Kleisli{\varindex{Kleisli}}
|
||||
\newcommand\I{\mathds{I}}
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
\documentclass[a4paper]{report}
|
||||
%% \documentclass[tightpage]{preview}
|
||||
%% \documentclass[compact,a4paper]{article}
|
||||
|
||||
\input{packages.tex}
|
||||
|
@ -49,6 +50,7 @@
|
|||
\myfrontmatter
|
||||
\maketitle
|
||||
\input{abstract.tex}
|
||||
\input{acknowledgements.tex}
|
||||
\tableofcontents
|
||||
\mainmatter
|
||||
%
|
||||
|
@ -65,7 +67,8 @@
|
|||
\begin{appendices}
|
||||
\setcounter{page}{1}
|
||||
\pagenumbering{roman}
|
||||
\input{sources.tex}
|
||||
\input{appendix/abstract-funext.tex}
|
||||
%% \input{sources.tex}
|
||||
%% \input{planning.tex}
|
||||
%% \input{halftime.tex}
|
||||
\end{appendices}
|
||||
|
|
|
@ -3,21 +3,27 @@
|
|||
\usepackage{natbib}
|
||||
\bibliographystyle{plain}
|
||||
|
||||
\usepackage{xcolor}
|
||||
%% \mode<report>{
|
||||
\usepackage[
|
||||
hidelinks,
|
||||
%% hidelinks,
|
||||
pdfusetitle,
|
||||
pdfsubject={category theory},
|
||||
pdfkeywords={type theory, homotopy theory, category theory, agda}]
|
||||
{hyperref}
|
||||
|
||||
%% }
|
||||
%% \definecolor{darkorange}{HTML}{ff8c00}
|
||||
%% \hypersetup{allbordercolors={darkorange}}
|
||||
\hypersetup{hidelinks}
|
||||
\usepackage{graphicx}
|
||||
%% \usepackage[active,tightpage]{preview}
|
||||
|
||||
\usepackage{parskip}
|
||||
\usepackage{multicol}
|
||||
\usepackage{amssymb,amsmath,amsthm,stmaryrd,mathrsfs,wasysym}
|
||||
\usepackage[toc,page]{appendix}
|
||||
\usepackage{xspace}
|
||||
\usepackage[a4paper,top=3cm,bottom=3cm]{geometry}
|
||||
\usepackage[paper=a4paper,top=3cm,bottom=3cm]{geometry}
|
||||
\usepackage{makeidx}
|
||||
\makeindex
|
||||
% \setlength{\parskip}{10pt}
|
||||
|
@ -78,15 +84,15 @@
|
|||
\newunicodechar{∨}{\textfallback{∨}}
|
||||
\newunicodechar{∧}{\textfallback{∧}}
|
||||
\newunicodechar{⊔}{\textfallback{⊔}}
|
||||
\newunicodechar{≊}{\textfallback{≊}}
|
||||
%% \newunicodechar{≊}{\textfallback{≊}}
|
||||
\newunicodechar{∈}{\textfallback{∈}}
|
||||
\newunicodechar{ℂ}{\textfallback{ℂ}}
|
||||
\newunicodechar{∘}{\textfallback{∘}}
|
||||
\newunicodechar{⟨}{\textfallback{⟨}}
|
||||
\newunicodechar{⟩}{\textfallback{⟩}}
|
||||
\newunicodechar{∎}{\textfallback{∎}}
|
||||
\newunicodechar{𝒜}{\textfallback{?}}
|
||||
\newunicodechar{ℬ}{\textfallback{?}}
|
||||
%% \newunicodechar{𝒜}{\textfallback{𝒜}}
|
||||
%% \newunicodechar{ℬ}{\textfallback{ℬ}}
|
||||
%% \newunicodechar{≊}{\textfallback{≊}}
|
||||
\makeatletter
|
||||
\newcommand*{\rom}[1]{\expandafter\@slowroman\romannumeral #1@}
|
||||
|
@ -113,3 +119,26 @@
|
|||
}
|
||||
|
||||
\makeatother
|
||||
\usepackage{xspace}
|
||||
\usepackage{tikz}
|
||||
\newcommand{\ExternalLink}{%
|
||||
\tikz[x=1.2ex, y=1.2ex, baseline=-0.05ex]{%
|
||||
\begin{scope}[x=1ex, y=1ex]
|
||||
\clip (-0.1,-0.1)
|
||||
--++ (-0, 1.2)
|
||||
--++ (0.6, 0)
|
||||
--++ (0, -0.6)
|
||||
--++ (0.6, 0)
|
||||
--++ (0, -1);
|
||||
\path[draw,
|
||||
line width = 0.5,
|
||||
rounded corners=0.5]
|
||||
(0,0) rectangle (1,1);
|
||||
\end{scope}
|
||||
\path[draw, line width = 0.5] (0.5, 0.5)
|
||||
-- (1, 1);
|
||||
\path[draw, line width = 0.5] (0.6, 1)
|
||||
-- (1, 1) -- (1, 0.6);
|
||||
}
|
||||
}
|
||||
\usepackage{ dsfont }
|
||||
|
|
653
doc/presentation.tex
Normal file
653
doc/presentation.tex
Normal file
|
@ -0,0 +1,653 @@
|
|||
\documentclass[a4paper,handout]{beamer}
|
||||
\usetheme{metropolis}
|
||||
\beamertemplatenavigationsymbolsempty
|
||||
%% \usecolortheme[named=seagull]{structure}
|
||||
|
||||
\input{packages.tex}
|
||||
\input{macros.tex}
|
||||
\title[Univalent Categories]{Univalent Categories\\ \footnotesize A formalization of category theory in Cubical Agda}
|
||||
\newcommand{\myname}{Frederik Hangh{\o}j Iversen}
|
||||
\author[\myname]{
|
||||
\myname\\
|
||||
\footnotesize Supervisors: Thierry Coquand, Andrea Vezzosi\\
|
||||
Examiner: Andreas Abel
|
||||
}
|
||||
\institute{Chalmers University of Technology}
|
||||
|
||||
\begin{document}
|
||||
\frame{\titlepage}
|
||||
|
||||
\begin{frame}
|
||||
\frametitle{Motivating example}
|
||||
\framesubtitle{Functional extensionality}
|
||||
Consider the functions
|
||||
\begin{align*}
|
||||
\var{zeroLeft} & ≜ \lambda (n \tp \bN) \mto (0 + n \tp \bN) \\
|
||||
\var{zeroRight} & ≜ \lambda (n \tp \bN) \mto (n + 0 \tp \bN)
|
||||
\end{align*}
|
||||
\pause
|
||||
We have
|
||||
%
|
||||
$$
|
||||
∏_{n \tp \bN} \var{zeroLeft}\ n ≡ \var{zeroRight}\ n
|
||||
$$
|
||||
%
|
||||
\pause
|
||||
But not
|
||||
%
|
||||
$$
|
||||
\var{zeroLeft} ≡ \var{zeroRight}
|
||||
$$
|
||||
%
|
||||
\pause
|
||||
We need
|
||||
%
|
||||
$$
|
||||
\funExt \tp ∏_{a \tp A} f\ a ≡ g\ a → f ≡ g
|
||||
$$
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Motivating example}
|
||||
\framesubtitle{Univalence}
|
||||
Consider the set
|
||||
$\{x \mid \phi\ x \land \psi\ x\}$
|
||||
\pause
|
||||
|
||||
If we show $∀ x . \psi\ x ≡ \top$
|
||||
then we want to conclude
|
||||
$\{x \mid \phi\ x \land \psi\ x\} ≡ \{x \mid \phi\ x\}$
|
||||
\pause
|
||||
|
||||
We need univalence:
|
||||
$$(A ≃ B) ≃ (A ≡ B)$$
|
||||
\pause
|
||||
%
|
||||
We will return to $≃$, but for now think of it as an
|
||||
isomorphism, so it induces maps:
|
||||
\begin{align*}
|
||||
\var{toPath} & \tp (A ≃ B) → (A ≡ B) \\
|
||||
\var{toEquiv} & \tp (A ≡ B) → (A ≃ B)
|
||||
\end{align*}
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Paths}
|
||||
\framesubtitle{Definition}
|
||||
Heterogeneous paths
|
||||
\begin{equation*}
|
||||
\Path \tp (P \tp I → \MCU) → P\ 0 → P\ 1 → \MCU
|
||||
\end{equation*}
|
||||
\pause
|
||||
For $P \tp I → \MCU$, $A \tp \MCU$ and $a_0, a_1 \tp A$
|
||||
inhabitants of $\Path\ P\ a_0\ a_1$ are like functions
|
||||
%
|
||||
$$
|
||||
p \tp ∏_{i \tp I} P\ i
|
||||
$$
|
||||
%
|
||||
Which satisfy $p\ 0 & = a_0$ and $p\ 1 & = a_1$
|
||||
\pause
|
||||
|
||||
Homogenous paths
|
||||
$$
|
||||
a_0 ≡ a_1 ≜ \Path\ (\var{const}\ A)\ a_0\ a_1
|
||||
$$
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Paths}
|
||||
\framesubtitle{Functional extenstionality}
|
||||
$$
|
||||
\funExt & \tp ∏_{a \tp A} f\ a ≡ g\ a → f ≡ g
|
||||
$$
|
||||
\pause
|
||||
$$
|
||||
\funExt\ p ≜ λ i\ a → p\ a\ i
|
||||
$$
|
||||
\pause
|
||||
$$
|
||||
\funExt\ (\var{const}\ \refl)
|
||||
\tp
|
||||
\var{zeroLeft} ≡ \var{zeroRight}
|
||||
$$
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Paths}
|
||||
\framesubtitle{Homotopy levels}
|
||||
\begin{align*}
|
||||
& \isContr && \tp \MCU → \MCU \\
|
||||
& \isContr\ A && ≜ ∑_{c \tp A} ∏_{a \tp A} a ≡ c
|
||||
\end{align*}
|
||||
\pause
|
||||
\begin{align*}
|
||||
& \isProp && \tp \MCU → \MCU \\
|
||||
& \isProp\ A && ≜ ∏_{a_0, a_1 \tp A} a_0 ≡ a_1
|
||||
\end{align*}
|
||||
\pause
|
||||
\begin{align*}
|
||||
& \isSet && \tp \MCU → \MCU \\
|
||||
& \isSet\ A && ≜ ∏_{a_0, a_1 \tp A} \isProp\ (a_0 ≡ a_1)
|
||||
\end{align*}
|
||||
\begin{align*}
|
||||
& \isGroupoid && \tp \MCU → \MCU \\
|
||||
& \isGroupoid\ A && ≜ ∏_{a_0, a_1 \tp A} \isSet\ (a_0 ≡ a_1)
|
||||
\end{align*}
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Paths}
|
||||
\framesubtitle{A few lemmas}
|
||||
Let $D$ be a type-family:
|
||||
$$
|
||||
D \tp ∏_{b \tp A} ∏_{p \tp a ≡ b} \MCU
|
||||
$$
|
||||
%
|
||||
\pause
|
||||
And $d$ and in inhabitant of $D$ at $\refl$:
|
||||
%
|
||||
$$
|
||||
d \tp D\ a\ \refl
|
||||
$$
|
||||
%
|
||||
\pause
|
||||
We then have the function:
|
||||
%
|
||||
$$
|
||||
\pathJ\ D\ d \tp ∏_{b \tp A} ∏_{p \tp a ≡ b} D\ b\ p
|
||||
$$
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Paths}
|
||||
\framesubtitle{A few lemmas}
|
||||
Given
|
||||
\begin{align*}
|
||||
A & \tp \MCU \\
|
||||
P & \tp A → \MCU \\
|
||||
\var{propP} & \tp ∏_{x \tp A} \isProp\ (P\ x) \\
|
||||
p & \tp a_0 ≡ a_1 \\
|
||||
p_0 & \tp P\ a_0 \\
|
||||
p_1 & \tp P\ a_1
|
||||
\end{align*}
|
||||
%
|
||||
We have
|
||||
$$
|
||||
\lemPropF\ \var{propP}\ p
|
||||
\tp
|
||||
\Path\ (\lambda\; i \mto P\ (p\ i))\ p_0\ p_1
|
||||
$$
|
||||
%
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Paths}
|
||||
\framesubtitle{A few lemmas}
|
||||
$∏$ preserves $\isProp$:
|
||||
$$
|
||||
\mathit{propPi}
|
||||
\tp
|
||||
\left(∏_{a \tp A} \isProp\ (P\ a)\right)
|
||||
→ \isProp\ \left(∏_{a \tp A} P\ a\right)
|
||||
$$
|
||||
\pause
|
||||
$∑$ preserves $\isProp$:
|
||||
$$
|
||||
\mathit{propSig} \tp \isProp\ A → \left(∏_{a \tp A} \isProp\ (P\ a)\right) → \isProp\ \left(∑_{a \tp A} P\ a\right)
|
||||
$$
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Pre categories}
|
||||
\framesubtitle{Definition}
|
||||
Data:
|
||||
\begin{align*}
|
||||
\Object & \tp \Type \\
|
||||
\Arrow & \tp \Object → \Object → \Type \\
|
||||
\identity & \tp \Arrow\ A\ A \\
|
||||
\lll & \tp \Arrow\ B\ C → \Arrow\ A\ B → \Arrow\ A\ C
|
||||
\end{align*}
|
||||
%
|
||||
\pause
|
||||
Laws:
|
||||
%
|
||||
$$
|
||||
h \lll (g \lll f) ≡ (h \lll g) \lll f
|
||||
$$
|
||||
$$
|
||||
(\identity \lll f ≡ f)
|
||||
×
|
||||
(f \lll \identity ≡ f)
|
||||
$$
|
||||
\pause
|
||||
1-categories:
|
||||
$$
|
||||
\isSet\ (\Arrow\ A\ B)
|
||||
$$
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Pre categories}
|
||||
\framesubtitle{Propositionality}
|
||||
$$
|
||||
\isProp\ \left( (\identity \comp f ≡ f) × (f \comp \identity ≡ f) \right)
|
||||
$$
|
||||
\pause
|
||||
\begin{align*}
|
||||
\isProp\ \IsPreCategory
|
||||
\end{align*}
|
||||
\pause
|
||||
\begin{align*}
|
||||
\var{isAssociative} & \tp \var{IsAssociative}\\
|
||||
\isIdentity & \tp \var{IsIdentity}\\
|
||||
\var{arrowsAreSets} & \tp \var{ArrowsAreSets}
|
||||
\end{align*}
|
||||
\pause
|
||||
\begin{align*}
|
||||
& \var{propIsAssociative} && a.\var{isAssociative}\
|
||||
&& b.\var{isAssociative} && i \\
|
||||
& \propIsIdentity && a.\isIdentity\
|
||||
&& b.\isIdentity && i \\
|
||||
& \var{propArrowsAreSets} && a.\var{arrowsAreSets}\
|
||||
&& b.\var{arrowsAreSets} && i
|
||||
\end{align*}
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Categories}
|
||||
\framesubtitle{Univalence}
|
||||
\begin{align*}
|
||||
\var{IsIdentity} & ≜
|
||||
∏_{A\ B \tp \Object} ∏_{f \tp \Arrow\ A\ B} \phi\ f
|
||||
%% \\
|
||||
%% & \mathrel{\ } \identity \lll f ≡ f × f \lll \identity ≡ f
|
||||
\end{align*}
|
||||
where
|
||||
$$
|
||||
\phi\ f ≜ \identity
|
||||
( \lll f ≡ f )
|
||||
×
|
||||
( f \lll \identity ≡ f)
|
||||
$$
|
||||
\pause
|
||||
Let $\approxeq$ denote ismorphism of objects. We can then construct
|
||||
the identity isomorphism in any category:
|
||||
$$
|
||||
\identity , \identity , \var{isIdentity} \tp A \approxeq A
|
||||
$$
|
||||
\pause
|
||||
Likewise since paths are substitutive we can promote a path to an isomorphism:
|
||||
$$
|
||||
\idToIso \tp A ≡ B → A ≊ B
|
||||
$$
|
||||
\pause
|
||||
For a category to be univalent we require this to be an equivalence:
|
||||
%
|
||||
$$
|
||||
\isEquiv\ (A ≡ B)\ (A \approxeq B)\ \idToIso
|
||||
$$
|
||||
%
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Categories}
|
||||
\framesubtitle{Univalence, cont'd}
|
||||
$$\isEquiv\ (A ≡ B)\ (A \approxeq B)\ \idToIso$$
|
||||
\pause%
|
||||
$$(A ≡ B) ≃ (A \approxeq B)$$
|
||||
\pause%
|
||||
$$(A ≡ B) ≅ (A \approxeq B)$$
|
||||
\pause%
|
||||
Name the above maps:
|
||||
$$\idToIso \tp A ≡ B → A ≊ B$$
|
||||
%
|
||||
$$\isoToId \tp (A \approxeq B) → (A ≡ B)$$
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Categories}
|
||||
\framesubtitle{Propositionality}
|
||||
$$
|
||||
\isProp\ \IsCategory = ∏_{a, b \tp \IsCategory} a ≡ b
|
||||
$$
|
||||
\pause
|
||||
So, for
|
||||
$$
|
||||
a\ b \tp \IsCategory
|
||||
$$
|
||||
the proof obligation is the pair:
|
||||
%
|
||||
\begin{align*}
|
||||
p & \tp a.\isPreCategory ≡ b.\isPreCategory \\
|
||||
& \mathrel{\ } \Path\ (\lambda\; i → (p\ i).Univalent)\ a.\isPreCategory\ b.\isPreCategory
|
||||
\end{align*}
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Categories}
|
||||
\framesubtitle{Propositionality, cont'd}
|
||||
First path given by:
|
||||
$$
|
||||
p
|
||||
≜
|
||||
\var{propIsPreCategory}\ a\ b
|
||||
\tp
|
||||
a.\isPreCategory ≡ b.\isPreCategory
|
||||
$$
|
||||
\pause
|
||||
Use $\lemPropF$ for the latter.
|
||||
\pause
|
||||
%
|
||||
Univalence is indexed by an identity proof. So $A ≜
|
||||
IsIdentity\ identity$ and $B ≜ \var{Univalent}$.
|
||||
\pause
|
||||
%
|
||||
$$
|
||||
\lemPropF\ \var{propUnivalent}\ p
|
||||
$$
|
||||
\end{frame}
|
||||
|
||||
\begin{frame}
|
||||
\frametitle{Categories}
|
||||
\framesubtitle{A theorem}
|
||||
%
|
||||
Let the isomorphism $(ι, \inv{ι}) \tp A \approxeq B$.
|
||||
%
|
||||
\pause
|
||||
%
|
||||
The isomorphism induces the path
|
||||
%
|
||||
$$
|
||||
p ≜ \idToIso\ (\iota, \inv{\iota}) \tp A ≡ B
|
||||
$$
|
||||
%
|
||||
\pause
|
||||
and consequently an arrow:
|
||||
%
|
||||
$$
|
||||
p_{\var{dom}} ≜ \congruence\ (λ x → \Arrow\ x\ X)\ p
|
||||
\tp
|
||||
\Arrow\ A\ X ≡ \Arrow\ B\ X
|
||||
$$
|
||||
%
|
||||
\pause
|
||||
The proposition is:
|
||||
%
|
||||
\begin{align}
|
||||
\label{eq:coeDom}
|
||||
\tag{$\var{coeDom}$}
|
||||
∏_{f \tp A → X}
|
||||
\var{coe}\ p_{\var{dom}}\ f ≡ f \lll \inv{\iota}
|
||||
\end{align}
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Categories}
|
||||
\framesubtitle{A theorem, proof}
|
||||
\begin{align*}
|
||||
\var{coe}\ p_{\var{dom}}\ f
|
||||
& ≡ f \lll \inv{(\idToIso\ p)} && \text{By path-induction} \\
|
||||
& ≡ f \lll \inv{\iota}
|
||||
&& \text{$\idToIso$ and $\isoToId$ are inverses}\\
|
||||
\end{align*}
|
||||
\pause
|
||||
%
|
||||
Induction will be based at $A$. Let $\widetilde{B}$ and $\widetilde{p}
|
||||
\tp A ≡ \widetilde{B}$ be given.
|
||||
%
|
||||
\pause
|
||||
%
|
||||
Define the family:
|
||||
%
|
||||
$$
|
||||
D\ \widetilde{B}\ \widetilde{p} ≜
|
||||
\var{coe}\ \widetilde{p}_{\var{dom}}\ f
|
||||
≡
|
||||
f \lll \inv{(\idToIso\ \widetilde{p})}
|
||||
$$
|
||||
\pause
|
||||
%
|
||||
The base-case becomes:
|
||||
$$
|
||||
d \tp D\ A\ \refl =
|
||||
\var{coe}\ \refl_{\var{dom}}\ f ≡ f \lll \inv{(\idToIso\ \refl)}
|
||||
$$
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Categories}
|
||||
\framesubtitle{A theorem, proof, cont'd}
|
||||
$$
|
||||
d \tp
|
||||
\var{coe}\ \refl_{\var{dom}}\ f ≡ f \lll \inv{(\idToIso\ \refl)}
|
||||
$$
|
||||
\pause
|
||||
\begin{align*}
|
||||
\var{coe}\ \refl^*\ f
|
||||
& ≡ f
|
||||
&& \text{$\refl$ is a neutral element for $\var{coe}$}\\
|
||||
& ≡ f \lll \identity \\
|
||||
& ≡ f \lll \var{subst}\ \refl\ \identity
|
||||
&& \text{$\refl$ is a neutral element for $\var{subst}$}\\
|
||||
& ≡ f \lll \inv{(\idToIso\ \refl)}
|
||||
&& \text{By definition of $\idToIso$}\\
|
||||
\end{align*}
|
||||
\pause
|
||||
In conclusion, the theorem is inhabited by:
|
||||
$$
|
||||
\label{eq:pathJ-example}
|
||||
\pathJ\ D\ d\ B\ p
|
||||
$$
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Span category} \framesubtitle{Definition} Given a base
|
||||
category $\bC$ and two objects in this category $\pairA$ and $\pairB$
|
||||
we can construct the \nomenindex{span category}:
|
||||
%
|
||||
\pause
|
||||
Objects:
|
||||
$$
|
||||
∑_{X \tp Object} \Arrow\ X\ \pairA × \Arrow\ X\ \pairB
|
||||
$$
|
||||
\pause
|
||||
%
|
||||
Arrows between objects $A ,\ a_{\pairA} ,\ a_{\pairB}$ and
|
||||
$B ,\ b_{\pairA} ,\ b_{\pairB}$:
|
||||
%
|
||||
$$
|
||||
∑_{f \tp \Arrow\ A\ B}
|
||||
b_{\pairA} \lll f ≡ a_{\pairA} ×
|
||||
b_{\pairB} \lll f ≡ a_{\pairB}
|
||||
$$
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Span category}
|
||||
\framesubtitle{Univalence}
|
||||
\begin{align*}
|
||||
\label{eq:univ-0}
|
||||
(X , x_{𝒜} , x_{ℬ}) ≡ (Y , y_{𝒜} , y_{ℬ})
|
||||
\end{align*}
|
||||
\begin{align*}
|
||||
\label{eq:univ-1}
|
||||
\begin{split}
|
||||
p \tp & X ≡ Y \\
|
||||
& \Path\ (λ i → \Arrow\ (p\ i)\ 𝒜)\ x_{𝒜}\ y_{𝒜} \\
|
||||
& \Path\ (λ i → \Arrow\ (p\ i)\ ℬ)\ x_{ℬ}\ y_{ℬ}
|
||||
\end{split}
|
||||
\end{align*}
|
||||
\begin{align*}
|
||||
\begin{split}
|
||||
\var{iso} \tp & X \approxeq Y \\
|
||||
& \Path\ (λ i → \Arrow\ (\widetilde{p}\ i)\ 𝒜)\ x_{𝒜}\ y_{𝒜} \\
|
||||
& \Path\ (λ i → \Arrow\ (\widetilde{p}\ i)\ ℬ)\ x_{ℬ}\ y_{ℬ}
|
||||
\end{split}
|
||||
\end{align*}
|
||||
\begin{align*}
|
||||
(X , x_{𝒜} , x_{ℬ}) ≊ (Y , y_{𝒜} , y_{ℬ})
|
||||
\end{align*}
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Span category}
|
||||
\framesubtitle{Univalence, proof}
|
||||
%
|
||||
\begin{align*}
|
||||
%% (f, \inv{f}, \var{inv}_f, \var{inv}_{\inv{f}})
|
||||
%% \tp
|
||||
(X, x_{𝒜}, x_{ℬ}) \approxeq (Y, y_{𝒜}, y_{ℬ})
|
||||
\to
|
||||
\begin{split}
|
||||
\var{iso} \tp & X \approxeq Y \\
|
||||
& \Path\ (λ i → \Arrow\ (\widetilde{p}\ i)\ 𝒜)\ x_{𝒜}\ y_{𝒜} \\
|
||||
& \Path\ (λ i → \Arrow\ (\widetilde{p}\ i)\ ℬ)\ x_{ℬ}\ y_{ℬ}
|
||||
\end{split}
|
||||
\end{align*}
|
||||
\pause
|
||||
%
|
||||
Let $(f, \inv{f}, \var{inv}_f, \var{inv}_{\inv{f}})$ be an inhabitant
|
||||
of the antecedent.\pause
|
||||
|
||||
Projecting out the first component gives us the isomorphism
|
||||
%
|
||||
$$
|
||||
(\fst\ f, \fst\ \inv{f}
|
||||
, \congruence\ \fst\ \var{inv}_f
|
||||
, \congruence\ \fst\ \var{inv}_{\inv{f}}
|
||||
)
|
||||
\tp X \approxeq Y
|
||||
$$
|
||||
\pause
|
||||
%
|
||||
This gives rise to the following paths:
|
||||
%
|
||||
\begin{align*}
|
||||
\begin{split}
|
||||
\widetilde{p} & \tp X ≡ Y \\
|
||||
\widetilde{p}_{𝒜} & \tp \Arrow\ X\ 𝒜 ≡ \Arrow\ Y\ 𝒜 \\
|
||||
\end{split}
|
||||
\end{align*}
|
||||
%
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Span category}
|
||||
\framesubtitle{Univalence, proof, cont'd}
|
||||
It remains to construct:
|
||||
%
|
||||
\begin{align*}
|
||||
\begin{split}
|
||||
\label{eq:product-paths}
|
||||
& \Path\ (λ i → \widetilde{p}_{𝒜}\ i)\ x_{𝒜}\ y_{𝒜}
|
||||
\end{split}
|
||||
\end{align*}
|
||||
\pause
|
||||
%
|
||||
This is achieved with the following lemma:
|
||||
%
|
||||
\begin{align*}
|
||||
∏_{q \tp A ≡ B} \var{coe}\ q\ x_{𝒜} ≡ y_{𝒜}
|
||||
→
|
||||
\Path\ (λ i → q\ i)\ x_{𝒜}\ y_{𝒜}
|
||||
\end{align*}
|
||||
%
|
||||
Which is used without proof.\pause
|
||||
|
||||
So the construction reduces to:
|
||||
%
|
||||
\begin{align*}
|
||||
\var{coe}\ \widetilde{p}_{𝒜}\ x_{𝒜} ≡ y_{𝒜}
|
||||
\end{align*}%
|
||||
\pause%
|
||||
This is proven with:
|
||||
%
|
||||
\begin{align*}
|
||||
\var{coe}\ \widetilde{p}_{𝒜}\ x_{𝒜}
|
||||
& ≡ x_{𝒜} \lll \fst\ \inv{f} && \text{\ref{eq:coeDom}} \\
|
||||
& ≡ y_{𝒜} && \text{Property of span category}
|
||||
\end{align*}
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Propositionality of products}
|
||||
We have
|
||||
%
|
||||
$$
|
||||
\isProp\ \var{Terminal}
|
||||
$$\pause
|
||||
%
|
||||
We can show:
|
||||
\begin{align*}
|
||||
\var{Terminal} ≃ \var{Product}\ ℂ\ 𝒜\ ℬ
|
||||
\end{align*}
|
||||
\pause
|
||||
And since equivalences preserve homotopy levels we get:
|
||||
%
|
||||
$$
|
||||
\isProp\ \left(\var{Product}\ \bC\ 𝒜\ ℬ\right)
|
||||
$$
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Monads}
|
||||
\framesubtitle{Monoidal form}
|
||||
%
|
||||
\begin{align*}
|
||||
\EndoR & \tp \Endo ℂ \\
|
||||
\pureNT
|
||||
& \tp \NT{\EndoR^0}{\EndoR} \\
|
||||
\joinNT
|
||||
& \tp \NT{\EndoR^2}{\EndoR}
|
||||
\end{align*}
|
||||
\pause
|
||||
%
|
||||
Let $\fmap$ be the map on arrows of $\EndoR$. Likewise
|
||||
$\pure$ and $\join$ are the maps of the natural transformations
|
||||
$\pureNT$ and $\joinNT$ respectively.
|
||||
%
|
||||
\begin{align*}
|
||||
\join \lll \fmap\ \join
|
||||
& ≡ \join \lll \join \\
|
||||
\join \lll \pure\ & ≡ \identity \\
|
||||
\join \lll \fmap\ \pure & ≡ \identity
|
||||
\end{align*}
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Monads}
|
||||
\framesubtitle{Kleisli form}
|
||||
%
|
||||
\begin{align*}
|
||||
\omapR & \tp \Object → \Object \\
|
||||
\pure & \tp % ∏_{X \tp Object}
|
||||
\Arrow\ X\ (\omapR\ X) \\
|
||||
\bind & \tp
|
||||
\Arrow\ X\ (\omapR\ Y)
|
||||
\to
|
||||
\Arrow\ (\omapR\ X)\ (\omapR\ Y)
|
||||
\end{align*}\pause
|
||||
%
|
||||
\begin{align*}
|
||||
\fish & \tp
|
||||
\Arrow\ A\ (\omapR\ B)
|
||||
→
|
||||
\Arrow\ B\ (\omapR\ C)
|
||||
→
|
||||
\Arrow\ A\ (\omapR\ C) \\
|
||||
f \fish g & ≜ f \rrr (\bind\ g)
|
||||
\end{align*}
|
||||
\pause
|
||||
%
|
||||
\begin{align*}
|
||||
\label{eq:monad-kleisli-laws-0}
|
||||
\bind\ \pure & ≡ \identity_{\omapR\ X} \\
|
||||
\label{eq:monad-kleisli-laws-1}
|
||||
\pure \fish f & ≡ f \\
|
||||
\label{eq:monad-kleisli-laws-2}
|
||||
(\bind\ f) \rrr (\bind\ g) & ≡ \bind\ (f \fish g)
|
||||
\end{align*}
|
||||
\end{frame}
|
||||
\begin{frame}
|
||||
\frametitle{Monads}
|
||||
\framesubtitle{Equivalence}
|
||||
In the monoidal formulation we can define $\bind$:
|
||||
%
|
||||
$$
|
||||
\bind\ f ≜ \join \lll \fmap\ f
|
||||
$$
|
||||
\pause
|
||||
%
|
||||
And likewise in the Kleisli formulation we can define $\join$:
|
||||
%
|
||||
$$
|
||||
\join ≜ \bind\ \identity
|
||||
$$
|
||||
\pause
|
||||
The laws are logically equivalent. So we get:
|
||||
%
|
||||
$$
|
||||
\var{Monoidal} ≃ \var{Kleisli}
|
||||
$$
|
||||
%
|
||||
\end{frame}
|
||||
\end{document}
|
95
doc/title.tex
Normal file
95
doc/title.tex
Normal file
|
@ -0,0 +1,95 @@
|
|||
%% FRONTMATTER
|
||||
\frontmatter
|
||||
%% \newgeometry{top=3cm, bottom=3cm,left=2.25 cm, right=2.25cm}
|
||||
\begingroup
|
||||
\thispagestyle{empty}
|
||||
{\Huge\thetitle}\\[.5cm]
|
||||
{\Large A formalization of category theory in Cubical Agda}\\[6cm]
|
||||
\begin{center}
|
||||
\includegraphics[width=\linewidth,keepaspectratio]{isomorphism.png}
|
||||
\end{center}
|
||||
% Cover text
|
||||
\vfill
|
||||
%% \renewcommand{\familydefault}{\sfdefault} \normalfont % Set cover page font
|
||||
{\Large\theauthor}\\[.5cm]
|
||||
Master's thesis in Computer Science
|
||||
\endgroup
|
||||
%% \end{titlepage}
|
||||
|
||||
|
||||
% BACK OF COVER PAGE (BLANK PAGE)
|
||||
\newpage
|
||||
%% \newgeometry{a4paper} % Temporarily change margins
|
||||
%% \restoregeometry
|
||||
\thispagestyle{empty}
|
||||
\null
|
||||
|
||||
%% \begin{titlepage}
|
||||
|
||||
% TITLE PAGE
|
||||
\newpage
|
||||
\thispagestyle{empty}
|
||||
\begin{center}
|
||||
\textsc{\LARGE Master's thesis \the\year}\\[4cm] % Report number is currently not in use
|
||||
\textbf{\LARGE \thetitle} \\[1cm]
|
||||
{\large \subtitle}\\[1cm]
|
||||
{\large \theauthor}
|
||||
|
||||
\vfill
|
||||
\centering
|
||||
\includegraphics[width=0.2\pdfpagewidth]{logo_eng.pdf}
|
||||
\vspace{5mm}
|
||||
|
||||
\textsc{Department of Computer Science and Engineering}\\
|
||||
\textsc{{\researchgroup}}\\
|
||||
%Name of research group (if applicable)\\
|
||||
\textsc{\institution} \\
|
||||
\textsc{Gothenburg, Sweden \the\year}\\
|
||||
\end{center}
|
||||
|
||||
|
||||
% IMPRINT PAGE (BACK OF TITLE PAGE)
|
||||
\newpage
|
||||
\thispagestyle{plain}
|
||||
\textit{\thetitle}\\
|
||||
\subtitle\\
|
||||
\copyright\ \the\year ~ \textsc{\theauthor}
|
||||
\vspace{4.5cm}
|
||||
|
||||
\setlength{\parskip}{0.5cm}
|
||||
\textbf{Author:}\\
|
||||
\theauthor\\
|
||||
\href{mailto:\authoremail>}{\texttt{<\authoremail>}}
|
||||
|
||||
\textbf{Supervisor:}\\
|
||||
\supervisor\\
|
||||
\href{mailto:\supervisoremail>}{\texttt{<\supervisoremail>}}\\
|
||||
\supervisordepartment
|
||||
|
||||
\textbf{Co-supervisor:}\\
|
||||
\cosupervisor\\
|
||||
\href{mailto:\cosupervisoremail>}{\texttt{<\cosupervisoremail>}}\\
|
||||
\cosupervisordepartment
|
||||
|
||||
\textbf{Examiner:}\\
|
||||
\examiner\\
|
||||
\href{mailto:\examineremail>}{\texttt{<\examineremail>}}\\
|
||||
\examinerdepartment
|
||||
|
||||
\vfill
|
||||
Master's Thesis \the\year\\ % Report number currently not in use
|
||||
\department\\
|
||||
%Division of Division name\\
|
||||
%Name of research group (if applicable)\\
|
||||
\institution\\
|
||||
SE-412 96 Gothenburg\\
|
||||
Telephone +46 31 772 1000 \setlength{\parskip}{0.5cm}\\
|
||||
% Caption for cover page figure if used, possibly with reference to further information in the report
|
||||
%% Cover: Wind visualization constructed in Matlab showing a surface of constant wind speed along with streamlines of the flow. \setlength{\parskip}{0.5cm}
|
||||
%Printed by [Name of printing company]\\
|
||||
Gothenburg, Sweden \the\year
|
||||
|
||||
%% \restoregeometry
|
||||
%% \end{titlepage}
|
||||
|
||||
\tableofcontents
|
|
@ -1 +1 @@
|
|||
Subproject commit 55ad461aa4fc6cf22e97812b7ff8128b3c7a902c
|
||||
Subproject commit 209626953d56294e9bd3d8892eda43b844b0edf9
|
|
@ -10,10 +10,11 @@ open import Cat.Category.NaturalTransformation
|
|||
open import Cat.Category.Yoneda
|
||||
open import Cat.Category.Monoid
|
||||
open import Cat.Category.Monad
|
||||
open Cat.Category.Monad.Monoidal
|
||||
open Cat.Category.Monad.Kleisli
|
||||
open import Cat.Category.Monad.Monoidal
|
||||
open import Cat.Category.Monad.Kleisli
|
||||
open import Cat.Category.Monad.Voevodsky
|
||||
|
||||
open import Cat.Categories.Opposite
|
||||
open import Cat.Categories.Sets
|
||||
open import Cat.Categories.Cat
|
||||
open import Cat.Categories.Rel
|
||||
|
|
|
@ -5,6 +5,7 @@ open import Cat.Prelude
|
|||
open import Cat.Category
|
||||
open import Cat.Category.Functor
|
||||
open import Cat.Categories.Fam
|
||||
open import Cat.Categories.Opposite
|
||||
|
||||
module _ {ℓa ℓb : Level} where
|
||||
record CwF : Set (lsuc (ℓa ⊔ ℓb)) where
|
||||
|
|
|
@ -1,13 +1,14 @@
|
|||
{-# OPTIONS --allow-unsolved-metas --cubical --caching #-}
|
||||
module Cat.Categories.Fun where
|
||||
|
||||
|
||||
open import Cat.Prelude
|
||||
open import Cat.Equivalence
|
||||
|
||||
open import Cat.Category
|
||||
open import Cat.Category.Functor
|
||||
import Cat.Category.NaturalTransformation
|
||||
as NaturalTransformation
|
||||
open import Cat.Categories.Opposite
|
||||
|
||||
module Fun {ℓc ℓc' ℓd ℓd' : Level} (ℂ : Category ℓc ℓc') (𝔻 : Category ℓd ℓd') where
|
||||
open NaturalTransformation ℂ 𝔻 public hiding (module Properties)
|
||||
|
@ -52,14 +53,14 @@ module Fun {ℓc ℓc' ℓd ℓd' : Level} (ℂ : Category ℓc ℓc') (𝔻 : C
|
|||
lem : coe (pp {C}) 𝔻.identity ≡ f→g
|
||||
lem = trans (𝔻.9-1-9-right {b = Functor.omap F C} 𝔻.identity p*) 𝔻.rightIdentity
|
||||
|
||||
idToNatTrans : NaturalTransformation F G
|
||||
idToNatTrans = (λ C → coe pp 𝔻.identity) , λ f → begin
|
||||
coe pp 𝔻.identity 𝔻.<<< F.fmap f ≡⟨ cong (𝔻._<<< F.fmap f) lem ⟩
|
||||
-- Just need to show that f→g is a natural transformation
|
||||
-- I know that it has an inverse; g→f
|
||||
f→g 𝔻.<<< F.fmap f ≡⟨ {!lem!} ⟩
|
||||
G.fmap f 𝔻.<<< f→g ≡⟨ cong (G.fmap f 𝔻.<<<_) (sym lem) ⟩
|
||||
G.fmap f 𝔻.<<< coe pp 𝔻.identity ∎
|
||||
-- idToNatTrans : NaturalTransformation F G
|
||||
-- idToNatTrans = (λ C → coe pp 𝔻.identity) , λ f → begin
|
||||
-- coe pp 𝔻.identity 𝔻.<<< F.fmap f ≡⟨ cong (𝔻._<<< F.fmap f) lem ⟩
|
||||
-- -- Just need to show that f→g is a natural transformation
|
||||
-- -- I know that it has an inverse; g→f
|
||||
-- f→g 𝔻.<<< F.fmap f ≡⟨ {!lem!} ⟩
|
||||
-- G.fmap f 𝔻.<<< f→g ≡⟨ cong (G.fmap f 𝔻.<<<_) (sym lem) ⟩
|
||||
-- G.fmap f 𝔻.<<< coe pp 𝔻.identity ∎
|
||||
|
||||
module _ {A B : Functor ℂ 𝔻} where
|
||||
module A = Functor A
|
||||
|
@ -92,70 +93,70 @@ module Fun {ℓc ℓc' ℓd ℓd' : Level} (ℂ : Category ℓc ℓc') (𝔻 : C
|
|||
U : (F : ℂ.Object → 𝔻.Object) → Set _
|
||||
U F = {A B : ℂ.Object} → ℂ [ A , B ] → 𝔻 [ F A , F B ]
|
||||
|
||||
module _
|
||||
(omap : ℂ.Object → 𝔻.Object)
|
||||
(p : A.omap ≡ omap)
|
||||
where
|
||||
D : Set _
|
||||
D = ( fmap : U omap)
|
||||
→ ( let
|
||||
raw-B : RawFunctor ℂ 𝔻
|
||||
raw-B = record { omap = omap ; fmap = fmap }
|
||||
)
|
||||
→ (isF-B' : IsFunctor ℂ 𝔻 raw-B)
|
||||
→ ( let
|
||||
B' : Functor ℂ 𝔻
|
||||
B' = record { raw = raw-B ; isFunctor = isF-B' }
|
||||
)
|
||||
→ (iso' : A ≊ B') → PathP (λ i → U (p i)) A.fmap fmap
|
||||
-- D : Set _
|
||||
-- D = PathP (λ i → U (p i)) A.fmap fmap
|
||||
-- eeq : (λ f → A.fmap f) ≡ fmap
|
||||
-- eeq = funExtImp (λ A → funExtImp (λ B → funExt (λ f → isofmap {A} {B} f)))
|
||||
-- where
|
||||
-- module _ {X : ℂ.Object} {Y : ℂ.Object} (f : ℂ [ X , Y ]) where
|
||||
-- isofmap : A.fmap f ≡ fmap f
|
||||
-- isofmap = {!ap!}
|
||||
d : D A.omap refl
|
||||
d = res
|
||||
where
|
||||
module _
|
||||
( fmap : U A.omap )
|
||||
( let
|
||||
raw-B : RawFunctor ℂ 𝔻
|
||||
raw-B = record { omap = A.omap ; fmap = fmap }
|
||||
)
|
||||
( isF-B' : IsFunctor ℂ 𝔻 raw-B )
|
||||
( let
|
||||
B' : Functor ℂ 𝔻
|
||||
B' = record { raw = raw-B ; isFunctor = isF-B' }
|
||||
)
|
||||
( iso' : A ≊ B' )
|
||||
where
|
||||
module _ {X Y : ℂ.Object} (f : ℂ [ X , Y ]) where
|
||||
step : {!!} 𝔻.≊ {!!}
|
||||
step = {!!}
|
||||
resres : A.fmap {X} {Y} f ≡ fmap {X} {Y} f
|
||||
resres = {!!}
|
||||
res : PathP (λ i → U A.omap) A.fmap fmap
|
||||
res i {X} {Y} f = resres f i
|
||||
-- module _
|
||||
-- (omap : ℂ.Object → 𝔻.Object)
|
||||
-- (p : A.omap ≡ omap)
|
||||
-- where
|
||||
-- D : Set _
|
||||
-- D = ( fmap : U omap)
|
||||
-- → ( let
|
||||
-- raw-B : RawFunctor ℂ 𝔻
|
||||
-- raw-B = record { omap = omap ; fmap = fmap }
|
||||
-- )
|
||||
-- → (isF-B' : IsFunctor ℂ 𝔻 raw-B)
|
||||
-- → ( let
|
||||
-- B' : Functor ℂ 𝔻
|
||||
-- B' = record { raw = raw-B ; isFunctor = isF-B' }
|
||||
-- )
|
||||
-- → (iso' : A ≊ B') → PathP (λ i → U (p i)) A.fmap fmap
|
||||
-- -- D : Set _
|
||||
-- -- D = PathP (λ i → U (p i)) A.fmap fmap
|
||||
-- -- eeq : (λ f → A.fmap f) ≡ fmap
|
||||
-- -- eeq = funExtImp (λ A → funExtImp (λ B → funExt (λ f → isofmap {A} {B} f)))
|
||||
-- -- where
|
||||
-- -- module _ {X : ℂ.Object} {Y : ℂ.Object} (f : ℂ [ X , Y ]) where
|
||||
-- -- isofmap : A.fmap f ≡ fmap f
|
||||
-- -- isofmap = {!ap!}
|
||||
-- d : D A.omap refl
|
||||
-- d = res
|
||||
-- where
|
||||
-- module _
|
||||
-- ( fmap : U A.omap )
|
||||
-- ( let
|
||||
-- raw-B : RawFunctor ℂ 𝔻
|
||||
-- raw-B = record { omap = A.omap ; fmap = fmap }
|
||||
-- )
|
||||
-- ( isF-B' : IsFunctor ℂ 𝔻 raw-B )
|
||||
-- ( let
|
||||
-- B' : Functor ℂ 𝔻
|
||||
-- B' = record { raw = raw-B ; isFunctor = isF-B' }
|
||||
-- )
|
||||
-- ( iso' : A ≊ B' )
|
||||
-- where
|
||||
-- module _ {X Y : ℂ.Object} (f : ℂ [ X , Y ]) where
|
||||
-- step : {!!} 𝔻.≊ {!!}
|
||||
-- step = {!!}
|
||||
-- resres : A.fmap {X} {Y} f ≡ fmap {X} {Y} f
|
||||
-- resres = {!!}
|
||||
-- res : PathP (λ i → U A.omap) A.fmap fmap
|
||||
-- res i {X} {Y} f = resres f i
|
||||
|
||||
fmapEq : PathP (λ i → U (omapEq i)) A.fmap B.fmap
|
||||
fmapEq = pathJ D d B.omap omapEq B.fmap B.isFunctor iso
|
||||
-- fmapEq : PathP (λ i → U (omapEq i)) A.fmap B.fmap
|
||||
-- fmapEq = pathJ D d B.omap omapEq B.fmap B.isFunctor iso
|
||||
|
||||
rawEq : A.raw ≡ B.raw
|
||||
rawEq i = record { omap = omapEq i ; fmap = fmapEq i }
|
||||
-- rawEq : A.raw ≡ B.raw
|
||||
-- rawEq i = record { omap = omapEq i ; fmap = fmapEq i }
|
||||
|
||||
private
|
||||
f : (A ≡ B) → (A ≊ B)
|
||||
f p = idToNatTrans p , idToNatTrans (sym p) , NaturalTransformation≡ A A (funExt (λ C → {!!})) , {!!}
|
||||
g : (A ≊ B) → (A ≡ B)
|
||||
g = Functor≡ ∘ rawEq
|
||||
inv : AreInverses f g
|
||||
inv = {!funExt λ p → ?!} , {!!}
|
||||
|
||||
iso : (A ≡ B) ≅ (A ≊ B)
|
||||
iso = f , g , inv
|
||||
-- private
|
||||
-- f : (A ≡ B) → (A ≊ B)
|
||||
-- f p = idToNatTrans p , idToNatTrans (sym p) , NaturalTransformation≡ A A (funExt (λ C → {!!})) , {!!}
|
||||
-- g : (A ≊ B) → (A ≡ B)
|
||||
-- g = Functor≡ ∘ rawEq
|
||||
-- inv : AreInverses f g
|
||||
-- inv = {!funExt λ p → ?!} , {!!}
|
||||
postulate
|
||||
iso : (A ≡ B) ≅ (A ≊ B)
|
||||
-- iso = f , g , inv
|
||||
|
||||
univ : (A ≡ B) ≃ (A ≊ B)
|
||||
univ = fromIsomorphism _ _ iso
|
||||
|
|
96
src/Cat/Categories/Opposite.agda
Normal file
96
src/Cat/Categories/Opposite.agda
Normal file
|
@ -0,0 +1,96 @@
|
|||
{-# OPTIONS --cubical #-}
|
||||
module Cat.Categories.Opposite where
|
||||
|
||||
open import Cat.Prelude
|
||||
open import Cat.Equivalence
|
||||
open import Cat.Category
|
||||
|
||||
-- | The opposite category
|
||||
--
|
||||
-- The opposite category is the category where the direction of the arrows are
|
||||
-- flipped.
|
||||
module _ {ℓa ℓb : Level} where
|
||||
module _ (ℂ : Category ℓa ℓb) where
|
||||
private
|
||||
module _ where
|
||||
module ℂ = Category ℂ
|
||||
opRaw : RawCategory ℓa ℓb
|
||||
RawCategory.Object opRaw = ℂ.Object
|
||||
RawCategory.Arrow opRaw = flip ℂ.Arrow
|
||||
RawCategory.identity opRaw = ℂ.identity
|
||||
RawCategory._<<<_ opRaw = ℂ._>>>_
|
||||
|
||||
open RawCategory opRaw
|
||||
|
||||
isPreCategory : IsPreCategory opRaw
|
||||
IsPreCategory.isAssociative isPreCategory = sym ℂ.isAssociative
|
||||
IsPreCategory.isIdentity isPreCategory = swap ℂ.isIdentity
|
||||
IsPreCategory.arrowsAreSets isPreCategory = ℂ.arrowsAreSets
|
||||
|
||||
open IsPreCategory isPreCategory
|
||||
|
||||
module _ {A B : ℂ.Object} where
|
||||
open Σ (toIso _ _ (ℂ.univalent {A} {B}))
|
||||
renaming (fst to idToIso* ; snd to inv*)
|
||||
open AreInverses {f = ℂ.idToIso A B} {idToIso*} inv*
|
||||
|
||||
shuffle : A ≊ B → A ℂ.≊ B
|
||||
shuffle (f , g , inv) = g , f , inv
|
||||
|
||||
shuffle~ : A ℂ.≊ B → A ≊ B
|
||||
shuffle~ (f , g , inv) = g , f , inv
|
||||
|
||||
lem : (p : A ≡ B) → idToIso A B p ≡ shuffle~ (ℂ.idToIso A B p)
|
||||
lem p = isoEq refl
|
||||
|
||||
isoToId* : A ≊ B → A ≡ B
|
||||
isoToId* = idToIso* ∘ shuffle
|
||||
|
||||
inv : AreInverses (idToIso A B) isoToId*
|
||||
inv =
|
||||
( funExt (λ x → begin
|
||||
(isoToId* ∘ idToIso A B) x
|
||||
≡⟨⟩
|
||||
(idToIso* ∘ shuffle ∘ idToIso A B) x
|
||||
≡⟨ cong (λ φ → φ x) (cong (λ φ → idToIso* ∘ shuffle ∘ φ) (funExt lem)) ⟩
|
||||
(idToIso* ∘ shuffle ∘ shuffle~ ∘ ℂ.idToIso A B) x
|
||||
≡⟨⟩
|
||||
(idToIso* ∘ ℂ.idToIso A B) x
|
||||
≡⟨ (λ i → verso-recto i x) ⟩
|
||||
x ∎)
|
||||
, funExt (λ x → begin
|
||||
(idToIso A B ∘ idToIso* ∘ shuffle) x
|
||||
≡⟨ cong (λ φ → φ x) (cong (λ φ → φ ∘ idToIso* ∘ shuffle) (funExt lem)) ⟩
|
||||
(shuffle~ ∘ ℂ.idToIso A B ∘ idToIso* ∘ shuffle) x
|
||||
≡⟨ cong (λ φ → φ x) (cong (λ φ → shuffle~ ∘ φ ∘ shuffle) recto-verso) ⟩
|
||||
(shuffle~ ∘ shuffle) x
|
||||
≡⟨⟩
|
||||
x ∎)
|
||||
)
|
||||
|
||||
isCategory : IsCategory opRaw
|
||||
IsCategory.isPreCategory isCategory = isPreCategory
|
||||
IsCategory.univalent isCategory
|
||||
= univalenceFromIsomorphism (isoToId* , inv)
|
||||
|
||||
opposite : Category ℓa ℓb
|
||||
Category.raw opposite = opRaw
|
||||
Category.isCategory opposite = isCategory
|
||||
|
||||
-- As demonstrated here a side-effect of having no-eta-equality on constructors
|
||||
-- means that we need to pick things apart to show that things are indeed
|
||||
-- definitionally equal. I.e; a thing that would normally be provable in one
|
||||
-- line now takes 13!! Admittedly it's a simple proof.
|
||||
module _ {ℂ : Category ℓa ℓb} where
|
||||
open Category ℂ
|
||||
private
|
||||
-- Since they really are definitionally equal we just need to pick apart
|
||||
-- the data-type.
|
||||
rawInv : Category.raw (opposite (opposite ℂ)) ≡ raw
|
||||
RawCategory.Object (rawInv _) = Object
|
||||
RawCategory.Arrow (rawInv _) = Arrow
|
||||
RawCategory.identity (rawInv _) = identity
|
||||
RawCategory._<<<_ (rawInv _) = _<<<_
|
||||
|
||||
oppositeIsInvolution : opposite (opposite ℂ) ≡ ℂ
|
||||
oppositeIsInvolution = Category≡ rawInv
|
|
@ -2,6 +2,7 @@
|
|||
module Cat.Categories.Rel where
|
||||
|
||||
open import Cat.Prelude hiding (Rel)
|
||||
open import Cat.Equivalence
|
||||
|
||||
open import Cat.Category
|
||||
|
||||
|
@ -61,15 +62,9 @@ module _ {A B : Set} {S : Subset (A × B)} (ab : A × B) where
|
|||
lem0 = (λ a'' a≡a'' → ∀ a''b∈S → (forwards ∘ backwards) (a'' , a≡a'' , a''b∈S) ≡ (a'' , a≡a'' , a''b∈S))
|
||||
lem1 = (λ z₁ → cong (\ z → a , refl , z) (pathJprop (\ y _ → y) z₁))
|
||||
|
||||
isequiv : isEquiv
|
||||
(Σ[ a' ∈ A ] (a , a') ∈ Diag A × (a' , b) ∈ S)
|
||||
((a , b) ∈ S)
|
||||
backwards
|
||||
isequiv y = gradLemma backwards forwards fwd-bwd bwd-fwd y
|
||||
|
||||
equi : (Σ[ a' ∈ A ] (a , a') ∈ Diag A × (a' , b) ∈ S)
|
||||
≃ (a , b) ∈ S
|
||||
equi = backwards , isequiv
|
||||
equi = fromIsomorphism _ _ (backwards , forwards , funExt bwd-fwd , funExt fwd-bwd)
|
||||
|
||||
ident-r : (Σ[ a' ∈ A ] (a , a') ∈ Diag A × (a' , b) ∈ S)
|
||||
≡ (a , b) ∈ S
|
||||
|
@ -95,15 +90,9 @@ module _ {A B : Set} {S : Subset (A × B)} (ab : A × B) where
|
|||
lem0 = (λ b'' b≡b'' → (ab''∈S : (a , b'') ∈ S) → (forwards ∘ backwards) (b'' , ab''∈S , sym b≡b'') ≡ (b'' , ab''∈S , sym b≡b''))
|
||||
lem1 = (λ ab''∈S → cong (λ φ → b , φ , refl) (pathJprop (λ y _ → y) ab''∈S))
|
||||
|
||||
isequiv : isEquiv
|
||||
(Σ[ b' ∈ B ] (a , b') ∈ S × (b' , b) ∈ Diag B)
|
||||
((a , b) ∈ S)
|
||||
backwards
|
||||
isequiv ab∈S = gradLemma backwards forwards bwd-fwd fwd-bwd ab∈S
|
||||
|
||||
equi : (Σ[ b' ∈ B ] (a , b') ∈ S × (b' , b) ∈ Diag B)
|
||||
≃ ab ∈ S
|
||||
equi = backwards , isequiv
|
||||
equi = fromIsomorphism _ _ (backwards , (forwards , funExt fwd-bwd , funExt bwd-fwd))
|
||||
|
||||
ident-l : (Σ[ b' ∈ B ] (a , b') ∈ S × (b' , b) ∈ Diag B)
|
||||
≡ ab ∈ S
|
||||
|
@ -133,15 +122,9 @@ module _ {A B C D : Set} {S : Subset (A × B)} {R : Subset (B × C)} {Q : Subset
|
|||
bwd-fwd : (x : Q⊕⟨R⊕S⟩) → (bwd ∘ fwd) x ≡ x
|
||||
bwd-fwd x = refl
|
||||
|
||||
isequiv : isEquiv
|
||||
(Σ[ c ∈ C ] (Σ[ b ∈ B ] (a , b) ∈ S × (b , c) ∈ R) × (c , d) ∈ Q)
|
||||
(Σ[ b ∈ B ] (a , b) ∈ S × (Σ[ c ∈ C ] (b , c) ∈ R × (c , d) ∈ Q))
|
||||
fwd
|
||||
isequiv = gradLemma fwd bwd fwd-bwd bwd-fwd
|
||||
|
||||
equi : (Σ[ c ∈ C ] (Σ[ b ∈ B ] (a , b) ∈ S × (b , c) ∈ R) × (c , d) ∈ Q)
|
||||
≃ (Σ[ b ∈ B ] (a , b) ∈ S × (Σ[ c ∈ C ] (b , c) ∈ R × (c , d) ∈ Q))
|
||||
equi = fwd , isequiv
|
||||
equi = fromIsomorphism _ _ (fwd , bwd , funExt bwd-fwd , funExt fwd-bwd)
|
||||
|
||||
-- isAssociativec : Q + (R + S) ≡ (Q + R) + S
|
||||
is-isAssociative : (Σ[ c ∈ C ] (Σ[ b ∈ B ] (a , b) ∈ S × (b , c) ∈ R) × (c , d) ∈ Q)
|
||||
|
|
|
@ -3,11 +3,11 @@
|
|||
module Cat.Categories.Sets where
|
||||
|
||||
open import Cat.Prelude as P
|
||||
|
||||
open import Cat.Equivalence
|
||||
open import Cat.Category
|
||||
open import Cat.Category.Functor
|
||||
open import Cat.Category.Product
|
||||
open import Cat.Equivalence
|
||||
open import Cat.Categories.Opposite
|
||||
|
||||
_⊙_ : {ℓa ℓb ℓc : Level} {A : Set ℓa} {B : Set ℓb} {C : Set ℓc} → (A ≃ B) → (B ≃ C) → A ≃ C
|
||||
eqA ⊙ eqB = Equivalence.compose eqA eqB
|
||||
|
|
170
src/Cat/Categories/Span.agda
Normal file
170
src/Cat/Categories/Span.agda
Normal file
|
@ -0,0 +1,170 @@
|
|||
{-# OPTIONS --cubical --caching #-}
|
||||
module Cat.Categories.Span where
|
||||
|
||||
open import Cat.Prelude as P hiding (_×_ ; fst ; snd)
|
||||
open import Cat.Equivalence
|
||||
open import Cat.Category
|
||||
|
||||
module _ {ℓa ℓb : Level} (ℂ : Category ℓa ℓb)
|
||||
(let module ℂ = Category ℂ) (𝒜 ℬ : ℂ.Object) where
|
||||
|
||||
open P
|
||||
|
||||
private
|
||||
module _ where
|
||||
raw : RawCategory _ _
|
||||
raw = record
|
||||
{ Object = Σ[ X ∈ ℂ.Object ] ℂ.Arrow X 𝒜 × ℂ.Arrow X ℬ
|
||||
; Arrow = λ{ (A , a0 , a1) (B , b0 , b1)
|
||||
→ Σ[ f ∈ ℂ.Arrow A B ]
|
||||
ℂ [ b0 ∘ f ] ≡ a0
|
||||
× ℂ [ b1 ∘ f ] ≡ a1
|
||||
}
|
||||
; identity = λ{ {X , f , g} → ℂ.identity {X} , ℂ.rightIdentity , ℂ.rightIdentity}
|
||||
; _<<<_ = λ { {_ , a0 , a1} {_ , b0 , b1} {_ , c0 , c1} (f , f0 , f1) (g , g0 , g1)
|
||||
→ (f ℂ.<<< g)
|
||||
, (begin
|
||||
ℂ [ c0 ∘ ℂ [ f ∘ g ] ] ≡⟨ ℂ.isAssociative ⟩
|
||||
ℂ [ ℂ [ c0 ∘ f ] ∘ g ] ≡⟨ cong (λ φ → ℂ [ φ ∘ g ]) f0 ⟩
|
||||
ℂ [ b0 ∘ g ] ≡⟨ g0 ⟩
|
||||
a0 ∎
|
||||
)
|
||||
, (begin
|
||||
ℂ [ c1 ∘ ℂ [ f ∘ g ] ] ≡⟨ ℂ.isAssociative ⟩
|
||||
ℂ [ ℂ [ c1 ∘ f ] ∘ g ] ≡⟨ cong (λ φ → ℂ [ φ ∘ g ]) f1 ⟩
|
||||
ℂ [ b1 ∘ g ] ≡⟨ g1 ⟩
|
||||
a1 ∎
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
module _ where
|
||||
open RawCategory raw
|
||||
|
||||
propEqs : ∀ {X' : Object}{Y' : Object} (let X , xa , xb = X') (let Y , ya , yb = Y')
|
||||
→ (xy : ℂ.Arrow X Y) → isProp (ℂ [ ya ∘ xy ] ≡ xa × ℂ [ yb ∘ xy ] ≡ xb)
|
||||
propEqs xs = propSig (ℂ.arrowsAreSets _ _) (\ _ → ℂ.arrowsAreSets _ _)
|
||||
|
||||
arrowEq : {X Y : Object} {f g : Arrow X Y} → fst f ≡ fst g → f ≡ g
|
||||
arrowEq {X} {Y} {f} {g} p = λ i → p i , lemPropF propEqs p {snd f} {snd g} i
|
||||
|
||||
isAssociative : IsAssociative
|
||||
isAssociative {f = f , f0 , f1} {g , g0 , g1} {h , h0 , h1} = arrowEq ℂ.isAssociative
|
||||
|
||||
isIdentity : IsIdentity identity
|
||||
isIdentity {AA@(A , a0 , a1)} {BB@(B , b0 , b1)} {f , f0 , f1} = arrowEq ℂ.leftIdentity , arrowEq ℂ.rightIdentity
|
||||
|
||||
arrowsAreSets : ArrowsAreSets
|
||||
arrowsAreSets {X , x0 , x1} {Y , y0 , y1}
|
||||
= sigPresSet ℂ.arrowsAreSets λ a → propSet (propEqs _)
|
||||
|
||||
isPreCat : IsPreCategory raw
|
||||
IsPreCategory.isAssociative isPreCat = isAssociative
|
||||
IsPreCategory.isIdentity isPreCat = isIdentity
|
||||
IsPreCategory.arrowsAreSets isPreCat = arrowsAreSets
|
||||
|
||||
open IsPreCategory isPreCat
|
||||
|
||||
module _ {𝕏 𝕐 : Object} where
|
||||
open Σ 𝕏 renaming (fst to X ; snd to x)
|
||||
open Σ x renaming (fst to xa ; snd to xb)
|
||||
open Σ 𝕐 renaming (fst to Y ; snd to y)
|
||||
open Σ y renaming (fst to ya ; snd to yb)
|
||||
open import Cat.Equivalence using (composeIso) renaming (_≅_ to _≅_)
|
||||
|
||||
-- The proof will be a sequence of isomorphisms between the
|
||||
-- following 4 types:
|
||||
T0 = ((X , xa , xb) ≡ (Y , ya , yb))
|
||||
T1 = (Σ[ p ∈ (X ≡ Y) ] (PathP (λ i → ℂ.Arrow (p i) 𝒜) xa ya) × (PathP (λ i → ℂ.Arrow (p i) ℬ) xb yb))
|
||||
T2 = Σ (X ℂ.≊ Y) (λ iso
|
||||
→ let p = ℂ.isoToId iso
|
||||
in
|
||||
( PathP (λ i → ℂ.Arrow (p i) 𝒜) xa ya)
|
||||
× PathP (λ i → ℂ.Arrow (p i) ℬ) xb yb
|
||||
)
|
||||
T3 = ((X , xa , xb) ≊ (Y , ya , yb))
|
||||
|
||||
step0 : T0 ≅ T1
|
||||
step0
|
||||
= (λ p → cong fst p , cong-d (fst ∘ snd) p , cong-d (snd ∘ snd) p)
|
||||
-- , (λ x → λ i → fst x i , (fst (snd x) i) , (snd (snd x) i))
|
||||
, (λ{ (p , q , r) → Σ≡ p λ i → q i , r i})
|
||||
, funExt (λ{ p → refl})
|
||||
, funExt (λ{ (p , q , r) → refl})
|
||||
|
||||
step1 : T1 ≅ T2
|
||||
step1
|
||||
= symIso
|
||||
(isoSigFst
|
||||
{A = (X ℂ.≊ Y)}
|
||||
{B = (X ≡ Y)}
|
||||
(ℂ.groupoidObject _ _)
|
||||
{Q = \ p → (PathP (λ i → ℂ.Arrow (p i) 𝒜) xa ya) × (PathP (λ i → ℂ.Arrow (p i) ℬ) xb yb)}
|
||||
ℂ.isoToId
|
||||
(symIso (_ , ℂ.asTypeIso {X} {Y}) .snd)
|
||||
)
|
||||
|
||||
step2 : T2 ≅ T3
|
||||
step2
|
||||
= ( λ{ (iso@(f , f~ , inv-f) , p , q)
|
||||
→ ( f , sym (ℂ.domain-twist-sym iso p) , sym (ℂ.domain-twist-sym iso q))
|
||||
, ( f~ , sym (ℂ.domain-twist iso p) , sym (ℂ.domain-twist iso q))
|
||||
, arrowEq (fst inv-f)
|
||||
, arrowEq (snd inv-f)
|
||||
}
|
||||
)
|
||||
, (λ{ (f , f~ , inv-f , inv-f~) →
|
||||
let
|
||||
iso : X ℂ.≊ Y
|
||||
iso = fst f , fst f~ , cong fst inv-f , cong fst inv-f~
|
||||
p : X ≡ Y
|
||||
p = ℂ.isoToId iso
|
||||
pA : ℂ.Arrow X 𝒜 ≡ ℂ.Arrow Y 𝒜
|
||||
pA = cong (λ x → ℂ.Arrow x 𝒜) p
|
||||
pB : ℂ.Arrow X ℬ ≡ ℂ.Arrow Y ℬ
|
||||
pB = cong (λ x → ℂ.Arrow x ℬ) p
|
||||
k0 = begin
|
||||
coe pB xb ≡⟨ ℂ.coe-dom iso ⟩
|
||||
xb ℂ.<<< fst f~ ≡⟨ snd (snd f~) ⟩
|
||||
yb ∎
|
||||
k1 = begin
|
||||
coe pA xa ≡⟨ ℂ.coe-dom iso ⟩
|
||||
xa ℂ.<<< fst f~ ≡⟨ fst (snd f~) ⟩
|
||||
ya ∎
|
||||
in iso , coe-lem-inv k1 , coe-lem-inv k0})
|
||||
, funExt (λ x → lemSig
|
||||
(λ x → propSig prop0 (λ _ → prop1))
|
||||
_ _
|
||||
(Σ≡ refl (ℂ.propIsomorphism _ _ _)))
|
||||
, funExt (λ{ (f , _) → lemSig propIsomorphism _ _ (Σ≡ refl (propEqs _ _ _))})
|
||||
where
|
||||
prop0 : ∀ {x} → isProp (PathP (λ i → ℂ.Arrow (ℂ.isoToId x i) 𝒜) xa ya)
|
||||
prop0 {x} = pathJ (λ y p → ∀ x → isProp (PathP (λ i → ℂ.Arrow (p i) 𝒜) xa x)) (λ x → ℂ.arrowsAreSets _ _) Y (ℂ.isoToId x) ya
|
||||
prop1 : ∀ {x} → isProp (PathP (λ i → ℂ.Arrow (ℂ.isoToId x i) ℬ) xb yb)
|
||||
prop1 {x} = pathJ (λ y p → ∀ x → isProp (PathP (λ i → ℂ.Arrow (p i) ℬ) xb x)) (λ x → ℂ.arrowsAreSets _ _) Y (ℂ.isoToId x) yb
|
||||
-- One thing to watch out for here is that the isomorphisms going forwards
|
||||
-- must compose to give idToIso
|
||||
iso
|
||||
: ((X , xa , xb) ≡ (Y , ya , yb))
|
||||
≅ ((X , xa , xb) ≊ (Y , ya , yb))
|
||||
iso = step0 ⊙ step1 ⊙ step2
|
||||
where
|
||||
infixl 5 _⊙_
|
||||
_⊙_ = composeIso
|
||||
equiv1
|
||||
: ((X , xa , xb) ≡ (Y , ya , yb))
|
||||
≃ ((X , xa , xb) ≊ (Y , ya , yb))
|
||||
equiv1 = _ , fromIso _ _ (snd iso)
|
||||
|
||||
univalent : Univalent
|
||||
univalent = univalenceFrom≃ equiv1
|
||||
|
||||
isCat : IsCategory raw
|
||||
IsCategory.isPreCategory isCat = isPreCat
|
||||
IsCategory.univalent isCat = univalent
|
||||
|
||||
span : Category _ _
|
||||
span = record
|
||||
{ raw = raw
|
||||
; isCategory = isCat
|
||||
}
|
|
@ -44,7 +44,7 @@ open Cat.Equivalence
|
|||
-- about these. The laws defined are the types the propositions - not the
|
||||
-- witnesses to them!
|
||||
record RawCategory (ℓa ℓb : Level) : Set (lsuc (ℓa ⊔ ℓb)) where
|
||||
no-eta-equality
|
||||
-- no-eta-equality
|
||||
field
|
||||
Object : Set ℓa
|
||||
Arrow : Object → Object → Set ℓb
|
||||
|
@ -55,14 +55,7 @@ record RawCategory (ℓa ℓb : Level) : Set (lsuc (ℓa ⊔ ℓb)) where
|
|||
-- infixl 8 _>>>_
|
||||
infixl 10 _<<<_ _>>>_
|
||||
|
||||
-- | Operations on data
|
||||
|
||||
domain : {a b : Object} → Arrow a b → Object
|
||||
domain {a} _ = a
|
||||
|
||||
codomain : {a b : Object} → Arrow a b → Object
|
||||
codomain {b = b} _ = b
|
||||
|
||||
-- | Reverse arrow composition
|
||||
_>>>_ : {A B C : Object} → (Arrow A B) → (Arrow B C) → Arrow A C
|
||||
f >>> g = g <<< f
|
||||
|
||||
|
@ -631,95 +624,3 @@ module _ {ℓa ℓb : Level} (ℂ : Category ℓa ℓb) where
|
|||
|
||||
_[_∘_] : {A B C : Object} → (g : Arrow B C) → (f : Arrow A B) → Arrow A C
|
||||
_[_∘_] = _<<<_
|
||||
|
||||
-- | The opposite category
|
||||
--
|
||||
-- The opposite category is the category where the direction of the arrows are
|
||||
-- flipped.
|
||||
module Opposite {ℓa ℓb : Level} where
|
||||
module _ (ℂ : Category ℓa ℓb) where
|
||||
private
|
||||
module _ where
|
||||
module ℂ = Category ℂ
|
||||
opRaw : RawCategory ℓa ℓb
|
||||
RawCategory.Object opRaw = ℂ.Object
|
||||
RawCategory.Arrow opRaw = flip ℂ.Arrow
|
||||
RawCategory.identity opRaw = ℂ.identity
|
||||
RawCategory._<<<_ opRaw = ℂ._>>>_
|
||||
|
||||
open RawCategory opRaw
|
||||
|
||||
isPreCategory : IsPreCategory opRaw
|
||||
IsPreCategory.isAssociative isPreCategory = sym ℂ.isAssociative
|
||||
IsPreCategory.isIdentity isPreCategory = swap ℂ.isIdentity
|
||||
IsPreCategory.arrowsAreSets isPreCategory = ℂ.arrowsAreSets
|
||||
|
||||
open IsPreCategory isPreCategory
|
||||
|
||||
module _ {A B : ℂ.Object} where
|
||||
open Σ (toIso _ _ (ℂ.univalent {A} {B}))
|
||||
renaming (fst to idToIso* ; snd to inv*)
|
||||
open AreInverses {f = ℂ.idToIso A B} {idToIso*} inv*
|
||||
|
||||
shuffle : A ≊ B → A ℂ.≊ B
|
||||
shuffle (f , g , inv) = g , f , inv
|
||||
|
||||
shuffle~ : A ℂ.≊ B → A ≊ B
|
||||
shuffle~ (f , g , inv) = g , f , inv
|
||||
|
||||
lem : (p : A ≡ B) → idToIso A B p ≡ shuffle~ (ℂ.idToIso A B p)
|
||||
lem p = isoEq refl
|
||||
|
||||
isoToId* : A ≊ B → A ≡ B
|
||||
isoToId* = idToIso* ∘ shuffle
|
||||
|
||||
inv : AreInverses (idToIso A B) isoToId*
|
||||
inv =
|
||||
( funExt (λ x → begin
|
||||
(isoToId* ∘ idToIso A B) x
|
||||
≡⟨⟩
|
||||
(idToIso* ∘ shuffle ∘ idToIso A B) x
|
||||
≡⟨ cong (λ φ → φ x) (cong (λ φ → idToIso* ∘ shuffle ∘ φ) (funExt lem)) ⟩
|
||||
(idToIso* ∘ shuffle ∘ shuffle~ ∘ ℂ.idToIso A B) x
|
||||
≡⟨⟩
|
||||
(idToIso* ∘ ℂ.idToIso A B) x
|
||||
≡⟨ (λ i → verso-recto i x) ⟩
|
||||
x ∎)
|
||||
, funExt (λ x → begin
|
||||
(idToIso A B ∘ idToIso* ∘ shuffle) x
|
||||
≡⟨ cong (λ φ → φ x) (cong (λ φ → φ ∘ idToIso* ∘ shuffle) (funExt lem)) ⟩
|
||||
(shuffle~ ∘ ℂ.idToIso A B ∘ idToIso* ∘ shuffle) x
|
||||
≡⟨ cong (λ φ → φ x) (cong (λ φ → shuffle~ ∘ φ ∘ shuffle) recto-verso) ⟩
|
||||
(shuffle~ ∘ shuffle) x
|
||||
≡⟨⟩
|
||||
x ∎)
|
||||
)
|
||||
|
||||
isCategory : IsCategory opRaw
|
||||
IsCategory.isPreCategory isCategory = isPreCategory
|
||||
IsCategory.univalent isCategory
|
||||
= univalenceFromIsomorphism (isoToId* , inv)
|
||||
|
||||
opposite : Category ℓa ℓb
|
||||
Category.raw opposite = opRaw
|
||||
Category.isCategory opposite = isCategory
|
||||
|
||||
-- As demonstrated here a side-effect of having no-eta-equality on constructors
|
||||
-- means that we need to pick things apart to show that things are indeed
|
||||
-- definitionally equal. I.e; a thing that would normally be provable in one
|
||||
-- line now takes 13!! Admittedly it's a simple proof.
|
||||
module _ {ℂ : Category ℓa ℓb} where
|
||||
open Category ℂ
|
||||
private
|
||||
-- Since they really are definitionally equal we just need to pick apart
|
||||
-- the data-type.
|
||||
rawInv : Category.raw (opposite (opposite ℂ)) ≡ raw
|
||||
RawCategory.Object (rawInv _) = Object
|
||||
RawCategory.Arrow (rawInv _) = Arrow
|
||||
RawCategory.identity (rawInv _) = identity
|
||||
RawCategory._<<<_ (rawInv _) = _<<<_
|
||||
|
||||
oppositeIsInvolution : opposite (opposite ℂ) ≡ ℂ
|
||||
oppositeIsInvolution = Category≡ rawInv
|
||||
|
||||
open Opposite public
|
||||
|
|
|
@ -28,186 +28,213 @@ import Cat.Category.Monad.Monoidal
|
|||
import Cat.Category.Monad.Kleisli
|
||||
open import Cat.Categories.Fun
|
||||
|
||||
module Monoidal = Cat.Category.Monad.Monoidal
|
||||
module Kleisli = Cat.Category.Monad.Kleisli
|
||||
|
||||
-- | The monoidal- and kleisli presentation of monads are equivalent.
|
||||
module _ {ℓa ℓb : Level} (ℂ : Category ℓa ℓb) where
|
||||
open Cat.Category.NaturalTransformation ℂ ℂ using (NaturalTransformation ; propIsNatural)
|
||||
private
|
||||
module ℂ = Category ℂ
|
||||
open ℂ using (Object ; Arrow ; identity ; _<<<_ ; _>>>_)
|
||||
module M = Monoidal ℂ
|
||||
module K = Kleisli ℂ
|
||||
|
||||
module _ (m : M.RawMonad) where
|
||||
open M.RawMonad m
|
||||
module Monoidal = Cat.Category.Monad.Monoidal ℂ
|
||||
module Kleisli = Cat.Category.Monad.Kleisli ℂ
|
||||
|
||||
forthRaw : K.RawMonad
|
||||
K.RawMonad.omap forthRaw = Romap
|
||||
K.RawMonad.pure forthRaw = pureT _
|
||||
K.RawMonad.bind forthRaw = bind
|
||||
module _ (m : Monoidal.RawMonad) where
|
||||
open Monoidal.RawMonad m
|
||||
|
||||
module _ {raw : M.RawMonad} (m : M.IsMonad raw) where
|
||||
private
|
||||
module MI = M.IsMonad m
|
||||
forthIsMonad : K.IsMonad (forthRaw raw)
|
||||
K.IsMonad.isIdentity forthIsMonad = snd MI.isInverse
|
||||
K.IsMonad.isNatural forthIsMonad = MI.isNatural
|
||||
K.IsMonad.isDistributive forthIsMonad = MI.isDistributive
|
||||
toKleisliRaw : Kleisli.RawMonad
|
||||
Kleisli.RawMonad.omap toKleisliRaw = Romap
|
||||
Kleisli.RawMonad.pure toKleisliRaw = pure
|
||||
Kleisli.RawMonad.bind toKleisliRaw = bind
|
||||
|
||||
forth : M.Monad → K.Monad
|
||||
Kleisli.Monad.raw (forth m) = forthRaw (M.Monad.raw m)
|
||||
Kleisli.Monad.isMonad (forth m) = forthIsMonad (M.Monad.isMonad m)
|
||||
module _ {raw : Monoidal.RawMonad} (m : Monoidal.IsMonad raw) where
|
||||
open Monoidal.IsMonad m
|
||||
|
||||
module _ (m : K.Monad) where
|
||||
open K.Monad m
|
||||
open Kleisli.RawMonad (toKleisliRaw raw) using (_>=>_)
|
||||
toKleisliIsMonad : Kleisli.IsMonad (toKleisliRaw raw)
|
||||
Kleisli.IsMonad.isIdentity toKleisliIsMonad = begin
|
||||
bind pure ≡⟨⟩
|
||||
join <<< (fmap pure) ≡⟨ snd isInverse ⟩
|
||||
identity ∎
|
||||
Kleisli.IsMonad.isNatural toKleisliIsMonad f = begin
|
||||
pure >=> f ≡⟨⟩
|
||||
pure >>> bind f ≡⟨⟩
|
||||
bind f <<< pure ≡⟨⟩
|
||||
(join <<< fmap f) <<< pure ≡⟨ isNatural f ⟩
|
||||
f ∎
|
||||
Kleisli.IsMonad.isDistributive toKleisliIsMonad f g = begin
|
||||
bind g >>> bind f ≡⟨⟩
|
||||
(join <<< fmap f) <<< (join <<< fmap g) ≡⟨ isDistributive f g ⟩
|
||||
join <<< fmap (join <<< fmap f <<< g) ≡⟨⟩
|
||||
bind (g >=> f) ∎
|
||||
-- Kleisli.IsMonad.isDistributive toKleisliIsMonad = isDistributive
|
||||
|
||||
backRaw : M.RawMonad
|
||||
M.RawMonad.R backRaw = R
|
||||
M.RawMonad.pureNT backRaw = pureNT
|
||||
M.RawMonad.joinNT backRaw = joinNT
|
||||
toKleisli : Monoidal.Monad → Kleisli.Monad
|
||||
Kleisli.Monad.raw (toKleisli m) = toKleisliRaw (Monoidal.Monad.raw m)
|
||||
Kleisli.Monad.isMonad (toKleisli m) = toKleisliIsMonad (Monoidal.Monad.isMonad m)
|
||||
|
||||
private
|
||||
open M.RawMonad backRaw renaming
|
||||
( join to join*
|
||||
; pure to pure*
|
||||
; bind to bind*
|
||||
; fmap to fmap*
|
||||
)
|
||||
module R = Functor (M.RawMonad.R backRaw)
|
||||
module _ (m : Kleisli.Monad) where
|
||||
open Kleisli.Monad m
|
||||
|
||||
backIsMonad : M.IsMonad backRaw
|
||||
M.IsMonad.isAssociative backIsMonad = begin
|
||||
join* <<< R.fmap join* ≡⟨⟩
|
||||
toMonoidalRaw : Monoidal.RawMonad
|
||||
Monoidal.RawMonad.R toMonoidalRaw = R
|
||||
Monoidal.RawMonad.pureNT toMonoidalRaw = pureNT
|
||||
Monoidal.RawMonad.joinNT toMonoidalRaw = joinNT
|
||||
|
||||
open Monoidal.RawMonad toMonoidalRaw renaming
|
||||
( join to join*
|
||||
; pure to pure*
|
||||
; bind to bind*
|
||||
; fmap to fmap*
|
||||
) using ()
|
||||
|
||||
toMonoidalIsMonad : Monoidal.IsMonad toMonoidalRaw
|
||||
Monoidal.IsMonad.isAssociative toMonoidalIsMonad = begin
|
||||
join* <<< fmap join* ≡⟨⟩
|
||||
join <<< fmap join ≡⟨ isNaturalForeign ⟩
|
||||
join <<< join ∎
|
||||
M.IsMonad.isInverse backIsMonad {X} = inv-l , inv-r
|
||||
Monoidal.IsMonad.isInverse toMonoidalIsMonad {X} = inv-l , inv-r
|
||||
where
|
||||
inv-l = begin
|
||||
join <<< pure ≡⟨ fst isInverse ⟩
|
||||
identity ∎
|
||||
inv-r = begin
|
||||
joinT X <<< R.fmap (pureT X) ≡⟨⟩
|
||||
join* <<< fmap* pure* ≡⟨⟩
|
||||
join <<< fmap pure ≡⟨ snd isInverse ⟩
|
||||
identity ∎
|
||||
|
||||
back : K.Monad → M.Monad
|
||||
Monoidal.Monad.raw (back m) = backRaw m
|
||||
Monoidal.Monad.isMonad (back m) = backIsMonad m
|
||||
toMonoidal : Kleisli.Monad → Monoidal.Monad
|
||||
Monoidal.Monad.raw (toMonoidal m) = toMonoidalRaw m
|
||||
Monoidal.Monad.isMonad (toMonoidal m) = toMonoidalIsMonad m
|
||||
|
||||
module _ (m : K.Monad) where
|
||||
module _ (m : Kleisli.Monad) where
|
||||
private
|
||||
open K.Monad m
|
||||
open Kleisli.Monad m
|
||||
bindEq : ∀ {X Y}
|
||||
→ K.RawMonad.bind (forthRaw (backRaw m)) {X} {Y}
|
||||
≡ K.RawMonad.bind (K.Monad.raw m)
|
||||
bindEq {X} {Y} = begin
|
||||
K.RawMonad.bind (forthRaw (backRaw m)) ≡⟨⟩
|
||||
(λ f → join <<< fmap f) ≡⟨⟩
|
||||
(λ f → bind (f >>> pure) >>> bind identity) ≡⟨ funExt lem ⟩
|
||||
(λ f → bind f) ≡⟨⟩
|
||||
bind ∎
|
||||
→ Kleisli.RawMonad.bind (toKleisliRaw (toMonoidalRaw m)) {X} {Y}
|
||||
≡ bind
|
||||
bindEq {X} {Y} = funExt lem
|
||||
where
|
||||
lem : (f : Arrow X (omap Y))
|
||||
→ bind (f >>> pure) >>> bind identity
|
||||
≡ bind f
|
||||
lem f = begin
|
||||
join <<< fmap f
|
||||
≡⟨⟩
|
||||
bind (f >>> pure) >>> bind identity
|
||||
≡⟨ isDistributive _ _ ⟩
|
||||
bind ((f >>> pure) >=> identity)
|
||||
≡⟨⟩
|
||||
bind ((f >>> pure) >>> bind identity)
|
||||
≡⟨ cong bind ℂ.isAssociative ⟩
|
||||
bind (f >>> (pure >>> bind identity))
|
||||
≡⟨⟩
|
||||
bind (f >>> (pure >=> identity))
|
||||
≡⟨ cong (λ φ → bind (f >>> φ)) (isNatural _) ⟩
|
||||
bind (f >>> identity)
|
||||
≡⟨ cong bind ℂ.leftIdentity ⟩
|
||||
bind f ∎
|
||||
|
||||
forthRawEq : forthRaw (backRaw m) ≡ K.Monad.raw m
|
||||
K.RawMonad.omap (forthRawEq _) = omap
|
||||
K.RawMonad.pure (forthRawEq _) = pure
|
||||
K.RawMonad.bind (forthRawEq i) = bindEq i
|
||||
toKleisliRawEq : toKleisliRaw (toMonoidalRaw m) ≡ Kleisli.Monad.raw m
|
||||
Kleisli.RawMonad.omap (toKleisliRawEq i) = (begin
|
||||
Kleisli.RawMonad.omap (toKleisliRaw (toMonoidalRaw m)) ≡⟨⟩
|
||||
Monoidal.RawMonad.Romap (toMonoidalRaw m) ≡⟨⟩
|
||||
omap ∎
|
||||
) i
|
||||
Kleisli.RawMonad.pure (toKleisliRawEq i) = (begin
|
||||
Kleisli.RawMonad.pure (toKleisliRaw (toMonoidalRaw m)) ≡⟨⟩
|
||||
Monoidal.RawMonad.pure (toMonoidalRaw m) ≡⟨⟩
|
||||
pure ∎
|
||||
) i
|
||||
Kleisli.RawMonad.bind (toKleisliRawEq i) = bindEq i
|
||||
|
||||
fortheq : (m : K.Monad) → forth (back m) ≡ m
|
||||
fortheq m = K.Monad≡ (forthRawEq m)
|
||||
toKleislieq : (m : Kleisli.Monad) → toKleisli (toMonoidal m) ≡ m
|
||||
toKleislieq m = Kleisli.Monad≡ (toKleisliRawEq m)
|
||||
|
||||
module _ (m : M.Monad) where
|
||||
module _ (m : Monoidal.Monad) where
|
||||
private
|
||||
open M.Monad m
|
||||
module KM = K.Monad (forth m)
|
||||
open Monoidal.Monad m
|
||||
-- module KM = Kleisli.Monad (toKleisli m)
|
||||
open Kleisli.Monad (toKleisli m) renaming
|
||||
( bind to bind* ; omap to omap* ; join to join*
|
||||
; fmap to fmap* ; pure to pure* ; R to R*)
|
||||
using ()
|
||||
module R = Functor R
|
||||
omapEq : KM.omap ≡ Romap
|
||||
omapEq : omap* ≡ Romap
|
||||
omapEq = refl
|
||||
|
||||
bindEq : ∀ {X Y} {f : Arrow X (Romap Y)} → KM.bind f ≡ bind f
|
||||
bindEq : ∀ {X Y} {f : Arrow X (Romap Y)} → bind* f ≡ bind f
|
||||
bindEq {X} {Y} {f} = begin
|
||||
KM.bind f ≡⟨⟩
|
||||
joinT Y <<< fmap f ≡⟨⟩
|
||||
bind f ∎
|
||||
bind* f ≡⟨⟩
|
||||
join <<< fmap f ≡⟨⟩
|
||||
bind f ∎
|
||||
|
||||
joinEq : ∀ {X} → KM.join ≡ joinT X
|
||||
joinEq : ∀ {X} → join* ≡ joinT X
|
||||
joinEq {X} = begin
|
||||
KM.join ≡⟨⟩
|
||||
KM.bind identity ≡⟨⟩
|
||||
bind identity ≡⟨⟩
|
||||
joinT X <<< fmap identity ≡⟨ cong (λ φ → _ <<< φ) R.isIdentity ⟩
|
||||
joinT X <<< identity ≡⟨ ℂ.rightIdentity ⟩
|
||||
joinT X ∎
|
||||
join* ≡⟨⟩
|
||||
bind* identity ≡⟨⟩
|
||||
bind identity ≡⟨⟩
|
||||
join <<< fmap identity ≡⟨ cong (λ φ → _ <<< φ) R.isIdentity ⟩
|
||||
join <<< identity ≡⟨ ℂ.rightIdentity ⟩
|
||||
join ∎
|
||||
|
||||
fmapEq : ∀ {A B} → KM.fmap {A} {B} ≡ fmap
|
||||
fmapEq : ∀ {A B} → fmap* {A} {B} ≡ fmap
|
||||
fmapEq {A} {B} = funExt (λ f → begin
|
||||
KM.fmap f ≡⟨⟩
|
||||
KM.bind (f >>> KM.pure) ≡⟨⟩
|
||||
bind (f >>> pureT _) ≡⟨⟩
|
||||
fmap (f >>> pureT B) >>> joinT B ≡⟨⟩
|
||||
fmap (f >>> pureT B) >>> joinT B ≡⟨ cong (λ φ → φ >>> joinT B) R.isDistributive ⟩
|
||||
fmap f >>> fmap (pureT B) >>> joinT B ≡⟨ ℂ.isAssociative ⟩
|
||||
joinT B <<< fmap (pureT B) <<< fmap f ≡⟨ cong (λ φ → φ <<< fmap f) (snd isInverse) ⟩
|
||||
identity <<< fmap f ≡⟨ ℂ.leftIdentity ⟩
|
||||
fmap f ∎
|
||||
fmap* f ≡⟨⟩
|
||||
bind* (f >>> pure*) ≡⟨⟩
|
||||
bind (f >>> pure) ≡⟨⟩
|
||||
fmap (f >>> pure) >>> join ≡⟨⟩
|
||||
fmap (f >>> pure) >>> join ≡⟨ cong (λ φ → φ >>> joinT B) R.isDistributive ⟩
|
||||
fmap f >>> fmap pure >>> join ≡⟨ ℂ.isAssociative ⟩
|
||||
join <<< fmap pure <<< fmap f ≡⟨ cong (λ φ → φ <<< fmap f) (snd isInverse) ⟩
|
||||
identity <<< fmap f ≡⟨ ℂ.leftIdentity ⟩
|
||||
fmap f ∎
|
||||
)
|
||||
|
||||
rawEq : Functor.raw KM.R ≡ Functor.raw R
|
||||
rawEq : Functor.raw R* ≡ Functor.raw R
|
||||
RawFunctor.omap (rawEq i) = omapEq i
|
||||
RawFunctor.fmap (rawEq i) = fmapEq i
|
||||
|
||||
Req : M.RawMonad.R (backRaw (forth m)) ≡ R
|
||||
Req : Monoidal.RawMonad.R (toMonoidalRaw (toKleisli m)) ≡ R
|
||||
Req = Functor≡ rawEq
|
||||
|
||||
pureTEq : M.RawMonad.pureT (backRaw (forth m)) ≡ pureT
|
||||
pureTEq = funExt (λ X → refl)
|
||||
pureTEq : Monoidal.RawMonad.pureT (toMonoidalRaw (toKleisli m)) ≡ pureT
|
||||
pureTEq = refl
|
||||
|
||||
pureNTEq : (λ i → NaturalTransformation Functors.identity (Req i))
|
||||
[ M.RawMonad.pureNT (backRaw (forth m)) ≡ pureNT ]
|
||||
[ Monoidal.RawMonad.pureNT (toMonoidalRaw (toKleisli m)) ≡ pureNT ]
|
||||
pureNTEq = lemSigP (λ i → propIsNatural Functors.identity (Req i)) _ _ pureTEq
|
||||
|
||||
joinTEq : M.RawMonad.joinT (backRaw (forth m)) ≡ joinT
|
||||
joinTEq : Monoidal.RawMonad.joinT (toMonoidalRaw (toKleisli m)) ≡ joinT
|
||||
joinTEq = funExt (λ X → begin
|
||||
M.RawMonad.joinT (backRaw (forth m)) X ≡⟨⟩
|
||||
KM.join ≡⟨⟩
|
||||
joinT X <<< fmap identity ≡⟨ cong (λ φ → joinT X <<< φ) R.isIdentity ⟩
|
||||
joinT X <<< identity ≡⟨ ℂ.rightIdentity ⟩
|
||||
joinT X ∎)
|
||||
Monoidal.RawMonad.joinT (toMonoidalRaw (toKleisli m)) X ≡⟨⟩
|
||||
join* ≡⟨⟩
|
||||
join <<< fmap identity ≡⟨ cong (λ φ → join <<< φ) R.isIdentity ⟩
|
||||
join <<< identity ≡⟨ ℂ.rightIdentity ⟩
|
||||
join ∎)
|
||||
|
||||
joinNTEq : (λ i → NaturalTransformation F[ Req i ∘ Req i ] (Req i))
|
||||
[ M.RawMonad.joinNT (backRaw (forth m)) ≡ joinNT ]
|
||||
[ Monoidal.RawMonad.joinNT (toMonoidalRaw (toKleisli m)) ≡ joinNT ]
|
||||
joinNTEq = lemSigP (λ i → propIsNatural F[ Req i ∘ Req i ] (Req i)) _ _ joinTEq
|
||||
|
||||
backRawEq : backRaw (forth m) ≡ M.Monad.raw m
|
||||
M.RawMonad.R (backRawEq i) = Req i
|
||||
M.RawMonad.pureNT (backRawEq i) = pureNTEq i
|
||||
M.RawMonad.joinNT (backRawEq i) = joinNTEq i
|
||||
toMonoidalRawEq : toMonoidalRaw (toKleisli m) ≡ Monoidal.Monad.raw m
|
||||
Monoidal.RawMonad.R (toMonoidalRawEq i) = Req i
|
||||
Monoidal.RawMonad.pureNT (toMonoidalRawEq i) = pureNTEq i
|
||||
Monoidal.RawMonad.joinNT (toMonoidalRawEq i) = joinNTEq i
|
||||
|
||||
backeq : (m : M.Monad) → back (forth m) ≡ m
|
||||
backeq m = M.Monad≡ (backRawEq m)
|
||||
|
||||
eqv : isEquiv M.Monad K.Monad forth
|
||||
eqv = gradLemma forth back fortheq backeq
|
||||
toMonoidaleq : (m : Monoidal.Monad) → toMonoidal (toKleisli m) ≡ m
|
||||
toMonoidaleq m = Monoidal.Monad≡ (toMonoidalRawEq m)
|
||||
|
||||
open import Cat.Equivalence
|
||||
|
||||
Monoidal≊Kleisli : M.Monad ≅ K.Monad
|
||||
Monoidal≊Kleisli = forth , back , funExt backeq , funExt fortheq
|
||||
Monoidal≊Kleisli : Monoidal.Monad ≅ Kleisli.Monad
|
||||
Monoidal≊Kleisli = toKleisli , toMonoidal , funExt toMonoidaleq , funExt toKleislieq
|
||||
|
||||
Monoidal≡Kleisli : M.Monad ≡ K.Monad
|
||||
Monoidal≡Kleisli : Monoidal.Monad ≡ Kleisli.Monad
|
||||
Monoidal≡Kleisli = isoToPath Monoidal≊Kleisli
|
||||
|
||||
grpdKleisli : isGrpd Kleisli.Monad
|
||||
grpdKleisli = Kleisli.grpdMonad
|
||||
|
||||
grpdMonoidal : isGrpd Monoidal.Monad
|
||||
grpdMonoidal = subst {P = isGrpd}
|
||||
(sym Monoidal≡Kleisli) grpdKleisli
|
||||
|
|
|
@ -5,6 +5,7 @@ The Kleisli formulation of monads
|
|||
open import Agda.Primitive
|
||||
|
||||
open import Cat.Prelude
|
||||
open import Cat.Equivalence
|
||||
|
||||
open import Cat.Category
|
||||
open import Cat.Category.Functor as F
|
||||
|
@ -230,6 +231,7 @@ record IsMonad (raw : RawMonad) : Set ℓ where
|
|||
m ∎
|
||||
|
||||
record Monad : Set ℓ where
|
||||
no-eta-equality
|
||||
field
|
||||
raw : RawMonad
|
||||
isMonad : IsMonad raw
|
||||
|
@ -264,3 +266,82 @@ module _ {m n : Monad} (eq : Monad.raw m ≡ Monad.raw n) where
|
|||
Monad≡ : m ≡ n
|
||||
Monad.raw (Monad≡ i) = eq i
|
||||
Monad.isMonad (Monad≡ i) = eqIsMonad i
|
||||
|
||||
module _ where
|
||||
private
|
||||
module _ (x y : RawMonad) (p q : x ≡ y) (a b : p ≡ q) where
|
||||
eq0-helper : isGrpd (Object → Object)
|
||||
eq0-helper = grpdPi (λ a → ℂ.groupoidObject)
|
||||
|
||||
eq0 : cong (cong RawMonad.omap) a ≡ cong (cong RawMonad.omap) b
|
||||
eq0 = eq0-helper
|
||||
(RawMonad.omap x) (RawMonad.omap y)
|
||||
(cong RawMonad.omap p) (cong RawMonad.omap q)
|
||||
(cong (cong RawMonad.omap) a) (cong (cong RawMonad.omap) b)
|
||||
|
||||
eq1-helper : (omap : Object → Object) → isGrpd ({X : Object} → ℂ [ X , omap X ])
|
||||
eq1-helper f = grpdPiImpl (setGrpd ℂ.arrowsAreSets)
|
||||
|
||||
postulate
|
||||
eq1 : PathP (λ i → PathP
|
||||
(λ j →
|
||||
PathP (λ k → {X : Object} → ℂ [ X , eq0 i j k X ])
|
||||
(RawMonad.pure x) (RawMonad.pure y))
|
||||
(λ i → RawMonad.pure (p i)) (λ i → RawMonad.pure (q i)))
|
||||
(cong-d (cong-d RawMonad.pure) a) (cong-d (cong-d RawMonad.pure) b)
|
||||
|
||||
|
||||
RawMonad' : Set _
|
||||
RawMonad' = Σ (Object → Object) (λ omap
|
||||
→ ({X : Object} → ℂ [ X , omap X ])
|
||||
× ({X Y : Object} → ℂ [ X , omap Y ] → ℂ [ omap X , omap Y ])
|
||||
)
|
||||
grpdRawMonad' : isGrpd RawMonad'
|
||||
grpdRawMonad' = grpdSig (grpdPi (λ _ → ℂ.groupoidObject)) λ _ → grpdSig (grpdPiImpl (setGrpd ℂ.arrowsAreSets)) (λ _ → grpdPiImpl (grpdPiImpl (grpdPi (λ _ → setGrpd ℂ.arrowsAreSets))))
|
||||
toRawMonad : RawMonad' → RawMonad
|
||||
RawMonad.omap (toRawMonad (a , b , c)) = a
|
||||
RawMonad.pure (toRawMonad (a , b , c)) = b
|
||||
RawMonad.bind (toRawMonad (a , b , c)) = c
|
||||
|
||||
IsMonad' : RawMonad' → Set _
|
||||
IsMonad' raw = M.IsIdentity × M.IsNatural × M.IsDistributive
|
||||
where
|
||||
module M = RawMonad (toRawMonad raw)
|
||||
|
||||
grpdIsMonad' : (m : RawMonad') → isGrpd (IsMonad' m)
|
||||
grpdIsMonad' m = grpdSig (propGrpd (propIsIdentity (toRawMonad m)))
|
||||
λ _ → grpdSig (propGrpd (propIsNatural (toRawMonad m)))
|
||||
λ _ → propGrpd (propIsDistributive (toRawMonad m))
|
||||
|
||||
Monad' = Σ RawMonad' IsMonad'
|
||||
grpdMonad' = grpdSig grpdRawMonad' grpdIsMonad'
|
||||
|
||||
toMonad : Monad' → Monad
|
||||
Monad.raw (toMonad x) = toRawMonad (fst x)
|
||||
isIdentity (Monad.isMonad (toMonad x)) = fst (snd x)
|
||||
isNatural (Monad.isMonad (toMonad x)) = fst (snd (snd x))
|
||||
isDistributive (Monad.isMonad (toMonad x)) = snd (snd (snd x))
|
||||
|
||||
fromMonad : Monad → Monad'
|
||||
fromMonad m = (M.omap , M.pure , M.bind)
|
||||
, M.isIdentity , M.isNatural , M.isDistributive
|
||||
where
|
||||
module M = Monad m
|
||||
|
||||
e : Monad' ≃ Monad
|
||||
e = fromIsomorphism _ _ (toMonad , fromMonad , (funExt λ _ → refl) , funExt eta-refl)
|
||||
where
|
||||
-- Monads don't have eta-equality
|
||||
eta-refl : (x : Monad) → toMonad (fromMonad x) ≡ x
|
||||
eta-refl =
|
||||
(λ x → λ
|
||||
{ i .Monad.raw → Monad.raw x
|
||||
; i .Monad.isMonad → Monad.isMonad x}
|
||||
)
|
||||
|
||||
grpdMonad : isGrpd Monad
|
||||
grpdMonad = equivPreservesNType
|
||||
{n = (S (S (S ⟨-2⟩)))}
|
||||
e grpdMonad'
|
||||
where
|
||||
open import Cubical.NType
|
||||
|
|
|
@ -18,7 +18,7 @@ private
|
|||
|
||||
open Category ℂ using (Object ; Arrow ; identity ; _<<<_)
|
||||
open import Cat.Category.NaturalTransformation ℂ ℂ
|
||||
using (NaturalTransformation ; Transformation ; Natural)
|
||||
using (NaturalTransformation ; Transformation ; Natural ; NaturalTransformation≡)
|
||||
|
||||
record RawMonad : Set ℓ where
|
||||
field
|
||||
|
@ -78,15 +78,39 @@ record IsMonad (raw : RawMonad) : Set ℓ where
|
|||
|
||||
isNatural : IsNatural
|
||||
isNatural {X} {Y} f = begin
|
||||
joinT Y <<< R.fmap f <<< pureT X ≡⟨ sym ℂ.isAssociative ⟩
|
||||
joinT Y <<< (R.fmap f <<< pureT X) ≡⟨ cong (λ φ → joinT Y <<< φ) (sym (pureN f)) ⟩
|
||||
joinT Y <<< (pureT (R.omap Y) <<< f) ≡⟨ ℂ.isAssociative ⟩
|
||||
joinT Y <<< pureT (R.omap Y) <<< f ≡⟨ cong (λ φ → φ <<< f) (fst isInverse) ⟩
|
||||
identity <<< f ≡⟨ ℂ.leftIdentity ⟩
|
||||
f ∎
|
||||
join <<< fmap f <<< pure ≡⟨ sym ℂ.isAssociative ⟩
|
||||
join <<< (fmap f <<< pure) ≡⟨ cong (λ φ → join <<< φ) (sym (pureN f)) ⟩
|
||||
join <<< (pure <<< f) ≡⟨ ℂ.isAssociative ⟩
|
||||
join <<< pure <<< f ≡⟨ cong (λ φ → φ <<< f) (fst isInverse) ⟩
|
||||
identity <<< f ≡⟨ ℂ.leftIdentity ⟩
|
||||
f ∎
|
||||
|
||||
isDistributive : IsDistributive
|
||||
isDistributive {X} {Y} {Z} g f = sym aux
|
||||
isDistributive {X} {Y} {Z} g f = begin
|
||||
join <<< fmap g <<< (join <<< fmap f)
|
||||
≡⟨ Category.isAssociative ℂ ⟩
|
||||
join <<< fmap g <<< join <<< fmap f
|
||||
≡⟨ cong (_<<< fmap f) (sym ℂ.isAssociative) ⟩
|
||||
(join <<< (fmap g <<< join)) <<< fmap f
|
||||
≡⟨ cong (λ φ → φ <<< fmap f) (cong (_<<<_ join) (sym (joinN g))) ⟩
|
||||
(join <<< (join <<< R².fmap g)) <<< fmap f
|
||||
≡⟨ cong (_<<< fmap f) ℂ.isAssociative ⟩
|
||||
((join <<< join) <<< R².fmap g) <<< fmap f
|
||||
≡⟨⟩
|
||||
join <<< join <<< R².fmap g <<< fmap f
|
||||
≡⟨ sym ℂ.isAssociative ⟩
|
||||
(join <<< join) <<< (R².fmap g <<< fmap f)
|
||||
≡⟨ cong (λ φ → φ <<< (R².fmap g <<< fmap f)) (sym isAssociative) ⟩
|
||||
(join <<< fmap join) <<< (R².fmap g <<< fmap f)
|
||||
≡⟨ sym ℂ.isAssociative ⟩
|
||||
join <<< (fmap join <<< (R².fmap g <<< fmap f))
|
||||
≡⟨ cong (_<<<_ join) ℂ.isAssociative ⟩
|
||||
join <<< (fmap join <<< R².fmap g <<< fmap f)
|
||||
≡⟨⟩
|
||||
join <<< (fmap join <<< fmap (fmap g) <<< fmap f)
|
||||
≡⟨ cong (λ φ → join <<< φ) (sym distrib3) ⟩
|
||||
join <<< fmap (join <<< fmap g <<< f)
|
||||
∎
|
||||
where
|
||||
module R² = Functor F[ R ∘ R ]
|
||||
distrib3 : ∀ {A B C D} {a : Arrow C D} {b : Arrow B C} {c : Arrow A B}
|
||||
|
@ -96,33 +120,9 @@ record IsMonad (raw : RawMonad) : Set ℓ where
|
|||
R.fmap (a <<< b <<< c) ≡⟨ R.isDistributive ⟩
|
||||
R.fmap (a <<< b) <<< R.fmap c ≡⟨ cong (_<<< _) R.isDistributive ⟩
|
||||
R.fmap a <<< R.fmap b <<< R.fmap c ∎
|
||||
aux = begin
|
||||
joinT Z <<< R.fmap (joinT Z <<< R.fmap g <<< f)
|
||||
≡⟨ cong (λ φ → joinT Z <<< φ) distrib3 ⟩
|
||||
joinT Z <<< (R.fmap (joinT Z) <<< R.fmap (R.fmap g) <<< R.fmap f)
|
||||
≡⟨⟩
|
||||
joinT Z <<< (R.fmap (joinT Z) <<< R².fmap g <<< R.fmap f)
|
||||
≡⟨ cong (_<<<_ (joinT Z)) (sym ℂ.isAssociative) ⟩
|
||||
joinT Z <<< (R.fmap (joinT Z) <<< (R².fmap g <<< R.fmap f))
|
||||
≡⟨ ℂ.isAssociative ⟩
|
||||
(joinT Z <<< R.fmap (joinT Z)) <<< (R².fmap g <<< R.fmap f)
|
||||
≡⟨ cong (λ φ → φ <<< (R².fmap g <<< R.fmap f)) isAssociative ⟩
|
||||
(joinT Z <<< joinT (R.omap Z)) <<< (R².fmap g <<< R.fmap f)
|
||||
≡⟨ ℂ.isAssociative ⟩
|
||||
joinT Z <<< joinT (R.omap Z) <<< R².fmap g <<< R.fmap f
|
||||
≡⟨⟩
|
||||
((joinT Z <<< joinT (R.omap Z)) <<< R².fmap g) <<< R.fmap f
|
||||
≡⟨ cong (_<<< R.fmap f) (sym ℂ.isAssociative) ⟩
|
||||
(joinT Z <<< (joinT (R.omap Z) <<< R².fmap g)) <<< R.fmap f
|
||||
≡⟨ cong (λ φ → φ <<< R.fmap f) (cong (_<<<_ (joinT Z)) (joinN g)) ⟩
|
||||
(joinT Z <<< (R.fmap g <<< joinT Y)) <<< R.fmap f
|
||||
≡⟨ cong (_<<< R.fmap f) ℂ.isAssociative ⟩
|
||||
joinT Z <<< R.fmap g <<< joinT Y <<< R.fmap f
|
||||
≡⟨ sym (Category.isAssociative ℂ) ⟩
|
||||
joinT Z <<< R.fmap g <<< (joinT Y <<< R.fmap f)
|
||||
∎
|
||||
|
||||
record Monad : Set ℓ where
|
||||
no-eta-equality
|
||||
field
|
||||
raw : RawMonad
|
||||
isMonad : IsMonad raw
|
||||
|
|
|
@ -1,15 +1,18 @@
|
|||
{-
|
||||
This module provides construction 2.3 in [voe]
|
||||
-}
|
||||
{-# OPTIONS --cubical --caching #-}
|
||||
{-# OPTIONS --cubical #-}
|
||||
module Cat.Category.Monad.Voevodsky where
|
||||
|
||||
open import Cat.Prelude
|
||||
open import Cat.Equivalence
|
||||
|
||||
open import Cat.Category
|
||||
open import Cat.Category.Functor as F
|
||||
import Cat.Category.NaturalTransformation
|
||||
open import Cat.Category.Monad
|
||||
import Cat.Category.Monad.Monoidal as Monoidal
|
||||
import Cat.Category.Monad.Kleisli as Kleisli
|
||||
open import Cat.Categories.Fun
|
||||
open import Cat.Equivalence
|
||||
|
||||
|
@ -24,6 +27,7 @@ module voe {ℓa ℓb : Level} (ℂ : Category ℓa ℓb) where
|
|||
|
||||
module §2-3 (omap : Object → Object) (pure : {X : Object} → Arrow X (omap X)) where
|
||||
record §1 : Set ℓ where
|
||||
no-eta-equality
|
||||
open M
|
||||
|
||||
field
|
||||
|
@ -74,12 +78,11 @@ module voe {ℓa ℓb : Level} (ℂ : Category ℓa ℓb) where
|
|||
isMonad : IsMonad rawMnd
|
||||
|
||||
toMonad : Monad
|
||||
toMonad = record
|
||||
{ raw = rawMnd
|
||||
; isMonad = isMonad
|
||||
}
|
||||
toMonad .Monad.raw = rawMnd
|
||||
toMonad .Monad.isMonad = isMonad
|
||||
|
||||
record §2 : Set ℓ where
|
||||
no-eta-equality
|
||||
open K
|
||||
|
||||
field
|
||||
|
@ -96,28 +99,24 @@ module voe {ℓa ℓb : Level} (ℂ : Category ℓa ℓb) where
|
|||
isMonad : IsMonad rawMnd
|
||||
|
||||
toMonad : Monad
|
||||
toMonad = record
|
||||
{ raw = rawMnd
|
||||
; isMonad = isMonad
|
||||
}
|
||||
toMonad .Monad.raw = rawMnd
|
||||
toMonad .Monad.isMonad = isMonad
|
||||
|
||||
§1-fromMonad : (m : M.Monad) → §2-3.§1 (M.Monad.Romap m) (λ {X} → M.Monad.pureT m X)
|
||||
§1-fromMonad m = record
|
||||
{ fmap = Functor.fmap R
|
||||
; RisFunctor = Functor.isFunctor R
|
||||
; pureN = pureN
|
||||
; join = λ {X} → joinT X
|
||||
; joinN = joinN
|
||||
; isMonad = M.Monad.isMonad m
|
||||
}
|
||||
where
|
||||
module _ (m : M.Monad) where
|
||||
open M.Monad m
|
||||
|
||||
§1-fromMonad : §2-3.§1 (M.Monad.Romap m) (λ {X} → M.Monad.pureT m X)
|
||||
§1-fromMonad .§2-3.§1.fmap = Functor.fmap R
|
||||
§1-fromMonad .§2-3.§1.RisFunctor = Functor.isFunctor R
|
||||
§1-fromMonad .§2-3.§1.pureN = pureN
|
||||
§1-fromMonad .§2-3.§1.join {X} = joinT X
|
||||
§1-fromMonad .§2-3.§1.joinN = joinN
|
||||
§1-fromMonad .§2-3.§1.isMonad = M.Monad.isMonad m
|
||||
|
||||
|
||||
§2-fromMonad : (m : K.Monad) → §2-3.§2 (K.Monad.omap m) (K.Monad.pure m)
|
||||
§2-fromMonad m = record
|
||||
{ bind = K.Monad.bind m
|
||||
; isMonad = K.Monad.isMonad m
|
||||
}
|
||||
§2-fromMonad m .§2-3.§2.bind = K.Monad.bind m
|
||||
§2-fromMonad m .§2-3.§2.isMonad = K.Monad.isMonad m
|
||||
|
||||
-- | In the following we seek to transform the equivalence `Monoidal≃Kleisli`
|
||||
-- | to talk about voevodsky's construction.
|
||||
|
@ -145,67 +144,103 @@ module voe {ℓa ℓb : Level} (ℂ : Category ℓa ℓb) where
|
|||
|
||||
forthEq : ∀ m → (forth ∘ back) m ≡ m
|
||||
forthEq m = begin
|
||||
(forth ∘ back) m ≡⟨⟩
|
||||
-- In full gory detail:
|
||||
( §2-fromMonad
|
||||
∘ Monoidal→Kleisli
|
||||
∘ §2-3.§1.toMonad
|
||||
∘ §1-fromMonad
|
||||
∘ Kleisli→Monoidal
|
||||
∘ §2-3.§2.toMonad
|
||||
) m ≡⟨⟩ -- fromMonad and toMonad are inverses
|
||||
( §2-fromMonad
|
||||
∘ Monoidal→Kleisli
|
||||
∘ Kleisli→Monoidal
|
||||
∘ §2-3.§2.toMonad
|
||||
) m ≡⟨ cong (λ φ → φ m) t ⟩
|
||||
-- Monoidal→Kleisli and Kleisli→Monoidal are inverses
|
||||
-- I should be able to prove this using congruence and `lem` below.
|
||||
( §2-fromMonad
|
||||
∘ §2-3.§2.toMonad
|
||||
) m ≡⟨⟩
|
||||
( §2-fromMonad
|
||||
∘ §2-3.§2.toMonad
|
||||
) m ≡⟨⟩ -- fromMonad and toMonad are inverses
|
||||
m ∎
|
||||
§2-fromMonad
|
||||
(Monoidal→Kleisli
|
||||
(§2-3.§1.toMonad
|
||||
(§1-fromMonad (Kleisli→Monoidal (§2-3.§2.toMonad m)))))
|
||||
≡⟨ cong-d (§2-fromMonad ∘ Monoidal→Kleisli) (lemmaz (Kleisli→Monoidal (§2-3.§2.toMonad m))) ⟩
|
||||
§2-fromMonad
|
||||
((Monoidal→Kleisli ∘ Kleisli→Monoidal)
|
||||
(§2-3.§2.toMonad m))
|
||||
-- Below is the fully normalized goal and context with
|
||||
-- `funExt` made abstract.
|
||||
--
|
||||
-- Goal: PathP (λ _ → §2-3.§2 omap (λ {z} → pure))
|
||||
-- (§2-fromMonad
|
||||
-- (.Cat.Category.Monad.toKleisli ℂ
|
||||
-- (.Cat.Category.Monad.toMonoidal ℂ (§2-3.§2.toMonad m))))
|
||||
-- (§2-fromMonad (§2-3.§2.toMonad m))
|
||||
-- Have: PathP
|
||||
-- (λ i →
|
||||
-- §2-3.§2 K.IsMonad.omap
|
||||
-- (K.RawMonad.pure
|
||||
-- (K.Monad.raw
|
||||
-- (funExt (λ m₁ → K.Monad≡ (.Cat.Category.Monad.toKleisliRawEq ℂ m₁))
|
||||
-- i (§2-3.§2.toMonad m)))))
|
||||
-- (§2-fromMonad
|
||||
-- (.Cat.Category.Monad.toKleisli ℂ
|
||||
-- (.Cat.Category.Monad.toMonoidal ℂ (§2-3.§2.toMonad m))))
|
||||
-- (§2-fromMonad (§2-3.§2.toMonad m))
|
||||
≡⟨ ( cong-d {x = Monoidal→Kleisli ∘ Kleisli→Monoidal} {y = idFun K.Monad} (\ φ → §2-fromMonad (φ (§2-3.§2.toMonad m))) re-ve) ⟩
|
||||
(§2-fromMonad ∘ §2-3.§2.toMonad) m
|
||||
≡⟨ lemma ⟩
|
||||
m ∎
|
||||
where
|
||||
t' : ((Monoidal→Kleisli ∘ Kleisli→Monoidal) ∘ §2-3.§2.toMonad {omap} {pure})
|
||||
≡ §2-3.§2.toMonad
|
||||
t' = cong (\ φ → φ ∘ §2-3.§2.toMonad) re-ve
|
||||
t : (§2-fromMonad ∘ (Monoidal→Kleisli ∘ Kleisli→Monoidal) ∘ §2-3.§2.toMonad {omap} {pure})
|
||||
≡ (§2-fromMonad ∘ §2-3.§2.toMonad)
|
||||
t = cong-d (\ f → §2-fromMonad ∘ f) t'
|
||||
u : (§2-fromMonad ∘ (Monoidal→Kleisli ∘ Kleisli→Monoidal) ∘ §2-3.§2.toMonad) m
|
||||
≡ (§2-fromMonad ∘ §2-3.§2.toMonad) m
|
||||
u = cong (\ φ → φ m) t
|
||||
lemma : (§2-fromMonad ∘ §2-3.§2.toMonad) m ≡ m
|
||||
§2-3.§2.bind (lemma i) = §2-3.§2.bind m
|
||||
§2-3.§2.isMonad (lemma i) = §2-3.§2.isMonad m
|
||||
lemmaz : ∀ m → §2-3.§1.toMonad (§1-fromMonad m) ≡ m
|
||||
M.Monad.raw (lemmaz m i) = M.Monad.raw m
|
||||
M.Monad.isMonad (lemmaz m i) = M.Monad.isMonad m
|
||||
|
||||
backEq : ∀ m → (back ∘ forth) m ≡ m
|
||||
backEq m = begin
|
||||
(back ∘ forth) m ≡⟨⟩
|
||||
( §1-fromMonad
|
||||
∘ Kleisli→Monoidal
|
||||
∘ §2-3.§2.toMonad
|
||||
∘ §2-fromMonad
|
||||
∘ Monoidal→Kleisli
|
||||
∘ §2-3.§1.toMonad
|
||||
) m ≡⟨⟩ -- fromMonad and toMonad are inverses
|
||||
( §1-fromMonad
|
||||
∘ Kleisli→Monoidal
|
||||
∘ Monoidal→Kleisli
|
||||
∘ §2-3.§1.toMonad
|
||||
) m ≡⟨ cong (λ φ → φ m) t ⟩ -- Monoidal→Kleisli and Kleisli→Monoidal are inverses
|
||||
( §1-fromMonad
|
||||
∘ §2-3.§1.toMonad
|
||||
) m ≡⟨⟩ -- fromMonad and toMonad are inverses
|
||||
§1-fromMonad
|
||||
(Kleisli→Monoidal
|
||||
(§2-3.§2.toMonad
|
||||
(§2-fromMonad (Monoidal→Kleisli (§2-3.§1.toMonad m)))))
|
||||
≡⟨ cong-d (§1-fromMonad ∘ Kleisli→Monoidal) (lemma (Monoidal→Kleisli (§2-3.§1.toMonad m))) ⟩
|
||||
§1-fromMonad
|
||||
((Kleisli→Monoidal ∘ Monoidal→Kleisli)
|
||||
(§2-3.§1.toMonad m))
|
||||
-- Below is the fully normalized `agda2-goal-and-context`
|
||||
-- with `funExt` made abstract.
|
||||
--
|
||||
-- Goal: PathP (λ _ → §2-3.§1 omap (λ {X} → pure))
|
||||
-- (§1-fromMonad
|
||||
-- (.Cat.Category.Monad.toMonoidal ℂ
|
||||
-- (.Cat.Category.Monad.toKleisli ℂ (§2-3.§1.toMonad m))))
|
||||
-- (§1-fromMonad (§2-3.§1.toMonad m))
|
||||
-- Have: PathP
|
||||
-- (λ i →
|
||||
-- §2-3.§1
|
||||
-- (RawFunctor.omap
|
||||
-- (Functor.raw
|
||||
-- (M.RawMonad.R
|
||||
-- (M.Monad.raw
|
||||
-- (funExt
|
||||
-- (λ m₁ → M.Monad≡ (.Cat.Category.Monad.toMonoidalRawEq ℂ m₁)) i
|
||||
-- (§2-3.§1.toMonad m))))))
|
||||
-- (λ {X} →
|
||||
-- fst
|
||||
-- (M.RawMonad.pureNT
|
||||
-- (M.Monad.raw
|
||||
-- (funExt
|
||||
-- (λ m₁ → M.Monad≡ (.Cat.Category.Monad.toMonoidalRawEq ℂ m₁)) i
|
||||
-- (§2-3.§1.toMonad m))))
|
||||
-- X))
|
||||
-- (§1-fromMonad
|
||||
-- (.Cat.Category.Monad.toMonoidal ℂ
|
||||
-- (.Cat.Category.Monad.toKleisli ℂ (§2-3.§1.toMonad m))))
|
||||
-- (§1-fromMonad (§2-3.§1.toMonad m))
|
||||
≡⟨ (cong-d (\ φ → §1-fromMonad (φ (§2-3.§1.toMonad m))) ve-re) ⟩
|
||||
§1-fromMonad (§2-3.§1.toMonad m)
|
||||
≡⟨ lemmaz ⟩
|
||||
m ∎
|
||||
where
|
||||
t : §1-fromMonad ∘ Kleisli→Monoidal ∘ Monoidal→Kleisli ∘ §2-3.§1.toMonad
|
||||
≡ §1-fromMonad ∘ §2-3.§1.toMonad
|
||||
-- Why does `re-ve` not satisfy this goal?
|
||||
t i m = §1-fromMonad (ve-re i (§2-3.§1.toMonad m))
|
||||
|
||||
voe-isEquiv : isEquiv (§2-3.§1 omap pure) (§2-3.§2 omap pure) forth
|
||||
voe-isEquiv = gradLemma forth back forthEq backEq
|
||||
where
|
||||
lemmaz : §1-fromMonad (§2-3.§1.toMonad m) ≡ m
|
||||
§2-3.§1.fmap (lemmaz i) = §2-3.§1.fmap m
|
||||
§2-3.§1.join (lemmaz i) = §2-3.§1.join m
|
||||
§2-3.§1.RisFunctor (lemmaz i) = §2-3.§1.RisFunctor m
|
||||
§2-3.§1.pureN (lemmaz i) = §2-3.§1.pureN m
|
||||
§2-3.§1.joinN (lemmaz i) = §2-3.§1.joinN m
|
||||
§2-3.§1.isMonad (lemmaz i) = §2-3.§1.isMonad m
|
||||
lemma : ∀ m → §2-3.§2.toMonad (§2-fromMonad m) ≡ m
|
||||
K.Monad.raw (lemma m i) = K.Monad.raw m
|
||||
K.Monad.isMonad (lemma m i) = K.Monad.isMonad m
|
||||
|
||||
equiv-2-3 : §2-3.§1 omap pure ≃ §2-3.§2 omap pure
|
||||
equiv-2-3 = forth , voe-isEquiv
|
||||
equiv-2-3 = fromIsomorphism _ _
|
||||
( forth , back
|
||||
, funExt backEq , funExt forthEq
|
||||
)
|
||||
|
|
|
@ -11,7 +11,7 @@ module _ {ℓa ℓb : Level} (ℂ : Category ℓa ℓb) where
|
|||
|
||||
module _ (A B : Object) where
|
||||
record RawProduct : Set (ℓa ⊔ ℓb) where
|
||||
no-eta-equality
|
||||
-- no-eta-equality
|
||||
field
|
||||
object : Object
|
||||
fst : ℂ [ object , A ]
|
||||
|
@ -55,286 +55,122 @@ module _ {ℓa ℓb : Level} (ℂ : Category ℓa ℓb) where
|
|||
× ℂ [ g ∘ snd ]
|
||||
]
|
||||
|
||||
module _ {ℓa ℓb : Level} {ℂ : Category ℓa ℓb} {A B : Category.Object ℂ} where
|
||||
module _ {ℓa ℓb : Level} {ℂ : Category ℓa ℓb}
|
||||
(let module ℂ = Category ℂ) {𝒜 ℬ : ℂ.Object} where
|
||||
private
|
||||
open Category ℂ
|
||||
module _ (raw : RawProduct ℂ A B) where
|
||||
module _ (x y : IsProduct ℂ A B raw) where
|
||||
private
|
||||
module x = IsProduct x
|
||||
module y = IsProduct y
|
||||
module _ (raw : RawProduct ℂ 𝒜 ℬ) where
|
||||
private
|
||||
open Category ℂ hiding (raw)
|
||||
module _ (x y : IsProduct ℂ 𝒜 ℬ raw) where
|
||||
private
|
||||
module x = IsProduct x
|
||||
module y = IsProduct y
|
||||
|
||||
module _ {X : Object} (f : ℂ [ X , A ]) (g : ℂ [ X , B ]) where
|
||||
module _ (f×g : Arrow X y.object) where
|
||||
help : isProp (∀{y} → (ℂ [ y.fst ∘ y ] ≡ f) P.× (ℂ [ y.snd ∘ y ] ≡ g) → f×g ≡ y)
|
||||
help = propPiImpl (λ _ → propPi (λ _ → arrowsAreSets _ _))
|
||||
module _ {X : Object} (f : ℂ [ X , 𝒜 ]) (g : ℂ [ X , ℬ ]) where
|
||||
module _ (f×g : Arrow X y.object) where
|
||||
help : isProp (∀{y} → (ℂ [ y.fst ∘ y ] ≡ f) P.× (ℂ [ y.snd ∘ y ] ≡ g) → f×g ≡ y)
|
||||
help = propPiImpl (λ _ → propPi (λ _ → arrowsAreSets _ _))
|
||||
|
||||
res = ∃-unique (x.ump f g) (y.ump f g)
|
||||
res = ∃-unique (x.ump f g) (y.ump f g)
|
||||
|
||||
prodAux : x.ump f g ≡ y.ump f g
|
||||
prodAux = lemSig ((λ f×g → propSig (propSig (arrowsAreSets _ _) λ _ → arrowsAreSets _ _) (λ _ → help f×g))) _ _ res
|
||||
prodAux : x.ump f g ≡ y.ump f g
|
||||
prodAux = lemSig ((λ f×g → propSig (propSig (arrowsAreSets _ _) λ _ → arrowsAreSets _ _) (λ _ → help f×g))) _ _ res
|
||||
|
||||
propIsProduct' : x ≡ y
|
||||
propIsProduct' i = record { ump = λ f g → prodAux f g i }
|
||||
propIsProduct' : x ≡ y
|
||||
propIsProduct' i = record { ump = λ f g → prodAux f g i }
|
||||
|
||||
propIsProduct : isProp (IsProduct ℂ A B raw)
|
||||
propIsProduct : isProp (IsProduct ℂ 𝒜 ℬ raw)
|
||||
propIsProduct = propIsProduct'
|
||||
|
||||
Product≡ : {x y : Product ℂ A B} → (Product.raw x ≡ Product.raw y) → x ≡ y
|
||||
Product≡ {x} {y} p i = record { raw = p i ; isProduct = q i }
|
||||
where
|
||||
q : (λ i → IsProduct ℂ A B (p i)) [ Product.isProduct x ≡ Product.isProduct y ]
|
||||
q = lemPropF propIsProduct p
|
||||
Product≡ : {x y : Product ℂ 𝒜 ℬ} → (Product.raw x ≡ Product.raw y) → x ≡ y
|
||||
Product≡ {x} {y} p i = record { raw = p i ; isProduct = q i }
|
||||
where
|
||||
q : (λ i → IsProduct ℂ 𝒜 ℬ (p i)) [ Product.isProduct x ≡ Product.isProduct y ]
|
||||
q = lemPropF propIsProduct p
|
||||
|
||||
module Try0 {ℓa ℓb : Level} {ℂ : Category ℓa ℓb}
|
||||
(let module ℂ = Category ℂ) {𝒜 ℬ : ℂ.Object} where
|
||||
open P
|
||||
open import Cat.Categories.Span
|
||||
|
||||
open P
|
||||
open Category (span ℂ 𝒜 ℬ)
|
||||
|
||||
module _ where
|
||||
raw : RawCategory _ _
|
||||
raw = record
|
||||
{ Object = Σ[ X ∈ ℂ.Object ] ℂ.Arrow X 𝒜 × ℂ.Arrow X ℬ
|
||||
; Arrow = λ{ (A , a0 , a1) (B , b0 , b1)
|
||||
→ Σ[ f ∈ ℂ.Arrow A B ]
|
||||
ℂ [ b0 ∘ f ] ≡ a0
|
||||
× ℂ [ b1 ∘ f ] ≡ a1
|
||||
}
|
||||
; identity = λ{ {X , f , g} → ℂ.identity {X} , ℂ.rightIdentity , ℂ.rightIdentity}
|
||||
; _<<<_ = λ { {_ , a0 , a1} {_ , b0 , b1} {_ , c0 , c1} (f , f0 , f1) (g , g0 , g1)
|
||||
→ (f ℂ.<<< g)
|
||||
, (begin
|
||||
ℂ [ c0 ∘ ℂ [ f ∘ g ] ] ≡⟨ ℂ.isAssociative ⟩
|
||||
ℂ [ ℂ [ c0 ∘ f ] ∘ g ] ≡⟨ cong (λ φ → ℂ [ φ ∘ g ]) f0 ⟩
|
||||
ℂ [ b0 ∘ g ] ≡⟨ g0 ⟩
|
||||
a0 ∎
|
||||
)
|
||||
, (begin
|
||||
ℂ [ c1 ∘ ℂ [ f ∘ g ] ] ≡⟨ ℂ.isAssociative ⟩
|
||||
ℂ [ ℂ [ c1 ∘ f ] ∘ g ] ≡⟨ cong (λ φ → ℂ [ φ ∘ g ]) f1 ⟩
|
||||
ℂ [ b1 ∘ g ] ≡⟨ g1 ⟩
|
||||
a1 ∎
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
module _ where
|
||||
open RawCategory raw
|
||||
|
||||
propEqs : ∀ {X' : Object}{Y' : Object} (let X , xa , xb = X') (let Y , ya , yb = Y')
|
||||
→ (xy : ℂ.Arrow X Y) → isProp (ℂ [ ya ∘ xy ] ≡ xa × ℂ [ yb ∘ xy ] ≡ xb)
|
||||
propEqs xs = propSig (ℂ.arrowsAreSets _ _) (\ _ → ℂ.arrowsAreSets _ _)
|
||||
|
||||
arrowEq : {X Y : Object} {f g : Arrow X Y} → fst f ≡ fst g → f ≡ g
|
||||
arrowEq {X} {Y} {f} {g} p = λ i → p i , lemPropF propEqs p {snd f} {snd g} i
|
||||
|
||||
private
|
||||
isAssociative : IsAssociative
|
||||
isAssociative {f = f , f0 , f1} {g , g0 , g1} {h , h0 , h1} = arrowEq ℂ.isAssociative
|
||||
|
||||
isIdentity : IsIdentity identity
|
||||
isIdentity {AA@(A , a0 , a1)} {BB@(B , b0 , b1)} {f , f0 , f1} = arrowEq ℂ.leftIdentity , arrowEq ℂ.rightIdentity
|
||||
|
||||
arrowsAreSets : ArrowsAreSets
|
||||
arrowsAreSets {X , x0 , x1} {Y , y0 , y1}
|
||||
= sigPresSet ℂ.arrowsAreSets λ a → propSet (propEqs _)
|
||||
|
||||
isPreCat : IsPreCategory raw
|
||||
IsPreCategory.isAssociative isPreCat = isAssociative
|
||||
IsPreCategory.isIdentity isPreCat = isIdentity
|
||||
IsPreCategory.arrowsAreSets isPreCat = arrowsAreSets
|
||||
|
||||
open IsPreCategory isPreCat
|
||||
|
||||
module _ {𝕏 𝕐 : Object} where
|
||||
open Σ 𝕏 renaming (fst to X ; snd to x)
|
||||
open Σ x renaming (fst to xa ; snd to xb)
|
||||
open Σ 𝕐 renaming (fst to Y ; snd to y)
|
||||
open Σ y renaming (fst to ya ; snd to yb)
|
||||
open import Cat.Equivalence using (composeIso) renaming (_≅_ to _≅_)
|
||||
step0
|
||||
: ((X , xa , xb) ≡ (Y , ya , yb))
|
||||
≅ (Σ[ p ∈ (X ≡ Y) ] (PathP (λ i → ℂ.Arrow (p i) 𝒜) xa ya) × (PathP (λ i → ℂ.Arrow (p i) ℬ) xb yb))
|
||||
step0
|
||||
= (λ p → cong fst p , cong-d (fst ∘ snd) p , cong-d (snd ∘ snd) p)
|
||||
-- , (λ x → λ i → fst x i , (fst (snd x) i) , (snd (snd x) i))
|
||||
, (λ{ (p , q , r) → Σ≡ p λ i → q i , r i})
|
||||
, funExt (λ{ p → refl})
|
||||
, funExt (λ{ (p , q , r) → refl})
|
||||
|
||||
step1
|
||||
: (Σ[ p ∈ (X ≡ Y) ] (PathP (λ i → ℂ.Arrow (p i) 𝒜) xa ya) × (PathP (λ i → ℂ.Arrow (p i) ℬ) xb yb))
|
||||
≅ Σ (X ℂ.≊ Y) (λ iso
|
||||
→ let p = ℂ.isoToId iso
|
||||
in
|
||||
( PathP (λ i → ℂ.Arrow (p i) 𝒜) xa ya)
|
||||
× PathP (λ i → ℂ.Arrow (p i) ℬ) xb yb
|
||||
)
|
||||
step1
|
||||
= symIso
|
||||
(isoSigFst
|
||||
{A = (X ℂ.≊ Y)}
|
||||
{B = (X ≡ Y)}
|
||||
(ℂ.groupoidObject _ _)
|
||||
{Q = \ p → (PathP (λ i → ℂ.Arrow (p i) 𝒜) xa ya) × (PathP (λ i → ℂ.Arrow (p i) ℬ) xb yb)}
|
||||
ℂ.isoToId
|
||||
(symIso (_ , ℂ.asTypeIso {X} {Y}) .snd)
|
||||
)
|
||||
|
||||
step2
|
||||
: Σ (X ℂ.≊ Y) (λ iso
|
||||
→ let p = ℂ.isoToId iso
|
||||
in
|
||||
( PathP (λ i → ℂ.Arrow (p i) 𝒜) xa ya)
|
||||
× PathP (λ i → ℂ.Arrow (p i) ℬ) xb yb
|
||||
)
|
||||
≅ ((X , xa , xb) ≊ (Y , ya , yb))
|
||||
step2
|
||||
= ( λ{ (iso@(f , f~ , inv-f) , p , q)
|
||||
→ ( f , sym (ℂ.domain-twist-sym iso p) , sym (ℂ.domain-twist-sym iso q))
|
||||
, ( f~ , sym (ℂ.domain-twist iso p) , sym (ℂ.domain-twist iso q))
|
||||
, arrowEq (fst inv-f)
|
||||
, arrowEq (snd inv-f)
|
||||
}
|
||||
)
|
||||
, (λ{ (f , f~ , inv-f , inv-f~) →
|
||||
let
|
||||
iso : X ℂ.≊ Y
|
||||
iso = fst f , fst f~ , cong fst inv-f , cong fst inv-f~
|
||||
p : X ≡ Y
|
||||
p = ℂ.isoToId iso
|
||||
pA : ℂ.Arrow X 𝒜 ≡ ℂ.Arrow Y 𝒜
|
||||
pA = cong (λ x → ℂ.Arrow x 𝒜) p
|
||||
pB : ℂ.Arrow X ℬ ≡ ℂ.Arrow Y ℬ
|
||||
pB = cong (λ x → ℂ.Arrow x ℬ) p
|
||||
k0 = begin
|
||||
coe pB xb ≡⟨ ℂ.coe-dom iso ⟩
|
||||
xb ℂ.<<< fst f~ ≡⟨ snd (snd f~) ⟩
|
||||
yb ∎
|
||||
k1 = begin
|
||||
coe pA xa ≡⟨ ℂ.coe-dom iso ⟩
|
||||
xa ℂ.<<< fst f~ ≡⟨ fst (snd f~) ⟩
|
||||
ya ∎
|
||||
in iso , coe-lem-inv k1 , coe-lem-inv k0})
|
||||
, funExt (λ x → lemSig
|
||||
(λ x → propSig prop0 (λ _ → prop1))
|
||||
_ _
|
||||
(Σ≡ refl (ℂ.propIsomorphism _ _ _)))
|
||||
, funExt (λ{ (f , _) → lemSig propIsomorphism _ _ (Σ≡ refl (propEqs _ _ _))})
|
||||
where
|
||||
prop0 : ∀ {x} → isProp (PathP (λ i → ℂ.Arrow (ℂ.isoToId x i) 𝒜) xa ya)
|
||||
prop0 {x} = pathJ (λ y p → ∀ x → isProp (PathP (λ i → ℂ.Arrow (p i) 𝒜) xa x)) (λ x → ℂ.arrowsAreSets _ _) Y (ℂ.isoToId x) ya
|
||||
prop1 : ∀ {x} → isProp (PathP (λ i → ℂ.Arrow (ℂ.isoToId x i) ℬ) xb yb)
|
||||
prop1 {x} = pathJ (λ y p → ∀ x → isProp (PathP (λ i → ℂ.Arrow (p i) ℬ) xb x)) (λ x → ℂ.arrowsAreSets _ _) Y (ℂ.isoToId x) yb
|
||||
-- One thing to watch out for here is that the isomorphisms going forwards
|
||||
-- must compose to give idToIso
|
||||
iso
|
||||
: ((X , xa , xb) ≡ (Y , ya , yb))
|
||||
≅ ((X , xa , xb) ≊ (Y , ya , yb))
|
||||
iso = step0 ⊙ step1 ⊙ step2
|
||||
lemma : Terminal ≃ Product ℂ 𝒜 ℬ
|
||||
lemma = fromIsomorphism Terminal (Product ℂ 𝒜 ℬ) (f , g , inv)
|
||||
where
|
||||
f : Terminal → Product ℂ 𝒜 ℬ
|
||||
f ((X , x0 , x1) , uniq) = p
|
||||
where
|
||||
infixl 5 _⊙_
|
||||
_⊙_ = composeIso
|
||||
equiv1
|
||||
: ((X , xa , xb) ≡ (Y , ya , yb))
|
||||
≃ ((X , xa , xb) ≊ (Y , ya , yb))
|
||||
equiv1 = _ , fromIso _ _ (snd iso)
|
||||
|
||||
univalent : Univalent
|
||||
univalent = univalenceFrom≃ equiv1
|
||||
|
||||
isCat : IsCategory raw
|
||||
IsCategory.isPreCategory isCat = isPreCat
|
||||
IsCategory.univalent isCat = univalent
|
||||
|
||||
cat : Category _ _
|
||||
cat = record
|
||||
{ raw = raw
|
||||
; isCategory = isCat
|
||||
}
|
||||
|
||||
open Category cat
|
||||
|
||||
lemma : Terminal ≃ Product ℂ 𝒜 ℬ
|
||||
lemma = fromIsomorphism Terminal (Product ℂ 𝒜 ℬ) (f , g , inv)
|
||||
-- C-x 8 RET MATHEMATICAL BOLD SCRIPT CAPITAL A
|
||||
-- 𝒜
|
||||
where
|
||||
f : Terminal → Product ℂ 𝒜 ℬ
|
||||
f ((X , x0 , x1) , uniq) = p
|
||||
where
|
||||
rawP : RawProduct ℂ 𝒜 ℬ
|
||||
rawP = record
|
||||
{ object = X
|
||||
; fst = x0
|
||||
; snd = x1
|
||||
}
|
||||
-- open RawProduct rawP renaming (fst to x0 ; snd to x1)
|
||||
module _ {Y : ℂ.Object} (p0 : ℂ [ Y , 𝒜 ]) (p1 : ℂ [ Y , ℬ ]) where
|
||||
uy : isContr (Arrow (Y , p0 , p1) (X , x0 , x1))
|
||||
uy = uniq {Y , p0 , p1}
|
||||
open Σ uy renaming (fst to Y→X ; snd to contractible)
|
||||
open Σ Y→X renaming (fst to p0×p1 ; snd to cond)
|
||||
ump : ∃![ f×g ] (ℂ [ x0 ∘ f×g ] ≡ p0 P.× ℂ [ x1 ∘ f×g ] ≡ p1)
|
||||
ump = p0×p1 , cond , λ {f} cond-f → cong fst (contractible (f , cond-f))
|
||||
isP : IsProduct ℂ 𝒜 ℬ rawP
|
||||
isP = record { ump = ump }
|
||||
p : Product ℂ 𝒜 ℬ
|
||||
p = record
|
||||
{ raw = rawP
|
||||
; isProduct = isP
|
||||
}
|
||||
g : Product ℂ 𝒜 ℬ → Terminal
|
||||
g p = 𝒳 , t
|
||||
where
|
||||
open Product p renaming (object to X ; fst to x₀ ; snd to x₁) using ()
|
||||
module p = Product p
|
||||
module isp = IsProduct p.isProduct
|
||||
𝒳 : Object
|
||||
𝒳 = X , x₀ , x₁
|
||||
module _ {𝒴 : Object} where
|
||||
open Σ 𝒴 renaming (fst to Y)
|
||||
open Σ (snd 𝒴) renaming (fst to y₀ ; snd to y₁)
|
||||
ump = p.ump y₀ y₁
|
||||
open Σ ump renaming (fst to f')
|
||||
open Σ (snd ump) renaming (fst to f'-cond)
|
||||
𝒻 : Arrow 𝒴 𝒳
|
||||
𝒻 = f' , f'-cond
|
||||
contractible : (f : Arrow 𝒴 𝒳) → 𝒻 ≡ f
|
||||
contractible ff@(f , f-cond) = res
|
||||
where
|
||||
k : f' ≡ f
|
||||
k = snd (snd ump) f-cond
|
||||
prp : (a : ℂ.Arrow Y X) → isProp
|
||||
( (ℂ [ x₀ ∘ a ] ≡ y₀)
|
||||
× (ℂ [ x₁ ∘ a ] ≡ y₁)
|
||||
)
|
||||
prp f f0 f1 = Σ≡
|
||||
(ℂ.arrowsAreSets _ _ (fst f0) (fst f1))
|
||||
(ℂ.arrowsAreSets _ _ (snd f0) (snd f1))
|
||||
h :
|
||||
( λ i
|
||||
→ ℂ [ x₀ ∘ k i ] ≡ y₀
|
||||
× ℂ [ x₁ ∘ k i ] ≡ y₁
|
||||
) [ f'-cond ≡ f-cond ]
|
||||
h = lemPropF prp k
|
||||
res : (f' , f'-cond) ≡ (f , f-cond)
|
||||
res = Σ≡ k h
|
||||
t : IsTerminal 𝒳
|
||||
t {𝒴} = 𝒻 , contractible
|
||||
ve-re : ∀ x → g (f x) ≡ x
|
||||
ve-re x = Propositionality.propTerminal _ _
|
||||
re-ve : ∀ p → f (g p) ≡ p
|
||||
re-ve p = Product≡ e
|
||||
where
|
||||
module p = Product p
|
||||
-- RawProduct does not have eta-equality.
|
||||
e : Product.raw (f (g p)) ≡ Product.raw p
|
||||
RawProduct.object (e i) = p.object
|
||||
RawProduct.fst (e i) = p.fst
|
||||
RawProduct.snd (e i) = p.snd
|
||||
inv : AreInverses f g
|
||||
inv = funExt ve-re , funExt re-ve
|
||||
rawP : RawProduct ℂ 𝒜 ℬ
|
||||
rawP = record
|
||||
{ object = X
|
||||
; fst = x0
|
||||
; snd = x1
|
||||
}
|
||||
-- open RawProduct rawP renaming (fst to x0 ; snd to x1)
|
||||
module _ {Y : ℂ.Object} (p0 : ℂ [ Y , 𝒜 ]) (p1 : ℂ [ Y , ℬ ]) where
|
||||
uy : isContr (Arrow (Y , p0 , p1) (X , x0 , x1))
|
||||
uy = uniq {Y , p0 , p1}
|
||||
open Σ uy renaming (fst to Y→X ; snd to contractible)
|
||||
open Σ Y→X renaming (fst to p0×p1 ; snd to cond)
|
||||
ump : ∃![ f×g ] (ℂ [ x0 ∘ f×g ] ≡ p0 P.× ℂ [ x1 ∘ f×g ] ≡ p1)
|
||||
ump = p0×p1 , cond , λ {f} cond-f → cong fst (contractible (f , cond-f))
|
||||
isP : IsProduct ℂ 𝒜 ℬ rawP
|
||||
isP = record { ump = ump }
|
||||
p : Product ℂ 𝒜 ℬ
|
||||
p = record
|
||||
{ raw = rawP
|
||||
; isProduct = isP
|
||||
}
|
||||
g : Product ℂ 𝒜 ℬ → Terminal
|
||||
g p = 𝒳 , t
|
||||
where
|
||||
open Product p renaming (object to X ; fst to x₀ ; snd to x₁) using ()
|
||||
module p = Product p
|
||||
module isp = IsProduct p.isProduct
|
||||
𝒳 : Object
|
||||
𝒳 = X , x₀ , x₁
|
||||
module _ {𝒴 : Object} where
|
||||
open Σ 𝒴 renaming (fst to Y)
|
||||
open Σ (snd 𝒴) renaming (fst to y₀ ; snd to y₁)
|
||||
ump = p.ump y₀ y₁
|
||||
open Σ ump renaming (fst to f')
|
||||
open Σ (snd ump) renaming (fst to f'-cond)
|
||||
𝒻 : Arrow 𝒴 𝒳
|
||||
𝒻 = f' , f'-cond
|
||||
contractible : (f : Arrow 𝒴 𝒳) → 𝒻 ≡ f
|
||||
contractible ff@(f , f-cond) = res
|
||||
where
|
||||
k : f' ≡ f
|
||||
k = snd (snd ump) f-cond
|
||||
prp : (a : ℂ.Arrow Y X) → isProp
|
||||
( (ℂ [ x₀ ∘ a ] ≡ y₀)
|
||||
× (ℂ [ x₁ ∘ a ] ≡ y₁)
|
||||
)
|
||||
prp f f0 f1 = Σ≡
|
||||
(ℂ.arrowsAreSets _ _ (fst f0) (fst f1))
|
||||
(ℂ.arrowsAreSets _ _ (snd f0) (snd f1))
|
||||
h :
|
||||
( λ i
|
||||
→ ℂ [ x₀ ∘ k i ] ≡ y₀
|
||||
× ℂ [ x₁ ∘ k i ] ≡ y₁
|
||||
) [ f'-cond ≡ f-cond ]
|
||||
h = lemPropF prp k
|
||||
res : (f' , f'-cond) ≡ (f , f-cond)
|
||||
res = Σ≡ k h
|
||||
t : IsTerminal 𝒳
|
||||
t {𝒴} = 𝒻 , contractible
|
||||
ve-re : ∀ x → g (f x) ≡ x
|
||||
ve-re x = Propositionality.propTerminal _ _
|
||||
re-ve : ∀ p → f (g p) ≡ p
|
||||
re-ve p = Product≡ e
|
||||
where
|
||||
module p = Product p
|
||||
-- RawProduct does not have eta-equality.
|
||||
e : Product.raw (f (g p)) ≡ Product.raw p
|
||||
RawProduct.object (e i) = p.object
|
||||
RawProduct.fst (e i) = p.fst
|
||||
RawProduct.snd (e i) = p.snd
|
||||
inv : AreInverses f g
|
||||
inv = funExt ve-re , funExt re-ve
|
||||
|
||||
propProduct : isProp (Product ℂ 𝒜 ℬ)
|
||||
propProduct = equivPreservesNType {n = ⟨-1⟩} lemma Propositionality.propTerminal
|
||||
|
@ -348,10 +184,7 @@ module _ {ℓa ℓb : Level} {ℂ : Category ℓa ℓb} {A B : Category.Object
|
|||
module y = HasProducts y
|
||||
|
||||
productEq : x.product ≡ y.product
|
||||
productEq = funExt λ A → funExt λ B → Try0.propProduct _ _
|
||||
productEq = funExt λ A → funExt λ B → propProduct _ _
|
||||
|
||||
propHasProducts : isProp (HasProducts ℂ)
|
||||
propHasProducts x y i = record { product = productEq x y i }
|
||||
|
||||
fmap≡ : {A : Set} {a0 a1 : A} {B : Set} → (f : A → B) → Path a0 a1 → Path (f a0) (f a1)
|
||||
fmap≡ = cong
|
||||
|
|
|
@ -9,9 +9,9 @@ open import Cat.Category.Functor
|
|||
open import Cat.Category.NaturalTransformation
|
||||
renaming (module Properties to F)
|
||||
using ()
|
||||
|
||||
open import Cat.Categories.Fun using (module Fun)
|
||||
open import Cat.Categories.Opposite
|
||||
open import Cat.Categories.Sets hiding (presheaf)
|
||||
open import Cat.Categories.Fun using (module Fun)
|
||||
|
||||
-- There is no (small) category of categories. So we won't use _⇑_ from
|
||||
-- `HasExponential`
|
||||
|
|
|
@ -105,3 +105,38 @@ module _ {ℓ : Level} {A : Set ℓ} where
|
|||
ntypeCumulative : ∀ {n m} → n ≤′ m → HasLevel ⟨ n ⟩₋₂ A → HasLevel ⟨ m ⟩₋₂ A
|
||||
ntypeCumulative {m} ≤′-refl lvl = lvl
|
||||
ntypeCumulative {n} {suc m} (≤′-step le) lvl = HasLevel+1 ⟨ m ⟩₋₂ (ntypeCumulative le lvl)
|
||||
|
||||
grpdPi : {ℓb : Level} {B : A → Set ℓb}
|
||||
→ ((a : A) → isGrpd (B a)) → isGrpd ((a : A) → (B a))
|
||||
grpdPi = piPresNType (S (S (S ⟨-2⟩)))
|
||||
|
||||
grpdPiImpl : {ℓb : Level} {B : A → Set ℓb}
|
||||
→ ({a : A} → isGrpd (B a)) → isGrpd ({a : A} → (B a))
|
||||
grpdPiImpl {B = B} g = equivPreservesNType {A = Expl} {B = Impl} {n = one} e (grpdPi (λ a → g))
|
||||
where
|
||||
one = (S (S (S ⟨-2⟩)))
|
||||
t : ({a : A} → HasLevel one (B a))
|
||||
t = g
|
||||
Impl = {a : A} → B a
|
||||
Expl = (a : A) → B a
|
||||
expl : Impl → Expl
|
||||
expl f a = f {a}
|
||||
impl : Expl → Impl
|
||||
impl f {a} = f a
|
||||
e : Expl ≃ Impl
|
||||
e = impl , (gradLemma impl expl (λ f → refl) (λ f → refl))
|
||||
|
||||
setGrpd : isSet A → isGrpd A
|
||||
setGrpd = ntypeCumulative
|
||||
{suc (suc zero)} {suc (suc (suc zero))}
|
||||
(≤′-step ≤′-refl)
|
||||
|
||||
propGrpd : isProp A → isGrpd A
|
||||
propGrpd = ntypeCumulative
|
||||
{suc zero} {suc (suc (suc zero))}
|
||||
(≤′-step (≤′-step ≤′-refl))
|
||||
|
||||
module _ {ℓa ℓb : Level} {A : Set ℓa} {B : A → Set ℓb} where
|
||||
open TLevel
|
||||
grpdSig : isGrpd A → (∀ a → isGrpd (B a)) → isGrpd (Σ A B)
|
||||
grpdSig = sigPresNType {n = S (S (S ⟨-2⟩))}
|
||||
|
|
Loading…
Reference in a new issue