Compare commits

...

70 Commits
1.5.0 ... dev

Author SHA1 Message Date
Frederik Hanghøj Iversen e7f5648607 Disable cube 2018-10-30 14:37:15 +01:00
Frederik Hanghøj Iversen 95a8e82d40 Update agda-stdlib to newer version 2018-10-30 14:28:09 +01:00
Frederik Hanghøj Iversen 8f15001a93 Use smaller symbols for arrows in presentation 2018-10-29 12:39:12 +01:00
Frederik Hanghøj Iversen c5020a0d87 Add Chalmers logo to front page 2018-09-11 15:16:14 +02:00
Frederik Hanghøj Iversen d7a07da07b Update to most recent version of stdlib 2018-08-04 13:46:59 +02:00
Frederik Hanghøj Iversen 7836367f4a Use HEAD version of cubical and stdlib 2018-07-19 20:22:17 +02:00
Andrea Vezzosi e16a4b8189 isEquiv is now a record 2018-07-18 16:49:01 +02:00
Frederik Hanghøj Iversen 9ee05e1a36 Universally quantify test object in epi- mono- morphism
Closes #26
2018-07-17 17:03:23 +02:00
Frederik Hanghøj Iversen 188bba6c8d Old unstaged changes
I hope these are mostly non dangerous.  Looks like it's mainly some reformatting.
2018-07-17 16:51:16 +02:00
Frederik Hanghøj Iversen 6f275247dd Final presentation 2018-06-07 15:20:14 +02:00
Frederik Hanghøj Iversen 5a748c57f0 Update RAEDME 2018-05-31 02:12:45 +02:00
Frederik Hanghøj Iversen c784723184 Insert link to pre-compiled pdf 2018-05-31 02:09:46 +02:00
Frederik Hanghøj Iversen 93871358a3 Update README 2018-05-31 02:05:56 +02:00
Frederik Hanghøj Iversen 66e5f46912 Untabify readme 2018-05-31 02:00:29 +02:00
Frederik Hanghøj Iversen 9497a650ea Test with master Agda version 2.6.0-d3efe64 2018-05-31 01:56:44 +02:00
Frederik Hanghøj Iversen 33f7e2ebbb Merge branch 'dev' 2018-05-31 01:07:31 +02:00
Frederik Hanghøj Iversen 27ae920634 Fixup some missing files 2018-05-31 01:07:05 +02:00
Frederik Hanghøj Iversen 08046d05dc Update makefile 2018-05-29 15:35:20 +02:00
Frederik Hanghøj Iversen 98cad057d5 Merge branch 'dev' 2018-05-29 15:26:38 +02:00
Frederik Hanghøj Iversen 49f1262b2c Update change log 2018-05-29 15:24:18 +02:00
Frederik Hanghøj Iversen 6d362af88e Add type-synonym 2018-05-29 15:14:46 +02:00
Frederik Hanghøj Iversen 392d656709 Move included graphics 2018-05-29 15:14:27 +02:00
Frederik Hanghøj Iversen 37a675a84f Final touch-up on report and acknowledgments 2018-05-29 15:09:38 +02:00
Frederik Hanghøj Iversen b992d5a7f2 Use unicode symbols 2018-05-28 17:44:23 +02:00
Frederik Hanghøj Iversen 1f750e2275 Erratta 2018-05-28 17:32:56 +02:00
Frederik Hanghøj Iversen 636b5f3e27 Makefile uses included libraries 2018-05-26 01:15:57 +02:00
Frederik Hanghøj Iversen 326951d826 Put in brackets for readability 2018-05-24 15:57:30 +02:00
Frederik Hanghøj Iversen 2d0dfab12a Remove some TODO-notes, add section on motifs 2018-05-23 18:28:27 +02:00
Frederik Hanghøj Iversen fc7e504359 Correctly terminate appendix section 2018-05-23 17:49:54 +02:00
Frederik Hanghøj Iversen 251fcf1966 Add backlog based on comments from Andrea, implement some of them 2018-05-23 17:34:50 +02:00
Frederik Hanghøj Iversen 879f5bab52 Use `fromIsomorphism` globally 2018-05-22 18:01:03 +02:00
Frederik Hanghøj Iversen 9848fac672 Provide grpdSig 2018-05-22 16:31:26 +02:00
Frederik Hanghøj Iversen cb0117819b Monoidal monads are also groupoids 2018-05-22 16:28:23 +02:00
Frederik Hanghøj Iversen b116247702 Kleisli monads are groupoids 2018-05-22 16:18:22 +02:00
Frederik Hanghøj Iversen e7f40eed8a Scaffolding for proving groupoid for monads 2018-05-22 15:40:30 +02:00
Frederik Hanghøj Iversen 1f2b105f9d Provide grpdPiImpl 2018-05-22 14:43:21 +02:00
Frederik Hanghøj Iversen 1683178f1c Ignore index-files 2018-05-22 13:48:30 +02:00
Frederik Hanghøj Iversen 01159930de Add section on functors and natural transformations
Also do not use ugly overbar
2018-05-22 13:45:52 +02:00
Frederik Hanghøj Iversen 2fce963072 TENTATIVE COMMIT 2018-05-18 13:14:41 +02:00
Frederik Hanghøj Iversen d4dc125fb0 Merge remote-tracking branch 'Saizan/benchmark' into dev 2018-05-16 11:38:12 +02:00
Frederik Hanghøj Iversen 1c0b0d9db2 Small changes 2018-05-16 11:36:26 +02:00
Frederik Hanghøj Iversen 4073d70189 Add note about constructive intepretation of univalence 2018-05-16 11:03:34 +02:00
Frederik Hanghøj Iversen be88602d24 Fix spacing after 'e.g.' and 'i.e.' 2018-05-16 11:01:07 +02:00
Andrea Vezzosi 47c881ba2a Voe: gone back to equational reasoning, as it's fairly cheap now. 2018-05-16 10:50:56 +02:00
Andrea Vezzosi 9f7a13b5da no-eta-equality for monads speeds up Voevodsky 2018-05-16 10:41:41 +02:00
Frederik Hanghøj Iversen d33c814e78 Add introduction 2018-05-15 18:36:33 +02:00
Andrea Vezzosi c75a1d5d8b commented out code with holes 2018-05-15 17:21:57 +02:00
Frederik Hanghøj Iversen 8a0ea9f4a5 Use darkorange for all bordercolors 2018-05-15 17:11:01 +02:00
Frederik Hanghøj Iversen 21363dbb78 Move opposite- and span- category to own modules 2018-05-15 16:38:07 +02:00
Frederik Hanghøj Iversen aced19e990 Various changes proposed by Andreas 2018-05-15 16:08:29 +02:00
Frederik Hanghøj Iversen 4d73514ab5 Use long name 2018-05-14 11:50:14 +02:00
Frederik Hanghøj Iversen 30cf0bb765 Add additional example of pathJ 2018-05-11 13:09:03 +02:00
Frederik Hanghøj Iversen 058f3c15a8 Provide example of using pathJ 2018-05-10 15:29:56 +02:00
Frederik Hanghøj Iversen 513d91ae4f Merge branch 'dev' 2018-05-10 14:29:41 +02:00
Frederik Hanghøj Iversen 3574ebc323 Add abstract 2018-05-10 14:28:54 +02:00
Frederik Hanghøj Iversen 3d618c001b Layout headache 2018-05-10 14:10:11 +02:00
Frederik Hanghøj Iversen ac6e838e48 Move package config to packages.tex 2018-05-10 13:11:07 +02:00
Frederik Hanghøj Iversen 00df3ccb45 Add conclusion 2018-05-10 12:40:39 +02:00
Frederik Hanghøj Iversen 258aa3d0e4 Write conclusion 2018-05-10 12:19:44 +02:00
Frederik Hanghøj Iversen 616d85351a Remove all my beloved contractions :( 2018-05-09 18:47:12 +02:00
Frederik Hanghøj Iversen 34798632f2 Fix its it's mistakes 2018-05-09 18:34:05 +02:00
Frederik Hanghøj Iversen bc8309c0cd Only index some things, change crossref to citation 2018-05-09 18:29:44 +02:00
Frederik Hanghøj Iversen d8f1aabed5 Expand gitignore 2018-05-09 18:25:01 +02:00
Frederik Hanghøj Iversen 9842c66eea Use macros extensively 2018-05-09 18:24:07 +02:00
Frederik Hanghøj Iversen 179570edf0 Changes based on Pierre's suggestions 2018-05-09 18:13:36 +02:00
Frederik Hanghøj Iversen 2b0dfe4984 Change titlepage 2018-05-09 12:33:58 +02:00
Frederik Hanghøj Iversen 2e11a2390f Shorten link 2018-05-08 23:05:45 +02:00
Frederik Hanghøj Iversen ae9a0b3de7 Encourage people to read my report 2018-05-08 22:46:17 +02:00
Frederik Hanghøj Iversen 655c12720a Remove obsolete comment 2018-05-08 18:49:17 +02:00
Frederik Hanghøj Iversen a5571de191 Document version numbers 2018-05-08 18:45:36 +02:00
53 changed files with 3715 additions and 1899 deletions

1
.gitignore vendored Normal file
View File

@ -0,0 +1 @@
html/

2
.gitmodules vendored
View File

@ -1,6 +1,6 @@
[submodule "libs/cubical"]
path = libs/cubical
url = git@github.com:fredefox/cubical.git
url = git@github.com:Saizan/cubical-demo.git
[submodule "libs/agda-stdlib"]
path = libs/agda-stdlib
url = git@github.com:agda/agda-stdlib.git

View File

@ -1,6 +1,13 @@
Change log
=========
Version 1.6.0
-------------
This version mainly contains changes to the report.
This is the version I submit for my MSc..
Version 1.5.0
-------------
Prove postulates in `Cat.Wishlist`:

View File

@ -1,5 +1,13 @@
build: src/**.agda
agda src/Cat.agda
agda --library-file ./libraries src/Cat.agda
clean:
find src -name "*.agdai" -type f -delete
html: src/**.agda
agda --html src/Cat.agda
upload: html
scp -r html/ remote11.chalmers.se:www/cat/doc/
.PHONY: upload clean

View File

@ -1,38 +1,52 @@
Description
===========
This project aims to formalize some parts of category theory using cubical agda
— an extension to agda permitting univalence. To learn more about this
[readthedocs](https://agda.readthedocs.io/en/latest/language/cubical.html).
— an extension to agda permitting univalence. To learn more about this
[read the docs](https://agda.readthedocs.io/en/latest/language/cubical.html).
This project draws a lot of inspiration from [the
HoTT-book](https://homotopytypetheory.org/book/).
Installation
============
If you want more information about this project, then you're in luck.
This is my masters thesis. Go ahead and read it
[here](http://web.student.chalmers.se/~hanghj/papers/univalent-categories.pdf)
or alternative like so:
cd doc/
make
Dependencies
------------
To succesfully compile the following is needed:
============
To successfully compile the following is needed:
* The Agda release candidate 2.5.4[^1]
* The experimental branch of [Agda Standard Library](https://github.com/agda/agda-stdlib)
* The master branch of Agda.
* [Agda Standard Library](https://github.com/agda/agda-stdlib)
* [Cubical](https://github.com/Saizan/cubical-demo/)
[^1]: At least version >= [`707ce6042b6a3bdb26521f3fe8dfe5d8a8470a43`](https://github.com/agda/agda/commit/707ce6042b6a3bdb26521f3fe8dfe5d8a8470a43)
Has been tested with:
It's important to have the right version of these - but which one is the right
is in constant flux. It's most likely the newest one.
* Agda version 2.6.0-d3efe64
I've used git submodules to manage dependencies. Unfortunately Agda does not
allow specifying libraries to be used only as local dependencies. So the
submodules are mostly used for documentation.
Building
========
You can build the library with
You can let Agda know about these libraries by appending them to your global
libraries file like so: (NB!: There is a good reason this is not in a
makefile. So please verify that you know what you are doing, you probably
already have standard-library in you libraries)
git submodule update --init
make
AGDA_LIB=~/.agda
readlink -f libs/*/*.agda-lib | tee -a $AGDA_LIB/libraries
The Makefile takes care of using the right dependencies.
Unfortunately I have not found a way to automatically inform
`agda-mode` that it should use these dependencies. So what you can do
in stead is to copy these libraries to a global location and then add
them system wide:
Anyways, assuming you have this set up you should be good to go.
mkdir -p ~/.agda/libs
cd ~/.agda/libs
git clone $CAT/libs/std-lib
git clone $CAT/libs/cubical
echo << EOF | tee -a ~/.agda/libraries
$HOME/.agda/libs/agda-stdlib/standard-library.agda-lib
$HOME/.agda/libs/cubical/cubical.agda-lib
EOF
Or you could symlink them as well if you want.

6
doc/.gitignore vendored
View File

@ -4,6 +4,12 @@
*.log
*.out
*.pdf
!assets/**
*.bbl
*.blg
*.toc
*.idx
*.ilg
*.ind
*.nav
*.snm

View File

@ -1,7 +1,95 @@
Presentation
====
Find one clear goal.
Remember crowd-control.
Leave out:
lemPropF
Outline
-------
Introduction
A formalization of Category Theory in cubical Agda.
Cubical Agda: A constructive interpretation of functional
extensionality and univalence
Talk about structure of library:
===
What can I say about reusability?
Misc
====
Meeting with Andrea May 18th
============================
App. 2 in HoTT gives typing rule for pathJ including a computational
rule for it.
If you have this computational rule definitionally, then you wouldn't
need to use `pathJprop`.
In discussion-section I mention HITs. I should remove this or come up
with a more elaborate example of something you could do, e.g.
something with pushouts in the category of sets.
The type Prop is a type where terms are *judgmentally* equal not just
propositionally so.
Maybe mention that Andreas Källberg is working on proving the
initiality conjecture.
Intensional Type Theory (ITT): Judgmental equality is decidable
Extensional Type Theory (ETT): Reflection is enough to make judgmental
equality undecidable.
Reflection : a ≡ b → a = b
ITT does not have reflections.
HTT ~ ITT + axiomatized univalence
Agda ~ ITT + K-rule
Coq ~ ITT (no K-rule)
Cubical Agda ~ ITT + Path + Glue
Prop is impredicative in Coq (whatever that means)
Prop ≠ hProp
Comments about abstract
-----
Pattern matching for paths (?)
Intro
-----
Main feature of judgmental equality is the conversion rule.
Conor explained: K + eliminators ≡ pat. matching
Explain jugmental equality independently of type-checking
Soundness for equality means that if `x = y` then `x` and `y` must be
equal according to the theory/model.
Decidability of `=` is a necessary condition for typechecking to be
decidable.
Canonicity is a nice-to-have though without canonicity terms can get
stuck. If we postulate results about judgmental equality. E.g. funext,
then we can construct a term of type natural number that is not a
numeral. Therefore stating canonicity with natural numbers:
∀ t . ⊢ t : N , ∃ n : N . ⊢ t = sⁿ 0 : N
is a sufficient condition to get a well-behaved equality.
Eta-equality for RawFunctor means that the associative law for
functors hold definitionally.
Computational property for funExt is only relevant in two places in my
whole formulation. Univalence and gradLemma does not influence any
proofs.

View File

@ -48,3 +48,6 @@ cleanall:
clean:
latexmk -c
read: all
xdg-open $(PROJNAME).pdf

23
doc/abstract.tex Normal file
View File

@ -0,0 +1,23 @@
\chapter*{Abstract}
The usual notion of propositional equality in intensional type-theory
is restrictive. For instance it does not admit functional
extensionality nor univalence. This poses a severe limitation on both
what is \emph{provable} and the \emph{re-usability} of proofs. Recent
developments have, however, resulted in cubical type theory, which
permits a constructive proof of univalence. The programming language
Agda has been extended with capabilities for working in such a cubical
setting. This thesis will explore the usefulness of this extension in
the context of category theory.
The thesis will motivate the need for univalence and explain why
propositional equality in cubical Agda is more expressive than in
standard Agda. Alternative approaches to Cubical Agda will be
presented and their pros and cons will be explained. As an example of
the application of univalence, two formulations of monads will be
presented: Namely monads in the monoidal form and monads in the
Kleisli form. Using univalence, it will be shown how these are equal.
Finally the thesis will explain the challenges that a developer will
face when working with cubical Agda and give some techniques to
overcome these difficulties. It will suggest how further work can
help alleviate some of these challenges.

13
doc/acknowledgements.tex Normal file
View File

@ -0,0 +1,13 @@
\chapter*{Acknowledgements}
I would like to thank my supervisor Thierry Coquand for giving me a
chance to work on this interesting topic. I would also like to thank
Andrea Vezzosi for some very long and very insightful meetings during
the project. It is fascinating and almost uncanny how quickly Andrea
can conjure up various proofs. I also want to recognize the support
of Knud Højgaards Fond who graciously sponsored me with a 20.000 DKK
scholarship which helped toward sponsoring the two years I have spent
studying abroad. I would also like to give a warm thanks to my fellow
students Pierre~Kraft and Nachiappan~Valliappan who have made the time
spent working on the thesis way more enjoyable. Lastly I would like to
give a special thanks to Valentina~Méndez who have been a great moral
support throughout the whole process.

View File

@ -0,0 +1,74 @@
\chapter{Non-reducing functional extensionality}
\label{app:abstract-funext}
In two places in my formalization was the computational behaviours of
functional extensionality used. The reduction behaviour can be
disabled by marking functional extensionality as abstract. Below the
fully normalized goal and context with functional extensionality
marked abstract has been shown. The excerpts are from the module
%
\begin{center}
\sourcelink{Cat.Category.Monad.Voevodsky}
\end{center}
%
where this is also written as a comment next to the proofs. When
functional extensionality is not abstract the goal and current value
are the same. It is of course necessary to show the fully normalized
goal and context otherwise the reduction behaviours is not forced.
\subsubsection*{First goal}
Goal:
\begin{verbatim}
PathP (λ _ → §2-3.§2 omap (λ {z} → pure))
(§2-fromMonad
(.Cat.Category.Monad.toKleisli
(.Cat.Category.Monad.toMonoidal (§2-3.§2.toMonad m))))
(§2-fromMonad (§2-3.§2.toMonad m))
\end{verbatim}
Have:
\begin{verbatim}
PathP
(λ i →
§2-3.§2 K.IsMonad.omap
(K.RawMonad.pure
(K.Monad.raw
(funExt (λ m₁ → K.Monad≡ (.Cat.Category.Monad.toKleisliRawEq m₁))
i (§2-3.§2.toMonad m)))))
(§2-fromMonad
(.Cat.Category.Monad.toKleisli
(.Cat.Category.Monad.toMonoidal (§2-3.§2.toMonad m))))
(§2-fromMonad (§2-3.§2.toMonad m))
\end{verbatim}
\subsubsection*{Second goal}
Goal:
\begin{verbatim}
PathP (λ _ → §2-3.§1 omap (λ {X} → pure))
(§1-fromMonad
(.Cat.Category.Monad.toMonoidal
(.Cat.Category.Monad.toKleisli (§2-3.§1.toMonad m))))
(§1-fromMonad (§2-3.§1.toMonad m))
\end{verbatim}
Have:
\begin{verbatim}
PathP
(λ i →
§2-3.§1
(RawFunctor.omap
(Functor.raw
(M.RawMonad.R
(M.Monad.raw
(funExt
(λ m₁ → M.Monad≡ (.Cat.Category.Monad.toMonoidalRawEq m₁)) i
(§2-3.§1.toMonad m))))))
{X}
fst
(M.RawMonad.pureNT
(M.Monad.raw
(funExt
(λ m₁ → M.Monad≡ (.Cat.Category.Monad.toMonoidalRawEq m₁)) i
(§2-3.§1.toMonad m))))
X))
(§1-fromMonad
(.Cat.Category.Monad.toMonoidal
(.Cat.Category.Monad.toKleisli (§2-3.§1.toMonad m))))
(§1-fromMonad (§2-3.§1.toMonad m))
\end{verbatim}

Binary file not shown.

Binary file not shown.

Binary file not shown.

BIN
doc/assets/isomorphism.pdf Normal file

Binary file not shown.

BIN
doc/assets/isomorphism.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 266 KiB

BIN
doc/assets/logo_ch_gu.pdf Normal file

Binary file not shown.

BIN
doc/assets/logo_eng.pdf Normal file

Binary file not shown.

BIN
doc/assets/logo_swe.pdf Normal file

Binary file not shown.

View File

@ -40,13 +40,17 @@
\newcommand*{\subtitle}[1]{\gdef\@subtitle{#1}}
%% FRONTMATTER
\newcommand*{\myfrontmatter}{%
\newgeometry{top=3cm, bottom=3cm,left=2.25 cm, right=2.25cm}
%% \newgeometry{top=3cm, bottom=3cm,left=2.25 cm, right=2.25cm}
\AddToShipoutPicture*{\backgroundpic{-4}{56.7}{assets/frontpage_gu_eng.pdf}}
\ClearShipoutPicture
\addtolength{\voffset}{2cm}
\begingroup
\thispagestyle{empty}
{\Huge\@title}\\[.5cm]
{\Large A formalization of category theory in Cubical Agda}\\[2.5cm]
{\Large A formalization of category theory in Cubical Agda}\\[6cm]
\begin{center}
\includegraphics[width=\linewidth,keepaspectratio]{isomorphism.png}
\includegraphics[width=\linewidth,keepaspectratio]{assets/isomorphism.pdf}
%% \includepdf{isomorphism.pdf}
\end{center}
% Cover text
\vfill
@ -60,13 +64,13 @@ Master's thesis in Computer Science
% BACK OF COVER PAGE (BLANK PAGE)
\newpage
%% \newgeometry{a4paper} % Temporarily change margins
%% \restoregeometry
\restoregeometry
\thispagestyle{empty}
\null
}
\renewcommand*{\maketitle}{%
\begin{titlepage}
%% \begin{titlepage}
% TITLE PAGE
\newpage
@ -79,7 +83,7 @@ Master's thesis in Computer Science
\vfill
\centering
\includegraphics[width=0.2\pdfpagewidth]{logo_eng.pdf}
\includegraphics[width=0.2\pdfpagewidth]{assets/logo_eng.pdf}
\vspace{5mm}
\textsc{Department of Computer Science and Engineering}\\
@ -131,5 +135,6 @@ Telephone +46 31 772 1000 \setlength{\parskip}{0.5cm}\\
%Printed by [Name of printing company]\\
Gothenburg, Sweden \the\year
\restoregeometry
\end{titlepage}}
%% \restoregeometry
%% \end{titlepage}
}

View File

@ -1,3 +1,63 @@
\chapter{Conclusion}
This thesis highlighted some issues with the standard inductive
definition of propositional equality used in Agda. Functional
extensionality and univalence are examples of two propositions not
admissible in Intensional Type Theory (ITT). This has a big impact on
what is provable and the reusability of proofs. This issue is
overcome with an extension to Agda's type system called Cubical Agda.
With Cubical Agda both functional extensionality and univalence are
admissible. Cubical Agda is more expressive, but there are certain
issues that arise that are not present in standard Agda. For one
thing Agda enjoys Uniqueness of Identity Proofs (UIP) though a flag
exists to turn this off. This feature is not present in Cubical Agda.
Rather than having unique identity proofs cubical Agda gives rise to a
hierarchy of types with increasing \nomen{homotopical
structure}{homotopy levels}. It turns out to be useful to build the
formalization with this hierarchy in mind as it can simplify proofs
considerably. Another issue one must overcome in Cubical Agda is when
a type has a field whose type depends on a previous field. In this
case paths between such types will be heterogeneous paths. In
practice it turns out to be considerably more difficult to work with
heterogeneous paths than with homogeneous paths. This thesis
demonstrated the application of some techniques to overcome these
difficulties, such as based path induction.
\TODO{\ldots}
This thesis formalizes some of the core concepts from category theory
including: categories, functors, products, exponentials, Cartesian
closed categories, natural transformations, the yoneda embedding,
monads and more. Category theory is an interesting case study for the
application of cubical Agda for two reasons in particular. One reason
is because category theory is the study of abstract algebra of
functions, meaning that functional extensionality is particularly
relevant. Another reason is that in category theory it is commonplace
to identify isomorphic structures. Univalence allows for making this
notion precise. This thesis also demonstrated another technique that
is common in category theory; namely to define categories to prove
properties of other structures. Specifically a category was defined
to demonstrate that any two product objects in a category are
isomorphic. Furthermore the thesis showed two formulations of monads
and proved that they indeed are equivalent: Namely monads in the
monoidal- and Kleisli- form. The monoidal formulation is more typical
to category theoretic formulations and the Kleisli formulation will be
more familiar to functional programmers. It would have been very
difficult to make a similar proof with setoids and the proof would be
very difficult to read. In the formulation we also saw how paths can
be used to extract functions. A path between two types induce an
isomorphism between the two types. This e.g.\ permits developers to
write a monad instance for a given type using the Kleisli formulation.
By transporting along the path between the monoidal- and Kleisli-
formulation one can reuse all the operations and results shown for
monoidal- monads in the context of kleisli monads.
%%
%% problem with inductive type
%% overcome with cubical
%% the path type
%% homotopy levels
%% depdendent paths
%%
%% category theory
%% algebra of functions ~ funExt
%% identify isomorphic types ~ univalence
%% using categories to prove properties
%% computational properties
%% reusability, compositional

View File

@ -1,35 +1,37 @@
\chapter{Cubical Agda}
\section{Propositional equality}
In Agda judgmental equality is a feature of the type-system. It's a property of
types that can be checked by computational means. In the example from the
introduction $n + 0$ can be judged to be equal to $n$ simply by expanding the
definition of $+$.
Judgmental equality in Agda is a feature of the type system. It is
something that can be checked automatically by the type checker: In
the example from the introduction $n + 0$ can be judged to be equal to
$n$ simply by expanding the definition of $+$.
Propositional equality on the other hand is defined within the language itself.
Cubical Agda extends the underlying type system (\TODO{Cite someone smarter than
me with a good resource on this}) but introduces a data-type within the
languages.
On the other hand, propositional equality is something defined within
the language itself. Propositional equality cannot be derived
automatically. The normal definition of propositional equality is an
inductive data type. Cubical Agda discards this type in favor of some
new primitives.
Exceprts of the source code relevant to this section can be found in appendix
\ref{sec:app-cubical}.
Most of the source code related with this section is implemented in
\cite{cubical-demo} it can be browsed in hyperlinked and syntax
highlighted HTML online. The links can be found in the beginning of
section \S\ref{ch:implementation}.
\subsection{The equality type}
The usual notion of judgmental equality says that given a type $A \tp \MCU$ and
two points of $A$; $a_0, a_1 \tp A$ we can form the type:
The usual notion of judgmental equality says that given a type $A \tp
\MCU$ and two points hereof $a_0, a_1 \tp A$ we can form the type:
%
\begin{align}
a_0 \equiv a_1 \tp \MCU
\end{align}
%
In Agda this is defined as an inductive data-type with the single constructor:
In Agda this is defined as an inductive data type with the single
constructor $\refl$ that for any $a \tp A$ gives:
%
\begin{align}
\refl \tp a \equiv a
\end{align}
%
For any $a \tp A$.
There also exist a related notion of \emph{heterogeneous} equality where allows
There also exist a related notion of \emph{heterogeneous} equality which allows
for equating points of different types. In this case given two types $A, B \tp
\MCU$ and two points $a \tp A$, $b \tp B$ we can construct the type:
%
@ -37,7 +39,8 @@ for equating points of different types. In this case given two types $A, B \tp
a \cong b \tp \MCU
\end{align}
%
This is likewise defined as an inductive data-type with a single constructors:
This likewise has the single constructor $\refl$ that for any $a \tp
A$ gives:
%
\begin{align}
\refl \tp a \cong a
@ -47,89 +50,131 @@ In Cubical Agda these two notions are paralleled with homogeneous- and
heterogeneous paths respectively.
%
\subsection{The path type}
In Cubical Agda judgmental equality is encapsulated with the type:
Judgmental equality in Cubical Agda is encapsulated with the type:
%
\begin{equation}
\Path \tp (P \tp \I\MCU) → P\ 0 → P\ 1 → \MCU
\end{equation}
%
The special type $\I$ is called the index set. The index set can be
thought of simply as the interval on the real numbers from $0$ to $1$
(both inclusive). The family $P$ over $\I$ will be referred to as the
\nomenindex{path space} given some path $p \tp \Path\ P\ a\ b$. By
that token $P\ 0$ corresponds to the type at the left endpoint of $p$.
Likewise $P\ 1$ is the type at the right endpoint. The type is called
$\Path$ because the idea has roots in homotopy theory. The intuition
is that $\Path$ describes\linebreak[1] paths in $\MCU$. I.e.\ paths
between types. For a path $p$ the expression $p\ i$ can be thought of
as a \emph{point} on this path. The index $i$ describes how far along
the path one has moved. An inhabitant of $\Path\ P\ a_0\ a_1$ is a
(dependent) function from the index set to the path space:
%
$$
\Path \tp (P \tp I → \MCU) → P\ 0 → P\ 1 → \MCU
p \tp \prod_{i \tp \I} P\ i
$$
%
$I$ is a special data-type (\TODO{that also has special computational properties
AFAIK}) called the index set. $I$ can be thought of simply as the interval on
the real numbers from $0$ to $1$. $P$ is a family of types over the index set
$I$. I will sometimes refer to $P$ as the ``path-space'' of some path $p \tp
\Path\ P\ a\ b$. By this token $P\ 0$ then corresponds to the type at the
left-endpoint and $P\ 1$ as the type at the right-endpoint. The type is called
$\Path$ because it is connected with paths in homotopy theory. The intuition
behind this is that $\Path$ describes paths in $\MCU$ -- i.e. between types. For
a path $p$ for the point $p\ i$ the index $i$ describes how far along the path
one has moved. An inhabitant of $\Path\ P\ a_0\ a_1$ is a (dependent-)
function, $p$, from the index-space to the path-space:
%
$$
p \tp I \to P\ i
$$
%
Which must satisfy being judgmentally equal to $a_0$ (respectively $a_1$) at the
endpoints. I.e.:
This function must satisfy being judgmentally equal to $a_0$ at the
left endpoint and equal to $a_1$ at the other end. I.e.:
%
\begin{align*}
p\ 0 & = a_0 \\
p\ 1 & = a_1
\end{align*}
%
The notion of ``homogeneous equalities'' can be recovered by not letting the
path-space $P$ depend on it's argument:
The notion of \nomenindex{homogeneous equalities} is recovered when $P$ does not
depend on its argument. That is for $A \tp \MCU$ and $a_0, a_1 \tp A$ the
homogenous equality between $a_0$ and $a_1$ is the type:
%
$$
a_0 \equiv a_1 \defeq \Path\ (\lambda i \to A)\ a_0\ a_1
a_0 \equiv a_1 \defeq \Path\ (\lambda\;i \to A)\ a_0\ a_1
$$
%
For $A \tp \MCU$, $a_0, a_1 \tp A$. I will generally prefer to use the notation
$a_0 \equiv a_1$ when talking about non-dependent paths and use the notation
$\Path\ (\lambda i \to A)\ a_0\ a_1$ when the path-space is of particular
interest.
I will generally prefer to use the notation $a \equiv b$ when talking
about non-dependent paths and use the notation $\Path\ (\lambda\; i
\to P\ i)\ a\ b$ when the path space is of particular interest.
With this definition we can also recover reflexivity. That is, for any $A \tp
\MCU$ and $a \tp A$:
With this definition we can recover reflexivity. That is, for any $A
\tp \MCU$ and $a \tp A$:
%
\begin{equation}
\begin{aligned}
\refl & \tp \Path (\lambda i \to A)\ a\ a \\
\refl & \defeq \lambda i \to a
\refl & \tp a \equiv a \\
\refl & \defeq \lambda\; i \to a
\end{aligned}
\end{equation}
%
Or, in other terms; reflexivity is the path in $A$ that is $a$ at the left
endpoint as well as at the right endpoint. It is inhabited by the path which
stays constantly at $a$ at any index $i$.
Here the path space is $P \defeq \lambda\; i \to A$ and it satsifies
$P\ i = A$ definitionally. So to inhabit it, is to give a path $\I \to
A$ that is judgmentally $a$ at either endpoint. This is satisfied by
the constant path; i.e.\ the path that is constantly $a$ at any index
$i \tp \I$.
Paths have some other important properties, but they are not the focus of this
thesis. \TODO{Refer the reader somewhere for more info.}
It is also surprisingly easy to show functional extensionality.
Functional extensionality is the proposition that given a type $A \tp
\MCU$, a family of types $B \tp A \to \MCU$ and functions $f, g \tp
\prod_{a \tp A} B\ a$ gives:
%
\begin{equation}
\label{eq:funExt}
\funExt \tp \left(\prod_{a \tp A} f\ a \equiv g\ a \right) \to f \equiv g
\end{equation}
%
%% p = λ\; i a → p a i
So given $η \tp \prod_{a \tp A} f\ a \equiv g\ a$ we must give a path
$f \equiv g$. That is a function $\I \to \prod_{a \tp A} B\ a$. So let
$i \tp \I$ be given. We must now give an expression $\phi \tp
\prod_{a \tp A} B\ a$ satisfying $\phi\ 0 \equiv f\ a$ and $\phi\ 1
\equiv g\ a$. This neccesitates that the expression must be a lambda
abstraction, so let $a \tp A$ be given. We can now apply $a$ to $η$
and get the path $η\ a \tp f\ a \equiv g\ a$. This exactly
satisfies the conditions for $\phi$. In conclusion \ref{eq:funExt} is
inhabited by the term:
%
\begin{equation*}
\funExt\ η \defeq λ\; i\ a → η\ a\ i
\end{equation*}
%
With $\funExt$ in place we can now construct a path between
$\var{zeroLeft}$ and $\var{zeroRight}$ -- the functions defined in the
introduction \S\ref{sec:functional-extensionality}:
%
\begin{align*}
p & \tp \var{zeroLeft} \equiv \var{zeroRight} \\
p & \defeq \funExt\ \var{zrn}
\end{align*}
%
Here $\var{zrn}$ is the proof from \ref{eq:zrn}.
%
\section{Homotopy levels}
In ITT all equality proofs are identical (in a closed context). This means that,
in some sense, any two inhabitants of $a \equiv b$ are ``equally good'' -- they
don't have any interesting structure. This is referred to as uniqueness of
identity proofs. Unfortunately this is orthogonal to univalence that only makes
sense in the absence of UIP.
In homotopy type theory we have a hierarchy of types based on their ``internal
structure''. At the bottom of this hierarchy we have the set of contractible
types:
In ITT all equality proofs are identical (in a closed context). This
means that, in some sense, any two inhabitants of $a \equiv b$ are
``equally good''. They do not have any interesting structure. This is
referred to as Uniqueness of Identity Proofs (UIP). Unfortunately it
is not possible to have a type theory with both univalence and UIP.
Instead in cubical Agda we have a hierarchy of types with an
increasing amount of homotopic structure. At the bottom of this
hierarchy is the set of contractible types:
%
\begin{equation}
\begin{aligned}
%% \begin{split}
& \isContr && \tp \MCU \to \MCU \\
& \isContr && \tp \MCU \to \MCU \\
& \isContr\ A && \defeq \sum_{c \tp A} \prod_{a \tp A} a \equiv c
%% \end{split}
\end{aligned}
\end{equation}
%
The first component of $\isContr\ A$ is called ``the center of contraction''.
Under the propositions-as-types interpretation of type-theory $\isContr\ A$ can
be thought of as ``the true proposition $A$''. It is a theorem that if a type is
contractible, then it is isomorphic to the unit-type $\top$.
Under the propositions-as-types interpretation of type theory $\isContr\ A$ can
be thought of as ``the true proposition $A$''. And indeed $\top$ is
contractible:
%
\begin{equation*}
(\var{tt} , \lambda\; x \to \refl) \tp \isContr\ \top
\end{equation*}
%
It is a theorem that if a type is contractible, then it is isomorphic to the
unit-type.
The next step in the hierarchy is the set of mere propositions:
%
@ -140,15 +185,23 @@ The next step in the hierarchy is the set of mere propositions:
\end{aligned}
\end{equation}
%
$\isProp\ A$ can be thought of as the set of true and false propositions. It is
a result that if a mere proposition $A$ is inhabited, then so is it
contractible. If it is not inhabited it is equivalent to the empty-type (or
false proposition).\TODO{Cite!!}
One can think of $\isProp\ A$ as the set of true and false propositions. And
indeed both $\top$ and $\bot$ are propositions:
%
\begin{align*}
\; \var{tt}, \var{tt} → refl) & \tp \isProp\ \\
λ\;\varnothing\ \varnothing & \tp \isProp\
\end{align*}
%
The term $\varnothing$ is used here to denote an impossible pattern. It is a
theorem that if a mere proposition $A$ is inhabited, then so is it contractible.
If it is not inhabited it is equivalent to the empty-type (or false
proposition).
I will refer to a type $A \tp \MCU$ as a \emph{mere} proposition if I want to
I will refer to a type $A \tp \MCU$ as a \emph{mere proposition} if I want to
stress that we have $\isProp\ A$.
Then comes the set of homotopical sets:
The next step in the hierarchy is the set of homotopical sets:
%
\begin{equation}
\begin{aligned}
@ -157,12 +210,16 @@ Then comes the set of homotopical sets:
\end{aligned}
\end{equation}
%
At this point it should be noted that the term ``set'' is somewhat conflated;
there is the notion of sets from set-theory, in Agda types are denoted
\texttt{Set}. I will use it consistently to refer to a type $A$ as a set exactly
if $\isSet\ A$ is inhabited.
I will not give an example of a set at this point. It turns out that
proving e.g.\ $\isProp\ \bN$ directly is not so straightforward (see
\cite[\S3.1.4]{hott-2013}). Hedberg's theorem states that any type
with decidable equality is a set. There will be examples of sets later
in this report. At this point it should be noted that the term ``set''
is somewhat conflated; there is the notion of sets from set theory.
In Agda types are denoted \texttt{Set}. I will use it consistently to
refer to a type $A$ as a set exactly if $\isSet\ A$ is a proposition.
The next step in the hierarchy is, as the reader might've guessed, the type:
As the reader may have guessed the next step in the hierarchy is the type:
%
\begin{equation}
\begin{aligned}
@ -171,10 +228,12 @@ The next step in the hierarchy is, as the reader might've guessed, the type:
\end{aligned}
\end{equation}
%
And so it continues. In fact we can generalize this family of types by indexing
them with a natural number. For historical reasons, though, the bottom of the
hierarchy, the contractible types, is said to be a \nomen{-2-type}, propositions
are \nomen{-1-types}, (homotopical) sets are \nomen{0-types} and so on\ldots
So it continues. In fact we can generalize this family of types by
indexing them with a natural number. For historical reasons, though,
the bottom of the hierarchy, the contractible types, is said to be a
\nomen{-2-type}{homotopy levels}, propositions are
\nomen{-1-types}{homotopy levels}, (homotopical) sets are
\nomen{0-types}{homotopy levels} and so on\ldots
Just as with paths, homotopical sets are not at the center of focus for this
thesis. But I mention here some properties that will be relevant for this
@ -183,91 +242,174 @@ exposition:
Proposition: Homotopy levels are cumulative. That is, if $A \tp \MCU$ has
homotopy level $n$ then so does it have $n + 1$.
Let $\left\Vert A \right\Vert = n$ denote that the level of $A$ is $n$.
Proposition: For any homotopic level $n$ this is a mere proposition.
For any level $n$ it is the case that to be of level $n$ is a mere proposition.
%
\section{A few lemmas}
Rather than getting into the nitty-gritty details of Agda I venture to take a
more ``combinator-based'' approach. That is, I will use theorems about paths
already that have already been formalized. Specifically the results come from
the Agda library \texttt{cubical} (\TODO{Cite}). I have used a handful of
results from this library as well as contributed a few lemmas myself.\footnote{The module \texttt{Cat.Prelude} lists the upstream dependencies. As well my contribution to \texttt{cubical} can be found in the git logs \TODO{Cite}.}
Rather than getting into the nitty-gritty details of Agda I venture to
take a more ``combinator-based'' approach. That is I will use
theorems about paths that have already been formalized.
Specifically the results come from the Agda library \texttt{cubical}
(\cite{cubical-demo}). I have used a handful of results from this
library as well as contributed a few lemmas myself%
\footnote{The module \texttt{Cat.Prelude} lists the upstream
dependencies. My contribution to \texttt{cubical} can as well be
found in the git logs which are available at
\hrefsymb{https://github.com/Saizan/cubical-demo}{\texttt{https://github.com/Saizan/cubical-demo}}.
}.
These theorems are all purely related to homotopy theory and cubical Agda and as
such not specific to the formalization of Category Theory. I will present a few
of these theorems here, as they will be used later in chapter
\ref{ch:implementation} throughout.
These theorems are all purely related to homotopy type theory and as
such not specific to the formalization of Category Theory. I will
present a few of these theorems here as they will be used throughout
chapter \ref{ch:implementation}. They should also give the reader some
intuition about the path type.
\subsection{Path induction}
\label{sec:pathJ}
The induction principle for paths intuitively gives us a way to reason about a
type-family indexed by a path by only considering if said path is $\refl$ (the
``base-case''). For \emph{based path induction}, that equality is \emph{based}
at some element $a \tp A$.
The induction principle for paths intuitively gives us a way to reason
about a type family indexed by a path by only considering if said path
is $\refl$ (the \nomen{base case}{path induction}). For \emph{based
path induction}, that equality is \emph{based} at some element $a
\tp A$.
Let a type $A \tp \MCU$ and an element of the type $a \tp A$ be given. $a$ is said to be the base of the induction. Given a family of types:
\pagebreak[3]
\begin{samepage}
Let a type $A \tp \MCU$ and an element of the type $a \tp A$ be
given. $a$ is said to be the base of the induction.\linebreak[3] Given
a family of types:
%
$$
P \tp \prod_{a' \tp A} \prod_{p \tp a ≡ a'} \MCU
D \tp \prod_{b \tp A} \prod_{p \tp a ≡ b} \MCU
$$
%
And an inhabitant of $P$ at $\refl$:
and an inhabitant of $D$ at $\refl$:
%
$$
p \tp P\ a\ \refl
d \tp D\ a\ \refl
$$
%
We have the function:
%
$$
\pathJ\ P\ p \tp \prod_{a' \tp A} \prod_{p \tp a ≡ a'} P\ a\ p
$$
\begin{equation}
\pathJ\ D\ d \tp \prod_{b \tp A} \prod_{p \tp a ≡ b} D\ b\ p
\end{equation}
\end{samepage}%
A simple application of $\pathJ$ is for proving that $\var{sym}$ is an
involution. Namely for any set $A \tp \MCU$, points $a, b \tp A$ and a path
between them $p \tp a \equiv b$:
%
\begin{equation}
\label{eq:sym-invol}
\var{sym}\ (\var{sym}\ p) ≡ p
\end{equation}
%
The proof will be by induction on $p$ and will be based at $a$. That
is $D$ will be the family:
%
\begin{align*}
D & \tp \prod_{b' \tp A} \prod_{p \tp a ≡ b'} \MCU \\
D\ b'\ p' & \defeq \var{sym}\ (\var{sym}\ p') ≡ p'
\end{align*}
%
The base case will then be:
%
\begin{align*}
d & \tp \var{sym}\ (\var{sym}\ \refl) ≡ \refl \\
d & \defeq \refl
\end{align*}
%
The reason $\refl$ proves this is that $\var{sym}\ \refl = \refl$ holds
definitionally. In summary \ref{eq:sym-invol} is inhabited by the term:
%
\begin{align*}
\pathJ\ D\ d\ b\ p
\tp
\var{sym}\ (\var{sym}\ p) ≡ p
\end{align*}
%
Another application of $\pathJ$ is for proving associativity of $\trans$. That
is, given a type $A \tp \MCU$, elements of $A$, $a, b, c, d \tp A$ and paths
between them $p \tp a \equiv b$, $q \tp b \equiv c$ and $r \tp c \equiv d$ we
have the following:
%
\begin{equation}
\label{eq:cum-trans}
\trans\ p\ (\trans\ q\ r) ≡ \trans\ (\trans\ p\ q)\ r
\end{equation}
%
In this case the induction will be based at $c$ (the left-endpoint of $r$) and
over the family:
%
\begin{align*}
T & \tp \prod_{d' \tp A} \prod_{r' \tp c ≡ d'} \MCU \\
T\ d'\ r' & \defeq \trans\ p\ (\trans\ q\ r') ≡ \trans\ (\trans\ p\ q)\ r'
\end{align*}
%
The base case is proven with $t$ which is defined as:
%
\begin{align*}
\trans\ p\ (\trans\ q\ \refl) &
\trans\ p\ q \\
&
\trans\ (\trans\ p\ q)\ \refl
\end{align*}
%
Here we have used the proposition $\trans\ p\ \refl \equiv p$ without proof. In
conclusion \ref{eq:cum-trans} is inhabited by the term:
%
\begin{align*}
\pathJ\ T\ t\ d\ r
\end{align*}
%
We shall see another application of path induction in \ref{eq:pathJ-example}.
\subsection{Paths over propositions}
\label{sec:lemPropF}
Another very useful combinator is $\lemPropF$:
To `promote' this to a dependent path we can use another useful combinator;
$\lemPropF$. Given a type $A \tp \MCU$ and a type family on $A$; $P \tp A \to
\MCU$. Let $\var{propP} \tp \prod_{x \tp A} \isProp\ (P\ x)$ be the proof that
$P$ is a mere proposition for all elements of $A$. Furthermore say we have a
path between some two elements in $A$; $p \tp a_0 \equiv a_1$ then we can built
a heterogeneous path between any two elements of $p_0 \tp P\ a_0$ and $p_1 \tp
P\ a_1$:
Another very useful combinator is $\lemPropF$: Given a type $A \tp
\MCU$ and a type family on $A$; $D \tp A \to \MCU$. Let $\var{propD}
\tp \prod_{x \tp A} \isProp\ (D\ x)$ be the proof that $D$ is a mere
proposition for all elements of $A$. Furthermore say we have a path
between some two elements in $A$; $p \tp a_0 \equiv a_1$ then we can
built a heterogeneous path between any two elements of $d_0 \tp
D\ a_0$ and $d_1 \tp D\ a_1$.
%
$$
\lemPropF\ \var{propP}\ p \defeq \Path\ (\lambda\; i \mto P\ (p\ i))\ p_0\ p_1
\lemPropF\ \var{propD}\ p \tp \Path\ (\lambda\; i \mto D\ (p\ i))\ d_0\ d_1
$$
%
This is quite a mouthful. So let me try to show how this is a very general and
useful result.
Note that $d_0$ and $d_1$, though points of the same family, have
different types. This is quite a mouthful, so let me try to show how
this is a very general and useful result.
Often when proving equalities between elements of some dependent types
$\lemPropF$ can be used to boil this complexity down to showing that the
dependent parts of the type are mere propositions. For instance, saw we have a type:
$\lemPropF$ can be used to boil this complexity down to showing that
the dependent parts of the type are mere propositions. For instance
say we have a type:
%
$$
T \defeq \sum_{a \tp A} P\ a
T \defeq \sum_{a \tp A} D\ a
$$
%
For some proposition $P \tp A \to \MCU$. If we want to prove $t_0 \equiv t_1$
for two elements $t_0, t_1 \tp T$ then this will be a pair of paths:
for some proposition $D \tp A \to \MCU$. That is we have $\var{propD}
\tp \prod_{a \tp A} \isProp\ (D\ a)$. If we want to prove $t_0 \equiv
t_1$ for two elements $t_0, t_1 \tp T$ then this will be a pair of
paths:
%
%
\begin{align*}
p \tp & \fst\ t_0 \equiv \fst\ t_1 \\
& \Path\ (\lambda i \to P\ (p\ i))\ \snd\ t_0 \equiv \snd\ t_1
& \Path\ (\lambda\; i \to D\ (p\ i))\ (\snd\ t_0)\ (\snd\ t_1)
\end{align*}
%
Here $\lemPropF$ directly allow us to prove the latter of these:
Here $\lemPropF$ directly allow us to prove the latter of these given
that we have already provided $p$.
%
$$
\lemPropF\ \var{propP}\ p
\tp \Path\ (\lambda i \to P\ (p\ i))\ \snd\ t_0 \equiv \snd\ t_1
\lemPropF\ \var{propD}\ p
\tp \Path\ (\lambda\; i \to D\ (p\ i))\ (\snd\ t_0)\ (\snd\ t_1)
$$
%
\subsection{Functions over propositions}
\label{sec:propPi}
\label{sec:propPi}%
$\prod$-types preserve propositionality when the co-domain is always a
proposition.
%
@ -277,9 +419,9 @@ $$
\subsection{Pairs over propositions}
\label{sec:propSig}
%
$\sum$-types preserve propositionality whenever it's first component is a
proposition, and it's second component is a proposition for all points of in the
left type.
$\sum$-types preserve propositionality whenever its first component is
a proposition and its second component is a proposition for all
points of the left type.
%
$$
\mathit{propSig} \tp \isProp\ A \to \left(\prod_{a \tp A} \isProp\ (P\ a)\right) \to \isProp\ \left(\sum_{a \tp A} P\ a\right)

View File

@ -1,74 +1,139 @@
\chapter{Perspectives}
\section{Discussion}
In the previous chapter the practical aspects of proving things in Cubical Agda
were highlighted. I also demonstrated the usefulness of separating ``laws'' from
``data''. One of the reasons for this is that dependencies within types can lead
to very complicated goals. One technique for alleviating this was to prove that
certain types are mere propositions.
In the previous chapter the practical aspects of proving things in
Cubical Agda were highlighted. I also demonstrated the usefulness of
separating ``laws'' from ``data''. One of the reasons for this is that
dependencies within types can lead to very complicated goals. One
technique for alleviating this was to prove that certain types are
mere propositions.
\subsection{Computational properties}
Another aspect (\TODO{That I actually didn't highlight very well in the previous
chapter}) is the computational nature of paths. Say we have formalized this
common result about monads:
The new contribution of cubical Agda is that it has a constructive
proof of functional extensionality\index{functional extensionality}
and univalence\index{univalence}. This means in particular that the
type checker can reduce terms defined with these theorems. One
interesting result of this development is how much this influenced the
development. In particular having a functional extensionality that
``computes'' should simplify some proofs.
\TODO{Some equation\ldots}
I have tested this by using a feature of Agda where one can mark
certain bindings as being \emph{abstract}. This means that the
type-checker will not try to reduce that term further during type
checking. I tried making univalence and functional extensionality
abstract. It turns out that the conversion behaviour of univalence is
not used anywhere. For functional extensionality there are two places
in the whole solution where the reduction behaviour is used to
simplify some proofs. This is in showing that the maps between the
two formulations of monads are inverses. See the notes in this
module:
%
\begin{center}
\sourcelink{Cat.Category.Monad.Voevodsky}
\end{center}
%
By transporting this to the Kleisli formulation we get a result that we can use
to compute with. This is particularly useful because the Kleisli formulation
will be more familiar to programmers e.g. those coming from a background in
Haskell. Whereas the theory usually talks about monoidal monads.
I will not reproduce it in full here as the type is quite involved. In
stead I have put this in a source listing in
\ref{app:abstract-funext}. The method used to find in what places the
computational behaviour of these proofs are needed has the caveat of
only working for places that directly or transitively uses these two
proofs. Fortunately though the code is structured in such a way that
this is the case. In conclusion the way I have structured these proofs
means that the computational behaviour of functional extensionality
and univalence has not been so relevant.
\TODO{Mention that with postulates we cannot do this}
Barring this the computational behaviour of paths can still be useful.
E.g.\ if a programmer wants to reuse functions that operate on a
monoidal monads to work with a monad in the Kleisli form that the
programmer has specified. To make this idea concrete, say we are
given some function $f \tp \Kleisli \to T$, having a path between $p
\tp \Monoidal \equiv \Kleisli$ induces a map $\coe\ p \tp \Monoidal
\to \Kleisli$. We can compose $f$ with this map to get $f \comp
\coe\ p \tp \Monoidal \to T$. Of course, since that map was
constructed with an isomorphism, these maps already exist and could be
used directly. So this is arguably only interesting when one also
wants to prove properties of applying such functions.
\subsection{Reusability of proofs}
The previous example also illustrate how univalence unifies two otherwise
disparate areas: The category-theoretic study of monads; and monads as in
functional programming. Univalence thus allows one to reuse proofs. You could
say that univalence gives the developer two proofs for the price of one.
The previous example illustrate how univalence unifies two otherwise
disparate areas: The category-theoretic study of monads; and monads as
in functional programming. Univalence thus allows one to reuse proofs.
You could say that univalence gives the developer two proofs for the
price of one. As an illustration of this I proved that monads are
groupoids. I initially proved this for the Kleisli
formulation\footnote{Actually doing this directly turned out to be
tricky as well, so I defined an equivalent formulation which was not
formulated with a record, but purely with $\sum$-types.}. Since the
two formulations are equal under univalence, substitution directly
gives us that this also holds for the monoidal formulation. This of
course generalizes to any family $P \tp 𝒰𝒰$ where $P$ is inhabited
at either formulation (i.e.\ either $P\ \Monoidal$ or $P\ \Kleisli$
holds).
The introduction (section \ref{sec:context}) mentioned an often
employed-technique for enabling extensional equalities is to use the
setoid-interpretation. Nowhere in this formalization has this been necessary,
$\Path$ has been used globally in the project as propositional equality. One
interesting place where this becomes apparent is in interfacing with the Agda
standard library. Multiple definitions in the Agda standard library have been
designed with the setoid-interpretation in mind. E.g. the notion of ``unique
existential'' is indexed by a relation that should play the role of
propositional equality. Likewise for equivalence relations, they are indexed,
not only by the actual equivalence relation, but also by another relation that
serve as propositional equality.
%% Unfortunately we cannot use the definition of equivalences found in the
%% standard library to do equational reasoning directly. The reason for this is
%% that the equivalence relation defined there must be a homogenous relation,
%% but paths are heterogeneous relations.
The introduction (section \S\ref{sec:context}) mentioned that a
typical way of getting access to functional extensionality is to work
with setoids. Nowhere in this formalization has this been necessary,
$\Path$ has been used globally in the project for propositional
equality. One interesting place where this becomes apparent is in
interfacing with the Agda standard library. Multiple definitions in
the Agda standard library have been designed with the
setoid-interpretation in mind. E.g.\ the notion of \emph{unique
existential} is indexed by a relation that should play the role of
propositional equality. Equivalence relations are likewise indexed,
not only by the actual equivalence relation but also by another
relation that serve as propositional equality.
%% Unfortunately we cannot use the definition of equivalences found in
%% the standard library to do equational reasoning directly. The
%% reason for this is that the equivalence relation defined there must
%% be a homogenous relation, but paths are heterogeneous relations.
In the formalization at present a significant amount of energy has been put
towards proving things that would not have been needed in classical Agda. The
proofs that some given type is a proposition were provided as a strategy to
simplify some otherwise very complicated proofs (e.g.
\ref{eq:proof-prop-IsPreCategory} and \label{eq:productPath}). Often these
proofs would not be this complicated. If the J-rule holds definitionally the
proof-assistant can help simplify these goals considerably. The lack of the
J-rule has a significant impact on the complexity of these kinds of proofs.
In the formalization at present a significant amount of energy has
been put towards proving things that would not have been needed in
classical Agda. The proofs that some given type is a proposition were
provided as a strategy to simplify some otherwise very complicated
proofs (e.g.\ \ref{eq:proof-prop-IsPreCategory}
and \ref{eq:productPath}). Often these proofs would not be this
complicated. If the J-rule holds definitionally the proof-assistant
can help simplify these goals considerably. The lack of the J-rule has
a significant impact on the complexity of these kinds of proofs.
\TODO{Universe levels.}
\subsection{Motifs}
An oft-used technique in this development is using based path
induction to prove certain properties. One particular challenge that
arises when doing so is that Agda is not able to automatically infer
the family that one wants to do induction over. For instance in the
proof $\var{sym}\ (\var{sym}\ p) ≡ p$ from \ref{eq:sym-invol} the
family that we chose to do induction over was $D\ b'\ p' \defeq
\var{sym}\ (\var{sym}\ p') ≡ p'$. However, if one interactively tries
to give this hole, all the information that Agda can provide is that
one must provide an element of $𝒰$. Agda could be more helpful in this
context, perhaps even infer this family in some situations. In this
very simple example this is of course not a big problem, but there are
examples in the source code where this gets more involved.
\section{Future work}
\subsection{Agda \texttt{Prop}}
Jesper Cockx' work extending the universe-level-laws for Agda and the
\texttt{Prop}-type.
\subsection{Compiling Cubical Agda}
\label{sec:compiling-cubical-agda}
Compilation of program written in Cubical Agda is currently not supported. One
issue here is that the backends does not provide an implementation for the
cubical primitives (such as the path-type). This means that even though the
path-type gives us a computational interpretation of functional extensionality,
univalence, transport, etc., we do not have a way of actually using this to
compile our programs that use these primitives. It would be interesting to see
practical applications of this. The path between monads that this library
exposes could provide one particularly interesting case-study.
Compilation of program written in Cubical Agda is currently not
supported. One issue here is that the backends does not provide an
implementation for the cubical primitives (such as the path-type).
This means that even though the path-type gives us a computational
interpretation of functional extensionality, univalence, transport,
etc., we do not have a way of actually using this to compile our
programs that use these primitives. It would be interesting to see
practical applications of this.
\subsection{Higher inductive types}
This library has not explored the usefulness of higher inductive types in the
context of Category Theory.
\subsection{Proving laws of programs}
Another interesting thing would be to use the Kleisli formulation of
monads to prove properties of functional programs. The existence of
univalence will make it possible to re-use proofs stated in terms of
the monoidal formulation in this setting.
%% \subsection{Higher inductive types}
%% This library has not explored the usefulness of higher inductive types
%% in the context of Category Theory.
\subsection{Initiality conjecture}
A fellow student at Chalmers, Andreas Källberg, is currently working
on proving the initiality conjecture. He will be using this library
to do so.

View File

@ -0,0 +1,9 @@
App. 2 in HoTT gives typing rule for pathJ including a computational
rule for it.
If you have this computational rule definitionally, then you wouldn't
need to use `pathJprop`.
In discussion-section I mention HITs. I should remove this or come up
with a more elaborate example of something you could do, e.g.
something with pushouts in the category of sets.

View File

@ -3,7 +3,7 @@ I've written this as an appendix because 1) the aim of the thesis changed
drastically from the planning report/proposal 2) partly I'm not sure how to
structure my thesis.
My work so far has very much focused on the formalization, i.e. coding. It's
My work so far has very much focused on the formalization, i.e.\ coding. It's
unclear to me at this point what I should have in the final report. Here I will
describe what I have managed to formalize so far and what outstanding challenges
I'm facing.
@ -57,8 +57,8 @@ composition and identity, laws; preservation of identity and composition) plus
the extra condition that it is univalent - namely that you can get an equality
of two objects from an isomorphism.
I make no distinction between a pre-category and a real category (as in the
[HoTT]-sense). A pre-category in my implementation would be a category sans the
I make no distinction between a pre category and a real category (as in the
[HoTT]-sense). A pre category in my implementation would be a category sans the
witness to univalence.
I also prove that being a category is a proposition. This gives rise to an

File diff suppressed because it is too large Load Diff

View File

@ -1,205 +1,266 @@
\chapter{Introduction}
Functional extensionality and univalence is not expressible in
\nomen{Intensional Martin Löf Type Theory} (ITT). This poses a severe limitation
on both i. what is \emph{provable} and ii. the \emph{re-usability} of proofs.
Recent developments have, however, resulted in \nomen{Cubical Type Theory} (CTT)
which permits a constructive proof of these two important notions.
This thesis is a case study in the application of cubical Agda to the
formalization of category theory. At the center of this is the notion
of \nomenindex{equality}. There are two pervasive notions of equality
in type theory: \nomenindex{judgmental equality} and
\nomenindex{propositional equality}. Judgmental equality is a property
of the type system. Propositional equality on the other hand is
usually defined \emph{within} the system. When introducing
definitions this report will use the symbol $\defeq$. Judgmental
equalities will be denoted with $=$ and for propositional equalities
the notation $\equiv$ is used.
Furthermore an extension has been implemented for the proof assistant Agda
(\cite{agda}, \cite{cubical-agda}) that allows us to work in such a ``cubical
setting''. This thesis will explore the usefulness of this extension in the
context of category theory.
The rules of judgmental equality are related with $β$- and
$η$-reduction, which gives a notion of computation in a given type
theory.
%
There are some properties that one usually want judgmental equality to
satisfy. It must be \nomenindex{sound}, enjoy \nomenindex{canonicity}
and be a \nomenindex{congruence relation}. Soundness means that things
judged to be equal are equal with respects to the \nomenindex{model}
of the theory or the \emph{meta theory}. It must be a congruence
relation, because otherwise the relation certainly does not adhere to
our notion of equality. E.g.\ One would be able to conclude things
like: $x \equiv y \rightarrow f\ x \nequiv f\ y$. Canonicity means
that any well typed term evaluates to a \emph{canonical} form. For
example, for a closed term $e \tp \bN$, it will be the case that $e$
reduces to $n$ applications of $\mathit{suc}$ to $0$ for some $n$;
i.e.\ $e = \mathit{suc}^n\ 0$. Without canonicity terms in the
language can get ``stuck'', meaning that they do not reduce to a
canonical form.
For a system to work as a programming languages it is necessary for
judgmental equality to be \nomenindex{decidable}. Being decidable
simply means that that an algorithm exists to decide whether two terms
are equal. For any practical implementation, the decidability must
also be effectively computable.
For propositional equality the decidability requirement is relaxed. It
is not in general possible to decide the correctness of logical
propositions (cf.\ Hilbert's \emph{entscheidigungsproblem}).
There are two flavors of type-theory. \emph{Intensional-} and
\emph{extensional-} type theory (ITT and ETT respectively). Identity
types in extensional type theory are required to be
\nomen{propositions}{proposition}. That is, a type with at most one
inhabitant. In extensional type theory the principle of reflection
%
$$a ≡ b → a = b$$
%
is enough to make type checking undecidable. This report focuses on
Agda, which at a glance can be thought of as a version of intensional
type theory. Pattern-matching in regular Agda lets one prove
\nomenindex{Uniqueness of Identity Proofs} (UIP). UIP states that any
two identity proofs are propositionally identical.
The usual notion of propositional equality in ITT is quite
restrictive. In the next section a few motivating examples will be
presented that highlight. There exist techniques to circumvent these
problems, as we shall see. This thesis will explore an extension to
Agda that redefines the notion of propositional equality and as such
is an alternative to these other techniques. The extension is called
cubical Agda. Cubical Agda drops UIP, as it does not permit
\nomenindex{functional extensionality} nor \nomenindex{univalence}.
What makes cubical Agda particularly interesting is that it gives a
\emph{constructive} interpretation of univalence. What all this means
will be elaborated in the following sections.
%
\section{Motivating examples}
%
In the following two sections I present two examples that illustrate some
limitations inherent in ITT and -- by extension -- Agda.
In the following two sections I present two examples that illustrate
some limitations inherent in ITT and, by extension, Agda.
%
\subsection{Functional extensionality}
\label{sec:functional-extensionality}%
Consider the functions:
%
\begin{multicols}{2}
\noindent
\begin{equation*}
f \defeq (n \tp \bN) \mapsto (0 + n \tp \bN)
\end{equation*}
\begin{equation*}
g \defeq (n \tp \bN) \mapsto (n + 0 \tp \bN)
\end{equation*}
\end{multicols}
\begin{align*}%
\var{zeroLeft} & \defeq λ\; (n \tp \bN) \to (0 + n \tp \bN) \\
\var{zeroRight} & \defeq λ\; (n \tp \bN) \to (n + 0 \tp \bN)
\end{align*}%
%
$n + 0$ is \nomen{definitionally} equal to $n$, which we write as $n + 0 = n$.
This is also called \nomen{judgmental} equality. We call it definitional
equality because the \emph{equality} arises from the \emph{definition} of $+$
which is:
The term $n + 0$ is \nomenindex{definitionally} equal to $n$, which we
write as $n + 0 = n$. This is also called \nomenindex{judgmental
equality}. We call it definitional equality because the
\emph{equality} arises from the \emph{definition} of $+$, which is:
%
\newcommand{\suc}[1]{\mathit{suc}\ #1}
\begin{align*}
+ & \tp \bN \to \bN \to \bN \\
n + 0 & \defeq n \\
+ & \tp \bN \to \bN \to \bN \\
n + 0 & \defeq n \\
n + (\suc{m}) & \defeq \suc{(n + m)}
\end{align*}
%
Note that $0 + n$ is \emph{not} definitionally equal to $n$. $0 + n$ is in
normal form. I.e.; there is no rule for $+$ whose left-hand-side matches this
expression. We \emph{do}, however, have that they are \nomen{propositionally}
equal, which we write as $n + 0 \equiv n$. Propositional equality means that
there is a proof that exhibits this relation. Since equality is a transitive
relation we have that $n + 0 \equiv 0 + n$.
Unfortunately we don't have $f \equiv g$.\footnote{Actually showing this is
outside the scope of this text. Essentially it would involve giving a model
for our type theory that validates all our axioms but where $f \equiv g$ is
not true.} There is no way to construct a proof asserting the obvious
equivalence of $f$ and $g$ -- even though we can prove them equal for all
points. This is exactly the notion of equality of functions that we are
interested in; that they are equal for all inputs. We call this
\nomen{point-wise equality}, where the \emph{points} of a function refers
to it's arguments.
In the context of category theory functional extensionality is e.g. needed to
show that representable functors are indeed functors. The representable functor
for a category $\bC$ and a fixed object in $A \in \bC$ is defined to be:
Note that $0 + n$ is \emph{not} definitionally equal to $n$. This is
because $0 + n$ is in normal form. I.e.\ there is no rule for $+$
whose left hand side matches this expression. We do, however, have that
they are \nomen{propositionally}{propositional equality} equal, which
we write as $n \equiv n + 0$. Propositional equality means that there
is a proof that exhibits this relation. We can do induction over $n$
to prove this:
%
\begin{align*}
\fmap \defeq X \mapsto \Hom_{\bC}(A, X)
\end{align*}
\begin{align}
\label{eq:zrn}
\begin{split}
\var{zrn}\ & \tp ∀ n → n ≡ \var{zeroRight}\ n \\
\var{zrn}\ \var{zero} & \defeq \var{refl} \\
\var{zrn}\ (\var{suc}\ n) & \defeq \var{cong}\ \var{suc}\ (\var{zrn}\ n)
\end{split}
\end{align}
%
The proof obligation that this satisfies the identity law of functors
($\fmap\ \idFun \equiv \idFun$) thus becomes:
%
\begin{align*}
\Hom(A, \idFun_{\bX}) = (g \mapsto \idFun \comp g) \equiv \idFun_{\Sets}
\end{align*}
%
One needs functional extensionality to ``go under'' the function arrow and apply
the (left) identity law of the underlying category to prove $\idFun \comp g
\equiv g$ and thus close the goal.
This show that zero is a right neutral element (hence the name
$\var{zrn}$). Since equality is a transitive relation we have that
$\forall n \to \var{zeroLeft}\ n \equiv \var{zeroRight}\ n$.
Unfortunately we don't have $\var{zeroLeft} \equiv \var{zeroRight}$.
There is no way to construct a proof asserting the obvious equivalence
of $\var{zeroLeft}$ and $\var{zeroRight}$. Actually showing this is
outside the scope of this text. It would essentially involve giving a
model for our type theory that validates all our axioms but where
$\var{zeroLeft} \equiv \var{zeroRight}$ is not true. We cannot show
that they are equal even though we can prove them equal for all
points. This is exactly the notion of equality that we are interested
in for functions: Functions are considered equal when they are equal
for all inputs. This is called \nomenindex{pointwise equality} where
\emph{points} of a function refer to its arguments.
%
\subsection{Equality of isomorphic types}
%
Let $\top$ denote the unit type -- a type with a single constructor. In the
propositions-as-types interpretation of type theory $\top$ is the proposition
that is always true. The type $A \x \top$ and $A$ has an element for each $a :
A$. So in a sense they have the same shape (Greek; \nomen{isomorphic}). The
second element of the pair does not add any ``interesting information''. It can
be useful to identify such types. In fact, it is quite commonplace in
mathematics. Say we look at a set $\{x \mid \phi\ x \land \psi\ x\}$ and somehow
conclude that $\psi\ x \equiv \top$ for all $x$. A mathematician would
Let $\top$ denote the unit type -- a type with a single constructor.
In the propositions as types interpretation of type theory $\top$ is
the proposition that is always true. The type $A \x \top$ and $A$ has
an element for each $a \tp A$. So in a sense they have the same shape
(Greek; \nomenindex{isomorphic}). The second element of the pair does
not add any ``interesting information''. It can be useful to identify
such types. In fact it is quite commonplace in mathematics. Say we
look at a set $\{x \mid \phi\ x \land \psi\ x\}$ and somehow conclude
that $\psi\ x \equiv \top$ for all $x$. A mathematician would
immediately conclude $\{x \mid \phi\ x \land \psi\ x\} \equiv \{x \mid
\phi\ x\}$ without thinking twice. Unfortunately such an identification can not
be performed in ITT.
\phi\ x\}$ without thinking twice. Unfortunately such an
identification can not be performed in ITT.
More specifically what we are interested in is a way of identifying
\nomen{equivalent} types. I will return to the definition of equivalence later
in section \ref{sec:equiv}, but for now it is sufficient to think of an
equivalence as a one-to-one correspondence. We write $A \simeq B$ to assert that
$A$ and $B$ are equivalent types. The principle of univalence says that:
\nomenindex{equivalent} types. I will return to the definition of
equivalence later in section \S\ref{sec:equiv}, but for now it is
sufficient to think of an equivalence as a one-to-one correspondence.
We write $A \simeq B$ to assert that $A$ and $B$ are equivalent types.
The principle of univalence says that:
%
$$\mathit{univalence} \tp (A \simeq B) \simeq (A \equiv B)$$
%
In particular this allows us to construct an equality from an equivalence
($\mathit{ua} \tp (A \simeq B) \to (A \equiv B)$) and vice-versa.
%
$$\mathit{ua} \tp (A \simeq B) \to (A \equiv B)$$
%
and vice versa.
\section{Formalizing Category Theory}
%
The above examples serve to illustrate a limitation of ITT. One case where these
limitations are particularly prohibitive is in the study of Category Theory. At
a glance category theory can be described as ``the mathematical study of
(abstract) algebras of functions'' (\cite{awodey-2006}). By that token
functional extensionality is particularly useful for formulating Category
Theory. In Category theory it is also common to identify isomorphic structures
and univalence gives us a way to make this notion precise. In fact we can
formulate this requirement within our formulation of categories by requiring the
\emph{categories} themselves to be univalent as we shall see.
The above examples serve to illustrate a limitation of ITT. One case
where these limitations are particularly prohibitive is in the study
of Category Theory. At a glance category theory can be described as
``the mathematical study of (abstract) algebras of functions''
(\cite{awodey-2006}). By that token functional extensionality is
particularly useful for formulating Category Theory. In Category
theory it is also commonplace to identify isomorphic structures.
Univalence gives us a way to make this notion precise. In fact we can
formulate this requirement within our formulation of categories by
requiring the \emph{categories} themselves to be univalent as we shall
see in section \S\ref{sec:univalence}.
\section{Context}
\label{sec:context}
%
The idea of formalizing Category Theory in proof assistants is not new. There
are a multitude of these available online. Just as a first reference see this
question on Math Overflow: \cite{mo-formalizations}. Notably these
implementations of category theory in Agda:
The idea of formalizing Category Theory in proof assistants is not new. There
are a multitude of these available online. Notably:
%
\begin{itemize}
\item
A formalization in Agda using the setoid approach:
\url{https://github.com/copumpkin/categories}
A formalization in Agda using the setoid approach
\item
A formalization in Agda with univalence and functional
extensionality as postulates:
\url{https://github.com/pcapriotti/agda-categories}
A formalization in Agda with univalence and functional extensionality as
postulates.
\item
A formalization in Coq in the homotopic setting:
\url{https://github.com/HoTT/HoTT/tree/master/theories/Categories}
A formalization in Coq in the homotopic setting
\item
A formalization in \emph{CubicalTT} -- a language designed for
cubical type theory. Formalizes many different things, but only a
few concepts from category theory:
\url{https://github.com/mortberg/cubicaltt}
A formalization in CubicalTT - a language designed for cubical-type-theory.
Formalizes many different things, but only a few concepts from category
theory.
\end{itemize}
%
The contribution of this thesis is to explore how working in a cubical setting
will make it possible to prove more things and to reuse proofs and to try and
compare some aspects of this formalization with the existing ones.\TODO{How can
I live up to this?}
The contribution of this thesis is to explore how working in a cubical
setting will make it possible to prove more things, to reuse proofs
and to compare some aspects of this formalization with the existing
ones.
There are alternative approaches to working in a cubical setting where one can
still have univalence and functional extensionality. One option is to postulate
these as axioms. This approach, however, has other shortcomings, e.g.; you lose
\nomen{canonicity} (\TODO{Pageno!} \cite{huber-2016}). Canonicity means that any
well-typed term evaluates to a \emph{canonical} form. For example for a closed
term $e \tp \bN$ it will be the case that $e$ reduces to $n$ applications of
$\mathit{suc}$ to $0$ for some $n$; $e = \mathit{suc}^n\ 0$. Without canonicity
terms in the language can get ``stuck'' -- meaning that they do not reduce to a
canonical form.
There are alternative approaches to working in a cubical setting where
one can still have univalence and functional extensionality. One
option is to postulate these as axioms. This approach, however, has
other shortcomings, e.g.\ you lose \nomenindex{canonicity}
(\cite[p.\ 3]{huber-2016}).
Another approach is to use the \emph{setoid interpretation} of type theory
(\cite{hofmann-1995,huber-2016}). With this approach one works with
\nomen{extensional sets} $(X, \sim)$, that is a type $X \tp \MCU$ and an
equivalence relation $\sim \tp X \to X \to \MCU$ on that type. Under the setoid
interpretation the equivalence relation serve as a sort of ``local''
propositional equality. This approach has other drawbacks; it does not satisfy
all propositional equalities of type theory (\TODO{Citation needed}), is
cumbersome to work with in practice (\cite[p. 4]{huber-2016}) and makes
equational proofs less reusable since equational proofs $a \sim_{X} b$ are
inherently `local' to the extensional set $(X , \sim)$.
Another approach is to use the \emph{setoid interpretation} of type
theory (\cite{hofmann-1995,huber-2016}). With this approach one works
with \nomenindex{extensional sets} $(X, \sim)$. That is a type $X \tp
\MCU$ and an equivalence relation $\sim\ \tp X \to X \to \MCU$ on that
type. Under the setoid interpretation the equivalence relation serve
as a sort of ``local'' propositional equality. Since the developer
gets to pick this relation, it is not a~priori a congruence
relation. It must be manually verified by the developer. Furthermore,
functions between different setoids must be shown to be setoid
homomorphism, that is; they preserve the relation.
This approach has other drawbacks: It does not satisfy all
propositional equalities of type theory a~priori. That is, the
developer must manually show that e.g.\ the relation is a congruence.
Equational proofs $a \sim_{X} b$ are in some sense `local' to the
extensional set $(X , \sim)$. To e.g.\ prove that $x y → f\ x
f\ y$ for some function $f \tp A → B$ between two extensional sets $A$
and $B$ it must be shown that $f$ is a groupoid homomorphism. This
makes it very cumbersome to work with in practice (\cite[p.
4]{huber-2016}).
\section{Conventions}
\TODO{Talk a bit about terminology. Find a good place to stuff this little
section.}
In the remainder of this thesis I will use the term \nomenindex{Type}
to describe -- well -- types; thereby departing from the notation in
Agda where the keyword \texttt{Set} refers to types.
\nomenindex{Set}, on the other hand, shall refer to the homotopical
notion of a set. I will also leave all universe levels implicit. This
of course does not mean that a statement such as $\MCU \tp \MCU$ means
that we have type-in-type but rather that the arguments to the
universes are implicit.
In the remainder of this paper I will use the term \nomen{Type} to describe --
well, types. Thereby diverging from the notation in Agda where the keyword
\texttt{Set} refers to types. \nomen{Set} on the other hand shall refer to the
homotopical notion of a set. I will also leave all universe levels implicit.
I use the term \nomenindex{arrow} to refer to morphisms in a category,
whereas the terms \nomenindex{morphism}, \nomenindex{map} or
\nomenindex{function} shall be reserved for talking about type
theoretic functions; i.e.\ functions in Agda.
And I use the term \nomen{arrow} to refer to morphisms in a category, whereas
the terms morphism, map or function shall be reserved for talking about
type-theoretic functions; i.e. functions in Agda.
$\defeq$ will be used for introducing definitions. $=$ will be used to for
judgmental equality and $\equiv$ will be used for propositional equality.
As already noted $\defeq$ will be used for introducing definitions $=$
will be used to for judgmental equality and $\equiv$ will be used for
propositional equality.
All this is summarized in the following table:
%
\begin{samepage}
\begin{center}
\begin{tabular}{ c c c }
Name & Agda & Notation \\
\hline
\nomen{Type} & \texttt{Set} & $\Type$ \\
\nomen{Set} & \texttt{Σ Set IsSet} & $\Set$ \\
\varindex{Type} & \texttt{Set} & $\Type$ \\
\varindex{Set} & \texttt{Σ Set IsSet} & $\Set$ \\
Function, morphism, map & \texttt{A → B} & $A → B$ \\
Dependent- ditto & \texttt{(a : A) → B} & $_{a \tp A} B$ \\
\nomen{Arrow} & \texttt{Arrow A B} & $\Arrow\ A\ B$ \\
\nomen{Object} & \texttt{C.Object} & $̱ℂ.Object$ \\
\varindex{Arrow} & \texttt{Arrow A B} & $\Arrow\ A\ B$ \\
\varindex{Object} & \texttt{C.Object} & $̱ℂ.Object$ \\
Definition & \texttt{=} & $̱\defeq$ \\
Judgmental equality & \null & $̱=$ \\
Propositional equality & \null & $̱\equiv$
\end{tabular}
\end{center}
\end{samepage}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 344 KiB

View File

@ -9,14 +9,15 @@
%% Alternatively:
%% \newcommand{\defeq}{}
\newcommand{\bN}{\mathbb{N}}
\newcommand{\bC}{\mathbb{C}}
\newcommand{\bX}{\mathbb{X}}
\newcommand{\bC}{}
\newcommand{\bD}{𝔻}
\newcommand{\bX}{𝕏}
% \newcommand{\to}{\rightarrow}
\newcommand{\mto}{\mapsto}
%% \newcommand{\mto}{\mapsto}
\newcommand{\mto}{\rightarrow}
\newcommand{\UU}{\ensuremath{\mathcal{U}}\xspace}
\let\type\UU
\newcommand{\MCU}{\UU}
\newcommand{\nomen}[1]{\emph{#1}}
\newcommand{\todo}[1]{\textit{#1}}
\newcommand{\comp}{\circ}
\newcommand{\x}{\times}
@ -32,57 +33,115 @@
}
\makeatother
\newcommand{\var}[1]{\ensuremath{\mathit{#1}}}
\newcommand{\Hom}{\var{Hom}}
\newcommand{\fmap}{\var{fmap}}
\newcommand{\bind}{\var{bind}}
\newcommand{\join}{\var{join}}
\newcommand{\omap}{\var{omap}}
\newcommand{\pure}{\var{pure}}
\newcommand{\idFun}{\var{id}}
\newcommand{\Sets}{\var{Sets}}
\newcommand{\Set}{\var{Set}}
\newcommand{\hSet}{\var{hSet}}
\newcommand{\id}{\var{id}}
\newcommand{\isEquiv}{\var{isEquiv}}
\newcommand{\idToIso}{\var{idToIso}}
\newcommand{\isSet}{\var{isSet}}
\newcommand{\isContr}{\var{isContr}}
\newcommand{\isGroupoid}{\var{isGroupoid}}
\newcommand{\pathJ}{\var{pathJ}}
\newcommand\Object{\var{Object}}
\newcommand\Functor{\var{Functor}}
\newcommand\isProp{\var{isProp}}
\newcommand\propPi{\var{propPi}}
\newcommand\propSig{\var{propSig}}
\newcommand\PreCategory{\var{PreCategory}}
\newcommand\IsPreCategory{\var{IsPreCategory}}
\newcommand\isIdentity{\var{isIdentity}}
\newcommand\propIsIdentity{\var{propIsIdentity}}
\newcommand\IsCategory{\var{IsCategory}}
\newcommand\Gl{\var{\lambda}}
\newcommand\lemPropF{\var{lemPropF}}
\newcommand\isPreCategory{\var{isPreCategory}}
\newcommand\congruence{\var{cong}}
\newcommand\identity{\var{identity}}
\newcommand\isequiv{\var{isequiv}}
\newcommand\qinv{\var{qinv}}
\newcommand\fiber{\var{fiber}}
\newcommand\shuffle{\var{shuffle}}
\newcommand\Univalent{\var{Univalent}}
\newcommand\refl{\var{refl}}
\newcommand\isoToId{\var{isoToId}}
\newcommand\Isomorphism{\var{Isomorphism}}
\newcommand{\varindex}[1]{\ensuremath{\var{#1}}\index{$\var{#1}$}}
\newcommand{\nomen}[2]{\emph{#1}\index{#2}}
\newcommand{\nomenindex}[1]{\nomen{#1}{#1}}
\newcommand{\Hom}{\varindex{Hom}}
\newcommand{\fmap}{\varindex{fmap}}
\newcommand{\bind}{\varindex{bind}}
\newcommand{\join}{\varindex{join}}
\newcommand{\omap}{\varindex{omap}}
\newcommand{\pure}{\varindex{pure}}
\newcommand{\idFun}{\varindex{id}}
\newcommand{\Sets}{\varindex{Sets}}
\newcommand{\Set}{\varindex{Set}}
\newcommand{\hSet}{\varindex{hSet}}
\newcommand{\id}{\varindex{id}}
\newcommand{\isEquiv}{\varindex{isEquiv}}
\newcommand{\idToIso}{\varindex{idToIso}}
\newcommand{\idIso}{\varindex{idIso}}
\newcommand{\isSet}{\varindex{isSet}}
\newcommand{\isContr}{\varindex{isContr}}
\newcommand{\isGroupoid}{\varindex{isGroupoid}}
\newcommand{\pathJ}{\varindex{pathJ}}
\newcommand\Object{\varindex{Object}}
\newcommand\Functor{\varindex{Functor}}
\newcommand\isProp{\varindex{isProp}}
\newcommand\propPi{\varindex{propPi}}
\newcommand\propSig{\varindex{propSig}}
\newcommand\PreCategory{\varindex{PreCategory}}
\newcommand\IsPreCategory{\varindex{IsPreCategory}}
\newcommand\isIdentity{\varindex{isIdentity}}
\newcommand\propIsIdentity{\varindex{propIsIdentity}}
\newcommand\IsCategory{\varindex{IsCategory}}
\newcommand\Gl{\varindex{\lambda}}
\newcommand\lemPropF{\varindex{lemPropF}}
\newcommand\isPreCategory{\varindex{isPreCategory}}
\newcommand\congruence{\varindex{cong}}
\newcommand\identity{\varindex{identity}}
\newcommand\isequiv{\varindex{isequiv}}
\newcommand\qinv{\varindex{qinv}}
\newcommand\fiber{\varindex{fiber}}
\newcommand\shufflef{\varindex{shuffle}}
\newcommand\Univalent{\varindex{Univalent}}
\newcommand\refl{\varindex{refl}}
\newcommand\isoToId{\varindex{isoToId}}
\newcommand\Isomorphism{\varindex{Isomorphism}}
\newcommand\rrr{\ggg}
\newcommand\fish{\mathrel{\wideoverbar{\rrr}}}
\newcommand\fst{\var{fst}}
\newcommand\snd{\var{snd}}
\newcommand\Path{\var{Path}}
\newcommand\Category{\var{Category}}
%% \newcommand\fish{\mathbin{}}
%% \newcommand\fish{\mathbin{}}
\newcommand\fish{\mathbin{}}
%% \newcommand\fish{\mathbin{}}
%% \newcommand\fish{\mathrel{\wideoverbar{\rrr}}}
\newcommand\fst{\varindex{fst}}
\newcommand\snd{\varindex{snd}}
\newcommand\Path{\varindex{Path}}
\newcommand\Category{\varindex{Category}}
\newcommand\TODO[1]{TODO: \emph{#1}}
\newcommand*{\QED}{\hfill\ensuremath{\square}}%
\newcommand\uexists{\exists!}
\newcommand\Arrow{\var{Arrow}}
\newcommand\NTsym{\var{NT}}
\newcommand\Arrow{\varindex{Arrow}}
\newcommand\embellish[1]{\widehat{#1}}
\newcommand\nattrans[1]{\embellish{#1}}
\newcommand\functor[1]{\embellish{#1}}
\newcommand\NTsym{\varindex{NT}}
\newcommand\NT[2]{\NTsym\ #1\ #2}
\newcommand\Endo[1]{\var{Endo}\ #1}
\newcommand\EndoR{\mathcal{R}}
\newcommand\Endo[1]{\varindex{Endo}\ #1}
\newcommand\EndoR{\functor{\mathcal{R}}}
\newcommand\omapR{\mathcal{R}}
\newcommand\omapF{\mathcal{F}}
\newcommand\omapG{\mathcal{G}}
\newcommand\FunF{\functor{\omapF}}
\newcommand\FunG{\functor{\omapG}}
\newcommand\funExt{\varindex{funExt}}
\newcommand{\suc}[1]{\varindex{suc}\ #1}
\newcommand{\trans}{\varindex{trans}}
\newcommand{\toKleisli}{\varindex{toKleisli}}
\newcommand{\toMonoidal}{\varindex{toMonoidal}}
\newcommand\pairA{\mathcal{A}}
\newcommand\pairB{\mathcal{B}}
\newcommand{\joinNT}{\functor{\varindex{join}}}
\newcommand{\pureNT}{\functor{\varindex{pure}}}
\newcommand{\hrefsymb}[2]{\href{#1}{#2 \ExternalLink}}
\newcommand{\sourcebasepath}{http://web.student.chalmers.se/\textasciitilde hanghj/cat/doc/html/}
\newcommand{\docbasepath}{https://github.com/fredefox/cat/}
\newcommand{\sourcelink}[1]{\hrefsymb
{\sourcebasepath#1.html}
{\texttt{#1}}
}
\newcommand{\gitlink}{\hrefsymb{\docbasepath}{\texttt{\docbasepath}}}
\newcommand{\doclink}{\hrefsymb{\sourcebasepath}{\texttt{\sourcebasepath}}}
\newcommand{\clll}{\mathrel{\bC.\mathord{\lll}}}
\newcommand{\dlll}{\mathrel{\bD.\mathord{\lll}}}
\newcommand\coe{\varindex{coe}}
\newcommand\Monoidal{\varindex{Monoidal}}
\newcommand\Kleisli{\varindex{Kleisli}}
\newcommand\I{\mathds{I}}
\makeatletter
\DeclareRobustCommand\bigop[1]{%
\mathop{\vphantom{\sum}\mathpalette\bigop@{#1}}\slimits@
}
\newcommand{\bigop@}[2]{%
\vcenter{%
\sbox\z@{$#1\sum$}%
\hbox{\resizebox{\ifx#1\displaystyle.7\fi\dimexpr\ht\z@+\dp\z@}{!}{$\m@th#2$}}%
}%
}
\makeatother
\renewcommand{\llll}{\mathbin{\bigop{\lll}}}
\renewcommand{\rrrr}{\mathbin{\bigop{\rrr}}}
%% \newcommand{\llll}{lll}
%% \newcommand{\rrrr}{rrr}

View File

@ -1,4 +1,5 @@
\documentclass[a4paper]{report}
%% \documentclass[tightpage]{preview}
%% \documentclass[compact,a4paper]{article}
\input{packages.tex}
@ -29,7 +30,7 @@
\usepackage{chalmerstitle}
\subtitle{A formalization of category theory in Cubical Agda}
\authoremail{hanghj@student.chalmers.se}
\newcommand{\chalmers}{Chalmers University of Technology}
\newcommand{\chalmers}{Chalmers University of Technology and University of Gothenburg}
\supervisor{Thierry Coquand}
\supervisoremail{coquand@chalmers.se}
\supervisordepartment{\chalmers}
@ -42,19 +43,16 @@
\institution{\chalmers}
\department{Department of Computer Science and Engineering}
\researchgroup{Programming Logic Group}
\bibliographystyle{plain}
%% \newtheorem{prop}{Proposition}
\makeatletter
\newcommand*{\rom}[1]{\expandafter\@slowroman\romannumeral #1@}
\makeatother
\begin{document}
\frontmatter
\myfrontmatter
\pagenumbering{roman}
\maketitle
\addtocontents{toc}{\protect\thispagestyle{empty}}
\input{abstract.tex}
\input{acknowledgements.tex}
\tableofcontents
\pagenumbering{arabic}
\mainmatter
%
\input{introduction.tex}
\input{cubical.tex}
@ -69,8 +67,10 @@
\begin{appendices}
\setcounter{page}{1}
\pagenumbering{roman}
\input{sources.tex}
\input{appendix/abstract-funext.tex}
%% \input{sources.tex}
%% \input{planning.tex}
%% \input{halftime.tex}
\end{appendices}
\printindex
\end{document}

View File

@ -1,22 +1,31 @@
\usepackage[utf8]{inputenc}
\usepackage{natbib}
\bibliographystyle{plain}
\usepackage{xcolor}
%% \mode<report>{
\usepackage[
hidelinks,
%% hidelinks,
pdfusetitle,
pdfsubject={category theory},
pdfkeywords={type theory, homotopy theory, category theory, agda}]
{hyperref}
%% }
%% \definecolor{darkorange}{HTML}{ff8c00}
%% \hypersetup{allbordercolors={darkorange}}
\hypersetup{hidelinks}
\usepackage{graphicx}
%% \usepackage[active,tightpage]{preview}
\usepackage{parskip}
\usepackage{multicol}
\usepackage{amssymb,amsmath,amsthm,stmaryrd,mathrsfs,wasysym}
\usepackage[toc,page]{appendix}
\usepackage{xspace}
\usepackage[a4paper]{geometry}
\usepackage[paper=a4paper,top=3cm,bottom=3cm]{geometry}
\usepackage{makeidx}
\makeindex
% \setlength{\parskip}{10pt}
% \usepackage{tikz}
@ -75,13 +84,68 @@
\newunicodechar{}{\textfallback{}}
\newunicodechar{}{\textfallback{}}
\newunicodechar{}{\textfallback{}}
\newunicodechar{}{\textfallback{}}
%% \newunicodechar{}{\textfallback{}}
\newunicodechar{}{\textfallback{}}
\newunicodechar{}{\textfallback{}}
\newunicodechar{}{\textfallback{}}
\newunicodechar{}{\textfallback{}}
\newunicodechar{}{\textfallback{}}
\newunicodechar{}{\textfallback{}}
\newunicodechar{𝒜}{\textfallback{?}}
\newunicodechar{}{\textfallback{?}}
%% \newunicodechar{𝒜}{\textfallback{𝒜}}
%% \newunicodechar{}{\textfallback{}}
%% \newunicodechar{}{\textfallback{}}
\makeatletter
\newcommand*{\rom}[1]{\expandafter\@slowroman\romannumeral #1@}
\makeatother
\makeatletter
\newcommand\frontmatter{%
\cleardoublepage
%\@mainmatterfalse
\pagenumbering{roman}}
\newcommand\mainmatter{%
\cleardoublepage
% \@mainmattertrue
\pagenumbering{arabic}}
\newcommand\backmatter{%
\if@openright
\cleardoublepage
\else
\clearpage
\fi
% \@mainmatterfalse
}
\makeatother
\usepackage{xspace}
\usepackage{tikz}
\newcommand{\ExternalLink}{%
\tikz[x=1.2ex, y=1.2ex, baseline=-0.05ex]{%
\begin{scope}[x=1ex, y=1ex]
\clip (-0.1,-0.1)
--++ (-0, 1.2)
--++ (0.6, 0)
--++ (0, -0.6)
--++ (0.6, 0)
--++ (0, -1);
\path[draw,
line width = 0.5,
rounded corners=0.5]
(0,0) rectangle (1,1);
\end{scope}
\path[draw, line width = 0.5] (0.5, 0.5)
-- (1, 1);
\path[draw, line width = 0.5] (0.6, 1)
-- (1, 1) -- (1, 0.6);
}
}
\usepackage{ dsfont }
\usepackage{eso-pic}
\newcommand{\backgroundpic}[3]{
\put(#1,#2){
\parbox[b][\paperheight]{\paperwidth}{
\centering
\includegraphics[width=\paperwidth,height=\paperheight,keepaspectratio]{#3}}}}

492
doc/presentation.tex Normal file
View File

@ -0,0 +1,492 @@
\documentclass[a4paper]{beamer}
%% \documentclass[a4paper,handout]{beamer}
%% \usecolortheme[named=seagull]{structure}
\input{packages.tex}
\input{macros.tex}
\title{Univalent Categories}
\subtitle{A formalization of category theory in Cubical Agda}
\newcommand{\myname}{Frederik Hangh{\o}j Iversen}
\author[\myname]{
\myname\\
\footnotesize Supervisors: Thierry Coquand, Andrea Vezzosi\\
Examiner: Andreas Abel
}
\institute{Chalmers University of Technology}
\begin{document}
\frame{\titlepage}
\begin{frame}
\frametitle{Introduction}
Category Theory: The study of abstract functions. Slogan: ``It's the
arrows that matter''\pause
Objects are equal ``up to isomorphism''. Univalence makes this notion
precise.\pause
Agda does not permit proofs of univalence. Cubical Agda admits
this.\pause
Goal: Construct a category whose terminal objects are (equivalent to)
products. Use this to conclude that products are propositions, not a
structure on a category.
\end{frame}
\begin{frame}
\frametitle{Outline}
The path type
Definition of a (pre-) category
1-categories
Univalent (proper) categories
The category of spans
\end{frame}
\section{Paths}
\begin{frame}
\frametitle{Paths}
\framesubtitle{Definition}
Heterogeneous paths
\begin{equation*}
\Path \tp (P \tp \I\MCU) → P\ 0 → P\ 1 → \MCU
\end{equation*}
\pause
For $P \tp \I\MCU$ and $a_0 \tp P\ 0$, $a_1 \tp P\ 1$
inhabitants of $\Path\ P\ a_0\ a_1$ are like functions
%
$$
p \tp_{i \tp \I} P\ i
$$
%
Which satisfy $p\ 0 & = a_0$ and $p\ 1 & = a_1$
\pause
Homogenous paths
$$
a_0 ≡ a_1 ≜ \Path\ (\var{const}\ A)\ a_0\ a_1
$$
\end{frame}
\begin{frame}
\frametitle{Pre categories}
\framesubtitle{Definition}
Data:
\begin{align*}
\Object & \tp \Type \\
\Arrow & \tp \Object\Object\Type \\
\identity & \tp \Arrow\ A\ A \\
\llll & \tp \Arrow\ B\ C → \Arrow\ A\ B → \Arrow\ A\ C
\end{align*}
%
\pause
Laws:
%
\begin{align*}
\var{isAssociative} & \tp
h \llll (g \llll f) ≡ (h \llll g) \llll f \\
\var{isIdentity} & \tp
(\identity \llll f ≡ f)
×
(f \llll \identity ≡ f)
\end{align*}
\end{frame}
\begin{frame}
\frametitle{Pre categories}
\framesubtitle{1-categories}
Cubical Agda does not admit \emph{Uniqueness of Identity Proofs}
(UIP). Rather there is a hierarchy of \emph{Homotopy Types}:
Contractible types, mere propositions, sets, groupoids, \dots
\pause
1-categories:
$$
\isSet\ (\Arrow\ A\ B)
$$
\pause
\begin{align*}
\isSet & \tp \MCU\MCU \\
\isSet\ A & ≜ ∏_{a_0, a_1 \tp A} \isProp\ (a_0 ≡ a_1)
\end{align*}
\end{frame}
\begin{frame}
\frametitle{Outline}
The path type \ensuremath{\checkmark}
Definition of a (pre-) category \ensuremath{\checkmark}
1-categories \ensuremath{\checkmark}
Univalent (proper) categories
The category of spans
\end{frame}
\begin{frame}
\frametitle{Categories}
\framesubtitle{Univalence}
Let $\approxeq$ denote isomorphism of objects. We can then construct
the identity isomorphism in any category:
$$
(\identity , \identity , \var{isIdentity}) \tp A \approxeq A
$$
\pause
Likewise since paths are substitutive we can promote a path to an isomorphism:
$$
\idToIso \tp A ≡ B → A ≊ B
$$
\pause
For a category to be univalent we require this to be an equivalence:
%
$$
\isEquiv\ (A ≡ B)\ (A \approxeq B)\ \idToIso
$$
%
\end{frame}
\begin{frame}
\frametitle{Categories}
\framesubtitle{Univalence, cont'd}
$$\isEquiv\ (A ≡ B)\ (A \approxeq B)\ \idToIso$$
\pause%
$$(A ≡ B)(A \approxeq B)$$
\pause%
$$(A ≡ B)(A \approxeq B)$$
\pause%
Name the inverse of $\idToIso$:
$$\isoToId \tp (A \approxeq B)(A ≡ B)$$
\end{frame}
\begin{frame}
\frametitle{Propositionality of products}
Construct a category for which it is the case that the terminal
objects are equivalent to products:
\begin{align*}
\var{Terminal}\var{Product}\ \ 𝒜\
\end{align*}
\pause
And since equivalences preserve homotopy levels we get:
%
$$
\isProp\ \left(\var{Product}\ \bC\ 𝒜\ \right)
$$
\end{frame}
\begin{frame}
\frametitle{Categories}
\framesubtitle{A theorem}
%
Let the isomorphism $(ι, \inv{ι}, \var{inv}) \tp A \approxeq B$.
%
\pause
%
The isomorphism induces the path
%
$$
p ≜ \isoToId\ (\iota, \inv{\iota}, \var{inv}) \tp A ≡ B
$$
%
\pause
and consequently a path on arrows:
%
$$
p_{\var{dom}}\congruence\ (λ x → \Arrow\ x\ X)\ p
\tp
\Arrow\ A\ X ≡ \Arrow\ B\ X
$$
%
\pause
The proposition is:
%
\begin{align}
\label{eq:coeDom}
\tag{$\var{coeDom}$}
_{f \tp A → X}
\var{coe}\ p_{\var{dom}}\ f ≡ f \llll \inv{\iota}
\end{align}
\end{frame}
\begin{frame}
\frametitle{Categories}
\framesubtitle{A theorem, proof}
\begin{align*}
\var{coe}\ p_{\var{dom}}\ f
& ≡ f \llll (\idToIso\ p)_1 && \text{By path-induction} \\
& ≡ f \llll \inv{\iota}
&& \text{$\idToIso$ and $\isoToId$ are inverses}\\
\end{align*}
\pause
%
Induction will be based at $A$. Let $\widetilde{B}$ and $\widetilde{p}
\tp A ≡ \widetilde{B}$ be given.
%
\pause
%
Define the family:
%
$$
D\ \widetilde{B}\ \widetilde{p}
\var{coe}\ \widetilde{p}_{\var{dom}}\ f
f \llll \inv{(\idToIso\ \widetilde{p})}
$$
\pause
%
The base-case becomes:
$$
d \tp D\ A\ \refl =
\left(\var{coe}\ \refl_{\var{dom}}\ f ≡ f \llll \inv{(\idToIso\ \refl)}\right)
$$
\end{frame}
\begin{frame}
\frametitle{Categories}
\framesubtitle{A theorem, proof, cont'd}
$$
d \tp
\var{coe}\ \refl_{\var{dom}}\ f ≡ f \llll \inv{(\idToIso\ \refl)}
$$
\pause
\begin{align*}
\var{coe}\ \refl_{\var{dom}}\ f
& =
\var{coe}\ \refl\ f \\
& ≡ f
&& \text{neutral element for $\var{coe}$}\\
& ≡ f \llll \identity \\
& ≡ f \llll \var{subst}\ \refl\ \identity
&& \text{neutral element for $\var{subst}$}\\
& ≡ f \llll \inv{(\idToIso\ \refl)}
&& \text{By definition of $\idToIso$}\\
\end{align*}
\pause
In conclusion, the theorem is inhabited by:
$$
\var{pathInd}\ D\ d\ B\ p
$$
\end{frame}
\begin{frame}
\frametitle{Span category} \framesubtitle{Definition} Given a base
category $\bC$ and two objects in this category $\pairA$ and $\pairB$
we can construct the \nomenindex{span category}:
%
\pause
Objects:
$$
_{X \tp Object} (\Arrow\ X\ \pairA) × (\Arrow\ X\ \pairB)
$$
\pause
%
Arrows between objects $(A , a_{\pairA} , a_{\pairB})$ and
$(B , b_{\pairA} , b_{\pairB})$:
%
$$
_{f \tp \Arrow\ A\ B}
(b_{\pairA} \llll f ≡ a_{\pairA}) ×
(b_{\pairB} \llll f ≡ a_{\pairB})
$$
\end{frame}
\begin{frame}
\frametitle{Span category}
\framesubtitle{Univalence}
\begin{align*}
(X , x_{𝒜} , x_{}) ≡ (Y , y_{𝒜} , y_{})
\end{align*}
\begin{align*}
\begin{split}
p \tp & X ≡ Y \\
& \Path\ (λ i → \Arrow\ (p\ i)\ 𝒜)\ x_{𝒜}\ y_{𝒜} \\
& \Path\ (λ i → \Arrow\ (p\ i)\ )\ x_{}\ y_{}
\end{split}
\end{align*}
\begin{align*}
\begin{split}
\var{iso} \tp & X \approxeq Y \\
& \Path\ (λ i → \Arrow\ (\widetilde{p}\ i)\ 𝒜)\ x_{𝒜}\ y_{𝒜} \\
& \Path\ (λ i → \Arrow\ (\widetilde{p}\ i)\ )\ x_{}\ y_{}
\end{split}
\end{align*}
\begin{align*}
(X , x_{𝒜} , x_{}) ≊ (Y , y_{𝒜} , y_{})
\end{align*}
\end{frame}
\begin{frame}
\frametitle{Span category}
\framesubtitle{Univalence, proof}
%
\begin{align*}
%% (f, \inv{f}, \var{inv}_f, \var{inv}_{\inv{f}})
%% \tp
(X, x_{𝒜}, x_{}) \approxeq (Y, y_{𝒜}, y_{})
\to
\begin{split}
\var{iso} \tp & X \approxeq Y \\
& \Path\ (λ i → \Arrow\ (\widetilde{p}\ i)\ 𝒜)\ x_{𝒜}\ y_{𝒜} \\
& \Path\ (λ i → \Arrow\ (\widetilde{p}\ i)\ )\ x_{}\ y_{}
\end{split}
\end{align*}
\pause
%
Let $(f, \inv{f}, \var{inv}_f, \var{inv}_{\inv{f}})$ be an inhabitant
of the antecedent.\pause
Projecting out the first component gives us the isomorphism
%
$$
(\fst\ f, \fst\ \inv{f}
, \congruence\ \fst\ \var{inv}_f
, \congruence\ \fst\ \var{inv}_{\inv{f}}
)
\tp X \approxeq Y
$$
\pause
%
This gives rise to the following paths:
%
\begin{align*}
\begin{split}
\widetilde{p} & \tp X ≡ Y \\
\widetilde{p}_{𝒜} & \tp \Arrow\ X\ 𝒜\Arrow\ Y\ 𝒜 \\
\end{split}
\end{align*}
%
\end{frame}
\begin{frame}
\frametitle{Span category}
\framesubtitle{Univalence, proof, cont'd}
It remains to construct:
%
\begin{align*}
\begin{split}
& \Path\ (λ i → \widetilde{p}_{𝒜}\ i)\ x_{𝒜}\ y_{𝒜}
\end{split}
\end{align*}
\pause
%
This is achieved with the following lemma:
%
\begin{align*}
_{q \tp A ≡ B} \var{coe}\ q\ x_{𝒜} ≡ y_{𝒜}
\Path\ (λ i → q\ i)\ x_{𝒜}\ y_{𝒜}
\end{align*}
%
Which is used without proof.\pause
So the construction reduces to:
%
\begin{align*}
\var{coe}\ \widetilde{p}_{𝒜}\ x_{𝒜} ≡ y_{𝒜}
\end{align*}%
\pause%
This is proven with:
%
\begin{align*}
\var{coe}\ \widetilde{p}_{𝒜}\ x_{𝒜}
& ≡ x_{𝒜} \llll \fst\ \inv{f} && \text{\ref{eq:coeDom}} \\
& ≡ y_{𝒜} && \text{Property of span category}
\end{align*}
\end{frame}
\begin{frame}
\frametitle{Propositionality of products}
We have
%
$$
\isProp\ \var{Terminal}
$$\pause
%
We can show:
\begin{align*}
\var{Terminal}\var{Product}\ \ 𝒜\
\end{align*}
\pause
And since equivalences preserve homotopy levels we get:
%
$$
\isProp\ \left(\var{Product}\ \bC\ 𝒜\ \right)
$$
\end{frame}
\begin{frame}
\frametitle{Monads}
\framesubtitle{Monoidal form}
%
\begin{align*}
\EndoR & \tp \Functor\ \ \\
\pureNT
& \tp \NT{\widehat{\identity}}{\EndoR} \\
\joinNT
& \tp \NT{(\EndoR \oplus \EndoR)}{\EndoR}
\end{align*}
\pause
%
Let $\fmap$ be the map on arrows of $\EndoR$.
%
\begin{align*}
\join \llll \fmap\ \join
&\join \llll \join \\
\join \llll \pure\ &\identity \\
\join \llll \fmap\ \pure &\identity
\end{align*}
\end{frame}
\begin{frame}
\frametitle{Monads}
\framesubtitle{Kleisli form}
%
\begin{align*}
\omapR & \tp \Object\Object \\
\pure & \tp %_{X \tp Object}
\Arrow\ X\ (\omapR\ X) \\
\bind & \tp
\Arrow\ X\ (\omapR\ Y)
\to
\Arrow\ (\omapR\ X)\ (\omapR\ Y)
\end{align*}\pause
%
\begin{align*}
\fish & \tp
\Arrow\ A\ (\omapR\ B)
\Arrow\ B\ (\omapR\ C)
\Arrow\ A\ (\omapR\ C) \\
f \fish g & ≜ f \rrrr (\bind\ g)
\end{align*}
\pause
%
\begin{align*}
\bind\ \pure &\identity_{\omapR\ X} \\
\pure \fish f & ≡ f \\
(\bind\ f) \rrrr (\bind\ g) &\bind\ (f \fish g)
\end{align*}
\end{frame}
\begin{frame}
\frametitle{Monads}
\framesubtitle{Equivalence}
In the monoidal formulation we can define $\bind$:
%
$$
\bind\ f ≜ \join \llll \fmap\ f
$$
\pause
%
And likewise in the Kleisli formulation we can define $\join$:
%
$$
\join\bind\ \identity
$$
\pause
The laws are logically equivalent. Since logical equivalence is
enough for as an equivalence of types for propositions we get:
%
$$
\var{Monoidal}\var{Kleisli}
$$
%
\end{frame}
\end{document}

View File

@ -1,15 +1,15 @@
\chapter{Source code}
\chapter{Source code excerpts}
\label{ch:app-sources}
In the following a few of the definitions mentioned in the report are included.
The following is \emph{not} a verbatim copy of individual modules from the
implementation. Rather, this is a redacted and pre-formatted designed for
presentation purposes. It's provided here as a convenience. The actual sources
presentation purposes. Its provided here as a convenience. The actual sources
are the only authoritative source. Is something is not clear, please refer to
those.
\clearpage
\section{Cubical}
\label{sec:app-cubical}
\begin{figure}[h]
\label{fig:path}
\begin{Verbatim}
postulate
PathP : ∀ {} (A : I → Set ) → A i0 → A i1 → Set
@ -23,6 +23,7 @@ module _ {} {A : Set } where
\end{Verbatim}
\caption{Excerpt from the cubical library. Definition of the path-type.}
\end{figure}
\clearpage
%
\begin{figure}[h]
\begin{Verbatim}

95
doc/title.tex Normal file
View File

@ -0,0 +1,95 @@
%% FRONTMATTER
\frontmatter
%% \newgeometry{top=3cm, bottom=3cm,left=2.25 cm, right=2.25cm}
\begingroup
\thispagestyle{empty}
{\Huge\thetitle}\\[.5cm]
{\Large A formalization of category theory in Cubical Agda}\\[6cm]
\begin{center}
\includegraphics[width=\linewidth,keepaspectratio]{isomorphism.png}
\end{center}
% Cover text
\vfill
%% \renewcommand{\familydefault}{\sfdefault} \normalfont % Set cover page font
{\Large\theauthor}\\[.5cm]
Master's thesis in Computer Science
\endgroup
%% \end{titlepage}
% BACK OF COVER PAGE (BLANK PAGE)
\newpage
%% \newgeometry{a4paper} % Temporarily change margins
%% \restoregeometry
\thispagestyle{empty}
\null
%% \begin{titlepage}
% TITLE PAGE
\newpage
\thispagestyle{empty}
\begin{center}
\textsc{\LARGE Master's thesis \the\year}\\[4cm] % Report number is currently not in use
\textbf{\LARGE \thetitle} \\[1cm]
{\large \subtitle}\\[1cm]
{\large \theauthor}
\vfill
\centering
\includegraphics[width=0.2\pdfpagewidth]{assets/logo_eng.pdf}
\vspace{5mm}
\textsc{Department of Computer Science and Engineering}\\
\textsc{{\researchgroup}}\\
%Name of research group (if applicable)\\
\textsc{\institution} \\
\textsc{Gothenburg, Sweden \the\year}\\
\end{center}
% IMPRINT PAGE (BACK OF TITLE PAGE)
\newpage
\thispagestyle{plain}
\textit{\thetitle}\\
\subtitle\\
\copyright\ \the\year ~ \textsc{\theauthor}
\vspace{4.5cm}
\setlength{\parskip}{0.5cm}
\textbf{Author:}\\
\theauthor\\
\href{mailto:\authoremail>}{\texttt{<\authoremail>}}
\textbf{Supervisor:}\\
\supervisor\\
\href{mailto:\supervisoremail>}{\texttt{<\supervisoremail>}}\\
\supervisordepartment
\textbf{Co-supervisor:}\\
\cosupervisor\\
\href{mailto:\cosupervisoremail>}{\texttt{<\cosupervisoremail>}}\\
\cosupervisordepartment
\textbf{Examiner:}\\
\examiner\\
\href{mailto:\examineremail>}{\texttt{<\examineremail>}}\\
\examinerdepartment
\vfill
Master's Thesis \the\year\\ % Report number currently not in use
\department\\
%Division of Division name\\
%Name of research group (if applicable)\\
\institution\\
SE-412 96 Gothenburg\\
Telephone +46 31 772 1000 \setlength{\parskip}{0.5cm}\\
% Caption for cover page figure if used, possibly with reference to further information in the report
%% Cover: Wind visualization constructed in Matlab showing a surface of constant wind speed along with streamlines of the flow. \setlength{\parskip}{0.5cm}
%Printed by [Name of printing company]\\
Gothenburg, Sweden \the\year
%% \restoregeometry
%% \end{titlepage}
\tableofcontents

@ -1 +1 @@
Subproject commit 4493cf249a1648be2ad365fe94ece337bfbcb5d9
Subproject commit ac331fc38ca05f85dfebc57eb1259ba2ea0e50d5

@ -1 +1 @@
Subproject commit 4e5d43a9c75286b3a8750567d75a930674d7720d
Subproject commit b112c292ded61b02fa32a1b65cac77314a1e9698

View File

@ -10,14 +10,15 @@ open import Cat.Category.NaturalTransformation
open import Cat.Category.Yoneda
open import Cat.Category.Monoid
open import Cat.Category.Monad
open Cat.Category.Monad.Monoidal
open Cat.Category.Monad.Kleisli
open import Cat.Category.Monad.Monoidal
open import Cat.Category.Monad.Kleisli
open import Cat.Category.Monad.Voevodsky
open import Cat.Categories.Opposite
open import Cat.Categories.Sets
open import Cat.Categories.Cat
open import Cat.Categories.Rel
open import Cat.Categories.Free
open import Cat.Categories.Fun
open import Cat.Categories.Cube
-- open import Cat.Categories.Cube
open import Cat.Categories.CwF

View File

@ -54,9 +54,9 @@ module _ { ' : Level} (Ns : Set ) where
case (f i) of λ
{ (inj₁ (fi , _)) case (f j) of λ
{ (inj₁ (fj , _)) fi fj i j
; (inj₂ _) Lift
; (inj₂ _) Lift _
}
; (inj₂ _) Lift
; (inj₂ _) Lift _
}
Hom = Σ Hom' rules

View File

@ -5,6 +5,7 @@ open import Cat.Prelude
open import Cat.Category
open import Cat.Category.Functor
open import Cat.Categories.Fam
open import Cat.Categories.Opposite
module _ {a b : Level} where
record CwF : Set (lsuc (a b)) where

View File

@ -1,13 +1,14 @@
{-# OPTIONS --allow-unsolved-metas --cubical --caching #-}
module Cat.Categories.Fun where
open import Cat.Prelude
open import Cat.Equivalence
open import Cat.Category
open import Cat.Category.Functor
import Cat.Category.NaturalTransformation
as NaturalTransformation
open import Cat.Categories.Opposite
module Fun {c c' d d' : Level} ( : Category c c') (𝔻 : Category d d') where
open NaturalTransformation 𝔻 public hiding (module Properties)
@ -52,14 +53,14 @@ module Fun {c c' d d' : Level} ( : Category c c') (𝔻 : C
lem : coe (pp {C}) 𝔻.identity f→g
lem = trans (𝔻.9-1-9-right {b = Functor.omap F C} 𝔻.identity p*) 𝔻.rightIdentity
idToNatTrans : NaturalTransformation F G
idToNatTrans = (λ C coe pp 𝔻.identity) , λ f begin
coe pp 𝔻.identity 𝔻.<<< F.fmap f ≡⟨ cong (𝔻._<<< F.fmap f) lem
-- Just need to show that f→g is a natural transformation
-- I know that it has an inverse; g→f
f→g 𝔻.<<< F.fmap f ≡⟨ {!lem!}
G.fmap f 𝔻.<<< f→g ≡⟨ cong (G.fmap f 𝔻.<<<_) (sym lem)
G.fmap f 𝔻.<<< coe pp 𝔻.identity
-- idToNatTrans : NaturalTransformation F G
-- idToNatTrans = (λ C → coe pp 𝔻.identity) , λ f → begin
-- coe pp 𝔻.identity 𝔻.<<< F.fmap f ≡⟨ cong (𝔻._<<< F.fmap f) lem ⟩
-- -- Just need to show that f→g is a natural transformation
-- -- I know that it has an inverse; g→f
-- f→g 𝔻.<<< F.fmap f ≡⟨ {!lem!} ⟩
-- G.fmap f 𝔻.<<< f→g ≡⟨ cong (G.fmap f 𝔻.<<<_) (sym lem) ⟩
-- G.fmap f 𝔻.<<< coe pp 𝔻.identity ∎
module _ {A B : Functor 𝔻} where
module A = Functor A
@ -92,70 +93,70 @@ module Fun {c c' d d' : Level} ( : Category c c') (𝔻 : C
U : (F : .Object 𝔻.Object) Set _
U F = {A B : .Object} [ A , B ] 𝔻 [ F A , F B ]
module _
(omap : .Object 𝔻.Object)
(p : A.omap omap)
where
D : Set _
D = ( fmap : U omap)
( let
raw-B : RawFunctor 𝔻
raw-B = record { omap = omap ; fmap = fmap }
)
(isF-B' : IsFunctor 𝔻 raw-B)
( let
B' : Functor 𝔻
B' = record { raw = raw-B ; isFunctor = isF-B' }
)
(iso' : A B') PathP (λ i U (p i)) A.fmap fmap
-- D : Set _
-- D = PathP (λ i → U (p i)) A.fmap fmap
-- eeq : (λ f → A.fmap f) ≡ fmap
-- eeq = funExtImp (λ A → funExtImp (λ B → funExt (λ f → isofmap {A} {B} f)))
-- where
-- module _ {X : .Object} {Y : .Object} (f : [ X , Y ]) where
-- isofmap : A.fmap f ≡ fmap f
-- isofmap = {!ap!}
d : D A.omap refl
d = res
where
module _
( fmap : U A.omap )
( let
raw-B : RawFunctor 𝔻
raw-B = record { omap = A.omap ; fmap = fmap }
)
( isF-B' : IsFunctor 𝔻 raw-B )
( let
B' : Functor 𝔻
B' = record { raw = raw-B ; isFunctor = isF-B' }
)
( iso' : A B' )
where
module _ {X Y : .Object} (f : [ X , Y ]) where
step : {!!} 𝔻.≊ {!!}
step = {!!}
resres : A.fmap {X} {Y} f fmap {X} {Y} f
resres = {!!}
res : PathP (λ i U A.omap) A.fmap fmap
res i {X} {Y} f = resres f i
-- module _
-- (omap : .Object → 𝔻.Object)
-- (p : A.omap ≡ omap)
-- where
-- D : Set _
-- D = ( fmap : U omap)
-- → ( let
-- raw-B : RawFunctor 𝔻
-- raw-B = record { omap = omap ; fmap = fmap }
-- )
-- → (isF-B' : IsFunctor 𝔻 raw-B)
-- → ( let
-- B' : Functor 𝔻
-- B' = record { raw = raw-B ; isFunctor = isF-B' }
-- )
-- → (iso' : A ≊ B') → PathP (λ i → U (p i)) A.fmap fmap
-- -- D : Set _
-- -- D = PathP (λ i → U (p i)) A.fmap fmap
-- -- eeq : (λ f → A.fmap f) ≡ fmap
-- -- eeq = funExtImp (λ A → funExtImp (λ B → funExt (λ f → isofmap {A} {B} f)))
-- -- where
-- -- module _ {X : .Object} {Y : .Object} (f : [ X , Y ]) where
-- -- isofmap : A.fmap f ≡ fmap f
-- -- isofmap = {!ap!}
-- d : D A.omap refl
-- d = res
-- where
-- module _
-- ( fmap : U A.omap )
-- ( let
-- raw-B : RawFunctor 𝔻
-- raw-B = record { omap = A.omap ; fmap = fmap }
-- )
-- ( isF-B' : IsFunctor 𝔻 raw-B )
-- ( let
-- B' : Functor 𝔻
-- B' = record { raw = raw-B ; isFunctor = isF-B' }
-- )
-- ( iso' : A ≊ B' )
-- where
-- module _ {X Y : .Object} (f : [ X , Y ]) where
-- step : {!!} 𝔻.≊ {!!}
-- step = {!!}
-- resres : A.fmap {X} {Y} f ≡ fmap {X} {Y} f
-- resres = {!!}
-- res : PathP (λ i → U A.omap) A.fmap fmap
-- res i {X} {Y} f = resres f i
fmapEq : PathP (λ i U (omapEq i)) A.fmap B.fmap
fmapEq = pathJ D d B.omap omapEq B.fmap B.isFunctor iso
-- fmapEq : PathP (λ i → U (omapEq i)) A.fmap B.fmap
-- fmapEq = pathJ D d B.omap omapEq B.fmap B.isFunctor iso
rawEq : A.raw B.raw
rawEq i = record { omap = omapEq i ; fmap = fmapEq i }
-- rawEq : A.raw ≡ B.raw
-- rawEq i = record { omap = omapEq i ; fmap = fmapEq i }
private
f : (A B) (A B)
f p = idToNatTrans p , idToNatTrans (sym p) , NaturalTransformation≡ A A (funExt (λ C {!!})) , {!!}
g : (A B) (A B)
g = Functor≡ rawEq
inv : AreInverses f g
inv = {!funExt λ p → ?!} , {!!}
iso : (A B) (A B)
iso = f , g , inv
-- private
-- f : (A ≡ B) → (A ≊ B)
-- f p = idToNatTrans p , idToNatTrans (sym p) , NaturalTransformation≡ A A (funExt (λ C → {!!})) , {!!}
-- g : (A ≊ B) → (A ≡ B)
-- g = Functor≡ ∘ rawEq
-- inv : AreInverses f g
-- inv = {!funExt λ p → ?!} , {!!}
postulate
iso : (A B) (A B)
-- iso = f , g , inv
univ : (A B) (A B)
univ = fromIsomorphism _ _ iso

View File

@ -0,0 +1,96 @@
{-# OPTIONS --cubical #-}
module Cat.Categories.Opposite where
open import Cat.Prelude
open import Cat.Equivalence
open import Cat.Category
-- | The opposite category
--
-- The opposite category is the category where the direction of the arrows are
-- flipped.
module _ {a b : Level} where
module _ ( : Category a b) where
private
module _ where
module = Category
opRaw : RawCategory a b
RawCategory.Object opRaw = .Object
RawCategory.Arrow opRaw = flip .Arrow
RawCategory.identity opRaw = .identity
RawCategory._<<<_ opRaw = ._>>>_
open RawCategory opRaw
isPreCategory : IsPreCategory opRaw
IsPreCategory.isAssociative isPreCategory = sym .isAssociative
IsPreCategory.isIdentity isPreCategory = swap .isIdentity
IsPreCategory.arrowsAreSets isPreCategory = .arrowsAreSets
open IsPreCategory isPreCategory
module _ {A B : .Object} where
open Σ (toIso _ _ (.univalent {A} {B}))
renaming (fst to idToIso* ; snd to inv*)
open AreInverses {f = .idToIso A B} {idToIso*} inv*
shuffle : A B A .≊ B
shuffle (f , g , inv) = g , f , inv
shuffle~ : A .≊ B A B
shuffle~ (f , g , inv) = g , f , inv
lem : (p : A B) idToIso A B p shuffle~ (.idToIso A B p)
lem p = isoEq refl
isoToId* : A B A B
isoToId* = idToIso* shuffle
inv : AreInverses (idToIso A B) isoToId*
inv =
( funExt (λ x begin
(isoToId* idToIso A B) x
≡⟨⟩
(idToIso* shuffle idToIso A B) x
≡⟨ cong (λ φ φ x) (cong (λ φ idToIso* shuffle φ) (funExt lem))
(idToIso* shuffle shuffle~ .idToIso A B) x
≡⟨⟩
(idToIso* .idToIso A B) x
≡⟨ (λ i verso-recto i x)
x )
, funExt (λ x begin
(idToIso A B idToIso* shuffle) x
≡⟨ cong (λ φ φ x) (cong (λ φ φ idToIso* shuffle) (funExt lem))
(shuffle~ .idToIso A B idToIso* shuffle) x
≡⟨ cong (λ φ φ x) (cong (λ φ shuffle~ φ shuffle) recto-verso)
(shuffle~ shuffle) x
≡⟨⟩
x )
)
isCategory : IsCategory opRaw
IsCategory.isPreCategory isCategory = isPreCategory
IsCategory.univalent isCategory
= univalenceFromIsomorphism (isoToId* , inv)
opposite : Category a b
Category.raw opposite = opRaw
Category.isCategory opposite = isCategory
-- As demonstrated here a side-effect of having no-eta-equality on constructors
-- means that we need to pick things apart to show that things are indeed
-- definitionally equal. I.e; a thing that would normally be provable in one
-- line now takes 13!! Admittedly it's a simple proof.
module _ { : Category a b} where
open Category
private
-- Since they really are definitionally equal we just need to pick apart
-- the data-type.
rawInv : Category.raw (opposite (opposite )) raw
RawCategory.Object (rawInv _) = Object
RawCategory.Arrow (rawInv _) = Arrow
RawCategory.identity (rawInv _) = identity
RawCategory._<<<_ (rawInv _) = _<<<_
oppositeIsInvolution : opposite (opposite )
oppositeIsInvolution = Category≡ rawInv

View File

@ -2,6 +2,7 @@
module Cat.Categories.Rel where
open import Cat.Prelude hiding (Rel)
open import Cat.Equivalence
open import Cat.Category
@ -61,15 +62,9 @@ module _ {A B : Set} {S : Subset (A × B)} (ab : A × B) where
lem0 = (λ a'' a≡a'' a''b∈S (forwards backwards) (a'' , a≡a'' , a''b∈S) (a'' , a≡a'' , a''b∈S))
lem1 = (λ z₁ cong (\ z a , refl , z) (pathJprop (\ y _ y) z₁))
isequiv : isEquiv
(Σ[ a' A ] (a , a') Diag A × (a' , b) S)
((a , b) S)
backwards
isequiv y = gradLemma backwards forwards fwd-bwd bwd-fwd y
equi : (Σ[ a' A ] (a , a') Diag A × (a' , b) S)
(a , b) S
equi = backwards , isequiv
equi = fromIsomorphism _ _ (backwards , forwards , funExt bwd-fwd , funExt fwd-bwd)
ident-r : (Σ[ a' A ] (a , a') Diag A × (a' , b) S)
(a , b) S
@ -95,15 +90,9 @@ module _ {A B : Set} {S : Subset (A × B)} (ab : A × B) where
lem0 = (λ b'' b≡b'' (ab''∈S : (a , b'') S) (forwards backwards) (b'' , ab''∈S , sym b≡b'') (b'' , ab''∈S , sym b≡b''))
lem1 = (λ ab''∈S cong (λ φ b , φ , refl) (pathJprop (λ y _ y) ab''∈S))
isequiv : isEquiv
(Σ[ b' B ] (a , b') S × (b' , b) Diag B)
((a , b) S)
backwards
isequiv ab∈S = gradLemma backwards forwards bwd-fwd fwd-bwd ab∈S
equi : (Σ[ b' B ] (a , b') S × (b' , b) Diag B)
ab S
equi = backwards , isequiv
equi = fromIsomorphism _ _ (backwards , (forwards , funExt fwd-bwd , funExt bwd-fwd))
ident-l : (Σ[ b' B ] (a , b') S × (b' , b) Diag B)
ab S
@ -133,15 +122,9 @@ module _ {A B C D : Set} {S : Subset (A × B)} {R : Subset (B × C)} {Q : Subset
bwd-fwd : (x : Q⊕⟨R⊕S⟩) (bwd fwd) x x
bwd-fwd x = refl
isequiv : isEquiv
(Σ[ c C ] (Σ[ b B ] (a , b) S × (b , c) R) × (c , d) Q)
(Σ[ b B ] (a , b) S × (Σ[ c C ] (b , c) R × (c , d) Q))
fwd
isequiv = gradLemma fwd bwd fwd-bwd bwd-fwd
equi : (Σ[ c C ] (Σ[ b B ] (a , b) S × (b , c) R) × (c , d) Q)
(Σ[ b B ] (a , b) S × (Σ[ c C ] (b , c) R × (c , d) Q))
equi = fwd , isequiv
equi = fromIsomorphism _ _ (fwd , bwd , funExt bwd-fwd , funExt fwd-bwd)
-- isAssociativec : Q + (R + S) ≡ (Q + R) + S
is-isAssociative : (Σ[ c C ] (Σ[ b B ] (a , b) S × (b , c) R) × (c , d) Q)

View File

@ -3,11 +3,11 @@
module Cat.Categories.Sets where
open import Cat.Prelude as P
open import Cat.Equivalence
open import Cat.Category
open import Cat.Category.Functor
open import Cat.Category.Product
open import Cat.Equivalence
open import Cat.Categories.Opposite
_⊙_ : {a b c : Level} {A : Set a} {B : Set b} {C : Set c} (A B) (B C) A C
eqA eqB = Equivalence.compose eqA eqB

View File

@ -0,0 +1,170 @@
{-# OPTIONS --cubical --caching #-}
module Cat.Categories.Span where
open import Cat.Prelude as P hiding (_×_ ; fst ; snd)
open import Cat.Equivalence
open import Cat.Category
module _ {a b : Level} ( : Category a b)
(let module = Category ) (𝒜 : .Object) where
open P
private
module _ where
raw : RawCategory _ _
raw = record
{ Object = Σ[ X .Object ] .Arrow X 𝒜 × .Arrow X
; Arrow = λ{ (A , a0 , a1) (B , b0 , b1)
Σ[ f .Arrow A B ]
[ b0 f ] a0
× [ b1 f ] a1
}
; identity = λ{ {X , f , g} .identity {X} , .rightIdentity , .rightIdentity}
; _<<<_ = λ { {_ , a0 , a1} {_ , b0 , b1} {_ , c0 , c1} (f , f0 , f1) (g , g0 , g1)
(f .<<< g)
, (begin
[ c0 [ f g ] ] ≡⟨ .isAssociative
[ [ c0 f ] g ] ≡⟨ cong (λ φ [ φ g ]) f0
[ b0 g ] ≡⟨ g0
a0
)
, (begin
[ c1 [ f g ] ] ≡⟨ .isAssociative
[ [ c1 f ] g ] ≡⟨ cong (λ φ [ φ g ]) f1
[ b1 g ] ≡⟨ g1
a1
)
}
}
module _ where
open RawCategory raw
propEqs : {X' : Object}{Y' : Object} (let X , xa , xb = X') (let Y , ya , yb = Y')
(xy : .Arrow X Y) isProp ( [ ya xy ] xa × [ yb xy ] xb)
propEqs xs = propSig (.arrowsAreSets _ _) (\ _ .arrowsAreSets _ _)
arrowEq : {X Y : Object} {f g : Arrow X Y} fst f fst g f g
arrowEq {X} {Y} {f} {g} p = λ i p i , lemPropF propEqs p {snd f} {snd g} i
isAssociative : IsAssociative
isAssociative {f = f , f0 , f1} {g , g0 , g1} {h , h0 , h1} = arrowEq .isAssociative
isIdentity : IsIdentity identity
isIdentity {AA@(A , a0 , a1)} {BB@(B , b0 , b1)} {f , f0 , f1} = arrowEq .leftIdentity , arrowEq .rightIdentity
arrowsAreSets : ArrowsAreSets
arrowsAreSets {X , x0 , x1} {Y , y0 , y1}
= sigPresSet .arrowsAreSets λ a propSet (propEqs _)
isPreCat : IsPreCategory raw
IsPreCategory.isAssociative isPreCat = isAssociative
IsPreCategory.isIdentity isPreCat = isIdentity
IsPreCategory.arrowsAreSets isPreCat = arrowsAreSets
open IsPreCategory isPreCat
module _ {𝕏 𝕐 : Object} where
open Σ 𝕏 renaming (fst to X ; snd to x)
open Σ x renaming (fst to xa ; snd to xb)
open Σ 𝕐 renaming (fst to Y ; snd to y)
open Σ y renaming (fst to ya ; snd to yb)
open import Cat.Equivalence using (composeIso) renaming (_≅_ to _≅_)
-- The proof will be a sequence of isomorphisms between the
-- following 4 types:
T0 = ((X , xa , xb) (Y , ya , yb))
T1 = (Σ[ p (X Y) ] (PathP (λ i .Arrow (p i) 𝒜) xa ya) × (PathP (λ i .Arrow (p i) ) xb yb))
T2 = Σ (X .≊ Y) (λ iso
let p = .isoToId iso
in
( PathP (λ i .Arrow (p i) 𝒜) xa ya)
× PathP (λ i .Arrow (p i) ) xb yb
)
T3 = ((X , xa , xb) (Y , ya , yb))
step0 : T0 T1
step0
= (λ p cong fst p , cong-d (fst snd) p , cong-d (snd snd) p)
-- , (λ x → λ i → fst x i , (fst (snd x) i) , (snd (snd x) i))
, (λ{ (p , q , r) Σ≡ p λ i q i , r i})
, funExt (λ{ p refl})
, funExt (λ{ (p , q , r) refl})
step1 : T1 T2
step1
= symIso
(isoSigFst
{A = (X .≊ Y)}
{B = (X Y)}
(.groupoidObject _ _)
{Q = \ p (PathP (λ i .Arrow (p i) 𝒜) xa ya) × (PathP (λ i .Arrow (p i) ) xb yb)}
.isoToId
(symIso (_ , .asTypeIso {X} {Y}) .snd)
)
step2 : T2 T3
step2
= ( λ{ (iso@(f , f~ , inv-f) , p , q)
( f , sym (.domain-twist-sym iso p) , sym (.domain-twist-sym iso q))
, ( f~ , sym (.domain-twist iso p) , sym (.domain-twist iso q))
, arrowEq (fst inv-f)
, arrowEq (snd inv-f)
}
)
, (λ{ (f , f~ , inv-f , inv-f~)
let
iso : X .≊ Y
iso = fst f , fst f~ , cong fst inv-f , cong fst inv-f~
p : X Y
p = .isoToId iso
pA : .Arrow X 𝒜 .Arrow Y 𝒜
pA = cong (λ x .Arrow x 𝒜) p
pB : .Arrow X .Arrow Y
pB = cong (λ x .Arrow x ) p
k0 = begin
coe pB xb ≡⟨ .coe-dom iso
xb .<<< fst f~ ≡⟨ snd (snd f~)
yb
k1 = begin
coe pA xa ≡⟨ .coe-dom iso
xa .<<< fst f~ ≡⟨ fst (snd f~)
ya
in iso , coe-lem-inv k1 , coe-lem-inv k0})
, funExt (λ x lemSig
(λ x propSig prop0 (λ _ prop1))
_ _
(Σ≡ refl (.propIsomorphism _ _ _)))
, funExt (λ{ (f , _) lemSig propIsomorphism _ _ (Σ≡ refl (propEqs _ _ _))})
where
prop0 : {x} isProp (PathP (λ i .Arrow (.isoToId x i) 𝒜) xa ya)
prop0 {x} = pathJ (λ y p x isProp (PathP (λ i .Arrow (p i) 𝒜) xa x)) (λ x .arrowsAreSets _ _) Y (.isoToId x) ya
prop1 : {x} isProp (PathP (λ i .Arrow (.isoToId x i) ) xb yb)
prop1 {x} = pathJ (λ y p x isProp (PathP (λ i .Arrow (p i) ) xb x)) (λ x .arrowsAreSets _ _) Y (.isoToId x) yb
-- One thing to watch out for here is that the isomorphisms going forwards
-- must compose to give idToIso
iso
: ((X , xa , xb) (Y , ya , yb))
((X , xa , xb) (Y , ya , yb))
iso = step0 step1 step2
where
infixl 5 _⊙_
_⊙_ = composeIso
equiv1
: ((X , xa , xb) (Y , ya , yb))
((X , xa , xb) (Y , ya , yb))
equiv1 = _ , fromIso _ _ (snd iso)
univalent : Univalent
univalent = univalenceFrom≃ equiv1
isCat : IsCategory raw
IsCategory.isPreCategory isCat = isPreCat
IsCategory.univalent isCat = univalent
span : Category _ _
span = record
{ raw = raw
; isCategory = isCat
}

View File

@ -44,7 +44,7 @@ open Cat.Equivalence
-- about these. The laws defined are the types the propositions - not the
-- witnesses to them!
record RawCategory (a b : Level) : Set (lsuc (a b)) where
no-eta-equality
-- no-eta-equality
field
Object : Set a
Arrow : Object Object Set b
@ -55,14 +55,7 @@ record RawCategory (a b : Level) : Set (lsuc (a ⊔ b)) where
-- infixl 8 _>>>_
infixl 10 _<<<_ _>>>_
-- | Operations on data
domain : {a b : Object} Arrow a b Object
domain {a} _ = a
codomain : {a b : Object} Arrow a b Object
codomain {b = b} _ = b
-- | Reverse arrow composition
_>>>_ : {A B C : Object} (Arrow A B) (Arrow B C) Arrow A C
f >>> g = g <<< f
@ -91,11 +84,11 @@ record RawCategory (a b : Level) : Set (lsuc (a ⊔ b)) where
_≊_ A B = Σ[ f Arrow A B ] (Isomorphism f)
module _ {A B : Object} where
Epimorphism : {X : Object } (f : Arrow A B) Set b
Epimorphism {X} f = (g₀ g₁ : Arrow B X) g₀ <<< f g₁ <<< f g₀ g₁
Epimorphism : (f : Arrow A B) Set _
Epimorphism f = {X} (g₀ g₁ : Arrow B X) g₀ <<< f g₁ <<< f g₀ g₁
Monomorphism : {X : Object} (f : Arrow A B) Set b
Monomorphism {X} f = (g₀ g₁ : Arrow X A) f <<< g₀ f <<< g₁ g₀ g₁
Monomorphism : (f : Arrow A B) Set _
Monomorphism f = {X} (g₀ g₁ : Arrow X A) f <<< g₀ f <<< g₁ g₀ g₁
IsInitial : Object Set (a b)
IsInitial I = {X : Object} isContr (Arrow I X)
@ -119,7 +112,7 @@ record RawCategory (a b : Level) : Set (lsuc (a ⊔ b)) where
--
-- [HoTT §9.1.4]
idToIso : (A B : Object) A B A B
idToIso A B eq = subst eq (idIso A)
idToIso A B eq = subst {P = λ X A X} eq (idIso A)
Univalent : Set (a b)
Univalent = {A B : Object} isEquiv (A B) (A B) (idToIso A B)
@ -159,7 +152,7 @@ record RawCategory (a b : Level) : Set (lsuc (a ⊔ b)) where
univalenceFrom≅ x = univalenceFrom≃ $ fromIsomorphism _ _ x
propUnivalent : isProp Univalent
propUnivalent a b i = propPi (λ iso propIsContr) a b i
propUnivalent a b i .equiv-proof = propPi (λ iso propIsContr) (a .equiv-proof) (b .equiv-proof) i
module _ {a b : Level} ( : RawCategory a b) where
record IsPreCategory : Set (lsuc (a b)) where
@ -182,7 +175,7 @@ module _ {a b : Level} ( : RawCategory a b) where
-- | Relation between iso- epi- and mono- morphisms.
module _ {A B : Object} {X : Object} (f : Arrow A B) where
iso→epi : Isomorphism f Epimorphism {X = X} f
iso→epi : Isomorphism f Epimorphism f
iso→epi (f- , left-inv , right-inv) g₀ g₁ eq = begin
g₀ ≡⟨ sym rightIdentity
g₀ <<< identity ≡⟨ cong (_<<<_ g₀) (sym right-inv)
@ -193,7 +186,7 @@ module _ {a b : Level} ( : RawCategory a b) where
g₁ <<< identity ≡⟨ rightIdentity
g₁
iso→mono : Isomorphism f Monomorphism {X = X} f
iso→mono : Isomorphism f Monomorphism f
iso→mono (f- , left-inv , right-inv) g₀ g₁ eq =
begin
g₀ ≡⟨ sym leftIdentity
@ -205,7 +198,7 @@ module _ {a b : Level} ( : RawCategory a b) where
identity <<< g₁ ≡⟨ leftIdentity
g₁
iso→epi×mono : Isomorphism f Epimorphism {X = X} f × Monomorphism {X = X} f
iso→epi×mono : Isomorphism f Epimorphism f × Monomorphism f
iso→epi×mono iso = iso→epi iso , iso→mono iso
propIsAssociative : isProp IsAssociative
@ -566,7 +559,6 @@ module _ {a b : Level} ( : RawCategory a b) where
-- projections of `IsCategory` - Here I arbitrarily chose to use this
-- result from `x : IsCategory C`. I don't know which (if any) possibly
-- adverse effects this may have.
module Prop = X.Propositionality
isIdentity= : (λ _ IsIdentity identity) [ X.isIdentity Y.isIdentity ]
isIdentity= = X.propIsIdentity X.isIdentity Y.isIdentity
@ -631,95 +623,3 @@ module _ {a b : Level} ( : Category a b) where
_[_∘_] : {A B C : Object} (g : Arrow B C) (f : Arrow A B) Arrow A C
_[_∘_] = _<<<_
-- | The opposite category
--
-- The opposite category is the category where the direction of the arrows are
-- flipped.
module Opposite {a b : Level} where
module _ ( : Category a b) where
private
module _ where
module = Category
opRaw : RawCategory a b
RawCategory.Object opRaw = .Object
RawCategory.Arrow opRaw = flip .Arrow
RawCategory.identity opRaw = .identity
RawCategory._<<<_ opRaw = ._>>>_
open RawCategory opRaw
isPreCategory : IsPreCategory opRaw
IsPreCategory.isAssociative isPreCategory = sym .isAssociative
IsPreCategory.isIdentity isPreCategory = swap .isIdentity
IsPreCategory.arrowsAreSets isPreCategory = .arrowsAreSets
open IsPreCategory isPreCategory
module _ {A B : .Object} where
open Σ (toIso _ _ (.univalent {A} {B}))
renaming (fst to idToIso* ; snd to inv*)
open AreInverses {f = .idToIso A B} {idToIso*} inv*
shuffle : A B A .≊ B
shuffle (f , g , inv) = g , f , inv
shuffle~ : A .≊ B A B
shuffle~ (f , g , inv) = g , f , inv
lem : (p : A B) idToIso A B p shuffle~ (.idToIso A B p)
lem p = isoEq refl
isoToId* : A B A B
isoToId* = idToIso* shuffle
inv : AreInverses (idToIso A B) isoToId*
inv =
( funExt (λ x begin
(isoToId* idToIso A B) x
≡⟨⟩
(idToIso* shuffle idToIso A B) x
≡⟨ cong (λ φ φ x) (cong (λ φ idToIso* shuffle φ) (funExt lem))
(idToIso* shuffle shuffle~ .idToIso A B) x
≡⟨⟩
(idToIso* .idToIso A B) x
≡⟨ (λ i verso-recto i x)
x )
, funExt (λ x begin
(idToIso A B idToIso* shuffle) x
≡⟨ cong (λ φ φ x) (cong (λ φ φ idToIso* shuffle) (funExt lem))
(shuffle~ .idToIso A B idToIso* shuffle) x
≡⟨ cong (λ φ φ x) (cong (λ φ shuffle~ φ shuffle) recto-verso)
(shuffle~ shuffle) x
≡⟨⟩
x )
)
isCategory : IsCategory opRaw
IsCategory.isPreCategory isCategory = isPreCategory
IsCategory.univalent isCategory
= univalenceFromIsomorphism (isoToId* , inv)
opposite : Category a b
Category.raw opposite = opRaw
Category.isCategory opposite = isCategory
-- As demonstrated here a side-effect of having no-eta-equality on constructors
-- means that we need to pick things apart to show that things are indeed
-- definitionally equal. I.e; a thing that would normally be provable in one
-- line now takes 13!! Admittedly it's a simple proof.
module _ { : Category a b} where
open Category
private
-- Since they really are definitionally equal we just need to pick apart
-- the data-type.
rawInv : Category.raw (opposite (opposite )) raw
RawCategory.Object (rawInv _) = Object
RawCategory.Arrow (rawInv _) = Arrow
RawCategory.identity (rawInv _) = identity
RawCategory._<<<_ (rawInv _) = _<<<_
oppositeIsInvolution : opposite (opposite )
oppositeIsInvolution = Category≡ rawInv
open Opposite public

View File

@ -28,186 +28,213 @@ import Cat.Category.Monad.Monoidal
import Cat.Category.Monad.Kleisli
open import Cat.Categories.Fun
module Monoidal = Cat.Category.Monad.Monoidal
module Kleisli = Cat.Category.Monad.Kleisli
-- | The monoidal- and kleisli presentation of monads are equivalent.
module _ {a b : Level} ( : Category a b) where
open Cat.Category.NaturalTransformation using (NaturalTransformation ; propIsNatural)
private
module = Category
open using (Object ; Arrow ; identity ; _<<<_ ; _>>>_)
module M = Monoidal
module K = Kleisli
module _ (m : M.RawMonad) where
open M.RawMonad m
module Monoidal = Cat.Category.Monad.Monoidal
module Kleisli = Cat.Category.Monad.Kleisli
forthRaw : K.RawMonad
K.RawMonad.omap forthRaw = Romap
K.RawMonad.pure forthRaw = pureT _
K.RawMonad.bind forthRaw = bind
module _ (m : Monoidal.RawMonad) where
open Monoidal.RawMonad m
module _ {raw : M.RawMonad} (m : M.IsMonad raw) where
private
module MI = M.IsMonad m
forthIsMonad : K.IsMonad (forthRaw raw)
K.IsMonad.isIdentity forthIsMonad = snd MI.isInverse
K.IsMonad.isNatural forthIsMonad = MI.isNatural
K.IsMonad.isDistributive forthIsMonad = MI.isDistributive
toKleisliRaw : Kleisli.RawMonad
Kleisli.RawMonad.omap toKleisliRaw = Romap
Kleisli.RawMonad.pure toKleisliRaw = pure
Kleisli.RawMonad.bind toKleisliRaw = bind
forth : M.Monad K.Monad
Kleisli.Monad.raw (forth m) = forthRaw (M.Monad.raw m)
Kleisli.Monad.isMonad (forth m) = forthIsMonad (M.Monad.isMonad m)
module _ {raw : Monoidal.RawMonad} (m : Monoidal.IsMonad raw) where
open Monoidal.IsMonad m
module _ (m : K.Monad) where
open K.Monad m
open Kleisli.RawMonad (toKleisliRaw raw) using (_>=>_)
toKleisliIsMonad : Kleisli.IsMonad (toKleisliRaw raw)
Kleisli.IsMonad.isIdentity toKleisliIsMonad = begin
bind pure ≡⟨⟩
join <<< (fmap pure) ≡⟨ snd isInverse
identity
Kleisli.IsMonad.isNatural toKleisliIsMonad f = begin
pure >=> f ≡⟨⟩
pure >>> bind f ≡⟨⟩
bind f <<< pure ≡⟨⟩
(join <<< fmap f) <<< pure ≡⟨ isNatural f
f
Kleisli.IsMonad.isDistributive toKleisliIsMonad f g = begin
bind g >>> bind f ≡⟨⟩
(join <<< fmap f) <<< (join <<< fmap g) ≡⟨ isDistributive f g
join <<< fmap (join <<< fmap f <<< g) ≡⟨⟩
bind (g >=> f)
-- Kleisli.IsMonad.isDistributive toKleisliIsMonad = isDistributive
backRaw : M.RawMonad
M.RawMonad.R backRaw = R
M.RawMonad.pureNT backRaw = pureNT
M.RawMonad.joinNT backRaw = joinNT
toKleisli : Monoidal.Monad Kleisli.Monad
Kleisli.Monad.raw (toKleisli m) = toKleisliRaw (Monoidal.Monad.raw m)
Kleisli.Monad.isMonad (toKleisli m) = toKleisliIsMonad (Monoidal.Monad.isMonad m)
private
open M.RawMonad backRaw renaming
( join to join*
; pure to pure*
; bind to bind*
; fmap to fmap*
)
module R = Functor (M.RawMonad.R backRaw)
module _ (m : Kleisli.Monad) where
open Kleisli.Monad m
backIsMonad : M.IsMonad backRaw
M.IsMonad.isAssociative backIsMonad = begin
join* <<< R.fmap join* ≡⟨⟩
toMonoidalRaw : Monoidal.RawMonad
Monoidal.RawMonad.R toMonoidalRaw = R
Monoidal.RawMonad.pureNT toMonoidalRaw = pureNT
Monoidal.RawMonad.joinNT toMonoidalRaw = joinNT
open Monoidal.RawMonad toMonoidalRaw renaming
( join to join*
; pure to pure*
; bind to bind*
; fmap to fmap*
) using ()
toMonoidalIsMonad : Monoidal.IsMonad toMonoidalRaw
Monoidal.IsMonad.isAssociative toMonoidalIsMonad = begin
join* <<< fmap join* ≡⟨⟩
join <<< fmap join ≡⟨ isNaturalForeign
join <<< join
M.IsMonad.isInverse backIsMonad {X} = inv-l , inv-r
Monoidal.IsMonad.isInverse toMonoidalIsMonad {X} = inv-l , inv-r
where
inv-l = begin
join <<< pure ≡⟨ fst isInverse
identity
inv-r = begin
joinT X <<< R.fmap (pureT X) ≡⟨⟩
join* <<< fmap* pure* ≡⟨⟩
join <<< fmap pure ≡⟨ snd isInverse
identity
back : K.Monad M.Monad
Monoidal.Monad.raw (back m) = backRaw m
Monoidal.Monad.isMonad (back m) = backIsMonad m
toMonoidal : Kleisli.Monad Monoidal.Monad
Monoidal.Monad.raw (toMonoidal m) = toMonoidalRaw m
Monoidal.Monad.isMonad (toMonoidal m) = toMonoidalIsMonad m
module _ (m : K.Monad) where
module _ (m : Kleisli.Monad) where
private
open K.Monad m
open Kleisli.Monad m
bindEq : {X Y}
K.RawMonad.bind (forthRaw (backRaw m)) {X} {Y}
K.RawMonad.bind (K.Monad.raw m)
bindEq {X} {Y} = begin
K.RawMonad.bind (forthRaw (backRaw m)) ≡⟨⟩
(λ f join <<< fmap f) ≡⟨⟩
(λ f bind (f >>> pure) >>> bind identity) ≡⟨ funExt lem
(λ f bind f) ≡⟨⟩
bind
Kleisli.RawMonad.bind (toKleisliRaw (toMonoidalRaw m)) {X} {Y}
bind
bindEq {X} {Y} = funExt lem
where
lem : (f : Arrow X (omap Y))
bind (f >>> pure) >>> bind identity
bind f
lem f = begin
join <<< fmap f
≡⟨⟩
bind (f >>> pure) >>> bind identity
≡⟨ isDistributive _ _
bind ((f >>> pure) >=> identity)
≡⟨⟩
bind ((f >>> pure) >>> bind identity)
≡⟨ cong bind .isAssociative
bind (f >>> (pure >>> bind identity))
≡⟨⟩
bind (f >>> (pure >=> identity))
≡⟨ cong (λ φ bind (f >>> φ)) (isNatural _)
bind (f >>> identity)
≡⟨ cong bind .leftIdentity
bind f
forthRawEq : forthRaw (backRaw m) K.Monad.raw m
K.RawMonad.omap (forthRawEq _) = omap
K.RawMonad.pure (forthRawEq _) = pure
K.RawMonad.bind (forthRawEq i) = bindEq i
toKleisliRawEq : toKleisliRaw (toMonoidalRaw m) Kleisli.Monad.raw m
Kleisli.RawMonad.omap (toKleisliRawEq i) = (begin
Kleisli.RawMonad.omap (toKleisliRaw (toMonoidalRaw m)) ≡⟨⟩
Monoidal.RawMonad.Romap (toMonoidalRaw m) ≡⟨⟩
omap
) i
Kleisli.RawMonad.pure (toKleisliRawEq i) = (begin
Kleisli.RawMonad.pure (toKleisliRaw (toMonoidalRaw m)) ≡⟨⟩
Monoidal.RawMonad.pure (toMonoidalRaw m) ≡⟨⟩
pure
) i
Kleisli.RawMonad.bind (toKleisliRawEq i) = bindEq i
fortheq : (m : K.Monad) forth (back m) m
fortheq m = K.Monad≡ (forthRawEq m)
toKleislieq : (m : Kleisli.Monad) toKleisli (toMonoidal m) m
toKleislieq m = Kleisli.Monad≡ (toKleisliRawEq m)
module _ (m : M.Monad) where
module _ (m : Monoidal.Monad) where
private
open M.Monad m
module KM = K.Monad (forth m)
open Monoidal.Monad m
-- module KM = Kleisli.Monad (toKleisli m)
open Kleisli.Monad (toKleisli m) renaming
( bind to bind* ; omap to omap* ; join to join*
; fmap to fmap* ; pure to pure* ; R to R*)
using ()
module R = Functor R
omapEq : KM.omap Romap
omapEq : omap* Romap
omapEq = refl
bindEq : {X Y} {f : Arrow X (Romap Y)} KM.bind f bind f
bindEq : {X Y} {f : Arrow X (Romap Y)} bind* f bind f
bindEq {X} {Y} {f} = begin
KM.bind f ≡⟨⟩
joinT Y <<< fmap f ≡⟨⟩
bind f
bind* f ≡⟨⟩
join <<< fmap f ≡⟨⟩
bind f
joinEq : {X} KM.join joinT X
joinEq : {X} join* joinT X
joinEq {X} = begin
KM.join ≡⟨⟩
KM.bind identity ≡⟨⟩
bind identity ≡⟨⟩
joinT X <<< fmap identity ≡⟨ cong (λ φ _ <<< φ) R.isIdentity
joinT X <<< identity ≡⟨ .rightIdentity
joinT X
join* ≡⟨⟩
bind* identity ≡⟨⟩
bind identity ≡⟨⟩
join <<< fmap identity ≡⟨ cong (λ φ _ <<< φ) R.isIdentity
join <<< identity ≡⟨ .rightIdentity
join
fmapEq : {A B} KM.fmap {A} {B} fmap
fmapEq : {A B} fmap* {A} {B} fmap
fmapEq {A} {B} = funExt (λ f begin
KM.fmap f ≡⟨⟩
KM.bind (f >>> KM.pure) ≡⟨⟩
bind (f >>> pureT _) ≡⟨⟩
fmap (f >>> pureT B) >>> joinT B ≡⟨⟩
fmap (f >>> pureT B) >>> joinT B ≡⟨ cong (λ φ φ >>> joinT B) R.isDistributive
fmap f >>> fmap (pureT B) >>> joinT B ≡⟨ .isAssociative
joinT B <<< fmap (pureT B) <<< fmap f ≡⟨ cong (λ φ φ <<< fmap f) (snd isInverse)
identity <<< fmap f ≡⟨ .leftIdentity
fmap f
fmap* f ≡⟨⟩
bind* (f >>> pure*) ≡⟨⟩
bind (f >>> pure) ≡⟨⟩
fmap (f >>> pure) >>> join ≡⟨⟩
fmap (f >>> pure) >>> join ≡⟨ cong (λ φ φ >>> joinT B) R.isDistributive
fmap f >>> fmap pure >>> join ≡⟨ .isAssociative
join <<< fmap pure <<< fmap f ≡⟨ cong (λ φ φ <<< fmap f) (snd isInverse)
identity <<< fmap f ≡⟨ .leftIdentity
fmap f
)
rawEq : Functor.raw KM.R Functor.raw R
rawEq : Functor.raw R* Functor.raw R
RawFunctor.omap (rawEq i) = omapEq i
RawFunctor.fmap (rawEq i) = fmapEq i
Req : M.RawMonad.R (backRaw (forth m)) R
Req : Monoidal.RawMonad.R (toMonoidalRaw (toKleisli m)) R
Req = Functor≡ rawEq
pureTEq : M.RawMonad.pureT (backRaw (forth m)) pureT
pureTEq = funExt (λ X refl)
pureTEq : Monoidal.RawMonad.pureT (toMonoidalRaw (toKleisli m)) pureT
pureTEq = refl
pureNTEq : (λ i NaturalTransformation Functors.identity (Req i))
[ M.RawMonad.pureNT (backRaw (forth m)) pureNT ]
[ Monoidal.RawMonad.pureNT (toMonoidalRaw (toKleisli m)) pureNT ]
pureNTEq = lemSigP (λ i propIsNatural Functors.identity (Req i)) _ _ pureTEq
joinTEq : M.RawMonad.joinT (backRaw (forth m)) joinT
joinTEq : Monoidal.RawMonad.joinT (toMonoidalRaw (toKleisli m)) joinT
joinTEq = funExt (λ X begin
M.RawMonad.joinT (backRaw (forth m)) X ≡⟨⟩
KM.join ≡⟨⟩
joinT X <<< fmap identity ≡⟨ cong (λ φ joinT X <<< φ) R.isIdentity
joinT X <<< identity ≡⟨ .rightIdentity
joinT X )
Monoidal.RawMonad.joinT (toMonoidalRaw (toKleisli m)) X ≡⟨⟩
join* ≡⟨⟩
join <<< fmap identity ≡⟨ cong (λ φ join <<< φ) R.isIdentity
join <<< identity ≡⟨ .rightIdentity
join )
joinNTEq : (λ i NaturalTransformation F[ Req i Req i ] (Req i))
[ M.RawMonad.joinNT (backRaw (forth m)) joinNT ]
[ Monoidal.RawMonad.joinNT (toMonoidalRaw (toKleisli m)) joinNT ]
joinNTEq = lemSigP (λ i propIsNatural F[ Req i Req i ] (Req i)) _ _ joinTEq
backRawEq : backRaw (forth m) M.Monad.raw m
M.RawMonad.R (backRawEq i) = Req i
M.RawMonad.pureNT (backRawEq i) = pureNTEq i
M.RawMonad.joinNT (backRawEq i) = joinNTEq i
toMonoidalRawEq : toMonoidalRaw (toKleisli m) Monoidal.Monad.raw m
Monoidal.RawMonad.R (toMonoidalRawEq i) = Req i
Monoidal.RawMonad.pureNT (toMonoidalRawEq i) = pureNTEq i
Monoidal.RawMonad.joinNT (toMonoidalRawEq i) = joinNTEq i
backeq : (m : M.Monad) back (forth m) m
backeq m = M.Monad≡ (backRawEq m)
eqv : isEquiv M.Monad K.Monad forth
eqv = gradLemma forth back fortheq backeq
toMonoidaleq : (m : Monoidal.Monad) toMonoidal (toKleisli m) m
toMonoidaleq m = Monoidal.Monad≡ (toMonoidalRawEq m)
open import Cat.Equivalence
Monoidal≊Kleisli : M.Monad K.Monad
Monoidal≊Kleisli = forth , back , funExt backeq , funExt fortheq
Monoidal≊Kleisli : Monoidal.Monad Kleisli.Monad
Monoidal≊Kleisli = toKleisli , toMonoidal , funExt toMonoidaleq , funExt toKleislieq
Monoidal≡Kleisli : M.Monad K.Monad
Monoidal≡Kleisli : Monoidal.Monad Kleisli.Monad
Monoidal≡Kleisli = isoToPath Monoidal≊Kleisli
grpdKleisli : isGrpd Kleisli.Monad
grpdKleisli = Kleisli.grpdMonad
grpdMonoidal : isGrpd Monoidal.Monad
grpdMonoidal = subst {P = isGrpd}
(sym Monoidal≡Kleisli) grpdKleisli

View File

@ -5,6 +5,7 @@ The Kleisli formulation of monads
open import Agda.Primitive
open import Cat.Prelude
open import Cat.Equivalence
open import Cat.Category
open import Cat.Category.Functor as F
@ -230,6 +231,7 @@ record IsMonad (raw : RawMonad) : Set where
m
record Monad : Set where
no-eta-equality
field
raw : RawMonad
isMonad : IsMonad raw
@ -264,3 +266,82 @@ module _ {m n : Monad} (eq : Monad.raw m ≡ Monad.raw n) where
Monad≡ : m n
Monad.raw (Monad≡ i) = eq i
Monad.isMonad (Monad≡ i) = eqIsMonad i
module _ where
private
module _ (x y : RawMonad) (p q : x y) (a b : p q) where
eq0-helper : isGrpd (Object Object)
eq0-helper = grpdPi (λ a .groupoidObject)
eq0 : cong (cong RawMonad.omap) a cong (cong RawMonad.omap) b
eq0 = eq0-helper
(RawMonad.omap x) (RawMonad.omap y)
(cong RawMonad.omap p) (cong RawMonad.omap q)
(cong (cong RawMonad.omap) a) (cong (cong RawMonad.omap) b)
eq1-helper : (omap : Object Object) isGrpd ({X : Object} [ X , omap X ])
eq1-helper f = grpdPiImpl (setGrpd .arrowsAreSets)
postulate
eq1 : PathP (λ i PathP
(λ j
PathP (λ k {X : Object} [ X , eq0 i j k X ])
(RawMonad.pure x) (RawMonad.pure y))
(λ i RawMonad.pure (p i)) (λ i RawMonad.pure (q i)))
(cong-d (cong-d RawMonad.pure) a) (cong-d (cong-d RawMonad.pure) b)
RawMonad' : Set _
RawMonad' = Σ (Object Object) (λ omap
({X : Object} [ X , omap X ])
× ({X Y : Object} [ X , omap Y ] [ omap X , omap Y ])
)
grpdRawMonad' : isGrpd RawMonad'
grpdRawMonad' = grpdSig (grpdPi (λ _ .groupoidObject)) λ _ grpdSig (grpdPiImpl (setGrpd .arrowsAreSets)) (λ _ grpdPiImpl (grpdPiImpl (grpdPi (λ _ setGrpd .arrowsAreSets))))
toRawMonad : RawMonad' RawMonad
RawMonad.omap (toRawMonad (a , b , c)) = a
RawMonad.pure (toRawMonad (a , b , c)) = b
RawMonad.bind (toRawMonad (a , b , c)) = c
IsMonad' : RawMonad' Set _
IsMonad' raw = M.IsIdentity × M.IsNatural × M.IsDistributive
where
module M = RawMonad (toRawMonad raw)
grpdIsMonad' : (m : RawMonad') isGrpd (IsMonad' m)
grpdIsMonad' m = grpdSig (propGrpd (propIsIdentity (toRawMonad m)))
λ _ grpdSig (propGrpd (propIsNatural (toRawMonad m)))
λ _ propGrpd (propIsDistributive (toRawMonad m))
Monad' = Σ RawMonad' IsMonad'
grpdMonad' = grpdSig grpdRawMonad' grpdIsMonad'
toMonad : Monad' Monad
Monad.raw (toMonad x) = toRawMonad (fst x)
isIdentity (Monad.isMonad (toMonad x)) = fst (snd x)
isNatural (Monad.isMonad (toMonad x)) = fst (snd (snd x))
isDistributive (Monad.isMonad (toMonad x)) = snd (snd (snd x))
fromMonad : Monad Monad'
fromMonad m = (M.omap , M.pure , M.bind)
, M.isIdentity , M.isNatural , M.isDistributive
where
module M = Monad m
e : Monad' Monad
e = fromIsomorphism _ _ (toMonad , fromMonad , (funExt λ _ refl) , funExt eta-refl)
where
-- Monads don't have eta-equality
eta-refl : (x : Monad) toMonad (fromMonad x) x
eta-refl =
(λ x λ
{ i .Monad.raw Monad.raw x
; i .Monad.isMonad Monad.isMonad x}
)
grpdMonad : isGrpd Monad
grpdMonad = equivPreservesNType
{n = (S (S (S ⟨-2⟩)))}
e grpdMonad'
where
open import Cubical.NType

View File

@ -18,7 +18,7 @@ private
open Category using (Object ; Arrow ; identity ; _<<<_)
open import Cat.Category.NaturalTransformation
using (NaturalTransformation ; Transformation ; Natural)
using (NaturalTransformation ; Transformation ; Natural ; NaturalTransformation≡)
record RawMonad : Set where
field
@ -78,15 +78,39 @@ record IsMonad (raw : RawMonad) : Set where
isNatural : IsNatural
isNatural {X} {Y} f = begin
joinT Y <<< R.fmap f <<< pureT X ≡⟨ sym .isAssociative
joinT Y <<< (R.fmap f <<< pureT X) ≡⟨ cong (λ φ joinT Y <<< φ) (sym (pureN f))
joinT Y <<< (pureT (R.omap Y) <<< f) ≡⟨ .isAssociative
joinT Y <<< pureT (R.omap Y) <<< f ≡⟨ cong (λ φ φ <<< f) (fst isInverse)
identity <<< f ≡⟨ .leftIdentity
f
join <<< fmap f <<< pure ≡⟨ sym .isAssociative
join <<< (fmap f <<< pure) ≡⟨ cong (λ φ join <<< φ) (sym (pureN f))
join <<< (pure <<< f) ≡⟨ .isAssociative
join <<< pure <<< f ≡⟨ cong (λ φ φ <<< f) (fst isInverse)
identity <<< f ≡⟨ .leftIdentity
f
isDistributive : IsDistributive
isDistributive {X} {Y} {Z} g f = sym aux
isDistributive {X} {Y} {Z} g f = begin
join <<< fmap g <<< (join <<< fmap f)
≡⟨ Category.isAssociative
join <<< fmap g <<< join <<< fmap f
≡⟨ cong (_<<< fmap f) (sym .isAssociative)
(join <<< (fmap g <<< join)) <<< fmap f
≡⟨ cong (λ φ φ <<< fmap f) (cong (_<<<_ join) (sym (joinN g)))
(join <<< (join <<< R².fmap g)) <<< fmap f
≡⟨ cong (_<<< fmap f) .isAssociative
((join <<< join) <<< R².fmap g) <<< fmap f
≡⟨⟩
join <<< join <<< R².fmap g <<< fmap f
≡⟨ sym .isAssociative
(join <<< join) <<< (R².fmap g <<< fmap f)
≡⟨ cong (λ φ φ <<< (R².fmap g <<< fmap f)) (sym isAssociative)
(join <<< fmap join) <<< (R².fmap g <<< fmap f)
≡⟨ sym .isAssociative
join <<< (fmap join <<< (R².fmap g <<< fmap f))
≡⟨ cong (_<<<_ join) .isAssociative
join <<< (fmap join <<< R².fmap g <<< fmap f)
≡⟨⟩
join <<< (fmap join <<< fmap (fmap g) <<< fmap f)
≡⟨ cong (λ φ join <<< φ) (sym distrib3)
join <<< fmap (join <<< fmap g <<< f)
where
module R² = Functor F[ R R ]
distrib3 : {A B C D} {a : Arrow C D} {b : Arrow B C} {c : Arrow A B}
@ -96,33 +120,9 @@ record IsMonad (raw : RawMonad) : Set where
R.fmap (a <<< b <<< c) ≡⟨ R.isDistributive
R.fmap (a <<< b) <<< R.fmap c ≡⟨ cong (_<<< _) R.isDistributive
R.fmap a <<< R.fmap b <<< R.fmap c
aux = begin
joinT Z <<< R.fmap (joinT Z <<< R.fmap g <<< f)
≡⟨ cong (λ φ joinT Z <<< φ) distrib3
joinT Z <<< (R.fmap (joinT Z) <<< R.fmap (R.fmap g) <<< R.fmap f)
≡⟨⟩
joinT Z <<< (R.fmap (joinT Z) <<< R².fmap g <<< R.fmap f)
≡⟨ cong (_<<<_ (joinT Z)) (sym .isAssociative)
joinT Z <<< (R.fmap (joinT Z) <<< (R².fmap g <<< R.fmap f))
≡⟨ .isAssociative
(joinT Z <<< R.fmap (joinT Z)) <<< (R².fmap g <<< R.fmap f)
≡⟨ cong (λ φ φ <<< (R².fmap g <<< R.fmap f)) isAssociative
(joinT Z <<< joinT (R.omap Z)) <<< (R².fmap g <<< R.fmap f)
≡⟨ .isAssociative
joinT Z <<< joinT (R.omap Z) <<< R².fmap g <<< R.fmap f
≡⟨⟩
((joinT Z <<< joinT (R.omap Z)) <<< R².fmap g) <<< R.fmap f
≡⟨ cong (_<<< R.fmap f) (sym .isAssociative)
(joinT Z <<< (joinT (R.omap Z) <<< R².fmap g)) <<< R.fmap f
≡⟨ cong (λ φ φ <<< R.fmap f) (cong (_<<<_ (joinT Z)) (joinN g))
(joinT Z <<< (R.fmap g <<< joinT Y)) <<< R.fmap f
≡⟨ cong (_<<< R.fmap f) .isAssociative
joinT Z <<< R.fmap g <<< joinT Y <<< R.fmap f
≡⟨ sym (Category.isAssociative )
joinT Z <<< R.fmap g <<< (joinT Y <<< R.fmap f)
record Monad : Set where
no-eta-equality
field
raw : RawMonad
isMonad : IsMonad raw

View File

@ -1,15 +1,18 @@
{-
This module provides construction 2.3 in [voe]
-}
{-# OPTIONS --cubical --caching #-}
{-# OPTIONS --cubical #-}
module Cat.Category.Monad.Voevodsky where
open import Cat.Prelude
open import Cat.Equivalence
open import Cat.Category
open import Cat.Category.Functor as F
import Cat.Category.NaturalTransformation
open import Cat.Category.Monad
import Cat.Category.Monad.Monoidal as Monoidal
import Cat.Category.Monad.Kleisli as Kleisli
open import Cat.Categories.Fun
open import Cat.Equivalence
@ -24,6 +27,7 @@ module voe {a b : Level} ( : Category a b) where
module §2-3 (omap : Object Object) (pure : {X : Object} Arrow X (omap X)) where
record §1 : Set where
no-eta-equality
open M
field
@ -74,12 +78,11 @@ module voe {a b : Level} ( : Category a b) where
isMonad : IsMonad rawMnd
toMonad : Monad
toMonad = record
{ raw = rawMnd
; isMonad = isMonad
}
toMonad .Monad.raw = rawMnd
toMonad .Monad.isMonad = isMonad
record §2 : Set where
no-eta-equality
open K
field
@ -96,28 +99,24 @@ module voe {a b : Level} ( : Category a b) where
isMonad : IsMonad rawMnd
toMonad : Monad
toMonad = record
{ raw = rawMnd
; isMonad = isMonad
}
toMonad .Monad.raw = rawMnd
toMonad .Monad.isMonad = isMonad
§1-fromMonad : (m : M.Monad) §2-3.§1 (M.Monad.Romap m) (λ {X} M.Monad.pureT m X)
§1-fromMonad m = record
{ fmap = Functor.fmap R
; RisFunctor = Functor.isFunctor R
; pureN = pureN
; join = λ {X} joinT X
; joinN = joinN
; isMonad = M.Monad.isMonad m
}
where
module _ (m : M.Monad) where
open M.Monad m
§1-fromMonad : §2-3.§1 (M.Monad.Romap m) (λ {X} M.Monad.pureT m X)
§1-fromMonad .§2-3.§1.fmap = Functor.fmap R
§1-fromMonad .§2-3.§1.RisFunctor = Functor.isFunctor R
§1-fromMonad .§2-3.§1.pureN = pureN
§1-fromMonad .§2-3.§1.join {X} = joinT X
§1-fromMonad .§2-3.§1.joinN = joinN
§1-fromMonad .§2-3.§1.isMonad = M.Monad.isMonad m
§2-fromMonad : (m : K.Monad) §2-3.§2 (K.Monad.omap m) (K.Monad.pure m)
§2-fromMonad m = record
{ bind = K.Monad.bind m
; isMonad = K.Monad.isMonad m
}
§2-fromMonad m .§2-3.§2.bind = K.Monad.bind m
§2-fromMonad m .§2-3.§2.isMonad = K.Monad.isMonad m
-- | In the following we seek to transform the equivalence `Monoidal≃Kleisli`
-- | to talk about voevodsky's construction.
@ -145,67 +144,103 @@ module voe {a b : Level} ( : Category a b) where
forthEq : m (forth back) m m
forthEq m = begin
(forth back) m ≡⟨⟩
-- In full gory detail:
( §2-fromMonad
Monoidal→Kleisli
§2-3.§1.toMonad
§1-fromMonad
Kleisli→Monoidal
§2-3.§2.toMonad
) m ≡⟨⟩ -- fromMonad and toMonad are inverses
( §2-fromMonad
Monoidal→Kleisli
Kleisli→Monoidal
§2-3.§2.toMonad
) m ≡⟨ cong (λ φ φ m) t
-- Monoidal→Kleisli and Kleisli→Monoidal are inverses
-- I should be able to prove this using congruence and `lem` below.
( §2-fromMonad
§2-3.§2.toMonad
) m ≡⟨⟩
( §2-fromMonad
§2-3.§2.toMonad
) m ≡⟨⟩ -- fromMonad and toMonad are inverses
m
§2-fromMonad
(Monoidal→Kleisli
(§2-3.§1.toMonad
(§1-fromMonad (Kleisli→Monoidal (§2-3.§2.toMonad m)))))
≡⟨ cong-d (§2-fromMonad Monoidal→Kleisli) (lemmaz (Kleisli→Monoidal (§2-3.§2.toMonad m)))
§2-fromMonad
((Monoidal→Kleisli Kleisli→Monoidal)
(§2-3.§2.toMonad m))
-- Below is the fully normalized goal and context with
-- `funExt` made abstract.
--
-- Goal: PathP (λ _ → §2-3.§2 omap (λ {z} → pure))
-- (§2-fromMonad
-- (.Cat.Category.Monad.toKleisli
-- (.Cat.Category.Monad.toMonoidal (§2-3.§2.toMonad m))))
-- (§2-fromMonad (§2-3.§2.toMonad m))
-- Have: PathP
-- (λ i →
-- §2-3.§2 K.IsMonad.omap
-- (K.RawMonad.pure
-- (K.Monad.raw
-- (funExt (λ m₁ → K.Monad≡ (.Cat.Category.Monad.toKleisliRawEq m₁))
-- i (§2-3.§2.toMonad m)))))
-- (§2-fromMonad
-- (.Cat.Category.Monad.toKleisli
-- (.Cat.Category.Monad.toMonoidal (§2-3.§2.toMonad m))))
-- (§2-fromMonad (§2-3.§2.toMonad m))
≡⟨ ( cong-d {x = Monoidal→Kleisli Kleisli→Monoidal} {y = idFun K.Monad} (\ φ §2-fromMonad (φ (§2-3.§2.toMonad m))) re-ve)
(§2-fromMonad §2-3.§2.toMonad) m
≡⟨ lemma
m
where
t' : ((Monoidal→Kleisli Kleisli→Monoidal) §2-3.§2.toMonad {omap} {pure})
§2-3.§2.toMonad
t' = cong (\ φ φ §2-3.§2.toMonad) re-ve
t : (§2-fromMonad (Monoidal→Kleisli Kleisli→Monoidal) §2-3.§2.toMonad {omap} {pure})
(§2-fromMonad §2-3.§2.toMonad)
t = cong-d (\ f §2-fromMonad f) t'
u : (§2-fromMonad (Monoidal→Kleisli Kleisli→Monoidal) §2-3.§2.toMonad) m
(§2-fromMonad §2-3.§2.toMonad) m
u = cong (\ φ φ m) t
lemma : (§2-fromMonad §2-3.§2.toMonad) m m
§2-3.§2.bind (lemma i) = §2-3.§2.bind m
§2-3.§2.isMonad (lemma i) = §2-3.§2.isMonad m
lemmaz : m §2-3.§1.toMonad (§1-fromMonad m) m
M.Monad.raw (lemmaz m i) = M.Monad.raw m
M.Monad.isMonad (lemmaz m i) = M.Monad.isMonad m
backEq : m (back forth) m m
backEq m = begin
(back forth) m ≡⟨⟩
( §1-fromMonad
Kleisli→Monoidal
§2-3.§2.toMonad
§2-fromMonad
Monoidal→Kleisli
§2-3.§1.toMonad
) m ≡⟨⟩ -- fromMonad and toMonad are inverses
( §1-fromMonad
Kleisli→Monoidal
Monoidal→Kleisli
§2-3.§1.toMonad
) m ≡⟨ cong (λ φ φ m) t -- Monoidal→Kleisli and Kleisli→Monoidal are inverses
( §1-fromMonad
§2-3.§1.toMonad
) m ≡⟨⟩ -- fromMonad and toMonad are inverses
§1-fromMonad
(Kleisli→Monoidal
(§2-3.§2.toMonad
(§2-fromMonad (Monoidal→Kleisli (§2-3.§1.toMonad m)))))
≡⟨ cong-d (§1-fromMonad Kleisli→Monoidal) (lemma (Monoidal→Kleisli (§2-3.§1.toMonad m)))
§1-fromMonad
((Kleisli→Monoidal Monoidal→Kleisli)
(§2-3.§1.toMonad m))
-- Below is the fully normalized `agda2-goal-and-context`
-- with `funExt` made abstract.
--
-- Goal: PathP (λ _ → §2-3.§1 omap (λ {X} → pure))
-- (§1-fromMonad
-- (.Cat.Category.Monad.toMonoidal
-- (.Cat.Category.Monad.toKleisli (§2-3.§1.toMonad m))))
-- (§1-fromMonad (§2-3.§1.toMonad m))
-- Have: PathP
-- (λ i →
-- §2-3.§1
-- (RawFunctor.omap
-- (Functor.raw
-- (M.RawMonad.R
-- (M.Monad.raw
-- (funExt
-- (λ m₁ → M.Monad≡ (.Cat.Category.Monad.toMonoidalRawEq m₁)) i
-- (§2-3.§1.toMonad m))))))
-- (λ {X} →
-- fst
-- (M.RawMonad.pureNT
-- (M.Monad.raw
-- (funExt
-- (λ m₁ → M.Monad≡ (.Cat.Category.Monad.toMonoidalRawEq m₁)) i
-- (§2-3.§1.toMonad m))))
-- X))
-- (§1-fromMonad
-- (.Cat.Category.Monad.toMonoidal
-- (.Cat.Category.Monad.toKleisli (§2-3.§1.toMonad m))))
-- (§1-fromMonad (§2-3.§1.toMonad m))
≡⟨ (cong-d (\ φ §1-fromMonad (φ (§2-3.§1.toMonad m))) ve-re)
§1-fromMonad (§2-3.§1.toMonad m)
≡⟨ lemmaz
m
where
t : §1-fromMonad Kleisli→Monoidal Monoidal→Kleisli §2-3.§1.toMonad
§1-fromMonad §2-3.§1.toMonad
-- Why does `re-ve` not satisfy this goal?
t i m = §1-fromMonad (ve-re i (§2-3.§1.toMonad m))
voe-isEquiv : isEquiv (§2-3.§1 omap pure) (§2-3.§2 omap pure) forth
voe-isEquiv = gradLemma forth back forthEq backEq
where
lemmaz : §1-fromMonad (§2-3.§1.toMonad m) m
§2-3.§1.fmap (lemmaz i) = §2-3.§1.fmap m
§2-3.§1.join (lemmaz i) = §2-3.§1.join m
§2-3.§1.RisFunctor (lemmaz i) = §2-3.§1.RisFunctor m
§2-3.§1.pureN (lemmaz i) = §2-3.§1.pureN m
§2-3.§1.joinN (lemmaz i) = §2-3.§1.joinN m
§2-3.§1.isMonad (lemmaz i) = §2-3.§1.isMonad m
lemma : m §2-3.§2.toMonad (§2-fromMonad m) m
K.Monad.raw (lemma m i) = K.Monad.raw m
K.Monad.isMonad (lemma m i) = K.Monad.isMonad m
equiv-2-3 : §2-3.§1 omap pure §2-3.§2 omap pure
equiv-2-3 = forth , voe-isEquiv
equiv-2-3 = fromIsomorphism _ _
( forth , back
, funExt backEq , funExt forthEq
)

View File

@ -11,7 +11,7 @@ module _ {a b : Level} ( : Category a b) where
module _ (A B : Object) where
record RawProduct : Set (a b) where
no-eta-equality
-- no-eta-equality
field
object : Object
fst : [ object , A ]
@ -55,286 +55,122 @@ module _ {a b : Level} ( : Category a b) where
× [ g snd ]
]
module _ {a b : Level} { : Category a b} {A B : Category.Object } where
module _ {a b : Level} { : Category a b}
(let module = Category ) {𝒜 : .Object} where
private
open Category
module _ (raw : RawProduct A B) where
module _ (x y : IsProduct A B raw) where
private
module x = IsProduct x
module y = IsProduct y
module _ (raw : RawProduct 𝒜 ) where
private
open Category hiding (raw)
module _ (x y : IsProduct 𝒜 raw) where
private
module x = IsProduct x
module y = IsProduct y
module _ {X : Object} (f : [ X , A ]) (g : [ X , B ]) where
module _ (f×g : Arrow X y.object) where
help : isProp ({y} ( [ y.fst y ] f) P.× ( [ y.snd y ] g) f×g y)
help = propPiImpl (λ _ propPi (λ _ arrowsAreSets _ _))
module _ {X : Object} (f : [ X , 𝒜 ]) (g : [ X , ]) where
module _ (f×g : Arrow X y.object) where
help : isProp ({y} ( [ y.fst y ] f) P.× ( [ y.snd y ] g) f×g y)
help = propPiImpl (λ _ propPi (λ _ arrowsAreSets _ _))
res = ∃-unique (x.ump f g) (y.ump f g)
res = ∃-unique (x.ump f g) (y.ump f g)
prodAux : x.ump f g y.ump f g
prodAux = lemSig ((λ f×g propSig (propSig (arrowsAreSets _ _) λ _ arrowsAreSets _ _) (λ _ help f×g))) _ _ res
prodAux : x.ump f g y.ump f g
prodAux = lemSig ((λ f×g propSig (propSig (arrowsAreSets _ _) λ _ arrowsAreSets _ _) (λ _ help f×g))) _ _ res
propIsProduct' : x y
propIsProduct' i = record { ump = λ f g prodAux f g i }
propIsProduct' : x y
propIsProduct' i = record { ump = λ f g prodAux f g i }
propIsProduct : isProp (IsProduct A B raw)
propIsProduct : isProp (IsProduct 𝒜 raw)
propIsProduct = propIsProduct'
Product≡ : {x y : Product A B} (Product.raw x Product.raw y) x y
Product≡ {x} {y} p i = record { raw = p i ; isProduct = q i }
where
q : (λ i IsProduct A B (p i)) [ Product.isProduct x Product.isProduct y ]
q = lemPropF propIsProduct p
Product≡ : {x y : Product 𝒜 } (Product.raw x Product.raw y) x y
Product≡ {x} {y} p i = record { raw = p i ; isProduct = q i }
where
q : (λ i IsProduct 𝒜 (p i)) [ Product.isProduct x Product.isProduct y ]
q = lemPropF propIsProduct p
module Try0 {a b : Level} { : Category a b}
(let module = Category ) {𝒜 : .Object} where
open P
open import Cat.Categories.Span
open P
open Category (span 𝒜 )
module _ where
raw : RawCategory _ _
raw = record
{ Object = Σ[ X .Object ] .Arrow X 𝒜 × .Arrow X
; Arrow = λ{ (A , a0 , a1) (B , b0 , b1)
Σ[ f .Arrow A B ]
[ b0 f ] a0
× [ b1 f ] a1
}
; identity = λ{ {X , f , g} .identity {X} , .rightIdentity , .rightIdentity}
; _<<<_ = λ { {_ , a0 , a1} {_ , b0 , b1} {_ , c0 , c1} (f , f0 , f1) (g , g0 , g1)
(f .<<< g)
, (begin
[ c0 [ f g ] ] ≡⟨ .isAssociative
[ [ c0 f ] g ] ≡⟨ cong (λ φ [ φ g ]) f0
[ b0 g ] ≡⟨ g0
a0
)
, (begin
[ c1 [ f g ] ] ≡⟨ .isAssociative
[ [ c1 f ] g ] ≡⟨ cong (λ φ [ φ g ]) f1
[ b1 g ] ≡⟨ g1
a1
)
}
}
module _ where
open RawCategory raw
propEqs : {X' : Object}{Y' : Object} (let X , xa , xb = X') (let Y , ya , yb = Y')
(xy : .Arrow X Y) isProp ( [ ya xy ] xa × [ yb xy ] xb)
propEqs xs = propSig (.arrowsAreSets _ _) (\ _ .arrowsAreSets _ _)
arrowEq : {X Y : Object} {f g : Arrow X Y} fst f fst g f g
arrowEq {X} {Y} {f} {g} p = λ i p i , lemPropF propEqs p {snd f} {snd g} i
private
isAssociative : IsAssociative
isAssociative {f = f , f0 , f1} {g , g0 , g1} {h , h0 , h1} = arrowEq .isAssociative
isIdentity : IsIdentity identity
isIdentity {AA@(A , a0 , a1)} {BB@(B , b0 , b1)} {f , f0 , f1} = arrowEq .leftIdentity , arrowEq .rightIdentity
arrowsAreSets : ArrowsAreSets
arrowsAreSets {X , x0 , x1} {Y , y0 , y1}
= sigPresSet .arrowsAreSets λ a propSet (propEqs _)
isPreCat : IsPreCategory raw
IsPreCategory.isAssociative isPreCat = isAssociative
IsPreCategory.isIdentity isPreCat = isIdentity
IsPreCategory.arrowsAreSets isPreCat = arrowsAreSets
open IsPreCategory isPreCat
module _ {𝕏 𝕐 : Object} where
open Σ 𝕏 renaming (fst to X ; snd to x)
open Σ x renaming (fst to xa ; snd to xb)
open Σ 𝕐 renaming (fst to Y ; snd to y)
open Σ y renaming (fst to ya ; snd to yb)
open import Cat.Equivalence using (composeIso) renaming (_≅_ to _≅_)
step0
: ((X , xa , xb) (Y , ya , yb))
(Σ[ p (X Y) ] (PathP (λ i .Arrow (p i) 𝒜) xa ya) × (PathP (λ i .Arrow (p i) ) xb yb))
step0
= (λ p cong fst p , cong-d (fst snd) p , cong-d (snd snd) p)
-- , (λ x → λ i → fst x i , (fst (snd x) i) , (snd (snd x) i))
, (λ{ (p , q , r) Σ≡ p λ i q i , r i})
, funExt (λ{ p refl})
, funExt (λ{ (p , q , r) refl})
step1
: (Σ[ p (X Y) ] (PathP (λ i .Arrow (p i) 𝒜) xa ya) × (PathP (λ i .Arrow (p i) ) xb yb))
Σ (X .≊ Y) (λ iso
let p = .isoToId iso
in
( PathP (λ i .Arrow (p i) 𝒜) xa ya)
× PathP (λ i .Arrow (p i) ) xb yb
)
step1
= symIso
(isoSigFst
{A = (X .≊ Y)}
{B = (X Y)}
(.groupoidObject _ _)
{Q = \ p (PathP (λ i .Arrow (p i) 𝒜) xa ya) × (PathP (λ i .Arrow (p i) ) xb yb)}
.isoToId
(symIso (_ , .asTypeIso {X} {Y}) .snd)
)
step2
: Σ (X .≊ Y) (λ iso
let p = .isoToId iso
in
( PathP (λ i .Arrow (p i) 𝒜) xa ya)
× PathP (λ i .Arrow (p i) ) xb yb
)
((X , xa , xb) (Y , ya , yb))
step2
= ( λ{ (iso@(f , f~ , inv-f) , p , q)
( f , sym (.domain-twist-sym iso p) , sym (.domain-twist-sym iso q))
, ( f~ , sym (.domain-twist iso p) , sym (.domain-twist iso q))
, arrowEq (fst inv-f)
, arrowEq (snd inv-f)
}
)
, (λ{ (f , f~ , inv-f , inv-f~)
let
iso : X .≊ Y
iso = fst f , fst f~ , cong fst inv-f , cong fst inv-f~
p : X Y
p = .isoToId iso
pA : .Arrow X 𝒜 .Arrow Y 𝒜
pA = cong (λ x .Arrow x 𝒜) p
pB : .Arrow X .Arrow Y
pB = cong (λ x .Arrow x ) p
k0 = begin
coe pB xb ≡⟨ .coe-dom iso
xb .<<< fst f~ ≡⟨ snd (snd f~)
yb
k1 = begin
coe pA xa ≡⟨ .coe-dom iso
xa .<<< fst f~ ≡⟨ fst (snd f~)
ya
in iso , coe-lem-inv k1 , coe-lem-inv k0})
, funExt (λ x lemSig
(λ x propSig prop0 (λ _ prop1))
_ _
(Σ≡ refl (.propIsomorphism _ _ _)))
, funExt (λ{ (f , _) lemSig propIsomorphism _ _ (Σ≡ refl (propEqs _ _ _))})
where
prop0 : {x} isProp (PathP (λ i .Arrow (.isoToId x i) 𝒜) xa ya)
prop0 {x} = pathJ (λ y p x isProp (PathP (λ i .Arrow (p i) 𝒜) xa x)) (λ x .arrowsAreSets _ _) Y (.isoToId x) ya
prop1 : {x} isProp (PathP (λ i .Arrow (.isoToId x i) ) xb yb)
prop1 {x} = pathJ (λ y p x isProp (PathP (λ i .Arrow (p i) ) xb x)) (λ x .arrowsAreSets _ _) Y (.isoToId x) yb
-- One thing to watch out for here is that the isomorphisms going forwards
-- must compose to give idToIso
iso
: ((X , xa , xb) (Y , ya , yb))
((X , xa , xb) (Y , ya , yb))
iso = step0 step1 step2
lemma : Terminal Product 𝒜
lemma = fromIsomorphism Terminal (Product 𝒜 ) (f , g , inv)
where
f : Terminal Product 𝒜
f ((X , x0 , x1) , uniq) = p
where
infixl 5 _⊙_
_⊙_ = composeIso
equiv1
: ((X , xa , xb) (Y , ya , yb))
((X , xa , xb) (Y , ya , yb))
equiv1 = _ , fromIso _ _ (snd iso)
univalent : Univalent
univalent = univalenceFrom≃ equiv1
isCat : IsCategory raw
IsCategory.isPreCategory isCat = isPreCat
IsCategory.univalent isCat = univalent
cat : Category _ _
cat = record
{ raw = raw
; isCategory = isCat
}
open Category cat
lemma : Terminal Product 𝒜
lemma = fromIsomorphism Terminal (Product 𝒜 ) (f , g , inv)
-- C-x 8 RET MATHEMATICAL BOLD SCRIPT CAPITAL A
-- 𝒜
where
f : Terminal Product 𝒜
f ((X , x0 , x1) , uniq) = p
where
rawP : RawProduct 𝒜
rawP = record
{ object = X
; fst = x0
; snd = x1
}
-- open RawProduct rawP renaming (fst to x0 ; snd to x1)
module _ {Y : .Object} (p0 : [ Y , 𝒜 ]) (p1 : [ Y , ]) where
uy : isContr (Arrow (Y , p0 , p1) (X , x0 , x1))
uy = uniq {Y , p0 , p1}
open Σ uy renaming (fst to Y→X ; snd to contractible)
open Σ Y→X renaming (fst to p0×p1 ; snd to cond)
ump : ∃![ f×g ] ( [ x0 f×g ] p0 P.× [ x1 f×g ] p1)
ump = p0×p1 , cond , λ {f} cond-f cong fst (contractible (f , cond-f))
isP : IsProduct 𝒜 rawP
isP = record { ump = ump }
p : Product 𝒜
p = record
{ raw = rawP
; isProduct = isP
}
g : Product 𝒜 Terminal
g p = 𝒳 , t
where
open Product p renaming (object to X ; fst to x₀ ; snd to x₁) using ()
module p = Product p
module isp = IsProduct p.isProduct
𝒳 : Object
𝒳 = X , x₀ , x₁
module _ {𝒴 : Object} where
open Σ 𝒴 renaming (fst to Y)
open Σ (snd 𝒴) renaming (fst to y₀ ; snd to y₁)
ump = p.ump y₀ y₁
open Σ ump renaming (fst to f')
open Σ (snd ump) renaming (fst to f'-cond)
𝒻 : Arrow 𝒴 𝒳
𝒻 = f' , f'-cond
contractible : (f : Arrow 𝒴 𝒳) 𝒻 f
contractible ff@(f , f-cond) = res
where
k : f' f
k = snd (snd ump) f-cond
prp : (a : .Arrow Y X) isProp
( ( [ x₀ a ] y₀)
× ( [ x₁ a ] y₁)
)
prp f f0 f1 = Σ≡
(.arrowsAreSets _ _ (fst f0) (fst f1))
(.arrowsAreSets _ _ (snd f0) (snd f1))
h :
( λ i
[ x₀ k i ] y₀
× [ x₁ k i ] y₁
) [ f'-cond f-cond ]
h = lemPropF prp k
res : (f' , f'-cond) (f , f-cond)
res = Σ≡ k h
t : IsTerminal 𝒳
t {𝒴} = 𝒻 , contractible
ve-re : x g (f x) x
ve-re x = Propositionality.propTerminal _ _
re-ve : p f (g p) p
re-ve p = Product≡ e
where
module p = Product p
-- RawProduct does not have eta-equality.
e : Product.raw (f (g p)) Product.raw p
RawProduct.object (e i) = p.object
RawProduct.fst (e i) = p.fst
RawProduct.snd (e i) = p.snd
inv : AreInverses f g
inv = funExt ve-re , funExt re-ve
rawP : RawProduct 𝒜
rawP = record
{ object = X
; fst = x0
; snd = x1
}
-- open RawProduct rawP renaming (fst to x0 ; snd to x1)
module _ {Y : .Object} (p0 : [ Y , 𝒜 ]) (p1 : [ Y , ]) where
uy : isContr (Arrow (Y , p0 , p1) (X , x0 , x1))
uy = uniq {Y , p0 , p1}
open Σ uy renaming (fst to Y→X ; snd to contractible)
open Σ Y→X renaming (fst to p0×p1 ; snd to cond)
ump : ∃![ f×g ] ( [ x0 f×g ] p0 P.× [ x1 f×g ] p1)
ump = p0×p1 , cond , λ {f} cond-f cong fst (contractible (f , cond-f))
isP : IsProduct 𝒜 rawP
isP = record { ump = ump }
p : Product 𝒜
p = record
{ raw = rawP
; isProduct = isP
}
g : Product 𝒜 Terminal
g p = 𝒳 , t
where
open Product p renaming (object to X ; fst to x₀ ; snd to x₁) using ()
module p = Product p
module isp = IsProduct p.isProduct
𝒳 : Object
𝒳 = X , x₀ , x₁
module _ {𝒴 : Object} where
open Σ 𝒴 renaming (fst to Y)
open Σ (snd 𝒴) renaming (fst to y₀ ; snd to y₁)
ump = p.ump y₀ y₁
open Σ ump renaming (fst to f')
open Σ (snd ump) renaming (fst to f'-cond)
𝒻 : Arrow 𝒴 𝒳
𝒻 = f' , f'-cond
contractible : (f : Arrow 𝒴 𝒳) 𝒻 f
contractible ff@(f , f-cond) = res
where
k : f' f
k = snd (snd ump) f-cond
prp : (a : .Arrow Y X) isProp
( ( [ x₀ a ] y₀)
× ( [ x₁ a ] y₁)
)
prp f f0 f1 = Σ≡
(.arrowsAreSets _ _ (fst f0) (fst f1))
(.arrowsAreSets _ _ (snd f0) (snd f1))
h :
( λ i
[ x₀ k i ] y₀
× [ x₁ k i ] y₁
) [ f'-cond f-cond ]
h = lemPropF prp k
res : (f' , f'-cond) (f , f-cond)
res = Σ≡ k h
t : IsTerminal 𝒳
t {𝒴} = 𝒻 , contractible
ve-re : x g (f x) x
ve-re x = Propositionality.propTerminal _ _
re-ve : p f (g p) p
re-ve p = Product≡ e
where
module p = Product p
-- RawProduct does not have eta-equality.
e : Product.raw (f (g p)) Product.raw p
RawProduct.object (e i) = p.object
RawProduct.fst (e i) = p.fst
RawProduct.snd (e i) = p.snd
inv : AreInverses f g
inv = funExt ve-re , funExt re-ve
propProduct : isProp (Product 𝒜 )
propProduct = equivPreservesNType {n = ⟨-1⟩} lemma Propositionality.propTerminal
@ -348,10 +184,7 @@ module _ {a b : Level} { : Category a b} {A B : Category.Object
module y = HasProducts y
productEq : x.product y.product
productEq = funExt λ A funExt λ B Try0.propProduct _ _
productEq = funExt λ A funExt λ B propProduct _ _
propHasProducts : isProp (HasProducts )
propHasProducts x y i = record { product = productEq x y i }
fmap≡ : {A : Set} {a0 a1 : A} {B : Set} (f : A B) Path a0 a1 Path (f a0) (f a1)
fmap≡ = cong

View File

@ -9,9 +9,9 @@ open import Cat.Category.Functor
open import Cat.Category.NaturalTransformation
renaming (module Properties to F)
using ()
open import Cat.Categories.Fun using (module Fun)
open import Cat.Categories.Opposite
open import Cat.Categories.Sets hiding (presheaf)
open import Cat.Categories.Fun using (module Fun)
-- There is no (small) category of categories. So we won't use _⇑_ from
-- `HasExponential`

View File

@ -184,7 +184,7 @@ module _ {a b : Level} (A : Set a) (B : Set b) where
private
module _ {obverse : A B} (e : isEquiv A B obverse) where
inverse : B A
inverse b = fst (fst (e b))
inverse b = fst (fst (e .equiv-proof b))
reverse : B A
reverse = inverse
@ -198,7 +198,7 @@ module _ {a b : Level} (A : Set a) (B : Set b) where
b
where
μ : (b : B) b obverse (inverse b)
μ b = snd (fst (e b))
μ b = snd (fst (e .equiv-proof b))
verso-recto : a (inverse obverse) a a
verso-recto a = begin
(inverse obverse) a ≡⟨ sym h
@ -206,7 +206,7 @@ module _ {a b : Level} (A : Set a) (B : Set b) where
a
where
c : isContr (fiber obverse (obverse a))
c = e (obverse a)
c = e .equiv-proof (obverse a)
fbr : fiber obverse (obverse a)
fbr = fst c
a' : A

View File

@ -44,8 +44,8 @@ module _ ( : Level) where
hSet : Set (lsuc )
hSet = ⟨0⟩ -type
Prop : Set (lsuc )
Prop = ⟨-1⟩ -type
hProp : Set (lsuc )
hProp = ⟨-1⟩ -type
-----------------
-- * Utilities --
@ -105,3 +105,38 @@ module _ { : Level} {A : Set } where
ntypeCumulative : {n m} n ≤′ m HasLevel n ⟩₋₂ A HasLevel m ⟩₋₂ A
ntypeCumulative {m} ≤′-refl lvl = lvl
ntypeCumulative {n} {suc m} (≤′-step le) lvl = HasLevel+1 m ⟩₋₂ (ntypeCumulative le lvl)
grpdPi : {b : Level} {B : A Set b}
((a : A) isGrpd (B a)) isGrpd ((a : A) (B a))
grpdPi = piPresNType (S (S (S ⟨-2⟩)))
grpdPiImpl : {b : Level} {B : A Set b}
({a : A} isGrpd (B a)) isGrpd ({a : A} (B a))
grpdPiImpl {B = B} g = equivPreservesNType {A = Expl} {B = Impl} {n = one} e (grpdPi (λ a g))
where
one = (S (S (S ⟨-2⟩)))
t : ({a : A} HasLevel one (B a))
t = g
Impl = {a : A} B a
Expl = (a : A) B a
expl : Impl Expl
expl f a = f {a}
impl : Expl Impl
impl f {a} = f a
e : Expl Impl
e = impl , (gradLemma impl expl (λ f refl) (λ f refl))
setGrpd : isSet A isGrpd A
setGrpd = ntypeCumulative
{suc (suc zero)} {suc (suc (suc zero))}
(≤′-step ≤′-refl)
propGrpd : isProp A isGrpd A
propGrpd = ntypeCumulative
{suc zero} {suc (suc (suc zero))}
(≤′-step (≤′-step ≤′-refl))
module _ {a b : Level} {A : Set a} {B : A Set b} where
open TLevel
grpdSig : isGrpd A ( a isGrpd (B a)) isGrpd (Σ A B)
grpdSig = sigPresNType {n = S (S (S ⟨-2⟩))}