The GradientBasedLearningForCAP package is a computational tool for categorical machine learning within the CAP (Categories, Algorithms, Programming) framework. It provides a categorical foundation for neural networks by modelling them as parametrised morphisms and performing computation in the category of smooth maps. The system supports symbolic expressions and automatic differentiation via the lens pattern, enabling the bidirectional data flow required for backpropagation. Included examples demonstrate practical applications such as finding a local minimum and training models for binary classification, multi-class classification, and linear regression, using various loss functions and optimizers including gradient descent and Adam. This implementation is based on the paper \(\href{https://arxiv.org/abs/2404.00408}{Deep~Learning~with~Parametric~Lenses}\).
The package implements the following main concepts:
Examples: Examples for creating and training neural networks and computing local minima.
Expressions: A symbolic expression system for representing mathematical formulas.
Skeletal Category of Smooth Maps: A category where objects are Euclidean spaces \(\mathbb{R}^n\) and morphisms are smooth maps with their Jacobian matrices.
Category of Parametrised Morphisms: A category that represents morphisms with learnable parameters, used to model neural network layers.
Neural Networks: High-level operations for constructing and training neural networks.
Category of Lenses: A category that models bidirectional data flow, essential for backpropagation in neural networks.
Fitting Parameters: Explain how to learn the parameters in order to minimize a parametrised morphism.
CAP Operation: The new categorical operations needed in this package.
Tools: Few GAP operations and helper functions.
generated by GAPDoc2HTML