Developer's Guide
Setup dev environment#
For development purposes, e.g. if you would like to make contributions, follow the following steps:
With uv
- Install
uv
, e.g.pip install --upgrade uv
- Then clone this repository and install the development dependencies:
laplace-torch
is now available in editable mode, e.g. you can run:
uv run python examples/regression_example.py
# Or, equivalently:
source .venv/bin/activate
python examples/regression_example.py
With pip
git clone git@github.com:aleximmer/Laplace.git
# Recommended to create a virtualenv before the following step
pip install -e ".[dev]"
# Run as usual, e.g.
python examples/regression_examples.py
Contributing#
Pull requests are very welcome. Please follow these guidelines:
- Follow the development setup.
- Use ruff as autoformatter. Please refer to the following makefile and run it via
make ruff
. Please note that the order ofruff check --fix
andruff format
is important! - Also use ruff as linter. Please manually fix all linting errors/warnings before opening a pull request.
- Fully document your changes in the form of Python docstrings, typehinting, and (if applicable) code/markdown examples in the
./examples
subdirectory. - See
docs/api_reference/*.md
on how to include a newly added class in the docs. - Provide as many test cases as possible. Make sure all test cases pass.
Issues, bug reports, and ideas are also very welcome!
Documentation#
The documentation is available here or can be generated and/or viewed locally:
With uv
# assuming the repository was cloned
uv sync --all-extras
# serve the docs locally
uv run mkdocs serve
With pip
Publishing the laplace-torch
package to PyPi#
With uv
, this is done via: https://docs.astral.sh/uv/guides/publish/.
If you want to make your life much easier, you can use pdm
:
Structure#
The laplace package consists of two main components:
- The subclasses of
laplace.BaseLaplace
that implement different sparsity structures: different subsets of weights ('all'
,'subnetwork'
and'last_layer'
) and different structures of the Hessian approximation ('full'
,'kron'
,'lowrank'
,'diag'
and'gp'
). This results in ten currently available options:laplace.FullLaplace
,laplace.KronLaplace
,laplace.DiagLaplace
,laplace.FunctionalLaplace
the corresponding last-layer variationslaplace.FullLLLaplace
,laplace.KronLLLaplace
,laplace.DiagLLLaplace
andlaplace.FunctionalLLLaplace
(which are all subclasses oflaplace.LLLaplace
),laplace.SubnetLaplace
(which only supports'full'
and'diag'
Hessian approximations) andlaplace.LowRankLaplace
(which only supports inference over'all'
weights). All of these can be conveniently accessed via thelaplace.Laplace
function. - The backends in
laplace.curvature
which provide access to Hessian approximations of the corresponding sparsity structures, for example, the diagonal GGN.
Additionally, the package provides utilities for
decomposing a neural network into feature extractor and last layer for LLLaplace
subclasses (laplace.utils.feature_extractor
)
and
effectively dealing with Kronecker factors (laplace.utils.matrix
).
Finally, the package implements several options to select/specify a subnetwork for SubnetLaplace
(as subclasses of laplace.utils.subnetmask.SubnetMask
).
Automatic subnetwork selection strategies include: uniformly at random (laplace.utils.subnetmask.RandomSubnetMask
), by largest parameter magnitudes (LargestMagnitudeSubnetMask
), and by largest marginal parameter variances (LargestVarianceDiagLaplaceSubnetMask
and LargestVarianceSWAGSubnetMask
).
In addition to that, subnetworks can also be specified manually, by listing the names of either the model parameters (ParamNameSubnetMask
) or modules (ModuleNameSubnetMask
) to perform Laplace inference over.
Extendability#
To extend the laplace package, new BaseLaplace
subclasses can be designed, for example,
Laplace with a block-diagonal Hessian structure.
One can also implement custom subnetwork selection strategies as new subclasses of SubnetMask
.
Alternatively, extending or integrating backends (subclasses of curvature.curvature
) allows to provide different Hessian
approximations to the Laplace approximations.
For example, currently the curvature.CurvlinopsInterface
based on Curvlinops and the native torch.func
(previously known as functorch
), curvature.BackPackInterface
based on BackPACK and curvature.AsdlInterface
based on ASDL are available.