Full API summary for Rockpool
Package structure summary
A machine learning library for training and deploying spiking neural network applications |
Base classes
See also
Getting started with Rockpool and Working with time series data.
|
The native Python / numpy |
|
The Rockpool base class for all |
Attribute types
|
Represent a module parameter |
|
Represent a module state |
|
Represent a module simulation parameter |
|
Identify an initialisation argument as a constant (non-trainable) parameter |
Alternative base classes
|
Base class for |
|
Base class for modules that are compatible with both Torch and Rockpool |
Combinator modules
|
Assemble modules into a feed-forward stack, with linear weights in between |
|
Build a sequential stack of modules by connecting them end-to-end |
|
Build a residual block over a sequential stack of modules |
Time series classes
See also
|
Base class to represent a continuous or event-based time series. |
|
Represents a continuously-sampled time series. |
|
Represents a discrete time series, composed of binary events (present or absent). |
Module
subclasses
|
Encapsulates a population of rate neurons, supporting feed-forward and recurrent modules |
|
Encapsulates a population of rate neurons, supporting feed-forward and recurrent modules, with a Jax backend |
|
Encapsulates a population of rate neurons, supporting feed-forward and recurrent modules, with a Toch backend |
|
A leaky integrate-and-fire spiking neuron model |
|
A leaky integrate-and-fire spiking neuron model, with a Jax backend |
|
A leaky integrate-and-fire spiking neuron model with a Torch backend |
|
A leaky integrate-and-fire spiking neuron model with adaptive hyperpolarisation, with a Torch backend |
|
A leaky integrate-and-fire spiking neuron model |
|
Feedforward layer that converts each analogue input channel to one spiking up and one spiking down channel. |
|
Encapsulates a linear weight matrix |
|
Encapsulates a linear weight matrix, with a Jax backend |
|
Applies a linear transformation to the incoming data: \(y = xA + b\) |
|
Wrap a callable function as an instantaneous Rockpool module |
|
Wrap a callable function as an instantaneous Rockpool module, with a Jax backend |
|
Wrap a callable function as an instantaneous Rockpool module, with a Torch backend |
|
Exponential synapse module |
|
Exponential synapse module with a Jax backend |
|
Exponential synapse module with a Torch backend |
|
A Jax-backed module implementing a smoothed weighted softmax, compatible with spiking inputs |
|
A Jax-backed module implementing a smoothed weighted softmax, compatible with spiking inputs |
|
Define a Butterworth filter bank (mel spacing) filtering layer with continuous sampled output |
|
Define a Butterworth filter bank filtering layer with continuous output |
alias of |
|
alias of |
Layer
subclasses from Rockpool v1
These classes are deprecated, but are still usable via the high-level API, until they are converted to the v2 API.
|
Base class for Layers in rockpool |
|
Spiking feedforward layer with spiking inputs and outputs |
|
Spiking recurrent layer with spiking in- and outputs, and a Brian2 backend |
|
Define an exponential synapse layer (spiking input), with a Brian2 backend |
Standard networks
|
Implement a WaveSense network |
|
Conversion utilities
Wrap a low-level Rockpool |
|
An adapter class to wrap a Rockpool v1 |
|
Convert a Rockpool v1 class to a v2 class |
Jax
training utilities
Jax functions useful for training networks using Jax Modules. |
|
Functions to implement adversarial training approaches using Jax |
|
Performs the PGA (projected gradient ascent) based attack on the parameters of the network given inputs. |
Implement a hybrid task / adversarial robustness loss |
PyTorch
training utilities
Torch loss functions and regularizers useful for training networks using Torch Modules. |
PyTorch
transformation API (beta)
Xylo hardware support and simulation
Convert a full network specification to a xylo config and validate it |
|
|
Read a Xylo configuration from disk in JSON format |
|
Save a Xylo configuration to disk in JSON format |
|
A |
|
A spiking neuron |
|
A spiking neuron |
|
A |
|
Interface to the Audio Front-End module on a Xylo-A2 HDK |
A digital divisive normalization block |
Xylo-family device simulations, deployment and HDK support |
Package to support the Xylo HW SYNS61300 (Xylo-1 core; "Pollen") |
|
Package to support the Xylo HW SYNS61201 (Xylo-A2) |
|
Package to support the Xylo HW SYNS65300 (Xylo-A1) |
|
Map a computational graph onto the Xylo v2 (SYNS61201) architecture |
Graph tracing and mapping
Base modules
|
Base class for graph modules |
|
Describe a module of computation in a graph |
|
Describe a node connecting |
|
A |
Encapsulate a |
Computational graph modules
|
A |
|
A |
|
A |
A |
|
A |
Utilities for generating and manipulating computational graphs |
General Utilities
Utility functions for working with trees. |
|
Tree manipulation utilities with no external dependencies |