Full API summary for Rockpool
Package structure summary
A machine learning library for training and deploying spiking neural network applications |
Base classes
See also
|
The native Python / numpy |
|
The Rockpool base class for all |
Attribute types
|
Base class for Rockpool registered attributes |
|
Represent a module parameter |
|
Represent a module state |
|
Represent a module simulation parameter |
|
Identify an initialisation argument as a constant (non-trainable) parameter |
Alternative base classes
|
Base class for |
|
Base class for modules that are compatible with both Torch and Rockpool |
Combinator modules
|
Assemble modules into a feed-forward stack, with linear weights in between |
|
Build a sequential stack of modules by connecting them end-to-end |
|
Build a residual block over a sequential stack of modules |
Time series classes
See also
|
Base class to represent a continuous or event-based time series. |
|
Represents a continuously-sampled time series. |
|
Represents a discrete time series, composed of binary events (present or absent). |
Module
subclasses
|
Encapsulates a population of rate neurons, supporting feed-forward and recurrent modules |
|
Encapsulates a population of rate neurons, supporting feed-forward and recurrent modules, with a Jax backend |
|
Encapsulates a population of rate neurons, supporting feed-forward and recurrent modules, with a Toch backend |
|
A leaky integrate-and-fire spiking neuron model |
|
A leaky integrate-and-fire spiking neuron model, with a Jax backend |
|
A leaky integrate-and-fire spiking neuron model with a Torch backend |
|
A leaky integrate-and-fire spiking neuron model with adaptive hyperpolarisation, with a Torch backend |
|
A leaky integrate-and-fire spiking neuron model |
|
Feedforward layer that converts each analogue input channel to one spiking up and one spiking down channel. |
|
Encapsulates a linear weight matrix |
|
Encapsulates a linear weight matrix, with a Jax backend |
|
Applies a linear transformation to the incoming data: \(y = xA + b\) |
|
Wrap a callable function as an instantaneous Rockpool module |
|
Wrap a callable function as an instantaneous Rockpool module, with a Jax backend |
|
Wrap a callable function as an instantaneous Rockpool module, with a Torch backend |
|
Exponential synapse module |
|
Exponential synapse module with a Jax backend |
|
Exponential synapse module with a Torch backend |
|
A Jax-backed module implementing a smoothed weighted softmax, compatible with spiking inputs |
|
A Jax-backed module implementing a smoothed weighted softmax, compatible with spiking inputs |
|
Define a Butterworth filter bank (mel spacing) filtering layer with continuous sampled output |
|
Define a Butterworth filter bank filtering layer with continuous output |
|
|
|
|
|
Layer
subclasses from Rockpool v1
These classes are deprecated, but are still usable via the high-level API, until they are converted to the v2 API.
|
Base class for Layers in rockpool |
|
Spiking feedforward layer with spiking inputs and outputs |
|
Spiking recurrent layer with spiking in- and outputs, and a Brian2 backend |
|
Define an exponential synapse layer (spiking input), with a Brian2 backend |
Standard networks
|
Implement a WaveSense network |
|
Conversion utilities
Wrap a low-level Rockpool |
|
An adapter class to wrap a Rockpool v1 |
|
Convert a Rockpool v1 class to a v2 class |
Jax
training utilities
Jax functions useful for training networks using Jax Modules. |
|
Functions to implement adversarial training approaches using Jax |
|
Performs the PGA (projected gradient ascent) based attack on the parameters of the network given inputs. |
Implement a hybrid task / adversarial robustness loss |
PyTorch
training utilities
Torch loss functions and regularizers useful for training networks using Torch Modules. |
PyTorch
transformation API (beta)
Xylo hardware support and simulation
Convert a full network specification to a xylo config and validate it |
|
|
Read a Xylo configuration from disk in JSON format |
|
Save a Xylo configuration to disk in JSON format |
|
A |
|
A spiking neuron |
|
A spiking neuron |
|
A |
|
Interface to the Audio Front-End module on a Xylo-A2 HDK |
A digital divisive normalization block |
Xylo-family device simulations, deployment and HDK support |
Package to support the Xylo HW SYNS61300 (Xylo-1 core; "Pollen") |
|
Package to support the Xylo HW SYNS61201 (Xylo-A2) |
|
Package to support the Xylo HW SYNS65300 (Xylo-A1) |
|
Map a computational graph onto the Xylo v2 (SYNS61201) architecture |
Dynap-SE2 hardware support and simulation
See also
Tutorials:
Dynap-SE2 Application Programming Interface (API) |
Simulation
Dynap-SE2 Simulation Module |
|
|
DynapSim solves dynamical chip equations for the DPI neuron and synapse models. |
Mismatch
|
mismatch_generator returns a function which simulates the analog device mismatch effect. |
frozen_mismatch_prototype process the module attributes tree and returns a frozen mismatch prototype which indicates the values to be deviated. |
|
dynamic_mismatch_prototype process the module attributes tree and returns a dynamical mismatch prototype which indicates the values to be deviated at run-time. |
Device to Simulation
|
mapper maps a computational graph onto Dynap-SE2 architecture. |
autoencoder_quantization executes the unsupervised weight configuration learning approach |
|
config_from_specification gets a specification and creates a samna configuration object for Dynap-SE2 chip. |
Computer Interface
find_dynapse_boards identifies the Dynap-SE2 boards plugged in to the system. |
|
|
DynapseSamna bridges the gap between the chip and the computer. |
Simulation to Device
dynapsim_net_from_specification gets a specification and creates a sequential dynapsim network consisting of a linear layer (virtual connections) and a recurrent layer (hardware connections) |
|
dynapsim_net_from_config constructs a |
More
|
DynapseNeurons stores the core computational properties of a Dynap-SE network |
|
DynapSimCore stores the simulation currents and manages the conversion from configuration objects. |
Graph tracing and mapping
Base modules
|
Base class for graph modules |
|
Describe a module of computation in a graph |
|
Describe a node connecting |
|
A |
Encapsulate a |
Computational graph modules
|
A |
|
A |
|
A |
A |
|
A |
Utilities for generating and manipulating computational graphs |
General Utilities
Utility functionality for managing backends |
|
Utility functions for working with trees. |
|
type_handling.py - Convenience functions for checking and converting object types |