Full API summary for Rockpool

rockpool

A machine learning library for training and deploying spiking neural network applications

Base classes

nn.modules.Module(*args, **kwargs)

The native Python / numpy Module base class for Rockpool

nn.modules.TimedModule(*args, **kwargs)

The Rockpool base class for all TimedModule modules

Attribute types

parameters.Parameter(data, family, ...)

Represent a module parameter

parameters.State(data, family, init_func, ...)

Represent a module state

parameters.SimulationParameter(data, family, ...)

Represent a module simulation parameter

parameters.Constant(obj)

Identify an initialisation argument as a constant (non-trainable) parameter

Alternative base classes

nn.modules.JaxModule(*args, **kwargs)

Base class for Module subclasses that use a Jax backend.

nn.modules.TorchModule(*args, **kwargs)

Base class for modules that are compatible with both Torch and Rockpool

Combinator modules

nn.combinators.FFwdStack(*args, **kwargs)

Assemble modules into a feed-forward stack, with linear weights in between

nn.combinators.Sequential(*args, **kwargs)

Build a sequential stack of modules by connecting them end-to-end

nn.combinators.Residual(*args, **kwargs)

Build a residual block over a sequential stack of modules

Time series classes

timeseries.TimeSeries([times, periodic, ...])

Base class to represent a continuous or event-based time series.

timeseries.TSContinuous([times, samples, ...])

Represents a continuously-sampled time series.

timeseries.TSEvent([times, channels, ...])

Represents a discrete time series, composed of binary events (present or absent).

Module subclasses

nn.modules.Rate(*args, **kwargs)

Encapsulates a population of rate neurons, supporting feed-forward and recurrent modules

nn.modules.RateJax(*args, **kwargs)

Encapsulates a population of rate neurons, supporting feed-forward and recurrent modules, with a Jax backend

nn.modules.RateTorch(*args, **kwargs)

Encapsulates a population of rate neurons, supporting feed-forward and recurrent modules, with a Toch backend

nn.modules.LIF(*args, **kwargs)

A leaky integrate-and-fire spiking neuron model

nn.modules.LIFJax(*args, **kwargs)

A leaky integrate-and-fire spiking neuron model, with a Jax backend

nn.modules.LIFTorch(*args, **kwargs)

A leaky integrate-and-fire spiking neuron model with a Torch backend

nn.modules.aLIFTorch(*args, **kwargs)

A leaky integrate-and-fire spiking neuron model with adaptive hyperpolarisation, with a Torch backend

nn.modules.LIFNeuronTorch(*args, **kwargs)

A leaky integrate-and-fire spiking neuron model

nn.modules.UpDownTorch(*args, **kwargs)

Feedforward layer that converts each analogue input channel to one spiking up and one spiking down channel.

nn.modules.Linear(*args, **kwargs)

Encapsulates a linear weight matrix

nn.modules.LinearJax(*args, **kwargs)

Encapsulates a linear weight matrix, with a Jax backend

nn.modules.LinearTorch(*args, **kwargs)

Applies a linear transformation to the incoming data: \(y = xA + b\)

nn.modules.Instant(*args, **kwargs)

Wrap a callable function as an instantaneous Rockpool module

nn.modules.InstantJax(*args, **kwargs)

Wrap a callable function as an instantaneous Rockpool module, with a Jax backend

nn.modules.InstantTorch(*args, **kwargs)

Wrap a callable function as an instantaneous Rockpool module, with a Torch backend

nn.modules.ExpSyn(*args, **kwargs)

Exponential synapse module

nn.modules.ExpSynJax(*args, **kwargs)

Exponential synapse module with a Jax backend

nn.modules.ExpSynTorch(*args, **kwargs)

Exponential synapse module with a Torch backend

nn.modules.SoftmaxJax(*args, **kwargs)

A Jax-backed module implementing a smoothed weighted softmax, compatible with spiking inputs

nn.modules.LogSoftmaxJax(*args, **kwargs)

A Jax-backed module implementing a smoothed weighted softmax, compatible with spiking inputs

nn.modules.ButterMelFilter(*args, **kwargs)

Define a Butterworth filter bank (mel spacing) filtering layer with continuous sampled output

nn.modules.ButterFilter(*args, **kwargs)

Define a Butterworth filter bank filtering layer with continuous output

nn.modules.LIFExodus

alias of rockpool.utilities.backend_management.missing_backend_shim.<locals>.MissingBackendShim

nn.modules.LIFMembraneExodus

alias of rockpool.utilities.backend_management.missing_backend_shim.<locals>.MissingBackendShim

Layer subclasses from Rockpool v1

These classes are deprecated, but are still usable via the high-level API, until they are converted to the v2 API.

nn.layers.Layer(weights[, dt, noise_std, name])

Base class for Layers in rockpool

nn.layers.FFIAFSpkInBrian(*args, **kwargs)

Spiking feedforward layer with spiking inputs and outputs

nn.layers.RecIAFSpkInBrian(*args, **kwargs)

Spiking recurrent layer with spiking in- and outputs, and a Brian2 backend

nn.layers.FFExpSynBrian(*args, **kwargs)

Define an exponential synapse layer (spiking input), with a Brian2 backend

Conversion utilities

nn.modules.timed_module.TimedModuleWrapper(...)

Wrap a low-level Rockpool Module automatically into a TimedModule object

nn.modules.timed_module.LayerToTimedModule(...)

An adapter class to wrap a Rockpool v1 Layer object, converting the object to support the TimedModule high-level Rockpool v2 API

nn.modules.timed_module.astimedmodule([...])

Convert a Rockpool v1 class to a v2 class

Jax training utilities

training.jax_loss

Jax functions useful for training networks using Jax Modules.

training.adversarial_jax

Functions to implement adversarial training approaches using Jax

training.adversarial_jax.pga_attack(...[, ...])

Performs the PGA (projected gradient ascent) based attack on the parameters of the network given inputs.

training.adversarial_jax.adversarial_loss(...)

Implement a hybrid task / adversarial robustness loss

PyTorch training utilities

training.torch_loss

Torch loss functions and regularizers useful for training networks using Torch Modules.

Xylo hardware support and simulation

devices.xylo.config_from_specification(...)

Convert a full network specification to a xylo config and validate it

devices.xylo.load_config(filename)

Read a Xylo configuration from disk in JSON format

devices.xylo.save_config(config, filename)

Save a Xylo configuration to disk in JSON format

devices.xylo.XyloSim(*args, **kwargs)

A Module simulating a digital SNN on Xylo, using XyloSim as a back-end.

devices.xylo.XyloSamna(*args, **kwargs)

A spiking neuron Module backed by the Xylo hardware, via samna.

devices.xylo.AFESim(*args, **kwargs)

A Module that simulates analog hardware for preprocessing audio

devices.xylo.AFESamna(*args, **kwargs)

Interface to the Audio Front-End module on a Xylo-A2 HDK

devices.xylo.DivisiveNormalisation(*args, ...)

A digital divisive normalization block

devices.xylo

Xylo-family device simulations, deployment and HDK support

devices.xylo.xylo_devkit_utils

Utilities for working with the Xylo HDK.

devices.xylo.syns61201

Package to sypport the Xylo HW SYNS61201 (Xylo-A2)

devices.xylo.syns61300

Package to sypport the Xylo HW SYNS61300 (Xylo-1 core; "Pollen")

devices.xylo.syns65300

Package to sypport the Xylo HW SYNS65300 (Xylo-A1)

devices.xylo.mapper(graph[, weight_dtype, ...])

Map a computational graph onto the Xylo v1 architecture

devices.xylo.XyloHiddenNeurons(input_nodes, ...)

A graph.GraphModule encapsulating Xylo v1 hidden neurons

devices.xylo.XyloOutputNeurons(input_nodes, ...)

A graph.GraphModule encapsulating Xylo V1 output neurons

Graph tracing and mapping

Base modules

graph.GraphModuleBase(input_nodes, ...)

Base class for graph modules

graph.GraphModule(input_nodes, output_nodes, ...)

Describe a module of computation in a graph

graph.GraphNode(source_modules, sink_modules)

Describe a node connecting GraphModule s

graph.GraphHolder(input_nodes, output_nodes, ...)

A GraphModule that encapsulates other graphs

graph.graph_base.as_GraphHolder(g)

Encapsulate a GraphModule inside a GraphHolder

Computational graph modules

graph.LinearWeights(input_nodes, ...[, biases])

A GraphModule that encapsulates a single set of linear weights

graph.GenericNeurons(input_nodes, ...)

A GraphModule than encapsulates a set of generic neurons

graph.AliasConnection(input_nodes, ...)

A GraphModule that encapsulates a set of alias connections

graph.LIFNeuronWithSynsRealValue(...)

A GraphModule that encapsulates a set of LIF spiking neurons with synaptic and membrane dynamics, and with real-valued parameters

graph.RateNeuronWithSynsRealValue(...)

A GraphModule that encapsulates a set of rate neurons, with synapses, and with real-valued parameters

graph.utils

Utilities for generating and manipulating computational graphs