Change logο
All notable changes between Rockpool releases will be documented in this file
[v3.0.1] β 2025-10-09ο
Changedο
Update release instructions to add logo color information
Update functions and methods to be compatible with Python 3.12 and Numpy >= 2.0
Update docker image for CI pipeline
[v.3.0.0.1 hotfix] β 2025-07-02ο
Fixedο
Improve precision of power measurement by start and stop power measurement in evolve call of XyloMonitor/XyloSamna
[v.3.0.0] β 2025-06-17ο
Removedο
Remove support for XyloTestBoard, XyloDevKit and XyloAudio 1, following updates on Samna 0.46.0. This breaks compatibility with old versions.
Updatedο
as_graphmethod ofahp_lifmodule changed to make it compatible with the mapper of XyloAudio 2
Addedο
Add missing audio samples for XyloAudio 3 tutorial
Using XyloSamna and XyloMonitor to deploy a model on XyloAudio 3 HDK
Fixedο
Initialization of
XyloSamnaorXyloMonitorin a loop was crashing when recording power.
[v.2.9.2.2 hotfix] β 2025-05-26ο
Fixedο
Restrict max version for samna to prevent unexpected updates breaking Rockpool
[v.2.9.2.1 hotfix] β 2025-05-09ο
Fixedο
Audio Front End using PDM input data was not configured correctly for XyloAudio 3
[v2.9.2] β 2025-04-17ο
Addedο
Power measurement for XyloAudio 3
AFESamna module for XyloAudio 3: allow recording of input spikes from live microphone
Complete set of tutorials for XyloAudio 3
Fixedο
Remove dependence of imp module that was deprecated in Python 3.12
Fixed bug in multiplying
TSContinuousobjects with differing numbers of channels
[v.2.9.1 hotfix] β 2024-10-14ο
Fixedο
Rockpool package was not generate on conda-forge. Update package build requirements.
[v2.9] β 2024-10-11ο
Addedο
Support for Xyloβ’Audio 3 development kit
Hardware interface via samna
Digital microphone input and simulaton package
Cycles model
Simulation support for audio front-end:
AFESimExternal,AFESimAGC, andAFESimPDMwith all the necessary sub-modules
Tutorial and documentation for the
SynNetarchitecture, to improve visibility
Changedο
Update
release notesfor developers in documentationAdd check for version
Add check for copyright
Update dependency version of Jax to >=0.4.28
Move instructions to build documentation inside
Contributingsection
[v2.8] β 2024-06-24ο
Addedο
Add cycles model for Xylo Audio and Xylo IMU, enabling users to calculate the required master clock frequency for Xylo
Add support for NIR, for importing and exporting Rockpool torch networks
Changedο
LIFExodusnow supports vectors as threshold parameterStandard
LIFmodules now havew_recas a simulation parameter when in non-recurrent mode
Fixedο
TypeErrorwhen usingLIFExodusUpdate
jax.configusagePower measurement for
xyloA2was not considering AFE channelsRemove
check_gradsfrom Jax tests, since this will fail for LIF neurons due to surrograte gradientsFix a bug in
AFESimon windows, where the maximum int32 value would be exceeded when seeding the AFE simulationFix stochasticity in some unit tests
Fix a bug in
channel_quantize, where quantization would be incorrectly applied for Xylo IMU networks with Nien < NhidFix a bug in
channel_quantize, where hidden unit biases would be incorrectly used in place of output unit biasesFix a non-handled buffer bug in
LIFJax, where non-recurrent modules would sometimes have garbage inw_recinstead of all zerosFix a bug in
TorchSequential.as_graph(), where torch module functions would be called instead of rockpool modules, leading to a failing call to.as_graph().
Deprecatedο
Brian2 tests are not running β Brian2 backend will be soon removed
Removedο
Securityο
[v.2.7.1 hotfix] β 2024-01-19ο
Fixedο
Bug in Xylo IMU mapper, where networks with more than 128 hidden neurons could not be mapped
[v2.7] β 2023-09-25ο
Addedο
Dependency on
pytest-random-orderv1.1.0 for test order randomization.New HowTo tutorial for performing constrained optimisation with torch and jax
Xylo IMU application software support:
mapper,config_from_specificationand graph mapping supportXyloSimmodule: SNN core simulation for Xylo IMUIMUIFSimmodule: Simulation of the input encoding interface with sub-modules:BandPassFilterFilterBankRotationRemovalIAFSpikeEncoderScaleSpikeEncoder
XyloIMUMonitormodule: Real-time hardware monitoring for Xylo IMU.XyloSamnamodule: Interface to the SNN core.IMUIFSamnamodule: Interface toIMUIF, utilizing neurons in the SNN core.IMUDatamodule: Collection of sensor data from the onboard IMU sensor.Utility functions for network mapping to the Xylo IMU HDK, interfacing, and data processing.
Introductory documentation providing an overview of Xylo IMU and instructions on configuring preprocessing.
New losses, with structure similar to PyTorch.
PeakLoss which can be imported as
peak_loss = rockpool.nn.losses.PeakLoss().MSELoss which can be imported as
mse_loss = rockpool.nn.losses.MSELoss().
Changedο
Update dependency version of pytest-xdist to >=3.2.1.
Update to
SequentialAPI.Sequentialnow permits instantiation with anOrderedDictto specify module names.Sequentialnow supports an.append()method, to append new modules, optionally specifying a module name.Cleaned up tree manipulation libraries and added to documentation. Implemented unit tests.
Removed obsolete unit tests
Changed semantics of transformation configurations for QAT, to only include attributes which will be transformed, rather than all attributes. This fixes an incompatibility with torch >= 2.0.
Added support for latest
torchversionsNew fine-grained installation options
Renamed power measurement dict keys returned by Xylo Audio 2 (
syns61201)XyloSamnamodule, to be more descriptiveUpgrade minimum Python version supported to 3.8
Upgrade minimum JAX version supported to 0.4.10
Rearranged Xylo documentation to separate overview, Xylo Audio and Xylo IMU
Fixedο
Fixed bug in initialising access to MC3620 IMU sensor on Xylo IMU HDK, where it would fail with an error the on the second initialisation
Deprecatedο
Removedο
NEST backend completely removed
Removed spiking output surrogate βUβ from LIF modules
Securityο
[v2.6] β 2023-03-22ο
Addedο
Dynap-SE2 Application Software Support (jax-backend)
jax backend
DynapSimneuron model with its own custom surrogate gradient implementationDynapSamnamodule handling low-level HDK interface under-the-hoodrockpool network <-> hardware configuration bi-directional conversion utilities
Network mapping:
mapper()andconfig_from_specification()sequentially combined
LinearJax+DynapSimnetwork getters :dynapsim_net_from_config()anddynapsim_net_from_spec()
transistor lookup tables to ease high-level parameters <-> currents <-> DAC (coarse, fine values) conversions
Dynap-SE2 specific auto-encoder quantization
autoencoder_quantization()Custom
DigitalAutoEncoderimplementation and training pipeline
samnaalias classes compensating the missing documentation supportunit tests + tutorials + developer docs
DynapseNeurongraph module which supports conversion from and toLIFNeuronWithSynsRealValuegraphhardcoded frozen and dynamic mismatch prototypes
mismatch transformation (jax)
LIFExodusnow supports training time constants, and multiple time constantsImproved API for
LIFTorchImplemented
ExpSynExodusfor accelerated training of exponential synapse modulesAdded initial developer documentation
Added MNIST tutorial
Fixed notebook links to MyBinder.org
Changedο
Updated Samna version requirement to >=0.19.0
User explicitly defines Cuda device for LIFExodus, ExpDynExodus and LIFMembraneExodus
Improved error message when a backend is missing
Improved transient removal in
syns61201.AFESim
Fixedο
Weight scaling was too different for output layers and hidden layers.
Hotfix: Regression in
LIFExodus
[v2.5] β 2022-11-29ο
Addedο
Added support for Xylo-Audio v2 (SYNS61201) devices and HDK
Added hardware versioning for Xylo devices
Added a beta implementation of Quantisation-Aware Training for Torch backend in
rockpool.transform.torch_transformAdded support for parameter boundary constraints in
rockpool.training.torch_lossAdded tutorial for Spiking Heidelberg Digits audio classification
Added tutorial and documentation for WaveSense network architecture
Added support to
LIFTorchfor training decays and bitshift parametersAdded a new utility package
rockpool.utilities.tree_utils
Changedο
Updated support for Exodus v1.1
Updated
XyloSim.from_specificationto handle NIEN β NRSN β NOEN for Xylo devicesUpdated
LIFTorchto provide propertaus for.as_graph()in case of decay and bitshift traningImproved backend management, to test torch version requirements
Fixedο
Fixed usage of Jax optimisers in tutorial notebooks to reflect Jax API changes
Fixed issues with
LIFTorchandaLIFTorch, preventingdeepcopyprotocolFixed bug in
tree_utils, whereTreewas used instead ofdictinisinstancecheckReplaced outdated reference from
FFRateEulertoRatemodule in high-level API tutorialFixed seeds in torch and numpy to avoid
nanloss problem while training in tutorialFixed bug in
TorchModulewhere assigning to an existing registered attribute would clear the family of the attributeFixed a bug in Constant handling for
torch.Tensors, which would raise errors in torch 1.12Fixed bug in
LIFTorch, which would cause recorded state to hang around post-evolution, causing errors fromdeepcopyFixed bug in
Module._register_module(), where replacing an existing submodule would cause the string representation to be incorrect
[v2.4.2] β 2022-10-06ο
Hotfixο
Improved handling of weights when using XyloSim.from_specification
[v2.4] β 2022-08ο
Major changesο
Linear...modules now do not have a bias parameter, by default.
Addedο
Support for Xylo SNN core v2, via XyloSim. Including biases and quantisation support; mapping and deployment for Xylo SNN core v2 (SYNS61201)
Added support for Xylo-A2 test board, with audio recording support from Xylo AFE (
AFESamnaandXyloSamna)Support for an LIF neuron including a trainable adaptive threshold (
aLIFTorch). Deployable to XyloNew module
BooleanState, which maintains a boolean stateSupport for membrane potential training using
LIFExodus
Changedο
Xylo package support for HW versioning (SYNS61300; SYNS61201)
Ability to return events, membrane potentials or synaptic currents as output from
XyloSimandXyloSamnaEnhanced Xylo
mapperto be more lenient about weight matrix size β now assumes missing weights are zeroXylo
mapperis now more lenient about HW constraints, permitting larger numbers of input and output channels than supported by existing HDKsXylo
mappersupports a configurable number of maxmimum hidden and output neuronsRunning
blackis enforced by the CI pipelineLinear...modules now export bias parameters, if they are presentLinear...modules now do not include bias parameters by defaultXylo
mappernow raises a warning if any linear weights have biasesLIFSlayerrenamed toLIFExodus, corresponding tosinabs.exoduslibrary name changePeriodic exponetial surrogate function now supports training thresholds
Fixedο
Fixes related to torch modules moved to simulation devices
Fixed issue in
dropout.py, where if jax was missing an ImportError was raisedFixed an issue with
Constanttorchparameters, wheredeepcopywould raise an errorFixed issue with newer versions of torch; torch v1.12 is now supported
Updated to support changes in latest jax api
Fixed bug in
WavesenseNet, where neuron class would not be checked properlyFixed bug in
channel_quantize, where unquantized weights were returned instead of quantized weights
Deprecatedο
LIFSlayeris now deprecated
[v2.3.1] β 2022-03-24ο
Hotfixο
Improved CI pipeline such that pipline is not blocked with sinabs.exodus cannot be installed
Fixed UserWarning raised by some torch-backed modules
Improved some unit tests
[v2.3] β 2022-03-16ο
Addedο
Standard dynamics introduced for LIF, Rate, Linear, Instant, ExpSyn. These are standardised across Jax, Torch and Numpy backends. We make efforts to guarantee identical dynamics for the standard modules across these backends, down to numerical precision
LIF modules can now train threhsolds and biases as well as time constants
New
JaxODELIFmodule, which implements a trainable LIF neuron following common dynamical equations for LIF neuronsNew addition of the WaveSense network architecture, for temporal signal processing with SNNs. This is available in
rockpool.networks, and is documented with a tutorialA new system for managing computational graphs, and mapping these graphs onto hardware architectures was introduced. These are documented in the Xylo quick-start tutorial, and in more detail in tutorials covering Computational Graphs and Graph Mapping. The mapping system performs design-rule checks for Xylo HDK
Included methods for post-traning quantisation for Xylo, in
rockpool.transformAdded simulation of a divisive normalisation block for Xylo audio applications
Added a
Residualcombinator, for convenient generation of networks with residual blocksSupport for
sinabslayers and ExodusModule,JaxModuleandTorchModuleprovide facility for auto-batching of input data. Input data shape is(B, T, Nin), or(T, Nin)when only a single batch is providedExpanded documentation on parameters and type-hinting
Changedο
Python > 3.6 is now required
Improved import handling, when various computational back-ends are missing
Updated for new versions of
samnaRenamed Cimulator -> XyloSim
Better parameter handling and rockpool/torch parameter registration for Torch modules
(Most) modules can accept batched input data
Improved / additional documentation for Xylo
Fixedο
Improved type casting and device handling for Torch modules
Fixed bug in Module, where
modules()would return a non-ordered dict. This caused issues withJaxModule
Removedο
Removed several obsolete
Layers andNetworks from Rockpool v1
[v2.2] β 2021-09-09ο
Addedο
Added support for the Xylo development kit in
.devices.xylo, including several tutorialsAdded CTC loss implementations in
.training.ctc_lossNew trainable
torchmodules:LIFTorchand others in.nn.modules.torch, including an asynchronous delta modulatorUpDownTorchAdded
torchtraining utilities and loss functions in.training.torch_lossNew
TorchSequentialclass to supportSequentialcombinator fortorchmodulesAdded a
FFwdStackTorchclass to supportFFwdStackcombinator fortorchmodules
Changedο
Existing
LIFTorchmodule renamed toLIFBitshiftTorch; updated module to align better with Rockpool APIImprovements to
.typehintspackageTorchModulenow raises an error if submodules are notTorchmodules
Fixedο
Updated LIF torch training tutorial to use new
LIFBitshiftTorchmoduleImproved installation instructions for
zsh
[v2.1] β 2021-07-20ο
Addedο
πΉ Adversarial training of parameters using the Jax back-end, including a tutorial
π° βEasterβ tutorial demonstrating an SNN trained to generate images
π₯ Torch tutorials for training non-spiking and spiking networks with Torch back-ends
Added new method
nn.Module.timed(), to automatically convert a module to aTimedModuleNew
LIFTorchmodule that permits training of neuron and synaptic time constants in addition to other network parametersNew
ExpSynTorchmodule: exponential leak synapses with Torch back-endNew
LinearTorchmodule: linear model with Torch back-endNew
LowPassmodule: exponential smoothing with Torch back-endNew
ExpSmoothJaxmodule: single time-constant exponential smoothing layer, supporting arbitrary transfer functions on outputNew
softmaxandlog_softmaxlosses injax_losspackageNew
utilities.jax_tree_utilspackage containing useful parameter tree handling functionsNew
TSContinuous.to_clocked()convenience method, to easily rasterise a continuous time seriesAlpha: Optional
_wrap_recorded_state()method added tonn.Modulebase class, which supports wrapping recorded state dictionaries asTimeSeriesobjects, when using the high-levelTimeSeriesAPISupport for
add_eventsflag for time-series wrapper classNew Parameter dictionary classes to simplify conversion and handling of Torch and Jax module parameters
Added
astorch()method to parameter dictionaries returned formTorchModule
Improved type hinting
Changedο
Old
LIFTorchmodule renamed toLIFBitshiftTorchKaiming and Xavier initialisation support for
LinearmodulesLinearmodules provide a bias by defaultMoved
filter_bankpackage from V1 layers intonn.modulesUpdate Jax requirement to > v2.13
Fixedο
Fixed binder links for tutorial notebooks
Fixed bug in
Modulefor multiple inheritance, where the incorrect__repr__()method would be calledFixed
TimedModuleWrapper.reset_state()methodFixed axis limit bug in
TSEvent.plot()methodRemoved page width constraint for docs
Enable
FFExpSynmodule by making it independent of oldRRTrainedLayer
Deprecatedο
Removed
rpycdependency
Removedο
[v2.0] β 2021-03-24ο
New Rockpool API. Breaking change from v1.x
Documentation for new API
Native support for Jax and Torch backends
Many v1 Layers transferred
[v1.1.0.4] β 2020-11-06ο
Hotfix to remove references to ctxctl and aiCTX
Hotfix to include NEST documentation in CI-built docs
Hotfix to include change log in build docs
[v1.1] β 2020-09-12ο
Addedο
Considerably expanded support for Denève-Machens spike-timing networks, including training arbitrary dynamical systems in a new
RecFSSpikeADSlayer. Added tutorials for standard D-M networks for linear dynamical systems, as well as a tutorial for training ADS networksAdded a new βIntro to SNNsβ getting-started guide
A new βsharp points of Rockpoolβ tutorial collects the tricks and traps for new users and old
A new
Networkclass,JaxStack, supports stacking and end-to-end gradient-based training of all Jax-based layers. A new tutorial has been added for this functionalityTimeSeriesclasses now support best-practices creation from clock or rasterised data.TSContinuousprovides a.from_clocked()method, andTSEventprovides a.from_raster()method for this purpose..from_clocked()a sample-and-hold interpolation, for intuitive generation of time series from periodically-sampled data.TSContinuousnow supports a.fill_valueproperty, which permits extrapolation usingscipy.interpolateNew
TSDictOnDiskclass for storingTimeSeriesobjects transparently on diskAllow ignoring data points for specific readout units in ridge regression Fisher relabelling. To be used, for example with all-vs-all classification
Added exponential synapse Jax layers
Added
RecLIFCurrentIn_SOlayer
Changedο
TSEventtime series no longer support creation without explicitly settingt_stop. The previous default of taking the final event time ast_stopwas causing too much confusion. For related reasons,TSEventnow forbids events to occur att_stopTimeSeriesclasses by default no longer permit sampling outside of the time range they are defined for, raising aValueErrorexception if this occurs. This renders safe several traps that new users were falling in to. This behaviour is selectable per time series, and can be transferred to a warning instead of an exception using thebeyond_range_exceptionflagJax trainable layers now import from a new mixin class
JaxTrainer. THe class provides a default loss function, which can be overridden in each sub-class to provide suitable regularisation. The training interface now returns loss value and gradients directly, rather than requiring an extra function call and additional evolutionImproved training method for JAX rate layers, to permit parameterisation of loss function and optimiser
Improved the
._prepare_input...()methods in theLayerclass, such that allLayers that inherit from this superclass are consistent in the number of time steps returned from evolutionThe
Network.load()method is now a class methodTest suite now uses multiple cores for faster testing
Changed company branding from aiCTX -> SynSense
Documentation is now hosted at https://rockpool.ai
Fixedο
Fixed bugs in precise spike-timing layer
RecSpikeBTFixed behavior of
Layerclass when passing weights in wrong formatStability improvements in
DynapseControlFix faulty z_score_standardization and Fisher relabelling in
RidgeRegrTrainer. Fisher relabelling now has better handling of differently sized batchesFixed bugs in saving and loading several layers
More sensible default values for
VirtualDynapsebaseweightsFix handling of empty
channelsargument inTSEvent._matching_channels()methodFixed bug in
Layer._prepare_input, where it would raise an AssertionError when no input TS was providedFixed a bug in
train_output_target, where the gradient would be incorrectly handled if no batching was performedFixed
to_dictmethod forFFExpSynJaxclassesRemoved redundant
_prepare_input()method from Torch layerMany small documentation improvements
[v1.0.8] β 2020-01-17ο
Addedο
Introduced new
TimeSeriesclass methodconcatenate_t(), which permits construction of a new time series by concatenating a set of existing time series, in the time dimensionNetworkclass now provides ato_dict()method for export.Networknow also can treat sub-Networks as layers.Training methods for spiking LIF Jax-backed layers in
rockpool.layers.training. Tutorial demonstrating SGD training of a feed-forward LIF network. Improvements in JAX LIF layers.Added
filter_banklayers, providinglayersubclasses which act as filter banks with spike-based outputAdded a
filter_widthparameter for butterworth filtersAdded a convenience function
start_at_zero()to delayTimeSeriesso that it starts at 0Added a change log in
CHANGELOG.md
Changedο
Improved
TSEvent.raster()to make it more intuitive. Rasters are now produced in line with time bases that can be created easily withnumpy.arange()Updated
conda_merge_request.shto work for conda feedstockTimeSeries.concatenate()renamed toconcatenate_t()RecRateEulerwarns iftauis too small instead of silently changingdt
Fixed or improvedο
Fixed issue in
Layer, where internal property was used when accessing._dt. This causes issues with layers that have an unusual internal type for._dt(e.g. if data is stored in a JAX variable on GPU)Reduce memory footprint of
.TSContinuousby approximately halfReverted regression in layer class
.RecLIFJax_IO, wheredtwas by default set to1.0, instead of being determined bytau_...Fixed incorrect use of
Optional[]type hintsAllow for small numerical differences in comparison between weights in NEST test
test_setWeightsRecImprovements in inline documentation
Increasing memory efficiency of
FFExpSyn._filter_databy reducing kernel sizeImplemented numerically stable timestep count for TSEvent rasterisation
Fixed bugs in
RidgeRegrTrainerFix plotting issue in time series
Fix bug of RecRateEuler not handling
dtargument in__init__()Fixed scaling between torch and nest weight parameters
Move
contains()method fromTSContinuoustoTimeSeriesparent classFix warning in
RRTrainedLayer._prepare_training_data()when times of target and input are not alignedBrian layers: Replace
np.asscalarwithfloat
[v1.0.7.post1] β 2019-11-28ο
Addedο
New
.Layersuperclass.RRTrainedLayer. This superclass implements ridge regression for layers that support ridge regression training.TimeSeriessubclasses now add axes labels on plottingNew spiking LIF JAX layers, with documentation and tutorials
.RecLIFJax,.RecLIFJax_IO,.RecLIFCurrentInJax,.RecLIFCurrentInJAX_IOAdded
saveandloadfacilities to.Networkobjects._matching_channels()now accepts an arbitrary list of event channels, which is used when analysing a periodic time series
Changedο
Documentation improvements
:py:meth:
.TSContinuous.plotmethod now supportsstaggerandskiparguments.Layerand.Networknow deal with a.Layer.size_outattribute. This is used to determine whether two layers are compatible to connect, rather than using.sizeExtended unit test for periodic event time series to check non-periodic time series as well
Fixedο
Fixed bug in
TSEvent.plot(), where stop times were not correctly handledFix bug in
Layer._prepare_input_events(), where if only a duration was provided, the method would return an input raster with an incorrect number of time stepsFixed bugs in handling of periodic event time series
.TSEventBug fix:
.Layer._prepare_input_eventswas failing for.Layers with spiking inputTSEvent.__call__()now correctly handles periodic event time series
[v1.0.6] β 2019-11-01ο
CI build and deployment improvements
[v1.0.5] β 2019-10-30ο
CI Build and deployment improvements
[v1.0.4] β 2019-10-28ο
Remove deployment dependency on docs
Hotfix: Fix link to
BlackAdd links to gitlab docs
[v1.0.3] β 2019-10-28ο
Hotfix for incorrect license text
Updated installation instructions
Included some status indicators in readme and docs
Improved CI
Extra meta-data detail in
setup.pyAdded more detail for contributing
Update README.md
[v1.0.2] β 2019-10-25ο
First public release