Feed-forward neural network for python / News: Recent posts

ffnet-0.8.3 released

In this version we switched to LGPL. Only minor updates in code are performed, no API changes are made. Along with this version, new graphical user interface ffnetui-0.8.3 has been released under GPL-3 licence.

Posted by Marek 2016-02-15

ffnet -0.8.0 released

This version supports python 3. Only minor changes in code (and no API changes) are made in comparison to previous release, all scripts should run without problems.

Posted by Marek 2015-02-16

ffnet -0.7.1 released

This release contains mainly documentation improvements and changes in examples. Look also at the new sphinx-based website: http://ffnet.sourceforge.net.

Posted by Marek 2012-01-06

citing ffnet

You can look now at the growing list of publications which cite ffnet:


Posted by Marek 2011-12-27

ffnet-0.7 released

This release contains couple of important changes:
- neural network can be trained now using the power of multi-processor
systems (see example mptrain.py)
- attributes which are necessary for calculation of network derivatives
are now generated only on demand;
- data normalization limits are not changed when retrainig with new
data set; net.renormalize = True have to be set first;
- compatibility with newest versions of numpy, scipy and networkx
is enhanced;
- support for *export to java* and *drawing network with drawffnet*
is dropped.
However basic API is almost untouched. Exactly the same trainig scripts
as for older versions should work without problems.... read more

Posted by Marek 2011-08-08

new ffnet release is on the way

ffnet-0.7 will appear soon. it will contain couple of enhancements among which the most exciting one is parallel training on multicore systems.

Posted by Marek 2011-07-10

ffnet-0.6.2 released!

ffnet version 0.6.2 is released and is available
for download at:


This release contains minor enhancements and compatibility improvements:
- ffnet works now with >=networkx-0.99;
- neural network can be called now with 2D array of inputs,
it also returns numpy array instead of python list;
- readdata function is now alias to numpy.loadtxt;
- docstrings are improved.... read more

Posted by Marek 2009-10-27

ffnet 0.6.1 released!

ffnet version 0.6.1 is released! Source packages,
Gentoo ebuilds and Windows binaries are available
for download at:


This is mainly bugfix release.

- added 'readdata' function (simplifies reading training data
from ASCII files)

- fixed bug preventing ffnet form working with scipy-0.6.0,
- importing ffnet doesn't need matplotlib now (really),
- corrections in fortran code generators

Posted by Marek 2007-10-24

ffnet 0.6 released!

ffnet version 0.6 is now released. Source packages,
Gentoo ebuilds and Windows binaries are now available
for download at:


The last public release was 0.5.

If you are unfamiliar with this package, see the end of
this message for a description.

- trained network can be now exported to fortran source
code and compiled
- added new architecture generator (imlgraph)
- added rprop training algorithm
- added draft network plotting facility (based on networkx
and matplotlib)... read more

Posted by Marek 2007-03-23

On-line documentation avilabe!

Documentation of ffnet modules (automatically generated with epydoc) is now avilable online. You may browse it following the link:

Posted by Marek 2007-02-02

Ebuilds for Gentoo Linux users avilable for download

Ebuilds for ffnet-0.5 and networkx-0.33 have been placed to download.

Posted by Marek 2006-12-13

ffnet release 0.5 files replaced

Release 0.5 files have been replaced with a subversion revision 38 snapshot. There is a small
change applied alowing the code to run with the version 0.33 of networkx (newest one).

Posted by Marek 2006-12-10

ffnet 0.5 released

This is first release of ffnet.
ffnet is fast and easy to use feed-forward neural network
training solution for python. Using it you are able to
train/test/save/load and use artificial neural network
with sigmoid activation functions.

Unique features present in ffnet:
1. Any network connectivity without cycles is allowed
(not only layered).
2. Training can be performed with use of several optimization
schemes including genetic alorithm based optimization.
3. There is access to exact partial derivatives of network outputs
vs. its inputs.
4. Normalization of data is handled automatically by ffnet.... read more

Posted by Marek 2006-12-06

Get latest updates about Open Source Projects, Conferences and News.

Sign up for the SourceForge newsletter:

No, thanks