This release contains mainly documentation improvements and changes in examples. Look also at the new sphinx-based website: http://ffnet.sourceforge.net.
You can look now at the growing list of publications which cite ffnet:
This release contains couple of important changes:
- neural network can be trained now using the power of multi-processor
systems (see example mptrain.py)
- attributes which are necessary for calculation of network derivatives
are now generated only on demand;
- data normalization limits are not changed when retrainig with new
data set; net.renormalize = True have to be set first;
- compatibility with newest versions of numpy, scipy and networkx
- support for *export to java* and *drawing network with drawffnet*
However basic API is almost untouched. Exactly the same trainig scripts
as for older versions should work without problems.... read more
ffnet-0.7 will appear soon. it will contain couple of enhancements among which the most exciting one is parallel training on multicore systems.
ffnet version 0.6.2 is released and is available
for download at:
This release contains minor enhancements and compatibility improvements:
- ffnet works now with >=networkx-0.99;
- neural network can be called now with 2D array of inputs,
it also returns numpy array instead of python list;
- readdata function is now alias to numpy.loadtxt;
- docstrings are improved.... read more
ffnet version 0.6.1 is released! Source packages,
Gentoo ebuilds and Windows binaries are available
for download at:
This is mainly bugfix release.
- added 'readdata' function (simplifies reading training data
from ASCII files)
CHANGES & BUG FIXES
- fixed bug preventing ffnet form working with scipy-0.6.0,
- importing ffnet doesn't need matplotlib now (really),
- corrections in fortran code generators
ffnet version 0.6 is now released. Source packages,
Gentoo ebuilds and Windows binaries are now available
for download at:
The last public release was 0.5.
If you are unfamiliar with this package, see the end of
this message for a description.
- trained network can be now exported to fortran source
code and compiled
- added new architecture generator (imlgraph)
- added rprop training algorithm
- added draft network plotting facility (based on networkx
and matplotlib)... read more
Documentation of ffnet modules (automatically generated with epydoc) is now avilable online. You may browse it following the link:
Ebuilds for ffnet-0.5 and networkx-0.33 have been placed to download.
Release 0.5 files have been replaced with a subversion revision 38 snapshot. There is a small
change applied alowing the code to run with the version 0.33 of networkx (newest one).
This is first release of ffnet.
ffnet is fast and easy to use feed-forward neural network
training solution for python. Using it you are able to
train/test/save/load and use artificial neural network
with sigmoid activation functions.
Unique features present in ffnet:
1. Any network connectivity without cycles is allowed
(not only layered).
2. Training can be performed with use of several optimization
schemes including genetic alorithm based optimization.
3. There is access to exact partial derivatives of network outputs
vs. its inputs.
4. Normalization of data is handled automatically by ffnet.... read more