Recent changes to ffnet-0.6.2 released!https://sourceforge.net/p/ffnet/news/2009/10/ffnet-062-released/Recent changes to ffnet-0.6.2 released!enTue, 27 Oct 2009 23:53:31 -0000ffnet-0.6.2 released!https://sourceforge.net/p/ffnet/news/2009/10/ffnet-062-released/<div class="markdown_content"><p>ffnet version 0.6.2 is released and is available<br />
for download at:</p>
<p><a href="http://ffnet.sourceforge.net">http://ffnet.sourceforge.net</a></p>
<p>This release contains minor enhancements and compatibility improvements:<br />
- ffnet works now with >=networkx-0.99;<br />
- neural network can be called now with 2D array of inputs,<br />
it also returns numpy array instead of python list;<br />
- readdata function is now alias to numpy.loadtxt;<br />
- docstrings are improved.</p>
<p>What is ffnet?<br />
--------------<br />
ffnet is a fast and easy-to-use feed-forward neural<br />
network training solution for python.</p>
<p>Unique features<br />
---------------<br />
1. Any network connectivity without cycles is allowed.<br />
2. Training can be performed with use of several optimization <br />
schemes including: standard backpropagation with momentum, rprop,<br />
conjugate gradient, bfgs, tnc, genetic alorithm based optimization.<br />
3. There is access to exact partial derivatives of network outputs <br />
vs. its inputs.<br />
4. Automatic normalization of data.</p>
<p>Basic assumptions and limitations:<br />
----------------------------------<br />
1. Network has feed-forward architecture.<br />
2. Input units have identity activation function, <br />
all other units have sigmoid activation function.<br />
3. Provided data are automatically normalized, both input and output, <br />
with a linear mapping to the range (0.15, 0.85).<br />
Each input and output is treated separately (i.e. linear map is <br />
unique for each input and output).<br />
4. Function minimized during training is a sum of squared errors <br />
of each output for each training pattern.</p>
<p>Performance<br />
-----------<br />
Excellent computational performance is achieved implementing core<br />
functions in fortran 77 and wrapping them with f2py. ffnet outstands<br />
in performance pure python training packages and is competitive to<br />
'compiled language' software. Moreover, a trained network can be <br />
exported to fortran sources, compiled and called in many<br />
programming languages.</p>
<p>Usage<br />
-----<br />
Basic usage of the package is outlined below:</p>
<p>>>> from ffnet import ffnet, mlgraph, savenet, loadnet, exportnet<br />
>>> conec = mlgraph( (2,2,1) )<br />
>>> net = ffnet(conec)<br />
>>> input = [ [0.,0.], [0.,1.], [1.,0.], [1.,1.] ]<br />
>>> target = [ [1.], [0.], [0.], [1.] ]<br />
>>> net.train_tnc(input, target, maxfun = 1000)<br />
>>> net.test(input, target, iprint = 2)<br />
>>> savenet(net, "xor.net")<br />
>>> exportnet(net, "xor.f")<br />
>>> net = loadnet("xor.net")<br />
>>> answer = net( [ 0., 0. ] )<br />
>>> partial_derivatives = net.derivative( [ 0., 0. ] )</p>
<p>Usage examples with full description can be found in <br />
examples directory of the source distribution or browsed<br />
at <a href="http://ffnet.sourceforge.net.">http://ffnet.sourceforge.net.</a></p></div>MarekTue, 27 Oct 2009 23:53:31 -0000https://sourceforge.net583893db803779e670d6bb4753796a82b17b0526