I just found a fixed-point NN library on sourceforge, with a paper describing how it works:

http://fann.sourceforge.net/report/report.html
See especially:
Section 3.2 - the author assumes that the NN will be trained on a PC, so only the execution (not the training) needs to be done with fixed-point arithmetic.  This allows some optimizations around integer overflow checking.
Section 3.3.8 - reproduced here in its entirety: "As seen in figure 3, the activation functions are often very close to zero for small values and close to one for large values. This leaves a relatively small span where the output is not zero or one. This span can be represented as a lookup table, with a reasonable resolution. It is hard to tell whether this lookup table will be faster than actually calculating the activation function, but it is worth a try."
Section 6.2.3 - benchmark summary: it works pretty well.

Anyway, if you want the NN on your Gumstix to train on the Gumstix itself, the fixed-point trick above won't apply, but at least there's reason to believe that the activation function can be fudged without too much penalty.  And in my admittedly non-expert opinion, I think this suggests that you could approximate the feed-forward multiplications as well.

On 12/8/06, Nate W <delaminator@gmail.com> wrote:
It's my understanding that the activation function need not be precise - a bunch of activation functions, even a simple step function (input < 0 yields -1, else +1) will work sometimes.  Given the 'fuzzy' nature of NNs, it won't really hurt to use approximations.  I'd be more concerned about the CPU cost of the matrix multiplications that propogate activation from one layer to the next - but that too might be sped up using approximations without seriously impacting the network's functionality ( e.g. use rough log tables and then do addition rather than multiplication).  I don't know if there's been research on this kind of thing, but again, since NNs are basically heuristic rather than algorithmic, it seems plausible enough that you could do such optimizations and still get good results.


On 12/7/06, Benjamin Burns < benjamin.burns@widgetry.org> wrote:
If a table doesn't provide the accuracy necessary, are there any rapidly
converging expansions for these functions?  The only expansion along
these lines that I've used is the Taylor expansion for atan, so forgive
me if I'm assuming too much...  Either way, a thrifty programmer should
be able to get around FP limitations, and if not, the worst-case
scenario would be adding a FPU daughterboard (IEEE 754?).  If you're not
up to it, but this is something that you're willing to sink some
resources into, find someone well versed in low-level optimizations, or
someone who can implement the daughterboard.

-Ben

Craig Hughes wrote:
> On Dec 6, 2006, at 3:33 AM, Dan Taylor wrote:
>
>
>> I think the 'stix
>> might have trouble with sigmoid or tanh activation functions, since
>> they're very floating point intensive.
>>
>
> They're only FP-intensive if you don't implement them as a lookup
> table ;)
>
> C
>
> -------------------------------------------------------------------------
> Take Surveys. Earn Cash. Influence the Future of IT
> Join SourceForge.net's Techsay panel and you'll get the chance to share your
> opinions on IT & business topics through brief surveys - and earn cash
> http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
> _______________________________________________
> gumstix-users mailing list
> gumstix-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/gumstix-users
>
>
>
>


-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
gumstix-users mailing list
gumstix-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/gumstix-users



--
Nate Waddoups
Redmond WA USA
http://www.natew.com/   <== for nerds
http://www.featherforum.com/   <== for birds



--
Nate Waddoups
Redmond WA USA
http://www.natew.com/   <== for nerds
http://www.featherforum.com/    <== for birds