I've been working on a library that includes neural net. It's currently being developed under windows (for unit testing) but was previously developed under windows/Linux and will be eventually ported back. (probably works now, just haven't had time to test it there)

Of course it uses floating point so it may not be useful to you.

It's available at http://code.google.com/p/crylib/

As an example of use, part of the unit test for it, trains XOR, the code is below.

                CryBPNetContainer *bp = new CryBPNetContainer();

           bp->SetAlpha(0.9);
            bp->SetEta(0.25);
            bp->SetGain(1);
            bp->RandomWeights();

            double _InData [4][2] = { {0,0},{0,1},{1,0},{1,1}};    //2 bit inputs
            double *InData = &_InData[0][0];
            double OutData[4];
            double TargetData[4] = {0,1,1,0};    // xor
            //double TargetData[4] = { 0,0,0,1};    // and
            //double TargetData[4] = {0,1,1,1};        // or
            //double TargetData[4] = {0,0,0,0};
            if ((Verbose) && (!CallBack(Verbose,"\nInitial Training (will take some time)\n",Fail)))
                return false;
// Start the training
//STTrainNet(int EPochs,int LengthIn,double *SampleIn,int LengthOut,double *SampleOut)
            bp->STTrainNet(5000,4,InData,4,TargetData);

            bp->printWeights();
            for(int i=0;i<4;i++)
            {
// set OutData to something other then the result
                OutData[i] = 4500;

// ask the network what OutData should be based on the InData
                bp->SimulateNet(&InData[i*2],&OutData[i],&TargetData[i],0);

// clean up the result by rounding to nearest int (will be either 1 or 0)
                int v = OutData[i]+0.5;
// check for failure
                Fail = v!=TargetData[i];
                sprintf(Result,"In %f,%f Expected %f Out %f\n",_InData[i][0],_InData[i][1],TargetData[i],OutData[i]);
                if (!CallBack(Verbose,Result,Fail))
                    return false;
            }
 

Craig Hughes wrote:
On Dec 6, 2006, at 3:33 AM, Dan Taylor wrote:

  
I think the 'stix
might have trouble with sigmoid or tanh activation functions, since
they're very floating point intensive.
    

They're only FP-intensive if you don't implement them as a lookup  
table ;)

C

-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
gumstix-users mailing list
gumstix-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/gumstix-users