From: info <in...@ad...> - 2006-12-06 04:40:01
|
At 08:50 PM 12/5/2006, you wrote: >Send gumstix-users mailing list submissions to > gum...@li... > >To subscribe or unsubscribe via the World Wide Web, visit > https://lists.sourceforge.net/lists/listinfo/gumstix-users >or, via email, send a message with subject or body 'help' to > gum...@li... > >You can reach the person managing the list at > gum...@li... > >When replying, please edit your Subject line so it is more specific >than "Re: Contents of gumstix-users digest..." > > >The camera is available in the USA from Acroname: >http://www.acroname.com/robotics/parts/R245-CMUCAM2-PLUS.html Price is advertised at US$169.00. Rock. >Today's Topics: > > 1. Cameras / I2C (Joe Wass) > 2. Re: Cameras / I2C (Dave Hylands) > 3. Re: Crond and Crontab (Cesar Perez) > 4. Re: Lost Console and Gumstix Lost (antoine cottin) > 5. Re: Crond and Crontab (Dave Hylands) > 6. Re: some i2c questions... (Ian Smith) > 7. AI/Neural Networks (Erik D. Rodriguez) > 8. Re: minicom *on* gumstix? (DJ Delorie) > 9. Re: Crond and Crontab (Cesar Perez) > 10. Re: Crond and Crontab (Cesar Perez) > > >---------------------------------------------------------------------- > >Message: 1 >Date: Tue, 5 Dec 2006 21:14:17 +0000 >From: Joe Wass <jo...@or...> >Subject: [Gumstix-users] Cameras / I2C >To: gum...@li... >Message-ID: <355...@or...> >Content-Type: text/plain; charset=US-ASCII; delsp=yes; format=flowed > >I've noticed a message or two about cameras. Spotted this : >http://www.robot-electronics.co.uk/shop/Camerax39s2081.htm > >It may be of some use to someone. > >ALSO can I take I2C directly off the motherboards (are there easy >solder pads)? > >Cheers > >Joe > > |
From: Colin S. <cj...@ab...> - 2006-12-06 11:04:51
|
>Date: Tue, 5 Dec 2006 14:58:12 -0800 >From: "Erik D. Rodriguez" <eri...@ce...> >Subject: [Gumstix-users] AI/Neural Networks >To: "General mailing list for gumstix users." > <gum...@li...> >Message-ID: > <9DA...@cs...> >Content-Type: text/plain; charset="us-ascii" > >Has anyone done any work with neural networks on the gumstix? I remember >seeing a couple people have used FANN. I would be interested in seeing >more information on your project. Specifically how you implemented your >NN, and what you have as its inputs and outputs. How you handled >training sets and such. > > > I am intending to implment some neural nets on the gumstix although haven't made many decisions about the exact implementation to date. I don't see how a gumstix implementation of a neural net will differ from an implementation on any other hardware platform. In my case I will be using the neural net to control the heading of a robot, so the input will be a compass and the output a motor. At the moment i'm still at the hardware implementation stage and haven't had a chance to get any AI working yet, it will be a few months until this happens. -- Colin Sauze PhD Student, Intelligent Robotics Group Department of Computer Science University of Wales, Aberystwyth http://users.aber.ac.uk/cjs06 |
From: Dan T. <da...@lo...> - 2006-12-06 11:39:56
|
Since the Gumstix uses soft floating point it might be a good idea to look into integer neural networks with simple threshold activation functions, rather than the usual real-valued NNs. I think the 'stix might have trouble with sigmoid or tanh activation functions, since they're very floating point intensive. Of course, using a threshold activation function rules out training with back propagation, so maybe you could look at using a GA instead? I've found that using a sparse 2D array of weight values (integer or floating point) is the quickest and simplest way to represent a network's topology and weight values, so the weight connecting from N1 to N2 is something like Weights[N1][N2]. If the weight is set to zero then there's no link from N1 to N2. This is lightning fast because you can quickly walk the weights connecting into a node by looping through Weights[*][N2]. Because it uses a 2D array, however, it doesn't scale well if you want to use large numbers (100+ ?) of neurons. I haven't implemented anything on the Gumstix yet, since I'm snowed under with PhD work and a full time job, but it's definitely on my list of things to do in the new year - perhaps with a mobile robot and camera. I'd love to know how you get on, please keep us informed! There's more info on my work with NNs and GAs on my website (see link in my sig). Cheers, Dan -- Dan Taylor BSc MIET Software Development Engineer PhD Student, Heriot Watt University, UK http://www.logicalgenetics.com |
From: Craig H. <cr...@gu...> - 2006-12-07 16:41:07
|
On Dec 6, 2006, at 3:33 AM, Dan Taylor wrote: > I think the 'stix > might have trouble with sigmoid or tanh activation functions, since > they're very floating point intensive. They're only FP-intensive if you don't implement them as a lookup table ;) C |
From: Chris B. <cb...@qu...> - 2006-12-07 21:23:29
|
I've been working on a library that includes neural net. It's currently being developed under windows (for unit testing) but was previously developed under windows/Linux and will be eventually ported back. (probably works now, just haven't had time to test it there) Of course it uses floating point so it may not be useful to you. It's available at http://code.google.com/p/crylib/ As an example of use, part of the unit test for it, trains XOR, the code is below. CryBPNetContainer *bp = new CryBPNetContainer(); bp->SetAlpha(0.9); bp->SetEta(0.25); bp->SetGain(1); bp->RandomWeights(); double _InData [4][2] = { {0,0},{0,1},{1,0},{1,1}}; //2 bit inputs double *InData = &_InData[0][0]; double OutData[4]; double TargetData[4] = {0,1,1,0}; // xor //double TargetData[4] = { 0,0,0,1}; // and //double TargetData[4] = {0,1,1,1}; // or //double TargetData[4] = {0,0,0,0}; if ((Verbose) && (!CallBack(Verbose,"\nInitial Training (will take some time)\n",Fail))) return false; // Start the training //STTrainNet(int EPochs,int LengthIn,double *SampleIn,int LengthOut,double *SampleOut) bp->STTrainNet(5000,4,InData,4,TargetData); bp->printWeights(); for(int i=0;i<4;i++) { // set OutData to something other then the result OutData[i] = 4500; // ask the network what OutData should be based on the InData bp->SimulateNet(&InData[i*2],&OutData[i],&TargetData[i],0); // clean up the result by rounding to nearest int (will be either 1 or 0) int v = OutData[i]+0.5; // check for failure Fail = v!=TargetData[i]; sprintf(Result,"In %f,%f Expected %f Out %f\n",_InData[i][0],_InData[i][1],TargetData[i],OutData[i]); if (!CallBack(Verbose,Result,Fail)) return false; } Craig Hughes wrote: > On Dec 6, 2006, at 3:33 AM, Dan Taylor wrote: > > >> I think the 'stix >> might have trouble with sigmoid or tanh activation functions, since >> they're very floating point intensive. >> > > They're only FP-intensive if you don't implement them as a lookup > table ;) > > C > > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to share your > opinions on IT & business topics through brief surveys - and earn cash > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > _______________________________________________ > gumstix-users mailing list > gum...@li... > https://lists.sourceforge.net/lists/listinfo/gumstix-users > > |
From: Benjamin B. <ben...@wi...> - 2006-12-08 09:03:39
|
If a table doesn't provide the accuracy necessary, are there any rapidly converging expansions for these functions? The only expansion along these lines that I've used is the Taylor expansion for atan, so forgive me if I'm assuming too much... Either way, a thrifty programmer should be able to get around FP limitations, and if not, the worst-case scenario would be adding a FPU daughterboard (IEEE 754?). If you're not up to it, but this is something that you're willing to sink some resources into, find someone well versed in low-level optimizations, or someone who can implement the daughterboard. -Ben Craig Hughes wrote: > On Dec 6, 2006, at 3:33 AM, Dan Taylor wrote: > > >> I think the 'stix >> might have trouble with sigmoid or tanh activation functions, since >> they're very floating point intensive. >> > > They're only FP-intensive if you don't implement them as a lookup > table ;) > > C > > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to share your > opinions on IT & business topics through brief surveys - and earn cash > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > _______________________________________________ > gumstix-users mailing list > gum...@li... > https://lists.sourceforge.net/lists/listinfo/gumstix-users > > > > |
From: Nate W <del...@gm...> - 2006-12-09 03:26:50
|
It's my understanding that the activation function need not be precise - a bunch of activation functions, even a simple step function (input < 0 yields -1, else +1) will work sometimes. Given the 'fuzzy' nature of NNs, it won't really hurt to use approximations. I'd be more concerned about the CPU cost of the matrix multiplications that propogate activation from one layer to the next - but that too might be sped up using approximations without seriously impacting the network's functionality (e.g. use rough log tables and then do addition rather than multiplication). I don't know if there's been research on this kind of thing, but again, since NNs are basically heuristic rather than algorithmic, it seems plausible enough that you could do such optimizations and still get good results. On 12/7/06, Benjamin Burns <ben...@wi...> wrote: > > If a table doesn't provide the accuracy necessary, are there any rapidly > converging expansions for these functions? The only expansion along > these lines that I've used is the Taylor expansion for atan, so forgive > me if I'm assuming too much... Either way, a thrifty programmer should > be able to get around FP limitations, and if not, the worst-case > scenario would be adding a FPU daughterboard (IEEE 754?). If you're not > up to it, but this is something that you're willing to sink some > resources into, find someone well versed in low-level optimizations, or > someone who can implement the daughterboard. > > -Ben > > Craig Hughes wrote: > > On Dec 6, 2006, at 3:33 AM, Dan Taylor wrote: > > > > > >> I think the 'stix > >> might have trouble with sigmoid or tanh activation functions, since > >> they're very floating point intensive. > >> > > > > They're only FP-intensive if you don't implement them as a lookup > > table ;) > > > > C > > > > > ------------------------------------------------------------------------- > > Take Surveys. Earn Cash. Influence the Future of IT > > Join SourceForge.net's Techsay panel and you'll get the chance to share > your > > opinions on IT & business topics through brief surveys - and earn cash > > > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > > _______________________________________________ > > gumstix-users mailing list > > gum...@li... > > https://lists.sourceforge.net/lists/listinfo/gumstix-users > > > > > > > > > > > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to share > your > opinions on IT & business topics through brief surveys - and earn cash > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > _______________________________________________ > gumstix-users mailing list > gum...@li... > https://lists.sourceforge.net/lists/listinfo/gumstix-users > -- Nate Waddoups Redmond WA USA http://www.natew.com/ <== for nerds http://www.featherforum.com/ <== for birds |
From: Nate W <del...@gm...> - 2006-12-09 03:54:54
|
I just found a fixed-point NN library on sourceforge, with a paper describing how it works: http://fann.sourceforge.net/report/report.html See especially: Section 3.2 - the author assumes that the NN will be trained on a PC, so only the execution (not the training) needs to be done with fixed-point arithmetic. This allows some optimizations around integer overflow checking. Section 3.3.8 - reproduced here in its entirety: "As seen in figure 3, the activation functions are often very close to zero for small values and close to one for large values. This leaves a relatively small span where the output is not zero or one. This span can be represented as a lookup table, with a reasonable resolution. It is hard to tell whether this lookup table will be faster than actually calculating the activation function, but it is worth a try." Section 6.2.3 - benchmark summary: it works pretty well. Anyway, if you want the NN on your Gumstix to train on the Gumstix itself, the fixed-point trick above won't apply, but at least there's reason to believe that the activation function can be fudged without too much penalty. And in my admittedly non-expert opinion, I think this suggests that you could approximate the feed-forward multiplications as well. On 12/8/06, Nate W <del...@gm...> wrote: > > It's my understanding that the activation function need not be precise - a > bunch of activation functions, even a simple step function (input < 0 yields > -1, else +1) will work sometimes. Given the 'fuzzy' nature of NNs, it won't > really hurt to use approximations. I'd be more concerned about the CPU cost > of the matrix multiplications that propogate activation from one layer to > the next - but that too might be sped up using approximations without > seriously impacting the network's functionality ( e.g. use rough log > tables and then do addition rather than multiplication). I don't know if > there's been research on this kind of thing, but again, since NNs are > basically heuristic rather than algorithmic, it seems plausible enough that > you could do such optimizations and still get good results. > > On 12/7/06, Benjamin Burns <ben...@wi...> wrote: > > > > If a table doesn't provide the accuracy necessary, are there any rapidly > > converging expansions for these functions? The only expansion along > > these lines that I've used is the Taylor expansion for atan, so forgive > > me if I'm assuming too much... Either way, a thrifty programmer should > > be able to get around FP limitations, and if not, the worst-case > > scenario would be adding a FPU daughterboard (IEEE 754?). If you're not > > up to it, but this is something that you're willing to sink some > > resources into, find someone well versed in low-level optimizations, or > > someone who can implement the daughterboard. > > > > -Ben > > > > Craig Hughes wrote: > > > On Dec 6, 2006, at 3:33 AM, Dan Taylor wrote: > > > > > > > > >> I think the 'stix > > >> might have trouble with sigmoid or tanh activation functions, since > > >> they're very floating point intensive. > > >> > > > > > > They're only FP-intensive if you don't implement them as a lookup > > > table ;) > > > > > > C > > > > > > > > ------------------------------------------------------------------------- > > > Take Surveys. Earn Cash. Influence the Future of IT > > > Join SourceForge.net's Techsay panel and you'll get the chance to > > share your > > > opinions on IT & business topics through brief surveys - and earn cash > > > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > > > > > _______________________________________________ > > > gumstix-users mailing list > > > gum...@li... > > > https://lists.sourceforge.net/lists/listinfo/gumstix-users > > > > > > > > > > > > > > > > > > > > ------------------------------------------------------------------------- > > Take Surveys. Earn Cash. Influence the Future of IT > > Join SourceForge.net's Techsay panel and you'll get the chance to share > > your > > opinions on IT & business topics through brief surveys - and earn cash > > > > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > > _______________________________________________ > > gumstix-users mailing list > > gum...@li... > > https://lists.sourceforge.net/lists/listinfo/gumstix-users > > > > > > -- > Nate Waddoups > Redmond WA USA > http://www.natew.com/ <== for nerds > http://www.featherforum.com/ <== for birds -- Nate Waddoups Redmond WA USA http://www.natew.com/ <== for nerds http://www.featherforum.com/ <== for birds |