bayesclasses-general Mailing List for Bayes+Estimate
Brought to you by:
mistevens
You can subscribe to this list here.
2004 |
Jan
(2) |
Feb
(2) |
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2005 |
Jan
(2) |
Feb
(4) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
(14) |
Oct
(6) |
Nov
|
Dec
|
2006 |
Jan
(1) |
Feb
|
Mar
(1) |
Apr
(2) |
May
(2) |
Jun
|
Jul
(2) |
Aug
(1) |
Sep
|
Oct
|
Nov
(15) |
Dec
(5) |
2007 |
Jan
(10) |
Feb
(4) |
Mar
(1) |
Apr
|
May
(8) |
Jun
|
Jul
|
Aug
|
Sep
(11) |
Oct
(2) |
Nov
(1) |
Dec
|
2008 |
Jan
|
Feb
(5) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
(2) |
Sep
|
Oct
(2) |
Nov
(1) |
Dec
|
2009 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(3) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2010 |
Jan
|
Feb
(1) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2011 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2012 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
(2) |
Oct
|
Nov
|
Dec
|
2014 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
(1) |
Oct
|
Nov
|
Dec
|
2015 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
(1) |
Oct
|
Nov
|
Dec
|
2017 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2021 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
(1) |
Oct
|
Nov
(1) |
Dec
|
2022 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
|
Dec
|
From: Matt H. <mwh...@gm...> - 2022-10-14 19:20:11
|
Bayesclasses https://www.google.com/search?q=bay...@li... Matt Matt Hazard |
From: Matt H. <mwh...@gm...> - 2022-03-30 02:27:56
|
Bayesclasses https://bit.ly/3DjF5Aj |
From: Matt H. <mwh...@gm...> - 2021-11-14 00:25:16
|
Bayesclasses https://bitly.com/3FbiXaX Matt |
From: Matt H. <mwh...@gm...> - 2021-09-08 07:41:09
|
Bayesclasses https://bitly.com/3kNDQ3A |
From: Julius Z. <zi...@at...> - 2017-05-18 13:36:03
|
Since this mailing list is not very busy, I thought I dropped some words of appreciation. I have been using your library for the first time in the early 2000s (it was still MTL based then). This was on my first job as a student assistant. Now 15 years and 5 jobs later I need to implement a GPS/IMU filter, and I stumbled over it again, and even saw that it got a new release in 2017. In a flash of nostalgia, I thought I dropped some kind words of appreciation. I have used your library on at least these projects: Microrobots (early 2000s): https://rob.ipr.kit.edu/english/297.php Motion tracking (my master thesis, 2006): https://cvhci.anthropomatik.kit.edu/%7Estiefel/papers/Ziegler_CVPR06.pdf Some autonomous car (bayes++ was used in the localisation system), 2011: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.640.1737 https://www.youtube.com/watch?v=GfXg9ux4xUw Your library was also the first example of a heavily template'd c++-library that I used, and I think it taught me a lot about c++ and shaped the way I code. So thank you for this unique libray, and may many releases follow :-) -- Julius |
From: Matt H. <mwh...@gm...> - 2015-07-03 15:57:49
|
Hey bayesclasses http://certanurse.com/understanding.php?cover=hus8pu4s96pdxtn Matt Hazard Sent from my iPhone |
From: Matt H. <mwh...@gm...> - 2015-03-23 20:00:42
|
Hi bayesclasses http://harvestcharlotte.com/winter.php?happen=q6bfef8tccrph4ft9 mwh...@gm... Sent from my iPhone |
From: Matt H. <mwh...@gm...> - 2014-09-30 19:43:35
|
Hello bayesclasses http://racingcartoons.com/feel.php?cool=nbcyxmes3178mckr mwh...@gm... |
From: Michael S. <ma...@mi...> - 2012-09-29 19:56:10
|
Hi Hanno, On 09/28/2012 12:02 PM, Hanno Stock wrote:> Hi, > > I have a question regarding contributions. We have a few fixes and > additions and would like to provide them upstream for consideration. Thanks. I would be happy to look at them. > > What is the preferred way? Just patches against the current release? > > The Subversion repo seems out of date and the git repo also is not > accessible. (We use both systems here...) I just checked. The GIT Browse in Sourceforge is giving 404s most annoying. I have now disable the SVN Feature in the Sourceforge project to avoid confusion. I generally develop everything under GIT now. I think diff against master would be the way to go. I was thinking of moving to GitHub which has much better colaberative coding support then that on SF. > > Btw. thanks for open-sourcing this library! Cheers. What are you using Bayes++ for it always interests me. Regards, Michael |
From: Hanno S. <han...@in...> - 2012-09-28 10:18:50
|
Hi, I have a question regarding contributions. We have a few fixes and additions and would like to provide them upstream for consideration. What is the preferred way? Just patches against the current release? The Subversion repo seems out of date and the git repo also is not accessible. (We use both systems here...) Btw. thanks for open-sourcing this library! Best regards Hanno |
From: Oliver <oli...@gm...> - 2011-07-04 13:45:21
|
Hey, i'am trying to use the Bayes++ lib, but for me it's very difficult to use due to not so optimal documentation. I wan't to set-up a standard SIR-Implementation. First linear with add. gaussian noise. is there any example of this? i stucking by implementing an derived class of Likelihood_obersve_model. is there anywhere a sample implementation? it would be great, if anyone could help me. i would like to use this lib, but ones again, it's not so well documented and it is not very clear, what has to be implemented and whats working out-of-the-box. last question: is there any other lib for particle-filtering? i need one really urgent. thanks a lot Oli |
From: Ben P. <bpo...@gm...> - 2010-02-21 07:11:15
|
Hi all, First of all, I want to say that I am fairly new to filtering, so please let me know if I am making conceptual mistakes here. I have read through the old mailing list archives, but I still have a few questions about how I should go about implementing two filters. So I am working on setting up Bayes++ to implement the two filters I mentioned above, an EKF and a UKF. The state of the system I am dealing with is described in differential form (the specific equations are generated at runtime): Xk' = f(Xk) + noise f in general can be a highly nonlinear function. The measurement model is linear. The first filter I first set about implementing is the Unscented Kalman Filter. Using runge-kutta numerical methods I can accurately do a state update between measurements (which may be quite far apart), so I extended Addative_predict_model and overrode f(X) with a method to do the numerical integration to advance the state a time step, and then just instantiated an Unscented_filter and passed it this as a predict model. Is this the correct way to go about implementing an Unscented filter for this system? It seems to perform well, but I'm hardly a trained eye... My goal is to next implement the Extended Kalman Filter, but I am a bit stuck here. Given that f represents the state of my system with respect to time, I'm not sure how I'd go about finding the derivative with respect to my state variables (eg to calculate the covariance update.) What I think I need are update equations of the form here: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=85891&userType=inst ("Design and Implementation of an Extended Kalman Filter for the State Estimation of a Permanent Magnet Synchronous Motor", the paper basically describes the situation I'm facing.) I'm not sure how how I would go about extending the EKF here to use numerical methods for state update and covariance update... how would you suggest doing this? Sorry if what I've asked doesn't make sense. Let me know if I should clarify anything I've written here. Thanks in advance, Ben Podgursky |
From: Michael S. <ma...@mi...> - 2009-07-08 08:59:10
|
Hi All, I recently submitted a patch to fix the problem in Boost which has been commited. If you need to use Boost 1.39 then you can fix it by using the uBLAS library from the Boost SVN Head. It would be good if someone else can confirm that the patch works for them Next release of Boost will obviously have the fix in it. Michaelesclasses-general -- ___________________________________ Michael Stevens Systems Engineering 34128 Kassel, Germany Phone/Fax: +49 561 5218038 Mobile: +49 1577 7807325 Navigation Systems, Estimation and Bayesian Filtering http://bayesclasses.sf.net ___________________________________ |
From: Rishi R. M. <ris...@ma...> - 2009-07-07 20:15:15
|
Solution found: reverting to older version of Boost (v.1.36.0) solved this. ________________________________________ From: Rishi Rajalingham, Mr Sent: Tuesday, July 07, 2009 3:07 PM To: bay...@li... Subject: Error building Bayes++ Hi, I've successfully built Boost 1.39.0 (using boost-jam-3.1.17-1 and boost-build) and tested it on given examples to confirm that it is indeed working. I'm fairly certain I've properly set up boost-build, and correctly compiling the Bayes++ library (bjam --v2 -sBOOST_ROOT="../boost_1_39_0" > output). The target folder is created, but for the most part empty (only contains a few .o files) and I get the attached error message. Has anyone experienced this before? Help would be greatly appreciated. Best, Rishi |
From: Rishi R. M. <ris...@ma...> - 2009-07-07 19:07:40
|
Hi, I've successfully built Boost 1.39.0 (using boost-jam-3.1.17-1 and boost-build) and tested it on given examples to confirm that it is indeed working. I'm fairly certain I've properly set up boost-build, and correctly compiling the Bayes++ library (bjam --v2 -sBOOST_ROOT="../boost_1_39_0" > output). The target folder is created, but for the most part empty (only contains a few .o files) and I get the attached error message. Has anyone experienced this before? Help would be greatly appreciated. Best, Rishi |
From: Michael S. <ma...@mi...> - 2008-10-16 16:14:38
|
Hi Jan, On Thursday 16 October 2008, Jan Ploski wrote: > Hi, > > I'm trying to use Bayes++ to implement a linear Kalman filter which > improves the coefficients of a linear multiple regression model (z(k) = > b0(k) + b1(k)*x1(k) + b2(k)*x2(k)) as new data (y) comes in incrementally. > The initial coefficients b0, b1, b2 which the filter's state represents > are computed based on a set of historical data. On input, I have as new > "observation" the actual measured scalar value z(k), and also the values > of the 2 independent variables (3 if you also count the constant term) > with which my regression model works. I suppose I should put these values > into Hx before each observe step. For my initial tests, I started with the > Simple example and parametrized it as follows: > > Linear_predict_model(3,3) with > // identity > Fx(0,0) = 1.; > Fx(0,1) = 1.; > Fx(0,2) = 1.; I suspect you want the identity matix here Fx(0,0) = 1.; Fx(1,1) = 1.; Fx(2,2) = 1.; The coefficients predicted value is there previous values. > // no noise in process step from k->k+1 > q[0] = q[1] = q[2] = 0; > G(0,0) = 1.; > G(1,1) = 1.; > G(2,2) = 1.; OK > Linear_uncorrelated_observe_model(3,1) with > Hx(0,0) = 1; // constant term > Hx(0,1) = 77; // value of independent variable x1 > Hx(0,2) = 92; // value of independent variable x2 > // no noise in observation > Zv[0] = 0.; Hmmm... but I suspect there really if noise in the observation and you should model this to get correct answers > My initial state (the coefficients) is > x_init[0] = 1.63684; > x_init[1] = 2.02357; > x_init[2] = -0.0381261; OK > with the covariance matrix X being 0. This is where things go really wrong. > What does it mean and how can I avoid it? The excpetion 'S not PD in observe' tells you that you have done something numerically and in this case mathematically wrong. You have setup a system where the state has no uncertainty: Fx. X. Fx' + G.q.G' = 0 and also the observation has no uncertainty: Z = 0 This is not possible! To get started you will need to place some initial uncertainty in X. After that you will also need to guess what the noise is in observed variable. You will then have a solvable problem. The better you guesses of these noises the better the solution will be! > Is my choice of the Simple > example as the basis for my model appropriate? Yep no problems. Hope that helps a bit. One thing I find useful is to write out the Kalman filter equation for s 1D problem so there a no maticies to confuse things. That way you can get some idea of how state uncertainty X, prediction noise Q, and observation noise Z effect the result. All the best, Michael -- ___________________________________ Michael Stevens Systems Engineering 34128 Kassel, Germany Phone/Fax: +49 561 5218038 Navigation Systems, Estimation and Bayesian Filtering http://bayesclasses.sf.net ___________________________________ |
From: Jan P. <Jan...@of...> - 2008-10-16 12:47:53
|
Hi, I'm trying to use Bayes++ to implement a linear Kalman filter which improves the coefficients of a linear multiple regression model (z(k) = b0(k) + b1(k)*x1(k) + b2(k)*x2(k)) as new data (y) comes in incrementally. The initial coefficients b0, b1, b2 which the filter's state represents are computed based on a set of historical data. On input, I have as new "observation" the actual measured scalar value z(k), and also the values of the 2 independent variables (3 if you also count the constant term) with which my regression model works. I suppose I should put these values into Hx before each observe step. For my initial tests, I started with the Simple example and parametrized it as follows: Linear_predict_model(3,3) with // identity Fx(0,0) = 1.; Fx(0,1) = 1.; Fx(0,2) = 1.; // no noise in process step from k->k+1 q[0] = q[1] = q[2] = 0; G(0,0) = 1.; G(1,1) = 1.; G(2,2) = 1.; Linear_uncorrelated_observe_model(3,1) with Hx(0,0) = 1; // constant term Hx(0,1) = 77; // value of independent variable x1 Hx(0,2) = 92; // value of independent variable x2 // no noise in observation Zv[0] = 0.; My initial state (the coefficients) is x_init[0] = 1.63684; x_init[1] = 2.02357; x_init[2] = -0.0381261; with the covariance matrix X being 0. Now, when I run the first predict-update-observe sequence, feeding it the observed value 285.35 (the multiple regression model would at this point quite inaccurately predict 153.944), I get the exception: Initial [3](1.6368,2.0236,-0.0381) *** [3,3]((0.0000,0.0000,0.0000),(0.0000,0.0000,0.0000),(0.0000,0.0000,0.0000)) Predict [3](3.6223,0.0000,0.0000) *** [3,3]((0.0000,0.0000,0.0000),(0.0000,0.0000,0.0000),(0.0000,0.0000,0.0000)) terminate called after throwing an instance of 'Bayesian_filter::Numeric_exception' what(): S not PD in observe What does it mean and how can I avoid it? Is my choice of the Simple example as the basis for my model appropriate? Regards, Jan Ploski |
From: Michael S. <ma...@mi...> - 2008-08-27 19:44:19
|
On Wednesday 27 August 2008, Carles Fernandez wrote: > Hi everybody, > > I just discovered Bayes++ surfing the web and I am beginning to play with > it. This mail is just for saluting the community and to report that I've > been able to compile Bayes++ on Ubuntu 8.04 with gcc-4-2-3, boost 1.36.0, > boost-jam-3.1.16-1-linuxx86 and boost-build-2.0-m12. > > Cheers, > Carles. Thanks, I occasionally check if there are any problems compiling with newer complers and Boost versions. I nice to here when things work without difficulty. All the best, Michael -- ___________________________________ Michael Stevens Systems Engineering 34128 Kassel, Germany Phone/Fax: +49 561 5218038 Navigation Systems, Estimation and Bayesian Filtering http://bayesclasses.sf.net ___________________________________ |
From: Carles F. <car...@gm...> - 2008-08-27 12:57:23
|
Hi everybody, I just discovered Bayes++ surfing the web and I am beginning to play with it. This mail is just for saluting the community and to report that I've been able to compile Bayes++ on Ubuntu 8.04 with gcc-4-2-3, boost 1.36.0, boost-jam-3.1.16-1-linuxx86 and boost-build-2.0-m12. Cheers, Carles. |
From: Nicola B. <nb...@es...> - 2008-02-25 12:21:17
|
Hello Zhang, I don't think Bayes++ can deal with data association and unfortunately I don't think there is any library with such capabilities. My suggestion is to have a look at the following book: Bar-Shalom, Y. & Li, X.R. Multitarget-Multisensor Tracking: Principles and Techniques Y. Bar-Shalom, 1995 which explains all the most famous algorithms for data association, including NN, JPDA and MHT. Then you can implement them according to your specific application. Hope it helps. Regards, Nicola On Sunday 24 February 2008 12:59:04 Zhang Xinzheng wrote: > Dear Nicola, > > Thanks for your reply. As I know from your personal website, your PhD Topic > is Multisensor Data Fusion for Simultaneous People Tracking and Recognition > with Service Robots. The data association is a very important issue in > target tracking, would you please tell me whether Bayes++ class can cope > with this problem or not? Many popular data association algorithms, such as > nearest neighbour, require the extra statistical techniques. I have read > the PV and PV_SIR examples, there is not any code related to data > association. Does the Bayes++ include some DA algorithm? Could you give me > some explaination and how you can deal with the data association by the > Bayse++ class? > > Many thanks. > > Yours sincerely, > Zhang Xinzheng > ============================ > The Hong Kong Polytechnic University > Department of Electrical Engineering > Hung Hom, Kowloon > Hong Kong > Tel: +852-2766 4276 > E-Mail: eex...@po... > > > > > -------- Original Message -------- > > Subject: Re: Some questions on SLAM class of Bayes++ > Date: 2008-02-24 04:25:34 > From: Nicola Bellotto > To: Zhang Xinzheng > CC: bay...@li... > > Dear Zhang, > > Sorry for the late reply, I was busy with some paper deadlines. > Unfortunately I am afraid I cannot help you with this as I don't know the > class you mentioned. The examples PV and PV_SIR might be simpler to > understand and more useful for a new Bayes++ user (at least they were for > me). > > If you have more specific questions about the Covariance_scheme (EKF), > Unscented_scheme (UKF) or SIR_scheme (particle filter), which are the > classes I use the most, I'm glad to help. Otherwise, you already wrote on > the right place, as probably you'll get your answer soon from Michael on > the mailing-list. > > Regards, > > Nicola > > On Wednesday 20 February 2008 04:11:39 you wrote: > > Dear Nicola, > > > > I am sorry to bother you. I hope to use the SLAM class of Bayes++ to my > > research. As a freshman, I have read the mailing lists of Bayes++. Some > > of your topics are very helpful for me, but a few questions on SLAM class > > confused me. From the source files of this class, I found there is no > > data association (DA) for the observed features, and in the SLAM example > > 'observe' and 'observe_new' seem to contribute to the DA. I am not sure > > my thought is right or not. Actually, it is not clear for me to > > understand the observation models 'Feature_observe' and > > 'Feature_observe_inverse', and the functions 'obseve' and 'observe_new'. > > > > Could you give me some advice on these questions? Thanks. > > I look forward to hearing your reply. > > > > Yours sincerely, > > Zhang Xinzheng > > ============================ > > The Hong Kong Polytechnic University > > Department of Electrical Engineering > > Hung Hom, Kowloon > > Hong Kong > > Tel: +852-2766 4276 > > E-Mail: eex...@po... -- ------------------------------------------ Nicola Bellotto University of Essex Dept. of Computing and Electronic Systems Wivenhoe Park Colchester CO4 3SQ United Kingdom Room: 1N1.2.8 Tel. +44 (0)1206 872477 URL: http://privatewww.essex.ac.uk/~nbello ------------------------------------------ |
From: Zhang X. <eex...@po...> - 2008-02-24 13:00:31
|
Dear Nicola, Thanks for your reply. As I know from your personal website, your PhD Topic is Multisensor Data Fusion for Simultaneous People Tracking and Recognition with Service Robots. The data association is a very important issue in target tracking, would you please tell me whether Bayes++ class can cope with this problem or not? Many popular data association algorithms, such as nearest neighbour, require the extra statistical techniques. I have read the PV and PV_SIR examples, there is not any code related to data association. Does the Bayes++ include some DA algorithm? Could you give me some explaination and how you can deal with the data association by the Bayse++ class? Many thanks. Yours sincerely, Zhang Xinzheng ============================ The Hong Kong Polytechnic University Department of Electrical Engineering Hung Hom, Kowloon Hong Kong Tel: +852-2766 4276 E-Mail: eex...@po... -------- Original Message -------- Subject: Re: Some questions on SLAM class of Bayes++ Date: 2008-02-24 04:25:34 From: Nicola Bellotto To: Zhang Xinzheng CC: bay...@li... Dear Zhang, Sorry for the late reply, I was busy with some paper deadlines. Unfortunately I am afraid I cannot help you with this as I don't know the class you mentioned. The examples PV and PV_SIR might be simpler to understand and more useful for a new Bayes++ user (at least they were for me). If you have more specific questions about the Covariance_scheme (EKF), Unscented_scheme (UKF) or SIR_scheme (particle filter), which are the classes I use the most, I'm glad to help. Otherwise, you already wrote on the right place, as probably you'll get your answer soon from Michael on the mailing-list. Regards, Nicola On Wednesday 20 February 2008 04:11:39 you wrote: > Dear Nicola, > > I am sorry to bother you. I hope to use the SLAM class of Bayes++ to my > research. As a freshman, I have read the mailing lists of Bayes++. Some of > your topics are very helpful for me, but a few questions on SLAM class > confused me. From the source files of this class, I found there is no data > association (DA) for the observed features, and in the SLAM example > 'observe' and 'observe_new' seem to contribute to the DA. I am not sure my > thought is right or not. Actually, it is not clear for me to understand the > observation models 'Feature_observe' and 'Feature_observe_inverse', and the > functions 'obseve' and 'observe_new'. > > Could you give me some advice on these questions? Thanks. > I look forward to hearing your reply. > > Yours sincerely, > Zhang Xinzheng > ============================ > The Hong Kong Polytechnic University > Department of Electrical Engineering > Hung Hom, Kowloon > Hong Kong > Tel: +852-2766 4276 > E-Mail: eex...@po... -- ------------------------------------------ Nicola Bellotto Dept. of Computing and Electronic Systems University of Essex Wivenhoe Park Colchester CO4 3SQ United Kingdom Room: 1N1.2.8 Tel. +44 (0)1206 872477 URL: http://privatewww.essex.ac.uk/~nbello ------------------------------------------ |
From: Nicola B. <nb...@es...> - 2008-02-23 20:26:50
|
Dear Zhang, Sorry for the late reply, I was busy with some paper deadlines. Unfortunately I am afraid I cannot help you with this as I don't know the class you mentioned. The examples PV and PV_SIR might be simpler to understand and more useful for a new Bayes++ user (at least they were for me). If you have more specific questions about the Covariance_scheme (EKF), Unscented_scheme (UKF) or SIR_scheme (particle filter), which are the classes I use the most, I'm glad to help. Otherwise, you already wrote on the right place, as probably you'll get your answer soon from Michael on the mailing-list. Regards, Nicola On Wednesday 20 February 2008 04:11:39 you wrote: > Dear Nicola, > > I am sorry to bother you. I hope to use the SLAM class of Bayes++ to my > research. As a freshman, I have read the mailing lists of Bayes++. Some of > your topics are very helpful for me, but a few questions on SLAM class > confused me. From the source files of this class, I found there is no data > association (DA) for the observed features, and in the SLAM example > 'observe' and 'observe_new' seem to contribute to the DA. I am not sure my > thought is right or not. Actually, it is not clear for me to understand the > observation models 'Feature_observe' and 'Feature_observe_inverse', and the > functions 'obseve' and 'observe_new'. > > Could you give me some advice on these questions? Thanks. > I look forward to hearing your reply. > > Yours sincerely, > Zhang Xinzheng > ============================ > The Hong Kong Polytechnic University > Department of Electrical Engineering > Hung Hom, Kowloon > Hong Kong > Tel: +852-2766 4276 > E-Mail: eex...@po... -- ------------------------------------------ Nicola Bellotto Dept. of Computing and Electronic Systems University of Essex Wivenhoe Park Colchester CO4 3SQ United Kingdom Room: 1N1.2.8 Tel. +44 (0)1206 872477 URL: http://privatewww.essex.ac.uk/~nbello ------------------------------------------ |
From: Robert Z. <eer...@ya...> - 2008-02-20 02:33:38
|
Dear all, My research focuses on SLAM and I am confused by two member functions in SLAM class, the "observe" and "observe_new". Could anyone explain them more in detail? Many thanks. Regards, Zhang Xinzheng --------------------------------- Be a better friend, newshound, and know-it-all with Yahoo! Mobile. Try it now. |
From: Robert Z. <eer...@ya...> - 2007-11-01 02:53:03
|
=0AThanks for your help, Michael. =0A=0AActually, my research focuses on SL= AM and I hope to use the SLAM class that you have done. I am confused by tw= o member functions in this class, the "observe" and "observe_new". Could yo= u explain them more in detail?=0A=0AI look forward to hearing your reply.= =0A=0ARegards,=0AZhang Xinzheng=0A=0A--------------------------------------= ---------------------------------------------------------------------------= ---------------------------------------------------------------------------= ---=0AOn Tuesday 30 October 2007, Robert Zhang wrote:=0A> > Dear all,=0A> >= =0A> > I have read the post name "Covariance matrix Q" in this mail list. I= have a=0A> > question on the covariance matrix R which is used in the line= arize observe=0A> > model when the EKF is employed. Is it generated similar= to Q? However, I do=0A> > not find any Jacobian matrix same as G in the = =0A> > "Linrz_uncorrelated_observe_model. Thanks for any help.=0A> =0A> The= observation covariance matrix R is used directly in Bayes++. For =0A> nota= tional consistency it is call 'Z' in the Bayes++ observe models.=0A> =0A> T= herefore in 'Linrz_correlated_observe_model has a symetric matrix (SymMatri= x)=0A> called 'Z'.=0A> =0A> It is very common that the addative noise in ob= servation models are =0A> uncorrelated. In this case 'Z' is a diagonal matr= ix. You can then use =0A> the 'Linrz_uncorrelated_observe_model' where the = vector 'Zv' represents =0A> observation variances.=0A> =0A> For example if = you have a range+angle measurement (such as a Radar) you can =0A> normally = model the measurement as having addative noise in range and addative =0A> n= oise in angle. These are usually uncorrelated. You can then measure/guess = =0A> their variances and place them in Zv[0] and Zb[1].=0A> =0A> Regards,= =0A> Michael=0A> -- =0A> ___________________________________=0A> Michael St= evens Systems Engineering=0A> =0A> 34128 Kassel, Germany=0A> Phone/Fax: +49= 561 5218038=0A> =0A> Navigation Systems, Estimation and=0A> Bayesian Filte= ring=0A> http://bayesclasses.sf.net=0A> ___________________________________= =0A =0A--------------------------------------------------------------------= ---------------------------------------------------------------------------= ------------------------------------------------=0A=0A=0A=0A_______________= ___________________________________=0ADo You Yahoo!?=0ATired of spam? Yaho= o! Mail has the best spam protection around =0Ahttp://mail.yahoo.com |
From: Michael S. <ma...@mi...> - 2007-10-31 11:19:13
|
On Tuesday 30 October 2007, Robert Zhang wrote: > Dear all, > > I have read the post name "Covariance matrix Q" in this mail list. I have a > question on the covariance matrix R which is used in the linearize observe > model when the EKF is employed. Is it generated similar to Q? However, I do > not find any Jacobian matrix same as G in the > "Linrz_uncorrelated_observe_model. Thanks for any help. The observation covariance matrix R is used directly in Bayes++. For notational consistency it is call 'Z' in the Bayes++ observe models. Therefore in 'Linrz_correlated_observe_model has a symetric matrix (SymMatrix) called 'Z'. It is very common that the addative noise in observation models are uncorrelated. In this case 'Z' is a diagonal matrix. You can then use the 'Linrz_uncorrelated_observe_model' where the vector 'Zv' represents observation variances. For example if you have a range+angle measurement (such as a Radar) you can normally model the measurement as having addative noise in range and addative noise in angle. These are usually uncorrelated. You can then measure/guess their variances and place them in Zv[0] and Zb[1]. Regards, Michael -- ___________________________________ Michael Stevens Systems Engineering 34128 Kassel, Germany Phone/Fax: +49 561 5218038 Navigation Systems, Estimation and Bayesian Filtering http://bayesclasses.sf.net ___________________________________ |